You need to agree to share your contact information to access this model
This repository is publicly accessible, but you have to accept the conditions to access its files and content.
You agree to not use the model for medical decisions nor for production use.
Log in or Sign Up to review the conditions and access this model content.
French Healthcare NER Model (Educational Version)
This French Healthcare NER model is part of the healthcare NLP case study featured in the book Natural Language Processing on Oracle Cloud Infrastructure: Building Transformer-Based NLP Solutions Using Oracle AI and Hugging Face. Dive into Chapter 6 for a comprehensive, step-by-step guide on building this model.
π Purpose and Scope
This model is designed to complement Chapter 6 of the book, allowing readers to:
- Explore the Model: Experiment with the healthcare NLP model built in the book without needing to train one from scratch.
- Recreate the Case Study: Follow along with the step-by-step implementation detailed in Chapter 6.
- Understand Key Concepts: Learn how to fine-tune and apply a healthcare NER model to French-language data.
This pre-built model simplifies the learning process and enables hands-on practice directly aligned with the book's content.
β οΈ Usage Restrictions
This is a demo model provided for educational purposes. It was trained on a limited dataset and is not intended for production use, clinical decision-making, or real-world medical applications.
- Educational and research purposes only
- Not licensed for commercial deployment
- Not for production use
- Not for medical decisions
π Book Reference
This model is built as described in Chapter 6 of the book Natural Language Processing on Oracle Cloud Infrastructure. The book covers the entire NLP solution lifecycleβincluding data preparation, model fine-tuning, deployment, and monitoring. Chapter 6 specifically focuses on:
- Fine-tuning a pretrained model from Hugging Face Hub for healthcare Named Entity Recognition (NER)
- Training the model using OCIβs Data Science service and Hugging Face Transformers libraries
- Performance evaluation and best practices for robust and cost-effective NLP models
For more details, you can explore the book and Chapter 6 on the following platforms:
- Full Book on Springer: View Here
- Chapter 6 on Springer: Read Chapter 6
- Amazon: Learn More
Citation
If you use this model, please cite the following:
@Inbook{Assoudi2024,
author="Assoudi, Hicham",
title="Model Fine-Tuning",
bookTitle="Natural Language Processing on Oracle Cloud Infrastructure: Building Transformer-Based NLP Solutions Using Oracle AI and Hugging Face",
year="2024",
publisher="Apress",
address="Berkeley, CA",
pages="249--319",
abstract="This chapter focuses on the process of fine-tuning a pretrained model for healthcare Named Entity Recognition (NER). This chapter provides an in-depth exploration of training the healthcare NER model using OCI's Data Science platform and Hugging Face tools. It covers the fine-tuning process, performance evaluation, and best practices that contribute to creating robust and cost-effective NLP models.",
isbn="979-8-8688-1073-2",
doi="10.1007/979-8-8688-1073-2_6",
url="https://doi.org/10.1007/979-8-8688-1073-2_6"
}
π Connect and Contact
Stay updated on my latest models and projects:
π Follow me on Hugging Face
For inquiries or professional communication, feel free to reach out:
π§ Email: [email protected]
- Downloads last month
- 48
Model tree for TypicaAI/HealthcareNER-Fr
Base model
almanach/camembert-bio-base