HealthcareNER-Fr / README.md
hassoudi's picture
Update README.md
2e7b33e verified

A newer version of the Gradio SDK is available: 5.22.0

Upgrade
metadata
title: HealthcareNER Fr
emoji: 🩺
colorFrom: blue
colorTo: pink
sdk: gradio
sdk_version: 5.9.1
app_file: app.py
pinned: false
license: apache-2.0
short_description: French Healthcare NER Demo from the Book NLP on OCI

French Healthcare NER Model (Educational Version)

This Hugging Face Space provides a live demonstration of the model developed as part of the healthcare NLP case study featured throughout my book Natural Language Processing on Oracle Cloud Infrastructure: Building Transformer-Based NLP Solutions Using Oracle AI and Hugging Face. Dive into Chapter 6 for a comprehensive, step-by-step guide on building this model.

📚 Purpose and Scope

This Hugging Face Space showcases the model built step-by-step in Chapters 4 to 7 of the book, covering everything from healthcare dataset creation to fine-tuning a transformer-based NER model. It provides a practical example of how NLP can be applied in healthcare to extract insights from French medical texts.

Why Explore This Demo?

  • Experiment with the Model: Interact with the healthcare NLP model from the book without the need to train one from scratch.
  • Discover What You Can Build: Get a hands-on preview of the process detailed in the book, from healthcare dataset preparation to fine-tuning a pre-trained transformer-based NER model.

⚠️ Usage Restrictions

This is a demo provided for educational purposes. The Model behind was trained on a limited dataset and is not intended for production use, clinical decision-making, or real-world medical applications.

  • Educational and research purposes only
  • Not licensed for commercial deployment
  • Not for production use
  • Not for medical decisions

🎓 Book Reference

This model is built as described in Chapter 6 of the book Natural Language Processing on Oracle Cloud Infrastructure. The book covers the entire NLP solution lifecycle—including data preparation, model fine-tuning, deployment, and monitoring. Chapter 6 specifically focuses on:

  • Fine-tuning a pretrained model from Hugging Face Hub for healthcare Named Entity Recognition (NER)
  • Training the model using OCI’s Data Science service and Hugging Face Transformers libraries
  • Performance evaluation and best practices for robust and cost-effective NLP models

For more details, you can explore the book and Chapter 6 on the following platforms:

Citation

If you use this model, please cite the following:

@Inbook{Assoudi2024,
author="Assoudi, Hicham",
title="Model Fine-Tuning",
bookTitle="Natural Language Processing on Oracle Cloud Infrastructure: Building Transformer-Based NLP Solutions Using Oracle AI and Hugging Face",
year="2024",
publisher="Apress",
address="Berkeley, CA",
pages="249--319",
abstract="This chapter focuses on the process of fine-tuning a pretrained model for healthcare Named Entity Recognition (NER). This chapter provides an in-depth exploration of training the healthcare NER model using OCI's Data Science platform and Hugging Face tools. It covers the fine-tuning process, performance evaluation, and best practices that contribute to creating robust and cost-effective NLP models.",
isbn="979-8-8688-1073-2",
doi="10.1007/979-8-8688-1073-2_6",
url="https://doi.org/10.1007/979-8-8688-1073-2_6"
}

📞 Connect and Contact

Stay updated on my latest models and projects:
👉 Follow me on Hugging Face

For inquiries or professional communication, feel free to reach out:
📧 Email: [email protected]