Model Details

Model Name: UMA-IA/LLM_Engine_Finetuned_Aero Authors:

  • Youri LALAIN, Engineering student at French Engineering School ECE
  • Lilian RAGE, Engineering student at French Engineering School ECE

Base Model: Mistral-7B-v0.1
Fine-tuned Dataset: UMA-IA/UMA_Dataset_Engine_Aero_LLM
License: Apache 2.0

Model Description

Mistral-7B Fine-tuné sur les moteurs aérospatiaux

LLM_Engine_Finetuned_Aero is a specialized fine-tuned version of Mistral-7B designed to provide accurate and detailed answers to technical questions related to aerospace and aeronautical engines. The model leverages the UMA-IA/UMA_Dataset_Engine_Aero_LLM to enhance its understanding of complex engineering principles, propulsion systems, and aerospace technologies.

Capabilities

  • Technical Q&A on aerospace and aeronautical engines
  • Analysis and explanations of propulsion system components
  • Assistance in understanding aerospace engineering concepts

Use Cases

  • Aerospace research and engineering support
  • Educational purposes for students and professionals
  • Assisting in aerospace-related R&D projects

Training Details

This model was fine-tuned on UMA-IA/UMA_Dataset_Engine_Aero_LLM, a curated dataset focusing on aerospace engines, propulsion systems, and general aeronautical engineering. The fine-tuning process was performed using supervised learning to adapt Mistral-7B to technical discussions.

How to Use

You can load the model using Hugging Face's transformers library:

from transformers import AutoModelForCausalLM, AutoTokenizer

model_name = "UMA-IA/LLM_Engine_Finetuned_Aero"

model = AutoModelForCausalLM.from_pretrained(model_name)
tokenizer = AutoTokenizer.from_pretrained(model_name)

input_text = "Explain the working principle of a turbofan engine."
inputs = tokenizer(input_text, return_tensors="pt")
outputs = model.generate(**inputs)
print(tokenizer.decode(outputs[0], skip_special_tokens=True))
Downloads last month
44
Safetensors
Model size
7.24B params
Tensor type
FP16
·
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.
The model cannot be deployed to the HF Inference API: The model has no library tag.

Model tree for UMA-IA/LLM_Engine_Finetuned_Aero

Finetuned
(868)
this model

Dataset used to train UMA-IA/LLM_Engine_Finetuned_Aero