Model Usage
I would like to use your model and compare it with other AI text detectors as part of a university project. However, I cannot use your proposed solution as it is not part of huggingface.co/models. Has anything changed recently, or do you have any advice for me?
Hi,
I’m not entirely sure what issue you're facing, but my model is publicly available on Hugging Face:
🔗 AICodexLab/answerdotai-ModernBERT-base-ai-detector.
If you just need to load and use it, here’s a simple way with transformers
:
from transformers import pipeline
model_name = "AICodexLab/answerdotai-ModernBERT-base-ai-detector"
classifier = pipeline("text-classification", model=model_name)
text = "This text was generated by an AI model."
result = classifier(text)
print(result)
If you meant something else, could you clarify? Happy to help!
Thank you for your fast response.
Below you fill find my current code (I use this with other models using transformers):
model_name_ModernBert = "AICodexLab/answerdotai-ModernBERT-base-ai-detector"
model_ModernBert = AutoModelForSequenceClassification.from_pretrained(model_name_ModernBert)
tokenizer_ModernBert = AutoTokenizer.from_pretrained(model_name_ModernBert)
detector_ModernBert = pipeline("text-classification", model=model_name_ModernBert, tokenizer=tokenizer_ModernBert)
In the attached image, you can see my error message as well. To me it seems like the model might not be accessible via transformers atm.
Thank you for your help.
As I saw in your screenshot, I did change my Python version from 3.10.6 to 3.11.9 and use the updated version of transformers.
With this, I was able to resolve the issue.
Thanks a lot!
Nice, good luck with your project.