builder / notebooks /demo Notebook
mgbam's picture
Create notebooks/demo Notebook
3180b4a verified
raw
history blame
1.72 kB
{
"cells": [
{
"cell_type": "markdown",
"metadata": {},
"source": [
"# Shasha Demo Notebook\n",
"This notebook shows how to programmatically invoke our AI inference pipeline, run sentiment analysis, and generate code examples."
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"# 1. Setup inference client\n",
"from hf_client import get_inference_client\n",
"# Initialize client for Qwen3-32B (fallback on Groq if unavailable)\n",
"client = get_inference_client('Qwen/Qwen3-32B', provider='auto')\n",
"# Example chat completion request\n",
"resp = client.chat.completions.create(\n",
" model='Qwen/Qwen3-32B',\n",
" messages=[{'role':'user','content':'Write a Python function to reverse a string.'}]\n",
")\n",
"print(resp.choices[0].message.content)"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"# 2. Sentiment Analysis with Transformers.js (Python demo)\n",
"from transformers import pipeline\n",
"# Using OpenAI provider for sentiment\n",
"sentiment = pipeline('sentiment-analysis', model='openai/gpt-4', trust_remote_code=True)\n",
"print(sentiment('I love building AI-powered tools!'))"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"---\n",
"## Next steps:\n",
"- Try different models (Gemini Pro, Fireworks AI) by changing the model= parameter.\n",
"- Explore custom plugins via plugins.py to integrate with Slack or GitHub.\n",
"- Use auth.py to load private files from Google Drive."
]
}
],
"metadata": {
"kernelspec": {
"display_name": "Python 3",
"language": "python",
"name": "python3"
},
"language_info": {
"name": "python",
"version": "3.x"
}
},
"nbformat": 4,
"nbformat_minor": 5
}