File size: 1,769 Bytes
6cc96b5 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 |
https://huggingface.co/spaces/decodingdatascience/newrag-pine
# π Telecom Customer Support LLM with Groq API
This project demonstrates how to build a fast, production-grade AI-powered telecom customer support assistant using the **Groq API** and optimized GenAI configurations.
## π Project Overview
A step-by-step guide to:
- Sending POST requests using **Postman**
- Connecting to the **Groq API**
- Testing default vs. optimized GenAI configurations
- Applying structured **prompt templates**
- Deploying a simple LLM-powered support API
## π§ Tech Stack
| Layer | Tool/Tech |
|---------------|-------------------------|
| LLM | [Groq API](https://groq.com/) |
| API Platform | FastAPI / Postman |
| Prompt Design | Custom templates |
| Deployment | Localhost / Cloud (optional) |
## π§ AI Configuration
| Parameter | Description |
|---------------------|--------------------------------------|
| `temperature` | Controls randomness (default: 0.7) |
| `top_p` | Nucleus sampling |
| `max_tokens` | Max tokens to generate |
| `frequency_penalty` | Repetition control |
| `presence_penalty` | Topic diversity |
## π Experiment Setup
### 1. No Prompt Template + Default Config
- Basic user input
- Uses Groq defaults
- For benchmarking
### 2. With Prompt Template + Tuned Config
- Structured input (e.g., role, intent, constraints)
- Custom temperature and token limits
- Optimized for domain-specific responses
## π Quickstart
### Step 1: Clone the repo
```bash
git clone https://github.com/your-username/telecom-support-llm.git
cd telecom-support-llm
|