Spaces:
Running
Running
“vinit5112”
commited on
Commit
·
deb090d
1
Parent(s):
a8c436a
Add all code
Browse files- README.md +18 -5
- backend/CONVERSATION_HISTORY_SYSTEM.md +249 -0
- backend/Dockerfile +19 -0
- backend/Qdrant.py +88 -0
- backend/SETUP_OFFLINE.md +68 -0
- backend/STREAMING_ANALYSIS.md +178 -0
- backend/app_v2.py +130 -0
- backend/backend_api.py +310 -0
- backend/download_model.py +66 -0
- backend/rag.py +198 -0
- backend/requirements.txt +12 -0
- backend/vector_store.py +297 -0
- docker-compose.yml +22 -0
- frontend/Dockerfile +34 -0
- frontend/nginx.conf +29 -0
- frontend/package-lock.json +0 -0
- frontend/package.json +55 -0
- frontend/postcss.config.js +6 -0
- frontend/public/index.html +22 -0
- frontend/public/manifest.json +8 -0
- frontend/src/App.js +218 -0
- frontend/src/components/ChatInterface.js +408 -0
- frontend/src/components/FileUploader.js +237 -0
- frontend/src/components/MessageBubble.js +138 -0
- frontend/src/components/Sidebar.js +216 -0
- frontend/src/components/TypingIndicator.js +34 -0
- frontend/src/components/WelcomeScreen.js +152 -0
- frontend/src/index.css +279 -0
- frontend/src/index.js +11 -0
- frontend/src/services/api.js +203 -0
- frontend/src/utils/conversationStorage.js +227 -0
- frontend/tailwind.config.js +78 -0
README.md
CHANGED
@@ -1,11 +1,24 @@
|
|
1 |
---
|
2 |
-
title: CA
|
3 |
emoji: 📚
|
4 |
colorFrom: blue
|
5 |
-
colorTo:
|
6 |
sdk: docker
|
7 |
-
|
8 |
-
license: mit
|
9 |
---
|
10 |
|
11 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
---
|
2 |
+
title: CA Study Assistant
|
3 |
emoji: 📚
|
4 |
colorFrom: blue
|
5 |
+
colorTo: purple
|
6 |
sdk: docker
|
7 |
+
app_port: 80
|
|
|
8 |
---
|
9 |
|
10 |
+
# CA Study Assistant
|
11 |
+
|
12 |
+
This is a full-stack AI-powered study assistant for Chartered Accountant students.
|
13 |
+
|
14 |
+
- **Frontend**: React
|
15 |
+
- **Backend**: FastAPI
|
16 |
+
- **Deployment**: Docker on Hugging Face Spaces
|
17 |
+
|
18 |
+
## How it Works
|
19 |
+
|
20 |
+
The application is containerized using Docker and orchestrated with Docker Compose.
|
21 |
+
|
22 |
+
- The **frontend** is a React app served by Nginx.
|
23 |
+
- The **backend** is a FastAPI server running with Uvicorn.
|
24 |
+
- Nginx acts as a reverse proxy, forwarding API requests from the frontend to the backend.
|
backend/CONVERSATION_HISTORY_SYSTEM.md
ADDED
@@ -0,0 +1,249 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
# Conversation History Management System
|
2 |
+
|
3 |
+
## Overview
|
4 |
+
The conversation history system has been upgraded from a basic memory-only implementation to a comprehensive, persistent storage solution using localStorage with advanced features.
|
5 |
+
|
6 |
+
## 🔄 **Previous Implementation (Memory Only)**
|
7 |
+
```javascript
|
8 |
+
// ❌ OLD - Lost on page refresh
|
9 |
+
const [conversations, setConversations] = useState([]);
|
10 |
+
```
|
11 |
+
|
12 |
+
## ✅ **New Implementation (Persistent Storage)**
|
13 |
+
|
14 |
+
### 1. **Core Storage Utility** (`utils/conversationStorage.js`)
|
15 |
+
A comprehensive utility class that handles all conversation persistence:
|
16 |
+
|
17 |
+
```javascript
|
18 |
+
import ConversationStorage from './utils/conversationStorage';
|
19 |
+
|
20 |
+
// Load conversations from localStorage
|
21 |
+
const conversations = ConversationStorage.loadConversations();
|
22 |
+
|
23 |
+
// Save conversations to localStorage
|
24 |
+
ConversationStorage.saveConversations(conversations);
|
25 |
+
```
|
26 |
+
|
27 |
+
### 2. **Enhanced Conversation Structure**
|
28 |
+
```javascript
|
29 |
+
{
|
30 |
+
id: "timestamp_based_id",
|
31 |
+
title: "Conversation Title",
|
32 |
+
messages: [
|
33 |
+
{
|
34 |
+
id: "message_id",
|
35 |
+
role: "user" | "assistant",
|
36 |
+
content: "message content",
|
37 |
+
timestamp: Date
|
38 |
+
}
|
39 |
+
],
|
40 |
+
createdAt: Date,
|
41 |
+
updatedAt: Date // ✅ NEW - Track when conversation was last modified
|
42 |
+
}
|
43 |
+
```
|
44 |
+
|
45 |
+
### 3. **Automatic Persistence**
|
46 |
+
- **Load on App Start**: Conversations are automatically loaded from localStorage
|
47 |
+
- **Save on Changes**: All conversation updates are automatically saved
|
48 |
+
- **No Manual Intervention**: Everything happens transparently
|
49 |
+
|
50 |
+
## 🚀 **Key Features**
|
51 |
+
|
52 |
+
### ✅ **Persistent Storage**
|
53 |
+
- Conversations survive page refreshes
|
54 |
+
- Conversations persist across browser sessions
|
55 |
+
- Automatic loading on app startup
|
56 |
+
|
57 |
+
### ✅ **Conversation Management**
|
58 |
+
- **Create**: New conversations are automatically saved
|
59 |
+
- **Update**: Message additions and title changes are saved
|
60 |
+
- **Delete**: Conversations can be permanently removed
|
61 |
+
- **Search**: Full-text search across all conversations
|
62 |
+
|
63 |
+
### ✅ **Storage Optimization**
|
64 |
+
- **Quota Management**: Handles localStorage size limits
|
65 |
+
- **Conversation Limits**: Maximum 50 conversations (configurable)
|
66 |
+
- **Automatic Cleanup**: Reduces storage when quota exceeded
|
67 |
+
|
68 |
+
### ✅ **Import/Export**
|
69 |
+
- **Export**: Download all conversations as JSON
|
70 |
+
- **Import**: Upload and merge conversation files
|
71 |
+
- **Backup**: Easy backup and restore functionality
|
72 |
+
|
73 |
+
### ✅ **Statistics & Monitoring**
|
74 |
+
- **Storage Usage**: Track localStorage consumption
|
75 |
+
- **Conversation Count**: Monitor total conversations
|
76 |
+
- **Message Count**: Track total messages across all conversations
|
77 |
+
|
78 |
+
## 🛠 **Implementation Details**
|
79 |
+
|
80 |
+
### App.js Integration
|
81 |
+
```javascript
|
82 |
+
// Load conversations on app start
|
83 |
+
useEffect(() => {
|
84 |
+
const savedConversations = ConversationStorage.loadConversations();
|
85 |
+
if (savedConversations.length > 0) {
|
86 |
+
setConversations(savedConversations);
|
87 |
+
setChatStarted(true);
|
88 |
+
setActiveConversationId(savedConversations[0].id);
|
89 |
+
}
|
90 |
+
}, []);
|
91 |
+
|
92 |
+
// Enhanced conversation management
|
93 |
+
const updateConversations = (updatedConversations) => {
|
94 |
+
setConversations(updatedConversations);
|
95 |
+
ConversationStorage.saveConversations(updatedConversations);
|
96 |
+
};
|
97 |
+
```
|
98 |
+
|
99 |
+
### ChatInterface.js Integration
|
100 |
+
```javascript
|
101 |
+
// Conversations are automatically saved when updated
|
102 |
+
setConversations(prev => prev.map(conv =>
|
103 |
+
conv.id === conversationId
|
104 |
+
? { ...conv, messages: [...conv.messages, newMessage] }
|
105 |
+
: conv
|
106 |
+
));
|
107 |
+
```
|
108 |
+
|
109 |
+
### Sidebar.js Integration
|
110 |
+
```javascript
|
111 |
+
// Delete conversations with confirmation
|
112 |
+
const handleDelete = (conversationId) => {
|
113 |
+
if (window.confirm('Are you sure you want to delete this conversation?')) {
|
114 |
+
onDeleteConversation(conversationId);
|
115 |
+
}
|
116 |
+
};
|
117 |
+
```
|
118 |
+
|
119 |
+
## 📊 **Storage Management**
|
120 |
+
|
121 |
+
### Local Storage Structure
|
122 |
+
```
|
123 |
+
Key: "ca_study_conversations"
|
124 |
+
Value: JSON array of conversation objects
|
125 |
+
```
|
126 |
+
|
127 |
+
### Storage Limits
|
128 |
+
- **Maximum Conversations**: 50 (prevents localStorage overflow)
|
129 |
+
- **Auto-Reduction**: Reduces to 25 conversations if quota exceeded
|
130 |
+
- **Size Monitoring**: Tracks storage usage in KB
|
131 |
+
|
132 |
+
### Error Handling
|
133 |
+
- **JSON Parse Errors**: Gracefully handles corrupted data
|
134 |
+
- **Storage Quota**: Automatic handling of localStorage limits
|
135 |
+
- **Network Issues**: Offline-first design
|
136 |
+
|
137 |
+
## 🔧 **Advanced Features**
|
138 |
+
|
139 |
+
### 1. **Search Functionality**
|
140 |
+
```javascript
|
141 |
+
// Search conversations by title or content
|
142 |
+
const results = ConversationStorage.searchConversations("accounting");
|
143 |
+
```
|
144 |
+
|
145 |
+
### 2. **Export Conversations**
|
146 |
+
```javascript
|
147 |
+
// Download all conversations as JSON file
|
148 |
+
ConversationStorage.exportConversations();
|
149 |
+
```
|
150 |
+
|
151 |
+
### 3. **Import Conversations**
|
152 |
+
```javascript
|
153 |
+
// Import conversations from file
|
154 |
+
const result = await ConversationStorage.importConversations(file);
|
155 |
+
console.log(`Imported ${result.count} conversations`);
|
156 |
+
```
|
157 |
+
|
158 |
+
### 4. **Storage Statistics**
|
159 |
+
```javascript
|
160 |
+
// Get detailed storage information
|
161 |
+
const stats = ConversationStorage.getStatistics();
|
162 |
+
// Returns: { totalConversations, totalMessages, storageSize, ... }
|
163 |
+
```
|
164 |
+
|
165 |
+
## 🔐 **Data Security & Privacy**
|
166 |
+
|
167 |
+
### Client-Side Storage
|
168 |
+
- **No Server Storage**: All data stays in user's browser
|
169 |
+
- **Privacy First**: No conversation data sent to servers
|
170 |
+
- **User Control**: Users can export/delete their own data
|
171 |
+
|
172 |
+
### Data Format
|
173 |
+
- **JSON Structure**: Human-readable format
|
174 |
+
- **Portable**: Easy to migrate between devices
|
175 |
+
- **Versionable**: Future-proof with version tracking
|
176 |
+
|
177 |
+
## 🎯 **User Experience Improvements**
|
178 |
+
|
179 |
+
### Before (Memory Only)
|
180 |
+
❌ Lost conversations on page refresh
|
181 |
+
❌ No conversation history
|
182 |
+
❌ No persistent sessions
|
183 |
+
❌ No conversation management
|
184 |
+
|
185 |
+
### After (Persistent Storage)
|
186 |
+
✅ Conversations survive page refreshes
|
187 |
+
✅ Full conversation history
|
188 |
+
✅ Persistent user sessions
|
189 |
+
✅ Advanced conversation management
|
190 |
+
✅ Search and filter capabilities
|
191 |
+
✅ Export/import functionality
|
192 |
+
✅ Storage monitoring and optimization
|
193 |
+
|
194 |
+
## 🚀 **Future Enhancements**
|
195 |
+
|
196 |
+
### Planned Features
|
197 |
+
1. **Cloud Sync**: Optional cloud storage integration
|
198 |
+
2. **User Authentication**: Multi-device synchronization
|
199 |
+
3. **Advanced Search**: Semantic search within conversations
|
200 |
+
4. **Tags/Categories**: Organize conversations by topics
|
201 |
+
5. **Shared Conversations**: Share conversations with others
|
202 |
+
6. **Analytics**: Conversation usage analytics
|
203 |
+
|
204 |
+
### Backend Integration (Optional)
|
205 |
+
```javascript
|
206 |
+
// Future: Optional backend storage
|
207 |
+
const backendStorage = new BackendConversationStorage();
|
208 |
+
await backendStorage.syncConversations(localConversations);
|
209 |
+
```
|
210 |
+
|
211 |
+
## 📋 **Migration Guide**
|
212 |
+
|
213 |
+
### For Existing Users
|
214 |
+
1. **Automatic Migration**: Existing conversations will be migrated to new format
|
215 |
+
2. **No Data Loss**: All existing conversations preserved
|
216 |
+
3. **Enhanced Features**: Immediate access to new capabilities
|
217 |
+
|
218 |
+
### For New Users
|
219 |
+
1. **Automatic Setup**: No configuration required
|
220 |
+
2. **Immediate Persistence**: Conversations saved from first use
|
221 |
+
3. **Full Feature Access**: All features available immediately
|
222 |
+
|
223 |
+
## 🔧 **Troubleshooting**
|
224 |
+
|
225 |
+
### Common Issues
|
226 |
+
1. **Storage Quota Exceeded**: Automatically handled with conversation reduction
|
227 |
+
2. **Corrupted Data**: Graceful fallback to empty conversation list
|
228 |
+
3. **Import Errors**: Validation and error reporting for file imports
|
229 |
+
|
230 |
+
### Debug Information
|
231 |
+
```javascript
|
232 |
+
// Check storage status
|
233 |
+
const stats = ConversationStorage.getStatistics();
|
234 |
+
console.log('Storage Stats:', stats);
|
235 |
+
|
236 |
+
// Clear all conversations (emergency)
|
237 |
+
ConversationStorage.clearAllConversations();
|
238 |
+
```
|
239 |
+
|
240 |
+
## ✅ **Conclusion**
|
241 |
+
|
242 |
+
The conversation history system has been completely upgraded to provide:
|
243 |
+
- **Persistent Storage**: No more lost conversations
|
244 |
+
- **Advanced Management**: Full CRUD operations
|
245 |
+
- **User Control**: Export/import capabilities
|
246 |
+
- **Performance**: Optimized for large conversation histories
|
247 |
+
- **Reliability**: Robust error handling and data protection
|
248 |
+
|
249 |
+
This system provides a professional-grade conversation management experience while maintaining simplicity and user privacy.
|
backend/Dockerfile
ADDED
@@ -0,0 +1,19 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
FROM python:3.9-slim
|
2 |
+
|
3 |
+
# Set the working directory in the container
|
4 |
+
WORKDIR /app
|
5 |
+
|
6 |
+
# Copy the requirements file and install dependencies
|
7 |
+
COPY requirements.txt .
|
8 |
+
RUN pip install --no-cache-dir -r requirements.txt
|
9 |
+
|
10 |
+
# Copy the rest of the backend application code
|
11 |
+
COPY . .
|
12 |
+
|
13 |
+
# Expose the port the app runs on
|
14 |
+
EXPOSE 8000
|
15 |
+
|
16 |
+
# Command to run the application
|
17 |
+
# We use app_v2:app because your main FastAPI instance is in the app_v2.py file.
|
18 |
+
# --host 0.0.0.0 makes the app accessible from outside the container.
|
19 |
+
CMD ["uvicorn", "app_v2:app", "--host", "0.0.0.0", "--port", "8000"]
|
backend/Qdrant.py
ADDED
@@ -0,0 +1,88 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
import os
|
2 |
+
from qdrant_client import QdrantClient, models
|
3 |
+
from qdrant_client.models import PayloadSchemaType
|
4 |
+
import logging
|
5 |
+
from dotenv import load_dotenv
|
6 |
+
|
7 |
+
# Configure logging
|
8 |
+
logger = logging.getLogger(__name__)
|
9 |
+
|
10 |
+
load_dotenv()
|
11 |
+
|
12 |
+
# Configuration
|
13 |
+
# QDRANT_URL = "https://cc102304-2c06-4d51-9dee-d436f4413549.us-west-1-0.aws.cloud.qdrant.io"
|
14 |
+
# QDRANT_API_KEY = "eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJhY2Nlc3MiOiJtIn0.cHs27o6erIf1BQHCdTxE4L4qZg4vCdrp51oNNNghjWM"
|
15 |
+
QDRANT_API_KEY = os.getenv("QDRANT_API_KEY")
|
16 |
+
QDRANT_URL = os.getenv("QDRANT_URL")
|
17 |
+
|
18 |
+
class QdrantManager:
|
19 |
+
def __init__(self):
|
20 |
+
self.qdrant_client = QdrantClient(
|
21 |
+
url=QDRANT_URL,
|
22 |
+
api_key=QDRANT_API_KEY,
|
23 |
+
)
|
24 |
+
print("Connected to Qdrant")
|
25 |
+
|
26 |
+
def get_or_create_company_collection(self, collection_name: str) -> str:
|
27 |
+
"""
|
28 |
+
Get or create a collection for a company.
|
29 |
+
|
30 |
+
Args:
|
31 |
+
collection_name: Name of the collection
|
32 |
+
|
33 |
+
Returns:
|
34 |
+
str: Collection name
|
35 |
+
|
36 |
+
Raises:
|
37 |
+
ValueError: If collection creation fails
|
38 |
+
"""
|
39 |
+
try:
|
40 |
+
|
41 |
+
print(f"Creating new collection: {collection_name}")
|
42 |
+
|
43 |
+
# Vector size for text-embedding-3-small is 1536
|
44 |
+
vector_size = 1536
|
45 |
+
|
46 |
+
# Create collection with vector configuration
|
47 |
+
self.qdrant_client.create_collection(
|
48 |
+
collection_name=collection_name,
|
49 |
+
vectors_config=models.VectorParams(
|
50 |
+
size=vector_size,
|
51 |
+
distance=models.Distance.COSINE
|
52 |
+
),
|
53 |
+
hnsw_config=models.HnswConfigDiff(
|
54 |
+
payload_m=16,
|
55 |
+
m=0,
|
56 |
+
),
|
57 |
+
)
|
58 |
+
|
59 |
+
# Create payload indices
|
60 |
+
payload_indices = {
|
61 |
+
"document_id": PayloadSchemaType.KEYWORD,
|
62 |
+
"content": PayloadSchemaType.TEXT
|
63 |
+
}
|
64 |
+
|
65 |
+
for field_name, schema_type in payload_indices.items():
|
66 |
+
self.qdrant_client.create_payload_index(
|
67 |
+
collection_name=collection_name,
|
68 |
+
field_name=field_name,
|
69 |
+
field_schema=schema_type
|
70 |
+
)
|
71 |
+
|
72 |
+
print(f"Successfully created collection: {collection_name}")
|
73 |
+
return collection_name
|
74 |
+
|
75 |
+
except Exception as e:
|
76 |
+
error_msg = f"Failed to create collection {collection_name}: {str(e)}"
|
77 |
+
logger.error(error_msg, exc_info=True)
|
78 |
+
raise ValueError(error_msg) from e
|
79 |
+
|
80 |
+
# Example usage
|
81 |
+
if __name__ == "__main__":
|
82 |
+
try:
|
83 |
+
qdrant_manager = QdrantManager()
|
84 |
+
collection_name = "ca-documents"
|
85 |
+
result = qdrant_manager.get_or_create_company_collection(collection_name)
|
86 |
+
print(f"Collection name: {result}")
|
87 |
+
except Exception as e:
|
88 |
+
print(f"Error: {e}")
|
backend/SETUP_OFFLINE.md
ADDED
@@ -0,0 +1,68 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
# Offline Mode Setup Guide
|
2 |
+
|
3 |
+
## Problem
|
4 |
+
The application fails to start with network connectivity errors when trying to download the sentence transformer model from Hugging Face.
|
5 |
+
|
6 |
+
## Error Message
|
7 |
+
```
|
8 |
+
Failed to resolve 'huggingface.co' ([Errno 11001] getaddrinfo failed)
|
9 |
+
```
|
10 |
+
|
11 |
+
## Solutions
|
12 |
+
|
13 |
+
### Option 1: Download Model When You Have Internet Access
|
14 |
+
1. When you have internet access, run the download script:
|
15 |
+
```bash
|
16 |
+
cd backend
|
17 |
+
python download_model.py
|
18 |
+
```
|
19 |
+
|
20 |
+
2. This will download and cache the model locally for offline use.
|
21 |
+
|
22 |
+
### Option 2: Manual Download
|
23 |
+
If you have internet access on another machine:
|
24 |
+
|
25 |
+
1. On a machine with internet access, run:
|
26 |
+
```python
|
27 |
+
from sentence_transformers import SentenceTransformer
|
28 |
+
model = SentenceTransformer('all-MiniLM-L6-v2')
|
29 |
+
```
|
30 |
+
|
31 |
+
2. Copy the cached model from:
|
32 |
+
- Windows: `C:\Users\{username}\.cache\huggingface\transformers\`
|
33 |
+
- Linux/Mac: `~/.cache/huggingface/transformers/`
|
34 |
+
|
35 |
+
3. Place it in the same location on your offline machine.
|
36 |
+
|
37 |
+
### Option 3: Force Offline Mode
|
38 |
+
If you believe the model is already cached, you can force offline mode by setting environment variables:
|
39 |
+
|
40 |
+
```bash
|
41 |
+
set TRANSFORMERS_OFFLINE=1
|
42 |
+
set HF_HUB_OFFLINE=1
|
43 |
+
python backend_api.py
|
44 |
+
```
|
45 |
+
|
46 |
+
### Option 4: Network Troubleshooting
|
47 |
+
If you should have internet access:
|
48 |
+
|
49 |
+
1. Check your internet connection
|
50 |
+
2. If behind a corporate firewall, ensure `huggingface.co` is accessible
|
51 |
+
3. Try accessing `https://huggingface.co` in your browser
|
52 |
+
4. Contact your IT department if needed
|
53 |
+
|
54 |
+
## Verification
|
55 |
+
After setting up offline mode, you can verify the model is working by running:
|
56 |
+
```bash
|
57 |
+
python download_model.py
|
58 |
+
```
|
59 |
+
|
60 |
+
This will check if the model is cached and available for offline use.
|
61 |
+
|
62 |
+
## Technical Details
|
63 |
+
The sentence transformer model "all-MiniLM-L6-v2" is approximately 80MB and is used for generating embeddings from text for the vector search functionality.
|
64 |
+
|
65 |
+
The application has been modified to:
|
66 |
+
1. Try loading the model normally first
|
67 |
+
2. Fall back to offline mode if network fails
|
68 |
+
3. Provide clear error messages with solutions
|
backend/STREAMING_ANALYSIS.md
ADDED
@@ -0,0 +1,178 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
# Streaming Implementation Analysis
|
2 |
+
|
3 |
+
## Overview
|
4 |
+
This document analyzes the streaming implementation across the backend and frontend components of the CA Study Assistant application.
|
5 |
+
|
6 |
+
## ✅ Backend Implementation Analysis
|
7 |
+
|
8 |
+
### 1. RAG Streaming Function (`rag.py`)
|
9 |
+
- **Status**: ✅ **GOOD** - Recently updated with latest API
|
10 |
+
- **Implementation**:
|
11 |
+
```python
|
12 |
+
for chunk in self.client.models.generate_content_stream(
|
13 |
+
model='gemini-2.5-flash',
|
14 |
+
contents=prompt
|
15 |
+
):
|
16 |
+
yield chunk.text
|
17 |
+
```
|
18 |
+
- **✅ Improvements Made**:
|
19 |
+
- Updated to use `generate_content_stream` instead of deprecated method
|
20 |
+
- Uses `gemini-2.5-flash` model (latest)
|
21 |
+
- Proper error handling with try-catch
|
22 |
+
|
23 |
+
### 2. FastAPI Streaming Endpoint (`backend_api.py`)
|
24 |
+
- **Status**: ✅ **IMPROVED** - Enhanced with better error handling
|
25 |
+
- **Implementation**:
|
26 |
+
```python
|
27 |
+
@app.post("/api/ask_stream")
|
28 |
+
async def ask_question_stream(request: QuestionRequest):
|
29 |
+
async def event_generator():
|
30 |
+
for chunk in rag_system.ask_question_stream(request.question):
|
31 |
+
if chunk: # Only yield non-empty chunks
|
32 |
+
yield chunk
|
33 |
+
return StreamingResponse(event_generator(), media_type="text/plain")
|
34 |
+
```
|
35 |
+
- **✅ Improvements Made**:
|
36 |
+
- Added null/empty chunk filtering
|
37 |
+
- Enhanced error handling in generator
|
38 |
+
- Proper async generator implementation
|
39 |
+
|
40 |
+
## ✅ Frontend Implementation Analysis
|
41 |
+
|
42 |
+
### 1. API Service (`services/api.js`)
|
43 |
+
- **Status**: ✅ **IMPROVED** - Enhanced with better error handling
|
44 |
+
- **Implementation**:
|
45 |
+
```javascript
|
46 |
+
export const sendMessageStream = async (message, onChunk) => {
|
47 |
+
const response = await fetch(`${API_BASE_URL}/ask_stream`, {
|
48 |
+
method: 'POST',
|
49 |
+
headers: { 'Content-Type': 'application/json' },
|
50 |
+
body: JSON.stringify({ question: message }),
|
51 |
+
});
|
52 |
+
|
53 |
+
const reader = response.body.getReader();
|
54 |
+
const decoder = new TextDecoder();
|
55 |
+
|
56 |
+
while (true) {
|
57 |
+
const { done, value } = await reader.read();
|
58 |
+
if (done) break;
|
59 |
+
const chunk = decoder.decode(value, { stream: true });
|
60 |
+
if (chunk) onChunk(chunk);
|
61 |
+
}
|
62 |
+
};
|
63 |
+
```
|
64 |
+
- **✅ Improvements Made**:
|
65 |
+
- Added HTTP status code checking
|
66 |
+
- Added reader.releaseLock() for proper cleanup
|
67 |
+
- Enhanced error handling
|
68 |
+
- Added null chunk filtering
|
69 |
+
|
70 |
+
### 2. Chat Interface (`components/ChatInterface.js`)
|
71 |
+
- **Status**: ✅ **GOOD** - Proper real-time UI updates
|
72 |
+
- **Implementation**:
|
73 |
+
```javascript
|
74 |
+
await sendMessageStream(message.trim(), (chunk) => {
|
75 |
+
fullResponse += chunk;
|
76 |
+
setConversations(prev => prev.map(conv =>
|
77 |
+
conv.id === conversationId ? {
|
78 |
+
...conv,
|
79 |
+
messages: conv.messages.map(msg =>
|
80 |
+
msg.id === assistantMessageId
|
81 |
+
? { ...msg, content: fullResponse }
|
82 |
+
: msg
|
83 |
+
),
|
84 |
+
} : conv
|
85 |
+
));
|
86 |
+
});
|
87 |
+
```
|
88 |
+
- **✅ Features**:
|
89 |
+
- Real-time message updates
|
90 |
+
- Proper loading states
|
91 |
+
- Error handling with toast notifications
|
92 |
+
- Typing indicators during streaming
|
93 |
+
|
94 |
+
## 🔧 Additional Improvements Made
|
95 |
+
|
96 |
+
### 1. Error Handling Enhancement
|
97 |
+
- **Backend**: Added comprehensive error handling in streaming generator
|
98 |
+
- **Frontend**: Added HTTP status checking and proper resource cleanup
|
99 |
+
- **Both**: Added null/empty chunk filtering
|
100 |
+
|
101 |
+
### 2. Testing Infrastructure
|
102 |
+
- **Created**: `test_streaming.py` - Comprehensive test suite for streaming
|
103 |
+
- **Features**:
|
104 |
+
- API connection testing
|
105 |
+
- Streaming functionality testing
|
106 |
+
- Error handling verification
|
107 |
+
- Performance metrics
|
108 |
+
|
109 |
+
### 3. Documentation
|
110 |
+
- **Created**: `STREAMING_ANALYSIS.md` - This comprehensive analysis
|
111 |
+
- **Updated**: Inline code comments for better maintainability
|
112 |
+
|
113 |
+
## 🚀 How to Test the Implementation
|
114 |
+
|
115 |
+
### 1. Test API Connection
|
116 |
+
```bash
|
117 |
+
cd backend
|
118 |
+
python test_streaming.py
|
119 |
+
```
|
120 |
+
|
121 |
+
### 2. Test Full Application
|
122 |
+
```bash
|
123 |
+
# Terminal 1 - Backend
|
124 |
+
cd backend
|
125 |
+
python backend_api.py
|
126 |
+
|
127 |
+
# Terminal 2 - Frontend
|
128 |
+
cd frontend
|
129 |
+
npm start
|
130 |
+
```
|
131 |
+
|
132 |
+
### 3. Test Streaming Manually
|
133 |
+
1. Open the application in browser
|
134 |
+
2. Ask a question
|
135 |
+
3. Observe real-time streaming response
|
136 |
+
4. Check browser dev tools for any errors
|
137 |
+
|
138 |
+
## 📊 Performance Characteristics
|
139 |
+
|
140 |
+
### Backend
|
141 |
+
- **Latency**: Low - streams immediately as chunks arrive from Gemini
|
142 |
+
- **Memory**: Efficient - no buffering, direct streaming
|
143 |
+
- **Error Recovery**: Graceful - continues streaming even if some chunks fail
|
144 |
+
|
145 |
+
### Frontend
|
146 |
+
- **UI Responsiveness**: Excellent - real-time updates without blocking
|
147 |
+
- **Memory Usage**: Low - processes chunks as they arrive
|
148 |
+
- **Error Handling**: Comprehensive - proper cleanup and user feedback
|
149 |
+
|
150 |
+
## 🎯 API Compatibility
|
151 |
+
|
152 |
+
### Google Generative AI API
|
153 |
+
- **✅ Model**: `gemini-2.5-flash` (latest)
|
154 |
+
- **✅ Method**: `generate_content_stream` (current)
|
155 |
+
- **✅ Parameters**: `model` and `contents` (correct format)
|
156 |
+
|
157 |
+
### FastAPI Streaming
|
158 |
+
- **✅ Response Type**: `StreamingResponse` (correct)
|
159 |
+
- **✅ Media Type**: `text/plain` (compatible with frontend)
|
160 |
+
- **✅ Async Generator**: Proper async/await implementation
|
161 |
+
|
162 |
+
### Frontend Fetch API
|
163 |
+
- **✅ ReadableStream**: Proper stream handling
|
164 |
+
- **✅ TextDecoder**: Correct UTF-8 decoding
|
165 |
+
- **✅ Resource Management**: Proper cleanup
|
166 |
+
|
167 |
+
## ✅ Conclusion
|
168 |
+
|
169 |
+
The streaming implementation is **WORKING CORRECTLY** and has been enhanced with:
|
170 |
+
|
171 |
+
1. **Latest API compatibility** - Uses gemini-2.5-flash with correct method
|
172 |
+
2. **Robust error handling** - Comprehensive error management
|
173 |
+
3. **Performance optimizations** - Efficient streaming without buffering
|
174 |
+
4. **Proper resource management** - No memory leaks or resource issues
|
175 |
+
5. **Real-time UI updates** - Smooth user experience
|
176 |
+
6. **Comprehensive testing** - Test suite for validation
|
177 |
+
|
178 |
+
The implementation follows best practices and should provide a smooth, responsive chat experience with real-time streaming responses.
|
backend/app_v2.py
ADDED
@@ -0,0 +1,130 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
import gradio as gr
|
2 |
+
from rag import RAG # Assuming your rag.py is in the same directory
|
3 |
+
import os
|
4 |
+
import time
|
5 |
+
|
6 |
+
# --- Configuration ---
|
7 |
+
GOOGLE_API_KEY = os.getenv("GOOGLE_API_KEY")
|
8 |
+
if not GOOGLE_API_KEY:
|
9 |
+
raise ValueError("GOOGLE_API_KEY environment variable not set. Please set it before running the app.")
|
10 |
+
|
11 |
+
COLLECTION_NAME = os.getenv("COLLECTION_NAME", "ca_documents")
|
12 |
+
|
13 |
+
# --- Initialize RAG System ---
|
14 |
+
try:
|
15 |
+
rag = RAG(GOOGLE_API_KEY, COLLECTION_NAME)
|
16 |
+
except Exception as e:
|
17 |
+
print(f"Fatal Error initializing RAG system: {e}")
|
18 |
+
raise
|
19 |
+
|
20 |
+
# --- Initial Chat State ---
|
21 |
+
welcome_message = {
|
22 |
+
"role": "assistant",
|
23 |
+
"content": """
|
24 |
+
👋 **Welcome to your CA Study Assistant!**
|
25 |
+
"""
|
26 |
+
}
|
27 |
+
initial_chat_history = [welcome_message]
|
28 |
+
|
29 |
+
# --- Core Functions ---
|
30 |
+
def upload_file(file):
|
31 |
+
if file is None:
|
32 |
+
return """<div class="message-bubble error">
|
33 |
+
<p><strong>⚠️ Please select a file.</strong></p>
|
34 |
+
<p>Supported formats: PDF, DOCX, TXT.</p>
|
35 |
+
</div>"""
|
36 |
+
try:
|
37 |
+
start_time = time.time()
|
38 |
+
file_path = file.name
|
39 |
+
success = rag.upload_document(file_path)
|
40 |
+
duration = time.time() - start_time
|
41 |
+
if success:
|
42 |
+
return f"""<div class=\"message-bubble success\">
|
43 |
+
<p><strong>✅ Success!</strong></p>
|
44 |
+
<p><em>{os.path.basename(file_path)}</em> uploaded.</p>
|
45 |
+
<p><small>Processed in {duration:.2f}s</small></p>
|
46 |
+
</div>"""
|
47 |
+
else:
|
48 |
+
return f"""<div class=\"message-bubble error\"><p><strong>❌ Upload Failed.</strong></p><p>Please ensure <em>{os.path.basename(file_path)}</em> is a valid file.</p></div>"""
|
49 |
+
except Exception as e:
|
50 |
+
return f"""<div class=\"message-bubble error\"><p><strong>❌ An Error Occurred</strong></p><p><small>{str(e)}</small></p></div>"""
|
51 |
+
|
52 |
+
def chat_with_docs(message: str, history: list):
|
53 |
+
if not message or not message.strip():
|
54 |
+
return "", history
|
55 |
+
history.append({"role": "user", "content": message})
|
56 |
+
answer = rag.ask_question(message)
|
57 |
+
history.append({"role": "assistant", "content": answer})
|
58 |
+
return "", history
|
59 |
+
|
60 |
+
def clear_chat():
|
61 |
+
return initial_chat_history, ""
|
62 |
+
|
63 |
+
# --- Load Custom CSS ---
|
64 |
+
with open("style.css") as f:
|
65 |
+
css = f.read()
|
66 |
+
|
67 |
+
# --- Gradio App Layout ---
|
68 |
+
with gr.Blocks(css=css, title="CA Study Assistant", theme=gr.themes.Base(), mode="auto") as app:
|
69 |
+
with gr.Column(elem_id="app-container"):
|
70 |
+
|
71 |
+
gr.HTML("""
|
72 |
+
<div id="header">
|
73 |
+
<h1>CA Study Assistant</h1>
|
74 |
+
<p>The smartest way to study your materials.</p>
|
75 |
+
</div>
|
76 |
+
""")
|
77 |
+
|
78 |
+
with gr.Tabs(elem_classes="tab-nav"):
|
79 |
+
with gr.TabItem("💬 Ask Questions", id="chat_tab"):
|
80 |
+
chatbot = gr.Chatbot(
|
81 |
+
value=initial_chat_history,
|
82 |
+
elem_id="chat-history",
|
83 |
+
show_label=False,
|
84 |
+
type="messages",
|
85 |
+
bubble_full_width=False,
|
86 |
+
)
|
87 |
+
with gr.Row(elem_id="chat-input-container"):
|
88 |
+
with gr.Column(scale=10):
|
89 |
+
chat_input = gr.Textbox(
|
90 |
+
placeholder="Ask anything about your documents...",
|
91 |
+
show_label=False, elem_id="chat-input", lines=1, max_lines=5,
|
92 |
+
)
|
93 |
+
with gr.Column(scale=1, min_width=60):
|
94 |
+
send_btn = gr.Button('➤', elem_id="send-btn")
|
95 |
+
clear_btn = gr.Button("🗑️ Clear", elem_id="clear-btn")
|
96 |
+
clear_btn.click(fn=clear_chat, outputs=[chatbot, chat_input])
|
97 |
+
|
98 |
+
with gr.TabItem("📚 Upload Documents", id="upload_tab"):
|
99 |
+
with gr.Column(elem_id="upload-area"):
|
100 |
+
gr.HTML("""
|
101 |
+
<div id="upload-icon">
|
102 |
+
<svg xmlns="http://www.w3.org/2000/svg" width="80" height="80" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round"><path d="M21 15v4a2 2 0 0 1-2 2H5a2 2 0 0 1-2-2v-4"></path><polyline points="17 8 12 3 7 8"></polyline><line x1="12" y1="3" x2="12" y2="15"></line></svg>
|
103 |
+
</div>
|
104 |
+
<h3>Upload Your Study Materials</h3>
|
105 |
+
<p>Drag & drop or click below to select a file.</p>
|
106 |
+
<div style="margin-top: 0.5rem;">
|
107 |
+
<span style="background:#c7d2fe;color:#5046e5;padding:0.3rem 0.75rem;border-radius:999px;margin-right:0.5rem;">PDF</span>
|
108 |
+
<span style="background:#c7d2fe;color:#5046e5;padding:0.3rem 0.75rem;border-radius:999px;margin-right:0.5rem;">DOCX</span>
|
109 |
+
<span style="background:#c7d2fe;color:#5046e5;padding:0.3rem 0.75rem;border-radius:999px;">TXT</span>
|
110 |
+
</div>
|
111 |
+
""")
|
112 |
+
upload_input = gr.File(
|
113 |
+
label="", file_count="single", file_types=[".pdf", ".docx", ".txt"],
|
114 |
+
type="filepath",
|
115 |
+
)
|
116 |
+
upload_output = gr.HTML(elem_id="upload-output")
|
117 |
+
|
118 |
+
upload_input.upload(
|
119 |
+
fn=upload_file, inputs=[upload_input], outputs=[upload_output],
|
120 |
+
api_name="upload_document"
|
121 |
+
)
|
122 |
+
send_btn.click(
|
123 |
+
fn=chat_with_docs, inputs=[chat_input, chatbot], outputs=[chat_input, chatbot],
|
124 |
+
)
|
125 |
+
chat_input.submit(
|
126 |
+
fn=chat_with_docs, inputs=[chat_input, chatbot], outputs=[chat_input, chatbot],
|
127 |
+
)
|
128 |
+
|
129 |
+
if __name__ == "__main__":
|
130 |
+
app.launch(share=True, debug=True)
|
backend/backend_api.py
ADDED
@@ -0,0 +1,310 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
from fastapi import FastAPI, File, UploadFile, HTTPException, Request
|
2 |
+
from fastapi.middleware.cors import CORSMiddleware
|
3 |
+
from fastapi.staticfiles import StaticFiles
|
4 |
+
from fastapi.responses import FileResponse, JSONResponse, StreamingResponse
|
5 |
+
from pydantic import BaseModel
|
6 |
+
import os
|
7 |
+
import tempfile
|
8 |
+
import uvicorn
|
9 |
+
from typing import List, Optional
|
10 |
+
import logging
|
11 |
+
from contextlib import asynccontextmanager
|
12 |
+
|
13 |
+
# Import your existing RAG system
|
14 |
+
from rag import RAG
|
15 |
+
from vector_store import VectorStore
|
16 |
+
|
17 |
+
# Configure logging
|
18 |
+
logging.basicConfig(level=logging.INFO)
|
19 |
+
logger = logging.getLogger(__name__)
|
20 |
+
|
21 |
+
# Pydantic models
|
22 |
+
class QuestionRequest(BaseModel):
|
23 |
+
question: str
|
24 |
+
|
25 |
+
class QuestionResponse(BaseModel):
|
26 |
+
answer: str
|
27 |
+
sources: Optional[List[str]] = []
|
28 |
+
|
29 |
+
class SearchRequest(BaseModel):
|
30 |
+
query: str
|
31 |
+
limit: Optional[int] = 5
|
32 |
+
|
33 |
+
class StatusResponse(BaseModel):
|
34 |
+
status: str
|
35 |
+
message: str
|
36 |
+
version: str
|
37 |
+
|
38 |
+
# Global variables
|
39 |
+
rag_system = None
|
40 |
+
|
41 |
+
@asynccontextmanager
|
42 |
+
async def lifespan(app: FastAPI):
|
43 |
+
# Startup
|
44 |
+
global rag_system
|
45 |
+
try:
|
46 |
+
# Initialize RAG system
|
47 |
+
google_api_key = os.getenv("GOOGLE_API_KEY")
|
48 |
+
if not google_api_key:
|
49 |
+
raise ValueError("GOOGLE_API_KEY environment variable not set")
|
50 |
+
|
51 |
+
collection_name = os.getenv("COLLECTION_NAME", "ca_documents")
|
52 |
+
rag_system = RAG(google_api_key, collection_name)
|
53 |
+
logger.info("RAG system initialized successfully")
|
54 |
+
|
55 |
+
except Exception as e:
|
56 |
+
logger.error(f"Failed to initialize RAG system: {e}")
|
57 |
+
raise
|
58 |
+
|
59 |
+
yield
|
60 |
+
|
61 |
+
# Shutdown
|
62 |
+
logger.info("Shutting down...")
|
63 |
+
|
64 |
+
# Create FastAPI app
|
65 |
+
app = FastAPI(
|
66 |
+
title="CA Study Assistant API",
|
67 |
+
description="Backend API for the CA Study Assistant RAG system",
|
68 |
+
version="2.0.0",
|
69 |
+
lifespan=lifespan
|
70 |
+
)
|
71 |
+
|
72 |
+
# CORS middleware
|
73 |
+
app.add_middleware(
|
74 |
+
CORSMiddleware,
|
75 |
+
allow_origins=["*"],
|
76 |
+
allow_credentials=True,
|
77 |
+
allow_methods=["*"],
|
78 |
+
allow_headers=["*"],
|
79 |
+
)
|
80 |
+
|
81 |
+
# Health check endpoint
|
82 |
+
@app.get("/health")
|
83 |
+
async def health_check():
|
84 |
+
return {"status": "healthy", "message": "CA Study Assistant API is running"}
|
85 |
+
|
86 |
+
# API Routes
|
87 |
+
# @app.post("/api/ask", response_model=QuestionResponse)
|
88 |
+
# async def ask_question(request: QuestionRequest):
|
89 |
+
# """
|
90 |
+
# Ask a question to the RAG system
|
91 |
+
# """
|
92 |
+
# try:
|
93 |
+
# if not rag_system:
|
94 |
+
# raise HTTPException(status_code=500, detail="RAG system not initialized")
|
95 |
+
|
96 |
+
# logger.info(f"Processing question: {request.question[:100]}...")
|
97 |
+
# answer = rag_system.ask_question(request.question)
|
98 |
+
|
99 |
+
# # Extract sources from the answer if they exist
|
100 |
+
# sources = []
|
101 |
+
# if "Sources:" in answer:
|
102 |
+
# parts = answer.split("Sources:")
|
103 |
+
# if len(parts) > 1:
|
104 |
+
# answer = parts[0].strip()
|
105 |
+
# sources_text = parts[1].strip()
|
106 |
+
# sources = [s.strip() for s in sources_text.split(",") if s.strip()]
|
107 |
+
|
108 |
+
# return QuestionResponse(answer=answer, sources=sources)
|
109 |
+
|
110 |
+
# except Exception as e:
|
111 |
+
# logger.error(f"Error processing question: {e}")
|
112 |
+
# raise HTTPException(status_code=500, detail=f"Error processing question: {str(e)}")
|
113 |
+
|
114 |
+
@app.post("/api/ask_stream")
|
115 |
+
async def ask_question_stream(request: QuestionRequest):
|
116 |
+
"""
|
117 |
+
Ask a question to the RAG system and get a streaming response
|
118 |
+
"""
|
119 |
+
try:
|
120 |
+
if not rag_system:
|
121 |
+
raise HTTPException(status_code=500, detail="RAG system not initialized")
|
122 |
+
|
123 |
+
logger.info(f"Processing streaming question: {request.question[:100]}...")
|
124 |
+
|
125 |
+
async def event_generator():
|
126 |
+
try:
|
127 |
+
for chunk in rag_system.ask_question_stream(request.question):
|
128 |
+
if chunk: # Only yield non-empty chunks
|
129 |
+
yield chunk
|
130 |
+
except Exception as e:
|
131 |
+
logger.error(f"Error during stream generation: {e}")
|
132 |
+
# This part may not be sent if the connection is already closed.
|
133 |
+
yield f"Error generating answer: {str(e)}"
|
134 |
+
|
135 |
+
return StreamingResponse(event_generator(), media_type="text/plain")
|
136 |
+
|
137 |
+
except Exception as e:
|
138 |
+
logger.error(f"Error processing streaming question: {e}")
|
139 |
+
raise HTTPException(status_code=500, detail=f"Error processing streaming question: {str(e)}")
|
140 |
+
|
141 |
+
@app.post("/api/upload")
|
142 |
+
async def upload_document(file: UploadFile = File(...)):
|
143 |
+
"""
|
144 |
+
Upload a document to the RAG system
|
145 |
+
"""
|
146 |
+
try:
|
147 |
+
if not rag_system:
|
148 |
+
raise HTTPException(status_code=500, detail="RAG system not initialized")
|
149 |
+
|
150 |
+
# Validate file type
|
151 |
+
allowed_extensions = ['.pdf', '.docx', '.txt']
|
152 |
+
file_extension = os.path.splitext(file.filename)[1].lower()
|
153 |
+
|
154 |
+
if file_extension not in allowed_extensions:
|
155 |
+
raise HTTPException(
|
156 |
+
status_code=400,
|
157 |
+
detail=f"Unsupported file type. Allowed types: {', '.join(allowed_extensions)}"
|
158 |
+
)
|
159 |
+
|
160 |
+
# Create temporary file
|
161 |
+
with tempfile.NamedTemporaryFile(delete=False, suffix=file_extension) as temp_file:
|
162 |
+
content = await file.read()
|
163 |
+
temp_file.write(content)
|
164 |
+
temp_file_path = temp_file.name
|
165 |
+
|
166 |
+
try:
|
167 |
+
# Process the uploaded file
|
168 |
+
logger.info(f"Processing uploaded file: {file.filename}")
|
169 |
+
success = rag_system.upload_document(temp_file_path)
|
170 |
+
|
171 |
+
if success:
|
172 |
+
return {
|
173 |
+
"status": "success",
|
174 |
+
"message": f"File '{file.filename}' uploaded and processed successfully",
|
175 |
+
"filename": file.filename,
|
176 |
+
"size": len(content)
|
177 |
+
}
|
178 |
+
else:
|
179 |
+
raise HTTPException(status_code=500, detail="Failed to process uploaded file")
|
180 |
+
|
181 |
+
finally:
|
182 |
+
# Clean up temporary file
|
183 |
+
if os.path.exists(temp_file_path):
|
184 |
+
os.unlink(temp_file_path)
|
185 |
+
|
186 |
+
except HTTPException:
|
187 |
+
raise
|
188 |
+
except Exception as e:
|
189 |
+
logger.error(f"Error uploading document: {e}")
|
190 |
+
raise HTTPException(status_code=500, detail=f"Error uploading document: {str(e)}")
|
191 |
+
|
192 |
+
@app.post("/api/search")
|
193 |
+
async def search_documents(request: SearchRequest):
|
194 |
+
"""
|
195 |
+
Search for similar documents
|
196 |
+
"""
|
197 |
+
try:
|
198 |
+
if not rag_system:
|
199 |
+
raise HTTPException(status_code=500, detail="RAG system not initialized")
|
200 |
+
|
201 |
+
results = rag_system.vector_store.search_similar(request.query, limit=request.limit)
|
202 |
+
|
203 |
+
return {
|
204 |
+
"status": "success",
|
205 |
+
"results": results,
|
206 |
+
"count": len(results)
|
207 |
+
}
|
208 |
+
|
209 |
+
except Exception as e:
|
210 |
+
logger.error(f"Error searching documents: {e}")
|
211 |
+
raise HTTPException(status_code=500, detail=f"Error searching documents: {str(e)}")
|
212 |
+
|
213 |
+
@app.get("/api/status", response_model=StatusResponse)
|
214 |
+
async def get_status():
|
215 |
+
"""
|
216 |
+
Get system status
|
217 |
+
"""
|
218 |
+
try:
|
219 |
+
status = "healthy" if rag_system else "unhealthy"
|
220 |
+
message = "RAG system is operational" if rag_system else "RAG system not initialized"
|
221 |
+
|
222 |
+
return StatusResponse(
|
223 |
+
status=status,
|
224 |
+
message=message,
|
225 |
+
version="2.0.0"
|
226 |
+
)
|
227 |
+
|
228 |
+
except Exception as e:
|
229 |
+
logger.error(f"Error getting status: {e}")
|
230 |
+
raise HTTPException(status_code=500, detail=f"Error getting status: {str(e)}")
|
231 |
+
|
232 |
+
@app.get("/api/collection/info")
|
233 |
+
async def get_collection_info():
|
234 |
+
"""
|
235 |
+
Get information about the vector collection
|
236 |
+
"""
|
237 |
+
try:
|
238 |
+
if not rag_system:
|
239 |
+
raise HTTPException(status_code=500, detail="RAG system not initialized")
|
240 |
+
|
241 |
+
info = rag_system.vector_store.get_collection_info()
|
242 |
+
return {
|
243 |
+
"status": "success",
|
244 |
+
"collection_info": info
|
245 |
+
}
|
246 |
+
|
247 |
+
except Exception as e:
|
248 |
+
logger.error(f"Error getting collection info: {e}")
|
249 |
+
raise HTTPException(status_code=500, detail=f"Error getting collection info: {str(e)}")
|
250 |
+
|
251 |
+
frontend_build_path = "../frontend/build"
|
252 |
+
if os.path.exists(frontend_build_path):
|
253 |
+
app.mount("/static", StaticFiles(directory=f"{frontend_build_path}/static"), name="static")
|
254 |
+
|
255 |
+
@app.get("/{full_path:path}")
|
256 |
+
async def serve_react_app(request: Request, full_path: str):
|
257 |
+
"""
|
258 |
+
Serve React app for all non-API routes
|
259 |
+
"""
|
260 |
+
# If it's an API route, let FastAPI handle it
|
261 |
+
if full_path.startswith("api/"):
|
262 |
+
raise HTTPException(status_code=404, detail="API endpoint not found")
|
263 |
+
|
264 |
+
# For static files (images, etc.)
|
265 |
+
if "." in full_path:
|
266 |
+
file_path = f"{frontend_build_path}/{full_path}"
|
267 |
+
if os.path.exists(file_path):
|
268 |
+
return FileResponse(file_path)
|
269 |
+
else:
|
270 |
+
raise HTTPException(status_code=404, detail="File not found")
|
271 |
+
|
272 |
+
# For all other routes, serve index.html (React Router will handle it)
|
273 |
+
return FileResponse(f"{frontend_build_path}/index.html")
|
274 |
+
|
275 |
+
# Error handlers
|
276 |
+
@app.exception_handler(404)
|
277 |
+
async def not_found_handler(request: Request, exc: HTTPException):
|
278 |
+
if request.url.path.startswith("/api/"):
|
279 |
+
return JSONResponse(
|
280 |
+
status_code=404,
|
281 |
+
content={"detail": "API endpoint not found"}
|
282 |
+
)
|
283 |
+
|
284 |
+
# For non-API routes, serve React app
|
285 |
+
if os.path.exists(f"{frontend_build_path}/index.html"):
|
286 |
+
return FileResponse(f"{frontend_build_path}/index.html")
|
287 |
+
else:
|
288 |
+
return JSONResponse(
|
289 |
+
status_code=404,
|
290 |
+
content={"detail": "React app not built. Run 'npm run build' in the frontend directory."}
|
291 |
+
)
|
292 |
+
|
293 |
+
@app.exception_handler(500)
|
294 |
+
async def internal_error_handler(request: Request, exc: Exception):
|
295 |
+
logger.error(f"Internal server error: {exc}")
|
296 |
+
return JSONResponse(
|
297 |
+
status_code=500,
|
298 |
+
content={"detail": "Internal server error"}
|
299 |
+
)
|
300 |
+
|
301 |
+
if __name__ == "__main__":
|
302 |
+
# Get port from environment or default to 8000
|
303 |
+
port = int(os.getenv("PORT", 8000))
|
304 |
+
uvicorn.run(
|
305 |
+
"backend_api:app",
|
306 |
+
host="0.0.0.0",
|
307 |
+
port=port,
|
308 |
+
reload=True,
|
309 |
+
log_level="info"
|
310 |
+
)
|
backend/download_model.py
ADDED
@@ -0,0 +1,66 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
#!/usr/bin/env python3
|
2 |
+
"""
|
3 |
+
Download the sentence transformer model for offline use.
|
4 |
+
Run this script when you have internet access to cache the model locally.
|
5 |
+
"""
|
6 |
+
|
7 |
+
import os
|
8 |
+
import sys
|
9 |
+
from sentence_transformers import SentenceTransformer
|
10 |
+
|
11 |
+
def download_model():
|
12 |
+
"""Download and cache the sentence transformer model."""
|
13 |
+
try:
|
14 |
+
print("Downloading sentence transformer model 'all-MiniLM-L6-v2'...")
|
15 |
+
print("This may take a few minutes on first run...")
|
16 |
+
|
17 |
+
# This will download and cache the model
|
18 |
+
model = SentenceTransformer("all-MiniLM-L6-v2")
|
19 |
+
|
20 |
+
# Test that it works
|
21 |
+
test_text = "This is a test sentence."
|
22 |
+
embedding = model.encode([test_text])
|
23 |
+
|
24 |
+
print(f"✓ Model downloaded successfully!")
|
25 |
+
print(f"✓ Model tested successfully!")
|
26 |
+
print(f"✓ Embedding dimension: {len(embedding[0])}")
|
27 |
+
print(f"✓ Model cache location: {model.cache_folder}")
|
28 |
+
|
29 |
+
return True
|
30 |
+
|
31 |
+
except Exception as e:
|
32 |
+
print(f"✗ Failed to download model: {e}")
|
33 |
+
return False
|
34 |
+
|
35 |
+
def check_model_exists():
|
36 |
+
"""Check if the model is already cached."""
|
37 |
+
try:
|
38 |
+
# Try to load from cache
|
39 |
+
import os
|
40 |
+
os.environ['TRANSFORMERS_OFFLINE'] = '1'
|
41 |
+
os.environ['HF_HUB_OFFLINE'] = '1'
|
42 |
+
|
43 |
+
model = SentenceTransformer("all-MiniLM-L6-v2")
|
44 |
+
print("✓ Model is already cached and available for offline use!")
|
45 |
+
return True
|
46 |
+
|
47 |
+
except Exception:
|
48 |
+
print("✗ Model is not cached or not available for offline use")
|
49 |
+
return False
|
50 |
+
|
51 |
+
if __name__ == "__main__":
|
52 |
+
print("Sentence Transformer Model Downloader")
|
53 |
+
print("=" * 40)
|
54 |
+
|
55 |
+
# Check if model already exists
|
56 |
+
if check_model_exists():
|
57 |
+
print("\nModel is already available. No download needed.")
|
58 |
+
sys.exit(0)
|
59 |
+
|
60 |
+
# Download the model
|
61 |
+
print("\nDownloading model...")
|
62 |
+
if download_model():
|
63 |
+
print("\n✓ Setup complete! You can now run the application offline.")
|
64 |
+
else:
|
65 |
+
print("\n✗ Download failed. Please check your internet connection.")
|
66 |
+
sys.exit(1)
|
backend/rag.py
ADDED
@@ -0,0 +1,198 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
from google import genai
|
2 |
+
from vector_store import VectorStore
|
3 |
+
import PyPDF2
|
4 |
+
from docx import Document
|
5 |
+
from typing import List
|
6 |
+
import os
|
7 |
+
from langchain_text_splitters import RecursiveCharacterTextSplitter
|
8 |
+
|
9 |
+
class RAG:
|
10 |
+
def __init__(self, google_api_key: str, collection_name: str = "ca_documents"):
|
11 |
+
# Setup Gemini
|
12 |
+
# The client gets the API key from the environment variable `GOOGLE_API_KEY`
|
13 |
+
# or from the `api_key` argument.
|
14 |
+
self.client = genai.Client(api_key=google_api_key)
|
15 |
+
|
16 |
+
# Setup Vector Store (Qdrant configuration is handled via environment variables)
|
17 |
+
self.vector_store = VectorStore(collection_name)
|
18 |
+
|
19 |
+
# Setup Text Splitter
|
20 |
+
self.text_splitter = RecursiveCharacterTextSplitter(
|
21 |
+
chunk_size=1000,
|
22 |
+
chunk_overlap=200,
|
23 |
+
length_function=len,
|
24 |
+
separators=["\n\n", "\n", ". ", " ", ""]
|
25 |
+
)
|
26 |
+
|
27 |
+
def process_pdf(self, file_path: str) -> List[str]:
|
28 |
+
"""Extract text from PDF and split into chunks using RecursiveTextSplitter"""
|
29 |
+
full_text = ""
|
30 |
+
with open(file_path, 'rb') as file:
|
31 |
+
pdf_reader = PyPDF2.PdfReader(file)
|
32 |
+
for page in pdf_reader.pages:
|
33 |
+
text = page.extract_text()
|
34 |
+
if text.strip():
|
35 |
+
full_text += text + "\n"
|
36 |
+
|
37 |
+
# Use RecursiveCharacterTextSplitter for better chunking
|
38 |
+
chunks = self.text_splitter.split_text(full_text)
|
39 |
+
return [chunk.strip() for chunk in chunks if chunk.strip()]
|
40 |
+
|
41 |
+
def process_docx(self, file_path: str) -> List[str]:
|
42 |
+
"""Extract text from DOCX and split into chunks using RecursiveTextSplitter"""
|
43 |
+
doc = Document(file_path)
|
44 |
+
full_text = "\n".join([paragraph.text for paragraph in doc.paragraphs])
|
45 |
+
|
46 |
+
# Use RecursiveCharacterTextSplitter for better chunking
|
47 |
+
chunks = self.text_splitter.split_text(full_text)
|
48 |
+
return [chunk.strip() for chunk in chunks if chunk.strip()]
|
49 |
+
|
50 |
+
def upload_document(self, file_path: str) -> bool:
|
51 |
+
"""Upload and process document"""
|
52 |
+
try:
|
53 |
+
filename = os.path.basename(file_path)
|
54 |
+
|
55 |
+
if file_path.endswith('.pdf'):
|
56 |
+
chunks = self.process_pdf(file_path)
|
57 |
+
elif file_path.endswith('.docx'):
|
58 |
+
chunks = self.process_docx(file_path)
|
59 |
+
elif file_path.endswith('.txt'):
|
60 |
+
with open(file_path, 'r', encoding='utf-8') as f:
|
61 |
+
full_text = f.read()
|
62 |
+
chunks = self.text_splitter.split_text(full_text)
|
63 |
+
chunks = [chunk.strip() for chunk in chunks if chunk.strip()]
|
64 |
+
else:
|
65 |
+
print("Unsupported file format")
|
66 |
+
return False
|
67 |
+
|
68 |
+
# Store chunks in Qdrant
|
69 |
+
for i, chunk in enumerate(chunks):
|
70 |
+
self.vector_store.add_document(
|
71 |
+
text=chunk,
|
72 |
+
metadata={"source": filename, "chunk_id": i}
|
73 |
+
)
|
74 |
+
|
75 |
+
print(f"Uploaded {len(chunks)} chunks from {filename}")
|
76 |
+
return True
|
77 |
+
|
78 |
+
except Exception as e:
|
79 |
+
print(f"Error uploading document: {e}")
|
80 |
+
return False
|
81 |
+
|
82 |
+
|
83 |
+
def is_casual_conversation(self, question: str) -> bool:
|
84 |
+
"""Determine if the question is casual conversation vs CA-specific query"""
|
85 |
+
|
86 |
+
question_lower = question.lower().strip()
|
87 |
+
|
88 |
+
# Pure casual greetings (exact matches or very short)
|
89 |
+
pure_casual = [
|
90 |
+
'hello', 'hi', 'hey', 'good morning', 'good afternoon', 'good evening',
|
91 |
+
'how are you', 'what\'s up', 'greetings', 'thanks', 'thank you',
|
92 |
+
'bye', 'goodbye', 'see you', 'nice to meet you', 'who are you',
|
93 |
+
'what can you do', 'help me', 'what is your name', 'introduce yourself',
|
94 |
+
'how do you work', 'what are you', 'can you help me'
|
95 |
+
]
|
96 |
+
|
97 |
+
# Check for exact matches first
|
98 |
+
if question_lower in pure_casual:
|
99 |
+
return True
|
100 |
+
|
101 |
+
# Check if it's a very short greeting (≤ 4 words) without technical terms
|
102 |
+
words = question_lower.split()
|
103 |
+
if len(words) <= 4:
|
104 |
+
# Technical/question indicators
|
105 |
+
technical_indicators = [
|
106 |
+
'what', 'how', 'why', 'when', 'where', 'explain', 'define', 'calculate',
|
107 |
+
'accounting', 'audit', 'tax', 'finance', 'depreciation', 'balance', 'sheet',
|
108 |
+
'profit', 'loss', 'asset', 'liability', 'equity', 'revenue', 'expense',
|
109 |
+
'journal', 'ledger', 'trial', 'cash', 'flow', 'ratio', 'analysis'
|
110 |
+
]
|
111 |
+
|
112 |
+
# If no technical indicators and contains casual words, it's casual
|
113 |
+
has_casual = any(casual in question_lower for casual in ['hello', 'hi', 'hey', 'thanks', 'bye'])
|
114 |
+
has_technical = any(tech in question_lower for tech in technical_indicators)
|
115 |
+
|
116 |
+
if has_casual and not has_technical:
|
117 |
+
return True
|
118 |
+
|
119 |
+
# Check for greetings followed by actual questions
|
120 |
+
# Pattern: "hello, what is..." or "hi there, how do..."
|
121 |
+
greeting_patterns = [
|
122 |
+
r'^(hello|hi|hey|good morning|good afternoon|good evening),?\s+(what|how|why|when|where|explain|define|tell|can you)',
|
123 |
+
r'^(hello|hi|hey)\s+(there|everyone)?,?\s+(what|how|why|when|where|explain|define|tell|can you)'
|
124 |
+
]
|
125 |
+
|
126 |
+
import re
|
127 |
+
for pattern in greeting_patterns:
|
128 |
+
if re.search(pattern, question_lower):
|
129 |
+
return False # It's a question with greeting prefix, not pure casual
|
130 |
+
|
131 |
+
return False
|
132 |
+
|
133 |
+
def ask_question_stream(self, question: str):
|
134 |
+
"""Ask a question and get a streaming answer"""
|
135 |
+
try:
|
136 |
+
# 1. Check if this is casual conversation
|
137 |
+
if self.is_casual_conversation(question):
|
138 |
+
# Respond as a friendly CA assistant for casual conversation
|
139 |
+
casual_prompt = f"""You are a friendly CA (Chartered Accountant) study assistant. The user said: "{question}"
|
140 |
+
|
141 |
+
Respond naturally and warmly as a CA study assistant. Be helpful and mention that you can help with CA studies, accounting concepts, financial topics, etc. Keep it brief but friendly."""
|
142 |
+
|
143 |
+
for chunk in self.client.models.generate_content_stream(
|
144 |
+
model='gemini-2.5-flash',
|
145 |
+
contents=casual_prompt
|
146 |
+
):
|
147 |
+
yield chunk.text
|
148 |
+
return
|
149 |
+
|
150 |
+
# 2. For CA-specific questions, search for similar documents
|
151 |
+
similar_docs = self.vector_store.search_similar(question, limit=3)
|
152 |
+
|
153 |
+
if similar_docs and len(similar_docs) > 0:
|
154 |
+
# 3. Create context from similar documents
|
155 |
+
context = "\n\n".join([doc["text"] for doc in similar_docs])
|
156 |
+
|
157 |
+
# 4. Create prompt for Gemini with context
|
158 |
+
prompt = f"""You are a CA study assistant. Based on the following context from uploaded documents, answer the question.
|
159 |
+
|
160 |
+
Context:
|
161 |
+
{context}
|
162 |
+
|
163 |
+
Question: {question}
|
164 |
+
|
165 |
+
Please provide a detailed answer based on the context above. If you need more specific information, suggest what documents might be helpful."""
|
166 |
+
|
167 |
+
else:
|
168 |
+
# 5. No documents found, but still be helpful
|
169 |
+
prompt = f"""You are a CA (Chartered Accountant) study assistant. The user asked: "{question}"
|
170 |
+
|
171 |
+
Even though no specific study materials have been uploaded yet, provide a helpful answer based on your knowledge of CA studies, accounting, finance, taxation, and auditing. Be informative and suggest that uploading relevant study materials would help provide more specific and detailed answers.
|
172 |
+
|
173 |
+
Question: {question}"""
|
174 |
+
|
175 |
+
# 6. Get answer from Gemini
|
176 |
+
for chunk in self.client.models.generate_content_stream(
|
177 |
+
model='gemini-2.5-flash',
|
178 |
+
contents=prompt
|
179 |
+
):
|
180 |
+
yield chunk.text
|
181 |
+
|
182 |
+
except Exception as e:
|
183 |
+
yield f"Error generating answer: {e}"
|
184 |
+
|
185 |
+
# Simple usage example
|
186 |
+
if __name__ == "__main__":
|
187 |
+
# Initialize
|
188 |
+
rag = RAG(
|
189 |
+
google_api_key="your_google_api_key",
|
190 |
+
collection_name="ca_documents"
|
191 |
+
)
|
192 |
+
|
193 |
+
# Upload documents
|
194 |
+
# rag.upload_document("path/to/your/ca_document.pdf")
|
195 |
+
|
196 |
+
# Ask questions
|
197 |
+
# answer = rag.ask_question("What is depreciation?")
|
198 |
+
# print(answer)
|
backend/requirements.txt
ADDED
@@ -0,0 +1,12 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
google-genai
|
2 |
+
qdrant-client
|
3 |
+
sentence-transformers
|
4 |
+
PyPDF2
|
5 |
+
python-docx
|
6 |
+
python-dotenv
|
7 |
+
langchain-text-splitters
|
8 |
+
fastapi
|
9 |
+
uvicorn[standard]
|
10 |
+
python-multipart
|
11 |
+
gradio
|
12 |
+
pycryptodome
|
backend/vector_store.py
ADDED
@@ -0,0 +1,297 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
from qdrant_client import QdrantClient, models
|
2 |
+
from qdrant_client.models import PointStruct, PayloadSchemaType
|
3 |
+
from sentence_transformers import SentenceTransformer
|
4 |
+
import uuid
|
5 |
+
import os
|
6 |
+
import logging
|
7 |
+
from typing import List, Dict, Any
|
8 |
+
from dotenv import load_dotenv
|
9 |
+
|
10 |
+
# Load environment variables
|
11 |
+
load_dotenv()
|
12 |
+
|
13 |
+
# Configure logging
|
14 |
+
logger = logging.getLogger(__name__)
|
15 |
+
|
16 |
+
class VectorStore:
|
17 |
+
def __init__(self, collection_name: str = "ca_documents"):
|
18 |
+
self.collection_name = collection_name
|
19 |
+
|
20 |
+
# Get Qdrant configuration from environment variables
|
21 |
+
qdrant_url = os.getenv("QDRANT_URL")
|
22 |
+
qdrant_api_key = os.getenv("QDRANT_API_KEY")
|
23 |
+
|
24 |
+
if not qdrant_url or not qdrant_api_key:
|
25 |
+
raise ValueError("QDRANT_URL and QDRANT_API_KEY environment variables are required")
|
26 |
+
|
27 |
+
# Connect to Qdrant cluster with API key
|
28 |
+
self.client = QdrantClient(
|
29 |
+
url=qdrant_url,
|
30 |
+
api_key=qdrant_api_key,
|
31 |
+
)
|
32 |
+
print("Connected to Qdrant")
|
33 |
+
|
34 |
+
# Initialize embedding model with offline support
|
35 |
+
self.embedding_model = self._initialize_embedding_model()
|
36 |
+
|
37 |
+
# Create collection with proper indices
|
38 |
+
self._create_collection_if_not_exists()
|
39 |
+
|
40 |
+
def _initialize_embedding_model(self):
|
41 |
+
"""Initialize the embedding model with offline support"""
|
42 |
+
try:
|
43 |
+
# Try to load the model normally first
|
44 |
+
print("Attempting to load sentence transformer model...")
|
45 |
+
model = SentenceTransformer("all-MiniLM-L6-v2")
|
46 |
+
print("Successfully loaded sentence transformer model")
|
47 |
+
return model
|
48 |
+
|
49 |
+
except Exception as e:
|
50 |
+
print(f"Failed to load model online: {e}")
|
51 |
+
print("Attempting to load model in offline mode...")
|
52 |
+
|
53 |
+
try:
|
54 |
+
# Try to load from cache with offline mode
|
55 |
+
import os
|
56 |
+
os.environ['TRANSFORMERS_OFFLINE'] = '1'
|
57 |
+
os.environ['HF_HUB_OFFLINE'] = '1'
|
58 |
+
|
59 |
+
model = SentenceTransformer("all-MiniLM-L6-v2", cache_folder=None)
|
60 |
+
print("Successfully loaded model in offline mode")
|
61 |
+
return model
|
62 |
+
|
63 |
+
except Exception as offline_error:
|
64 |
+
print(f"Failed to load model in offline mode: {offline_error}")
|
65 |
+
|
66 |
+
# Try to find a local cache directory
|
67 |
+
try:
|
68 |
+
import transformers
|
69 |
+
cache_dir = os.path.join(os.path.expanduser("~"), ".cache", "huggingface", "transformers")
|
70 |
+
if os.path.exists(cache_dir):
|
71 |
+
print(f"Looking for cached model in: {cache_dir}")
|
72 |
+
|
73 |
+
# Try to load from specific cache directory
|
74 |
+
model = SentenceTransformer("all-MiniLM-L6-v2", cache_folder=cache_dir)
|
75 |
+
print("Successfully loaded model from cache")
|
76 |
+
return model
|
77 |
+
|
78 |
+
except Exception as cache_error:
|
79 |
+
print(f"Failed to load from cache: {cache_error}")
|
80 |
+
|
81 |
+
# If all else fails, provide instructions
|
82 |
+
error_msg = """
|
83 |
+
Failed to initialize sentence transformer model. This is likely due to network connectivity issues.
|
84 |
+
|
85 |
+
Solutions:
|
86 |
+
1. Check your internet connection
|
87 |
+
2. If behind a corporate firewall, ensure huggingface.co is accessible
|
88 |
+
3. Pre-download the model when you have internet access by running:
|
89 |
+
python -c "from sentence_transformers import SentenceTransformer; SentenceTransformer('all-MiniLM-L6-v2')"
|
90 |
+
4. Or manually download the model and place it in your cache directory
|
91 |
+
|
92 |
+
For now, the application will not work without the embedding model.
|
93 |
+
"""
|
94 |
+
|
95 |
+
print(error_msg)
|
96 |
+
raise RuntimeError(f"Cannot initialize embedding model: {str(e)}")
|
97 |
+
|
98 |
+
def _create_collection_if_not_exists(self) -> bool:
|
99 |
+
"""
|
100 |
+
Create collection with proper payload indices if it doesn't exist.
|
101 |
+
|
102 |
+
Returns:
|
103 |
+
bool: True if collection exists or was created successfully
|
104 |
+
"""
|
105 |
+
try:
|
106 |
+
# Check if collection exists
|
107 |
+
collections = self.client.get_collections()
|
108 |
+
collection_names = [col.name for col in collections.collections]
|
109 |
+
|
110 |
+
if self.collection_name in collection_names:
|
111 |
+
print(f"Collection '{self.collection_name}' already exists")
|
112 |
+
return True
|
113 |
+
|
114 |
+
print(f"Creating new collection: {self.collection_name}")
|
115 |
+
|
116 |
+
# Vector size for all-MiniLM-L6-v2 is 384
|
117 |
+
vector_size = 1
|
118 |
+
|
119 |
+
# Create collection with vector configuration
|
120 |
+
self.client.create_collection(
|
121 |
+
collection_name=self.collection_name,
|
122 |
+
vectors_config=models.VectorParams(
|
123 |
+
size=vector_size,
|
124 |
+
distance=models.Distance.COSINE
|
125 |
+
),
|
126 |
+
hnsw_config=models.HnswConfigDiff(
|
127 |
+
payload_m=16,
|
128 |
+
m=0,
|
129 |
+
),
|
130 |
+
)
|
131 |
+
|
132 |
+
# Create payload indices
|
133 |
+
payload_indices = {
|
134 |
+
"document_id": PayloadSchemaType.KEYWORD,
|
135 |
+
"content": PayloadSchemaType.TEXT
|
136 |
+
}
|
137 |
+
|
138 |
+
for field_name, schema_type in payload_indices.items():
|
139 |
+
self.client.create_payload_index(
|
140 |
+
collection_name=self.collection_name,
|
141 |
+
field_name=field_name,
|
142 |
+
field_schema=schema_type
|
143 |
+
)
|
144 |
+
|
145 |
+
print(f"Successfully created collection: {self.collection_name}")
|
146 |
+
return True
|
147 |
+
|
148 |
+
except Exception as e:
|
149 |
+
error_msg = f"Failed to create collection {self.collection_name}: {str(e)}"
|
150 |
+
logger.error(error_msg, exc_info=True)
|
151 |
+
print(error_msg)
|
152 |
+
return False
|
153 |
+
|
154 |
+
def add_document(self, text: str, metadata: Dict = None) -> bool:
|
155 |
+
"""Add a document to the collection"""
|
156 |
+
try:
|
157 |
+
# Generate embedding
|
158 |
+
embedding = self.embedding_model.encode([text])[0]
|
159 |
+
|
160 |
+
# Generate document ID
|
161 |
+
document_id = str(uuid.uuid4())
|
162 |
+
|
163 |
+
# Create payload with indexed fields
|
164 |
+
payload = {
|
165 |
+
"document_id": document_id, # KEYWORD index
|
166 |
+
"content": text, # TEXT index - stores the actual text content
|
167 |
+
}
|
168 |
+
|
169 |
+
# Add metadata fields if provided
|
170 |
+
if metadata:
|
171 |
+
payload.update(metadata)
|
172 |
+
|
173 |
+
# Create point
|
174 |
+
point = PointStruct(
|
175 |
+
id=document_id,
|
176 |
+
vector=embedding.tolist(),
|
177 |
+
payload=payload
|
178 |
+
)
|
179 |
+
|
180 |
+
# Store in Qdrant
|
181 |
+
self.client.upsert(
|
182 |
+
collection_name=self.collection_name,
|
183 |
+
points=[point]
|
184 |
+
)
|
185 |
+
|
186 |
+
return True
|
187 |
+
except Exception as e:
|
188 |
+
print(f"Error adding document: {e}")
|
189 |
+
return False
|
190 |
+
|
191 |
+
def search_similar(self, query: str, limit: int = 5) -> List[Dict]:
|
192 |
+
"""Search for similar documents"""
|
193 |
+
try:
|
194 |
+
# Generate query embedding
|
195 |
+
query_embedding = self.embedding_model.encode([query])[0]
|
196 |
+
|
197 |
+
# Search in Qdrant
|
198 |
+
results = self.client.search(
|
199 |
+
collection_name=self.collection_name,
|
200 |
+
query_vector=query_embedding.tolist(),
|
201 |
+
limit=limit
|
202 |
+
)
|
203 |
+
|
204 |
+
# Return results
|
205 |
+
return [
|
206 |
+
{
|
207 |
+
"text": hit.payload["content"], # Use content field
|
208 |
+
"document_id": hit.payload.get("document_id"),
|
209 |
+
"score": hit.score,
|
210 |
+
# Include any additional metadata fields
|
211 |
+
**{k: v for k, v in hit.payload.items() if k not in ["content", "document_id"]}
|
212 |
+
}
|
213 |
+
for hit in results
|
214 |
+
]
|
215 |
+
|
216 |
+
except Exception as e:
|
217 |
+
print(f"Error searching: {e}")
|
218 |
+
return []
|
219 |
+
|
220 |
+
def search_by_document_id(self, document_id: str) -> Dict:
|
221 |
+
"""Search for a specific document by its ID using the indexed field"""
|
222 |
+
try:
|
223 |
+
# Use scroll to find document by document_id
|
224 |
+
results = self.client.scroll(
|
225 |
+
collection_name=self.collection_name,
|
226 |
+
scroll_filter=models.Filter(
|
227 |
+
must=[
|
228 |
+
models.FieldCondition(
|
229 |
+
key="document_id",
|
230 |
+
match=models.MatchValue(value=document_id)
|
231 |
+
)
|
232 |
+
]
|
233 |
+
),
|
234 |
+
limit=1
|
235 |
+
)
|
236 |
+
|
237 |
+
if results[0]: # results is a tuple (points, next_page_offset)
|
238 |
+
hit = results[0][0] # Get first point
|
239 |
+
return {
|
240 |
+
"text": hit.payload["content"], # Use content field
|
241 |
+
"document_id": hit.payload.get("document_id"),
|
242 |
+
# Include any additional metadata fields
|
243 |
+
**{k: v for k, v in hit.payload.items() if k not in ["content", "document_id"]}
|
244 |
+
}
|
245 |
+
else:
|
246 |
+
return None
|
247 |
+
|
248 |
+
except Exception as e:
|
249 |
+
print(f"Error searching by document ID: {e}")
|
250 |
+
return None
|
251 |
+
|
252 |
+
def search_by_content(self, content_query: str, limit: int = 5) -> List[Dict]:
|
253 |
+
"""Search for documents by content using the TEXT index"""
|
254 |
+
try:
|
255 |
+
# Use scroll with text search filter
|
256 |
+
results = self.client.scroll(
|
257 |
+
collection_name=self.collection_name,
|
258 |
+
scroll_filter=models.Filter(
|
259 |
+
must=[
|
260 |
+
models.FieldCondition(
|
261 |
+
key="content",
|
262 |
+
match=models.MatchText(text=content_query)
|
263 |
+
)
|
264 |
+
]
|
265 |
+
),
|
266 |
+
limit=limit
|
267 |
+
)
|
268 |
+
|
269 |
+
# Return results
|
270 |
+
return [
|
271 |
+
{
|
272 |
+
"text": hit.payload["content"], # Use content field
|
273 |
+
"document_id": hit.payload.get("document_id"),
|
274 |
+
# Include any additional metadata fields
|
275 |
+
**{k: v for k, v in hit.payload.items() if k not in ["content", "document_id"]}
|
276 |
+
}
|
277 |
+
for hit in results[0] # results[0] contains the points
|
278 |
+
]
|
279 |
+
|
280 |
+
except Exception as e:
|
281 |
+
print(f"Error searching by content: {e}")
|
282 |
+
return []
|
283 |
+
|
284 |
+
def get_collection_info(self) -> Dict:
|
285 |
+
"""Get information about the collection"""
|
286 |
+
try:
|
287 |
+
collection_info = self.client.get_collection(self.collection_name)
|
288 |
+
return {
|
289 |
+
"name": collection_info.config.name,
|
290 |
+
"vector_size": collection_info.config.params.vectors.size,
|
291 |
+
"distance": collection_info.config.params.vectors.distance,
|
292 |
+
"points_count": collection_info.points_count,
|
293 |
+
"indexed_only": collection_info.config.params.vectors.on_disk
|
294 |
+
}
|
295 |
+
except Exception as e:
|
296 |
+
print(f"Error getting collection info: {e}")
|
297 |
+
return {}
|
docker-compose.yml
ADDED
@@ -0,0 +1,22 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
version: '3.8'
|
2 |
+
|
3 |
+
services:
|
4 |
+
backend:
|
5 |
+
build:
|
6 |
+
context: ./backend
|
7 |
+
dockerfile: Dockerfile
|
8 |
+
image: ca-study-assistant-backend
|
9 |
+
container_name: backend-container
|
10 |
+
restart: unless-stopped
|
11 |
+
|
12 |
+
frontend:
|
13 |
+
build:
|
14 |
+
context: ./frontend
|
15 |
+
dockerfile: Dockerfile
|
16 |
+
image: ca-study-assistant-frontend
|
17 |
+
container_name: frontend-container
|
18 |
+
ports:
|
19 |
+
- "80:80" # Expose port 80 to the host (Hugging Face Spaces)
|
20 |
+
restart: unless-stopped
|
21 |
+
depends_on:
|
22 |
+
- backend # Ensures the backend starts before the frontend
|
frontend/Dockerfile
ADDED
@@ -0,0 +1,34 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
# Stage 1: Build the React application
|
2 |
+
FROM node:18-alpine AS build
|
3 |
+
|
4 |
+
WORKDIR /app
|
5 |
+
|
6 |
+
# Copy package.json and package-lock.json
|
7 |
+
COPY package.json package-lock.json ./
|
8 |
+
|
9 |
+
# Install dependencies
|
10 |
+
RUN npm install
|
11 |
+
|
12 |
+
# Copy the rest of the application source code
|
13 |
+
COPY . .
|
14 |
+
|
15 |
+
# Build the application
|
16 |
+
RUN npm run build
|
17 |
+
|
18 |
+
# Stage 2: Serve the application with Nginx
|
19 |
+
FROM nginx:1.23-alpine
|
20 |
+
|
21 |
+
# Copy the build output from the build stage
|
22 |
+
COPY --from=build /app/build /usr/share/nginx/html
|
23 |
+
|
24 |
+
# Remove the default Nginx configuration
|
25 |
+
RUN rm /etc/nginx/conf.d/default.conf
|
26 |
+
|
27 |
+
# Copy our custom Nginx configuration
|
28 |
+
COPY nginx.conf /etc/nginx/conf.d
|
29 |
+
|
30 |
+
# Expose port 80
|
31 |
+
EXPOSE 80
|
32 |
+
|
33 |
+
# Start Nginx
|
34 |
+
CMD ["nginx", "-g", "daemon off;"]
|
frontend/nginx.conf
ADDED
@@ -0,0 +1,29 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
server {
|
2 |
+
listen 80;
|
3 |
+
|
4 |
+
# Root directory for the React app
|
5 |
+
root /usr/share/nginx/html;
|
6 |
+
index index.html index.htm;
|
7 |
+
|
8 |
+
# Handle client-side routing for React
|
9 |
+
location / {
|
10 |
+
try_files $uri /index.html;
|
11 |
+
}
|
12 |
+
|
13 |
+
# Proxy API requests to the backend service
|
14 |
+
# Any request to http://<your-space-url>/api/...
|
15 |
+
# will be forwarded to http://backend:8000/...
|
16 |
+
location /api {
|
17 |
+
proxy_pass http://backend:8000; # 'backend' is the service name in docker-compose.yml
|
18 |
+
proxy_set_header Host $host;
|
19 |
+
proxy_set_header X-Real-IP $remote_addr;
|
20 |
+
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
|
21 |
+
proxy_set_header X-Forwarded-Proto $scheme;
|
22 |
+
}
|
23 |
+
|
24 |
+
# Optional: Improve performance with caching for static assets
|
25 |
+
location ~* \.(css|js|jpg|jpeg|png|gif|ico|svg)$ {
|
26 |
+
expires 1y;
|
27 |
+
add_header Cache-Control "public";
|
28 |
+
}
|
29 |
+
}
|
frontend/package-lock.json
ADDED
The diff for this file is too large to render.
See raw diff
|
|
frontend/package.json
ADDED
@@ -0,0 +1,55 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
{
|
2 |
+
"name": "ca-study-assistant-frontend",
|
3 |
+
"version": "0.1.0",
|
4 |
+
"private": true,
|
5 |
+
"dependencies": {
|
6 |
+
"@headlessui/react": "^1.7.17",
|
7 |
+
"@heroicons/react": "^2.0.18",
|
8 |
+
"@tailwindcss/typography": "^0.5.10",
|
9 |
+
"axios": "^1.6.2",
|
10 |
+
"framer-motion": "^10.16.16",
|
11 |
+
"react": "^18.2.0",
|
12 |
+
"react-dom": "^18.2.0",
|
13 |
+
"react-dropzone": "^14.2.3",
|
14 |
+
"react-hot-toast": "^2.4.1",
|
15 |
+
"react-markdown": "^9.0.1",
|
16 |
+
"react-scripts": "5.0.1",
|
17 |
+
"react-syntax-highlighter": "^15.5.0",
|
18 |
+
"remark-gfm": "^4.0.0",
|
19 |
+
"uuid": "^9.0.1"
|
20 |
+
},
|
21 |
+
"devDependencies": {
|
22 |
+
"@types/react": "^18.2.43",
|
23 |
+
"@types/react-dom": "^18.2.17",
|
24 |
+
"@types/uuid": "^9.0.7",
|
25 |
+
"autoprefixer": "^10.4.16",
|
26 |
+
"postcss": "^8.4.32",
|
27 |
+
"tailwindcss": "^3.3.6",
|
28 |
+
"typescript": "^4.9.5"
|
29 |
+
},
|
30 |
+
"scripts": {
|
31 |
+
"start": "react-scripts start",
|
32 |
+
"build": "react-scripts build",
|
33 |
+
"test": "react-scripts test",
|
34 |
+
"eject": "react-scripts eject"
|
35 |
+
},
|
36 |
+
"eslintConfig": {
|
37 |
+
"extends": [
|
38 |
+
"react-app",
|
39 |
+
"react-app/jest"
|
40 |
+
]
|
41 |
+
},
|
42 |
+
"browserslist": {
|
43 |
+
"production": [
|
44 |
+
">0.2%",
|
45 |
+
"not dead",
|
46 |
+
"not op_mini all"
|
47 |
+
],
|
48 |
+
"development": [
|
49 |
+
"last 1 chrome version",
|
50 |
+
"last 1 firefox version",
|
51 |
+
"last 1 safari version"
|
52 |
+
]
|
53 |
+
},
|
54 |
+
"proxy": "http://localhost:8000"
|
55 |
+
}
|
frontend/postcss.config.js
ADDED
@@ -0,0 +1,6 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
module.exports = {
|
2 |
+
plugins: {
|
3 |
+
tailwindcss: {},
|
4 |
+
autoprefixer: {},
|
5 |
+
},
|
6 |
+
}
|
frontend/public/index.html
ADDED
@@ -0,0 +1,22 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
<!DOCTYPE html>
|
2 |
+
<html lang="en">
|
3 |
+
<head>
|
4 |
+
<meta charset="utf-8" />
|
5 |
+
<link rel="icon" href="%PUBLIC_URL%/favicon.ico" />
|
6 |
+
<meta name="viewport" content="width=device-width, initial-scale=1" />
|
7 |
+
<meta name="theme-color" content="#000000" />
|
8 |
+
<meta name="description" content="CA Study Assistant - AI-powered study companion for Chartered Accountant exam preparation" />
|
9 |
+
<link rel="apple-touch-icon" href="%PUBLIC_URL%/logo192.png" />
|
10 |
+
<link rel="manifest" href="%PUBLIC_URL%/manifest.json" />
|
11 |
+
|
12 |
+
<!-- Preconnect to fonts -->
|
13 |
+
<link rel="preconnect" href="https://fonts.googleapis.com">
|
14 |
+
<link rel="preconnect" href="https://fonts.gstatic.com" crossorigin>
|
15 |
+
|
16 |
+
<title>CA Study Assistant</title>
|
17 |
+
</head>
|
18 |
+
<body>
|
19 |
+
<noscript>You need to enable JavaScript to run this app.</noscript>
|
20 |
+
<div id="root"></div>
|
21 |
+
</body>
|
22 |
+
</html>
|
frontend/public/manifest.json
ADDED
@@ -0,0 +1,8 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
{
|
2 |
+
"short_name": "React App",
|
3 |
+
"name": "Create React App Sample",
|
4 |
+
"start_url": ".",
|
5 |
+
"display": "standalone",
|
6 |
+
"theme_color": "#000000",
|
7 |
+
"background_color": "#ffffff"
|
8 |
+
}
|
frontend/src/App.js
ADDED
@@ -0,0 +1,218 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
import React, { useState, useEffect } from 'react';
|
2 |
+
import { Toaster } from 'react-hot-toast';
|
3 |
+
import ChatInterface from './components/ChatInterface';
|
4 |
+
import Sidebar from './components/Sidebar';
|
5 |
+
import WelcomeScreen from './components/WelcomeScreen';
|
6 |
+
import { SunIcon, MoonIcon, HomeIcon } from '@heroicons/react/24/outline';
|
7 |
+
import ConversationStorage from './utils/conversationStorage';
|
8 |
+
|
9 |
+
function App() {
|
10 |
+
const [darkMode, setDarkMode] = useState(false);
|
11 |
+
const [sidebarOpen, setSidebarOpen] = useState(false);
|
12 |
+
const [chatStarted, setChatStarted] = useState(false);
|
13 |
+
const [conversations, setConversations] = useState([]);
|
14 |
+
const [activeConversationId, setActiveConversationId] = useState(null);
|
15 |
+
|
16 |
+
useEffect(() => {
|
17 |
+
// Check for saved theme preference or default to light mode
|
18 |
+
const savedTheme = localStorage.getItem('theme');
|
19 |
+
const prefersDark = window.matchMedia('(prefers-color-scheme: dark)').matches;
|
20 |
+
|
21 |
+
if (savedTheme === 'dark' || (!savedTheme && prefersDark)) {
|
22 |
+
setDarkMode(true);
|
23 |
+
document.documentElement.classList.add('dark');
|
24 |
+
}
|
25 |
+
|
26 |
+
// Load conversations from localStorage
|
27 |
+
const savedConversations = ConversationStorage.loadConversations();
|
28 |
+
if (savedConversations.length > 0) {
|
29 |
+
setConversations(savedConversations);
|
30 |
+
setChatStarted(true);
|
31 |
+
// Set the most recent conversation as active
|
32 |
+
setActiveConversationId(savedConversations[0].id);
|
33 |
+
}
|
34 |
+
}, []);
|
35 |
+
|
36 |
+
const toggleDarkMode = () => {
|
37 |
+
setDarkMode(!darkMode);
|
38 |
+
if (darkMode) {
|
39 |
+
document.documentElement.classList.remove('dark');
|
40 |
+
localStorage.setItem('theme', 'light');
|
41 |
+
} else {
|
42 |
+
document.documentElement.classList.add('dark');
|
43 |
+
localStorage.setItem('theme', 'dark');
|
44 |
+
}
|
45 |
+
};
|
46 |
+
|
47 |
+
const startNewChat = () => {
|
48 |
+
const newConversation = {
|
49 |
+
id: Date.now().toString(),
|
50 |
+
title: 'New Conversation',
|
51 |
+
messages: [],
|
52 |
+
createdAt: new Date(),
|
53 |
+
updatedAt: new Date(),
|
54 |
+
};
|
55 |
+
|
56 |
+
// Save to localStorage
|
57 |
+
ConversationStorage.addConversation(newConversation);
|
58 |
+
|
59 |
+
setConversations(prev => [newConversation, ...prev]);
|
60 |
+
setActiveConversationId(newConversation.id);
|
61 |
+
setChatStarted(true);
|
62 |
+
setSidebarOpen(false);
|
63 |
+
};
|
64 |
+
|
65 |
+
const deleteConversation = (conversationId) => {
|
66 |
+
// Delete from localStorage
|
67 |
+
ConversationStorage.deleteConversation(conversationId);
|
68 |
+
|
69 |
+
setConversations(prev => prev.filter(conv => conv.id !== conversationId));
|
70 |
+
|
71 |
+
// If the deleted conversation was active, switch to another one
|
72 |
+
if (activeConversationId === conversationId) {
|
73 |
+
const remainingConversations = conversations.filter(conv => conv.id !== conversationId);
|
74 |
+
if (remainingConversations.length > 0) {
|
75 |
+
setActiveConversationId(remainingConversations[0].id);
|
76 |
+
} else {
|
77 |
+
setActiveConversationId(null);
|
78 |
+
setChatStarted(false);
|
79 |
+
}
|
80 |
+
}
|
81 |
+
};
|
82 |
+
|
83 |
+
const updateConversations = (updatedConversations) => {
|
84 |
+
setConversations(updatedConversations);
|
85 |
+
// Save to localStorage
|
86 |
+
ConversationStorage.saveConversations(updatedConversations);
|
87 |
+
};
|
88 |
+
|
89 |
+
const handleFirstMessage = (message) => {
|
90 |
+
if (!chatStarted) {
|
91 |
+
startNewChat();
|
92 |
+
}
|
93 |
+
};
|
94 |
+
|
95 |
+
const goBackToHome = () => {
|
96 |
+
setActiveConversationId(null);
|
97 |
+
setChatStarted(false);
|
98 |
+
setSidebarOpen(false);
|
99 |
+
};
|
100 |
+
|
101 |
+
return (
|
102 |
+
<div className={`min-h-screen transition-colors duration-200 ${
|
103 |
+
darkMode
|
104 |
+
? 'bg-gray-900 text-white'
|
105 |
+
: 'bg-gray-50 text-gray-900'
|
106 |
+
}`}>
|
107 |
+
{/* Header */}
|
108 |
+
<header className={`fixed top-0 left-0 right-0 z-50 ${
|
109 |
+
darkMode
|
110 |
+
? 'bg-gray-800/95 border-gray-700'
|
111 |
+
: 'bg-white/95 border-gray-200'
|
112 |
+
} backdrop-blur-sm border-b`}>
|
113 |
+
<div className="flex items-center justify-between px-4 py-3">
|
114 |
+
<div className="flex items-center space-x-4">
|
115 |
+
<button
|
116 |
+
onClick={() => setSidebarOpen(!sidebarOpen)}
|
117 |
+
className={`p-2 rounded-lg transition-colors ${
|
118 |
+
darkMode
|
119 |
+
? 'hover:bg-gray-700 text-gray-300'
|
120 |
+
: 'hover:bg-gray-100 text-gray-600'
|
121 |
+
}`}
|
122 |
+
>
|
123 |
+
<svg className="w-5 h-5" fill="none" stroke="currentColor" viewBox="0 0 24 24">
|
124 |
+
<path strokeLinecap="round" strokeLinejoin="round" strokeWidth={2} d="M4 6h16M4 12h16M4 18h16" />
|
125 |
+
</svg>
|
126 |
+
</button>
|
127 |
+
|
128 |
+
<button
|
129 |
+
onClick={goBackToHome}
|
130 |
+
className="text-lg font-semibold gradient-text hover:opacity-80 transition-opacity"
|
131 |
+
>
|
132 |
+
CA Study Assistant
|
133 |
+
</button>
|
134 |
+
</div>
|
135 |
+
|
136 |
+
<div className="flex items-center space-x-2">
|
137 |
+
{chatStarted && (
|
138 |
+
<button
|
139 |
+
onClick={goBackToHome}
|
140 |
+
className={`p-2 rounded-lg transition-colors ${
|
141 |
+
darkMode
|
142 |
+
? 'hover:bg-gray-700 text-gray-300'
|
143 |
+
: 'hover:bg-gray-100 text-gray-600'
|
144 |
+
}`}
|
145 |
+
title="Back to Home"
|
146 |
+
>
|
147 |
+
<HomeIcon className="w-5 h-5" />
|
148 |
+
</button>
|
149 |
+
)}
|
150 |
+
|
151 |
+
<button
|
152 |
+
onClick={toggleDarkMode}
|
153 |
+
className={`p-2 rounded-lg transition-colors ${
|
154 |
+
darkMode
|
155 |
+
? 'hover:bg-gray-700 text-gray-300'
|
156 |
+
: 'hover:bg-gray-100 text-gray-600'
|
157 |
+
}`}
|
158 |
+
>
|
159 |
+
{darkMode ? (
|
160 |
+
<SunIcon className="w-5 h-5" />
|
161 |
+
) : (
|
162 |
+
<MoonIcon className="w-5 h-5" />
|
163 |
+
)}
|
164 |
+
</button>
|
165 |
+
</div>
|
166 |
+
</div>
|
167 |
+
</header>
|
168 |
+
|
169 |
+
{/* Sidebar */}
|
170 |
+
<Sidebar
|
171 |
+
open={sidebarOpen}
|
172 |
+
onClose={() => setSidebarOpen(false)}
|
173 |
+
conversations={conversations}
|
174 |
+
activeConversationId={activeConversationId}
|
175 |
+
onConversationSelect={setActiveConversationId}
|
176 |
+
onNewChat={startNewChat}
|
177 |
+
onDeleteConversation={deleteConversation}
|
178 |
+
onBackToHome={goBackToHome}
|
179 |
+
darkMode={darkMode}
|
180 |
+
/>
|
181 |
+
|
182 |
+
{/* Main Content */}
|
183 |
+
<main className={`transition-all duration-200 ${
|
184 |
+
sidebarOpen ? 'md:ml-64' : 'ml-0'
|
185 |
+
} pt-16`}>
|
186 |
+
{chatStarted ? (
|
187 |
+
<ChatInterface
|
188 |
+
conversationId={activeConversationId}
|
189 |
+
conversations={conversations}
|
190 |
+
setConversations={updateConversations}
|
191 |
+
darkMode={darkMode}
|
192 |
+
/>
|
193 |
+
) : (
|
194 |
+
<WelcomeScreen
|
195 |
+
onStartChat={handleFirstMessage}
|
196 |
+
onNewChat={startNewChat}
|
197 |
+
darkMode={darkMode}
|
198 |
+
/>
|
199 |
+
)}
|
200 |
+
</main>
|
201 |
+
|
202 |
+
{/* Toast notifications */}
|
203 |
+
<Toaster
|
204 |
+
position="top-right"
|
205 |
+
toastOptions={{
|
206 |
+
duration: 4000,
|
207 |
+
style: {
|
208 |
+
background: darkMode ? '#374151' : '#ffffff',
|
209 |
+
color: darkMode ? '#f9fafb' : '#111827',
|
210 |
+
border: darkMode ? '1px solid #4b5563' : '1px solid #e5e7eb',
|
211 |
+
},
|
212 |
+
}}
|
213 |
+
/>
|
214 |
+
</div>
|
215 |
+
);
|
216 |
+
}
|
217 |
+
|
218 |
+
export default App;
|
frontend/src/components/ChatInterface.js
ADDED
@@ -0,0 +1,408 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
import React, { useState, useRef, useEffect } from 'react';
|
2 |
+
import { motion, AnimatePresence } from 'framer-motion';
|
3 |
+
import { PaperAirplaneIcon, StopIcon } from '@heroicons/react/24/solid';
|
4 |
+
import MessageBubble from './MessageBubble';
|
5 |
+
import TypingIndicator from './TypingIndicator';
|
6 |
+
import FileUploader from './FileUploader';
|
7 |
+
import { sendMessage, sendMessageStream } from '../services/api';
|
8 |
+
import toast from 'react-hot-toast';
|
9 |
+
|
10 |
+
const ChatInterface = ({ conversationId, conversations, setConversations, darkMode }) => {
|
11 |
+
const [message, setMessage] = useState('');
|
12 |
+
const [isLoading, setIsLoading] = useState(false);
|
13 |
+
const [showFileUploader, setShowFileUploader] = useState(false);
|
14 |
+
const messagesEndRef = useRef(null);
|
15 |
+
const textareaRef = useRef(null);
|
16 |
+
|
17 |
+
const currentConversation = conversations.find(conv => conv.id === conversationId);
|
18 |
+
const messages = currentConversation?.messages || [];
|
19 |
+
|
20 |
+
const scrollToBottom = () => {
|
21 |
+
messagesEndRef.current?.scrollIntoView({ behavior: "smooth" });
|
22 |
+
};
|
23 |
+
|
24 |
+
useEffect(() => {
|
25 |
+
scrollToBottom();
|
26 |
+
}, [messages]);
|
27 |
+
|
28 |
+
const handleSubmit = async (e) => {
|
29 |
+
e.preventDefault();
|
30 |
+
if (!message.trim() || isLoading) return;
|
31 |
+
|
32 |
+
const userMessage = {
|
33 |
+
id: Date.now().toString(),
|
34 |
+
role: 'user',
|
35 |
+
content: message.trim(),
|
36 |
+
timestamp: new Date(),
|
37 |
+
};
|
38 |
+
|
39 |
+
// Add user message immediately
|
40 |
+
setConversations(prev => prev.map(conv =>
|
41 |
+
conv.id === conversationId
|
42 |
+
? {
|
43 |
+
...conv,
|
44 |
+
messages: [...conv.messages, userMessage],
|
45 |
+
title: conv.messages.length === 0 ? message.slice(0, 50) + '...' : conv.title
|
46 |
+
}
|
47 |
+
: conv
|
48 |
+
));
|
49 |
+
|
50 |
+
setMessage('');
|
51 |
+
setIsLoading(true);
|
52 |
+
const assistantMessageId = (Date.now() + 1).toString();
|
53 |
+
|
54 |
+
try {
|
55 |
+
let fullResponse = '';
|
56 |
+
|
57 |
+
// Add a placeholder for the assistant's message
|
58 |
+
setConversations(prev => prev.map(conv =>
|
59 |
+
conv.id === conversationId
|
60 |
+
? { ...conv, messages: [...conv.messages, { id: assistantMessageId, role: 'assistant', content: '', timestamp: new Date() }] }
|
61 |
+
: conv
|
62 |
+
));
|
63 |
+
|
64 |
+
await sendMessageStream(message.trim(), (chunk) => {
|
65 |
+
fullResponse += chunk;
|
66 |
+
setConversations(prev => prev.map(conv =>
|
67 |
+
conv.id === conversationId
|
68 |
+
? {
|
69 |
+
...conv,
|
70 |
+
messages: conv.messages.map(msg =>
|
71 |
+
msg.id === assistantMessageId
|
72 |
+
? { ...msg, content: fullResponse }
|
73 |
+
: msg
|
74 |
+
),
|
75 |
+
}
|
76 |
+
: conv
|
77 |
+
));
|
78 |
+
});
|
79 |
+
|
80 |
+
} catch (error) {
|
81 |
+
toast.error('Failed to send message. Please try again.');
|
82 |
+
console.error('Error sending message:', error);
|
83 |
+
// Optional: remove placeholder on error
|
84 |
+
setConversations(prev => prev.map(conv =>
|
85 |
+
conv.id === conversationId
|
86 |
+
? { ...conv, messages: conv.messages.filter(msg => msg.id !== assistantMessageId) }
|
87 |
+
: conv
|
88 |
+
));
|
89 |
+
} finally {
|
90 |
+
setIsLoading(false);
|
91 |
+
}
|
92 |
+
};
|
93 |
+
|
94 |
+
const handleKeyDown = (e) => {
|
95 |
+
if (e.key === 'Enter' && !e.shiftKey) {
|
96 |
+
e.preventDefault();
|
97 |
+
handleSubmit(e);
|
98 |
+
}
|
99 |
+
};
|
100 |
+
|
101 |
+
const adjustTextareaHeight = () => {
|
102 |
+
const textarea = textareaRef.current;
|
103 |
+
if (textarea) {
|
104 |
+
textarea.style.height = 'auto';
|
105 |
+
textarea.style.height = Math.min(textarea.scrollHeight, 120) + 'px';
|
106 |
+
}
|
107 |
+
};
|
108 |
+
|
109 |
+
useEffect(() => {
|
110 |
+
adjustTextareaHeight();
|
111 |
+
}, [message]);
|
112 |
+
|
113 |
+
return (
|
114 |
+
<div className="flex flex-col h-screen">
|
115 |
+
{/* Messages Container */}
|
116 |
+
<div className="flex-1 overflow-y-auto px-4 py-6">
|
117 |
+
<div className="max-w-3xl mx-auto">
|
118 |
+
{/* Empty State */}
|
119 |
+
{messages.length === 0 && !isLoading && (
|
120 |
+
<motion.div
|
121 |
+
initial={{ opacity: 0, y: 20 }}
|
122 |
+
animate={{ opacity: 1, y: 0 }}
|
123 |
+
transition={{ duration: 0.6 }}
|
124 |
+
className="flex flex-col items-center justify-center min-h-[60vh] text-center"
|
125 |
+
>
|
126 |
+
{/* CA Assistant Avatar */}
|
127 |
+
<motion.div
|
128 |
+
initial={{ scale: 0.8 }}
|
129 |
+
animate={{ scale: 1 }}
|
130 |
+
transition={{ duration: 0.5, delay: 0.2 }}
|
131 |
+
className={`w-20 h-20 rounded-full flex items-center justify-center mb-6 ${
|
132 |
+
darkMode
|
133 |
+
? 'bg-gradient-to-br from-primary-600 to-purple-600'
|
134 |
+
: 'bg-gradient-to-br from-primary-500 to-purple-500'
|
135 |
+
} shadow-lg`}
|
136 |
+
>
|
137 |
+
<svg className="w-10 h-10 text-white" fill="none" stroke="currentColor" viewBox="0 0 24 24">
|
138 |
+
<path strokeLinecap="round" strokeLinejoin="round" strokeWidth={2}
|
139 |
+
d="M12 6.253v13m0-13C10.832 5.477 9.246 5 7.5 5S4.168 5.477 3 6.253v13C4.168 18.477 5.754 18 7.5 18s3.332.477 4.5 1.253m0-13C13.168 5.477 14.754 5 16.5 5c1.746 0 3.332.477 4.5 1.253v13C19.832 18.477 18.246 18 16.5 18c-1.746 0-3.332.477-4.5 1.253" />
|
140 |
+
</svg>
|
141 |
+
</motion.div>
|
142 |
+
|
143 |
+
{/* Welcome Message */}
|
144 |
+
<motion.h2
|
145 |
+
initial={{ opacity: 0 }}
|
146 |
+
animate={{ opacity: 1 }}
|
147 |
+
transition={{ delay: 0.4 }}
|
148 |
+
className="text-2xl md:text-3xl font-bold mb-3 gradient-text"
|
149 |
+
>
|
150 |
+
Hello! I'm your CA Study Assistant
|
151 |
+
</motion.h2>
|
152 |
+
|
153 |
+
<motion.p
|
154 |
+
initial={{ opacity: 0 }}
|
155 |
+
animate={{ opacity: 1 }}
|
156 |
+
transition={{ delay: 0.5 }}
|
157 |
+
className={`text-lg mb-8 ${darkMode ? 'text-gray-300' : 'text-gray-600'}`}
|
158 |
+
>
|
159 |
+
I'm here to help you with accounting, finance, taxation, and auditing concepts.
|
160 |
+
Ask me anything or upload your study materials!
|
161 |
+
</motion.p>
|
162 |
+
|
163 |
+
{/* Quick Start Suggestions */}
|
164 |
+
<motion.div
|
165 |
+
initial={{ opacity: 0, y: 20 }}
|
166 |
+
animate={{ opacity: 1, y: 0 }}
|
167 |
+
transition={{ delay: 0.6 }}
|
168 |
+
className="w-full max-w-2xl"
|
169 |
+
>
|
170 |
+
<h3 className={`text-sm font-semibold mb-4 ${
|
171 |
+
darkMode ? 'text-gray-400' : 'text-gray-500'
|
172 |
+
}`}>
|
173 |
+
Try asking me about:
|
174 |
+
</h3>
|
175 |
+
|
176 |
+
<div className="grid grid-cols-1 md:grid-cols-2 gap-3">
|
177 |
+
{[
|
178 |
+
{ icon: "📊", text: "Financial statement analysis", query: "Explain financial statement analysis" },
|
179 |
+
{ icon: "💰", text: "Depreciation methods", query: "What are different depreciation methods?" },
|
180 |
+
{ icon: "🏦", text: "Working capital management", query: "Explain working capital management" },
|
181 |
+
{ icon: "📈", text: "Ratio analysis", query: "How to perform ratio analysis?" },
|
182 |
+
{ icon: "📋", text: "Auditing procedures", query: "What are key auditing procedures?" },
|
183 |
+
{ icon: "💼", text: "Tax planning strategies", query: "Explain tax planning strategies" }
|
184 |
+
].map((suggestion, index) => (
|
185 |
+
<motion.button
|
186 |
+
key={index}
|
187 |
+
initial={{ opacity: 0, x: -20 }}
|
188 |
+
animate={{ opacity: 1, x: 0 }}
|
189 |
+
transition={{ delay: 0.7 + index * 0.1 }}
|
190 |
+
whileHover={{ scale: 1.02, y: -2 }}
|
191 |
+
whileTap={{ scale: 0.98 }}
|
192 |
+
onClick={() => setMessage(suggestion.query)}
|
193 |
+
className={`flex items-center p-4 rounded-xl text-left transition-all ${
|
194 |
+
darkMode
|
195 |
+
? 'bg-gray-800 hover:bg-gray-700 border-gray-700 text-gray-300'
|
196 |
+
: 'bg-gray-50 hover:bg-gray-100 border-gray-200 text-gray-700'
|
197 |
+
} border hover:border-primary-300 hover:shadow-md`}
|
198 |
+
>
|
199 |
+
<span className="text-2xl mr-3">{suggestion.icon}</span>
|
200 |
+
<span className="font-medium">{suggestion.text}</span>
|
201 |
+
</motion.button>
|
202 |
+
))}
|
203 |
+
</div>
|
204 |
+
</motion.div>
|
205 |
+
|
206 |
+
{/* Upload Reminder */}
|
207 |
+
<motion.div
|
208 |
+
initial={{ opacity: 0 }}
|
209 |
+
animate={{ opacity: 1 }}
|
210 |
+
transition={{ delay: 1.2 }}
|
211 |
+
className={`mt-8 p-4 rounded-xl ${
|
212 |
+
darkMode
|
213 |
+
? 'bg-primary-900/20 border-primary-700/30'
|
214 |
+
: 'bg-primary-50 border-primary-200'
|
215 |
+
} border`}
|
216 |
+
>
|
217 |
+
<div className="flex items-center justify-center">
|
218 |
+
<svg className={`w-5 h-5 mr-2 ${
|
219 |
+
darkMode ? 'text-primary-400' : 'text-primary-600'
|
220 |
+
}`} fill="none" stroke="currentColor" viewBox="0 0 24 24">
|
221 |
+
<path strokeLinecap="round" strokeLinejoin="round" strokeWidth={2}
|
222 |
+
d="M9 12h6m-6 4h6m2 5H7a2 2 0 01-2-2V5a2 2 0 012-2h5.586a1 1 0 01.707.293l5.414 5.414a1 1 0 01.293.707V19a2 2 0 01-2 2z" />
|
223 |
+
</svg>
|
224 |
+
<span className={`text-sm ${
|
225 |
+
darkMode ? 'text-primary-300' : 'text-primary-700'
|
226 |
+
}`}>
|
227 |
+
💡 Upload your study materials for more specific and detailed answers
|
228 |
+
</span>
|
229 |
+
</div>
|
230 |
+
</motion.div>
|
231 |
+
</motion.div>
|
232 |
+
)}
|
233 |
+
|
234 |
+
{/* Messages */}
|
235 |
+
<AnimatePresence>
|
236 |
+
{messages.map((msg, index) => (
|
237 |
+
<MessageBubble
|
238 |
+
key={msg.id}
|
239 |
+
message={msg}
|
240 |
+
darkMode={darkMode}
|
241 |
+
isLast={index === messages.length - 1}
|
242 |
+
/>
|
243 |
+
))}
|
244 |
+
</AnimatePresence>
|
245 |
+
|
246 |
+
{isLoading && <TypingIndicator darkMode={darkMode} />}
|
247 |
+
|
248 |
+
<div ref={messagesEndRef} />
|
249 |
+
</div>
|
250 |
+
</div>
|
251 |
+
|
252 |
+
{/* File Uploader Modal */}
|
253 |
+
<AnimatePresence>
|
254 |
+
{showFileUploader && (
|
255 |
+
<motion.div
|
256 |
+
initial={{ opacity: 0 }}
|
257 |
+
animate={{ opacity: 1 }}
|
258 |
+
exit={{ opacity: 0 }}
|
259 |
+
className="fixed inset-0 bg-black bg-opacity-50 flex items-center justify-center z-50 p-4"
|
260 |
+
onClick={() => setShowFileUploader(false)}
|
261 |
+
>
|
262 |
+
<motion.div
|
263 |
+
initial={{ scale: 0.9, opacity: 0 }}
|
264 |
+
animate={{ scale: 1, opacity: 1 }}
|
265 |
+
exit={{ scale: 0.9, opacity: 0 }}
|
266 |
+
onClick={(e) => e.stopPropagation()}
|
267 |
+
className={`max-w-md w-full p-6 rounded-2xl ${
|
268 |
+
darkMode ? 'bg-gray-800' : 'bg-white'
|
269 |
+
} shadow-2xl`}
|
270 |
+
>
|
271 |
+
<h3 className="text-lg font-semibold mb-4">Upload Document</h3>
|
272 |
+
<FileUploader darkMode={darkMode} onClose={() => setShowFileUploader(false)} />
|
273 |
+
</motion.div>
|
274 |
+
</motion.div>
|
275 |
+
)}
|
276 |
+
</AnimatePresence>
|
277 |
+
|
278 |
+
{/* Input Area */}
|
279 |
+
<div className={`border-t ${
|
280 |
+
darkMode ? 'border-gray-700/50 bg-gray-900/95' : 'border-gray-200/50 bg-white/95'
|
281 |
+
} backdrop-blur-sm p-6`}>
|
282 |
+
<div className="max-w-3xl mx-auto">
|
283 |
+
<form onSubmit={handleSubmit} className="relative">
|
284 |
+
{/* Enhanced Input Container */}
|
285 |
+
<div className={`relative overflow-hidden rounded-2xl border-2 transition-all duration-300 ${
|
286 |
+
darkMode
|
287 |
+
? 'bg-gradient-to-br from-gray-800 to-gray-900 border-gray-600 focus-within:border-primary-500 focus-within:from-gray-700 focus-within:to-gray-800'
|
288 |
+
: 'bg-gradient-to-br from-white to-gray-50 border-gray-300 focus-within:border-primary-500 focus-within:from-blue-50 focus-within:to-white'
|
289 |
+
} focus-within:ring-4 focus-within:ring-primary-500/20 shadow-xl hover:shadow-2xl focus-within:shadow-2xl`}>
|
290 |
+
|
291 |
+
{/* Subtle Inner Glow */}
|
292 |
+
<div className={`absolute inset-0 opacity-0 focus-within:opacity-100 transition-opacity duration-300 ${
|
293 |
+
darkMode
|
294 |
+
? 'bg-gradient-to-br from-primary-900/20 to-purple-900/20'
|
295 |
+
: 'bg-gradient-to-br from-primary-50/50 to-purple-50/50'
|
296 |
+
}`} />
|
297 |
+
|
298 |
+
{/* Input Content */}
|
299 |
+
<div className="relative flex items-end space-x-4 p-4">
|
300 |
+
{/* File Upload Button */}
|
301 |
+
<motion.button
|
302 |
+
type="button"
|
303 |
+
whileHover={{ scale: 1.05 }}
|
304 |
+
whileTap={{ scale: 0.95 }}
|
305 |
+
onClick={() => setShowFileUploader(true)}
|
306 |
+
className={`flex-shrink-0 p-3 rounded-xl transition-all duration-200 ${
|
307 |
+
darkMode
|
308 |
+
? 'hover:bg-gray-700/70 text-gray-400 hover:text-primary-400 hover:shadow-lg'
|
309 |
+
: 'hover:bg-gray-100/70 text-gray-500 hover:text-primary-600 hover:shadow-md'
|
310 |
+
} relative group backdrop-blur-sm`}
|
311 |
+
title="Upload document"
|
312 |
+
>
|
313 |
+
<svg className="w-5 h-5" fill="none" stroke="currentColor" viewBox="0 0 24 24">
|
314 |
+
<path strokeLinecap="round" strokeLinejoin="round" strokeWidth={2}
|
315 |
+
d="M15.172 7l-6.586 6.586a2 2 0 102.828 2.828l6.414-6.586a4 4 0 00-5.656-5.656l-6.415 6.585a6 6 0 108.486 8.486L20.5 13" />
|
316 |
+
</svg>
|
317 |
+
|
318 |
+
{/* Enhanced Tooltip */}
|
319 |
+
<div className={`absolute -top-14 left-1/2 transform -translate-x-1/2 px-3 py-2 rounded-lg text-xs whitespace-nowrap opacity-0 group-hover:opacity-100 transition-all duration-200 ${
|
320 |
+
darkMode ? 'bg-gray-800 text-white shadow-xl border border-gray-700' : 'bg-gray-900 text-white shadow-xl'
|
321 |
+
}`}>
|
322 |
+
Upload documents
|
323 |
+
<div className={`absolute top-full left-1/2 transform -translate-x-1/2 w-0 h-0 border-l-4 border-r-4 border-t-4 border-transparent ${
|
324 |
+
darkMode ? 'border-t-gray-800' : 'border-t-gray-900'
|
325 |
+
}`} />
|
326 |
+
</div>
|
327 |
+
</motion.button>
|
328 |
+
|
329 |
+
{/* Enhanced Text Input */}
|
330 |
+
<div className="flex-1 relative">
|
331 |
+
<textarea
|
332 |
+
ref={textareaRef}
|
333 |
+
value={message}
|
334 |
+
onChange={(e) => setMessage(e.target.value)}
|
335 |
+
onKeyDown={handleKeyDown}
|
336 |
+
placeholder={messages.length === 0 ? "Hi! Ask me about accounting, finance, taxation, or upload your study materials..." : "Ask a follow-up question..."}
|
337 |
+
className={`w-full resize-none border-none outline-none bg-transparent py-3 px-2 text-base leading-relaxed ${
|
338 |
+
darkMode ? 'text-white placeholder-gray-400' : 'text-gray-900 placeholder-gray-500'
|
339 |
+
} placeholder:text-sm placeholder:leading-relaxed`}
|
340 |
+
rows={1}
|
341 |
+
disabled={isLoading}
|
342 |
+
style={{
|
343 |
+
minHeight: '24px',
|
344 |
+
maxHeight: '120px',
|
345 |
+
lineHeight: '1.5'
|
346 |
+
}}
|
347 |
+
/>
|
348 |
+
|
349 |
+
{/* Input Focus Indicator */}
|
350 |
+
<div className={`absolute left-0 bottom-0 h-0.5 w-0 bg-gradient-to-r from-primary-500 to-purple-500 transition-all duration-300 ${
|
351 |
+
message.trim() ? 'w-full' : 'group-focus-within:w-full'
|
352 |
+
}`} />
|
353 |
+
</div>
|
354 |
+
|
355 |
+
{/* Enhanced Send Button */}
|
356 |
+
<motion.button
|
357 |
+
type="submit"
|
358 |
+
disabled={!message.trim() || isLoading}
|
359 |
+
whileHover={message.trim() && !isLoading ? { scale: 1.05 } : {}}
|
360 |
+
whileTap={message.trim() && !isLoading ? { scale: 0.95 } : {}}
|
361 |
+
className={`flex-shrink-0 p-3 rounded-xl transition-all duration-200 relative group ${
|
362 |
+
message.trim() && !isLoading
|
363 |
+
? 'bg-gradient-to-r from-primary-600 to-primary-700 hover:from-primary-700 hover:to-primary-800 text-white shadow-lg hover:shadow-xl'
|
364 |
+
: darkMode
|
365 |
+
? 'bg-gray-600/50 text-gray-400 hover:bg-gray-600/70'
|
366 |
+
: 'bg-gray-300/50 text-gray-500 hover:bg-gray-300/70'
|
367 |
+
} disabled:cursor-not-allowed`}
|
368 |
+
title={isLoading ? "Stop generation" : "Send message"}
|
369 |
+
>
|
370 |
+
{isLoading ? (
|
371 |
+
<div className="relative">
|
372 |
+
<StopIcon className="w-5 h-5" />
|
373 |
+
<div className="absolute inset-0 border-2 border-white border-t-transparent rounded-full animate-spin opacity-50"></div>
|
374 |
+
</div>
|
375 |
+
) : (
|
376 |
+
<PaperAirplaneIcon className="w-5 h-5" />
|
377 |
+
)}
|
378 |
+
|
379 |
+
{/* Enhanced Send Button Glow Effect */}
|
380 |
+
{message.trim() && !isLoading && (
|
381 |
+
<div className="absolute inset-0 rounded-xl bg-gradient-to-r from-primary-600 to-primary-700 opacity-0 group-hover:opacity-30 transition-opacity duration-200 blur-lg -z-10"></div>
|
382 |
+
)}
|
383 |
+
</motion.button>
|
384 |
+
</div>
|
385 |
+
|
386 |
+
{/* Bottom Border Accent */}
|
387 |
+
<div className={`absolute bottom-0 left-0 right-0 h-0.5 bg-gradient-to-r from-transparent via-primary-500 to-transparent opacity-0 focus-within:opacity-100 transition-opacity duration-300`} />
|
388 |
+
</div>
|
389 |
+
</form>
|
390 |
+
|
391 |
+
{/* Footer Text */}
|
392 |
+
<motion.p
|
393 |
+
initial={{ opacity: 0 }}
|
394 |
+
animate={{ opacity: 1 }}
|
395 |
+
transition={{ delay: 0.3 }}
|
396 |
+
className={`text-xs text-center mt-3 ${
|
397 |
+
darkMode ? 'text-gray-500' : 'text-gray-400'
|
398 |
+
}`}
|
399 |
+
>
|
400 |
+
⚡ Powered by AI • CA Study Assistant can make mistakes. Consider checking important information.
|
401 |
+
</motion.p>
|
402 |
+
</div>
|
403 |
+
</div>
|
404 |
+
</div>
|
405 |
+
);
|
406 |
+
};
|
407 |
+
|
408 |
+
export default ChatInterface;
|
frontend/src/components/FileUploader.js
ADDED
@@ -0,0 +1,237 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
import React, { useCallback, useState } from 'react';
|
2 |
+
import { useDropzone } from 'react-dropzone';
|
3 |
+
import { motion, AnimatePresence } from 'framer-motion';
|
4 |
+
import {
|
5 |
+
CloudArrowUpIcon,
|
6 |
+
DocumentIcon,
|
7 |
+
CheckCircleIcon,
|
8 |
+
XCircleIcon,
|
9 |
+
XMarkIcon
|
10 |
+
} from '@heroicons/react/24/outline';
|
11 |
+
import { uploadDocument } from '../services/api';
|
12 |
+
import toast from 'react-hot-toast';
|
13 |
+
|
14 |
+
const FileUploader = ({ darkMode, onClose }) => {
|
15 |
+
const [uploading, setUploading] = useState(false);
|
16 |
+
const [uploadedFiles, setUploadedFiles] = useState([]);
|
17 |
+
|
18 |
+
const onDrop = useCallback(async (acceptedFiles) => {
|
19 |
+
setUploading(true);
|
20 |
+
|
21 |
+
for (const file of acceptedFiles) {
|
22 |
+
try {
|
23 |
+
const formData = new FormData();
|
24 |
+
formData.append('file', file);
|
25 |
+
|
26 |
+
await uploadDocument(formData);
|
27 |
+
|
28 |
+
setUploadedFiles(prev => [...prev, {
|
29 |
+
name: file.name,
|
30 |
+
size: file.size,
|
31 |
+
status: 'success'
|
32 |
+
}]);
|
33 |
+
|
34 |
+
toast.success(`${file.name} uploaded successfully!`);
|
35 |
+
} catch (error) {
|
36 |
+
setUploadedFiles(prev => [...prev, {
|
37 |
+
name: file.name,
|
38 |
+
size: file.size,
|
39 |
+
status: 'error'
|
40 |
+
}]);
|
41 |
+
|
42 |
+
toast.error(`Failed to upload ${file.name}`);
|
43 |
+
}
|
44 |
+
}
|
45 |
+
|
46 |
+
setUploading(false);
|
47 |
+
}, []);
|
48 |
+
|
49 |
+
const { getRootProps, getInputProps, isDragActive } = useDropzone({
|
50 |
+
onDrop,
|
51 |
+
accept: {
|
52 |
+
'application/pdf': ['.pdf'],
|
53 |
+
'application/vnd.openxmlformats-officedocument.wordprocessingml.document': ['.docx'],
|
54 |
+
'text/plain': ['.txt']
|
55 |
+
},
|
56 |
+
multiple: true
|
57 |
+
});
|
58 |
+
|
59 |
+
const formatFileSize = (bytes) => {
|
60 |
+
if (bytes === 0) return '0 Bytes';
|
61 |
+
const k = 1024;
|
62 |
+
const sizes = ['Bytes', 'KB', 'MB', 'GB'];
|
63 |
+
const i = Math.floor(Math.log(bytes) / Math.log(k));
|
64 |
+
return parseFloat((bytes / Math.pow(k, i)).toFixed(2)) + ' ' + sizes[i];
|
65 |
+
};
|
66 |
+
|
67 |
+
const removeFile = (index) => {
|
68 |
+
setUploadedFiles(prev => prev.filter((_, i) => i !== index));
|
69 |
+
};
|
70 |
+
|
71 |
+
return (
|
72 |
+
<div className="space-y-4">
|
73 |
+
{/* Dropzone */}
|
74 |
+
<motion.div
|
75 |
+
{...getRootProps()}
|
76 |
+
whileHover={{ scale: 1.02 }}
|
77 |
+
whileTap={{ scale: 0.98 }}
|
78 |
+
className={`file-drop-zone border-2 border-dashed rounded-2xl p-8 text-center cursor-pointer transition-all ${
|
79 |
+
isDragActive
|
80 |
+
? darkMode
|
81 |
+
? 'border-primary-400 bg-primary-900/20'
|
82 |
+
: 'border-primary-500 bg-primary-50'
|
83 |
+
: darkMode
|
84 |
+
? 'border-gray-600 hover:border-gray-500 bg-gray-800'
|
85 |
+
: 'border-gray-300 hover:border-gray-400 bg-gray-50'
|
86 |
+
}`}
|
87 |
+
>
|
88 |
+
<input {...getInputProps()} />
|
89 |
+
|
90 |
+
<CloudArrowUpIcon className={`w-12 h-12 mx-auto mb-4 ${
|
91 |
+
isDragActive
|
92 |
+
? darkMode ? 'text-primary-400' : 'text-primary-500'
|
93 |
+
: darkMode ? 'text-gray-400' : 'text-gray-500'
|
94 |
+
}`} />
|
95 |
+
|
96 |
+
<h3 className={`text-lg font-semibold mb-2 ${
|
97 |
+
darkMode ? 'text-white' : 'text-gray-900'
|
98 |
+
}`}>
|
99 |
+
{isDragActive ? 'Drop files here' : 'Upload study materials'}
|
100 |
+
</h3>
|
101 |
+
|
102 |
+
<p className={`mb-4 ${
|
103 |
+
darkMode ? 'text-gray-400' : 'text-gray-600'
|
104 |
+
}`}>
|
105 |
+
Drag & drop files here, or click to browse
|
106 |
+
</p>
|
107 |
+
|
108 |
+
<div className="flex justify-center space-x-2">
|
109 |
+
<span className={`px-3 py-1 rounded-full text-xs font-medium ${
|
110 |
+
darkMode
|
111 |
+
? 'bg-blue-900/30 text-blue-400'
|
112 |
+
: 'bg-blue-100 text-blue-700'
|
113 |
+
}`}>
|
114 |
+
PDF
|
115 |
+
</span>
|
116 |
+
<span className={`px-3 py-1 rounded-full text-xs font-medium ${
|
117 |
+
darkMode
|
118 |
+
? 'bg-green-900/30 text-green-400'
|
119 |
+
: 'bg-green-100 text-green-700'
|
120 |
+
}`}>
|
121 |
+
DOCX
|
122 |
+
</span>
|
123 |
+
<span className={`px-3 py-1 rounded-full text-xs font-medium ${
|
124 |
+
darkMode
|
125 |
+
? 'bg-purple-900/30 text-purple-400'
|
126 |
+
: 'bg-purple-100 text-purple-700'
|
127 |
+
}`}>
|
128 |
+
TXT
|
129 |
+
</span>
|
130 |
+
</div>
|
131 |
+
</motion.div>
|
132 |
+
|
133 |
+
{/* Upload Progress */}
|
134 |
+
{uploading && (
|
135 |
+
<motion.div
|
136 |
+
initial={{ opacity: 0, y: 20 }}
|
137 |
+
animate={{ opacity: 1, y: 0 }}
|
138 |
+
className={`p-4 rounded-lg ${
|
139 |
+
darkMode ? 'bg-gray-800' : 'bg-gray-100'
|
140 |
+
}`}
|
141 |
+
>
|
142 |
+
<div className="flex items-center space-x-3">
|
143 |
+
<div className="animate-spin rounded-full h-5 w-5 border-b-2 border-primary-500"></div>
|
144 |
+
<span className={`${darkMode ? 'text-gray-300' : 'text-gray-700'}`}>
|
145 |
+
Uploading files...
|
146 |
+
</span>
|
147 |
+
</div>
|
148 |
+
</motion.div>
|
149 |
+
)}
|
150 |
+
|
151 |
+
{/* Uploaded Files List */}
|
152 |
+
<AnimatePresence>
|
153 |
+
{uploadedFiles.length > 0 && (
|
154 |
+
<motion.div
|
155 |
+
initial={{ opacity: 0, height: 0 }}
|
156 |
+
animate={{ opacity: 1, height: 'auto' }}
|
157 |
+
exit={{ opacity: 0, height: 0 }}
|
158 |
+
className="space-y-2"
|
159 |
+
>
|
160 |
+
<h4 className={`font-medium ${
|
161 |
+
darkMode ? 'text-gray-300' : 'text-gray-700'
|
162 |
+
}`}>
|
163 |
+
Uploaded Files
|
164 |
+
</h4>
|
165 |
+
|
166 |
+
{uploadedFiles.map((file, index) => (
|
167 |
+
<motion.div
|
168 |
+
key={index}
|
169 |
+
initial={{ opacity: 0, x: -20 }}
|
170 |
+
animate={{ opacity: 1, x: 0 }}
|
171 |
+
className={`flex items-center justify-between p-3 rounded-lg ${
|
172 |
+
darkMode ? 'bg-gray-800' : 'bg-gray-100'
|
173 |
+
}`}
|
174 |
+
>
|
175 |
+
<div className="flex items-center space-x-3">
|
176 |
+
<DocumentIcon className={`w-5 h-5 ${
|
177 |
+
darkMode ? 'text-gray-400' : 'text-gray-500'
|
178 |
+
}`} />
|
179 |
+
|
180 |
+
<div>
|
181 |
+
<p className={`text-sm font-medium ${
|
182 |
+
darkMode ? 'text-white' : 'text-gray-900'
|
183 |
+
}`}>
|
184 |
+
{file.name}
|
185 |
+
</p>
|
186 |
+
<p className={`text-xs ${
|
187 |
+
darkMode ? 'text-gray-500' : 'text-gray-400'
|
188 |
+
}`}>
|
189 |
+
{formatFileSize(file.size)}
|
190 |
+
</p>
|
191 |
+
</div>
|
192 |
+
</div>
|
193 |
+
|
194 |
+
<div className="flex items-center space-x-2">
|
195 |
+
{file.status === 'success' ? (
|
196 |
+
<CheckCircleIcon className="w-5 h-5 text-green-500" />
|
197 |
+
) : (
|
198 |
+
<XCircleIcon className="w-5 h-5 text-red-500" />
|
199 |
+
)}
|
200 |
+
|
201 |
+
<button
|
202 |
+
onClick={() => removeFile(index)}
|
203 |
+
className={`p-1 rounded transition-colors ${
|
204 |
+
darkMode
|
205 |
+
? 'hover:bg-gray-700 text-gray-400'
|
206 |
+
: 'hover:bg-gray-200 text-gray-500'
|
207 |
+
}`}
|
208 |
+
>
|
209 |
+
<XMarkIcon className="w-4 h-4" />
|
210 |
+
</button>
|
211 |
+
</div>
|
212 |
+
</motion.div>
|
213 |
+
))}
|
214 |
+
</motion.div>
|
215 |
+
)}
|
216 |
+
</AnimatePresence>
|
217 |
+
|
218 |
+
{/* Close Button */}
|
219 |
+
{onClose && (
|
220 |
+
<div className="flex justify-end">
|
221 |
+
<button
|
222 |
+
onClick={onClose}
|
223 |
+
className={`px-4 py-2 rounded-lg font-medium transition-colors ${
|
224 |
+
darkMode
|
225 |
+
? 'bg-gray-700 hover:bg-gray-600 text-white'
|
226 |
+
: 'bg-gray-200 hover:bg-gray-300 text-gray-700'
|
227 |
+
}`}
|
228 |
+
>
|
229 |
+
Close
|
230 |
+
</button>
|
231 |
+
</div>
|
232 |
+
)}
|
233 |
+
</div>
|
234 |
+
);
|
235 |
+
};
|
236 |
+
|
237 |
+
export default FileUploader;
|
frontend/src/components/MessageBubble.js
ADDED
@@ -0,0 +1,138 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
import React from 'react';
|
2 |
+
import { motion } from 'framer-motion';
|
3 |
+
import ReactMarkdown from 'react-markdown';
|
4 |
+
import remarkGfm from 'remark-gfm';
|
5 |
+
import { Prism as SyntaxHighlighter } from 'react-syntax-highlighter';
|
6 |
+
import { tomorrow, prism } from 'react-syntax-highlighter/dist/esm/styles/prism';
|
7 |
+
import { UserIcon, AcademicCapIcon } from '@heroicons/react/24/solid';
|
8 |
+
|
9 |
+
const MessageBubble = ({ message, darkMode, isLast }) => {
|
10 |
+
const isUser = message.role === 'user';
|
11 |
+
|
12 |
+
const messageVariants = {
|
13 |
+
hidden: { opacity: 0, y: 20 },
|
14 |
+
visible: {
|
15 |
+
opacity: 1,
|
16 |
+
y: 0,
|
17 |
+
transition: {
|
18 |
+
duration: 0.3,
|
19 |
+
ease: "easeOut"
|
20 |
+
}
|
21 |
+
}
|
22 |
+
};
|
23 |
+
|
24 |
+
const formatTime = (timestamp) => {
|
25 |
+
return new Date(timestamp).toLocaleTimeString([], {
|
26 |
+
hour: '2-digit',
|
27 |
+
minute: '2-digit'
|
28 |
+
});
|
29 |
+
};
|
30 |
+
|
31 |
+
return (
|
32 |
+
<motion.div
|
33 |
+
variants={messageVariants}
|
34 |
+
initial="hidden"
|
35 |
+
animate="visible"
|
36 |
+
className={`flex gap-4 mb-6 ${isUser ? 'justify-end' : 'justify-start'}`}
|
37 |
+
>
|
38 |
+
{!isUser && (
|
39 |
+
<div className={`flex-shrink-0 w-8 h-8 rounded-full flex items-center justify-center ${
|
40 |
+
darkMode ? 'bg-primary-600' : 'bg-primary-500'
|
41 |
+
}`}>
|
42 |
+
<AcademicCapIcon className="w-5 h-5 text-white" />
|
43 |
+
</div>
|
44 |
+
)}
|
45 |
+
|
46 |
+
<div className={`max-w-[80%] ${isUser ? 'order-first' : ''}`}>
|
47 |
+
<div className={`rounded-2xl px-4 py-3 ${
|
48 |
+
isUser
|
49 |
+
? darkMode
|
50 |
+
? 'bg-primary-600 text-white'
|
51 |
+
: 'bg-primary-500 text-white'
|
52 |
+
: darkMode
|
53 |
+
? 'bg-gray-800 border border-gray-700'
|
54 |
+
: 'bg-white border border-gray-200 shadow-sm'
|
55 |
+
}`}>
|
56 |
+
{isUser ? (
|
57 |
+
<p className="whitespace-pre-wrap">{message.content}</p>
|
58 |
+
) : (
|
59 |
+
<div className="message-content">
|
60 |
+
<ReactMarkdown
|
61 |
+
remarkPlugins={[remarkGfm]}
|
62 |
+
components={{
|
63 |
+
code({ node, inline, className, children, ...props }) {
|
64 |
+
const match = /language-(\w+)/.exec(className || '');
|
65 |
+
return !inline && match ? (
|
66 |
+
<SyntaxHighlighter
|
67 |
+
style={darkMode ? tomorrow : prism}
|
68 |
+
language={match[1]}
|
69 |
+
PreTag="div"
|
70 |
+
{...props}
|
71 |
+
>
|
72 |
+
{String(children).replace(/\n$/, '')}
|
73 |
+
</SyntaxHighlighter>
|
74 |
+
) : (
|
75 |
+
<code className={className} {...props}>
|
76 |
+
{children}
|
77 |
+
</code>
|
78 |
+
);
|
79 |
+
},
|
80 |
+
p: ({ children }) => <p className="mb-2 last:mb-0">{children}</p>,
|
81 |
+
ul: ({ children }) => <ul className="list-disc list-inside mb-2">{children}</ul>,
|
82 |
+
ol: ({ children }) => <ol className="list-decimal list-inside mb-2">{children}</ol>,
|
83 |
+
li: ({ children }) => <li className="mb-1">{children}</li>,
|
84 |
+
h1: ({ children }) => <h1 className="text-xl font-bold mb-2">{children}</h1>,
|
85 |
+
h2: ({ children }) => <h2 className="text-lg font-semibold mb-2">{children}</h2>,
|
86 |
+
h3: ({ children }) => <h3 className="text-md font-medium mb-2">{children}</h3>,
|
87 |
+
blockquote: ({ children }) => (
|
88 |
+
<blockquote className={`border-l-4 pl-4 italic my-2 ${
|
89 |
+
darkMode ? 'border-gray-600 text-gray-300' : 'border-gray-300 text-gray-600'
|
90 |
+
}`}>
|
91 |
+
{children}
|
92 |
+
</blockquote>
|
93 |
+
),
|
94 |
+
}}
|
95 |
+
>
|
96 |
+
{message.content}
|
97 |
+
</ReactMarkdown>
|
98 |
+
</div>
|
99 |
+
)}
|
100 |
+
</div>
|
101 |
+
|
102 |
+
{/* Timestamp and Sources */}
|
103 |
+
<div className={`text-xs mt-2 ${
|
104 |
+
darkMode ? 'text-gray-500' : 'text-gray-400'
|
105 |
+
} ${isUser ? 'text-right' : 'text-left'}`}>
|
106 |
+
<span>{formatTime(message.timestamp)}</span>
|
107 |
+
|
108 |
+
{!isUser && message.sources && message.sources.length > 0 && (
|
109 |
+
<div className="mt-2">
|
110 |
+
<span className="font-medium">Sources: </span>
|
111 |
+
{message.sources.map((source, index) => (
|
112 |
+
<span key={index} className={`inline-block mr-2 px-2 py-1 rounded text-xs ${
|
113 |
+
darkMode
|
114 |
+
? 'bg-gray-700 text-gray-300'
|
115 |
+
: 'bg-gray-100 text-gray-600'
|
116 |
+
}`}>
|
117 |
+
{source}
|
118 |
+
</span>
|
119 |
+
))}
|
120 |
+
</div>
|
121 |
+
)}
|
122 |
+
</div>
|
123 |
+
</div>
|
124 |
+
|
125 |
+
{isUser && (
|
126 |
+
<div className={`flex-shrink-0 w-8 h-8 rounded-full flex items-center justify-center ${
|
127 |
+
darkMode ? 'bg-gray-700' : 'bg-gray-300'
|
128 |
+
}`}>
|
129 |
+
<UserIcon className={`w-5 h-5 ${
|
130 |
+
darkMode ? 'text-gray-300' : 'text-gray-600'
|
131 |
+
}`} />
|
132 |
+
</div>
|
133 |
+
)}
|
134 |
+
</motion.div>
|
135 |
+
);
|
136 |
+
};
|
137 |
+
|
138 |
+
export default MessageBubble;
|
frontend/src/components/Sidebar.js
ADDED
@@ -0,0 +1,216 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
import React from 'react';
|
2 |
+
import { motion, AnimatePresence } from 'framer-motion';
|
3 |
+
import {
|
4 |
+
PlusIcon,
|
5 |
+
XMarkIcon,
|
6 |
+
ChatBubbleLeftIcon,
|
7 |
+
TrashIcon,
|
8 |
+
HomeIcon
|
9 |
+
} from '@heroicons/react/24/outline';
|
10 |
+
|
11 |
+
const Sidebar = ({
|
12 |
+
open,
|
13 |
+
onClose,
|
14 |
+
conversations,
|
15 |
+
activeConversationId,
|
16 |
+
onConversationSelect,
|
17 |
+
onNewChat,
|
18 |
+
onDeleteConversation,
|
19 |
+
onBackToHome,
|
20 |
+
darkMode
|
21 |
+
}) => {
|
22 |
+
const formatDate = (date) => {
|
23 |
+
const now = new Date();
|
24 |
+
const messageDate = new Date(date);
|
25 |
+
const diffTime = Math.abs(now - messageDate);
|
26 |
+
const diffDays = Math.ceil(diffTime / (1000 * 60 * 60 * 24));
|
27 |
+
|
28 |
+
if (diffDays === 1) return 'Today';
|
29 |
+
if (diffDays === 2) return 'Yesterday';
|
30 |
+
if (diffDays <= 7) return `${diffDays} days ago`;
|
31 |
+
return messageDate.toLocaleDateString();
|
32 |
+
};
|
33 |
+
|
34 |
+
const sidebarVariants = {
|
35 |
+
open: {
|
36 |
+
x: 0,
|
37 |
+
transition: {
|
38 |
+
type: "spring",
|
39 |
+
stiffness: 300,
|
40 |
+
damping: 30
|
41 |
+
}
|
42 |
+
},
|
43 |
+
closed: {
|
44 |
+
x: -280,
|
45 |
+
transition: {
|
46 |
+
type: "spring",
|
47 |
+
stiffness: 300,
|
48 |
+
damping: 30
|
49 |
+
}
|
50 |
+
}
|
51 |
+
};
|
52 |
+
|
53 |
+
const overlayVariants = {
|
54 |
+
open: { opacity: 1 },
|
55 |
+
closed: { opacity: 0 }
|
56 |
+
};
|
57 |
+
|
58 |
+
return (
|
59 |
+
<>
|
60 |
+
{/* Mobile Overlay */}
|
61 |
+
<AnimatePresence>
|
62 |
+
{open && (
|
63 |
+
<motion.div
|
64 |
+
variants={overlayVariants}
|
65 |
+
initial="closed"
|
66 |
+
animate="open"
|
67 |
+
exit="closed"
|
68 |
+
className="fixed inset-0 bg-black bg-opacity-50 z-40 md:hidden"
|
69 |
+
onClick={onClose}
|
70 |
+
/>
|
71 |
+
)}
|
72 |
+
</AnimatePresence>
|
73 |
+
|
74 |
+
{/* Sidebar */}
|
75 |
+
<motion.aside
|
76 |
+
variants={sidebarVariants}
|
77 |
+
initial="closed"
|
78 |
+
animate={open ? "open" : "closed"}
|
79 |
+
className={`fixed left-0 top-16 h-[calc(100vh-4rem)] w-64 z-50 ${
|
80 |
+
darkMode
|
81 |
+
? 'bg-gray-900 border-gray-700'
|
82 |
+
: 'bg-white border-gray-200'
|
83 |
+
} border-r shadow-lg flex flex-col`}
|
84 |
+
>
|
85 |
+
{/* Header */}
|
86 |
+
<div className="p-4 border-b border-gray-200 dark:border-gray-700">
|
87 |
+
<div className="flex items-center justify-between mb-4">
|
88 |
+
<h2 className="font-semibold text-lg">Conversations</h2>
|
89 |
+
<button
|
90 |
+
onClick={onClose}
|
91 |
+
className={`p-1 rounded-lg transition-colors md:hidden ${
|
92 |
+
darkMode
|
93 |
+
? 'hover:bg-gray-800 text-gray-400'
|
94 |
+
: 'hover:bg-gray-100 text-gray-500'
|
95 |
+
}`}
|
96 |
+
>
|
97 |
+
<XMarkIcon className="w-5 h-5" />
|
98 |
+
</button>
|
99 |
+
</div>
|
100 |
+
|
101 |
+
<div className="space-y-2">
|
102 |
+
<motion.button
|
103 |
+
whileHover={{ scale: 1.02 }}
|
104 |
+
whileTap={{ scale: 0.98 }}
|
105 |
+
onClick={onNewChat}
|
106 |
+
className={`w-full flex items-center gap-3 px-3 py-2 rounded-lg transition-colors ${
|
107 |
+
darkMode
|
108 |
+
? 'bg-gray-800 hover:bg-gray-700 text-white border-gray-600'
|
109 |
+
: 'bg-gray-50 hover:bg-gray-100 text-gray-900 border-gray-300'
|
110 |
+
} border`}
|
111 |
+
>
|
112 |
+
<PlusIcon className="w-4 h-4" />
|
113 |
+
<span className="font-medium">New Chat</span>
|
114 |
+
</motion.button>
|
115 |
+
|
116 |
+
<motion.button
|
117 |
+
whileHover={{ scale: 1.02 }}
|
118 |
+
whileTap={{ scale: 0.98 }}
|
119 |
+
onClick={onBackToHome}
|
120 |
+
className={`w-full flex items-center gap-3 px-3 py-2 rounded-lg transition-colors ${
|
121 |
+
darkMode
|
122 |
+
? 'bg-primary-600 hover:bg-primary-700 text-white'
|
123 |
+
: 'bg-primary-50 hover:bg-primary-100 text-primary-900 border-primary-300'
|
124 |
+
} border`}
|
125 |
+
>
|
126 |
+
<HomeIcon className="w-4 h-4" />
|
127 |
+
<span className="font-medium">Back to Home</span>
|
128 |
+
</motion.button>
|
129 |
+
</div>
|
130 |
+
</div>
|
131 |
+
|
132 |
+
{/* Conversations List */}
|
133 |
+
<div className="flex-1 overflow-y-auto p-2">
|
134 |
+
{conversations.length === 0 ? (
|
135 |
+
<div className={`text-center py-8 ${
|
136 |
+
darkMode ? 'text-gray-500' : 'text-gray-400'
|
137 |
+
}`}>
|
138 |
+
<ChatBubbleLeftIcon className="w-12 h-12 mx-auto mb-3 opacity-50" />
|
139 |
+
<p className="text-sm">No conversations yet</p>
|
140 |
+
<p className="text-xs mt-1">Start a new chat to begin</p>
|
141 |
+
</div>
|
142 |
+
) : (
|
143 |
+
<div className="space-y-1">
|
144 |
+
{conversations.map((conversation) => (
|
145 |
+
<motion.button
|
146 |
+
key={conversation.id}
|
147 |
+
whileHover={{ x: 4 }}
|
148 |
+
onClick={() => {
|
149 |
+
onConversationSelect(conversation.id);
|
150 |
+
onClose();
|
151 |
+
}}
|
152 |
+
className={`w-full text-left p-3 rounded-lg transition-all group ${
|
153 |
+
activeConversationId === conversation.id
|
154 |
+
? darkMode
|
155 |
+
? 'bg-primary-600 text-white'
|
156 |
+
: 'bg-primary-50 text-primary-900 border-primary-200'
|
157 |
+
: darkMode
|
158 |
+
? 'hover:bg-gray-800 text-gray-300'
|
159 |
+
: 'hover:bg-gray-50 text-gray-700'
|
160 |
+
}`}
|
161 |
+
>
|
162 |
+
<div className="flex items-start justify-between">
|
163 |
+
<div className="flex-1 min-w-0">
|
164 |
+
<p className="font-medium truncate text-sm">
|
165 |
+
{conversation.title}
|
166 |
+
</p>
|
167 |
+
<p className={`text-xs mt-1 ${
|
168 |
+
activeConversationId === conversation.id
|
169 |
+
? 'text-primary-200'
|
170 |
+
: darkMode
|
171 |
+
? 'text-gray-500'
|
172 |
+
: 'text-gray-500'
|
173 |
+
}`}>
|
174 |
+
{formatDate(conversation.createdAt)}
|
175 |
+
</p>
|
176 |
+
</div>
|
177 |
+
|
178 |
+
<button
|
179 |
+
onClick={(e) => {
|
180 |
+
e.stopPropagation();
|
181 |
+
if (window.confirm('Are you sure you want to delete this conversation?')) {
|
182 |
+
onDeleteConversation(conversation.id);
|
183 |
+
}
|
184 |
+
}}
|
185 |
+
className={`opacity-0 group-hover:opacity-100 p-1 rounded transition-opacity ${
|
186 |
+
darkMode
|
187 |
+
? 'hover:bg-gray-700 text-gray-400'
|
188 |
+
: 'hover:bg-gray-200 text-gray-500'
|
189 |
+
}`}
|
190 |
+
>
|
191 |
+
<TrashIcon className="w-4 h-4" />
|
192 |
+
</button>
|
193 |
+
</div>
|
194 |
+
</motion.button>
|
195 |
+
))}
|
196 |
+
</div>
|
197 |
+
)}
|
198 |
+
</div>
|
199 |
+
|
200 |
+
{/* Footer */}
|
201 |
+
<div className={`p-4 border-t ${
|
202 |
+
darkMode ? 'border-gray-700' : 'border-gray-200'
|
203 |
+
}`}>
|
204 |
+
<div className={`text-xs ${
|
205 |
+
darkMode ? 'text-gray-500' : 'text-gray-400'
|
206 |
+
}`}>
|
207 |
+
<p>CA Study Assistant v2.0</p>
|
208 |
+
<p className="mt-1">Powered by AI</p>
|
209 |
+
</div>
|
210 |
+
</div>
|
211 |
+
</motion.aside>
|
212 |
+
</>
|
213 |
+
);
|
214 |
+
};
|
215 |
+
|
216 |
+
export default Sidebar;
|
frontend/src/components/TypingIndicator.js
ADDED
@@ -0,0 +1,34 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
import React from 'react';
|
2 |
+
import { motion } from 'framer-motion';
|
3 |
+
import { AcademicCapIcon } from '@heroicons/react/24/solid';
|
4 |
+
|
5 |
+
const TypingIndicator = ({ darkMode }) => {
|
6 |
+
return (
|
7 |
+
<motion.div
|
8 |
+
initial={{ opacity: 0, y: 20 }}
|
9 |
+
animate={{ opacity: 1, y: 0 }}
|
10 |
+
exit={{ opacity: 0, y: -20 }}
|
11 |
+
className="flex gap-4 mb-6"
|
12 |
+
>
|
13 |
+
<div className={`flex-shrink-0 w-8 h-8 rounded-full flex items-center justify-center ${
|
14 |
+
darkMode ? 'bg-primary-600' : 'bg-primary-500'
|
15 |
+
}`}>
|
16 |
+
<AcademicCapIcon className="w-5 h-5 text-white" />
|
17 |
+
</div>
|
18 |
+
|
19 |
+
<div className={`rounded-2xl px-4 py-3 ${
|
20 |
+
darkMode
|
21 |
+
? 'bg-gray-800 border border-gray-700'
|
22 |
+
: 'bg-white border border-gray-200 shadow-sm'
|
23 |
+
}`}>
|
24 |
+
<div className="typing-indicator">
|
25 |
+
<div className={`dot ${darkMode ? 'text-gray-400' : 'text-gray-500'}`}></div>
|
26 |
+
<div className={`dot ${darkMode ? 'text-gray-400' : 'text-gray-500'}`}></div>
|
27 |
+
<div className={`dot ${darkMode ? 'text-gray-400' : 'text-gray-500'}`}></div>
|
28 |
+
</div>
|
29 |
+
</div>
|
30 |
+
</motion.div>
|
31 |
+
);
|
32 |
+
};
|
33 |
+
|
34 |
+
export default TypingIndicator;
|
frontend/src/components/WelcomeScreen.js
ADDED
@@ -0,0 +1,152 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
import React from 'react';
|
2 |
+
import { motion } from 'framer-motion';
|
3 |
+
import FileUploader from './FileUploader';
|
4 |
+
import {
|
5 |
+
DocumentTextIcon,
|
6 |
+
ChatBubbleBottomCenterTextIcon,
|
7 |
+
LightBulbIcon,
|
8 |
+
AcademicCapIcon
|
9 |
+
} from '@heroicons/react/24/outline';
|
10 |
+
|
11 |
+
const WelcomeScreen = ({ onStartChat, onNewChat, darkMode }) => {
|
12 |
+
const suggestions = [
|
13 |
+
"What is depreciation in accounting?",
|
14 |
+
"Explain the concept of working capital",
|
15 |
+
"What are the different types of financial statements?",
|
16 |
+
"How do you calculate return on investment?"
|
17 |
+
];
|
18 |
+
|
19 |
+
const features = [
|
20 |
+
{
|
21 |
+
icon: DocumentTextIcon,
|
22 |
+
title: "Upload Documents",
|
23 |
+
description: "Upload PDFs, Word docs, and text files for instant analysis"
|
24 |
+
},
|
25 |
+
{
|
26 |
+
icon: ChatBubbleBottomCenterTextIcon,
|
27 |
+
title: "Ask Questions",
|
28 |
+
description: "Get detailed answers based on your uploaded study materials"
|
29 |
+
},
|
30 |
+
{
|
31 |
+
icon: LightBulbIcon,
|
32 |
+
title: "Smart Insights",
|
33 |
+
description: "AI-powered explanations tailored for CA exam preparation"
|
34 |
+
}
|
35 |
+
];
|
36 |
+
|
37 |
+
return (
|
38 |
+
<div className="min-h-screen flex flex-col items-center justify-center p-4">
|
39 |
+
<div className="max-w-4xl w-full">
|
40 |
+
{/* Hero Section */}
|
41 |
+
<motion.div
|
42 |
+
initial={{ opacity: 0, y: 20 }}
|
43 |
+
animate={{ opacity: 1, y: 0 }}
|
44 |
+
transition={{ duration: 0.6 }}
|
45 |
+
className="text-center mb-12"
|
46 |
+
>
|
47 |
+
<div className="mb-6">
|
48 |
+
<AcademicCapIcon className={`w-16 h-16 mx-auto mb-4 ${
|
49 |
+
darkMode ? 'text-primary-400' : 'text-primary-600'
|
50 |
+
}`} />
|
51 |
+
</div>
|
52 |
+
|
53 |
+
<h1 className="text-4xl md:text-6xl font-bold mb-6 gradient-text">
|
54 |
+
CA Study Assistant
|
55 |
+
</h1>
|
56 |
+
|
57 |
+
<p className={`text-xl md:text-2xl mb-8 ${
|
58 |
+
darkMode ? 'text-gray-300' : 'text-gray-600'
|
59 |
+
}`}>
|
60 |
+
Upload your study materials and get instant, intelligent answers
|
61 |
+
</p>
|
62 |
+
|
63 |
+
<motion.button
|
64 |
+
whileHover={{ scale: 1.05 }}
|
65 |
+
whileTap={{ scale: 0.95 }}
|
66 |
+
onClick={onNewChat}
|
67 |
+
className="btn-primary px-8 py-4 rounded-xl text-lg font-semibold text-white shadow-lg"
|
68 |
+
>
|
69 |
+
Start New Conversation
|
70 |
+
</motion.button>
|
71 |
+
</motion.div>
|
72 |
+
|
73 |
+
{/* Features */}
|
74 |
+
<motion.div
|
75 |
+
initial={{ opacity: 0, y: 20 }}
|
76 |
+
animate={{ opacity: 1, y: 0 }}
|
77 |
+
transition={{ duration: 0.6, delay: 0.2 }}
|
78 |
+
className="grid md:grid-cols-3 gap-6 mb-12"
|
79 |
+
>
|
80 |
+
{features.map((feature, index) => (
|
81 |
+
<motion.div
|
82 |
+
key={index}
|
83 |
+
whileHover={{ y: -5 }}
|
84 |
+
className={`p-6 rounded-2xl ${
|
85 |
+
darkMode
|
86 |
+
? 'bg-gray-800 border-gray-700'
|
87 |
+
: 'bg-white border-gray-200'
|
88 |
+
} border shadow-lg`}
|
89 |
+
>
|
90 |
+
<feature.icon className={`w-8 h-8 mb-4 ${
|
91 |
+
darkMode ? 'text-primary-400' : 'text-primary-600'
|
92 |
+
}`} />
|
93 |
+
<h3 className="text-lg font-semibold mb-2">{feature.title}</h3>
|
94 |
+
<p className={`${
|
95 |
+
darkMode ? 'text-gray-400' : 'text-gray-600'
|
96 |
+
}`}>
|
97 |
+
{feature.description}
|
98 |
+
</p>
|
99 |
+
</motion.div>
|
100 |
+
))}
|
101 |
+
</motion.div>
|
102 |
+
|
103 |
+
{/* File Upload Section */}
|
104 |
+
<motion.div
|
105 |
+
initial={{ opacity: 0, y: 20 }}
|
106 |
+
animate={{ opacity: 1, y: 0 }}
|
107 |
+
transition={{ duration: 0.6, delay: 0.4 }}
|
108 |
+
className="mb-12"
|
109 |
+
>
|
110 |
+
<h2 className="text-2xl font-semibold text-center mb-6">
|
111 |
+
Upload Your Study Materials
|
112 |
+
</h2>
|
113 |
+
<FileUploader darkMode={darkMode} />
|
114 |
+
</motion.div>
|
115 |
+
|
116 |
+
{/* Suggestion Pills */}
|
117 |
+
<motion.div
|
118 |
+
initial={{ opacity: 0, y: 20 }}
|
119 |
+
animate={{ opacity: 1, y: 0 }}
|
120 |
+
transition={{ duration: 0.6, delay: 0.6 }}
|
121 |
+
className="text-center"
|
122 |
+
>
|
123 |
+
<h3 className={`text-lg font-semibold mb-4 ${
|
124 |
+
darkMode ? 'text-gray-300' : 'text-gray-700'
|
125 |
+
}`}>
|
126 |
+
Try asking:
|
127 |
+
</h3>
|
128 |
+
|
129 |
+
<div className="flex flex-wrap justify-center gap-3">
|
130 |
+
{suggestions.map((suggestion, index) => (
|
131 |
+
<motion.button
|
132 |
+
key={index}
|
133 |
+
whileHover={{ scale: 1.05 }}
|
134 |
+
whileTap={{ scale: 0.95 }}
|
135 |
+
onClick={() => onStartChat(suggestion)}
|
136 |
+
className={`px-4 py-2 rounded-full text-sm font-medium transition-colors ${
|
137 |
+
darkMode
|
138 |
+
? 'bg-gray-800 hover:bg-gray-700 text-gray-300 border-gray-700'
|
139 |
+
: 'bg-gray-100 hover:bg-gray-200 text-gray-700 border-gray-300'
|
140 |
+
} border`}
|
141 |
+
>
|
142 |
+
{suggestion}
|
143 |
+
</motion.button>
|
144 |
+
))}
|
145 |
+
</div>
|
146 |
+
</motion.div>
|
147 |
+
</div>
|
148 |
+
</div>
|
149 |
+
);
|
150 |
+
};
|
151 |
+
|
152 |
+
export default WelcomeScreen;
|
frontend/src/index.css
ADDED
@@ -0,0 +1,279 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
@import 'tailwindcss/base';
|
2 |
+
@import 'tailwindcss/components';
|
3 |
+
@import 'tailwindcss/utilities';
|
4 |
+
|
5 |
+
@import url('https://fonts.googleapis.com/css2?family=Inter:wght@300;400;500;600;700;800&display=swap');
|
6 |
+
|
7 |
+
* {
|
8 |
+
box-sizing: border-box;
|
9 |
+
}
|
10 |
+
|
11 |
+
body {
|
12 |
+
margin: 0;
|
13 |
+
font-family: 'Inter', -apple-system, BlinkMacSystemFont, 'Segoe UI', 'Roboto', 'Oxygen',
|
14 |
+
'Ubuntu', 'Cantarell', 'Fira Sans', 'Droid Sans', 'Helvetica Neue',
|
15 |
+
sans-serif;
|
16 |
+
-webkit-font-smoothing: antialiased;
|
17 |
+
-moz-osx-font-smoothing: grayscale;
|
18 |
+
}
|
19 |
+
|
20 |
+
/* Custom scrollbar */
|
21 |
+
::-webkit-scrollbar {
|
22 |
+
width: 6px;
|
23 |
+
}
|
24 |
+
|
25 |
+
::-webkit-scrollbar-track {
|
26 |
+
@apply bg-gray-100 dark:bg-gray-800;
|
27 |
+
}
|
28 |
+
|
29 |
+
::-webkit-scrollbar-thumb {
|
30 |
+
@apply bg-gray-300 dark:bg-gray-600 rounded-full;
|
31 |
+
}
|
32 |
+
|
33 |
+
::-webkit-scrollbar-thumb:hover {
|
34 |
+
@apply bg-gray-400 dark:bg-gray-500;
|
35 |
+
}
|
36 |
+
|
37 |
+
/* Loading dots animation */
|
38 |
+
.typing-indicator {
|
39 |
+
display: flex;
|
40 |
+
align-items: center;
|
41 |
+
gap: 4px;
|
42 |
+
}
|
43 |
+
|
44 |
+
.typing-indicator .dot {
|
45 |
+
width: 6px;
|
46 |
+
height: 6px;
|
47 |
+
border-radius: 50%;
|
48 |
+
background-color: currentColor;
|
49 |
+
opacity: 0.4;
|
50 |
+
animation: typing-dot 1.4s infinite ease-in-out;
|
51 |
+
}
|
52 |
+
|
53 |
+
.typing-indicator .dot:nth-child(1) {
|
54 |
+
animation-delay: 0s;
|
55 |
+
}
|
56 |
+
|
57 |
+
.typing-indicator .dot:nth-child(2) {
|
58 |
+
animation-delay: 0.2s;
|
59 |
+
}
|
60 |
+
|
61 |
+
.typing-indicator .dot:nth-child(3) {
|
62 |
+
animation-delay: 0.4s;
|
63 |
+
}
|
64 |
+
|
65 |
+
@keyframes typing-dot {
|
66 |
+
0%, 60%, 100% {
|
67 |
+
opacity: 0.4;
|
68 |
+
transform: scale(1);
|
69 |
+
}
|
70 |
+
30% {
|
71 |
+
opacity: 1;
|
72 |
+
transform: scale(1.2);
|
73 |
+
}
|
74 |
+
}
|
75 |
+
|
76 |
+
/* Message bubble animations */
|
77 |
+
.message-enter {
|
78 |
+
opacity: 0;
|
79 |
+
transform: translateY(20px);
|
80 |
+
}
|
81 |
+
|
82 |
+
.message-enter-active {
|
83 |
+
opacity: 1;
|
84 |
+
transform: translateY(0);
|
85 |
+
transition: opacity 300ms ease-out, transform 300ms ease-out;
|
86 |
+
}
|
87 |
+
|
88 |
+
/* File upload hover effects */
|
89 |
+
.file-drop-zone {
|
90 |
+
transition: all 0.2s ease-in-out;
|
91 |
+
}
|
92 |
+
|
93 |
+
.file-drop-zone:hover {
|
94 |
+
transform: translateY(-2px);
|
95 |
+
box-shadow: 0 10px 25px -5px rgba(0, 0, 0, 0.1);
|
96 |
+
}
|
97 |
+
|
98 |
+
/* Gradient text */
|
99 |
+
.gradient-text {
|
100 |
+
background: linear-gradient(135deg, #667eea 0%, #764ba2 100%);
|
101 |
+
-webkit-background-clip: text;
|
102 |
+
-webkit-text-fill-color: transparent;
|
103 |
+
background-clip: text;
|
104 |
+
}
|
105 |
+
|
106 |
+
/* Glass morphism effect */
|
107 |
+
.glass {
|
108 |
+
backdrop-filter: blur(16px) saturate(180%);
|
109 |
+
-webkit-backdrop-filter: blur(16px) saturate(180%);
|
110 |
+
background-color: rgba(255, 255, 255, 0.8);
|
111 |
+
border: 1px solid rgba(255, 255, 255, 0.125);
|
112 |
+
}
|
113 |
+
|
114 |
+
.glass.dark {
|
115 |
+
background-color: rgba(17, 24, 39, 0.8);
|
116 |
+
border: 1px solid rgba(75, 85, 99, 0.3);
|
117 |
+
}
|
118 |
+
|
119 |
+
/* Button hover effects */
|
120 |
+
.btn-primary {
|
121 |
+
@apply bg-primary-600 hover:bg-primary-700 focus:ring-primary-500;
|
122 |
+
transition: all 0.2s ease-in-out;
|
123 |
+
}
|
124 |
+
|
125 |
+
.btn-primary:hover {
|
126 |
+
transform: translateY(-1px);
|
127 |
+
box-shadow: 0 4px 12px rgba(59, 130, 246, 0.3);
|
128 |
+
}
|
129 |
+
|
130 |
+
/* Code syntax highlighting overrides */
|
131 |
+
.prose pre {
|
132 |
+
@apply bg-gray-900 dark:bg-gray-950;
|
133 |
+
}
|
134 |
+
|
135 |
+
.prose code {
|
136 |
+
@apply bg-gray-100 dark:bg-gray-800 text-gray-800 dark:text-gray-200 px-1 py-0.5 rounded text-sm;
|
137 |
+
}
|
138 |
+
|
139 |
+
/* Message content styling */
|
140 |
+
.message-content {
|
141 |
+
@apply prose prose-sm max-w-none dark:prose-invert;
|
142 |
+
}
|
143 |
+
|
144 |
+
.message-content p:last-child {
|
145 |
+
margin-bottom: 0;
|
146 |
+
}
|
147 |
+
|
148 |
+
.message-content ul, .message-content ol {
|
149 |
+
@apply my-2;
|
150 |
+
}
|
151 |
+
|
152 |
+
.message-content li {
|
153 |
+
@apply my-1;
|
154 |
+
}
|
155 |
+
|
156 |
+
/* Enhanced Chat Input Styles */
|
157 |
+
.chat-input-container {
|
158 |
+
background: linear-gradient(135deg, rgba(255, 255, 255, 0.1) 0%, rgba(255, 255, 255, 0.05) 100%);
|
159 |
+
backdrop-filter: blur(10px);
|
160 |
+
border-radius: 16px;
|
161 |
+
transition: all 0.3s cubic-bezier(0.4, 0, 0.2, 1);
|
162 |
+
}
|
163 |
+
|
164 |
+
.chat-input-container:focus-within {
|
165 |
+
transform: translateY(-1px);
|
166 |
+
box-shadow: 0 20px 40px -12px rgba(0, 0, 0, 0.25);
|
167 |
+
}
|
168 |
+
|
169 |
+
.chat-input {
|
170 |
+
font-family: 'Inter', system-ui, -apple-system, sans-serif;
|
171 |
+
font-weight: 400;
|
172 |
+
line-height: 1.6;
|
173 |
+
letter-spacing: 0.01em;
|
174 |
+
}
|
175 |
+
|
176 |
+
.chat-input::placeholder {
|
177 |
+
font-weight: 400;
|
178 |
+
opacity: 0.6;
|
179 |
+
transition: opacity 0.3s ease;
|
180 |
+
}
|
181 |
+
|
182 |
+
.chat-input:focus::placeholder {
|
183 |
+
opacity: 0.4;
|
184 |
+
}
|
185 |
+
|
186 |
+
/* Enhanced scrollbar for textarea */
|
187 |
+
.chat-input::-webkit-scrollbar {
|
188 |
+
width: 4px;
|
189 |
+
}
|
190 |
+
|
191 |
+
.chat-input::-webkit-scrollbar-track {
|
192 |
+
background: transparent;
|
193 |
+
}
|
194 |
+
|
195 |
+
.chat-input::-webkit-scrollbar-thumb {
|
196 |
+
background: rgba(156, 163, 175, 0.5);
|
197 |
+
border-radius: 2px;
|
198 |
+
}
|
199 |
+
|
200 |
+
.chat-input::-webkit-scrollbar-thumb:hover {
|
201 |
+
background: rgba(156, 163, 175, 0.8);
|
202 |
+
}
|
203 |
+
|
204 |
+
/* Input focus indicator animation */
|
205 |
+
.input-focus-indicator {
|
206 |
+
transform-origin: left;
|
207 |
+
animation: expand 0.3s ease-out;
|
208 |
+
}
|
209 |
+
|
210 |
+
@keyframes expand {
|
211 |
+
from {
|
212 |
+
transform: scaleX(0);
|
213 |
+
}
|
214 |
+
to {
|
215 |
+
transform: scaleX(1);
|
216 |
+
}
|
217 |
+
}
|
218 |
+
|
219 |
+
/* Enhanced button glow effect */
|
220 |
+
.btn-glow {
|
221 |
+
position: relative;
|
222 |
+
overflow: visible;
|
223 |
+
}
|
224 |
+
|
225 |
+
.btn-glow::before {
|
226 |
+
content: '';
|
227 |
+
position: absolute;
|
228 |
+
inset: -2px;
|
229 |
+
border-radius: inherit;
|
230 |
+
background: linear-gradient(45deg, transparent, rgba(59, 130, 246, 0.3), transparent);
|
231 |
+
z-index: -1;
|
232 |
+
opacity: 0;
|
233 |
+
transition: opacity 0.3s ease;
|
234 |
+
}
|
235 |
+
|
236 |
+
.btn-glow:hover::before {
|
237 |
+
opacity: 1;
|
238 |
+
}
|
239 |
+
|
240 |
+
/* Smooth transitions for dark mode */
|
241 |
+
* {
|
242 |
+
transition: background-color 0.3s ease, border-color 0.3s ease, color 0.3s ease;
|
243 |
+
}
|
244 |
+
|
245 |
+
/* Enhanced tooltip styling */
|
246 |
+
.tooltip {
|
247 |
+
animation: tooltip-appear 0.2s ease-out;
|
248 |
+
}
|
249 |
+
|
250 |
+
@keyframes tooltip-appear {
|
251 |
+
from {
|
252 |
+
opacity: 0;
|
253 |
+
transform: translateX(-50%) translateY(4px);
|
254 |
+
}
|
255 |
+
to {
|
256 |
+
opacity: 1;
|
257 |
+
transform: translateX(-50%) translateY(0);
|
258 |
+
}
|
259 |
+
}
|
260 |
+
|
261 |
+
/* Floating effect for input container */
|
262 |
+
.floating-input {
|
263 |
+
box-shadow:
|
264 |
+
0 10px 25px -5px rgba(0, 0, 0, 0.1),
|
265 |
+
0 4px 6px -2px rgba(0, 0, 0, 0.05);
|
266 |
+
transition: all 0.3s cubic-bezier(0.4, 0, 0.2, 1);
|
267 |
+
}
|
268 |
+
|
269 |
+
.floating-input:hover {
|
270 |
+
box-shadow:
|
271 |
+
0 20px 40px -12px rgba(0, 0, 0, 0.15),
|
272 |
+
0 8px 16px -4px rgba(0, 0, 0, 0.1);
|
273 |
+
}
|
274 |
+
|
275 |
+
.floating-input:focus-within {
|
276 |
+
box-shadow:
|
277 |
+
0 25px 50px -12px rgba(59, 130, 246, 0.25),
|
278 |
+
0 10px 20px -5px rgba(0, 0, 0, 0.1);
|
279 |
+
}
|
frontend/src/index.js
ADDED
@@ -0,0 +1,11 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
import React from 'react';
|
2 |
+
import ReactDOM from 'react-dom/client';
|
3 |
+
import './index.css';
|
4 |
+
import App from './App';
|
5 |
+
|
6 |
+
const root = ReactDOM.createRoot(document.getElementById('root'));
|
7 |
+
root.render(
|
8 |
+
<React.StrictMode>
|
9 |
+
<App />
|
10 |
+
</React.StrictMode>
|
11 |
+
);
|
frontend/src/services/api.js
ADDED
@@ -0,0 +1,203 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
import axios from 'axios';
|
2 |
+
|
3 |
+
// Base URL for the API (will use proxy from package.json)
|
4 |
+
const API_BASE_URL = process.env.REACT_APP_API_URL || '/api';
|
5 |
+
|
6 |
+
// Create axios instance with default config
|
7 |
+
const api = axios.create({
|
8 |
+
baseURL: API_BASE_URL,
|
9 |
+
headers: {
|
10 |
+
'Content-Type': 'application/json',
|
11 |
+
},
|
12 |
+
});
|
13 |
+
|
14 |
+
// Request interceptor
|
15 |
+
api.interceptors.request.use(
|
16 |
+
(config) => {
|
17 |
+
// Add auth token if available
|
18 |
+
const token = localStorage.getItem('auth_token');
|
19 |
+
if (token) {
|
20 |
+
config.headers.Authorization = `Bearer ${token}`;
|
21 |
+
}
|
22 |
+
return config;
|
23 |
+
},
|
24 |
+
(error) => {
|
25 |
+
return Promise.reject(error);
|
26 |
+
}
|
27 |
+
);
|
28 |
+
|
29 |
+
// Response interceptor
|
30 |
+
api.interceptors.response.use(
|
31 |
+
(response) => {
|
32 |
+
return response.data;
|
33 |
+
},
|
34 |
+
(error) => {
|
35 |
+
// Handle common errors
|
36 |
+
if (error.response?.status === 401) {
|
37 |
+
// Handle unauthorized access
|
38 |
+
localStorage.removeItem('auth_token');
|
39 |
+
// Redirect to login if needed
|
40 |
+
}
|
41 |
+
|
42 |
+
const errorMessage = error.response?.data?.message || error.message || 'An error occurred';
|
43 |
+
return Promise.reject(new Error(errorMessage));
|
44 |
+
}
|
45 |
+
);
|
46 |
+
|
47 |
+
// API Functions
|
48 |
+
|
49 |
+
/**
|
50 |
+
* Send a message/question to the RAG system
|
51 |
+
*/
|
52 |
+
export const sendMessage = async (message) => {
|
53 |
+
try {
|
54 |
+
const response = await api.post('/ask', {
|
55 |
+
question: message,
|
56 |
+
});
|
57 |
+
|
58 |
+
return response;
|
59 |
+
} catch (error) {
|
60 |
+
console.error('Error sending message:', error);
|
61 |
+
throw error;
|
62 |
+
}
|
63 |
+
};
|
64 |
+
|
65 |
+
/**
|
66 |
+
* Send a message/question to the RAG system and get a streaming response
|
67 |
+
*/
|
68 |
+
export const sendMessageStream = async (message, onChunk) => {
|
69 |
+
try {
|
70 |
+
const response = await fetch(`${API_BASE_URL}/ask_stream`, {
|
71 |
+
method: 'POST',
|
72 |
+
headers: {
|
73 |
+
'Content-Type': 'application/json',
|
74 |
+
},
|
75 |
+
body: JSON.stringify({ question: message }),
|
76 |
+
});
|
77 |
+
|
78 |
+
if (!response.ok) {
|
79 |
+
throw new Error(`HTTP error! status: ${response.status}`);
|
80 |
+
}
|
81 |
+
|
82 |
+
if (!response.body) {
|
83 |
+
throw new Error("ReadableStream not yet supported in this browser.");
|
84 |
+
}
|
85 |
+
|
86 |
+
const reader = response.body.getReader();
|
87 |
+
const decoder = new TextDecoder();
|
88 |
+
|
89 |
+
try {
|
90 |
+
while (true) {
|
91 |
+
const { done, value } = await reader.read();
|
92 |
+
if (done) {
|
93 |
+
break;
|
94 |
+
}
|
95 |
+
const chunk = decoder.decode(value, { stream: true });
|
96 |
+
if (chunk) {
|
97 |
+
onChunk(chunk);
|
98 |
+
}
|
99 |
+
}
|
100 |
+
} finally {
|
101 |
+
reader.releaseLock();
|
102 |
+
}
|
103 |
+
} catch (error) {
|
104 |
+
console.error('Error sending streaming message:', error);
|
105 |
+
throw error;
|
106 |
+
}
|
107 |
+
};
|
108 |
+
|
109 |
+
/**
|
110 |
+
* Upload a document to the system
|
111 |
+
*/
|
112 |
+
export const uploadDocument = async (formData) => {
|
113 |
+
try {
|
114 |
+
const response = await api.post('/upload', formData, {
|
115 |
+
headers: {
|
116 |
+
'Content-Type': 'multipart/form-data',
|
117 |
+
},
|
118 |
+
// Upload progress callback
|
119 |
+
onUploadProgress: (progressEvent) => {
|
120 |
+
const percentCompleted = Math.round(
|
121 |
+
(progressEvent.loaded * 100) / progressEvent.total
|
122 |
+
);
|
123 |
+
console.log(`Upload progress: ${percentCompleted}%`);
|
124 |
+
},
|
125 |
+
});
|
126 |
+
|
127 |
+
return response;
|
128 |
+
} catch (error) {
|
129 |
+
console.error('Error uploading document:', error);
|
130 |
+
throw error;
|
131 |
+
}
|
132 |
+
};
|
133 |
+
|
134 |
+
/**
|
135 |
+
* Get system status and information
|
136 |
+
*/
|
137 |
+
export const getSystemStatus = async () => {
|
138 |
+
try {
|
139 |
+
const response = await api.get('/status');
|
140 |
+
return response;
|
141 |
+
} catch (error) {
|
142 |
+
console.error('Error getting system status:', error);
|
143 |
+
throw error;
|
144 |
+
}
|
145 |
+
};
|
146 |
+
|
147 |
+
/**
|
148 |
+
* Get collection information
|
149 |
+
*/
|
150 |
+
export const getCollectionInfo = async () => {
|
151 |
+
try {
|
152 |
+
const response = await api.get('/collection/info');
|
153 |
+
return response;
|
154 |
+
} catch (error) {
|
155 |
+
console.error('Error getting collection info:', error);
|
156 |
+
throw error;
|
157 |
+
}
|
158 |
+
};
|
159 |
+
|
160 |
+
/**
|
161 |
+
* Delete a document from the collection
|
162 |
+
*/
|
163 |
+
export const deleteDocument = async (documentId) => {
|
164 |
+
try {
|
165 |
+
const response = await api.delete(`/documents/${documentId}`);
|
166 |
+
return response;
|
167 |
+
} catch (error) {
|
168 |
+
console.error('Error deleting document:', error);
|
169 |
+
throw error;
|
170 |
+
}
|
171 |
+
};
|
172 |
+
|
173 |
+
/**
|
174 |
+
* Search documents
|
175 |
+
*/
|
176 |
+
export const searchDocuments = async (query, limit = 5) => {
|
177 |
+
try {
|
178 |
+
const response = await api.post('/search', {
|
179 |
+
query,
|
180 |
+
limit,
|
181 |
+
});
|
182 |
+
|
183 |
+
return response;
|
184 |
+
} catch (error) {
|
185 |
+
console.error('Error searching documents:', error);
|
186 |
+
throw error;
|
187 |
+
}
|
188 |
+
};
|
189 |
+
|
190 |
+
/**
|
191 |
+
* Health check endpoint
|
192 |
+
*/
|
193 |
+
export const healthCheck = async () => {
|
194 |
+
try {
|
195 |
+
const response = await api.get('/health');
|
196 |
+
return response;
|
197 |
+
} catch (error) {
|
198 |
+
console.error('Error checking health:', error);
|
199 |
+
throw error;
|
200 |
+
}
|
201 |
+
};
|
202 |
+
|
203 |
+
export default api;
|
frontend/src/utils/conversationStorage.js
ADDED
@@ -0,0 +1,227 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
/**
|
2 |
+
* Conversation Storage Utility
|
3 |
+
* Handles conversation persistence with localStorage
|
4 |
+
*/
|
5 |
+
|
6 |
+
const STORAGE_KEY = 'ca_study_conversations';
|
7 |
+
const MAX_CONVERSATIONS = 50; // Limit to prevent localStorage overflow
|
8 |
+
|
9 |
+
export class ConversationStorage {
|
10 |
+
|
11 |
+
/**
|
12 |
+
* Load all conversations from localStorage
|
13 |
+
*/
|
14 |
+
static loadConversations() {
|
15 |
+
try {
|
16 |
+
const stored = localStorage.getItem(STORAGE_KEY);
|
17 |
+
if (!stored) return [];
|
18 |
+
|
19 |
+
const conversations = JSON.parse(stored);
|
20 |
+
|
21 |
+
// Convert date strings back to Date objects
|
22 |
+
return conversations.map(conv => ({
|
23 |
+
...conv,
|
24 |
+
createdAt: new Date(conv.createdAt),
|
25 |
+
messages: conv.messages.map(msg => ({
|
26 |
+
...msg,
|
27 |
+
timestamp: new Date(msg.timestamp)
|
28 |
+
}))
|
29 |
+
}));
|
30 |
+
} catch (error) {
|
31 |
+
console.error('Error loading conversations:', error);
|
32 |
+
return [];
|
33 |
+
}
|
34 |
+
}
|
35 |
+
|
36 |
+
/**
|
37 |
+
* Save conversations to localStorage
|
38 |
+
*/
|
39 |
+
static saveConversations(conversations) {
|
40 |
+
try {
|
41 |
+
// Limit the number of conversations to prevent localStorage overflow
|
42 |
+
const limitedConversations = conversations.slice(0, MAX_CONVERSATIONS);
|
43 |
+
localStorage.setItem(STORAGE_KEY, JSON.stringify(limitedConversations));
|
44 |
+
return true;
|
45 |
+
} catch (error) {
|
46 |
+
console.error('Error saving conversations:', error);
|
47 |
+
// Handle localStorage quota exceeded
|
48 |
+
if (error.name === 'QuotaExceededError') {
|
49 |
+
// Try to save with fewer conversations
|
50 |
+
const reducedConversations = conversations.slice(0, 25);
|
51 |
+
try {
|
52 |
+
localStorage.setItem(STORAGE_KEY, JSON.stringify(reducedConversations));
|
53 |
+
return true;
|
54 |
+
} catch (retryError) {
|
55 |
+
console.error('Error saving reduced conversations:', retryError);
|
56 |
+
}
|
57 |
+
}
|
58 |
+
return false;
|
59 |
+
}
|
60 |
+
}
|
61 |
+
|
62 |
+
/**
|
63 |
+
* Add a new conversation
|
64 |
+
*/
|
65 |
+
static addConversation(conversation) {
|
66 |
+
const conversations = this.loadConversations();
|
67 |
+
const newConversations = [conversation, ...conversations];
|
68 |
+
return this.saveConversations(newConversations);
|
69 |
+
}
|
70 |
+
|
71 |
+
/**
|
72 |
+
* Update an existing conversation
|
73 |
+
*/
|
74 |
+
static updateConversation(conversationId, updates) {
|
75 |
+
const conversations = this.loadConversations();
|
76 |
+
const updatedConversations = conversations.map(conv =>
|
77 |
+
conv.id === conversationId ? { ...conv, ...updates } : conv
|
78 |
+
);
|
79 |
+
return this.saveConversations(updatedConversations);
|
80 |
+
}
|
81 |
+
|
82 |
+
/**
|
83 |
+
* Delete a conversation
|
84 |
+
*/
|
85 |
+
static deleteConversation(conversationId) {
|
86 |
+
const conversations = this.loadConversations();
|
87 |
+
const filteredConversations = conversations.filter(conv => conv.id !== conversationId);
|
88 |
+
return this.saveConversations(filteredConversations);
|
89 |
+
}
|
90 |
+
|
91 |
+
/**
|
92 |
+
* Add a message to a conversation
|
93 |
+
*/
|
94 |
+
static addMessage(conversationId, message) {
|
95 |
+
const conversations = this.loadConversations();
|
96 |
+
const updatedConversations = conversations.map(conv => {
|
97 |
+
if (conv.id === conversationId) {
|
98 |
+
return {
|
99 |
+
...conv,
|
100 |
+
messages: [...conv.messages, message],
|
101 |
+
updatedAt: new Date()
|
102 |
+
};
|
103 |
+
}
|
104 |
+
return conv;
|
105 |
+
});
|
106 |
+
return this.saveConversations(updatedConversations);
|
107 |
+
}
|
108 |
+
|
109 |
+
/**
|
110 |
+
* Search conversations by title or content
|
111 |
+
*/
|
112 |
+
static searchConversations(query) {
|
113 |
+
const conversations = this.loadConversations();
|
114 |
+
const lowercaseQuery = query.toLowerCase();
|
115 |
+
|
116 |
+
return conversations.filter(conv =>
|
117 |
+
conv.title.toLowerCase().includes(lowercaseQuery) ||
|
118 |
+
conv.messages.some(msg =>
|
119 |
+
msg.content.toLowerCase().includes(lowercaseQuery)
|
120 |
+
)
|
121 |
+
);
|
122 |
+
}
|
123 |
+
|
124 |
+
/**
|
125 |
+
* Get conversation statistics
|
126 |
+
*/
|
127 |
+
static getStatistics() {
|
128 |
+
const conversations = this.loadConversations();
|
129 |
+
const totalMessages = conversations.reduce((sum, conv) => sum + conv.messages.length, 0);
|
130 |
+
|
131 |
+
return {
|
132 |
+
totalConversations: conversations.length,
|
133 |
+
totalMessages,
|
134 |
+
storageSize: this.getStorageSize(),
|
135 |
+
oldestConversation: conversations.length > 0 ?
|
136 |
+
conversations[conversations.length - 1].createdAt : null,
|
137 |
+
newestConversation: conversations.length > 0 ?
|
138 |
+
conversations[0].createdAt : null
|
139 |
+
};
|
140 |
+
}
|
141 |
+
|
142 |
+
/**
|
143 |
+
* Get storage size in KB
|
144 |
+
*/
|
145 |
+
static getStorageSize() {
|
146 |
+
try {
|
147 |
+
const stored = localStorage.getItem(STORAGE_KEY);
|
148 |
+
return stored ? Math.round(new Blob([stored]).size / 1024) : 0;
|
149 |
+
} catch (error) {
|
150 |
+
return 0;
|
151 |
+
}
|
152 |
+
}
|
153 |
+
|
154 |
+
/**
|
155 |
+
* Export conversations as JSON
|
156 |
+
*/
|
157 |
+
static exportConversations() {
|
158 |
+
const conversations = this.loadConversations();
|
159 |
+
const exportData = {
|
160 |
+
exportDate: new Date().toISOString(),
|
161 |
+
version: '1.0',
|
162 |
+
conversations
|
163 |
+
};
|
164 |
+
|
165 |
+
const blob = new Blob([JSON.stringify(exportData, null, 2)], {
|
166 |
+
type: 'application/json'
|
167 |
+
});
|
168 |
+
|
169 |
+
const url = URL.createObjectURL(blob);
|
170 |
+
const a = document.createElement('a');
|
171 |
+
a.href = url;
|
172 |
+
a.download = `ca_study_conversations_${new Date().toISOString().split('T')[0]}.json`;
|
173 |
+
document.body.appendChild(a);
|
174 |
+
a.click();
|
175 |
+
document.body.removeChild(a);
|
176 |
+
URL.revokeObjectURL(url);
|
177 |
+
}
|
178 |
+
|
179 |
+
/**
|
180 |
+
* Import conversations from JSON file
|
181 |
+
*/
|
182 |
+
static importConversations(file) {
|
183 |
+
return new Promise((resolve, reject) => {
|
184 |
+
const reader = new FileReader();
|
185 |
+
reader.onload = (e) => {
|
186 |
+
try {
|
187 |
+
const importData = JSON.parse(e.target.result);
|
188 |
+
|
189 |
+
if (!importData.conversations || !Array.isArray(importData.conversations)) {
|
190 |
+
reject(new Error('Invalid conversation file format'));
|
191 |
+
return;
|
192 |
+
}
|
193 |
+
|
194 |
+
const existingConversations = this.loadConversations();
|
195 |
+
const mergedConversations = [...importData.conversations, ...existingConversations];
|
196 |
+
|
197 |
+
// Remove duplicates based on ID
|
198 |
+
const uniqueConversations = mergedConversations.filter((conv, index, self) =>
|
199 |
+
index === self.findIndex(c => c.id === conv.id)
|
200 |
+
);
|
201 |
+
|
202 |
+
const success = this.saveConversations(uniqueConversations);
|
203 |
+
resolve({ success, count: importData.conversations.length });
|
204 |
+
} catch (error) {
|
205 |
+
reject(error);
|
206 |
+
}
|
207 |
+
};
|
208 |
+
reader.onerror = () => reject(new Error('Error reading file'));
|
209 |
+
reader.readAsText(file);
|
210 |
+
});
|
211 |
+
}
|
212 |
+
|
213 |
+
/**
|
214 |
+
* Clear all conversations
|
215 |
+
*/
|
216 |
+
static clearAllConversations() {
|
217 |
+
try {
|
218 |
+
localStorage.removeItem(STORAGE_KEY);
|
219 |
+
return true;
|
220 |
+
} catch (error) {
|
221 |
+
console.error('Error clearing conversations:', error);
|
222 |
+
return false;
|
223 |
+
}
|
224 |
+
}
|
225 |
+
}
|
226 |
+
|
227 |
+
export default ConversationStorage;
|
frontend/tailwind.config.js
ADDED
@@ -0,0 +1,78 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
/** @type {import('tailwindcss').Config} */
|
2 |
+
module.exports = {
|
3 |
+
content: [
|
4 |
+
"./src/**/*.{js,jsx,ts,tsx}",
|
5 |
+
],
|
6 |
+
darkMode: 'class',
|
7 |
+
theme: {
|
8 |
+
extend: {
|
9 |
+
colors: {
|
10 |
+
primary: {
|
11 |
+
50: '#f0f9ff',
|
12 |
+
100: '#e0f2fe',
|
13 |
+
200: '#bae6fd',
|
14 |
+
300: '#7dd3fc',
|
15 |
+
400: '#38bdf8',
|
16 |
+
500: '#0ea5e9',
|
17 |
+
600: '#0284c7',
|
18 |
+
700: '#0369a1',
|
19 |
+
800: '#075985',
|
20 |
+
900: '#0c4a6e',
|
21 |
+
950: '#082f49',
|
22 |
+
},
|
23 |
+
gray: {
|
24 |
+
50: '#f9fafb',
|
25 |
+
100: '#f3f4f6',
|
26 |
+
200: '#e5e7eb',
|
27 |
+
300: '#d1d5db',
|
28 |
+
400: '#9ca3af',
|
29 |
+
500: '#6b7280',
|
30 |
+
600: '#4b5563',
|
31 |
+
700: '#374151',
|
32 |
+
800: '#1f2937',
|
33 |
+
900: '#111827',
|
34 |
+
950: '#030712',
|
35 |
+
},
|
36 |
+
success: {
|
37 |
+
50: '#f0fdf4',
|
38 |
+
500: '#22c55e',
|
39 |
+
600: '#16a34a',
|
40 |
+
},
|
41 |
+
error: {
|
42 |
+
50: '#fef2f2',
|
43 |
+
500: '#ef4444',
|
44 |
+
600: '#dc2626',
|
45 |
+
}
|
46 |
+
},
|
47 |
+
fontFamily: {
|
48 |
+
sans: ['Inter', 'system-ui', 'sans-serif'],
|
49 |
+
},
|
50 |
+
animation: {
|
51 |
+
'fade-in': 'fadeIn 0.5s ease-in-out',
|
52 |
+
'slide-up': 'slideUp 0.3s ease-out',
|
53 |
+
'pulse-slow': 'pulse 3s cubic-bezier(0.4, 0, 0.6, 1) infinite',
|
54 |
+
'typing': 'typing 1.5s steps(20, end) infinite',
|
55 |
+
},
|
56 |
+
keyframes: {
|
57 |
+
fadeIn: {
|
58 |
+
'0%': { opacity: '0' },
|
59 |
+
'100%': { opacity: '1' },
|
60 |
+
},
|
61 |
+
slideUp: {
|
62 |
+
'0%': { transform: 'translateY(10px)', opacity: '0' },
|
63 |
+
'100%': { transform: 'translateY(0)', opacity: '1' },
|
64 |
+
},
|
65 |
+
typing: {
|
66 |
+
'0%, 50%': { opacity: '1' },
|
67 |
+
'51%, 100%': { opacity: '0' },
|
68 |
+
}
|
69 |
+
},
|
70 |
+
backdropBlur: {
|
71 |
+
xs: '2px',
|
72 |
+
}
|
73 |
+
},
|
74 |
+
},
|
75 |
+
plugins: [
|
76 |
+
require('@tailwindcss/typography'),
|
77 |
+
],
|
78 |
+
}
|