Zelyanoth commited on
Commit
1d6d1e6
·
1 Parent(s): 0e3b2a1

Implement immediate Celery Beat schedule updates

Browse files

Add immediate Celery Beat schedule updates on schedule creation and deletion
Update Celery Beat configuration to use full task path
Refactor Celery task imports and configurations
Add logging for Celery tasks
Update frontend components to reflect changes

CELERY_SCHEDULING_SETUP.md ADDED
@@ -0,0 +1,241 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ # Celery Scheduling Setup Guide
2
+
3
+ This guide explains how to set up and use the Celery scheduling system with your Lin application.
4
+
5
+ ## Overview
6
+
7
+ The updated `start_app.py` now automatically starts both the Flask application and Celery components (worker and beat scheduler) when you run the application. This ensures that your scheduled tasks will execute properly.
8
+
9
+ ## Prerequisites
10
+
11
+ ### 1. Redis Server
12
+ Celery requires Redis as a message broker. Make sure Redis is installed and running:
13
+
14
+ **Windows:**
15
+ ```bash
16
+ # Install Redis (if not installed)
17
+ # Download from https://github.com/microsoftarchive/redis/releases
18
+
19
+ # Start Redis server
20
+ redis-server
21
+ ```
22
+
23
+ **Linux/Mac:**
24
+ ```bash
25
+ # Install Redis
26
+ sudo apt-get install redis-server # Ubuntu/Debian
27
+ brew install redis # macOS
28
+
29
+ # Start Redis
30
+ sudo systemctl start redis
31
+ sudo systemctl enable redis
32
+ ```
33
+
34
+ ### 2. Python Dependencies
35
+ Install the required packages:
36
+ ```bash
37
+ pip install -r backend/requirements.txt
38
+ ```
39
+
40
+ ## Starting the Application
41
+
42
+ ### Using start_app.py (Recommended)
43
+ ```bash
44
+ python start_app.py
45
+ ```
46
+
47
+ This will:
48
+ 1. Check Redis connection
49
+ 2. Start Celery worker in the background
50
+ 3. Start Celery beat scheduler in the background
51
+ 4. Start the Flask application
52
+
53
+ ### Using Backend Scripts (Alternative)
54
+ ```bash
55
+ # Start both worker and beat
56
+ cd backend
57
+ python start_celery.py all
58
+
59
+ # Or start individually
60
+ python start_celery.py worker # Start Celery worker
61
+ python start_celery.py beat # Start Celery beat scheduler
62
+ ```
63
+
64
+ ## Configuration
65
+
66
+ ### Environment Variables
67
+ Make sure these are set in your `.env` file:
68
+
69
+ ```env
70
+ # Supabase configuration
71
+ SUPABASE_URL="your_supabase_url"
72
+ SUPABASE_KEY="your_supabase_key"
73
+
74
+ # Redis configuration (if not using defaults)
75
+ CELERY_BROKER_URL="redis://localhost:6379/0"
76
+ CELERY_RESULT_BACKEND="redis://localhost:6379/0"
77
+
78
+ # Scheduler configuration
79
+ SCHEDULER_ENABLED=True
80
+ ```
81
+
82
+ ### Celery Configuration
83
+ The unified configuration is in `backend/celery_config.py`:
84
+
85
+ ```python
86
+ celery_app.conf.beat_schedule = {
87
+ 'load-schedules': {
88
+ 'task': 'backend.celery_tasks.schedule_loader.load_schedules_task',
89
+ 'schedule': crontab(minute='*/5'), # Every 5 minutes
90
+ },
91
+ }
92
+ ```
93
+
94
+ ## How Scheduling Works
95
+
96
+ ### 1. Schedule Loading
97
+ - **Immediate Updates**: When you create or delete a schedule via the API, Celery Beat is updated immediately
98
+ - **Periodic Updates**: Celery Beat also runs every 5 minutes as a backup
99
+ - Executes `load_schedules_task`
100
+ - Fetches schedules from Supabase database
101
+ - Creates individual periodic tasks for each schedule
102
+
103
+ ### 2. Task Execution
104
+ - **Content Generation**: Runs 5 minutes before scheduled time
105
+ - **Post Publishing**: Runs at the scheduled time
106
+ - Tasks are queued in appropriate queues (content, publish)
107
+
108
+ ### 3. Database Integration
109
+ - Uses Supabase for schedule storage
110
+ - Automatically creates tasks based on schedule data
111
+ - Handles social network authentication
112
+
113
+ ## Monitoring and Debugging
114
+
115
+ ### Checking Celery Status
116
+ ```bash
117
+ # Check worker status
118
+ celery -A celery_config inspect stats
119
+
120
+ # Check scheduled tasks
121
+ celery -A celery_config inspect scheduled
122
+
123
+ # Check active tasks
124
+ celery -A celery_config inspect active
125
+ ```
126
+
127
+ ### Viewing Logs
128
+ - **Flask Application**: Check console output
129
+ - **Celery Worker**: Look for worker process logs
130
+ - **Celery Beat**: Look for beat process logs
131
+
132
+ ### Common Issues
133
+
134
+ **1. Redis Connection Failed**
135
+ ```
136
+ Solution: Start Redis server first
137
+ ```
138
+
139
+ **2. Tasks Not Executing**
140
+ ```bash
141
+ # Check if Celery worker is running
142
+ celery -A celery_config inspect ping
143
+
144
+ # Check if beat scheduler is running
145
+ ps aux | grep celery
146
+ ```
147
+
148
+ **3. Schedule Not Loading**
149
+ - Check Supabase database connection
150
+ - Verify schedule data in database
151
+ - Check task registration in Celery
152
+
153
+ ## Testing the Scheduling System
154
+
155
+ ### Manual Testing
156
+ ```python
157
+ # Test schedule loading task
158
+ from backend.celery_tasks.schedule_loader import load_schedules_task
159
+ result = load_schedules_task()
160
+ print(result)
161
+ ```
162
+
163
+ ### API Testing (Recommended)
164
+ 1. **Create a schedule via the API**:
165
+ ```bash
166
+ curl -X POST http://localhost:5000/api/schedules/ \
167
+ -H "Content-Type: application/json" \
168
+ -H "Authorization: Bearer YOUR_JWT_TOKEN" \
169
+ -d '{
170
+ "social_network": "1",
171
+ "schedule_time": "09:00",
172
+ "days": ["Monday", "Wednesday", "Friday"]
173
+ }'
174
+ ```
175
+
176
+ 2. **Check the response**: You should see a `celery_update_task_id` field indicating the scheduler was updated immediately
177
+
178
+ 3. **Verify in Celery**: Check if the individual tasks were created:
179
+ ```bash
180
+ celery -A celery_config inspect scheduled
181
+ ```
182
+
183
+ ### Database Testing
184
+ 1. Add a schedule directly in the Supabase database
185
+ 2. Wait 5 minutes for the loader task to run (or trigger via API)
186
+ 3. Check if individual tasks were created
187
+ 4. Verify task execution times
188
+
189
+ ## Production Deployment
190
+
191
+ ### Using Docker
192
+ ```bash
193
+ # Build and start all services
194
+ docker-compose up -d
195
+
196
+ # Check logs
197
+ docker-compose logs -f
198
+ ```
199
+
200
+ ### Using Supervisor (Linux)
201
+ Create `/etc/supervisor/conf.d/lin.conf`:
202
+ ```ini
203
+ [program:lin_worker]
204
+ command=python start_app.py
205
+ directory=/path/to/lin
206
+ autostart=true
207
+ autorestart=true
208
+ user=www-data
209
+ environment=PATH="/path/to/venv/bin"
210
+
211
+ [program:lin_beat]
212
+ command=python -m celery -A celery_config beat
213
+ directory=/path/to/lin
214
+ autostart=true
215
+ autorestart=true
216
+ user=www-data
217
+ environment=PATH="/path/to/venv/bin"
218
+ ```
219
+
220
+ ## Troubleshooting Checklist
221
+
222
+ 1. ✅ Redis server is running
223
+ 2. ✅ All Python dependencies are installed
224
+ 3. ✅ Environment variables are set correctly
225
+ 4. ✅ Supabase database connection works
226
+ 5. ✅ Celery worker is running
227
+ 6. ✅ Celery beat scheduler is running
228
+ 7. ✅ Schedule data exists in database
229
+ 8. ✅ Tasks are properly registered
230
+ 9. ✅ Task execution permissions are correct
231
+
232
+ ## Support
233
+
234
+ If you encounter issues:
235
+ 1. Check this guide first
236
+ 2. Review the logs for error messages
237
+ 3. Verify all prerequisites are met
238
+ 4. Test components individually
239
+ 5. Check the Celery documentation
240
+
241
+ For additional help, refer to the Celery documentation at: https://docs.celeryq.dev/
backend/api/schedules.py CHANGED
@@ -1,6 +1,7 @@
1
  from flask import Blueprint, request, jsonify, current_app
2
  from flask_jwt_extended import jwt_required, get_jwt_identity
3
  from backend.services.schedule_service import ScheduleService
 
4
 
5
  schedules_bp = Blueprint('schedules', __name__)
6
 
@@ -131,6 +132,21 @@ def create_schedule():
131
  result = schedule_service.create_schedule(user_id, social_network, schedule_time, days)
132
 
133
  if result['success']:
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
134
  # Add CORS headers to success response
135
  response_data = jsonify(result)
136
  response_data.headers.add('Access-Control-Allow-Origin', 'http://localhost:3000')
@@ -209,6 +225,21 @@ def delete_schedule(schedule_id):
209
  result = schedule_service.delete_schedule(schedule_id)
210
 
211
  if result['success']:
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
212
  # Add CORS headers to success response
213
  response_data = jsonify(result)
214
  response_data.headers.add('Access-Control-Allow-Origin', 'http://localhost:3000')
 
1
  from flask import Blueprint, request, jsonify, current_app
2
  from flask_jwt_extended import jwt_required, get_jwt_identity
3
  from backend.services.schedule_service import ScheduleService
4
+ from backend.celery_tasks.schedule_loader import load_schedules_task
5
 
6
  schedules_bp = Blueprint('schedules', __name__)
7
 
 
132
  result = schedule_service.create_schedule(user_id, social_network, schedule_time, days)
133
 
134
  if result['success']:
135
+ # Trigger immediate Celery Beat schedule update
136
+ try:
137
+ print("[INFO] Triggering immediate Celery Beat schedule update...")
138
+ # Execute the schedule loader task immediately to update Celery Beat
139
+ celery_result = load_schedules_task.delay()
140
+ print(f"[INFO] Celery Beat update task queued: {celery_result.id}")
141
+
142
+ # Add the task ID to the response for tracking
143
+ result['celery_update_task_id'] = celery_result.id
144
+ result['message'] += ' (Scheduler updated immediately)'
145
+ except Exception as e:
146
+ print(f"[WARNING] Failed to trigger immediate Celery update: {str(e)}")
147
+ # Don't fail the schedule creation if Celery update fails
148
+ result['message'] += ' (Note: Scheduler update will occur in 5 minutes)'
149
+
150
  # Add CORS headers to success response
151
  response_data = jsonify(result)
152
  response_data.headers.add('Access-Control-Allow-Origin', 'http://localhost:3000')
 
225
  result = schedule_service.delete_schedule(schedule_id)
226
 
227
  if result['success']:
228
+ # Trigger immediate Celery Beat schedule update
229
+ try:
230
+ print("[INFO] Triggering immediate Celery Beat schedule update after deletion...")
231
+ # Execute the schedule loader task immediately to update Celery Beat
232
+ celery_result = load_schedules_task.delay()
233
+ print(f"[INFO] Celery Beat update task queued: {celery_result.id}")
234
+
235
+ # Add the task ID to the response for tracking
236
+ result['celery_update_task_id'] = celery_result.id
237
+ result['message'] += ' (Scheduler updated immediately)'
238
+ except Exception as e:
239
+ print(f"[WARNING] Failed to trigger immediate Celery update: {str(e)}")
240
+ # Don't fail the schedule deletion if Celery update fails
241
+ result['message'] += ' (Note: Scheduler update will occur in 5 minutes)'
242
+
243
  # Add CORS headers to success response
244
  response_data = jsonify(result)
245
  response_data.headers.add('Access-Control-Allow-Origin', 'http://localhost:3000')
backend/celery_beat_config.py CHANGED
@@ -2,6 +2,9 @@ from celery import Celery
2
  from celery.schedules import crontab
3
  import os
4
 
 
 
 
5
  # Create Celery instance for Beat scheduler
6
  celery_beat = Celery('lin_scheduler')
7
 
@@ -13,7 +16,7 @@ celery_beat.conf.result_backend = os.environ.get('CELERY_RESULT_BACKEND', 'redis
13
  celery_beat.conf.beat_schedule = {
14
  # This task will run every 5 minutes to load schedules from the database
15
  'load-schedules': {
16
- 'task': 'load_schedules_task',
17
  'schedule': crontab(minute='*/5'),
18
  },
19
  }
 
2
  from celery.schedules import crontab
3
  import os
4
 
5
+ # Import the task function
6
+ from backend.celery_tasks.schedule_loader import load_schedules_task
7
+
8
  # Create Celery instance for Beat scheduler
9
  celery_beat = Celery('lin_scheduler')
10
 
 
16
  celery_beat.conf.beat_schedule = {
17
  # This task will run every 5 minutes to load schedules from the database
18
  'load-schedules': {
19
+ 'task': 'backend.celery_tasks.schedule_loader.load_schedules_task',
20
  'schedule': crontab(minute='*/5'),
21
  },
22
  }
backend/celery_config.py ADDED
@@ -0,0 +1,75 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ """
2
+ Unified Celery configuration for the Lin application.
3
+ This centralizes all Celery configuration to avoid conflicts.
4
+ """
5
+
6
+ import os
7
+ from celery import Celery
8
+ from celery.schedules import crontab
9
+ from backend.config import Config
10
+
11
+ # Create Celery instance
12
+ celery_app = Celery('lin_app')
13
+
14
+ # Configure Celery with broker and result backend
15
+ celery_app.conf.broker_url = os.environ.get('CELERY_BROKER_URL', 'redis://localhost:6379/0')
16
+ celery_app.conf.result_backend = os.environ.get('CELERY_RESULT_BACKEND', 'redis://localhost:6379/0')
17
+
18
+ # Additional Celery configuration
19
+ celery_app.conf.update(
20
+ # Task serialization
21
+ task_serializer='json',
22
+ accept_content=['json'],
23
+ result_serializer='json',
24
+ timezone='UTC',
25
+ enable_utc=True,
26
+
27
+ # Task routing
28
+ task_routes={
29
+ 'backend.celery_tasks.content_tasks.generate_content_task': {'queue': 'content'},
30
+ 'backend.celery_tasks.content_tasks.publish_post_task': {'queue': 'publish'},
31
+ 'backend.celery_tasks.schedule_loader.load_schedules_task': {'queue': 'scheduler'},
32
+ },
33
+
34
+ # Worker configuration
35
+ worker_prefetch_multiplier=1,
36
+ task_acks_late=True,
37
+ worker_max_tasks_per_child=100,
38
+
39
+ # Beat scheduler configuration
40
+ beat_scheduler='django_celery_beat.schedulers:DatabaseScheduler',
41
+ beat_schedule={
42
+ # This task will run every 5 minutes to load schedules from the database
43
+ 'load-schedules': {
44
+ 'task': 'backend.celery_tasks.schedule_loader.load_schedules_task',
45
+ 'schedule': crontab(minute='*/5'),
46
+ },
47
+ },
48
+
49
+ # Task result expiration
50
+ result_expires=3600, # 1 hour
51
+
52
+ # Task time limits
53
+ task_soft_time_limit=300, # 5 minutes
54
+ task_time_limit=600, # 10 minutes
55
+
56
+ # Rate limiting
57
+ task_annotations=(
58
+ ('backend.celery_tasks.content_tasks.generate_content_task', {'rate_limit': '10/h'}),
59
+ ('backend.celery_tasks.content_tasks.publish_post_task', {'rate_limit': '30/h'}),
60
+ ),
61
+
62
+ # Error handling
63
+ task_reject_on_worker_lost=True,
64
+ worker_disable_rate_limits=False,
65
+
66
+ # Security
67
+ result_backend_transport_options={'visibility': 'hidden'},
68
+ broker_connection_max_retries=3,
69
+ broker_connection_retry_delay=5,
70
+ )
71
+
72
+ # Import tasks to ensure they're registered
73
+ from backend import celery_tasks
74
+
75
+ __all__ = ['celery_app']
backend/celery_tasks/content_tasks.py CHANGED
@@ -1,12 +1,14 @@
1
- from celery import Celery, current_task
2
  from backend.services.content_service import ContentService
3
  from backend.services.linkedin_service import LinkedInService
4
  from backend.utils.database import init_supabase
 
5
 
6
  # Configure logging
 
7
  logger = logging.getLogger(__name__)
8
 
9
- @celery.task(bind=True)
10
  def generate_content_task(self, user_id: str, schedule_id: str, supabase_client_config: dict):
11
  """
12
  Celery task to generate content for a scheduled post.
 
1
+ from celery import current_task
2
  from backend.services.content_service import ContentService
3
  from backend.services.linkedin_service import LinkedInService
4
  from backend.utils.database import init_supabase
5
+ from backend.celery_config import celery_app
6
 
7
  # Configure logging
8
+ import logging
9
  logger = logging.getLogger(__name__)
10
 
11
+ @celery_app.task(bind=True)
12
  def generate_content_task(self, user_id: str, schedule_id: str, supabase_client_config: dict):
13
  """
14
  Celery task to generate content for a scheduled post.
backend/celery_tasks/schedule_loader.py CHANGED
@@ -3,9 +3,9 @@ from celery.schedules import crontab
3
  from datetime import datetime
4
  import logging
5
  from backend.utils.database import init_supabase
6
- # Use relative import for the Config class to work with Hugging Face Spaces
7
  from backend.config import Config
8
  from backend.celery_tasks.scheduler import schedule_content_generation, schedule_post_publishing
 
9
 
10
  # Configure logging
11
  logger = logging.getLogger(__name__)
 
3
  from datetime import datetime
4
  import logging
5
  from backend.utils.database import init_supabase
 
6
  from backend.config import Config
7
  from backend.celery_tasks.scheduler import schedule_content_generation, schedule_post_publishing
8
+ from backend.celery_config import celery_app
9
 
10
  # Configure logging
11
  logger = logging.getLogger(__name__)
backend/start_celery.bat CHANGED
@@ -11,14 +11,14 @@ if not exist "app.py" (
11
  REM Function to start Celery worker
12
  :start_worker
13
  echo Starting Celery worker...
14
- start "Celery Worker" cmd /k "celery -A celery_app worker --loglevel=info"
15
  echo Celery worker started
16
  goto :eof
17
 
18
  REM Function to start Celery Beat scheduler
19
  :start_beat
20
  echo Starting Celery Beat scheduler...
21
- start "Celery Beat" cmd /k "celery -A celery_beat_config beat --loglevel=info"
22
  echo Celery Beat scheduler started
23
  goto :eof
24
 
@@ -30,11 +30,15 @@ if "%1"=="worker" (
30
  ) else if "%1"=="all" (
31
  call :start_worker
32
  call :start_beat
 
 
 
33
  ) else (
34
- echo Usage: %0 {worker^|beat^|all}
35
  echo worker - Start Celery worker
36
  echo beat - Start Celery Beat scheduler
37
  echo all - Start both worker and scheduler
 
38
  pause
39
  exit /b 1
40
  )
 
11
  REM Function to start Celery worker
12
  :start_worker
13
  echo Starting Celery worker...
14
+ start "Celery Worker" cmd /k "python start_celery.py worker"
15
  echo Celery worker started
16
  goto :eof
17
 
18
  REM Function to start Celery Beat scheduler
19
  :start_beat
20
  echo Starting Celery Beat scheduler...
21
+ start "Celery Beat" cmd /k "python start_celery.py beat"
22
  echo Celery Beat scheduler started
23
  goto :eof
24
 
 
30
  ) else if "%1"=="all" (
31
  call :start_worker
32
  call :start_beat
33
+ ) else if "%1"=="check" (
34
+ echo Checking system requirements...
35
+ python start_celery.py check
36
  ) else (
37
+ echo Usage: %0 {worker^|beat^|all^|check}
38
  echo worker - Start Celery worker
39
  echo beat - Start Celery Beat scheduler
40
  echo all - Start both worker and scheduler
41
+ echo check - Check system requirements
42
  pause
43
  exit /b 1
44
  )
backend/start_celery.py ADDED
@@ -0,0 +1,116 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ #!/usr/bin/env python3
2
+ """
3
+ Script to start Celery components for the Lin application.
4
+ This script provides a unified way to start Celery worker and beat scheduler.
5
+ """
6
+
7
+ import os
8
+ import sys
9
+ import subprocess
10
+ import platform
11
+ from pathlib import Path
12
+
13
+ # Add the backend directory to Python path
14
+ backend_dir = Path(__file__).parent
15
+ sys.path.insert(0, str(backend_dir))
16
+
17
+ def check_redis():
18
+ """Check if Redis is running."""
19
+ try:
20
+ import redis
21
+ client = redis.Redis(host='localhost', port=6379, db=0)
22
+ client.ping()
23
+ print("✓ Redis connection successful")
24
+ return True
25
+ except Exception as e:
26
+ print(f"✗ Redis connection failed: {e}")
27
+ print("Please start Redis server first:")
28
+ print(" Windows: redis-server")
29
+ print(" Linux/Mac: sudo systemctl start redis")
30
+ return False
31
+
32
+ def start_worker():
33
+ """Start Celery worker."""
34
+ print("Starting Celery worker...")
35
+ cmd = [
36
+ sys.executable, "-m", "celery",
37
+ "-A", "celery_config",
38
+ "worker",
39
+ "--loglevel=info",
40
+ "--pool=solo",
41
+ "--max-tasks-per-child=100"
42
+ ]
43
+
44
+ if platform.system() == "Windows":
45
+ subprocess.Popen(cmd, cwd=backend_dir)
46
+ else:
47
+ subprocess.Popen(cmd, cwd=backend_dir)
48
+
49
+ print("Celery worker started")
50
+
51
+ def start_beat():
52
+ """Start Celery Beat scheduler."""
53
+ print("Starting Celery Beat scheduler...")
54
+ cmd = [
55
+ sys.executable, "-m", "celery",
56
+ "-A", "celery_config",
57
+ "beat",
58
+ "--loglevel=info",
59
+ "--scheduler=django_celery_beat.schedulers:DatabaseScheduler"
60
+ ]
61
+
62
+ if platform.system() == "Windows":
63
+ subprocess.Popen(cmd, cwd=backend_dir)
64
+ else:
65
+ subprocess.Popen(cmd, cwd=backend_dir)
66
+
67
+ print("Celery Beat scheduler started")
68
+
69
+ def start_all():
70
+ """Start both worker and beat."""
71
+ if not check_redis():
72
+ return False
73
+
74
+ print("Starting all Celery components...")
75
+ start_worker()
76
+ start_beat()
77
+ print("All Celery components started")
78
+ return True
79
+
80
+ def main():
81
+ """Main function."""
82
+ if len(sys.argv) < 2:
83
+ print("Usage: python start_celery.py <command>")
84
+ print("Commands:")
85
+ print(" worker - Start Celery worker only")
86
+ print(" beat - Start Celery Beat scheduler only")
87
+ print(" all - Start both worker and beat")
88
+ print(" check - Check system requirements")
89
+ sys.exit(1)
90
+
91
+ command = sys.argv[1].lower()
92
+
93
+ if command == "worker":
94
+ if not check_redis():
95
+ sys.exit(1)
96
+ start_worker()
97
+ elif command == "beat":
98
+ if not check_redis():
99
+ sys.exit(1)
100
+ start_beat()
101
+ elif command == "all":
102
+ if not start_all():
103
+ sys.exit(1)
104
+ elif command == "check":
105
+ print("Checking system requirements...")
106
+ if check_redis():
107
+ print("✓ All requirements met")
108
+ else:
109
+ print("✗ Some requirements not met")
110
+ sys.exit(1)
111
+ else:
112
+ print(f"Unknown command: {command}")
113
+ sys.exit(1)
114
+
115
+ if __name__ == "__main__":
116
+ main()
backend/start_celery.sh CHANGED
@@ -10,14 +10,14 @@ fi
10
  # Function to start Celery worker
11
  start_worker() {
12
  echo "Starting Celery worker..."
13
- celery -A celery_app worker --loglevel=info &
14
  echo "Celery worker started with PID $!"
15
  }
16
 
17
  # Function to start Celery Beat scheduler
18
  start_beat() {
19
  echo "Starting Celery Beat scheduler..."
20
- celery -A celery_beat_config beat --loglevel=info &
21
  echo "Celery Beat scheduler started with PID $!"
22
  }
23
 
@@ -27,6 +27,12 @@ start_all() {
27
  start_beat
28
  }
29
 
 
 
 
 
 
 
30
  # Main script logic
31
  case "$1" in
32
  worker)
@@ -38,11 +44,15 @@ case "$1" in
38
  all)
39
  start_all
40
  ;;
 
 
 
41
  *)
42
- echo "Usage: $0 {worker|beat|all}"
43
  echo " worker - Start Celery worker"
44
  echo " beat - Start Celery Beat scheduler"
45
  echo " all - Start both worker and scheduler"
 
46
  exit 1
47
  ;;
48
  esac
 
10
  # Function to start Celery worker
11
  start_worker() {
12
  echo "Starting Celery worker..."
13
+ python start_celery.py worker &
14
  echo "Celery worker started with PID $!"
15
  }
16
 
17
  # Function to start Celery Beat scheduler
18
  start_beat() {
19
  echo "Starting Celery Beat scheduler..."
20
+ python start_celery.py beat &
21
  echo "Celery Beat scheduler started with PID $!"
22
  }
23
 
 
27
  start_beat
28
  }
29
 
30
+ # Function to check system requirements
31
+ check_requirements() {
32
+ echo "Checking system requirements..."
33
+ python start_celery.py check
34
+ }
35
+
36
  # Main script logic
37
  case "$1" in
38
  worker)
 
44
  all)
45
  start_all
46
  ;;
47
+ check)
48
+ check_requirements
49
+ ;;
50
  *)
51
+ echo "Usage: $0 {worker|beat|all|check}"
52
  echo " worker - Start Celery worker"
53
  echo " beat - Start Celery Beat scheduler"
54
  echo " all - Start both worker and scheduler"
55
+ echo " check - Check system requirements"
56
  exit 1
57
  ;;
58
  esac
frontend/src/components/Sidebar/Sidebar.jsx CHANGED
@@ -84,7 +84,6 @@ const Sidebar = ({ isCollapsed, toggleSidebar }) => {
84
  label: 'Dashboard',
85
  icon: 'dashboard',
86
  description: 'Overview and analytics',
87
- badge: 'New',
88
  iconColor: 'text-primary-600',
89
  animationDelay: 0,
90
  ariaLabel: 'Dashboard - Overview and analytics',
@@ -105,7 +104,6 @@ const Sidebar = ({ isCollapsed, toggleSidebar }) => {
105
  label: 'Accounts',
106
  icon: 'account_circle',
107
  description: 'Social media accounts',
108
- count: 3,
109
  iconColor: 'text-success-600',
110
  animationDelay: 200,
111
  ariaLabel: 'Accounts - Social media accounts',
@@ -116,7 +114,6 @@ const Sidebar = ({ isCollapsed, toggleSidebar }) => {
116
  label: 'Posts',
117
  icon: 'post_add',
118
  description: 'Content posts',
119
- count: 12,
120
  iconColor: 'text-warning-600',
121
  animationDelay: 300,
122
  ariaLabel: 'Posts - Content posts',
 
84
  label: 'Dashboard',
85
  icon: 'dashboard',
86
  description: 'Overview and analytics',
 
87
  iconColor: 'text-primary-600',
88
  animationDelay: 0,
89
  ariaLabel: 'Dashboard - Overview and analytics',
 
104
  label: 'Accounts',
105
  icon: 'account_circle',
106
  description: 'Social media accounts',
 
107
  iconColor: 'text-success-600',
108
  animationDelay: 200,
109
  ariaLabel: 'Accounts - Social media accounts',
 
114
  label: 'Posts',
115
  icon: 'post_add',
116
  description: 'Content posts',
 
117
  iconColor: 'text-warning-600',
118
  animationDelay: 300,
119
  ariaLabel: 'Posts - Content posts',
frontend/src/pages/Home.jsx CHANGED
@@ -186,29 +186,6 @@ const Home = () => {
186
  </Link>
187
  </div>
188
 
189
- {/* Trust indicators */}
190
- <div className={`mt-12 sm:mt-16 flex flex-wrap justify-center items-center gap-4 sm:gap-6 lg:gap-8 text-secondary-600 transition-all duration-1000 ${
191
- isVisible ? 'animate-slide-up opacity-100' : 'opacity-0 translate-y-8'
192
- }`} style={{ animationDelay: '0.5s' }}>
193
- <div className="flex items-center gap-2">
194
- <svg className="w-5 h-5 sm:w-6 sm:h-6 text-primary-600" fill="currentColor" viewBox="0 0 20 20">
195
- <path fillRule="evenodd" d="M6.267 3.455a3.066 3.066 0 001.745-.723 3.066 3.066 0 013.976 0 3.066 3.066 0 001.745.723 3.066 3.066 0 012.812 2.812c.051.643.304 1.254.723 1.745a3.066 3.066 0 010 3.976 3.066 3.066 0 00-.723 1.745 3.066 3.066 0 01-2.812 2.812 3.066 3.066 0 00-1.745.723 3.066 3.066 0 01-3.976 0 3.066 3.066 0 00-1.745-.723 3.066 3.066 0 01-2.812-2.812 3.066 3.066 0 00-.723-1.745 3.066 3.066 0 010-3.976 3.066 3.066 0 00.723-1.745 3.066 3.066 0 012.812-2.812zm7.44 5.252a1 1 0 00-1.414-1.414L9 10.586 7.707 9.293a1 1 0 00-1.414 1.414l2 2a1 1 0 001.414 0l4-4z" clipRule="evenodd" />
196
- </svg>
197
- <span className="text-sm sm:text-base font-medium">10,000+ Active Users</span>
198
- </div>
199
- <div className="flex items-center gap-2">
200
- <svg className="w-5 h-5 sm:w-6 sm:h-6 text-primary-600" fill="currentColor" viewBox="0 0 20 20">
201
- <path fillRule="evenodd" d="M3 17a1 1 0 011-1h12a1 1 0 110 2H4a1 1 0 01-1-1zm3.293-7.707a1 1 0 011.414 0L9 10.586V3a1 1 0 112 0v7.586l1.293-1.293a1 1 0 111.414 1.414l-3 3a1 1 0 01-1.414 0l-3-3a1 1 0 010-1.414z" clipRule="evenodd" />
202
- </svg>
203
- <span className="text-sm sm:text-base font-medium">50M+ Posts Managed</span>
204
- </div>
205
- <div className="flex items-center gap-2">
206
- <svg className="w-5 h-5 sm:w-6 sm:h-6 text-primary-600" fill="currentColor" viewBox="0 0 20 20">
207
- <path d="M9.049 2.927c.3-.921 1.603-.921 1.902 0l1.07 3.292a1 1 0 00.95.69h3.462c.969 0 1.371 1.24.588 1.81l-2.8 2.034a1 1 0 00-.364 1.118l1.07 3.292c.3.921-.755 1.688-1.54 1.118l-2.8-2.034a1 1 0 00-1.175 0l-2.8 2.034c-.784.57-1.838-.197-1.539-1.118l1.07-3.292a1 1 0 00-.364-1.118L2.98 8.72c-.783-.57-.38-1.81.588-1.81h3.461a1 1 0 00.951-.69l1.07-3.292z" />
208
- </svg>
209
- <span className="text-sm sm:text-base font-medium">4.9/5 User Rating</span>
210
- </div>
211
- </div>
212
  </div>
213
  </div>
214
  </section>
 
186
  </Link>
187
  </div>
188
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
189
  </div>
190
  </div>
191
  </section>
start_app.py CHANGED
@@ -1,10 +1,72 @@
1
  #!/usr/bin/env python
2
  """
3
  Entry point for the Lin application.
4
- This is an alternative entry point that can be used if needed.
5
  """
6
  import os
7
  import sys
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
8
 
9
  if __name__ == "__main__":
10
  # Set the port for Hugging Face Spaces
@@ -12,16 +74,41 @@ if __name__ == "__main__":
12
  os.environ.setdefault('PORT', port)
13
 
14
  print(f"Starting Lin application on port {port}...")
 
 
 
 
 
 
 
15
 
16
  try:
 
 
 
17
  # Import and run the backend Flask app directly
18
  from backend.app import create_app
19
  app = create_app()
 
 
 
 
 
 
 
 
20
  app.run(
21
  host='0.0.0.0',
22
  port=int(port),
23
- debug=False
 
24
  )
 
 
 
 
25
  except Exception as e:
26
  print(f"Failed to start Lin application: {e}")
 
 
27
  sys.exit(1)
 
1
  #!/usr/bin/env python
2
  """
3
  Entry point for the Lin application.
4
+ This script starts both the Flask application and Celery scheduler components.
5
  """
6
  import os
7
  import sys
8
+ import subprocess
9
+ import platform
10
+ import time
11
+ from pathlib import Path
12
+
13
+ def check_redis():
14
+ """Check if Redis is running."""
15
+ try:
16
+ import redis
17
+ client = redis.Redis(host='localhost', port=6379, db=0)
18
+ client.ping()
19
+ print("✓ Redis connection successful")
20
+ return True
21
+ except Exception as e:
22
+ print(f"✗ Redis connection failed: {e}")
23
+ print("Please start Redis server first:")
24
+ print(" Windows: redis-server")
25
+ print(" Linux/Mac: sudo systemctl start redis")
26
+ return False
27
+
28
+ def start_celery_components():
29
+ """Start Celery worker and beat scheduler in background processes."""
30
+ print("Starting Celery components...")
31
+
32
+ backend_dir = Path(__file__).parent / "backend"
33
+
34
+ # Start Celery worker
35
+ worker_cmd = [
36
+ sys.executable, "-m", "celery",
37
+ "-A", "celery_config",
38
+ "worker",
39
+ "--loglevel=info",
40
+ "--pool=solo",
41
+ "--max-tasks-per-child=100"
42
+ ]
43
+
44
+ # Start Celery beat
45
+ beat_cmd = [
46
+ sys.executable, "-m", "celery",
47
+ "-A", "celery_config",
48
+ "beat",
49
+ "--loglevel=info",
50
+ "--scheduler=django_celery_beat.schedulers:DatabaseScheduler"
51
+ ]
52
+
53
+ if platform.system() == "Windows":
54
+ # Windows: Use subprocess to start background processes
55
+ subprocess.Popen(worker_cmd, cwd=backend_dir,
56
+ stdout=subprocess.PIPE, stderr=subprocess.PIPE,
57
+ creationflags=subprocess.CREATE_NEW_PROCESS_GROUP)
58
+ subprocess.Popen(beat_cmd, cwd=backend_dir,
59
+ stdout=subprocess.PIPE, stderr=subprocess.PIPE,
60
+ creationflags=subprocess.CREATE_NEW_PROCESS_GROUP)
61
+ else:
62
+ # Linux/Mac: Use subprocess with proper signal handling
63
+ subprocess.Popen(worker_cmd, cwd=backend_dir,
64
+ stdout=subprocess.PIPE, stderr=subprocess.PIPE)
65
+ subprocess.Popen(beat_cmd, cwd=backend_dir,
66
+ stdout=subprocess.PIPE, stderr=subprocess.PIPE)
67
+
68
+ print("Celery worker and beat scheduler started in background")
69
+ time.sleep(2) # Give Celery components time to start
70
 
71
  if __name__ == "__main__":
72
  # Set the port for Hugging Face Spaces
 
74
  os.environ.setdefault('PORT', port)
75
 
76
  print(f"Starting Lin application on port {port}...")
77
+ print("=" * 60)
78
+
79
+ # Check if Redis is available
80
+ if not check_redis():
81
+ print("Warning: Redis not available. Celery may not function properly.")
82
+ print("Continuing with Flask app only...")
83
+ print("=" * 60)
84
 
85
  try:
86
+ # Start Celery components first
87
+ start_celery_components()
88
+
89
  # Import and run the backend Flask app directly
90
  from backend.app import create_app
91
  app = create_app()
92
+
93
+ print("=" * 60)
94
+ print("Flask application starting...")
95
+ print("Access the application at:")
96
+ print(f" http://localhost:{port}")
97
+ print(f" http://127.0.0.1:{port}")
98
+ print("=" * 60)
99
+
100
  app.run(
101
  host='0.0.0.0',
102
  port=int(port),
103
+ debug=False,
104
+ threaded=True
105
  )
106
+
107
+ except KeyboardInterrupt:
108
+ print("\nShutting down application...")
109
+ sys.exit(0)
110
  except Exception as e:
111
  print(f"Failed to start Lin application: {e}")
112
+ import traceback
113
+ traceback.print_exc()
114
  sys.exit(1)