mohammed-aljafry commited on
Commit
779f141
Β·
verified Β·
1 Parent(s): ff44cfa

Upload README.md with huggingface_hub

Browse files
Files changed (1) hide show
  1. README.md +206 -5
README.md CHANGED
@@ -1,10 +1,211 @@
1
  ---
2
- title: Baseer Api Server
3
- emoji: 🐨
4
  colorFrom: blue
5
- colorTo: indigo
6
  sdk: docker
7
- pinned: false
 
 
 
 
 
 
 
 
 
 
 
 
 
8
  ---
9
 
10
- Check out the configuration reference at https://huggingface.co/docs/hub/spaces-config-reference
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
  ---
2
+ title: Baseer Self-Driving API
3
+ emoji: πŸš—
4
  colorFrom: blue
5
+ colorTo: red
6
  sdk: docker
7
+ app_port: 7860
8
+ pinned: true
9
+ license: mit
10
+ short_description: A RESTful API for an InterFuser-based self-driving model.
11
+ tags:
12
+ - computer-vision
13
+ - autonomous-driving
14
+ - deep-learning
15
+ - fastapi
16
+ - pytorch
17
+ - interfuser
18
+ - graduation-project
19
+ - carla
20
+ - self-driving
21
  ---
22
 
23
+ # πŸš— Baseer Self-Driving API
24
+
25
+ | Service | Status |
26
+ |---|---|
27
+ | **API Status** | [![Status](https://img.shields.io/website?up_message=online&down_message=offline&url=https%3A%2F%2FBaseerAI-baseer-server.hf.space)](https://BaseerAI-baseer-server.hf.space) |
28
+ | **Model** | [![Model](https://img.shields.io/badge/Model-Interfuser--Baseer--v1-blue)](https://huggingface.co/BaseerAI/Interfuser-Baseer-v1) |
29
+ | **Frameworks** | [![FastAPI](https://img.shields.io/badge/FastAPI-005571?style=flat&logo=fastapi)](https://fastapi.tiangolo.com/) [![PyTorch](https://img.shields.io/badge/PyTorch-%23EE4C2C.svg?style=flat&logo=PyTorch&logoColor=white)](https://pytorch.org/) |
30
+
31
+ ## πŸ“‹ Project Description
32
+
33
+ **Baseer** is an advanced self-driving system that provides a robust, real-time API for autonomous vehicle control. This Space hosts the FastAPI server that acts as an interface to the fine-tuned **[Interfuser-Baseer-v1](https://huggingface.co/BaseerAI/Interfuser-Baseer-v1)** model.
34
+
35
+ The system is designed to take a live camera feed and vehicle measurements, process them through the deep learning model, and return actionable control commands and a comprehensive scene analysis.
36
+
37
+ ---
38
+
39
+ ## πŸ—οΈ Architecture
40
+
41
+ This project follows a decoupled client-server architecture, where the model and the application are managed separately for better modularity and scalability.
42
+
43
+ ```
44
+ +-----------+ +------------------------+ +--------------------------+
45
+ | | | | | |
46
+ | Client | -> | Baseer API (Space) | -> | Interfuser Model (Hub) |
47
+ |(e.g.CARLA)| | (FastAPI Server) | | (Private/Gated Weights) |
48
+ | | | | | |
49
+ +-----------+ +------------------------+ +--------------------------+
50
+ HTTP Loads Model Model Repository
51
+ Request
52
+ ```
53
+
54
+ ## ✨ Key Features
55
+
56
+ ### 🧠 **Advanced Perception Engine**
57
+ - **Powered by:** The [Interfuser-Baseer-v1](https://huggingface.co/BaseerAI/Interfuser-Baseer-v1) model.
58
+ - **Focus:** High-accuracy traffic object detection and safe waypoint prediction.
59
+ - **Scene Analysis:** Real-time assessment of junctions, traffic lights, and stop signs.
60
+
61
+ ### ⚑ **High-Performance API**
62
+ - **Framework:** Built with **FastAPI** for high throughput and low latency.
63
+ - **Stateful Sessions:** Manages multiple, independent driving sessions, each with its own tracker and controller state.
64
+ - **RESTful Interface:** Intuitive and easy-to-use API design.
65
+
66
+ ### πŸ“Š **Comprehensive Outputs**
67
+ - **Control Commands:** `steer`, `throttle`, `brake`.
68
+ - **Scene Analysis:** Probabilities for junctions, traffic lights, and stop signs.
69
+ - **Predicted Waypoints:** The model's intended path for the next 10 steps.
70
+ - **Visual Dashboard:** A generated image that provides a complete, human-readable overview of the current state.
71
+
72
+ ---
73
+
74
+ ## πŸš€ How to Use
75
+
76
+ Interact with the API by making HTTP requests to its endpoints. The typical workflow is to start a session, run steps in a loop, and then end the session.
77
+
78
+ ### 1. Start a New Session
79
+ This will initialize a new set of tracker and controller instances on the server.
80
+
81
+ **Request:**
82
+ ```bash
83
+ curl -X POST "https://BaseerAI-baseer-server.hf.space/start_session"
84
+ ```
85
+
86
+ **Example Response:**
87
+ ```json
88
+ {
89
+ "session_id": "a1b2c3d4-e5f6-7890-1234-567890abcdef"
90
+ }
91
+ ```
92
+
93
+ ### 2. Run a Simulation Step
94
+
95
+ Send the current camera view and vehicle measurements to be processed. The API will return control commands and a full analysis.
96
+
97
+ **Request:**
98
+ ```bash
99
+ curl -X POST "https://BaseerAI-baseer-server.hf.space/run_step" \
100
+ -H "Content-Type: application/json" \
101
+ -d '{
102
+ "session_id": "a1b2c3d4-e5f6-7890-1234-567890abcdef",
103
+ "image_b64": "your-base64-encoded-bgr-image-string",
104
+ "measurements": {
105
+ "pos_global": [105.0, -20.0],
106
+ "theta": 1.57,
107
+ "speed": 5.5,
108
+ "target_point": [10.0, 0.0]
109
+ }
110
+ }'
111
+ ```
112
+
113
+ **Example Response:**
114
+ ```json
115
+ {
116
+ "control_commands": {
117
+ "steer": 0.05,
118
+ "throttle": 0.6,
119
+ "brake": false
120
+ },
121
+ "scene_analysis": {
122
+ "is_junction": 0.02,
123
+ "traffic_light_state": 0.95,
124
+ "stop_sign": 0.01
125
+ },
126
+ "predicted_waypoints": [
127
+ [1.0, 0.05],
128
+ [2.0, 0.06],
129
+ [3.0, 0.07],
130
+ [4.0, 0.07],
131
+ [5.0, 0.08],
132
+ [6.0, 0.08],
133
+ [7.0, 0.09],
134
+ [8.0, 0.09],
135
+ [9.0, 0.10],
136
+ [10.0, 0.10]
137
+ ],
138
+ "dashboard_b64": "a-very-long-base64-string-representing-the-dashboard-image...",
139
+ "reason": "Red Light"
140
+ }
141
+ ```
142
+
143
+ **Response Fields:**
144
+ - **`control_commands`**: The final commands to be applied to the vehicle.
145
+ - **`scene_analysis`**: Probabilities for different road hazards. A high `traffic_light_state` value (e.g., > 0.5) indicates a red light.
146
+ - **`predicted_waypoints`**: The model's intended path, relative to the vehicle.
147
+ - **`dashboard_b64`**: A Base64-encoded JPEG image of the full dashboard view, which can be directly displayed in a client application.
148
+ - **`reason`**: A human-readable string explaining the primary reason for the control action (e.g., "Following ID 12", "Red Light", "Cruising").
149
+
150
+ ### 3. End the Session
151
+
152
+ This will clean up the session data from the server.
153
+
154
+ **Request:**
155
+ ```bash
156
+ curl -X POST "https://BaseerAI-baseer-server.hf.space/end_session?session_id=a1b2c3d4-e5f6-7890-1234-567890abcdef"
157
+ ```
158
+
159
+ **Example Response:**
160
+ ```json
161
+ {
162
+ "message": "Session a1b2c3d4-e5f6-7890-1234-567890abcdef ended."
163
+ }
164
+ ```
165
+
166
+ ---
167
+
168
+ ## πŸ“‘ API Endpoints
169
+
170
+ | Endpoint | Method | Description |
171
+ |---|---|---|
172
+ | `/` | GET | Landing page with API status. |
173
+ | `/docs` | GET | Interactive API documentation (Swagger UI). |
174
+ | `/start_session` | POST | Initializes a new driving session. |
175
+ | `/run_step` | POST | Processes a single frame and returns control commands. |
176
+ | `/end_session` | POST | Terminates a specific session. |
177
+ | `/sessions` | GET | Lists all currently active sessions. |
178
+
179
+ ---
180
+
181
+ ## 🎯 Intended Use Cases & Limitations
182
+
183
+ ### βœ… Optimal Use Cases
184
+ - Simulating driving in CARLA environments.
185
+ - Research in end-to-end autonomous driving.
186
+ - Testing perception and control modules in a closed-loop system.
187
+ - Real-time object detection and trajectory planning.
188
+
189
+ ### ⚠️ Limitations
190
+ - **Simulation-Only:** Trained exclusively on CARLA data. Not suitable for real-world driving.
191
+ - **Vision-Based:** Relies on a single front-facing camera and has inherent blind spots.
192
+ - **No LiDAR:** Lacks the robustness of sensor fusion in adverse conditions.
193
+
194
+ ---
195
+
196
+ ## πŸ› οΈ Development
197
+
198
+ This project is part of a graduation thesis in Artificial Intelligence.
199
+ - **Deep Learning:** PyTorch
200
+ - **API Server:** FastAPI
201
+ - **Image Processing:** OpenCV
202
+ - **Scientific Computing:** NumPy
203
+
204
+ ## πŸ“ž Contact
205
+
206
+ For inquiries or support, please use the **Community** tab in this Space or open an issue in the project's GitHub repository (if available).
207
+
208
+ ---
209
+
210
+ **Developed by:** Adam Altawil
211
+ **License:** MIT