A newer version of the Streamlit SDK is available:
1.44.1
metadata
license: gpl-3.0
title: VIDEO MANIPULATION DETECTION
sdk: streamlit
emoji: π
colorFrom: purple
colorTo: blue
pinned: true
sdk_version: 1.41.1
For deploying your project on Streamlit in a Space, here's the adjusted README.md with relevant details:
# Video Manipulation Detection - Streamlit App
## Overview
This project is an AI-powered tool designed to detect video manipulations like deepfakes using object detection and optical flow analysis. It provides a web interface where users can upload videos, and the tool analyzes the video for potential manipulations, giving a score to indicate the likelihood of manipulation.
## Configuration
```yaml
title: Video Manipulation Detection
emoji: π₯
colorFrom: blue
colorTo: green
sdk: streamlit
python_version: 3.10
sdk_version: 1.10.0
suggested_hardware: "t4-medium"
suggested_storage: "medium"
app_file: app.py
app_port: 8501
base_path: /
fullWidth: true
header: default
short_description: Detect deepfakes and video manipulations using AI-based analysis.
models:
- facebook/detr-resnet-50
tags:
- deepfake-detection
- video-analysis
- AI
thumbnail: https://example.com/thumbnail.png
pinned: true
hf_oauth: false
hf_oauth_scopes: []
hf_oauth_expiration_minutes: 480
hf_oauth_authorized_org: []
disable_embedding: false
startup_duration_timeout: 30m
custom_headers: {}
preload_from_hub:
- facebook/detr-resnet-50
Description
This Streamlit app allows users to upload a video, which is then analyzed for potential manipulation. It uses the following key features:
- Frame Extraction: Captures frames from the video at defined intervals.
- Object Detection: Uses the DETR model (a transformer-based object detection model) to detect objects in each frame.
- Optical Flow Calculation: Computes motion patterns between consecutive frames to detect any abnormal motion (indicative of manipulation).
- Manipulation Score: Based on the results of object detection and motion analysis, the app generates a manipulation score to indicate the likelihood of tampering.
Steps to Deploy
Clone the Repository:
git clone https://github.com/yourusername/video-manipulation-detection.git cd video-manipulation-detection
Create a Virtual Environment: Create a virtual environment (optional but recommended):
python3 -m venv venv
Activate the virtual environment:
- On Windows:
venv\Scripts\activate
- On Mac/Linux:
source venv/bin/activate
- On Windows:
Install Dependencies: Install the required dependencies:
pip install -r requirements.txt
Run the Streamlit App: To run the app locally:
streamlit run app.py
This will start the app on
http://localhost:8501
.Deployment on Streamlit Space:
- Push the repository to Hugging Face Spaces under your account.
- Ensure the
app.py
file is located at the root of the repository. - The app will automatically run in the cloud on Streamlit.
Requirements
- Python 3.10
- Streamlit
- OpenCV
- PyTorch
- Hugging Face Transformers
- PIL (Python Imaging Library)
- tqdm
Usage
- Upload a video file (e.g.,
.mp4
,.mov
). - The app will process the video and display:
- The manipulation score based on the likelihood of manipulation.
- Frame-based analysis and optical flow insights.
- View the final result on the app interface, indicating whether the video is likely manipulated.
License
This project is licensed under the MIT License - see the LICENSE file for details.
Acknowledgments
- DETR (facebook/detr-resnet-50) model from Hugging Face for object detection.
- OpenCV for frame extraction and optical flow calculations.
- Streamlit for creating the interactive web interface.
### Key Changes for Streamlit Space Deployment:
1. **`sdk: streamlit`**: Specifies that this is a Streamlit-based application.
2. **`app_file: app.py`**: Indicates the main file for the app (`app.py`).
3. **Deployment Steps**: Instructions on how to deploy the app on Streamlit, including how to clone the repository and install dependencies.
4. **Hugging Face Specific Configuration**: The configuration YAML ensures the app is correctly set up when deploying to **Hugging Face Spaces**.
Once this is set up, you'll be able to push your project to **Hugging Face Spaces**, where it will run as a **Streamlit** app.