Overview • Features • How It Works • Installation • Usage • Contributing • License
VideoMuse is an AI-powered application that generates custom music for your videos, creating unique and engaging music videos.
VideoMuse automates the process of creating music videos by analyzing the content of your video and generating a lyrics, selecting a music style, and creating a custom music track to accompany it. Using advanced AI models and audio processing techniques, VideoMuse produces high-quality music that complements your video's mood, style, and content.
- Video Analysis: Utilizes Google's Gemini AI to analyze video content and generate music prompts.
- Custom Music Generation: Creates unique music tracks based on video analysis using Suno AI.
- Audio Overlay: Seamlessly combines the original video with the generated music track.
- User-Friendly Interface: Easy-to-use Streamlit app for uploading videos and generating music videos.
- Video Upload: Users upload their video through the Streamlit interface.
- Video Analysis: The video is analyzed using Google's Gemini AI to generate a music prompt.
- Music Generation: Based on the prompt, custom music is created using Suno AI's API.
- Audio Processing: The generated music is downloaded and overlaid onto the original video.
- Final Output: A new music video is created, combining the original video with the custom music track.
- Clone the repository:
git clone https://github.com/amadad/videomuse.git - Install required dependencies:
pip install -r requirements.txt - Set up environment variables:
GOOGLE_API_KEY: Your Google API key for Gemini AISUNO_API_KEY: Your Suno AI API key
- Run the Streamlit app:
streamlit run main.py - Upload your video file through the web interface.
- Enter a prompt to guide the music generation (optional).
- Click "Generate Music Video" to start the process.
- Wait for the process to complete, and enjoy your new music video!
Contributions to VideoMuse are welcome. Please ensure to follow the project's code standards and submit pull requests for review.
This project is licensed under the MIT License - see the LICENSE file for details.