AI-powered podcast generator with microservices architecture that creates conversations from various input types.
- Spring Boot Server (Java 21) - REST API, authentication, and RabbitMQ publisher
- Python FastAPI Service - AI processing using OpenAI GPT-4 and ElevenLabs for voice synthesis
- React Native Mobile App (Expo) - Cross-platform mobile client with NativeWind styling
- PostgreSQL - Database with persistent storage
- RabbitMQ - Message queue for async processing
- K3s - Kubernetes deployment with Traefik ingress
- TOPIC - Generate from text topics or prompts
- PDF_DOCUMENT - Analyze and discuss PDF content (with page range selection)
- PHOTO - Create discussions about images using GPT-4 Vision
- WEBSITE_LINK - Scrape and analyze web content
- SHORT (~2 minutes) - Quick summaries
- MEDIUM (~5 minutes) - Balanced discussions
- LONG (~10 minutes) - In-depth analysis
- Real-time search functionality
- Swipe-to-delete podcast cards (Apple Music style)
- Type-based icons for different content types
- Background audio playback
- Optimistic UI updates with React Query
- Docker and Docker Compose
- Node.js and npm (for mobile app)
- Java 21 and Maven (for server development)
- Python 3.11 (for AI service development)
Create a .env file in the project root with your API keys:
# AI Services
OPENAI_API_KEY="your-openai-api-key"
ELEVENLABS_API_KEY="your-elevenlabs-api-key"
# AWS S3 Storage
AWS_ACCESS_KEY_ID="your-aws-access-key"
AWS_SECRET_ACCESS_KEY="your-aws-secret-key"
AWS_REGION="ap-south-1"
AWS_S3_BUCKET_NAME="your-s3-bucket-name"
# Clerk Authentication
CLERK_WEBHOOK_SECRET="your-clerk-webhook-secret"
CLERK_JWKS_URI="https://your-clerk-instance.clerk.accounts.dev/.well-known/jwks.json"
# Message Queue (for local development)
RABBITMQ_HOST="localhost"
RABBITMQ_PORT="5672"
RABBITMQ_USERNAME="guest"
RABBITMQ_PASSWORD="guest"Option A: One-command setup
./scripts/k3s-dev.shStop k3 docker
./scripts/k3s-dev.sh downOption B: Manual steps
# Start K3s cluster
docker-compose -f infra/docker-compose.k3s.yaml up -d
# Wait for K3s to be ready (about 30 seconds)
sleep 30
# Create secrets from .env file
./scripts/create-secrets.sh
# Build and deploy all services
./scripts/build-deploy.sh- API: http://localhost/actuator/health
- API Endpoints: http://localhost/api/podcasts
- RabbitMQ Management: http://localhost:15672 (guest/guest)
# Install dependencies
cd mobile
npm install
# Start Expo development server
npm start
# Run on specific platforms
npm run ios # iOS simulator
npm run android # Android emulator
npm run web # Web browser# Start K3s cluster
docker-compose -f infra/docker-compose.k3s.yaml up -d
# Stop and cleanup
docker-compose -f infra/docker-compose.k3s.yaml down -v
# View logs
docker exec infra-k3s-server-1 kubectl logs -f deployment/notepod-server -n notepod
docker exec infra-k3s-server-1 kubectl logs -f deployment/notepod-ai -n notepod
# Check pod status
docker exec infra-k3s-server-1 kubectl get pods -n notepod
# Recreate secrets (if .env changes)
./scripts/create-secrets.sh
# Rebuild and redeploy
./scripts/build-deploy.shOption A: One-command setup
./scripts/local-dev.shOption B: Manual steps
# Start PostgreSQL and RabbitMQ only
docker-compose -f infra/docker-compose.yaml up -d
# Run Spring Boot server (Terminal 1)
cd server && ./mvnw spring-boot:run
# Run Python AI service (Terminal 2)
cd podcast-ai && pip install -r requirements.txt && python app.py
# Run mobile app (Terminal 3)
cd mobile && npm install && npm start# Spring Boot tests
cd server
./mvnw test
# Python service tests
cd podcast-ai
python tests/test_pdf_podcast.py
python tests/test_image_podcast.py
python tests/test_website_podcast.py
# Mobile app linting
cd mobile
npm run lintcurl -X POST http://localhost/api/podcasts \
-H "Content-Type: application/json" \
-d '{
"type": "TOPIC",
"sourceContent": "The future of artificial intelligence in healthcare",
"duration": "MEDIUM",
"speakers": [
{"name": "Dr. Smith", "role": "AI Researcher"},
{"name": "Lisa", "role": "Healthcare Expert"}
]
}'curl -X POST http://localhost/api/podcasts \
-H "Content-Type: application/json" \
-d '{
"type": "PDF_DOCUMENT",
"sourceContent": "https://example.com/research-paper.pdf",
"duration": "LONG",
"speakers": [
{"name": "Prof. Johnson", "role": "Researcher"},
{"name": "Sarah", "role": "Student"}
]
}'# Restart K3s cluster
docker-compose -f docker-compose.k3s.yaml restart k3s-server
# Check K3s cluster status
docker exec infra-k3s-server-1 kubectl cluster-info
# View ingress status
docker exec infra-k3s-server-1 kubectl get ingress -n notepod# Check if secrets exist
docker exec infra-k3s-server-1 kubectl get secrets -n notepod
# Recreate secrets
docker exec infra-k3s-server-1 kubectl delete secret api-secrets -n notepod
./scripts/create-secrets.sh# Check PostgreSQL logs
docker exec infra-k3s-server-1 kubectl logs -f deployment/postgres -n notepod
# Check if PVCs are bound
docker exec infra-k3s-server-1 kubectl get pvc -n notepod# Check notepod-ai logs
kubectl -n notepod logs -f -l app=notepod-ai --all-containers --prefix --tail=1000ssh -L 5432:notepod-postgres.cnmw0e08ykfs.ap-south-1.rds.amazonaws.com:5432 -i ~/.ssh/notepod-key.pem ubuntu@notepod.saheen.dev ssh -L 15672:10.42.0.123:15672 -i ~/.ssh/notepod-key.pem ubuntu@notepod.saheen.dev