A comprehensive smart city platform that uses Graph Neural Networks to process real-time IoT sensor data, detect events, and provide intelligent recommendations for urban management.
π Live Demo - Try the interactive demo!
Visit the live demo to see UrbanSense in action immediately.
# Clone the repository
git clone https://github.com/VrindaBansal/urbansense.git
cd urbansense
# Navigate to web demo
cd web_demo
# Start local server
python -m http.server 8080
# Open http://localhost:8080 in your browser
Thatβs it! No dependencies required for the basic demo.
UrbanSense transforms how cities understand and respond to urban events by connecting thousands of IoT sensors through intelligent AI. Instead of treating sensors as isolated data points, our Graph Neural Network creates a βsocial networkβ for sensors, understanding relationships and patterns across the entire urban ecosystem.
UrbanSense can automatically detect and respond to:
git clone https://github.com/VrindaBansal/urbansense.git
cd urbansense
cd web_demo
python -m http.server 8080
# Visit http://localhost:8080
pip install -r requirements.txt
pip install -r requirements-full.txt
4. **Environment Setup** (if using backend features):
```bash
# Copy example environment file
cp .env.example .env
# Edit .env with your configurations
# Build and run with Docker
docker build -t urbansense .
docker run -p 8080:8080 urbansense
from src.applications.smart_city.sensor_fusion import SmartCitySensorFusion
# Initialize the system
fusion = SmartCitySensorFusion()
# Add sensors
fusion.add_sensor('temp_001', 'temperature', location=(40.7589, -73.9851))
fusion.add_sensor('traffic_001', 'traffic', location=(40.7580, -73.9855))
# Process real-time data
fusion.process_sensor_data('temp_001', 28.5)
fusion.process_sensor_data('traffic_001', 75.0)
# Get AI insights
anomalies = fusion.detect_anomalies()
events = fusion.detect_events()
recommendations = fusion.get_recommendations()
urbansense/
βββ web_demo/ # Interactive web interface
β βββ index.html # Main interface
β βββ simulation.js # Core simulation engine
β βββ style.css # UI styling
β βββ quick_demo.py # Standalone demo server
βββ src/
β βββ models/gnn/ # Graph Neural Network models
β βββ applications/ # Smart city applications
β βββ fusion/ # Sensor fusion algorithms
β βββ utils/ # Utility functions
βββ docs/ # Documentation
βββ tests/ # Unit tests
βββ requirements.txt # Core dependencies
βββ requirements-full.txt # Full ML dependencies
βββ .env.example # Environment template
βββ README.md # This file
The project automatically deploys to GitHub Pages on every push to main branch:
# Build for production
npm run build
# Deploy to your hosting platform
npm run deploy
# Run all tests
python -m pytest tests/
# Run specific test category
python -m pytest tests/test_gnn.py
python -m pytest tests/test_fusion.py
# Run with coverage
python -m pytest --cov=src tests/
We welcome contributions! Hereβs how to get started:
git checkout -b feature/amazing-featuregit commit -m 'Add amazing feature'git push origin feature/amazing-featureThis project is licensed under the MIT License - see the LICENSE file for details.
Built with inspiration from multimodal AI research in medical applications, adapted for smart city use cases. This project demonstrates how AI can make urban environments more responsive, safe, and efficient.
Ready to transform your city with AI? Try the demo now!