A real-time vehicle tracking system that ingests GTFS data from TransLink's API and streams it through Kafka for real-time vehicle position tracking.
├── README.md # This file
├── pom.xml # Maven parent configuration
├── .env # Environment variables (not in repo)
├── google_transit.zip # GTFS static data
├── config/ # Configuration templates
│ ├── .env.template # Environment variables template
│ └── application-template.yml
├── deployment/ # Docker and deployment files
│ ├── docker-compose.yml # Base compose with Kafka
│ ├── docker-compose.local.yml # Local development overlay
│ ├── docker-compose.cloud.yml # Confluent Cloud overlay
│ ├── docker-compose.base.yml # Services only (no Kafka)
│ ├── docker-compose.prod.yml # Production deployment
│ ├── start.sh # Deployment script
│ ├── setup.sh # Setup script
│ └── nginx/ # Nginx configuration
├── docs/ # Documentation
│ ├── DEPLOYMENT.md # Comprehensive deployment guide
│ ├── API_DOCUMENTATION.md # API documentation
│ └── DEPLOYMENT_GUIDE.md # Additional deployment info
├── data-ingestion-service/ # Data ingestion microservice
├── transit-tracker-service/ # Transit tracking microservice
└── shared-models/ # Common models and protobuf
The system consists of 3 main modules:
- shared-models: Common data models and protobuf definitions for GTFS data
- data-ingestion-service: Spring Boot service that polls TransLink GTFS API every 30 seconds and publishes vehicle positions to Kafka (runs on port 8081)
- transit-tracker-service: Spring Boot service that consumes Kafka events, loads GTFS static data, and provides REST API with WebSocket support (runs on port 8082)
- Java 17+
- Maven 3.6+
- Docker and Docker Compose
- TransLink API key (Get one here)
- (Optional) Confluent Cloud account for production
# Clone repository
git clone <your-repo-url>
cd real-time-transit
# Copy environment template and fill in your values
cp config/.env.template .env
# Edit .env with your actual API keys# Start everything locally with Kafka UI
cd deployment
./start.sh local up
# Services will be available at:
# - Data Ingestion Service: http://localhost:8081
# - Transit Tracker Service: http://localhost:8082
# - Kafka UI: http://localhost:8080# Use your Confluent Cloud credentials from .env
cd deployment
./start.sh cloud upcurl http://localhost:8081/actuator/health
curl http://localhost:8082/actuator/health- Uses containerized Kafka (no external dependencies)
- Kafka UI available for monitoring
- Fast polling intervals (10 seconds) for development
- All logs visible via
./start.sh local logs
- Uses your Confluent Cloud Kafka
- Production-like configuration
- Tests integration with managed Kafka
# Build all modules
mvn clean compile
# Run tests
mvn test
# Package for deployment
mvn clean packageGET /health- Health checkGET /stats- Ingestion statistics
GET /api/routes/{routeId}/directions/{directionId}/arrivals- Get arrival predictionsGET /health- Health checkWS /ws- WebSocket for real-time updates
- Kafka UI: http://localhost:8080 (monitor topics, consumers, messages)
- Application Health: Check
/healthendpoints - Logs:
./start.sh local logsor./start.sh local logs | grep [service-name]
- Health checks built into Docker Compose
- Structured logging with rotation
- Metrics available via actuator endpoints
See docs/DEPLOYMENT.md for comprehensive deployment documentation.
cd deployment
# Local development with full Kafka stack
./start.sh local up
# Production-like testing with Confluent Cloud
./start.sh cloud up
# Full production deployment
./start.sh prod up
# View logs
./start.sh [env] logs
# Stop services
./start.sh [env] downThe system supports multiple environment profiles:
- local: Development with containerized Kafka
- prod: Production with Confluent Cloud and security
Configuration files are in each service's src/main/resources/:
application.yml- Base configurationapplication-local.yml- Local development overridesapplication-prod.yml- Production configuration with security
- Services won't start: Check
.envfile has valid credentials - Kafka connection failed: Verify
KAFKA_BOOTSTRAP_SERVERSand credentials - TransLink API errors: Check
TRANSLINK_API_KEYis valid - Port conflicts: Ensure ports 8080, 8081, 8082, 9092 are available
# Check service status
cd deployment && ./start.sh local status
# View detailed logs
cd deployment && ./start.sh local logs
# Test Kafka connectivity (if using local Kafka)
docker exec -it transit-kafka kafka-console-consumer --bootstrap-server localhost:9092 --topic vehicle-positions- Follow the modular architecture
- Use Spring profiles for environment-specific config
- Add tests for new functionality
- Update documentation for API changes
- Use the provided Docker setup for consistent environments
[Your License]