Skip to content

Backend, modern REST API for obtaining match and odds data crawled from multiple sites. Using FastAPI, MongoDB as database, Motor as async MongoDB client, Scrapy as crawler and Docker.

License

Notifications You must be signed in to change notification settings

tmikulin/apiestas

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

96 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

img/logo.png


https://circleci.com/gh/franloza/apiestas/tree/master.svg?style=shield

Introduction

Apiestas is a project composed of a backend powered by the awesome framework FastAPI and a crawler powered by Scrapy.

This project has followed code examples from RealWorld apps, specifically the following projects:

The crawler inserts and updates data from the MongoDB database by using the Apiestas REST API and the data is exposed through this API. The REST API communicates with the database by using Motor - the async Python driver for MongoDB. Finally, this application uses Typer to create the Apiestas CLI, which is the main entrypoint of the application.

Quickstart

First, set environment variables and create database. For example using docker:

export MONGO_DB=rwdb MONGO_PORT=5432 MONGO_USER=MONGO MONGO_PASSWORD=MONGO
docker run --name mongodb --rm -e MONGO_USER="$MONGO_USER" -e MONGO_PASSWORD="$MONGO_PASSWORD" -e MONGO_DB="$MONGO_DB" MONGO
export MONGO_HOST=$(docker inspect -f '{{range .NetworkSettings.Networks}}{{.IPAddress}}{{end}}' pgdb)
mongo --host=$MONGO_HOST --port=$MONGO_PORT --username=$MONGO_USER $MONGO_DB

Then run the following commands to bootstrap your environment with pipenv:

git clone https://github.com/franloza/apiestas
cd apiestas
pipenv install
pipenv shell

Then create .env file (or rename and modify .env.example) in api or crawling folders and set environment variables for every application:

cd api
touch .env
echo DB_CONNECTION=mongo://$MONGO_USER:$MONGO_PASSWORD@$MONGO_HOST:$MONGO_PORT/$MONGO_DB >> .env

To run the web application in debug use:

python main.py api --reload

Development with Docker

You must have docker and docker-compose tools installed to work with material in this section. Then just run:

cd docker
docker-compose up -d

The API will be available on localhost:9000 in your browser.

If you want to enable the surebets calculation feature, you need to use the extended Docker Compose file for Kafka environment. This file is docker-compose.kafka.yml. However, instead of executing this file directly along with docker-compose.yml file, execute run-with-kafka.sh as it is necessary to set up Kafka Connect, MongoDB Replica Set and wait for the systems to be ready. containers initialization

If you run Apiestas with Kafka and Kafka Connect, you will enable Kafka UI, where you can to examine the topics and other info.: http://localhost:9021 or http://localhost:8001/

  • The matches topic should have the crawled bets and matches.
  • The mongo.apiestas.matches topic should contain the change events.

You can also examine the collections in the MongoDB by executing:

docker-compose exec mongo /usr/bin/mongo

To see the logs of the different services, you can execute the following command:

docker-compose -f docker-compose.yml -f docker-compose.kafka.yml  logs -f api surebets crawler

Run tests with Docker

cd docker
docker-compose -f docker-compose-test.yml run tests

Web routes

All routes are available on /docs or /redoc paths with Swagger or ReDoc.

Docs

img/docs.png

Redoc

img/redoc.png

Data sources

Currently the application implements two working crawlers:

  • oddsportalcom - Used as ground truth for matches and odds
  • elcomparador.com - for odds data
  • Codere - for odds data

Architecture

img/apiestas_arch.png

TODO

  1. Add support for more bet types calculation
  2. Support time series visualization

About

Backend, modern REST API for obtaining match and odds data crawled from multiple sites. Using FastAPI, MongoDB as database, Motor as async MongoDB client, Scrapy as crawler and Docker.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Python 96.7%
  • Shell 3.0%
  • Dockerfile 0.3%