Simple application to crawl the website of my local theatres and send me updates about OV films.
The application will scrape the cinemas' websites in an interval (a few days). It will then persist any entries that were not previously scraped in a database and notify me via email about these new entries.
# Run tests
./bin/kaocha
./bin/kaocha --watch
(dev) ; switch to dev namespace
(dev/restart) ; restart web server
The application is split up into two "processes". The crawler and the web app.
- Crawls the cineplex website
- Stores newly found entries in a database
- Triggered by URL invokation (via IFTTT Webhook)
- Blacklist a movie (hide from user)
- List all upcoming screenings
Overview page with future screenings of OV movies.
Trigger a crawl of the cineplex website.
?passphrase=$PASSPHRASE
Hides a movie from future notifications.
The external services used:
Postgres
docker-compose up -d
for testing.
We have a few helper functions available to create, rollback and execute migrations from the REPL. Additionally the migrations will be executed on application startup.
(ov-movies.database/migrate!)
(ov-movies.database/rollback!)
(ov-movies.database/create-migration! "create-my-new-table")
Name | Usage |
---|---|
ENV | The environment the app is running under. "dev" or "prod". |
BASE_URL | HTTPS URL of the application |
PASSPHRASE | User entered passphrase to protected endpoints |
DATABASE_URL | JDBC URL to Postgres database |
PORT | Port the application will run on. |
MOVIE_DB_API_KEY | API key for TMDB |
The application is deployed continuously on every successful master
build.
We're using antq to find outdated dependencies.
# List outdated dependencies
clojure -M:outdated
# Upgrade interactively
clojure -M:outdated --upgrade
- Add more cinemas (Gilching, Seefeld, Gauting)
- Monitoring to detect when a scraper breaks