diff --git a/README.md b/README.md index 3b38c5d..c03a1a1 100644 --- a/README.md +++ b/README.md @@ -12,6 +12,7 @@ This project contains different subprojects: - **[avail-da](./avail-da/)**: Docker containers to run an Avail DA node locally. - **[da-sender](./da-sender/)**: A Rust script to submit pubdata from [ZKSync Era](https://github.com/matter-labs/zksync-era) to the local Avail DA node. - **[da-getter](./da-getter/)**: A Rust script to retrieve data from the local Avail DA node. +- **[tools/web_scraper](./tools/web_scraper/)**: Python script to automate claiming AVAIL tokens from the AVAIL Faucet website. ## πŸ›  Requirements @@ -20,6 +21,7 @@ To run this project, you need to have installed: - [Docker and Docker Compose](https://www.docker.com/products/docker-desktop). - [Deno](https://deno.com). - [Rust](https://www.rust-lang.org/tools/install). +- [Python 3](https://www.python.org/downloads/). ## πŸ‘¨β€πŸ’» Usage @@ -57,6 +59,16 @@ make send-data ``` This command uses the da-sender script to submit the pubdata to your locally running Avail DA node. +### πŸ•ΈοΈ Web Scraper + +To set up the web scraper, navigate to the `tools/web_scraper` directory and run: + +```sh +make web-scraper +``` + +This command will create a virtual environment, install dependencies, and run the web scraper. + ## 🧞 Commands All commands are run from the root of the project, from a terminal: @@ -74,3 +86,9 @@ All commands are run from the root of the project, from a terminal: | `make format` | Formats the code in `da-sender`, `da-getter`, and `deno` directories | | `make validium` | Runs the Deno script that retrieves pubdata from `ZKSync Era`, sends it and verifies it to Avail | | `make validium-test` | Runs Deno tests | +| `make web-scraper` | Sets up the web scraper virtual environment, installs dependencies, and runs the scraper | +| `make web-scraper-install` | Installs dependencies for the web scraper | +| `make web-scraper-run` | Runs the web scraper to claim AVAIL tokens from the AVAIL Faucet website | +| `make web-scraper-venv` | Sets up the virtual environment for the web scraper | +| `make web-scraper-clean` | Cleans up the web scraper virtual environment and removes generated files | +| `make web-scraper-clean-logs` | Cleans up the logs generated by the web scraper |