This project automates the scraping of stock market data every 30 minutes during NYSE hours and sends a daily summary email after market close.
- Runs every 30 minutes from 9:30 AM to 4:00 PM EST (13:00–21:30 UTC)
- Scrapes stock data using
main.py - Saves output
.csvfiles to the repository’sdata-storebranch using GitHub Actions
- Runs at 4:30 PM EST (22:00 UTC)
- Checks out the
data-storebranch - Retrieves
send_and_clean.pyfrom themainbranch - Sends the collected
.csvfiles as email attachments - Cleans up the
data-storebranch by deleting.csvfiles
main: Source code and workflow definitionsdata-store: Stores temporary.csvfiles collected during market hours
Two workflows are defined in .github/workflows/:
scraper.yml: Periodic scraping and CSV uploadsend_email.yml: Email dispatch and cleanup after close
The following secrets must be configured in the repository:
PP: Password or app password for SMTP emailSMAIL: Sender email addressRMAIL: Recipient email address
- Python 3.12
- Selenium
- GitHub Actions
- Gmail SMTP
Scraped files will be named like:
stock_data_29-06-25_04-45-42_AM.csv
And emailed as daily attachments after the market closes.
- All automation is handled via GitHub Actions (no server needed)
- The
GITHUB_TOKENis used to authenticate pushes todata-store - Be sure to keep email secrets secure
MIT License