
Free Web Scraping: Run Scrapers on GitHub Actions (No Server, No Cost)
I used to pay $5/month for a server to run my scrapers. Then I realized GitHub Actions gives me 2,000 free minutes per month. That's enough to run a scraper every hour, 24/7, for $0. The Setup (5 Minutes) You need: A GitHub repo A Python script A YAML config file That's it. Step 1: The Scraper # scraper.py import requests import json from datetime import datetime import os def scrape (): # Example: track Bitcoin price data = requests . get ( " https://api.coingecko.com/api/v3/simple/price " , params = { " ids " : " bitcoin,ethereum " , " vs_currencies " : " usd " } ). json () entry = { " timestamp " : datetime . utcnow (). isoformat (), " bitcoin " : data [ " bitcoin " ][ " usd " ], " ethereum " : data [ " ethereum " ][ " usd " ], } os . makedirs ( " data " , exist_ok = True ) try : with open ( " data/prices.json " ) as f : history = json . load ( f ) except ( FileNotFoundError , json . JSONDecodeError ): history = [] history . append ( entry ) with open ( " data/prices.json " , " w "
Continue reading on Dev.to Tutorial
Opens in a new tab




