
I Built a Bot That Keeps My Resume Always Up to Date on GitHub
I use Overleaf to write my resume in LaTeX. Every time I made an edit, I had to manually compile it, download the PDF, and push it to my GitHub repo so my portfolio website could link to it. After doing this one too many times, I decided to automate the whole thing. Here's the full story - the scraper, the GitHub Actions workflow, the bugs I hit, and how I eventually wired it to my portfolio site. The Problem My portfolio at nakuldev.vercel.app links directly to my resume PDF. For that link to always point to the latest version, I'd have to: Open Overleaf, compile, download Replace the old PDF in my repo Commit and push Boring. Repetitive. Easy to forget. So I automated it. The Plan Write a Node.js script that opens my Overleaf share link in a headless browser Find the PDF download link in the DOM Download the PDF and save it locally Run this on a schedule via GitHub Actions Auto-commit and push the new PDF back to the repo Step 1 - Scraping Overleaf with Playwright Overleaf is a React
Continue reading on Dev.to Webdev
Opens in a new tab


