
Docker said build succeeded. Image was 4GB.
Built a scraper container last week. Docker said build succeeded. Pushed it to the registry. Tried pulling it on the server. 4.2GB download started. My internet peaked at 2MB/s. That's 35 minutes of waiting every single deploy. The dumb mistake Threw everything in the Dockerfile: FROM python:3.11 COPY . /app WORKDIR /app RUN pip install -r requirements.txt CMD ["python", "scraper.py"] Simple right? Worked locally. Build succeeded. Containerized my scraper. Except python:3.11 is the full Debian image. 1GB base. Includes compilers, build tools, stuff I never touched. Then I copied my entire project folder. That included old test data (400MB), scraped results from local runs (800MB), node_modules from when I tested a JS library once (600MB), venv folder (200MB). COPY . grabbed everything. Docker doesn't ignore files like git does. Then pip installed requests, beautifulsoup4, and selenium. Selenium pulled chromium. Another 500MB. Fun times. Fixing it Switched to python:3.11-slim first. Sam
Continue reading on Dev.to Python
Opens in a new tab



