
How I Built an AI Pipeline That Converts Floor Plans to 3D Printable Models
The Problem Turning a 2D floor plan into a 3D model is tedious. Architects spend hours tracing walls in SketchUp or Blender. Non-technical users have no viable path at all. I wanted to automate the entire pipeline — upload a floor plan image, get a 3D model back. The Stack Mobile : Flutter (iOS, Android coming soon) API : Flask monolith on a single GCE VM 3D Pipeline : Blender (headless, via bpy) Database : PostgreSQL + SQLAlchemy Auth : Firebase Admin SDK Storage : Google Cloud Storage + Cloudflare CDN Queue : Redis-backed worker pool How the AI Detection Works The floor plan image goes through a detection pipeline that identifies: Wall segments (position, thickness, orientation) Door openings (location, swing direction) Window openings (location, sill height, header height) Room boundaries (polygons + room type classification) The output is a structured JSON that drives the Blender pipeline. The Blender Pipeline Blender runs headless on the server. The JSON layout drives a Python scr
Continue reading on Dev.to Python
Opens in a new tab

![[MM’s] Boot Notes — The Day Zero Blueprint — Configuration That Survives Production](/_next/image?url=https%3A%2F%2Fcdn-images-1.medium.com%2Fmax%2F1496%2F1*0XEWNqtLt1IFIW6yT4x-6A.png&w=1200&q=75)
