
Visualizing Pedestrian Trajectories on a Map with MapLibre
Note:I use AI assistance to draft and polish the English, but the analysis, interpretation, and core ideas are my own. Learning to write technical English is itself part of this project. Motivation When I stood alone in protest at stations along the Yamanote Line, I watched people walk past me. Some glanced at my sign and looked away with a frown. Others gave a small nod. But I had no way to measure how many people actually reacted to my message — or how. That question is what started this project. I wanted to capture not just whether people reacted, but how their movement changed: did they slow down, step aside, or adjust their path? To do that, I needed to track pedestrian trajectories from video and place them on a real map. I filmed at several stations in Tokyo using a smartphone, and built the pipeline using open-source tools — YOLOX for detection, ByteTrack for tracking, and MapLibre for visualization. Introduction In the previous article, I extracted pedestrian trajectories from
Continue reading on Dev.to JavaScript
Opens in a new tab



