
How to Train a Model in TensorFlow.js on Massive Datasets (Out-of-Core Learning).
You wake up one day and decide: "I'm going to create a machine learning model". On the first day, you have a JSON file with 100 entries and 100 WebP images to train a Convolutional Neural Network (CNN) for detection. You read the JSON, load the 100 images, put everything in memory (which should consume about 100 MB), and everything works perfectly. After a week, you've taken a lot of photos, and your data has increased 10 times. Now, with 1000 entries, your model performs much better. But then, you look at your task manager and your Node.js training process is already consuming 1 GB. Again, you think: "not a problem, my computer has 16 GB of RAM", and you carry on happily with your model. However... how much can the memory handle? After a month collecting images and filling out your JSON, you come across a 10 GB .zip file of photos and thousands of entries. The modus operandi is the same: you extract the files to the usual folder and run the training script. Kaboom!!! Your computer fre
Continue reading on Dev.to JavaScript
Opens in a new tab



