Image Classification with Teachable Machine, ml5.js and p5.js

Teachable Machine [] is a cool new and easy way to create machine learning models for your sites, apps, and more with zero code. As of this moment there are three types of projects that you can create.

For this tutorial we will use the ‘Image Project’ but you can select any other project as well. We are going to recognize hand gestures using image classification.

From this interface you can either upload set of images that belongs to a specific class or record them using your web cam. I will go with web cam and record 10 seconds of images of sign language equivalent of letter ‘a’ [source].

Notice that I have changed class label from Class 1 to A . Make sure to use a generic background like I have in white. Repeat the process for all the classes you have. In my case I’m gonna do this for letter B and C as well.

Once you have all the classes recorded and class labels renamed,go ahead and click on Train Model.

Note that you can change some settings by clicking on Advanced if you know what you’re doing. If not you don’t have to do anything. Also keep in mind that the more images you have, accuracy of the final outcome will increase.

Once the training is complete you can check if you get the correct output using following interface. If you are not happy with the outcome you can train the model again.

If you are okay with the output, click on Export Model.

On the Tensorflow.js tab (because ml5.js is built on top of tensorflow), select Download radio button and click on Download my model. This will download a zip file to your local PC.

Next, we will use a node development server(with express) to run our code (install Node and Git if you haven’t already). Clone this repo using,

git clone

Go inside p5js-server folder and run npm i using command prompt or terminal. Then execute node server.js to start the server. Your file structure should look like this.

Create a folder in public folder named model and copy the files in previously downloaded zip file(metadata.json, model.json, weights.bin) into model folder.

Open up sketch.js using any code editor, and replace the code in it with following code snippet.

Once that is done, navigate to http://localhost:3000 (make sure you have started the server using command node server.js

This will result in following output.

Let me explain the code.

// STEP 1: Load the model!
function preload() {
classifier = ml5.imageClassifier(modelURL + 'model.json');}

The p5 preload() function is used to handle asynchronous loading of external files in a blocking way. So in our case we load the model with the help of it and ml5 imageClassifier().

function setup() {
createCanvas(640, 520);
// Create the video
video = createCapture(VIDEO);
// STEP 2.1: Start classifying

The setup() function is called once when the program starts. It’s used to define initial environment properties such as screen size and background color and to load media such as images and fonts as the program starts.

So in our case we have created a canvas using createCanvas(), get the video using createCapture(), called the method classifyVideo() to start classifying.

// STEP 2.2 classify!
function classifyVideo() {
classifier.classify(video, gotResults);

In here we use the method .classify(input, ?callback) of ml5 imageClassifier() to pass the video and get the results.

function draw() {

// Draw the video
image(video, 0, 0);
// STEP 4: Draw the label
textAlign(CENTER, CENTER);
text(label, width / 2, height - 16);

Called directly after setup(), the p5 draw() function continuously executes the lines of code contained inside its block until the program is stopped or noLoop() is called.

Here we are setting background to black, show the video input using image(), and drawing a label to show our class labels.

// STEP 3: Get the classification!
function gotResults(error, results) {
// Something went wrong!
if (error) {
// Store the label and classify again!
label = results[0].label;

Implementation of the callback function mentioned in function classifyVideo()

Here if you say something like, console.log(results); and open up the console, you will see an array with following structure gets printed.

This is ordered by the confidence level, label C is first element because it has the highest confidence. That is why we take the the first element and display it’s label in gotResults() function.

label = results[0].label;

I hope you got some basic understanding over how to use teachable machine, p5.js and ml5.js all together.

Read some more creative programming with p5.js following my audio visualization tutorial.

This project is on github —

Reference —



Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store