TensorFlow.js is an open-source JavaScript library for machine learning that enables developers to train and deploy ML models in the browser or in Node.js.
In this post, we'll learn how to use TensorFlow.js to deploy a pre-trained ML model in a Node.js application. We'll also learn how to use the Node.js API to run inference on the model.
Before getting started, you'll need the following:
First, we need to install the TensorFlow.js library. We can do this using the Node Package Manager (NPM):
npm install @tensorflow/tfjs
Next, we need to download the pre-trained ML model. For this example, we'll use the MobileNet image classification model.
Once the model is downloaded, we can deploy it in our Node.js application.
To deploy the model, we need to create a new file called index.js
. In this file, we'll use the tf.loadModel()
function to load the pre-trained model:
const tf = require('@tensorflow/tfjs');
// Load the model.
tf.loadModel('https://storage.googleapis.com/tfjs-models/tfjs/mobilenet_v1_0.25_224/model.json')
.then(model => {
// Use the model.
});
Once the model is loaded, we can use it to run inference on an image. For this example, we'll use the TensorFlow.js image classification example.
First, we need to load the image into a tf.Tensor
:
const tf = require('@tensorflow/tfjs');
// Load the model.
tf.loadModel('https://storage.googleapis.com/tfjs-models/tfjs/mobilenet_v1_0.25_224/model.json')
.then(model => {
// Use the model.
const img = tf.tensor3d(imageData, [224, 224, 3]);
});
Next, we need to pre-process the image using the mobilenet.preprocessInput()
function:
const tf = require('@tensorflow/tfjs');
// Load the model.
tf.loadModel('https://storage.googleapis.com/tfjs-models/tfjs/mobilenet_v1_0.25_224/model.json')
.then(model => {
// Use the model.
const img = tf.tensor3d(imageData, [224, 224, 3]);
// Pre-process the image.
const preprocessedImg = tf.mobilenet.preprocessInput(img);
});
Finally, we can use the model.predict()
function to run inference on the image:
const tf = require('@tensorflow/tfjs');
// Load the model.
tf.loadModel('https://storage.googleapis.com/tfjs-models/tfjs/mobilenet_v1_0.25_224/model.json')
.then(model => {
// Use the model.
const img = tf.tensor3d(imageData, [224, 224, 3]);
// Pre-process the image.
const preprocessedImg = tf.mobilenet.preprocessInput(img);
// Run inference.
model.predict(preprocessedImg).then(predictions => {
// Use the predictions.
});
});
The predictions
variable contains the results of the inference.
In this post, we learned how to use TensorFlow.js to deploy a pre-trained ML model in a Node.js application. We also learned how to use the Node.js API to run inference on the model.