9

I have converted a keras model to tensorflow json format and saved it locally in my computer. I am trying to load that json model in a javascript code using the below command

model = await tf.loadModel('web_model')

But the model is not getting loaded. Is there a way to load tensorflow json model from local file system?

user2693313
  • 241
  • 1
  • 5
  • 12
  • 1
    I guess you are not serving from a dev server? Using your browser to just open the html file will result in problems in the xhr requests used to fetch the file. Maybe try out https://www.npmjs.com/package/http-server – Max Dec 05 '18 at 20:05
  • I have just started exploring tensorflow js and using my brower to test things – user2693313 Dec 05 '18 at 20:06

9 Answers9

18

I know you're trying to load your model in a browser but if anybody lands here that's trying to do it in Node, here's how:

const tf = require("@tensorflow/tfjs");
const tfn = require("@tensorflow/tfjs-node");
const handler = tfn.io.fileSystem("./path/to/your/model.json");
const model = await tf.loadModel(handler);
jafaircl
  • 650
  • 4
  • 6
9

LoadModel uses fetch under the hood. And fetch cannot access the local files directly. It is meant to be used to get files served by a server. More on this here. To load a local file with the browser, there is two approaches, asking the user to upload the file with

<input type="file"/>

Or serving the file by a server.

In these two scenarios, tf.js provides way to load the model.

  1. Load the model by asking the user to upload the file

html

<input type="file" id="upload-json"/>
<input type="file" id="upload-weights"/>

js

const uploadJSONInput = document.getElementById('upload-json');
const uploadWeightsInput = document.getElementById('upload-weights');
const model = await tfl.loadModel(tf.io.browserFiles(
 [uploadJSONInput.files[0], uploadWeightsInput.files[0]]));
  1. Serving the local files using a server

To do so, one can use the following npm module http-server to serve the directory containing both the weight and the model. It can be installed with the following command:

 npm install http-server -g

Inside the directory, one can run the following command to launch the server:

http-server -c1 --cors .

Now the model can be loaded:

 // load model in js script
 (async () => {
   ...
   const model = await tf.loadFrozenModel('http://localhost:8080/model.pb', 'http://localhost:8080/weights.json')
 })()
edkeveked
  • 14,876
  • 8
  • 45
  • 80
  • Thanks a lot for the help. But I am getting the below error while trying to load the files in model Uncaught (in promise) TypeError: Failed to execute 'readAsText' on 'FileReader': parameter 1 is not of type 'Blob'. – user2693313 Dec 11 '18 at 07:48
  • Maybe, you have to read the file after it has been loaded using an event listener – edkeveked Dec 14 '18 at 20:44
2
const tf = require('@tensorflow/tfjs');
const tfnode = require('@tensorflow/tfjs-node');

async function loadModel(){
    const handler = tfnode.io.fileSystem('tfjs_model/model.json');
    const model = await tf.loadLayersModel(handler);
    console.log("Model loaded")
}


loadModel();

This worked for me in node. Thanks to jafaircl.

1

You could try:

const model = await tf.models.modelFromJSON(myModelJSON)

Here it is in the tensorflow.org docs

0

Check out our documentation for loading models: https://js.tensorflow.org/api/latest/#Models-Loading

You can use tf.loadModel takes a string which is a URL to your model definition which needs to get served over HTTP. This means you need to start an http-server to serve those files (it will not allow you to make a request to your filesystem because of CORS).

This package can do that for you: npmjs.com/package/http-server

Nikhil
  • 433
  • 1
  • 4
  • 15
0

You could use insecure chrome instance:

C:\Program Files (x86)\Google\Chrome\Application>chrome.exe --disable-web-security --disable-gpu --user-data-dir=C:/Temp

Than you could add this script to redefine fetch function

async function fetch(url) {
  return new Promise(function(resolve, reject) {
    var xhr = new XMLHttpRequest
    xhr.onload = function() {
      resolve(new Response(xhr.responseText, {status: 200}))
    }
    xhr.onerror = function() {
      reject(new TypeError('Local request failed'))
    }
    xhr.open('GET', url)
    xhr.send(null)
  })
}

After that be shure that you use the right model loader my comment about loader issue

BUT your weights will be incorrect - as I understand there are some encoding problems.

Mahalov Ivan
  • 51
  • 1
  • 2
0

If you are trying to load it in server side, use @tensorflow/tfjs-node instead of @tensorflow/tfjs and update to 0.2.1 or higher version to resolve this issue.

Nima
  • 173
  • 1
  • 10
0

I am using React js for loading model (for image classification and more machine learning stuff)

Tensorflow.js do not support an Api to read a previously model trained

    const file= new Blob()
    file.src=modelJSON
    const files= new Blob()
    files.src=modelWeights
    console.log(files)
    const model= await tf.loadLayersModel(tf.io.browserFiles([file, files]));

[![enter image description here][1]][1]

You be able to create an APi in Express.js for servering your model (model.json and weigths.bin) if you use a web app (for a tensorflow.lite you could use a opencv.readTensorflowmodel(model.pb, weight.pbtxt)

References: How to load tensorflow-js weights from express using tf.loadLayersModel()?

     const classifierModel = await tf.loadLayersModel(            
            "https://rp5u7.sse.codesandbox.io/api/pokeml/classify"
        ); 
        const im = new Image()
            im.src =imagenSample//'../../../../../Models/ShapesClassification/Samples/images (2).png';
        const abc= this.preprocessImage(im);
const preds = await classifierModel.predict(abc)//.argMax(-1);
            console.log('<Response>',preds,'Principal',preds.shape[0],'DATA',preds.dataSync())
            const responde=[...preds.dataSync()]
            console.log('Maxmimo Valor',Math.max.apply(Math, responde.map(function(o) { return o; })))
            let indiceMax = this.indexOfMax(responde)
            console.log(indiceMax)
            console.log('<<<LABEL>>>',this.labelsReturn(indiceMax))
0

If you are using Django, you should:

  1. create a directory static in your app and put your model there.

  2. load that static directory to the template where you want to use your model:

    var modelPath = "{% static 'sampleModel.json' %}">
    

Don't forget to also load tensorflow.js library:

<script src="https://cdn.jsdelivr.net/npm/@tensorflow/tfjs"></script>
  1. Now you can load your model:

    <script>model = await tf.loadGraphModel(modelPath)</script>
    
Sayyor Y
  • 470
  • 2
  • 7
  • 19