need help with async/await.
currently studying https://github.com/tensorflow/tfjs-converter.
and I'm stumped at this part of the code (loading my python converted saved js model for use in the browser):
import * as tf from '@tensorflow/tfjs';
import {loadFrozenModel} from '@tensorflow/tfjs-converter';
/*1st model loader*/
const MODEL_URL = './model/web_model.pb';
const WEIGHTS_URL = '.model/weights_manifest.json';
const model = await loadFrozenModel(MODEL_URL, WEIGHTS_URL);
/*2nd model execution in browser*/
const cat = document.getElementById('cat');
model.execute({input: tf.fromPixels(cat)});
I noticed it's using es6 (import/export) and es2017 (async/await) so I've used babel with babel-preset-env, babel-polyfill and babel-plugin-transform-runtime. I've used webpack but switched over to Parcel as my bundler (as suggested by the tensorflow.js devs). In both bundlers I keep getting the error that the await should be wrapped in an async function so I wrapped the first part of the code in an async function hoping to get a Promise.
async function loadMod(){
const MODEL_URL = './model/web_model.pb';
const WEIGHTS_URL = '.model/weights_manifest.json';
const model = await loadFrozenModel(MODEL_URL, WEIGHTS_URL);
}
loadMod();
now both builders say that the 'await is a reserved word'. vscode eslinter says that loadMod(); has a Promise void. (so the promise failed or got rejected?) I'm trying to reference the javascript model files using a relative path or is this wrong? I have to 'serve' the ML model from the cloud? It can't be from a relative local path?
Any suggestions would be much appreciated. Thanks!