I know this might seem duplicate, however I am not able to find the solution suitable for me. Or maybe I just need a complete example.
Here is the problem: I want to implement a webpage predicting the class of an input text, thanks to a pre-trained model. I have the json file corresponding to tensorflowjs model and both
- tokeniser.json(saved by Keras Tokenizer().to_json()
- vocab.json(saved as in this question corresponding to- tokenizer.word_index)
now, I know how to load the model in a javascript object, with the async function of tensorflowjs. How can I do the same for the tokeniser? and how I can then tokenise (under the imported tokeniser) the input text?
======================= Clarification ===========================
The example of my json files can be found at these links
I tried the following code
// loadVocab function to get the vocabulary from json.
async function loadVocab() {
  var word2index = await JSON.parse(await JSON.stringify(vocabPath));
  return word2index;
}
where vocabPath is a string containing the url above.
at the end of my script I call a function init()
async function init(){
    model = await loadModel();
    word2index = await loadVocab();
    console.log(word2index["the"]); // I expect 1
}
but of course I got undefined since I guess it takes the real string of the url as a json, not the json at that url.
any idea?
 
    