We are looking for a freelancer who is able to implement a TensorFlow C++ inference engine for a Natural Language Understanding task. For this, we provide the existing Python implementation including all pre- and post-processing steps as well as pre-trained models to test the implementation. The Natural Language Understanding module uses a TensorFlow models for character-based embeddings, intent-detection and slot-filling. The models contain following layers:
- Bi-LSTM (Flair Embeddings)
- CRF (from tensorflow_addons)
In addition we currently perform tokenization using the word_tokenize method of the NLTK tokenizer package. This tokenizer can be replaced if required.
For the implementation of this task we have following functional and non-functional requirements:
Functional requirement 1.1: Implemented inference engine that is initialized from following arguments (example files can be found in the provided package):
- the embeddings model
- the intent/out-of-domain detection model
- bio-tagging model
- intent rules
and contains at least following methods:
- initialize(std::string embeddings_model_path, std::string intent_model_path, std::string bio_tagging_model_path, std::string intent_rules_path)
-- input arguments can be extended by required configuration parameters
-- methods loads all three models and the intent_rules
- forward(std::string utterance, EmbModel *embeddings_model, IntModel *intent_model, BIOModel *bio_tagging_model, Rules *intent_rules, std::string &slots, std::string &intent)
-- method fills in "slots" and "intent" based on the provided input arguments
-- the slots datatype is a serialized json with keys and values identical to the provided Python implementation
-- intent is either the intent class name or "OutOfDomain"
Functional requirement 1.2: Implemented interactive demo script that uses the inference engine and:
- asks for an input utterance
- outputs the detected slots, intents/out-of-domain
- repeats this until a user enters "q" as an input utterance
A python version of this is provided.
Functional requirement 1.3: All required additional functions/methods/classes to make 1.1 and 1.2 possible, can be freely designed.
Non-functional requirement 2.1: Inference engine uses Tensorflow 2.4 C++ API (not Tensorflow Lite)
Non-functional requirement 2.2: Tensorflow 2.4 is build as a shared library ([login to view URL]). Makefile and build instructions are provided
Non-functional requirement 2.3: Everything is computed on a CPU (no GPU)
Non-functional requirement 2.4: The entire source code and a pre-build of the inference engine and interactive demo script are provided
Non-functional requirement 2.5: Build instructions with all Makefiles are provided
Non-functional requirement 2.6: Build instructions for used third-party libraries are provided
Non-functional requirement 2.7: Third-party libraries are provided as a shared library (.so)
Non-functional requirement 2.8: Implementation follows the C++11 standard
Non-functional requirement 2.9: CRF layer implementation provided in tensorflow-addons is used
Non-functional requirement 2.10: Everything runs on a Linux system (x86_64, Ubuntu 16 or newer)
Non-functional requirement 2.11: Each function in the inference engine contains in-code documentation describing of:
- The purpose of the function
- The input arguments
- The return arguments
20 freelancere byder i gennemsnit £1355 timen for dette job
Hi! I'm interested in your project. I can help you with my skill and I have good experiences. I will finish it in time and do my best. Lets discuss more detail in private chat. Thanks for reading my bid.
Hey, I have checked your requirement and understand that as well. I have done SIMILAR work past. Do you want to see the DEMO WORK??? Will show you Thanks.