A Demo to Experience the Results

A previous blog post describes how we can set up a successful GenAI-based assistant/chatbot with minimum data and training/evaluation efforts. The key is building your chatbot using a Verticalized Language Model as base model, rather than using general purpose models like GPT or Mistral. Since the fine-tuned vertical model has already learnt the language (vocabulary, semantics…) of your domain, learning the specificities of your business takes a much smaller effort because it reduces the amount of data needed.

In this post we are going to see a few examples of the different types of answers we get. These examples are captured from the public demo we have published at https://www.bitext.com/sales-demos/

The demos allow for the comparison of 3 models:
  • GPT-3.5: a general purpose LLM
  • Customized Banking: a version of Pre-trained Banking Model customized for a specific client
  • Customized Banking: a version of Pre-trained Banking Model customized for a specific client

To compare the different answers we get from the three models, we will use a common question:I want to open an account”.

Answer from GPT-3.5, the General Purpose Model

As expected, the General Purpose Model asks for the type of account we want to open, since GPT (and most if not all other Models) is generic by definition. The account could be an email account, a social media account… or a bank account. The lack of context makes it impossible for the non-verticalized Model to provide helpful information about opening a bank account.

Answer from Pre-trained Banking Model

Since this Model is already verticalized for Banking, the Model safely assumes that the account we are asking about is a bank account and easily provides proper instructions on how to proceed.

In other words, the Model already knows the vocabulary and expressions from the Banking domain, and can solve semantic ambiguities that the generic model cannot, like what the meaning of “account” is in this request.

Additionally, the Model provides a specific style for the answer, following standard corporate rules for language like tone, vocabulary, sentence length… as it is common practice in customer support.

Answer from Client-Specific Customized Banking Model

Since this Model is already customized for a specific bank (in this case, the fictitious BBI bank), the Model provides customer-specific and correct instructions on how to proceed to open an account specifically in BBI.

This customized Model already knows not only the vocabulary and expressions in Banking but also the specifics of one particular bank. Additionally, the Model uses the particular tone and style of BBI, following its corporate communication rules.

Conclusion

As we have shown, customizing Large Language Models in 2 steps via fine-tuning is a very efficient way to reduce data needs, as well as training and evaluation efforts, when building customized Conversational Assistants. Bitext provides these Pre-Built Datasets and Models in 20 verticals. Some examples of data, models and demos can be found here

Sharing is caring!