Chatbots

How to make your chatbot more human-like

Chatbot’s potential is nothing new, and here at Bitext we have been talking about them for a while. We emphasize the importance of Natural Language Processing to overcome the current limitations users and developers of bots are facing when trying to create human-like chatbots.

We have done some testing of bots and bot developing platforms like wit.ai, api.ai, and LUIS. And we detected some issues that seem not to be completely solved yet. The issues found are fundamental for human language and our linguistic technology is a perfect match to solve them.

We decided to put our resources to work and today we are proud to introduce you to our news chatbot so you can start having an idea of what our platform can do.

The best way to put it to a test is to try different hard issues we have detected while testing bots and platforms:

  • Negation: we realized that many bots don’t understand negation in a phrase because they have been built based on a keyword approach. That makes it difficult for users to ask for something as simple as “I want a barbeque pizza with no pork”.  Let’s see some examples:
    • “I want a barbeque pizza with no pork” (only negates pork).
    • “We don’t want any drinks” (negates the whole event).
    • “I’m not sure… I’ll take a beer (It doesn’t negate the main event)”
  • Coordination: it is one of the most used elements in how humans talk, and after our research we found out that most relevant platforms do not support a request where elements are joined by a coordinator. Our linguistic knowledge makes us capable to solve this issue.
    • “[[I want a Hawaiian pizza] and [my wife will have a Margherita]]” (two main events).
    • “I’ll have a Hawaiian [with [extra cheese] and [onion]] (two changes in ingredients)”.
    • “I’ll take [[a Hawaiian [with [extra cheese] and [onion]]] and [a Margherita]]” (two pizzas, the first one with two ingredients)
  • Connection between different phrases: Most of the chatbots have been designed following a tree model, so it’s not possible for a user to change his request and that forces him to start over. As solution we propose the usage of connectors as in the following examples:
    • “I want a Margherita with onion… Moreover, add extra cheese” (adds info to the first sentence: adds an ingredient).
    • “I want a Hawaiian with extra pineapple. However, I prefer it with no ham” (also adds info to the previous one).

If you want to start trying our demo chatbot, click here: 

  

admin

Recent Posts

Deploying Successful GenAI-based Chatbots with less Data and more Peace of Mind.

Customizing Large Language Models in 2 steps via fine-tuning is a very efficient way to…

5 months ago

Any Solutions to the Endless Data Needs of GenAI?

Discover the advantages of using symbolic approaches over traditional data generation techniques in GenAI. Learn…

6 months ago

From General-Purpose LLMs to Verticalized Enterprise Models

In the blog "General Purpose Models vs. Verticalized Enterprise GenAI," the focus is on the…

7 months ago

Case Study: Finequities & Bitext Copilot – Redefining the New User Journey in Social Finance

Bitext introduced the Copilot, a natural language interface that replaces static forms with a conversational,…

9 months ago

Automating Online Sales with Proactive Copilots

Automating Online Sales with a New Breed of Copilots. The next generation of GenAI Copilots…

10 months ago

Taming the GPT Beast for Customer Service

GPT and other generative models tend to provide disparate answers for the same question. Having…

1 year ago