GPT, like other generative models, tends to provide disparate answers for the same question. Sometimes answers vary only slightly, but sometimes they are very inconsistent or even contradictory.
Below are a few different answers provided by GPT 3.5 for the question “How much does it cost to open an international banking account?”:
Having control of what GPT answers is technically called fine-tuning; fine-tuning is about “taming the beast” and making it answer what you need in your particular set up. There are different approaches to fine-tuning, typically based on the software side: parameters such as learning rate, weight decay, batch size, number of epochs, momentum, and dropout rate play crucial roles.
We’ve approached the problem using data: can we make GPT answer some questions properly? By properly, we mean in the right tone and corporate style, with the right content, and without glitches/contradictions/hallucinations.
For this experiment, we’ve generated a dataset verticalized for the banking sector (retail banking, loans, wealth management, etc.) and customized for Customer Support. The dataset we used is available at Hugging Face for free here.
We call this dataset hybrid because it has all the advantages of synthetic text (low cost, high scalability and speed, privacy, and corporate identity) while avoiding the disadvantages (hallucination and bias). It contains 30M tokens which have been infused with anonymized PII (Personally Identifiable Information), establishing an empathic connection with users while being tailored to the sector’s target demographic. Below is a sample from the dataset:
Question:
“how much does it cost to open an account for international banking.”
Response:
“Thank you for considering opening an account for international banking! I can provide you with the information you need.
The cost of opening an account for international banking may vary depending on the financial institution and the specific account type you choose. To get the most accurate and up-to-date information. I recommend [….]
You can reach our customer service team at {{Customer Support Phone Number}} or through the Live Chat on our website at {{Company Website URL}}. They are available {{Customer Support Working Hours}} to […]
We look forward to the possibility of serving you and providing a seamless international banking experience!”
These are the answers provided by the data-finetuned version of GPT 3.5 for the same question ‘How much does it cost to open an international banking account?’:
As you can see in the screenshots, the behavior of GPT can be modified (fine-tuned) for specific purposes. In our case, it’s for Customer Support Banking.
When the questions used for testing are part of the training data, it’s less surprising that the answers are positively influenced by the training.
It’s a bit more surprising when using questions that where not used in the training, like the one we used “how much does it cost to open an account for international banking”.
As we can see, they all:
With data-centric fine-tuning, we get the best of two worlds:
Customizing Large Language Models in 2 steps via fine-tuning is a very efficient way to…
Discover the advantages of using symbolic approaches over traditional data generation techniques in GenAI. Learn…
In the blog "General Purpose Models vs. Verticalized Enterprise GenAI," the focus is on the…
Bitext introduced the Copilot, a natural language interface that replaces static forms with a conversational,…
Automating Online Sales with a New Breed of Copilots. The next generation of GenAI Copilots…
ChatGPT has major flaws that prevent it from becoming a useful tool in industries like…