This post originally appeared on the Finn AI blog, which is now part of Glia.
Artificial intelligence (AI) and financial services have one thing in common—consumers don’t trust them. According to the 2019 Edelman Trust Barometer, finance is still the least trusted industry globally. Consumers are also on the fence when it comes to putting their trust in conversational AI banking, mainly because of the lack of clarity regarding the processes, performance, and intentions of the provider.
So how can two very different industries—banking and AI—use their unique resources and skills to rebuild consumer trust?
Engagement fosters trust (and vice versa)
Forrester’s 2018 Customer Experience Index found that the banking industry is struggling to create and maintain a human connection with customers. This is understandable since this endeavor is new to banks. In the past, banks had a captive audience—people joined the same bank as their parents…and stayed with them forever. But times have changed.
We’re living in an era where our Netflix account chooses our entertainment and our Amazon account knows when we’re running low on toilet paper. It’s hard to believe that our banks—the institutions we trust with our livelihoods—are the most silent of all the companies we interact with every day. A recent study by Celent found that only 44 percent of people feel that their bank knows them.
Millennials have matured in an age where personalized, responsive services—like those delivered by Uber, Amazon, and Google—are the norm. It’s no wonder they’re 2 to 3 times more likely to switch banks than people in other age groups.
As banks pivot to meet the demands of the evolved consumer, they’re looking to conversational AI banking to increase customer engagement through personalized services. The end result: a banking chatbot that acts as a conduit between the bank and its users.
As with any new relationship, there’s a period of trust-building that needs to happen before value can be exchanged. Before banking chatbots can become the status quo, banks need to learn the language of trust—and their conversational AI needs to enable it.
Building trust in virtual assistants
It takes time to build trust. Virtual assistants are capable of performing everyday banking tasks for customers, but it all depends on one thing—the trust of the user.
If the banking chatbot begins to make decisions about your finances without your consent, you will be uncomfortable—and rightly so. You haven’t reached the right level of comfort with the chatbot to trust it to take such actions. But if the chatbot starts by giving you useful advice that helps you reach your financial goals, you’ll slowly begin to trust that it has your best interests in mind.
Customer trust journey with a banking chatbot
Level 1 | Level 2 | Level 3 | Level 4 | Level 5 |
You interacted with the banking chatbot. You opened a dialogue because you trusted it would be worth your time. | Depending what channel you’re in, you permit the chatbot to access your account to see what’s going on. | You listen to insights from the chatbot such as, “Your spending is on track this week, think about saving some money.” | You allow the chatbot to take action with your approval, for example, “You’ve just been paid, can I move $50 into saving?” | The chatbot knows you so well, it completes pre-approved tasks for you proactively. |
Achieving the privacy—value exchange
In the modern economy, consumers expect to exchange a certain amount of personal privacy with organizations that deliver value. However, privacy is inherently trust-based. And after several decades of complacency, the world is now hyper-focused on privacy. The steady flood of data breaches has washed away people’s trust in organizations. People no longer trust the system to self-regulate so they want the power to regain control of their data at the push of a button.
Governments around the world have stepped up to meet these consumer demands by implementing sweeping regulations like the EU General Data Protection Regulation (GDPR) that give people more control over the personal information that is being collected about them—and allows them the right to be forgotten. These regulations are good for society. They’ve made us think about the value that others extract from our data and the value that we get in return.
As we build relationships with chatbots (and banks), the privacy–value exchange becomes key since value exchange is completely dependent on trust. Banks must guide customers on a journey to build trust in the virtual financial assistant. For the assistant to deliver true value, the customer must trust that their privacy is respected and that the bot has their best interests in mind—and not the interests of the bank.
Finn AI has defined five key principles—or tenets—of trust. If a user is confident that the chatbot follows these principles, they’ll be more likely to trust it to take valuable actions on their behalf.
Finn AI tenets of trust
“I trust you to be competent”
If you tell the chatbot to move $50 from checking, the bot does this correctly.
“I trust you to be well intentioned”
The chatbot is not sneaky. It is working for you, and only you.
“I trust you to know me”
The chatbot understands your unique needs and only recommends actions, products, or services that will benefit you financially.
“I trust you to be reliable”
The chatbot better be available whenever you need it.
“I trust you to be discreet”
The chatbot only uses the information that you’ve shared for the purposes that it was shared. It will not use this information in the future, for example, to prohibit you from being approved for a loan.
Quantifying trust and making it actionable
The ability to quantify trust will be the biggest differentiator in banking in the future. Finn AI is working with customers to achieve this. Leveraging data from user behavior, we are building a framework that banks can use to determine the trust level of a given set of users—and that Finn AI can use to build better products.
Going beyond customer surveys and CSAT measurements, data scientists can correlate behaviors with trust so that banks can surface insights like:
- “If a user expresses this sentiment in this context, it demonstrates that trust levels are declining and we should do something to improve this.”
- “If a user takes this action, it demonstrates that trust levels are increasing and they may be ready to try feature X.”
In this way, banks can intuit a user’s needs based on how they’re behaving with their virtual banking assistant.
In the experience economy, profitability and happy customers are not mutually exclusive. Smart banks will work to bridge the trust gap with personalized services, delivered via virtual banking assistants, to help customers achieve their financial goals. In turn, banks will increase retention and customer lifetime value. Trust is a win-win for everyone and banks must invest heavily in it.