March 4, 2024



Many financial advisors acknowledge that cognitive biases affect their practice, but few tools exist to combat them. Artificial intelligence may offer a solution, the researchers say.

recently published in Journal of Business Research Eleven different cognitive biases that financial advisors exhibit when advising clients are identified. Among the 21 advisors interviewed by an international team of scholars from four universities, the most common areas of bias were types that the researchers categorized as confirmation bias, affinity bias, and priming bias.

“[Confirmation bias]involves financial planners or investors holding preconceived beliefs about investments,” the researchers said. “When information emerges from a scenario that seems to align with their thought process, this information is reinforced. These information It may be true or false, but financial advisors believe it to be true and use it to support their decision-making.”

Advisers exhibit affinity bias when they favor people with similar interests or experiences as their own. Alternatively, priming bias is when one stimulus influences an advisor’s response to a second stimulus.

This bias can lead advisors to make incorrect assumptions about financial markets, the researchers said. “If the stock market has ever crashed due to a sudden surge in stock purchases, financial planners may be tempted to associate the stock market crash with such surges in the future,” the researchers said. “However, the stock market may crash for a number of other reasons. “

The remaining eight biases observed included familiarity, unconsciousness, overconfidence, framing, selfishness, belief, anchoring, and embodiment bias.

After 25 years of financial education and work experience, “some of my views seem to be ingrained,” said Sean Lovison, a financial advisor at WJL Financial Advisors in Moorestown, New Jersey.

“I often find myself gaining confirmation by seeking out information that confirms my pre-existing beliefs, and possibly ignoring information that contradicts them,” he said.

The researchers found that consultants generally see AI as being “entirely data and computationally based” valuable in helping them avoid this cognitive bias, despite their possible fear of being replaced by the technology.

Familiarity bias can be another common problem for time-pressed advisors, says Sindhu Joseph, founder of CogniCor, a San Francisco-based wealth management artificial intelligence company.

While busy advisors may rely on the few investment vehicles they typically recommend to clients, artificial intelligence can provide advisors with a wider range of potential options and help them evaluate the best investments for specific client needs, Joseph said.

Sarah Fallaw, president of Atlanta-area behavioral finance firm DataPoints, said advisor bias extends beyond investment decisions. Advisors often fall into the same bias when interacting with clients.

By using a “narrow” client discovery process, consultants may gather only basic demographic information, leading them to provide guidance that ignores clients’ “personality or behavior-based traits such as attitudes, beliefs and values,” Farrow said.

“Confirmation bias can lead advisors to seek out information that supports their view of a client, while ignoring information that is inconsistent with that view,” Farrow said. “Advisors must be aware of this bias, especially given client demographics: A conscious attempt to affirm a spouse’s “typical” gender roles in relation to investment decisions or stereotyped money-related beliefs based on the client’s race.”

Client context is important not only for advisors to avoid making biased assumptions about clients, but also for advisors to better account for possible biases in clients themselves.

“To some extent, both client bias and advisor bias can be overcome by (AI) providing relevant information about both situations, and I think AI can go a step further and provide the next best action,” Joseph said.

While consultants participating in the study were generally willing to incorporate AI into their practice as a means of combating cognitive bias, they also expressed concern about bias in AI itself.

Since the AI ​​is trained on real-life example data, any bias present in the training data can affect the AI ​​itself. For example, when ChatGPT was asked to define terms for women and men, ChatGPT interpreted return on investment (ROI) differently.

While male-oriented responses readily use gendered pronouns—“he invests,” female-oriented responses use non-gender second-person pronouns—“you invest.”

There are several different approaches developers can take to de-bias AI, Joseph said.

First, they can weed out information from the data that is not important to the decisions they are making, so factors like gender or age don’t unnecessarily affect their output. Second, developers can create “synthetic data” to train AI.

“If you have a large population of one class and much less data for another class, you can supplement that population with artificially created synthetic data, and then make it a balanced training set,” Joseph said.

sophie recently announced Efforts to generate a synthetic dataset for training image generation AI on a more balanced set of photos.

According to the company, the output of the image-generating AI tool showed gender bias. When asked to create realistic images of people who are good at money, only 2% of the thousands of images depicted women.

Experts say that while AI can be very useful in helping advisors avoid cognitive biases in practice, programmers must work to mitigate any bias in the AI ​​programs themselves. In the absence of any regulatory standards, users rely on AI companies to remove bias from their tools without any oversight.

“It would be very interesting if some regulator could step in and say, ‘These categories of decisions should exhibit these aspects so that it qualifies as an unbiased AI system,'” Joseph said. “Otherwise, you’re putting too much trust in the solution provider to build these things, and most people don’t do that.”