Technology

Protecting Clients, Firms In Age Of AI And Proliferating Tech

Tom Burroughes Group Editor September 28, 2023

Protecting Clients, Firms In Age Of AI And Proliferating Tech

As we continue to examine the theme of "protecting the client," FWR talks to a large US wealth manager and lawyers in Asia about their views on AI, cybersecurity, and ways to protect firms and their end-clients.

(This article is part of a series of items examining the ways wealth advisors can “protect the client” that go beyond the traditional areas of investment and tax. See a previous examples here and here.)

Cybersecurity continues to be a challenge for advisors working with HNW clients, knowing that this population group is targeted by criminals. With AI and other technologies on the rise, threats proliferate – but hopefully, so do means of defense.

“A decade ago, most organized crime was about going for lots of credit card fraud, and it was about going after where capital lay,” Scott Bush, chief client officer at Geller, the US wealth manager, told this publication recently. “Not enough of them [HNW families] understand the importance of changing behaviors to secure their footprints.”

Geller’s multi-family office business provides investment and tax advice, yet more and more of its work is about helping clients to be more secure, Bush continued. “We have a robust cybersecurity team that our clients engage with. We can monitor behavior that is out of the ordinary,” he said, saying that AI tools can help flag patterns.

Family Wealth Report asked Bush about AI. (At the recent family office fintech summit in New York that this publication hosted, AI and its threat was a major talking point.)

“AI is starting to be embedded in systems looking at activity and looking for anomalies,” he continued. 

Geller has plenty of reason to be tight on cybersecurity. As of December 31, 2022, Geller had $5 billion in assets under management and $3 billion in assets under advisement. 

The view from Hong Kong
And it is not just a US issue. Kenix Yuen, partner of Gall, a Hong Kong law firm, said AI is a double-edge sword when it comes to cybersecurity. 

“It depends on how we use it, and how much we understand it. On the one hand, AI can generate false information, images, and documents, and impersonate others. It may make scams look more real or sophisticated. It can also be a tool to help analyze and detect false information as well. We are yet to see how AI is being used by scammers,” she said. 

One trend that isn’t changing, with or without AI, is that malicious actors target the vulnerable, and that doesn’t simply mean the elderly, Yuen continued. 

“Alongside the usual investment, employment and romance scams on instant messaging apps, we have seen a recent hijacking tactic – sending SMS messages to people asking them to verify its WhatsApp account by inputting or sending back the security code sent to them, failing which the account is deleted permanently,” she said. “Once the hackers receive the security codes, the WhatsApp account is hijacked and used to defraud family and friends of the account owner for funds.”

“Similar tactics have been used to hijack social media accounts. The elderly are easy targets as they are usually less sophisticated with technology, but they are not the only group,” Yuen said. “To reduce the chance of the public being defrauded, it is important to raise awareness of the public via different channels about the latest tactics deployed by hackers. At the same time, law enforcement agencies can work together with the developers or owners of popular online platforms to devise ways to combat the latest scam tactics deployed by fraudsters.”

The scale of the cybersecurity problem remains daunting, and the amount of investment entering the space is now a major line-item on tech budgets. The global market is about twice the monetary value of the “physical” one – around $202 billion in 2022 and estimated to rise at a compound annual growth rate of 12.3 per cent from 2023 to 2030 (source: Grand View Research).

Getting cybersecurity right is crucial if the business model sometimes known as “open banking” is to take off. The term describes how financial data can be shared between banks and third-party service providers using application programming interfaces (APIs). Traditionally, banks have kept customer financial data within their own closed systems. A concern is that if crooks can penetrate the “walls” of one system, clients could be vulnerable. 
 


Getting craftier
“They [attackers] have gotten very creative in ways to penetrate accounts, IT networks and people’s homes,” Mona Manahi, managing director, head of personal CFO Services at Geller, told Family Wealth Report

Both Manahi and Bush are, nonetheless, far from being AI pessimists.

They said that despite the recent hype around artificial intelligence and machine learning, they are not new concepts, especially in the technology and cybersecurity space. Geller, for example, has used AI and ML within a variety of IT/cyber applications (such as spam filters and security monitoring tools) for years.
 
However, they said, what is different today is that advances in basic AI/ML function and data access are becoming more useful across a broad range of applications. But that doesn’t mean that AI/ML can be deployed across everything and produce accurate results.

 

Artificial intelligence and machine learning can be successfully used to increase efficiency, spark innovation, and determine if any items are overlooked or perform other significantly useful tasks. However, these tools must be carefully used to avoid “unintended consequences.”

Working from home
A trend even before the pandemic hit in early 2020 was working from home – and that creates new vulnerabilities.

Another Hong Kong-based lawyer, Wendy Wong, partner at Simmons & Simmons, reflected on the change and what it meant for firms as well as their end clients.

“From an employment perspective, we advise clients to implement a remote working policy or data security policy which clearly sets out employees’ obligations when working remotely, such as locking a device when not in use, ensuring that data is encrypted and not disclosing any passwords, PINs or encryption keys to third parties,” Wong said. “Regular training of employees would also be important. Another step that employers can consider taking is to revise the employment contract to expressly provide that employees must always comply with the relevant policies when working remotely.”

Talk of the workplace leads back to Geller, and an insight from Bush that families need to be aware of whom they employ in their households.

“A difference between an average family and UHNW family is that they [UHNWs] have people working for them who are not family members,” Bush said, talking of the importance of background checks on staff, ongoing vetting and monitoring, and being aware of what sort of systems children and relatives use, etc. 

And maybe there is a need to keep the younger generation aware of the risks, he said.

“The younger generation often feels more comfortable with technology, and they aren’t paying attention to the threats of tech,” Bush said.

Register for FamilyWealthReport today

Gain access to regular and exclusive research on the global wealth management sector along with the opportunity to attend industry events such as exclusive invites to Breakfast Briefings and Summits in the major wealth management centres and industry leading awards programmes