92% of firms do not oversee third-party use of AI, according to survey


An overwhelming number of advisory firms have not adopted policies and procedures related to the use of AI among third parties and service providers, according to results from a study conducted by compliance firm ACA Group and the National Association of Compliance Professionals.

Overall, the survey found that 92% of respondents do not have policies in place for the use of AI by third parties and service providers, and only 32% have an AI governance committee or group. Additionally, nearly seven in 10 firms have not developed or implemented policies and procedures governing the use of artificial intelligence by employees, while only 18% have a formal testing system for AI tools.

The results showed that while there is “broad interest” in AI across the space, there is also a “clear disconnect when it comes to putting the necessary safeguards in place,” according to NSCP Executive Director Lisa Crossley.

The survey was conducted online in June and July, with responses from 219 compliance professionals detailing how their firms use AI. About 40% of respondents were from firms with between 11 and 50 employees, with assets under management ranging from 1 billion to 10 billion dollars.

Although one ACA Group's previous survey this year found that 64% of advisory firms had no plans to introduce AI tools, that survey focused on the use of AI for client interactions. According to Aaron Pinnick, senior manager of thought leadership at ACA, the current survey is about the use of AI for internal and external use.

According to results from the current survey, 50% of respondents did not have any policies and procedures for the use of employee AI finalized or in progress, while 18% responded that they were “in the process of drafting” such policies.

While 67% of respondents said they were using AI to “increase efficiency in compliance processes,” 68% of AI users reported seeing “no impact” on the efficiency of their compliance programs (survey respondents indicated the most common uses of AI were research, marketing, compliance, risk management and operations support).

Compliance professionals at firms reported that the two biggest barriers to adopting AI tools remained cybersecurity or privacy concerns and uncertainty about regulations and examinations, at 45% and 42%, respectively (while a lack of AI-savvy talent HE ranked third).

About 50% of respondents said their employee training covered AI cyber risks and “appropriate use of AI and data protection.” At the same time, some firms encrypted data and performed “regular vulnerability and penetration testing” on AI tools. Some 44% of firms reported allowing only “non-public” AI tools, while 33% of compliance professionals said they conduct a “privacy impact assessment” on a tool before their firm adopts it.

The results of the survey come in a week after the SEC's Division of Examinations released its priorities for 2025underlining that they were investigating the integration of AI advisors into operations, including portfolio management, trading, marketing and compliance (as well as their disclosures to investors). Along with a previously reported SEC sweepis the latest indication of regulators' growing focus on how advisers use AI in day-to-day practices.



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *