In a move to Safeguard financial markets and prevent the retirement savings system from being unjustly influenced by AI-generated content, a bipartisan pair of senators introduced legislation to address the potential threats.
Virginia Democratic Sen. Mark Warner and Louisiana Republican Sen. John Kennedy, both members of the Senate Committee on Banking, Housing, and Urban Affairs, introduced the Financial Artificial Intelligence Risk Reduction Act (S. 3554) on December. 18. The act mandates financial regulators to address uses of AI-generated content that could disrupt financial markets.
Institutions could face hefty penalties for using deepfakes or other artificial intelligence tools to illegally manipulate markets or to engage in securities fraud. More specifically, the legislation requires the Financial Stability Oversight Council (FSOC) to coordinate financial regulators’ response to threats to the stability of the markets posed by AI, including the use of “deepfakes” by malign actors and other practices associated with the use of AI tools that could undermine the financial system, such as trading algorithms.
“AI has tremendous potential but also enormous disruptive power across a variety of fields and industries perhaps none more so than our financial markets,” Sen. Warner said in a statement in introducing the legislation. “The time to address those vulnerabilities is now.”
The legislation directed FSOC to identify gaps in existing regulations and exam standards that could hinder effective responses to AI threats and implement specific recommendations to address those gaps. In fact, in its annual report released Dec. 14, the FSOC for the first time identified the use of AI in financial services as a vulnerability in the financial system.
While AI offers potential benefits such as reducing costs and improving efficiencies the report warns that the use of AI can introduce certain risks, including safety-and-soundness risks like cyber and model risks.
The bill also directs the Financial Stability Oversight Council (FSOC) to coordinate various government efforts to study AI’s risks and to identify concrete recommendations that agencies could implement as rules or regulations, the senators said in a statement.
“AI is moving quickly, and our laws should do the same to prevent AI manipulation from rattling our financial markets,” Kennedy said. “Our bill would help ensure that AI threats do not put Americans’ investments and retirement dreams at risk.”
In addition to allowing the Securities and Exchange Commission to seek triple penalties against companies that use AI to violate the agency’s rules, the legislation would authorize multiple regulatory agencies to explore new ways to govern AI’s use in the financial sector.
On September 20, 2023, the United States Senate Committee on Banking, Housing, and Urban Affairs held a hearing on “Artificial Intelligence in Financial Services.”
The purpose of this hearing was to discuss the current and future applications of artificial intelligence (AI) in the financial services industry, aiming to assess the potential risks and benefits of AI in areas such as credit underwriting, algorithmic trading, fraud prevention, and consumer lending.
The hearing highlighted the need for policies and regulations to ensure that AI is used responsibly and for the benefit of consumers and society. Witnesses and senators emphasized the importance of transparency, accountability, and finding the right balance between regulation and innovation.
Do you have any story or press releases you want to share? Send tips to editor@envestreetfinancial.com
Follow us on Twitter, Facebook, or LinkedIn to ensure you don’t miss out on any