A cross-party group of UK lawmakers is calling for urgent AI-focused stress testing in the financial sector. They warn that without early action, artificial intelligence could create market disruptions and harm consumers if left unchecked.
UK lawmakers are warning that artificial intelligence is spreading rapidly in financial services without proper oversight. A Treasury Select Committee report states that over 75% of UK financial institutions now use AI in areas like trading, lending, and insurance.
These tools support fast decision-making and automate key functions, yet most firms have not tested how their systems behave under pressure. Lawmakers say this raises the risk of sudden system failures, as AI tools are often connected across platforms, meaning one failure could affect many institutions at once.
The Committee criticised the Bank of England and the Financial Conduct Authority (FCA) for adopting a “wait-and-see” stance. The report warns that the rapid pace of AI adoption is outstripping the response from regulators, putting markets and consumers at risk.
However, the Committee explained that waiting for problems to appear may leave no time to fix them. They believe stress testing would allow early detection of weaknesses in AI systems and avoid larger disruptions in the financial sector.
The report recommends that the Bank of England and FCA introduce AI-specific stress tests similar to those already used for banks. These tests would simulate market pressure and assess how AI systems respond. Lawmakers also call on the FCA to issue practical guidance by the end of the year.
This guidance should explain how current consumer protection rules apply when decisions are made by automated systems, and clearly state who is responsible when things go wrong. The Committee wrote, “Only through such trials can authorities see exactly how algorithms might spark disruption or amplify turmoil once markets shift.”
Lawmakers also raised concerns over the lack of oversight for technology providers that support the financial system. The Critical Third Parties Regime was created to give the Bank of England and FCA powers over companies such as cloud services and AI vendors.
Yet no firms have been designated under this regime, despite its introduction more than a year ago. Lawmakers warn that large parts of the sector depend on a small number of providers like Amazon Web Services and Google Cloud. If one of these firms experiences technical issues, it could trigger failures across several financial institutions.
Chair of the Treasury Committee, Dame Meg Hillier, said, “The use of AI in the City has quickly become widespread and it is the responsibility of the Bank of England, the FCA and the Government to ensure the safety mechanisms within the system keeps pace.” She added that she is not confident the current system could manage a major AI failure.
The Committee is pressing regulators to act now, not later, by testing AI systems under stress and clearly defining responsibilities when systems fail. The report encourages cooperation between firms and regulators to manage AI risks while ensuring the UK benefits from emerging technologies.
The post Treasury Committee Urges AI Oversight to Avoid UK Market Disruption Risks appeared first on CoinCentral.


