New Free Tool Paves the Way for Safe AI Use
The University of Glasgow, leading the PHAWM project, has launched a free tool to ensure safe AI application usage. The framework allows users to audit AI apps' strengths and weaknesses, supporting regulations like the EU's AI Act. The tool aids various sectors by involving affected audiences, fostering better AI outcomes.
- Country:
- United Kingdom
Amid the rapid adoption of AI technology, the University of Glasgow has launched a free tool developed by the PHAWM project to assist in the safe use of AI applications. The tool is designed to help organizations, policymakers, and the public harness AI while identifying potential harms.
Partnered with the University of Strathclyde, the initiative focuses on the urgent need for comprehensive risk assessment of AI applications. In alignment with the EU's AI Act effective from 2024, this tool enables users to conduct thorough audits of AI apps, ensuring a balance between innovation and protection against negative consequences.
Designed to include those usually excluded from the audit process, the tool aims to produce better outcomes for end users across sectors such as health and cultural heritage. Backed by £3.5 million from Responsible AI UK, the project involves over 30 researchers and multiple partners, offering extensive training for effective adoption of these tools.
ALSO READ
-
India AI Impact Summit 2026: Charting the Future with Ethics and Innovation
-
Iran-Flagged Vessel 'Nora' Detained by Denmark Over Registration Issues
-
India's Visionary Leap in AI Under Modi's Leadership
-
India's Billionaire-Driven AI Infrastructure Revolution
-
Norway Declines U.S. Peace Board Invitation Amidst Gaza Aid Efforts