New Free Tool Paves the Way for Safe AI Use

The University of Glasgow, leading the PHAWM project, has launched a free tool to ensure safe AI application usage. The framework allows users to audit AI apps' strengths and weaknesses, supporting regulations like the EU's AI Act. The tool aids various sectors by involving affected audiences, fostering better AI outcomes.


Devdiscourse News Desk | London | Updated: 19-02-2026 20:10 IST | Created: 19-02-2026 20:10 IST
New Free Tool Paves the Way for Safe AI Use
This image is AI-generated and does not depict any real-life event or location. It is a fictional representation created for illustrative purposes only.
  • Country:
  • United Kingdom

Amid the rapid adoption of AI technology, the University of Glasgow has launched a free tool developed by the PHAWM project to assist in the safe use of AI applications. The tool is designed to help organizations, policymakers, and the public harness AI while identifying potential harms.

Partnered with the University of Strathclyde, the initiative focuses on the urgent need for comprehensive risk assessment of AI applications. In alignment with the EU's AI Act effective from 2024, this tool enables users to conduct thorough audits of AI apps, ensuring a balance between innovation and protection against negative consequences.

Designed to include those usually excluded from the audit process, the tool aims to produce better outcomes for end users across sectors such as health and cultural heritage. Backed by £3.5 million from Responsible AI UK, the project involves over 30 researchers and multiple partners, offering extensive training for effective adoption of these tools.

Give Feedback