Researchers Create An Analytical Tool To Determine How Much AI Knows

By Toby T

Over the past few months, there has been a noticeable spike in interest in generative artificial intelligence (AI) systems across different fields. And more work has gone into improving these systems as much as possible. 

Source: @markusspiske/Unsplash

Recently, researchers at the University of Surrey developed software that can discern just how much an AI system has farmed from an organization’s database. According to the researchers, the software can help to improve a company’s online security, optimizing their understanding of what AI has learned and helping them to see how vulnerable they are to the technology in terms of sensitive data exposure. 

Besides this, the software can also determine whether AI can exploit possible flaws in a company’s code. For instance, in the case of a digital banking platform, it could determine whether an AI system has found loopholes in the app’s code that could be exploited by hackers.

Speaking on the tool, Dr. Solofomampionona Fortunat Rajaona, a research fellow and the lead author of the research paper, explained that AI systems have been infused into different aspects of daily life – from robotics to transportation and more. And with tools like ChatGPT, humans have also begun to interact more with AI systems.

However, one major problem with these systems is the general lack of knowledge surrounding how much they know. 

Source: @campaign_creators/Unsplash

As the doctor explained, this verification software can deduce just how much AI systems learn from their interactions with humans – whether this knowledge is enough or if it’s even too much. In cases of the latter, this could mean that the system is capable of disrupting users’ privacy across the board. 

With this ability, the researchers believe that they can help organizations safely use AI in secure settings. The study is already quite prominent, already being crowned the best paper by judges at the International Symposium on Formal Methods.