UN Rights Council Calls for AI Transparency

UN Rights Council Calls for AI Transparency

image provided by pixabay

This post is also available in: heעברית (Hebrew)

The UN Human Rights Council called last week (Friday) for transparency on the risks of artificial intelligence and for responsible use of the data harvested by AI.

Since generative AI content became available to the public, and its use has been quickly growing, authorities have been scrambling to figure out how to regulate such chatbots and ensure the technology does not endanger humanity.

According to TechXplore, the UN’s top rights body adopted a resolution calling for the “adequate explainability” of AI-supported decisions, considering “human rights risks arising from these technologies”, as well as calling for the use of data in AI systems to be in line with international human rights law.

The resolution was co-sponsored by Austria, Brazil, Denmark, Morocco, Singapore and South Korea, and adopted by consensus in the 47-country council.

Yun Seong-deok, the South Korean ambassador, said the resolution underlined the importance of “ensuring, promoting and protecting human rights throughout the life cycle of artificial intelligence systems”.

Michele Taylor, US ambassador, called it a step forward for the council, saying “This resolution recognizes both the harms and benefits that new emerging digital technologies, especially artificial intelligence, can bring to the field of human rights.”

Since its launch last year, ChatGPT has become a global sensation for its ability to produce human-like content, which can include essays, poems and conversations from simple prompts.

Nevertheless, despite AI systems being extremely helpful in many fields, and could even save lives in medical diagnosis, experts and users alike still fear it could be exploited by authoritarian regimes to exercise mass surveillance of citizens.

As Europe grapples with this new type of technology, and the UN rules are slowly progressing, we shall see how the situation unfolds.