Victims of violence who are seeking support often have to repeat their experience several times over and to multiple people. This can exacerbate trauma.
A project in Mexico proposes to address this with the help of artificial intelligence (AI).
Figures from national statistics agency INEGI show that two-thirds of women in Mexico have experienced some form of violence, with almost 44 percent suffering abuse from a partner.
The project is exploring the use of AI-based speech-to-text transcription to reduce the need to share distressing details over and over again, and to speed up the reporting process.
Guadalajara pilot programme
A pilot in the City of Guadalajara is in the planning phase. Information would be stored in a platform accessible by authorised public servants. AI could also be used to automate assessments of the level of violence to ensure women receive the right support quickly. In future, it could be developed to simplify the process of submitting a legal report, which is currently lengthy.
“The proposal is to fill out the required forms automatically to make the experience shorter, and to avoid revictimising women,” said Juan Roberto Hernandez, General Coordinator at fAIr LAC Jalisco, which is leading the project. He was speaking at the recent capitaltribunenews.com Institute City Leadership Forum in Barcelona.
Regional alliance fAIr LAC Jalisco is an Inter-American Development Bank (IADB) initiative that brings together public, private and civil society to promote the ethical and responsible use of AI in Latin America. It includes Tecnológico de Monterrey university, innovation agency C Minds, and the Mexican state of Jalisco. Projects are particularly focused on social services.
Public trust
Despite its potential benefits, there is also increasing awareness of the need for caution in using AI in public services. Further, in Latin America and the Caribbean, trust – including in government – is lower than in any other region in the world, according to an IDB report.
“When we are developing pilots a key question is how to protect people’s digital rights, privacy and confidentiality,” said Hernandez. “There were many resources telling us that this is very important but nothing that showed us how.”
Other fAIr LAC Jalisco pilots have used AI to identify the factors impacting pupils dropping out of junior high school and to detect and prevent diabetic retinopathy. The diabetes trial that concluded recently took place in three health centres and more than 100 cases were detected through earlier screening of 1,000 participants.
First, fAIr LAC Jalisco established an Ethical Risks and Governance Committee made up of experts in ethics, law, data governance and cybersecurity. They advise and oversee the implementation process of AI use cases.
Toolkit
Hernandez said existing tools aren’t detailed enough.
“They didn’t cover all the aspects of digital rights and ethical governance,” he said.
The partners are putting together a toolkit based on best practices from organisations such as IDB and UNESCO.
It includes an ethical assessment matrix, an ethics report, an implementation matrix, informed consent and privacy notices, and data suspension procedure templates.
When communicating with the public, a key challenge, says Hernandez, is “finding the balance” between overloading people with technical information and showing them how privacy, security and ethics are being effectively managed.
“We need to analyse every situation to find the best moment and way to explain,” he said.
The post How AI could support female victims of violence in Mexico appeared first on capitaltribunenews.com.