Update you browser

For the best experience, we recommend you update your browser. Visit our accessibility page for a list of supported browsers. Alternatively, you can continue using your current browser by closing this message.


Key details


  • 12 January 2022


  • RCA

Read time

  • 2 minutes

Our MA/MSc Global Innovation Design is all about design as a catalyst for social change, with students looking at how global issues can be resolved using new systems and technologies. Some of the biggest social movements in recent years like Black Lives Matter and the Arab Spring have spread globally through digital spaces. At the same time new technologies in those spaces have been used to sanction state violence. 

For the AI-Witness module last term, our students turned their attention to how technology might help the most vulnerable people in our world. Challenging themselves to consider ways in which vulnerable communities and legal experts could use big data and AI to deter and increase convictions for human rights abuses.

Uncover: Avoiding Modern Slavery in Online Shopping


Uncover is a browser extension that uses machine learning to show shoppers how much modern day slavery contributed to items they’re browsing and provide them with alternatives. With 40 million adults working under slave labour worldwide and 150 million child labourers, Linda and Ali wanted to create a product that would take the pressure off the consumer and put it back on the producers of goods.

Using machine learning and data, Uncover gathers evidence from reports, media sentiment and a company’s website to score retailers on their effort to prevent and avoid products that come from slave labour – giving consumers an easy way to choose ethical products.

Mocra: Using AI to Report Abuse

Mocra is an AI infused platform that aims to help caregivers and professional workers to improve their communication skills when children disclose sexual abuse to them. Using China as their case study, Cathie and Luxian designed a platform to help resolve the problem of under-reporting in rural areas where only 34.06% of incidents are reported due to anxieties around power relations.

Mocra is a supportive tool helping caregivers to recognise and evaluate the appropriateness of language used during disclosure. Using four key indicators: 'preconception’, ‘tendency to evaluate’, ‘Tone’ and‘Verbal Language’, the portal offers instant suggestions for wording to smooth communications and provide an empathic response.

Clear Card: Machine Learning Tool for Asylum Seekers

Clear Card

Clear Card is a narrative reconstruction tool that helps asylum seekers give testimony while automatically sourcing third party evidence relevant to their claim. Intended as a guide to the emotionally demanding asylum interview process, Clear Card helps users give the right evidence to the authorities using machine learning to identify keywords and corroborate them with trusted sources.

Through a visual timeline of the asylum journey, Clear Card aims to check that claims are a legal fit for the asylum process – a check list or ClearCard is produced for the user to ensure they have met the criteria. Arnau, Aura and Ahad designed the system in response to claims that up to 80% of refused asylum cases are overturned due to a flawed credibility assessment.

Seen:Voices: Data as Self-expression for Refugee Children

Seen:Voices is an art-based mental health support system for refugee children aged 5–10 aiming to represent their unheard voices as a collective experience. Initially intended as a way of exploring traumatic experiences in a non-threatening way, Louise and Tong also envision the collection of works as a way supporting the project whilst fulfilling its aims. Gathering the images is a way of gathering qualitative data for reports on refugee children's experience, whilst also raising awareness through public exhibitions of work by anonymous refugee children. A feedback loop is central to the project with sales from exhibitions and NFTs of artworks going back into funds for refugee charities.

AI-Witness was led by Senior Tutor, Dr John Stevens and and Visiting Tutor Maxim Dedushkov. With guest contributors including Professor Gareth Loudon (Head of Programme, IDE/GID (RCA)) Professor Yvonne MacDermott-Rees, Professor of Law (Swansea University), Professor Lex Paulson (Executive Director, UM6P School of Collective Intelligence), Maria Fernanda Felix de la Luz (UN Development Program) and James Maltby (Save the Children UK) Liz O’Driscoll (Head Of Innovation, Civica).

Interested in how technology could create a more humane world?

Find out about our MA/MSc Global Innovation Design.
Fernanda Dobal