by Viviana Bono — 30 April 2026
Topics: Report
On April 16th, I attended the ETAPS 2026 session “Diversity, Equity, and Inclusion”, an event to “promote the fair treatment and full participation of all individuals, especially those historically underrepresented or discriminated against”.
The first thing that came in my mind during this event is that the body is our I/O device. Through it, we exchange information with the environment and with the other individuals. We express ourselves by means of what we do, but also by means of the way we look. If someone does not conform to the dominant standard of male, white, in his prime, at least once in life has experienced discrimination of some sort: women, people of colour, disabled people, non-binary people, the elderly, people coming from underdeveloped countries, and I am sure I am missing someone in this (unordered) list. Often, discrimination arises just from non-verbal communication and sometimes political correctness hides it. I am not against being politically correct, when it is useful to stop and think before hurting others. However, it can be a weapon to conceal biases, and biases imply distrust.
Professor Viviana Patti of the University of Turin, in her talk “Is artificial intelligence neutral? Gen(d)erally no. Reimagining research in STEM through epistemic justice and intersectionality”, indeed gave the name “testimonial injustice” (introduced by Miranda Fricker) to the phenomenon of giving less weight to someone’s word not because of the evidence, but because of prejudices about their identity. Having the precise words to express injustices is of paramount importance, that’s why Viviana talked also about “hermeneutical injustice” (introduced again by Miranda Fricker), that indicates when individuals or groups lack the conceptual tools to make sense of and communicate their own experiences because dominant frameworks fail to include them.
The second thing I thought about is how important is having a job, possibly a fulfilling one. Work makes people independent economically and socially, and proud of themselves, because it is a means of personal promotion and growth. Unfortunately unemployment is a general problem that hits minorities in particular. One cannot talk about work without mentioning technology, in general, and artificial intelligence, in particular. However, is AI neutral?
To give some perspective in this direction, there was the talk by Professor Rosa Meo of the University of Turin, “Detecting a desperate treatment of genders in job interviews with generative AI chatbot”. The experiment Rosa and her team made was to have a (male) student acting as someone looking for a job, once male, once female, with a chatbot playing the role of the interviewer. When interviewing the female role, the chatbot showed all the typical prejudices on women, notably the bias on their lack of reliability and the possibility that at a certain point the female worker would privilege her family life over her loyalty to the job. This experiment is only at the beginning: there were suggestions by the audience, such as making it more an interdisciplinary work by involving psychologists and sociologists, utilising people of different genders to act in the role of the interviewed person, trying the same approach to measure the reaction of the chatbot to other underrepresented groups.
The presentation “AI: Artificially Inaccessible”, given by Dajana Gioffrè (Chief Vision Officer in Accessiway) and Jacopo Deyla (Chief Accessibility Officer in Accessiway), put together my two key issues, the body and work. Dajana and Jacopo are representatives of a company that takes digital accessibility from a one-off project to an ongoing practice, with a platform that combines certified expert audits and AI automation. In Accessiway, people can get their job not only despite their body, but because of their body, since no one understands the requirements of a category of customers with special needs better than the customers themselves. In their talk, they highlighted the increasing importance of AI in coding, making it possible to build websites and applications via specs in natural language (vibe coding). However, they underlined the necessity of training AI’s in such a way they learn to keep in account accessibility: “every prompt is a choice for inclusion”.
My last observation goes to intersectionality. Together with interdisciplinarity, it is the key to keep the attention on bias high. Since nobody is immune from prejudice, multiple, diverse voices can contribute to equity and inclusion, towards making the world a place to belong for everyone.