A Group Of Aussie Women Have Created A New Chatbot To Help Women Of Colour Report Racism

Maya cares

The conversation around online chatbots is often a fearful one, full of concerns about AI takeover and cheating students. But what about all the ways chatbots can actually improve our lives? Enter Maya, a new chatbot designed specifically to support women of colour who experience racism.

If you’re a woman of colour, and you’ve experienced racism, did you report it? Did you take further action after the experience? Did you even know where to start when it came to seeking support?

Priyanka Ashraf didn’t when she was verbally abused at a supermarket and told to go back to her country, an experience many women of colour including myself can relate to.

“It never even occurred to me that I should report it, and it was only much later when I spoke to a friend who asked me, ‘Why didn’t you report it?’” she told ABC News.

“I used to be a practising lawyer, with access to information and knowledge.

“If I didn’t know how to report it, then clearly people who have less access to that information [face] even more barriers?”

And so Ashraf created Maya Cares, a chatbot made specifically to help Aboriginal and Torres Strait Islander women (and any other women of colour) “heal from and report racism”.

The chatbot is designed to talk to you like “your digital big sister”. It asks questions about what happened, and then depending on the nature of the incident, will offer options on how and where to report racism and direct you to counselling and support resources.

A young Muslim woman under the name Ayan has been beta-testing the chatbot after her own experience of racism, where she suspected a prospective employer rejected her because of her hijab.

“I got a call saying: ‘Look the manager really likes you, but we were wondering if you could remove your headscarf or headwrap?’” she recalled, per ABC News.

She said the employer said white people in the area would be offended by her hijab, but she refused to remove it. She was then told there was an error in her application and she was rejected from the job.

“I felt that the reason I didn’t get the callback is because I refused to take my headscarf off,” she said.

However, since the conversation was over the phone and she was technically rejected because of her application process being flawed, Ayan was left unsure of how to proceed. Maya Cares has been helpful to her, and she believes it will be helpful to others.

“It’s giving the user the option to decide how they want to deal with it,” she said.

“Some people might want to see a counsellor or psychologist, other [people] might want to talk to others about it and get some validation … and some want to take more action.”

The other benefit of Maya Cares is that it would also act as a database of racist incidents — an area stats are lacking in given people are unlikely to report experiencing racism.

“There are big missing patches of data of what racism really looks like and that constantly feeds into this narrative that racism doesn’t really happen,” said Wendi Qi Zhang, service designer of Maya Cares, per ABC News.

“What we’re really hoping for is, like, a big surge of undeniable database and [an] evidence base to give to different policy-makers.”

A wide array of answers will also train Maya Cares to give more specific and personalised responses to people experiencing racism, which is obviously ideal given different types of racism (workplace vs at a supermarket, for example) will require different avenues of reporting.

All I know is that, as a fellow woman of colour, I have never officially reported any of the racism I have experienced in my 24 years of life.

From security guards at pubs telling me to remove my hijab, to random strangers on the street telling me to back to where I came from, to vitriolic racist abuse sent straight to my inbox, it’s rough out here — and I dealt with most of that alone.

Let’s hope Maya Cares changes that.

More Stuff From PEDESTRIAN.TV