Week 4

By Adrianna

Thanks Jen and Sean for all the readings! I especially enjoyed “Dismantling Tech as a Bad Romance in Its Continued Master-Slave Relationship”. The Bina48 section gave me a lot to consider and I went down a rabbit hole researching other takes on the term “Black Siri”. In my search I came across this GIF (sorry! I could only attach its image format) and article. In a nutshell, it discusses how some people have felt validated by the updated “diverse” versions of virtual assistant voices. Yet, it has also brought up concerns about gendered and racial biases. Is it really better to have a woman or a person of color act as your virtual assistant? I’m not sure where I stand with this. Either way, the article reinforces what the Bad Romance piece tells us. It’s essential that we start opening up spaces for more diverse persons in these positions of power, so that they (with their own lived experiences) can improve the technology we have and hopefully even make it accessible to more people.

4 thoughts on “Week 4

  1. Jen Hoyer (she/her)

    Thank you so much for sharing that article, Adrianna! It makes me think about the ways we unconsciously relate to technology as human even if we know it’s not. (I’ve often felt weird when friends choose a specific siri voice because it sounds “sexy,” for example, but then, what does it *really* mean to objectify a literal object…)

  2. Brieanna Scolaro (They)

    Adrianna, this concept of racial/gender biases in tech voices is extremely interesting. I have thought about this a lot but from a beneficial perspective regarding what voices are available for meditation apps. In my experience it is a robotic, White man sounding voice, and many of my clients do not feel so soothed by the voice of a man as victims of various types of trauma. Now, the apps that I use have the option to select your voice, and more and more meditation + fitness + yoga classes are including other types of voices, which can make folx feel welcomed into a typically White dominated space. This is where it is validating. However, what is the inverse? What happens when this is not the case, or it is a way to reinforce systems of power and privilege. What happens if a Google Maps is defaulted to the voice of a Black female, and folx immediately have an internal reaction and seek to change this. What happens if the voice of a virtual assistant is that of a man (this is not what we would expect) versus the voice of a female (such as traditional secretarial roles in the past, etc).

  3. Sean Patrick Palmer

    For the tenth anniversary of the establishment of our major. we had a guest speaker come in. He was from Cameroon, and pointed out that most of the emojis that have been created were for people in the West, and not just in the skin tone of the emojis, but also in the facial expressions. Many of the facial expressions he saw in them would be, if not always offensive, at least not well-received in his culture.

    I gad never thought about the facial expression aspect of it before then, though once he said it, it made perfect sense.

    (I wish I remembered this person’s name. IT was an interesting lecture — which doesn’t always happen– but it was also six or seven years ago.)

  4. Katina Rogers (she/her)

    Such a good point, Adrianna—and I think it’s another way of examining the false perception of neutrality in tech (which is often a very white version of what constitutes “neutrality”, as Sean’s example about emojis illustrates so well).

Comments are closed.