Things That Matter

There’s a Twitter Bot That Will Correct You If You Tweet “Illegal Immigrant”

If you’re going to use the phrase “illegal immigrant” on Twitter,  expect a prompt reply from @DroptheIBot. The brainchild of Fusion reporters Patrick Hogan and Jorge Rivas, @DroptheIBot is programmed to recognize when someone tweets the words “illegal immigrant.” Then it’ll reply with a suggestion to use “undocumented immigrant” or “unauthorized immigrant” instead.

@DroptheIBot only has one thing to say.

https://twitter.com/DroptheIBot/status/627022713811521536

But people who get called out are getting pissed.

Like, insult-slinging pissed because bots have feelings, you know?

Some have tried to fight the bot.

Others have taken the opportunity to learn and grow.

Several Twitter users are excited that someone has taken a stand.

Even undocumented activist Julissa Arce gave her praises.

Yet, some still don’t get it’s a bot.

Update:

The @DroptheIBot account has been suspended by Twitter.  According to Twitter, there are three reasons why accounts are suspended: sending spam, hacked account, and/or abusive tweets. It is anyone’s guess why @DroptheIBot got suspended, but it is likely @DroptheIBot’s messages were flagged by users as spam.

What do you think about @DroptheIBot’s attempt to correct Twitter users? mitú wants to know. Let us know in the comments below!

Notice any needed corrections? Please email us at corrections@wearemitu.com

Twitter’s AIs Prefer Ted Cruz With Boobs And White Skin Over Black

Things That Matter

Twitter’s AIs Prefer Ted Cruz With Boobs And White Skin Over Black

Ever notice how on some social platforms like Twitter or Instagram that you yourself are mysteriously unable to crop your display images on your own? That’s because Twitter prefers to let their algorithms make the decision. Over the weekend users on Twitter discovered the surprising dangers of letting algorithms crop your own images.

Education tech researcher Colin Madland drew attention to the issue while speaking out about how the video-calling program Zoom, often crops the head out of his black person coworker while on calls.

It didn’t take long for Madland and other users to discover that Twitter’s AIs use discriminatory equations to prioritize certain faces as well. In short, the social platform’s AIs prefer white faces over Black ones.

In response to the discoveries, a Twitter spokesperson acknowledged that the company was looking into the issue “Our team did test for bias before shipping the model and did not find evidence of racial or gender bias in our testing. But it’s clear from these examples that we’ve got more analysis to do. We’re looking into this and will continue to share what we learn and what actions we take,” they stated.

Of course, Madland’s discovery is nothing new. In 2019, test results from the National Institute of Standards and Technology revealed that some of the strongest algorithms online were much more likely to confuse the faces of Black women than those of white women, or Black or white men. “The NIST test challenged algorithms to verify that two photos showed the same face, similar to how a border agent would check passports,” Wired points out. “At sensitivity settings where Idemia’s algorithms falsely matched different white women’s faces at a rate of one in 10,000, it falsely matched black women’s faces about once in 1,000—10 times more frequently. A one in 10,000 false match rate is often used to evaluate facial recognition systems.”

Still, it didn’t take long for users on the platform to ask what other physical preferences Twitter has.

Turns out the AIs prefer Ted Cruz with large anime breasts over a normal-looking Ted Cruz.

(To better understand this Tweet, click the link above)

The user who tested the image of Cruz, found that Twitter’s algorithm on the back end selected what part of the picture it would showcase in the preview and ultimately chose both images of Cruz with a large anime chest.

It’s nothing new that Twitter has its massive problems.

For a platform that so controls and oversees so much of what we consume and how we now operate, it’s scary to know how Twitter chooses to display people with different skin tones. The round of jokes and Twitter experiments by users has only revived concerns on how “learning” computer algorithms fuel real-world biases like racism and sexism.

Notice any needed corrections? Please email us at corrections@wearemitu.com

Report Shows That Immigration Narratives On TV Are Latinx-Focused And Over-Emphasize Crime

Entertainment

Report Shows That Immigration Narratives On TV Are Latinx-Focused And Over-Emphasize Crime

The media advocacy group Define American recently released a study that focused on the way immigrant characters are depicted on television. The second-annual study is entitled “Change the Narrative, Change the World”.

Although the study reports progress in some areas of onscreen representation, there is still a long way to go.

For example, the study reported that half of the immigrant characters depicted on television are Latino, which is consistent with reality. What is not consistent with reality, however, is how crime-related storylines are still an overrepresented theme in these storylines.

The study shows that on television 22% of immigrant characters have crime storylines show up as part of their narratives. These types of storylines further pedal the false narrative that immigrants are criminals, when in reality, they’re just everyday people who are trying to lives their best lives. Ironically, this statistic is an improvement on the previous year’s statistics in which crime themes made up 34% of immigrants’ stories on TV.

These numbers are further proof that the media feels stories of Latino immigration have to be about sadness and hardship in order to be worth watching.

According to Define American’s website, their organization believes that “powerful storytelling is the catalyst that can reshape our country’s immigration narrative and generate significant cultural change.”

They believe that changing the narratives depicted in entertainment media can “reshape our country’s immigration narrative and generate significant cultural change.” 

“We wanted to determine if seeing the specific immigration storylines influenced [viewers’] attitudes, behavior, or knowledge in the real world,” said Sarah Lowe, the associate director of research and impact at Define American to Variety. “And we were reassured and inspired to see the impact it had.” 

Define American’s founder, Jose Antonio Vargas, is relatively optimistic about the study’s outcomes, saying that the report has “some promising findings” and the numbers “provide [him] with hope”. He added that there are still “many areas in which immigrant representation can improve”.

via Getty Images

Namely, Vargas was disappointed in television’s failure to take an intersectional approach to immigration in regards to undocumented Black immigrants. 

“Black undocumented immigrants are detained and deported at higher rates than other ethnic groups,” Vargas told Variety. “But their stories are largely left off-screen and left out of the larger narrative around immigration.” 

“Change the Narrative, Change the World” also showed that Asian and Pacific Islander immigrants are also under-represented on television compared with reality. Also worth noting, male immigrants were over-represented on television compared to reality, while immigrants with disabilities were also under-represented.

The study also showed that when viewers are exposed to TV storylines that humanize immigrants, they’re more likely to take action on immigration issues themselves. 

The effect that fictional entertainment narratives have on viewers further proves that representation does, indeed, matter. What we watch as entertainment changes the way we think about other people’s lived experiences. And that, in turn, can change the world.

Notice any needed corrections? Please email us at corrections@wearemitu.com