“Algorithms are opinions embedded in code”

Tech companies have taken over the power to make decisions for us. This can be convenient when it comes to playlists or navigation. But under the guise of “objectivity”, their algorithms also categorize people and reinforce social inequality.

published on the NUDGED blog, February 28, 2020 >>

When we moved to New York City in the middle of the January 2018 snowstorms, we were hit by an algorithm. After offering us three identical apartments – hideous overpriced studios in a shabby high rise that had been on the market for months – our broker shrugged. “That’s all I have for you”, he said offhand. “If you had a credit history, well, that would be a different story. At the same price, I could show you plenty of well-kept apartments down there.” He pointed out the window at the brownstones of Hamilton Heights that I had been looking at all along.

Targeted advertisement gone wrong (screenshot)

Targeted advertisement gone wrong (screenshot)

A credit what? Oh my, I should have seen this coming. Algorithms like the one behind the US credit score already rule many aspects of our everyday lives – although they are nowhere to be seen. We only recognize them if they hit us or if they fail – you all know these situations. Early in my pregnancy, before I had told my friends and family, an ad popped up on my computer: A giant moose standing on a woman’s belly, ironically advertising smarter “ways to lose stubborn belly fat”. Apparently buying a bathroom scale on Amazon (while being careful enough to google anything pregnancy-related in private mode) had landed me in the “dieting” category.

“That algorithm had a very bad day”, I said to myself and laughed it off as I often do. Because despite all the data mining I felt that I had little to worry about, a sign of privilege as I understand now. What if an algorithm gets an early pregnancy right and alerts her health insurance or employer? Or if it inadvertently lets her parents know before she can do anything about it – yes, this happened.

 

Algorithms simplify our life – or turn it upside down

Tech companies have taken over the power to influence the basic functions of society. Some of their algorithms control what kind of information and political advertisement we are fed, thereby enclosing us in a filter bubble that reinforces what we already believe (confirmation bias), one of the reasons for the increasing divide in our societies. Proponents argue that algorithms are making our lives easier, at least for most people most of the time: They sort out spam e-mails, propose our favorite songs on Pandora or the safest way to get to work by bike. This reveals a utilitarian perspective, as Abeba Birhane has pointed out, according to which the impact on minorities can be ignored – they become “collateral damage”.

That impact can be devastating. Because the most influential algorithms do not categorize e-mails and songs but us – human beings. They calculate our alleged probability to become a risk for a company or a government, and their verdicts can have life-changing consequences: Will you get a loan to go to college? Will your family be investigated based on child welfare concerns? And how are police officers deployed in your neighborhood?

Continue reading on the NUDGED blog >>