Algorithm SLUG

Flanked by two monitors, software engineering student Paul Ho’s eyes reflect back the strings of characters specifying a set of instructions. These instructions are made specifically to print the phrase, “Hello, World,” the first thing most computer scientists learn when they’re beginning to learn coding languages:
>>> print (“Hello, World!”)
Hello, World!
This language is known as Python, but there are numerous other languages like this that have an increasing amount of influence over every aspect of your life.

If you’ve used Google, you’ve utilized Google’s PageRank algorithm and these languages mentioned above. This algorithm searches the web for quality links to a page, ranking sites by the number of quality links back to them. Using this data, along with around 200 other data points, Google is able to give you the most relevant search results in record time.

Generally, algorithms are made for data processing, computation, and reasoning problems. It can be simply understood as a recipe for a problem presented, a sequence of understandable instructions for a specified purpose. With the maelstrom of information found on the web, the problem has become getting us content we want to see. Everything we interact online is informing and algorithm so that it can improve its instruction and feed us more content that we’ll hopefully interact with.

For the people creating the algorithms, this this is the goal; “I’m working on an app that helps people and hopefully makes their lives easier. I’ve studied this because I love computers, but you can do a lot of good with it as well,” said Paul, who is working on an app that uses algorithms to track road conditions and make our travels easier.

To sort through the constant storm of information, a site like Facebook tracks all the activity from you and your friends accounts, amassing this data and using it to curate your feed. The more and more information received from users, the more and more specialized and efficient the algorithm will get.

As we’re fed more and more of this content pertaining to us, we develop an information desert, devoid of any dissenting opinions or viewpoints. We create this echo chamber ourselves when we provide more information to the algorithm; it works to use this information and provide the best response based on the data input. For many, they increasingly put the blame on sites like Facebook and Twitter for having an agenda.

“That idea (that Facebook is purposefully influencing the agenda) is ridiculous, they don’t care or have the time to micro-manage what we’re seeing on our feeds,” said Paul.
“People have to search out other information on their own. We rely on computers to do everything for us and that will eventually come with problems. This is one of those problems we have to deal with as a society,” explained Paul.

A study conducted by the Pew Research Center in 2014 found political polarization at an all-time high in the United States. The share of Americans holding one consistent political view jumped from 10 to 21 percent while animosity towards the opposite party also rose. The information deserts called, “ideological silos” by Pew are also on the rise. They found that the most polarized are also most apt to stay solely within their political community.

Parallel to our political communities are our online communities. Catering to the divisive environment, a website often lets you hide or downvote content, most likely telling the algorithm to hide it from you in the future. We are in part creating the information bubble ourselves and complaining when the algorithms are doing too good of a job.

Increasingly, the information and data being provided to the algorithms are illegitimate. Sharing an article or interacting with one false website can spread that same propaganda to all your friends and family. This is no fault of the algorithm but a fault of the user and further emphasizes the need for a critical eye.

“(Fake news) is no fault of the programmer,” said Paul. “We need to check out the websites we’re getting our information from and make sure it’s legitimate. We’re taught to have a critical eye and check sources in middle school.”

Recent indictments of 13 Russian nationals provide an example of this bad data. An enemy of the state exploited human tendencies with fake news and algorithms to further worsen political polarization. By spreading and sharing unscrupulous links, we played right into the hand of a nation looking to subvert our democracy.

In the wake of criticism from their users, sites like Facebook and Twitter are trying to quell the wave of “fake news” by changing their algorithm to deal with the problem. At the same time, the developers of these tools are wrangling with the real world consequences of their decisions. Every interaction we have online is influencing what we’ll see in the future and it’s important for users as well as the people behind the algorithms to realize that.

Instead of solving traditional computer problems, they are being used to solve human problems. When an algorithm fails to calculate a basic function, the error can be obvious and quickly fixed. But how can we fix an error that isn’t as easily identified? In large part, the error is coming from the user and not the algorithm.
People in the computer science field like Paul hold progressively more power in the digital age but they aren’t looking to exploit that. On the contrary, they want to make users every day life easier and more efficient.