Community//

Time to Make Search Algorithms Illegal

After being victimized by an algorithm and writing about it earlier this year, I have become more aware of these invisible mischief-makers and the negative impact some of them are having in the world. As I have become more informed I am advocating that we label some of these algorithms criminal. Here’s why. One of […]

The Thrive Global Community welcomes voices from many spheres on our open platform. We publish pieces as written by outside contributors with a wide range of opinions, which don’t necessarily reflect our own. Community stories are not commissioned by our editorial team and must meet our guidelines prior to being published.
Blockchain technology concept with diagram of chain and encrypted blocks.businessman working with digital tablet computer and server room.
Blockchain technology concept with diagram of chain and encrypted blocks.businessman working with digital tablet computer and server room.

After being victimized by an algorithm and writing about it earlier this year, I have become more aware of these invisible mischief-makers and the negative impact some of them are having in the world. As I have become more informed I am advocating that we label some of these algorithms criminal. Here’s why.

One of the most serious threats to modern civilization is the growing divide between various groups on matters such as abortion, gun control, racism, politics, healthcare, and the more recent additions to that list – global warming and the COVID-19 pandemic.

Some of the biggest contributors to this growing divide are not obvious. The culprits are the algorithms that influence our Internet searches, steering us to sites that will further radicalize our personal biases.

Yes, biases. We all have them. It cannot be helped. Each of us has been influenced by our life experiences – where we grew up, who are parents were/are, teachers, friends, schoolmates. Even best friends have slightly different biases or points-of-view (“POV”).

Back to algorithms. Some historical perspective might be helpful in seeing the insidious nature of these devilish artificial and invisible human constructs.

Before technology became ubiquitous in our lives, anyone could seek out groups to join, periodicals to read, and people to befriend who shared their biases, or POV. This might have enhanced their POVs but it was due to their proactive initiative, consciously seeking out confirmation of their biases. Their outreach was willful and deliberate.

With the advent of search engines, and subsequently search algorithms, our free will is being bypassed. We are unaware of how we are being directed to information that will exaggerate our biases and enflame our POVs. Since we only see POVs that agree with us we tend to become further extreme in our biases to the point where we might even become radicalized. Our POVs become truth and we confuse facts with opinion, as I wrote in my October newsletter [link to “Finding Truth” editorial]. This could be called “brain-washing,” a term used to describe how the Chinese would subject American POWs to communist propaganda during the Korean War.

Most people assume that if two people do a Google search on the same subject each one will get the same list of webpages, photos, articles, etc. Not so! You can demonstrate this by asking someone who has a different POV than you – political party, different profession, resident country, or race – and compare the search results each of you gets.

Why am I thinking of these search algorithms as criminal?

I suppose there is no existing law they can be accused of breaking so they may not be doing anything currently illegal. But perhaps the crime they are committing simply hasn’t been declared illegal as yet. Perhaps it is time for there to be a law that to influence public opinion in this anonymous and insidious way is wrong and, therefore, should be a criminal act.

In the early days of aviation, there were no laws requiring flight plans to be filed. A law had to be enacted so planes could avoid flying into one another when flying only by instrument flight rules (IFR).

Laws have been enacted to protect the public as society has evolved. Now is the time to make these search algorithms illegal.

Put in motion by these algorithms, widening divides are causing rioting in our streets, where people on both sides are harmed and property is damaged; all of which is criminal activity. Outside the U.S. these same “persuasive technologies” (the term the industry uses to describe how these algorithms change how we think) are used to radicalize young people who rely on social media and recruit them into terrorist groups like ISIS and Al Qaeda.

Then there is the psychological stress and anguish amongst huge populations around the world, including divides within families where each side escalates their artificially manipulated POV and attempts to push it on the other. Many family members won’t speak to other family members over some of these radicalized POVs.

These divides are eroding our civilization.

What is driving companies like Google, Facebook, and Apple to create these algorithms? Are they bad people trying to drive a wedge between our citizenry? I doubt this. I believe their motive is to sell information to both sides so the intent is likely to build markets; but they are having the negative consequence of destroying our civilization. Giving the engineers who create these nasty puppeteers the benefit of the doubt, let’s call these consequences “unintended” but the negative effect they are having in the world can still be defined as a crime.

Algorithms are generally proprietary and are closely guarded by their owners/developers. Some algorithms are so complex that even their creators don’t know exactly how they work. This is what is called inside Silicon Valley as the “black box problem,” according to Bahar Gholipour.

As Shoshana Zuboff1 so articulately defines the breed of capitalism coming out of the Valley – “surveillance capitalism” – these technology giants are in the business of selling our personal data. As the publisher of her book states in the front jacket flap, “The stakes could not be higher: a global architecture of behavior modification threatens human nature in the twenty-first century just as industrial capitalism disfigured the natural world in the twentieth.”

Coincidental to drafting this article, I learned of a new Netflix movie called “The Social Dilemma.” It offers a chilling perspective of the drive to monetize our personal information with persuasive technology. As one person in the film reports, “we are being programmed and we don’t even know it.”

Zuboff is featured in the film and makes a case for the “inevitable destructive consequences” of this malicious technology and calls for it to be outlawed.

So let us start standing up for ourselves and outlaw these gap-spreaders so we can stop the massive widening of all the division that has infected our world. Let’s do what we need to do so that laws are enacted that prevent this insidious manipulation of our opinions, sooner rather than later.

1 The Age of Surveillance Capitalism: The Fight for a Human Future at the New Frontier of Power, Public Affairs, New York, 2019

    Share your comments below. Please read our commenting guidelines before posting. If you have a concern about a comment, report it here.

    You might also like...

    Humans and Emotionally Intelligent AI
    Community//

    Humans and Emotionally Intelligent AI

    by Emily Daniel
    Wisdom//

    Why Echo Chambers are Becoming Louder — and More Polarizing

    by Kellogg School of Management
    We use cookies on our site to give you the best experience possible. By continuing to browse the site, you agree to this use. For more information on how we use cookies, see our Privacy Policy.