NEW BOOK!
Explore a better way to work – one that promises more calm, clarity, and creativity.

Beyond Digital Ethics

Extremist Suggestions

Earlier this week, Zeynep Tufekci appeared on Ezra Klein’s podcast. If you don’t know Tufekci, you should: she’s one of my favorite academic thinkers on the intersection of technology and society.

During the interview, Tufecki discussed her investigation of YouTube’s autoplay recommendation algorithm. She noticed that YouTube tends to push users toward increasingly extreme content.

If you start with a mainstream conservative video, for example, and let YouTube’s autoplay feature keep loading your next video, it doesn’t take long until you’re watching white supremacists.

Similarly, if you start with a mainstream liberal video, it doesn’t take long until you’re mired in a swamp of wild government and health conspiracies.

Tufecki is understandably concerned about this state of affairs. But what’s the solution? She offers a suggestion that has become increasingly popular in recent years:

“We owe it to ourselves to [ask], how do we design our systems so they help us be our better selves, [rather] than constantly tempting us with things that, if we sat down and were asked about, would probably say ‘that’s not what we want.'”

This represents a standard response from the growing digital ethics movement, which believes that if we better train engineers about the ethical impact of their technology design choices, we can usher in an age in which our relationship with these tools is more humane and positive.

A Pragmatic Alternative

I agree that digital ethics is an important area of inquiry; perhaps one of the more exciting topics within modern philosophical thought.

But I don’t share the movement’s optimism that more awareness will influence the operation of major attention economy conglomerates such as YouTube. The algorithm that drives this site’s autoplay toward extremes does so not because it’s evil, but because it was tasked to optimize user engagement, which in turn optimizes revenue — the primary objective of a publicly traded corporation.

It’s hard to imagine companies of this size voluntarily reducing revenue in response to a new brand of ethics. It’s unclear, given their fiduciary responsibility to their shareholders, if they’re even allowed to do so.

By contrast, I’ve long supported a focus on culture over corporations. Instead of quixotically convincing some of the most valuable business enterprises in the history of the world to behave against their interests, we should convince individuals to adopt a much more skeptical and minimalist approach to the digital junk these companies peddle.

We don’t need to convince YouTube to artificially constrain the effectiveness of its AutoPlay algorithm, we should instead convince users of the life-draining inanity of idly browsing YouTube.

I’m not alone in holding this position.

Consider Tristan Harris, who, to quote his website, spent three years “as a Google Design Ethicist developing a framework for how technology should ‘ethically’ steer the thoughts and actions of billions of people from screens.”

After realizing that Google actually had very little interest in making their technology more ethical, he quit to start a non-profit that eventually became the Center for Humane Technology.

I’m both a fan and close observer of Harris, so I’ve been intrigued to observe how his focus has shifted increasingly from promoting better digital ethics, and toward other forms of defending against the worst excesses of the attention economy.

The current website for his center, for example, now includes emphases on political pressure, cultural change (as I promote), and a focus on smartphone manufacturers, who don’t directly profit from exploiting user attention, and might  therefore be persuaded to introduce more bulwarks against cognitive incursions.

I appreciate this pragmatism and think it hints at a better technological future. We need to harness the discomfort we increasingly feel toward the current crop of tech giants and redirect it toward an honest examination of our own behavior.

29 thoughts on “Beyond Digital Ethics”

  1. I don’t know…people are not very good at resisting addictive substances (like youtube surfing). And we regulate addictive substances (high taxes on alcohol, tobacco, controls on distribution of addictive drugs, etc.). I wonder if there isn’t some combination of government action and advocacy for cultural change (what you’re arguing for) that might be effective. (I realize that government action would require a sea change in public attitudes towards what are often seen as harmless behaviors.)

    Reply
    • I could imagine an age restriction on certain types of online interactive media gaining traction, though this presents ontological difficulties (e.g., what exactly makes an app “social media?”).

      The lesson I take away from the fight against youth tobacco use, for example, is that the legal age restrictions helped, but what ended up much more effective were efforts to change culture (I vividly remember the Truth campaign from my childhood).

      Reply
  2. I agree that people are not very good at resisting addiction, but why? Many have broken addictions once they saw that the substance was bad for them. The trouble with technology is that it’s not consumed the same way alcohol and drugs are consumed. Although alcoholism is billed as a disease, it’s also billed as a disease a person can get control of. If only the public sought such self awareness, the addiction to technologies could be broken. I think it’s a lack of self awareness, and a willingness to embrace whatever comes along (sold) that is at the root of a lot of societal problems. Corporations are not interested in people taking control of their own lives, but taking control of one’s own life is exactly what each of us needs to do.

    Reply
  3. Agreed. I like this. I think it’s right. I tend to believe the change you mention will come about organically. In this case “organically” is what you suggest needs to happen. Us helping each other. That’s what we’re here for. So, what can I do for you?

    Reply
  4. Great article Cal.

    It seems straightforwardly true that key is in changing culture and policy. To continue with the smoking example, can you imagine tobacco companies in the 60’s voluntarily engineering their products to be less addictive?

    The change in attitude towards smoking has come about only through external forces, mostly through the Government.

    One of the challenges that makes big tech so unique is the death grip that it appears to have on everyone, including politics. Social media is such a powerful political tool that Government’s have as much to gain from it’s attention stealing qualities as the companies themselves. And with no ‘obvious’ health concerns (i.e. no lung cancer) Governments have no incentive to tackle the issue.

    Big tech is a different beast. I think ultimately the answer is going to lie in people power. The silver lining is that we may be seeing the beginnings of this now (maybe?).

    Reply
  5. Cal , you introduced me to Tristan Harris by linking his famous article in one of your blog posts. I have been following him ever since. His podcast with Sam Harris was really eye opening . You might want to also listen to his latest podcast in Nuerohacker Collective Insight podcast.

    Reply
  6. Cal,

    I commented several days ago and do not see my post listed. Therefore, if you have objection to what I wrote. Please contact me so we can discuss in order to avoid misunderstanding. Again, I just commented that your example of conservative and liberal was incorrect. If you disagree, let’s have a constructive discussion.

    Sincerely,

    Luis

    Reply
  7. People vs. Profits is a common refrain… and companies rarely choose people no matter how many free lunches or ping pong tables or unlimited vacation days they provide to prove their amazing culture.

    Common Sense (where I work) is and has been working on this front for a while now as it relates to school-aged kids and their families. There is currently a wave of apps and other products or services that make avoiding social media and other addictive tech part of their interaction.

    I’d welcome a short conversation on the topic if you have a few minutes… I am working on a project that will pull some of these founders into roundtable working sessions to see how different companies are approaching these same underlying problems in unique ways. Something tells me you’d have some great insights and comments to share with a group like that!

    Reply
  8. Hi Cal,
    I just came across this article in Nautilus about a new digital magazine called “The Disconnect”, “containing commentary, fiction, and poetry”, which can only be read by disconnecting from the internet!
    I find this idea interesting. It tallies with your thoughts on how we should use technology when it benefits us and not simply become pawns in the attention-grabbing wars being staged online by tech. companies.

    https://nautil.us/issue/63/horizons/the-online-magazine-you-cant-read-online utm_source=pocket&utm_medium=email&utm_campaign=pockethits

    Reply
  9. It is important to realise that there is no such thing as a simple search on the Internet. In reality, all data are tracked, analysed, retrieved and reused.

    Reply
  10. I don’t quite understand– Tristan Harris himself is talking about four distinct ways to have “humane technology”: political change, better consumer awareness, better design for devices and engaging employees who create these softwares/products. Your claim is– fix yourself and it’ll stop, which seems a rather limited solution to the sheer scale of this problem.

    Reply
  11. This is a minor quibble but given the increased availability of large scale private financing, one that seems important to get right. You write, “The algorithm that drives this site’s autoplay toward extremes does so not because it’s evil, but because it was tasked to optimize user engagement, which in turn optimizes revenue — the primary objective of a publicly traded corporation.” I would argue that the last clause should read, “the primary objective of a for profit corporation.” I also think the phrase “optimizes revenue” is opaque but agree it is fundamentally accurate.

    Reply
  12. Digital ethics is very important to us. And it is used in the system And it also uses the Gmail account. which is very important to the digital ethics. you also provide the best information about the digital ethics. This article is great.

    Reply
  13. Focusing on culturally powerful individuals……..I think this is an excellent theory. One example: Apparently Ian Fleming’s books (about James Bond) didn’t become truly popular in the United States until President John F. Kennedy said “From Russia with Love” was one of his favorite books.

    Reply
  14. It is imperative to understand that there is no such thing as a straightforward pursuit on the Internet. As a general rule, all information are followed, broke down, recovered and reused

    Reply
  15. It also provides information and support with maintaining the printer, troubleshooting problems that may occur. Just give us a call on our Epson Printer Support number if you do not know what you are looking for or if you would like help with your problem.

    Reply

Leave a Comment