Everywhere you look, high tech is in somebodyās bullseye. Take Apple. On the inside, top investors are worried about its productsā effects on children. On the outside, liberal activists are grousing about its offshore billions it can now bring home under GOP tax reform. Even usually anti-regulation conservatives at the National Review are asking why Big Tech isnāt regulated like Big Oil or Big Tobacco.
These examples, all recently in the news, confirm the trend but skim the surface. New national polling has found public opinion is shifting from a warm embrace to growing skepticism. Itās not just the way so-called fake news on social media had a role in recent elections in the U.S. and led to congressional inquiries. And it’s not just calls for federal anti-trust actions aimed at the most popular information curators, Facebook and Google.
Beyond these dots that attest to a backlash is understanding whatās really going on below the screens and in the minds of Facebookās 2 billion users and Google-owned YouTubeās 1.5 billion users. There is a new phrase describing this sphere of human activity, the technology behind it and its effects. Whatās being called the attention economy is coming under new scrutiny because it’s seen as undermining the journalism profession as well as trust in public institutions and democracy.
āWe come here in friendship,ā said Anthony Marx, president of the New York Public Library, co-chair of the Knight Commission on Trust, Media and Democracy, at Stanford University this week. The panel was created last fall to try to fix the attention economyās biggest problems, including the way Google search and Facebook have demoted the visibility of independent media, under the guise of fighting fake news.
Marxās comments elicited nervous laughter, because he had just presided over a panel that laid out in vivid and disturbing detail how Silicon Valleyās best minds have created brain-tracking, brain-mimicking and brain-triggering computational formulas. These algorithms have turned billions of digital device users into information addictsāand when put at the service of supercomputers, targeting online advertising or content placement, they have fractured society as never before.
āWeāre at Stanford; the belly of the beast. This is where it all started. Which is why weāre here, why we need to understand what you all are thinking,ā Marx said, speaking to panelists, an audience filled with tech executives and commission members culled from some of the monopolistic companies under attack.
āLetās be very clear about two things,ā he said, proceeding diplomatically. āOne, you all have created this amazing tool. If you had said to me as a child that I would have something in my pocket that could connect me to all the information in the world, potentially, I wouldnāt have believed you. Thatās astonishing. So thank you. Thatās the good news.ā
āThe less good news is this is not taking us to a good place,ā he continued. āIt has not taken us to a good place. And Iām not supposed to say any of this, but your industry, the industry that you all are a part of, I think the world has changed its view of the industry in the last year, in a way that I have never seen the likes of before. Meaning what was thank you, is now uh-oh. Bad things are coming of this, and that puts us all on the spot, which is why we are here. We want to understand what is possibleāwhat can be better.ā
The Attention Economy
Those outside Silicon Valleyās innermost circles cannot access or evaluate the algorithms powering Facebookās news feeds and advertising-driven content placement, or YouTubeās engine that recommends other videos individual users might like. However, panelists at the Commissionās Stanford University session were exceptionally articulate and forthcoming about the nature and goals of the algorithms, better described as brain-mimicking artificial intelligence.
One of the most outspoken explainers and critics was Tristan Harris, a former ādesign ethicistā at Googleāhis company was acquired by it in 2011āwho now runs a non-profit, Time Well Spent, which seeks to improve Big Techās impact on society. What this under-40, ex-CEO said was as stunning as what looked like a blasĆ© reaction from his industry colleagues.
Harris said the attention economy, or the media on everyoneās smartphones and computers, is not just the endless marketing we all see. Thereās a deeper reason why many established news sources can be supplanted by shadowy propaganda on major platforms, why facts can be outrun by opinions and lies, and why narrower tribal loyalties can usurp democratic institutions.
Harris pointed the finger of blame at the heart and circulatory system of Silicon Valley. Its artificial intelligence algorithms are designed to trigger brain responses and be addictive, he said. They power a business model loosely called online advertising, but that is a superstructure that cashes in by targeting and provoking shared interests, via curated content. But it also separates society into disconnected spheres.
āThereās the public rhetoric about what [information] technology ought to do and what the positive intentions are. But then thereās the reality, if you actually go inside the companies, and hear the engineers and designers talk about their daily objectives, everything comes down to whatās going to hook people to staying on the screen,ā Harris said. āNo matter what the positive intentions are, 2 billion people do wake up in the morning right now and they have one of these things in their pockets, and they do use one of a handful of services. As my colleague Roger McNamee, who is [Facebook founder Mark] Zuckerbergās mentor likes to say, thereās 2 billion people who use Facebook, thatās more than the number of followers of Christianity; 1.5 billion people use YouTube, thatās about the number of followers of Islam. These products have more influence over our daily thoughts than many religions and certainly more than any government.ā
When John Lennon said the Beatles were bigger than Jesus in 1966, he created an international uproar. The band received death threats and had to stop touring. But when Harris said Facebook was more popular than Jesus and YouTube served more people than entire continents, these breathtaking assertions barely raised eyebrows. That scale underscores why high techās biggest successes are facing a reckoning, from congressional inquiries on Russian meddling in the 2016 presidential election, to new calls to break up tech monopolies under anti-trust laws, to solution-seeking forums like Knight Commission.
Harris studied how brains make choices at Stanford and went on to create technologies and a company that tapped the āinvisible influences that hijack human thinking,ā as his bio puts it. But that technology now poses an existential threat to humanity, he said, because itās growing beyond the ability any one companyāor handful of attention economy monopoliesāto control.
āWeāre a species thatā¦can study our own ability to be manipulated,ā he said. āWe have to talk about the advertising-based business model, which, paired with artificial intelligence, poses an existential threat. We have to get really serious about this. If you think about where are the most powerful AIs in the world located right now? Arguably, at two companies: Google and Facebook. The most powerful AIs in the world.
āInstead of pointing them at a challenge like climate change, and saying, letās solve that, or pointing it at drug discovery for cancer, and saying, letās solve that, we have pointed the most powerful AI supercomputers in the world at your brain. And we basically said, play chess against this brain and figure out what will engage it the best. And so every time we open up a news feed, weāre playing chess against a supercomputer thatās designed to see 50 million steps ahead on the chessboard of your mind, and figure out what will perfectly engage you.ā
The results are not always pretty, he said, a remark others echoed.
āWhen you think about the global consequences of thisāthe fact that this supercomputer is doing this in languages, and in countries, the engineers and the companies donāt even speak [or live in], which is how you get the Rohingya genocide in Burma. And how you get some fake news pod creating certain deaths in India, in South Sudan. The engineers canāt put this thing back in the bag. We have created exponential impact without exponential sensitivity.ā
Itās even worse than that, explained panelist Gina Bianchini, the founder and CEO of Mighty Networks, which specializes in niche social networks. She repeatedly said that there is a race in Silicon Valley to break Facebook and Googleās information monopolies that involve algorithms teachingāprogrammingāthemselves to execute a range of tasks, including bringing content to people who will be hubs in their own information networks. (Silicon Valley calls this “machine learning.”)
While she lauded the virtues of more competition, she and others described artificial intelligence as being at a threshold where the ability to get the biggest players in a room to agree to solutions would not be possible. Thatās because artificial intelligence is becoming so decentralized that the ethical problems highlighted by Harris will be beyond anyoneās ability to rein inābecause Silicon Valley and Big Tech isnāt a monolithic entity.
āThereās actually a scarier thing happening, which is today you can talk to two companies. Somebody shows up from Google. Somebody shows up from Facebook and wants to have the conversation, because they have the monopoly over attention today and over the advertising revenue,ā she said. āThe natural progression of software and where technology goes is it bends toward decentralization. It bends toward distributed technologies. Who do you talk to at that point?ā
Bianchini gave an example that underscored why traditional anti-trust laws and government regulation are hopelessly outdated and ill-equipped to deal with the attention economyās dark side. She cited Napster, which allowed music lovers to share audio files, so the recording industry sued and shut it down.
āWe were able to shut down Napster and the next thing that happened was Bit Torrent [software], where there was nothing to shut down. That is where the world is going.ā
What Would George Orwell Do?
These criticisms and explanations were not entirely rejected by their targets in the room. But as is often the case in high-stakes hearings, core issues can be sidetracked by expanding the focusānot by sticking with key questions such as whether the attention economyās biggest players would change whatās powering their addictive algorithms and micro-targeted advertising.
Take commission member Richard Gingras, vice president of news for Google. Before asking questions, he said there are two historic developments to keep in mind. First, the internet put the āmeans of communication, the printing press, in everyoneās hands.ā That has history-making benefits and challenges.
āWe have diversity of information like we have never seen before. Some of that diversity is troubling. Thatās par for the course, what freedom of the press is all about. Such that Iāve often posed the question of, is the true challenge to democracy the fact that we have unfettered free expression⦠thatās one component,ā said Gingras.
āThe second is I think the points about [ad and content] targeting are fair in the sense that we do haveāand by we, itās permeated throughout the ecosystemācompanies have the ability to target or leverage targeting beyond the dreams of any direct marketer or in the history of politics. Here, again, it’s not that the behaviors are necessarily different, itās just that they are more efficient,ā he said. āThese seem to me to be the key changes.ā
Big technological changes always have intended and unintended consequences, Gingras said. History is filled with examples from professions that had to adapt, he said, adding that’s what the media and political culture need to do.
āItās not sufficient to simply talk about this through the lens of technology,ā Gingras said. āWhat is incumbent on the rest of society and its institutions to think about and address it as well. When you look at [an]Ā environment where people are consuming information in different ways, forming opinions in different ways, this seems to me to suggest that we should rethink the mechanisms of journalism.ā
āHow we interact with our audiences,ā he continued. āHow we formulate content. The content models we use. Even business models we use to get there⦠How do these other institutions have to change? How do our basic cultural approaches to transparency and trust need to change to help folks understand why they are seeing what they are seeing.ā
Gingras does not make this comment in a vacuum. He co-founded an initiative called theĀ Trust Project, based at Santa Clara University Journalism School, which is urging news organizations to better label their online content and revise their websites so the search algorithms can elevate more authoritative content. That will help media stand out in the attention economy. Of course, it also helps Google do better searchesābecause Google search, unlike Facebook, directs users away from its website, and better results will fortify its search monopoly.
As the Knight Commissionās public sessions came to a close, the very issue Silicon Valley opposes the mostāonly second to revealing its secretive computational formulaāwas raised. What would be the result of government regulation, including the possibility of anti-trust actions breaking up the attention economy monopolies?
That question prompted one of the fiercest exchanges, and while unresolved, it suggests that Facebook and Google are going to have to become more transparent or face even greater backlash.
Gina Bianchini: āI have a very low confidence that the solutions are going to come from regulation. The solutions are going to come from the fact that we are building a grassroots mass motivation to move around centralization, which is going to be a whole different conversation.”
Richard Gingras: āI find this thread a little bit puzzling. If I had heard the discussion about possible solutions to the problem, absent any knowledge of problem, I would have thought that we were talking about the fact that we actually have a problem with monolithic information in a society which is over-guided and controlled in one direction. Right? But of course thatās actually not the problem weāre facing. In fact, the problem weāre facing is one that is completely the opposite. We have tremendous diversity and points of views, silos of thought, reinforced silos of thought, from one end of the spectrum to the other and around and back again. So when I look at that problem, I wonder what problem are we really trying to solve, and how? Iām failing to see the dots connected on this.ā
Gina Bianchini: āFrom a monolithic perspective, whoās controlling that algorithm?”
Richard Gingras: āBut the algorithmā¦ā
Gina Bianchini: āItās two companies [Facebook and Google].ā
Richard Gingras: āThis putative control isnāt controlling peopleās points of view. If anything, it’s commending various points of view beyond their own level of comfort.ā
Ethan Zuckerberg, director of the Center for Civic Media at MIT, and commission consultant: āItās impossible to know that from the outside world. Itās literally impossible.ā
Richard Gingras: āOutside world. Itās not hard looking at our world today to say we have a society thatās less unified than ever before.ā
Ethan Zuckerberg: āAnd you can ask a question⦠about whether this information environment, around Facebook and Google, took a very extreme part of that and made it much, much more powerful. But we have a very, very hard time auditing that… All I am trying to say is that one thing short of regulation, and actually breaking up these entities, would be paths to a great deal more transparency, so we can ask these hard questions about how these platforms are shaping the information and knowledge that we are getting.ā
The Knight Commission will continue meeting through 2018 before issuing a report and recommendations next fall. But in several brief hours at Stanford Universityās alumni center, it laid out the issues, challenges and stakes for the problems in an attention economy where psychological manipulation and micro-targeting are used by the top information curators.
Notably, late Friday, Mark Zuckerberg announced that Facebook would soon ask its 2 billion users to rate the trustworthiness of the media on their news feeds. That may help identify more and less trustworthy news sources according to each userās values. But it wonāt get at the āinvisible influences that hijack human thinkingā as Tristan Harris put it. Nor will it address the societal segmentation accelerated by online ad technology that Gingras acknowledged. Nor is it an action that would add transparency to the algorithms powering these information monopolies that MITās Zuckerberg noted.
Indeed, as New York Times tech columnist Farhad Manjoo noted this week in a piece that pondered if Apple would save the day by adding elegant product features to blunt the excesses of the digital ad business, āIām skeptical theyāll [the leaders of the attention economy] be able to suppress their economic interests.ā
ZNetwork is funded solely through the generosity of its readers.
Donate