Does Facebook Even Know How to Control Facebook?

0
9
Does Facebook Even Know How to Control Facebook?


Later today, executives from Facebook, Google, and Twitter will go before the Senate Intelligence Committee to testify about the ways that Russian operatives used these platforms to plant and spread disinformation, and generally wreak havoc on the 2016 presidential election.

There will be plenty of discussion of the specifics of the troll campaigns—which could have reached 126 million users on Facebook alone—as well as how the companies hunted down the evidence they have so far. Expect Virginia Senator Mark Warner to give them hell.

The companies will argue that the scale of the interference was small and that it was unlikely to have swung the election. “This equals about four-thousandths of one percent (0.004 percent) of content in News Feed, or approximately 1 out of 23,000 pieces of content,” wrote Colin Stretch, Facebook’s general counsel, in prepared testimony to the committee.

They’re also likely to argue that, in any case, there’s not much that they could have done to stop Russian trolls without unduly affecting the public sphere now orbiting their advertising machines.

Based on what we know, these are not unreasonable positions. There was so much chaos and misinformation on these platforms in the run-up to the election, it probably would be hard to disentangle the independent effects of Russian agents from all the scammers, hucksters, self-made pundits, opportunists, conspiracy theorists, activists, citizens, and partisan media businesses.

No one has ever seen a system like this because none has ever existed.

But that should make us step back and enlarge the question: If this is what electoral campaigns look like online, and especially on the largest social platform, Facebook, then what can be done about that?

Put simply: Is Facebook too big to work?

* * *

When I say Facebook, I don’t mean exclusively the corporate entity, Facebook Inc. While Facebook has built the system, it now contains vastly more actors contributing to its functioning than Facebook employs or directly controls.

No one has ever seen a system like this because none has ever existed. It’s software and interfaces and tools. It’s a social network. It’s an advertising vehicle. It’s the key media distributor. But, like a city or a nation, it’s also human behaviors, habits, norms. Because much of it is built on machine learning, which transforms user behavior into new software adjustments, users and algorithms drive each other by design.

If everyone stops clicking on something, soon the software will stop showing it to anyone. If Facebook pushes video into feeds, people will watch more video. These feedback loops are what Facebook is as an attention-gathering machine. They ensure that the system is always calibrated for you and you and you. In the lethal struggle between Silicon Valley’s most valuable companies, which are also America’s most valuable companies, this ability to hold human attention as well as any invention since television is Facebook’s competitive advantage.

Because users are so important in this system, Andreessen Horowitz’s Ben Evans argues that Facebook has only very limited control of the system. Facebook is, in his terms, “extremely good at surfing user behavior.” He notes that Facebook’s News Feed algorithm doesn’t make “editorial” but merely mechanistic rankings, based on a variety of factors including but not limited to how many people click, like, and share a story.

“This applies at every level of scale—whether it’s creating an entirely new product or tuning some small feature based on a daily or hourly feedback loop—Facebook doesn’t determine what the feedback tells it,” Evans writes. He compares Facebook to a fashion designer. They don’t create the zeitgeist, but merely try to capture it. Every failed Facebook product—and there are some—is evidence for this proposition.

This is an important point in understanding the beast. Facebook’s control over its platform has significant limitations.

But Evans’s model of the interaction between Facebook Inc. and Facebook is scoped too narrowly. It’s a good way of thinking about Facebook the business, but not a good way of thinking about Facebook the phenomenon as we experience it. Which makes sense: Evans is a business analyst.

However, the product-sales model of Facebook Inc. doesn’t capture everything about Facebook’s broader system.

First, the network effects of Facebook are sweeping and intense. They’ve got users pretty well locked-in because their friends and their data are both inside. Nobody really believes that Microsoft Word or Adobe PDFs are the platonic ideal of word processing and document sharing, but they work well enough, and everybody else is using them. Facebook has become this type of utility for most social networking.

Second, the dominance of the network has also changed the dynamics of the broader internet. As News Feed takes up more people’s attention, it has acted like the interstate system, siphoning people away from the old ways that they used to find stories and connect with people on the internet. Other sites and apps became ghost towns, digital equivalents of the towns off the old state highways. Two that didn’t—Instagram and WhatsApp—Facebook snatched up. One that didn’t—Snapchat—Facebook cloned the features of.

But the biggest difference is that like any application that rewires social networks and attention, Facebook changes people. There’s nothing mystical about this statement. The platform shapes the nature of people’s relationships to each other and with news about the world. Their social lives are increasingly mediated by software. Their expectations of feedback from others change.

Facebook shapes the psychology of people to be more engaged Facebook users. And this is its greatest form of control.

When people say (or joke) that when you’re on Facebook, you’re not the customer but the product, it’s worth noting that the product keeps getting better.

Alexis Madrigal

People know that they have to make themselves “algorithmically recognizable,” as Microsoft Research’s Tarleton Gillespie puts it. If they do not do so, they will find themselves left out of their friends’ feeds, which means left out of a big chunk of their friends’ lives. University of Copenhagen researcher Taina Bucher calls this the “threat of invisibility.”

And so, people use their social intelligence to make sense of Facebook. They develop folk mythologies that Bucher calls “the algorithmic imaginary,” which offer them some sense of control over how Facebook might read them.

This has consequences. In September, Yale psychologist M.J. Crockett published a paper in Nature arguing that moral outrage changes when extruded through social networks. “Digital media may promote the expression of moral outrage by magnifying its triggers, reducing its personal costs and amplifying its personal benefits,” Crockett writes. “At the same time, online social networks may diminish the social benefits of outrage by reducing the likelihood that norm-enforcing messages reach their targets, and could even impose new social costs by increasing polarization.”

That’s just one example. There are many others. And all them, from Facebook Inc’s perspective, would be mere externalities in its drive for attention. For you and me, it might be something more.

But individual users are not the only players in the Facebook information environment. While people like to think that things get popular by “going viral,” my colleague Derek Thompson has pointed out that a body of research shows that distribution power really matters. Most of the time when something goes big, it’s because a site or page or person or app with a large audience features it. While the nature of the story or video matter somewhat, these key distribution moments determine popularity.

On Facebook, every serious media company must maintain a presence. The same is true for advocacy organizations, activists, and other groups that depend on reaching an audience. These institutions undergo a more conscious optimization process than individual users. They hire consultants and train social media managers. They talk with Facebook and read research on what’s working. They monitor Crowdtangle for things that many people are sharing. They know that their survival depends, in large part, on generating stories that can be popular on Facebook and then getting them in front of large audiences of people who will share these stories.

It’s common practice to “swap” links between different media outlets. Sometimes these relationships are formalized. Other times they are more just friends or former colleagues helping each other out to pop a story.

Each and every player in this increasingly large part of the media world has to be intimately familiar with every facet of Facebook in order to develop an algorithmic intuition about which stories might get shares with what headline. Everyone monitors everyone else for breakthroughs in style, art, story type, and anything else that could provide a scrap of advantage.

And of course, the same social media people who work for media companies can move over to industry as well, where there is a whole other set of tools and optimizations. Because as in the old adage that every company is a media company, every company is also a social media company.

If normal users are evolving creatures in a well-resourced environment, the media (and other) organizations are like bacteria under brutal selective pressure. They are evolving, transmuting, swapping DNA at tremendous speed. And of course if regular users come up with something, they adopt it immediately.

This memetic competition and co-evolution is one reason the media landscape is so dizzying. Every single node in the network is changing constantly, which generates unpredictable effects within the larger entity.

And there is even more optimization going on within each media organization, writers and videomakers—journalists and other content producers—are also learning as regular users themselves and with the best-practices that filter down to them officially or unofficially from the social media people.

This is to say nothing of scammers, alt-right memelords, hoaxers, Russian agents, or anything of the sort. This is just your base case Facebook.

It exists in this precise configuration because of decisions that Facebook has made. In particular, the design of the site flattens brands out. People see things “on Facebook,” as opposed to in The Atlantic or on Fox News. Each post looks similar. The feed doesn’t differentiate between news and news-like content.

The unit of distribution for media used to be a bundle: magazine, newspaper, television show, etc. The internet tore that apart. The URL became the atomic unit, but there was a central place that editors controlled called a home page (or, as we once envisioned, an app). Facebook ripped those apart, too, and few homepages have real distribution power (though they’re important for other reasons). The ease of digital publishing, this great unbundling, and Facebook’s design have all flattened the distance between what used to be called news or journalism and everything else.

And it is within this wild, dynamic system that our political process now plays out. Facebook can easily fix some of the ad transparency problems that the 2016 election brought to light. They can simply disclose political ads—and, crucially—some sense of how they were targeted. They have already announced they are going in this direction, which is good.

But the larger problems require that Facebook try to exert power within this extremely complex ecosystem on which the billions of people and the world’s journalism outfits have come to depend.

***

I made a list of the other kinds of control that Facebook Inc. has over Facebook the system. It is extensive.

Of course, there is the thing that everyone talks about: the way that Facebook ranks stories that show up on News Feed. They have machine learning rules that predict what people will click, like, and share. They run other software over the top based on surveys and “information integrity” work. And then boom: You’ve got a ranking. They could change those weights. They certainly have through time, as we saw with the massive burst in video in 2015 and 2016.

They also have models of your social network that are based on what they’ve learned about how people interact with each other on the network. They could change those to emphasize closer relations or more recent connections or hometown affiliation. The relationship between how Facebook models your social “graph” and what the people on it share is obviously a key component of what shows up on News Feed, too.

There is the graphical and user interaction design of the site and the News Feed. Each piece of the design obviously can encourage or discourage certain types of behavior. Add a little Live button and more people will go live.

There are the mechanics of the tools that Facebook provides to allow people to organize, which sociologist Zeynep Tufekci notes have lasting consequences.

There are standards. No porn, for example. Also no copyrighted material. People must also go by their “real names.” Agree or disagree, these standards certainly shape the community’s functioning in important ways.

There are corporate practices. Facebook salespeople have brought many organizations onto Facebook. They’ve done outreach with media companies and politicians. They’ve marketed the company to certain crowds. They maintain a large “policy,” which is to say lobbying, arm that fights to quash bills that might hamper Facebook’s business. There are the decisions Facebook has made about the opacity of its News Feed ranking algorithms and ad targeting. All of these things shape who is on Facebook, how they use the product, and what decisions Facebook makes.

And then there are the axioms that undergird the product. Facebook has never fully laid them out, but there are three that I feel confident advancing:

  1. Facebook Inc. should not determine what is true and what is false information.
  2. Engagement measures quality of content.
  3. The most engaging content should be spread to the most people.

Each of these axioms structures the media ecosystem as we know it. These are the ground rules governing the actual product development at Facebook, and therefore the ground rules for the evolving use of Facebook by everyone else. These rules makes sense for how Facebook was created: It was a way to connect friends. And why should some company try to factcheck your father-in-law’s posts? That would be crazy.

But, given Facebook’s global, massive expansion, if you just stare at these axioms for a moment, you might see why the company might be encountering—and genuinely trying to solve—“information integrity” problems.

What Facebook—or government regulators—should now do, however, is far less clear. The past year has been the company’s “Frankenstein moment,” as Kevin Roose put it in The New York Times. “The company has been hit with a series of scandals that have bruised its image, enraged its critics and opened up the possibility that in its quest for global dominance, Facebook may have created something it can’t fully control,” Roose wrote.

Facebook’s own employees have warned about the consequences that could result from changing Axiom 1. For example, Chief Security Officer Alex Stamos tweeted that Facebook shouldn’t become the “Ministry of Truth.”

So many of the problems are hard to fix precisely because Facebook is so big and so dominant. Even small experiments to improve the system can reshape whole countries’ media ecosystems as just happened in six small nations, including Cambodia, which was already undergoing press repression by an authoritarian regime.

This scale did not come by accident. The company spread as aggressively as it could to grab all existing internet users and bring a Facebook variation of the internet to those without access.

That’s because the most obvious answer to who controls Facebook is written into the company’s shareholding structure, and that’s Mark Zuckerberg. In his letter to prospective shareholders before the company’s IPO, Zuckerberg laid out the company’s goals, among them:

  • “We hope to rewire the way people spread and consume information.”
  • “We hope to improve how people connect to businesses and the economy.”
  • “We hope to change how people relate to their governments and social institutions.”

All of these goals were structured around the desire to “have the biggest impact” by “solving the most important problems.”

Mark Zuckerberg has been compared to many figures. In a 2010 New York Review of Books essay, Charles Petersen off-handedly compared Zuckerberg to Robert Moses, New York’s most powerful and feared 20th-century urban planner. As time goes on, the comparison seems ever more apt.

Moses was the subject of Robert Caro’s first master work, The Power Broker. In it, Caro details how Moses rewired the city, building bridges, tunnels, parks, housing, and more. He also razed neighborhoods and bulldozed opponents, deftly using power to create a city that was both rationally planned and idiosyncratically crafted to Moses’ standards.

In April, science and technology scholar David Banks expanded on the comparison in a post on Cyborgology. “Mark Zuckerberg will be our generation’s Robert Moses. Fueled by a deep belief in rational systems’ ability to reward the best with power over the rest, Moses commanded massive budgets, built enormous public works projects, and never ran for a single elected office,” he wrote. “Moses wielded bureaucracy and technocratic authority the way Obama could use rhetorical flourish and moral authority. He was a master of the craft and changed what it meant to use it. Zuckerberg could carve out a similar role for himself on a bigger scale.”

Zuckerberg has cultivated a nerdy, engineer-hacker image that seems the very opposite of the imposing Moses bureaucrat. But both were builders. They believed they were making the world a better place by building (roughly) democratic infrastructure. One might even say that they both wanted a more connected world.

And builders need power. As the years have gone, I’ve come to find it remarkable that most people did not translate Zuckerberg’s (and Silicon Valley’s) “have the biggest impact” to “amass the most power.” But power is a necessary step on the way to impact.

Facebook Inc. has it now. They got what they wanted, and now, to twist Stamos, they’re acting as if is a punishment.

The status quo may be tenable—just look at the upward march of Facebook’s share prices—but if Mark Zuckerberg had been OK with the status quo, we wouldn’t all be living in Facebook world.

And as we are likely to see in the hearings this week, even if he tries to keep the company more or less on the path it has been, its home country’s legislature may just have something to say about that.





Source link

LEAVE A REPLY

Please enter your comment!
Please enter your name here