For nearly two weeks, Facebook has been at the center of a media firestorm about whether its "human editors" have been inappropriately tampering with the "Trending Topics" seen by millions.
A U.S. senator has pressured Facebook for answers. Top Facebook executives have met with leading conservative figures like Glenn Beck to assure them that no systemic liberal bias reaches the users of its social network. And CEO Mark Zuckerberg has put his own integrity on the line to ensure everything is above board.
This is Facebook in 2016: A sprawling $300 billion giant with 1.6 billion users and such intense scrutiny on it that even an unsubstantiated rumor about a possible political bias by one contracted editor managing a section many users don't even look at can set the political and media world on fire for weeks.
The real takeaway from this controversy, according to some longtime Facebook watchers, is less about the whiff of bias than the reminder that Facebook is now in uncharted waters, with no clear guidebook for how to manage itself and the expectations of its community.
"No matter how much you think, 'Some day, we'll get to everyone on the planet,' no company in any industry has ever been in that kind of position. Therefore there is no precedent for the structure, systems, responsibilities and controls that are necessary for an organization that has that degree of influence," David Kirkpatrick, author of The Facebook Effect, the definitive account of Facebook's rise, told Mashable's Biz Please podcast, which you can listen to below, or download on iTunes and Stitcher.
"[Mark] is still a relatively young person who is learning as he goes," says Kirkpatrick, who also founded the Techonomy conference, "as are all the people there because nothing like this has ever existed before."
Should Facebook have a public editor to vet its media efforts, as Mashable has suggested? Should it invite government oversight, or push for radical transparency with the public on this and other projects? These are all fair questions, but there is little precedent for the right answer.
Facebook could change a few lines of code and impact the personal lives of millions, an extreme level of influence that it unintentionally demonstrated when it conducted an emotion manipulation study in 2014.
Zuckerberg, who famously wrote "I'm CEO, bitch" on his business cards in the early days, has remodeled himself in recent years as a younger statesmen, philanthropist and father of the year, effectively becoming a polished ambassador for the powerful business. Facebook, likewise, has attempted to build trust with an equal playing field paved by algorithms. More than anything, the reports about trending topic bias threatened that trust.
"I consider them a highly ethical company that takes issues like this deeply seriously," Kirkpatrick says. "The fact that Zuckerberg met with these conservative leaders.... is a sign of how seriously he takes it and how much he feels misunderstood and how much he wants to demonstrate that he is fair-minded."
It seems to have worked: Beck praised Zuckerberg for looking him in the eye and appearing "sincere" in the desire to create a fair public space.
Perhaps that will end this particular controversy, perhaps not. Either way, this particular incident is still far from the worst that Zuckerberg and his team have experienced.
When Kirkpatrick first met Zuckerberg, it was for lunch in 2006 "right in the middle of the News Feed controversy," when a big chunk of its user base revolted over the introduction of the News Feed. "But he just rolled right through it. Because being the data geek he was, he knew that people were actually using this new feature, and what they said was much less important than what they did."
As Facebook continues to stretch deeper into media, politics and the very core of our lives, data may not be enough to weather the inevitable storms that come.