Her official Facebook page has more than , followers. And Facebook was my primary digital organizing tool for a very long time. I gave up on it. Facebook has vowed to restore confidence in the social-media platform after a series of scandals. The company admitted to a massive privacy breach in April , after it was revealed it shared the user data of millions of people with data-firm Cambridge Analytica.
Subsequent reports on Facebook revealed how extensively the company tracks user preferences and identities like ethnicity, religion, and political affiliations. Although Zuckerberg has continued to make promises to improve the platform and rein in data-collection processes, some security experts remain skeptical and the company is facing several federal probes regarding how it tracks users.
And as a company dedicated to connecting people as its mission, Facebook clearly does not want to be accountable for the connections that users genuinely want, independent of whether Facebook gives it to users or the users find it themselves. As a strategy professor, I am probably more empathetic to Facebook than most.
Facebook has a strategy of connecting people that has created a tremendous amount of value, but that same strategy is getting Facebook into a lot of trouble today. There are hard tradeoffs on all sides. My view is that there is no clear solution, but there are three broad routes that Facebook can pursue, potentially in conjunction.
As one route, Facebook can be more transparent about the fundamental tradeoffs that come with social networking by releasing research that documents specific issues, like with body image and Instagram, alongside its ongoing advocacy for the value that comes with connecting people.
These insights can guide regulators and put Facebook in a good position to take regulation in a favorable direction for the industry, and regulation that imposes costly compliance requirements can be a barrier to entry that protects incumbents like Facebook, e.
To comprehensively moderate all its content, Facebook would need to continue advancing the frontier on algorithm detection of undesirable content and increase the number of human moderators by an order of magnitude or multiple.
As of , Facebook employs 15, human moderators that each view hundreds of content items daily, and it will need many more. This effort will cost billions of dollars, and perhaps more painfully for Facebook, force it to decide what content to restrict: curating for one person is censoring another.
However, no moderation effort can do much about the content running through encrypted WhatsApp or Messenger communications. On algorithm-originated connections, it will be impractical to delegate accountability on what is often a black box process — and this technology is a core piece of intellectual property for Facebook — so Facebook needs to be ready to take responsibility on what connections that algorithm promotes.
But on user-originated connections to undesirable content, Facebook has been unclear about who is accountable here. The quasi-independent Oversight Board moves Facebook towards this direction of delegating accountability, but it is still evasive and incomplete: The board only reviews Facebook content decisions after the fact on appeal, and the board is still financially dependent on Facebook and too small to operate at scale.
Moving forward, Facebook can itself take on genuine accountability by massively ramping up its own moderating efforts; publicly and credibly give that accountability to an outside authority; or leave that accountability in the hands of individual users by taking a stand and fighting for its original mission of connecting people freely however they want.
Right now, Facebook is ambiguously doing all three, leaving no one accountable at the end of the day. Facebook serves as a convenient lightning rod for ire, but Facebook could disappear off of the face of the earth tomorrow and we will still face these problems again and again.
The Facebook Trap is intrinsic to social networking as a whole, and reflects the consequences of digital technology facilitating a more connected world.
Twitter has evolved on the same path as Facebook towards using algorithms to connect people globally, imparting many of the same adverse consequences as Facebook.
Snap chat , originally reliant on connecting friends, drastically redesigned its platform to drive indirect network effects that increase the amount of time users spend watching professional content. TikTok has rapidly become a powerhouse by using its best-in-class algorithms to connect users to the most engaging content globally without having to build from a network of real-life friends.
We all need to reckon with the consequences of what it means to connect more people more intensely. You have 1 free article s left this month. Under the tech-friendly Obama administration, the Justice Department and the Federal Trade Commission allowed Facebook to swallow up quick-growing potential rivals. The breakup plan also faces steep hurdles. Over the last few decades, American antitrust law has grown fecklessly friendly to corporations.
In June, a federal judge threw out sprawling antitrust cases against Facebook brought by the FTC and 40 states, saying that they had failed to prove that Facebook is a social media monopoly.
Imposing rules for what Facebook can and cannot publish or amplify has been a hot topic among politicians. Democrats in Congress have introduced proposals to police misinformation on Facebook, while lawmakers in Texas and Florida have attempted to bar social media companies from kicking people off for speech offenses, among them Trump.
As I wrote last week, these policies give me the creeps, since they inevitably involve the government imposing rules on speech. Just about all of them seem to violate the First Amendment.
Here is a seemingly obvious way to cut off Facebook at the knees: Prohibit it from collecting and saving the data it has on us, thereby severely hampering its primary business, targeted advertising. The rationale for this is straightforward. Last year we made our privacy settings easier to find and rolled out GDPR-style controls around the world. As part of this, we asked people to review what personal information they share and what data they want us to use for ad targeting.
We also built better tools for people to access and download their data. We now have over 30, people working on safety and security — about half of whom are content reviewers working out of 20 offices around the world. We regularly share our latest enforcement numbers and more in our Enforcement Report. Last year, for instance, we updated our policy against credible violence to include misinformation that has the potential to contribute to imminent violence or physical harm.
This is a particularly important change in countries like Myanmar where people have used our platform to share fake news meant to foment division and incite violence. Myanmar — and countries around the world where some people seek to weaponize our platform — remain a huge area of concern for us. We want to make sure people have the power to decide when and how they engage with our services. To do this, we worked with leading mental health experts, organizations and academics to launch dashboards in both Facebook and Instagram that track the amount of time spent on each app.
The aim here is to give people more insight into how they spend their time. This is particularly true with ads related to politics.
0コメント