Format | Price | Quantity | Select |
---|---|---|---|
PDF Download |
$6.95
|
||
Printed Black & White Copy |
$7.25
|
This public-sourced case describes the evolution of Facebook monitoring users' activity around what potentially offensive or disturbing material could stay posted and what should be removed. Following intense media and regulatory (e.g., Congressional) scrutiny, the company revised policies and reorganized content-review teams. The case offers an opportunity to debate a key question the company faced: What should platform governance look like? As Facebook faced an exponential growth in the number of user-generated posts daily, posts that contained harmful, inappropriate content also increased, raising the need for Facebook to devise a set of guidelines that were not only applicable across many different types of posts and situations, but were also perceived as acceptable by Facebook users. The tradeoffs a platform business makes between user characteristics, intent, outcome, and norms can be explored as students put themselves in CEO Mark Zuckerberg's shoes to redesign Facebook's platform-governance policies. Students will also have a chance to examine how certain platform-governance policies can subsequently impact Facebook's future strategy: for instance, the more Facebook got involved in curating content for its users—thus serving as arbiter—the more potential to shift Facebook from a tech business to a news publisher increased, thereby subjecting the company to a different set of regulations.
- Explore policies and tools to moderate user content on open platforms - Practice conducting a 5Cs marketing framework analysis - Identify challenges of managing free speech, community standards, and censorship worldwide