I highly recommend everyone read Mark Zuckerberg's Building Global Community note. These ideas raise challenging / fascinating / relevant questions for the rules of our connected world.
Its also interesting to see from a strategic perspective that Facebook sees basically all aspects of community as under their purview. On the one hand this is both ambitious-in-a-positive-way and begins to take responsibility for their influence, and on the other I struggle to see whether this has left room for non-FB mediated community.[1]
As someone who spent a lot of time building community products for a specific audience with specific needs (students and educators), I know what it feels like to see Facebook undermine efforts by launching different tools and features which only loosely meet those needs, but win because of scale. I worry that the subtext of this note (or at least its realized eventuality) is basically if its not on Facebook, it's not a community[2], and that we'll all be limited in what tools our communities have based on the lowest common denominator of what Facebook offers.
The Growing Importance of Sophisticated Social Tools
The many tools and capabilities described in Zuck's piece (much of it AI-based content analysis) are incredibly important and sometimes downright necessary as the information age advances and more human activity is mediated digitally. This new digital medium we've arrived at comes with it many flaws - a lack of empathetic cues from text prompts make conversation more divisive, a penchant for reactivity due to the abundance of always-on content, an addiction cycle brought on by the reward structure of content, and even the ability to be manipulated en masse by bots. The more our activity moves to this digital medium, the more we need sophisticated tools to elevate discourse and keep communities cohesive, rather than subject ourselves to living with the flaws of the medium and being overwhelmed with the toxic raw data of the internet.
Now compare this to Facebook's strategy, which is to own the entire consumer social experience, because ad models require eyeballs. As Ben Thompson of Stratechery puts it, Facebook is a walled-garden and not a platform because they capture the vast majority of the value themselves, leaving little room for other companies/organizations to capture value. Meanwhile by Ben's (and Bill Gates') definition, you only have a platform when you capture less value than you create for others. If Facebook succeeds in owning the majority share of the many facets of human community, a walled-garden strategy is extremely problematic because Facebook may not leave enough room for others to shape communities in ways that Facebook does not support[3] -- that is, unless you exit the walled garden.
Diverse, Healthy Community Cannot Co-exist with a Monopoly Social Network
Exiting the walled garden though means being subjected to that toxic raw data of the internet. The only way to avoid that is to use the same types of sophisticated tools that Facebook is describing here. However, if Facebook successfully leverages their scale to become a near monopoly across all possible community activity (which I think they will), they may not leave room for other communities to build these tools themselves. There may not be the market, or the available talent, to reach such sophistication required. And so if you exit the walled garden, you enter the ghetto of poorly filtered internet communities struggling to keep at bay the toxic bots and the less than stellar human behaviors that emerge from the medium. Many an online community have struggled with this, going all the way back to usenet and BBSs.
I know this is a bit of a dramatic line of thinking. It supposes little technology and knowledge transfer between Facebook and others, it supposes there is no market outside Facebook for community products that can achieve these sophisticated means, it ignores the possibility of open source, etc.
The reason I bring it up is that I read Mark's post as Community = Facebook. Again, both positive and negative things will arise if that is to come to pass, but that is a huge thing that we should not underestimate. In that reality, I am concerned for what this means for the diversity of human community structure and for those that wish to operate differently than Facebook - either in a highly divergent way, or more likely, because they have small nuances[4] to them that are not well captured by Facebook's present definition of and toolset for community.
The idea that as AI gets better we will have tools that elevate communities and discourse is very exciting. I hope Facebook doesn't keep these tools to themselves, and that communities outside Facebook will have the power to overcome the weight of the toxic raw data of the internet and act as counterbalances to any potential community monoculture that may arise.
Nonetheless, I find Zuck's note to be largely positive and quite inspiring.
UPDATE: As time has passed my feelings of inspiration have (unsurprisingly) waned.
Zuck mentions 'physical community', but lets face it, given our Augmented Reality future, that will converge with Facebook-mediated community. ↩︎
Aligning Facebook with community is similar to how Google has aligned itself with the web. Companies aligning themselves to larger themes is a great tool to instill corporate values and keep incentives pure, if the directionality is preserved.
Good for Internet --> Good for Google
Good for Community --> Good for Facebook
However, if the directionality changes, you have a recipe for all sorts of misalignment of incentives and employee brainwashing. Imagine the horrible decisions and implications that would follow if companies believed:
Good for Facebook --> Good for Community
Good for Google --> Good for Internet ↩︎
Early on, Facebook did make an effort at becoming a platform and had a great deal more social data interoperablity happening. In fact, I personally built two startups around this ecosystem. Unfortunately, because they did not adequately police the use of this data, Facebook Apps became a privacy nightmare and synonymous with scams and users (rightly) feeling exploited, not to mention the nefarious things certain companies did with this data. ↩︎
Nuances like how new community members are onboarded with increasing freedoms over time, defining allowable content and configuring the methods of enforcing it, scaffolding community governance practices like voting, housekeeping of past content, etc. ↩︎