It's almost a cliché in Silicon Valley now. "The pivot" — turning a company around to focus on a new goal or message — is something every young firm seems to go through. But when the company in question is Facebook, a pivot is no joke. With what is possibly history's largest customer base, changes at Facebook affect people across the globe.

So it seems worth poking at what happened this week, when Facebook presented a new vision for itself as a company focused on privacy.

The new, shiny image was presented at the company's F8 developer conference, and the change itself is unsurprising. Facebook has had a horrible couple of years of privacy gaffes and bad press. So, in response, CEO Mark Zuckerberg insisted he is now reshaping the entire company from top to bottom around privacy and community, primarily by shifting to encrypted messages and focusing on groups.

First reported in January, the company is doing two major things concurrently: unifying the technical infrastructure between Facebook, Instagram, and WhatsApp, and switching all communications on the platforms to be encrypted; and trying to make private groups as central to the platform as one's friends are now. It's ostensibly intended not only to focus on community, but also to pare back the scale of most people's experience of Facebook — think neighborhood group or family chat, rather than posting for everyone you've ever added as a friend. One example of how this might work was highlighted by Zuckerberg during his presentation, where users of WhatsApp can send money to each other or share their location privately, features that will eventually extend in an interoperable manner to Instagram and Facebook Messenger, too.

Yet ironically, in attempting to fix its privacy issues, Facebook may only exacerbate its other big problem. If privacy gaffes are one part of what makes Facebook a danger to its users, the other part is how it enables the spread of disinformation: a friendly home to misleading facts, bigotry, and closed enclaves of groupthink. In helping users congregate in private groups and messaging, Facebook could easily make that dynamic worse.

But the example of integrating WhatsApp in particular is both instructive and worrying. WhatsApp has been the site of a series of misinformation problems across the globe. In India in particular, it has been used to spread falsehoods and to help foster communal violence. Part of the reason is that WhatsApp is by nature not only private, but encrypted — that is, what happens on the platform is essentially impossible to police, even when the effects are clearly harmful.

The trouble with expanding this model to all of Facebook's properties should be clear. In focusing on siloed, private groups or messaging, the trouble of misinformation becomes even more problematic. What happens, for example, when anti-vax or racist groups move toward closed, private chats? The potential for falsehoods to travel unchallenged goes up exponentially.

This isn't simple malice on the part of Facebook; the company is in a real catch-22. On the one hand, public visibility of extreme views helps normalize them, according to recent data from Harvard. The seeming return of of white nationalism for example is arguably being helped along by the implicit and explicit expression of such views by prominent politicians or online figures. If that's the case, putting up barriers to the public broadcast of such opinions seems a reasonable reaction.

Yet, on the other hand, there is still some truth in the adage that sunlight is the best disinfectant. The moral and social opprobrium that can come in expressing fringe views in public can help clamp down on them, too. In creating more enclosed, private spaces in which communities dedicated to misinformation or hate can operate unimpeded, Facebook may in fact be sacrificing social health.

What that suggests is that perhaps the private vs. public dichotomy is the wrong lens through which to look at Facebook's overarching problems as a global platform. While privacy is vital and Facebook's missteps on that front have been serious and need to be addressed, it is also insufficient on its own to tackle the problem of misinformation or hatred. In fact, it's possible that those problems are just too big for Facebook as it is currently structured: a social network with billions of users that runs on ads and thrives on engagement.

The change that needs to happen is not in how Facebook approaches privacy, but in how both the company and we think of digital networks' place in our social fabric — that they can, when unchecked, exacerbate our worst tendencies and foster division. The solutions to those problems will be far more difficult, and may not even be technological at all, including but not limited to human moderators, regulation, or even breaking up the big tech companies. A redesigned app or website, after all, can only do so much.

Perhaps it is our politics that are in need of a major pivot.