Facebook is in hot water. Public criticism is at a roiling boil following a series of data-sharing scandals. Multiple investigations are underway. Facebook's stock just took a sizeable hit, and its user growth is looking shaky.

Obviously, the company has courted controversy before. But this feels like a tipping point.

Which raises the question: Could the internet's premiere social platform actually destroy itself?

The big news of course is the Cambridge Analytica scandal, involving a Trump-linked consulting firm working on new ways to shape political marketing through big data analysis. Back in 2014, it created a Facebook app that let people take a personality quiz. Around 270,000 users obliged, allowing Cambridge Analytica to accumulate data on their characteristics and habits. But through those users' connections to other Facebook users, the firm was able to expand their data-gathering reach to around 50 million profiles. It also looks like Cambridge Analytica violated Facebook's terms of service in the process.

On Monday, the Federal Trade Commission (FTC) officially announced an investigation. Members of Congress asked Facebook CEO Mark Zuckerberg to come and testify. Multiple state attorney generals are launching investigations. No less than eight lawsuits have been filed against Facebook in just the last week.

The scandal gained a lot of steam from the connections between Cambridge Analytica and the Trump campaign. But it also just piled atop a long running series of scandals dogging Facebook.

Just within the last few days, news broke that Facebook has been collecting text and call data from users on Android smartphones. The social media platform is being sued for allowing discrimination in online housing advertisements. In Britain, Facebook has been extremely willing to hand user data over to the police even when a legal warrant isn't involved. In fact, going back to 2007, Facebook has repeatedly faced controversies over how it shares user data with third parties. And then, of course, there's the platform's role in the dissemination of Russian propaganda in the 2016 U.S. presidential election.

There's a lot swirling around here. But I think the way to cut through the fluff is this: People who use Facebook think the platform is one thing. Facebook itself thinks it's something else.

Users believe they're signing up for a private, personal experience: A way to connect with friends and family. But Facebook sees itself as providing a way to connect with the entire world. That's why everything you broadcast — from articles to personal thoughts and photos — is available as data to third parties unless you specify otherwise. And if you shout your click habits or phone call records or thoughts on politics to the world, it's sort of silly to get upset when other people and institutions make use of that information.

The question is how honest and upfront Facebook is with users about its intentions. That was the cause of a spat between Facebook and the FTC back in 2011. As a result, Facebook agreed to do a lot more to explicitly ask users before it shared their data in various ways. The new FTC investigation is looking into whether the access Facebook gave Cambridge Analytica violated the 2011 deal, even if through negligence.

Here's the fundamental problem, though: Facebook benefits from this gap between its purpose and its users' perceptions. And the wider the gap, the more it benefits.

Users aren't Facebook's customers. They use the site for free. Advertisers and other third parties are Facebook's customers. And what Facebook provides them is data on its users. But if users realized what they were actually signing up for, it seems likely far fewer of them would do it. Facebook's creepy nature can fly under the radar as long as its data sharing is as hard to understand as possible. And turning data management into an annoying chore only helps its cause.

Tougher regulations on Facebook and similar social media platforms are certainly called for. But Facebook has always dragged its feet on efforts to safeguard data because it needs to. The more Facebook's purpose coheres with users' perception of its purpose, the less profitable Facebook becomes.

But even if more regulation isn't forthcoming, Facebook still could face an existential crisis. Users may not be its customers, but it still needs them to make its business model work.

We're not at a critical mass of user departures yet, but Facebook's growth is slowing down. In the U.S. and Canada, total users actually declined for the very first time at the end of 2017. The latest surveys suggest a further decline since the fourth quarter of last year, lining up with the deluge of data-sharing concerns.

And it's easy to see how things could get quickly worse for the social network. As Facebook grows ever larger, and the temptation to mine its data grows, its pile of offenses may become too large for most users to ignore. And why wouldn't they just leave?

Facebook boomed on its convenience and connectivity, but it doesn't provide anything fundamentally necessary. That, as much as the threat of regulation, is probably why the company's stock value collapsed by 16 percent in just a few days.

Facebook should face the music from regulators. But it may destroy itself all on its own.