Between data privacy problems and Russian bots, big tech platforms like Facebook are giving U.S. policymakers a serious headache. And that goes double for the Democratic Party. According to Axios, Sen. Mark Warner (D-Va.) has a new paper laying out multiple policy options for how to rein them in.

Kudos to Warner at the outset: The paper is clearly intended as a conversation starter; it's a conversation we need to have; and some of its ideas are genuinely worthwhile. But the paper also dances around the central problem presented by Facebook and its ilk. Probably because the central problem suggests solutions too radical for many Democrats to stomach.

In their desire to avoid looking radical, Democrats could wind up pushing solutions that are actually more dangerous than the alternative.

The Democratic Party is particularly concerned about the issue because of the 2016 election. Almost two years later, the way disinformation campaigns used Facebook and other platforms is still being understood. But the problem is also deeper than politics. Warner's paper describes it well: There's the potential for manipulating democracy, the issues of user privacy, and the question of whether the gargantuan size of such platforms chokes off competition. Warner doesn't aim to pass a definitive policy, but to propose a menu of options.

The thing is, before we start picking from the menu, we have to decide what Facebook is. As Ezra Klein recently laid out, Facebook can't really be treated as an "open" platform in the traditional sense anymore: "Facebook is making critical choices all of the time," Klein wrote. "The visibility of posts is driven by Facebook's newsfeed algorithms, the content is governed by Facebook's code of conduct, and a publisher like Infowars uses a different kind of Facebook page altogether." These same caveats, particularly the point about algorithms, apply to other major tech companies like Google as well.

That means you have to think of Facebook and other platforms as publishers.

If Facebook is a publisher, then demanding it do something about "fake news" and disinformation makes sense. We all expect other publishers, like Klein's Vox or ourselves at The Week, to police ourselves and to issue corrections when they occur. The problem is that many of these decisions are ultimately subjective, and the line between self-policing and censorship is thin.

The way the publishing industry solves this problem is through market competition. But Facebook is something like a monopoly publisher.

This brings us to what Warner doesn't recommend: Just breaking Facebook up. By using aggressive antitrust enforcement to smash it into multiple companies, you could solve most of its issues in one fell swoop. A competitive market would ensure users could punish poor data privacy or information filtering by simply moving to a competitor. But Warner obviously views this solution as too radical.

Instead, he seems to want to treat the big platforms the same way we treat other private companies with monopoly positions providing crucial services: as tightly-regulated public utilities.

It's not a terrible idea. The best parts of Warner's paper point in this direction: forcing platforms above a certain size to make activity data publicly available to researchers (they already know how to do this while keeping users' anonymity and privacy, they just don't like to do it for proprietary reasons); make disclosure requirements for online political ads more akin to those for radio and TV ads; create a kind of information tech equivalent to fiduciary duties in the financial world, which would apply to search engines, social platforms, ISPs, and so forth; privacy rulemaking authority for the Federal Trade Commission; and laws on privacy protection and algorithmic fairness similar to what's been passed in Europe.

But effectively applying such rules is where the rubber meets the road.

As Axios pointed out, the paper doesn't include creating a new federal regulatory agency for big tech platforms and digital issues. That's not the end of the world. But the history of consumer protections in the financial industry prior to the creation of the Consumer Financial Protection Bureau, for instance, shows how diffusing these rules among many agencies — rather than giving them to a single enforcer with a single mission — can lead to lax enforcement. In the antirust realm as well, regulators have often relied on these kinds of behavioral modifications as alternatives to just breaking companies up, and it hasn't always worked well. So I wouldn't put much stock in Warner's fix having much bite.

It could even backfire. That's because his recommendations are a minefield of potential free speech infringements.

Right now federal law protects tech platforms from tort and criminal liability for stuff people post on them. In the name of free speech and robust public discourse, media outlets also have wide legal latitude to publish analysis and information on matters deemed of public concern.

One of Warner's options is to reform that law to make platforms liable for stuff like "defamation, invasion of privacy, false light, and public disclosure of private facts." That would be a radical change that could possibly bring a huge swath of public discourse under much stricter standards. For instance, what distinguishes "public disclosure of private facts" from a standard "scoop" by a newspaper? This option includes caveats that we'll need to deal with those distinctions. But it doesn't really say how. Lots of powerful people would probably love to bring America's public discourse under tighter control. And the confusion over what the tech platforms are, and how to treat them, could be a backdoor for them to silence the media.

To put it mildly, this seems like a lot to give up to keep Facebook intact.

Radical and aggressive solutions can be politically scary. But by going to the root of an issue they can also help us avoid creating whole new problems.