from the regulating-on-myths dept
It’s kind of crazy how many regulatory proposals we see appear to be based on myths and moral panics. The latest, just introduced is the House version of the Filter Bubble Transparency Act, which is the companion bill to the Senate bill of the same name. Both bills are “bipartisan,” which makes it worse, not better. The Senate version was introduced by Senator John Thune, and co-sponsored by a bevy of anti-tech grandstanding Senators: Richard Blumenthal, Jerry Moran, Marsha Blackburn, Brian Schatz, and Mark Warner. The House version was introduced by Ken Buck, and co-sponsored by David Cicilline, Lori Trahan, and Burgess Owens.
While some of the reporting on this suggests that the bill “targets” algorithms, it only does so in the stupidest, most ridiculous ways. The bill is poorly drafted, poorly thought out, and exposes an incredible amount of ignorance about how any of this works. It doesn’t target all algorithms — and explicitly exempts search based on direct keywords, or algorithms that try to “protect the children.” Instead, it has a weird attack on what it calls “opaque algorithms.” The definition itself is a bit opaque:
The term “opaque algorithm” means an algorithmic ranking system that determines the order or manner that information is furnished to a user on a covered internet platform based, in whole or part, on user-specific data that was no expressly provided by the user to the platform for such purpose.
The fact that it then immediately includes an exemption for “age-appropriate content filters” only hints at some of the problems with this bill — which starts with the fact that there are all sorts of reasons why algorithms recommending things to you based on more information than you provide directly might be kinda useful. For example, a straightforward reading of this bill would mean that no site can automatically determine you’re visiting with a mobile device and format the page accordingly. After all, that’s an algorithmic system that uses information not expressly provided by the user in order to present information to you ranked in a different way (for example, moving ads to a different spot). What’s more, “inferences about the user’s connected device” are explicitly excluded from being used even if they are based on data expressly provided by the user — so even allowing a user to set a preference for their device type, and serve optimized pages based on that preference, would appear to still count as an “opaque algorithm” under the bill’s definitions. You could argue that a mobile-optimized page is not necessarily a “ranking” system, except the bill defines “algorithmic ranking system” as “a computational process … used to determine the order or manner that a set of information is provided to a user.” At the very least, there are enough arguments either way that someone will sue over it.
Similarly, lots of media websites offer you a certain number of free articles before you hit their register or paywall — and again, that’s based on information not expressly provided by the user — meaning that such a practice might be in trouble (which will be fun to watch when media orgs who use those kinds of paywall tricks but are cheering this on as an “anti-big-tech” measure discover what they’re really supporting).
The point here is that lots of algorithm/ranking systems that work based on information not expressly provided by the user are actually doing important things that would be missed if they suddenly couldn’t be done any more.
And, even if the bill were clarified in a bill-of-attainder fashion to make it clear it only applies to social media news feeds, it still won’t do much good. Both Facebook and Twitter already let you set up a chronological feed if you want it. But, more to the point, the very rationale behind this bill makes no sense and is not based in reality.
Cicilline’s quote about the bill demonstrates just how ignorant he is of how all of this stuff actually works:
“Facebook and other dominant platforms manipulate their users through opaque algorithms that prioritize growth and profit over everything else. And due to these platforms’ monopoly power and dominance, users are stuck with few alternatives to this exploitative business model, whether it is in their social media feed, on paid advertisements, or in their search results.”
Except… as already noted, you can already turn off the algorithmic feed in Facebook, and as the Facebook Papers just showed, when Facebook experimented with turning off the algorithmic rankings in its newsfeed it actually made the company more money, not less.
Also, the name of the bill is based on the idea of “filter bubbles” and many of the co-sponsors of the bill claim that these websites are purposefully driving people deeper into these “filter bubbles.” However, as we again just recently discussed, new research shows that social media tends to expose people to a wider set of ideas and viewpoints, rather than more narrowly constraining them. In fact, they’re much more likely to face a “filter bubble” in their local community than by being exposed to the wider world through the internet and social media.
So, in the end, we have a well-hyped bill based on the (false) idea of filter bubbles and the (false) idea of algorithms only serving corporate profit, which would require websites to give users a chance to turn off an algorithm — which they already allow, and which would effectively kill off other useful tools like mobile optimization. It seems like the only purpose this legislation actually serves to accomplish is to let these politicians stand up in front of the news media and claim they’re “taking on big tech!” and smile disingenuously.
Thank you for reading this Techdirt post. With so many things competing for everyone’s attention these days, we really appreciate you giving us your time. We work hard every day to put quality content out there for our community.
Techdirt is one of the few remaining truly independent media outlets. We do not have a giant corporation behind us, and we rely heavily on our community to support us, in an age when advertisers are increasingly uninterested in sponsoring small, independent sites — especially a site like ours that is unwilling to pull punches in its reporting and analysis.
While other websites have resorted to paywalls, registration requirements, and increasingly annoying/intrusive advertising, we have always kept Techdirt open and available to anyone. But in order to continue doing so, we need your support. We offer a variety of ways for our readers to support us, from direct donations to special subscriptions and cool merchandise — and every little bit helps. Thank you.
–The Techdirt Team
Filed Under: algorithms, antitrust, big tech, david cicilline, filter bubble transparency act, filter bubbles, john thune, ken buck, opaque algorithms, ranking, richard blumenthal