The guys from Transparify have called my attention to an interesting article by Grant Gross published at pcworld.com. Gross has used a Transparify DIY approach to analyze the financial transparency of different groups (non-profits, Think Tanks, advocacy associations) engaged in the net neutrality debate. There is a lot of diversity: from organizations who disclose their donors and the amounts donated (rated “A”) to others who provide no information whatsoever about their contributors. His summary: most of these groups get poor grades for financial transparency.
This rating gives me a precious opportunity to show why financial transparency, though important, is never enough in itself. Blind ratings can endorse poor, biased work of policy research institutions or fail to recognize good organizations; the accountability process is much broader and it needs good reporting eyes that look into goals, strategies and research output.
Let’s take the Center for Democracy and Technology, a DC-based group that has received an A in this informal rating. Among others, they are funded by Google, Facebook and Verizon. But they are transparent, so they have nothing to hide, right? At least, they can’t be blamed of astroturfing. Says Gross:
Astroturfing is commonly defined as a lack of funding transparency, paired with the appearance of grassroots support
With good reasons, though, CDT might be blamed of half of it: the appearance of grassroot support. According to my own analysis of one of its policy papers from last year, The Importance of Internet Neutrality to Protecting Human Rights Online, CDT has masked its advocacy in favor of net neutrality as an issue of civil rights and freedom of expression in order to garnish a support from the general public they wouldn’t be able to generate if they spoke openly in name of their donors.
If you want to know the details of my analysis, you can read the text that follows. You will find a very qualitative discussion on the meaning of the concepts and definitions used in CDT’s paper. No need to dive into data.
Policy paper analysis. Masking motivations in the net neutrality debate.
How does CDT represent Internet neutrality? The working definition appears somewhat late in the text, at the end of page 3:
the principle that providers of internet access should not discriminate in their carriage of Internet traffic on the basis of its source, destination, content, or associated application.
In other words, networks have traditionally been blind to the use given to the ones and zeros that run through their veins, and this has been possible because they belong in a different layer than user-related functions, leaving all the work to transform information in usable content to user-held terminals (layering and end-to-end architecture principles).
According to this model, all internet nodes are structurally equivalent. Due to its architectural design, this amorphous structure has no pillars, centers, meeting halls or walls. All nodes can connect with all and become a temporary nucleus that will eventually be reabsorbed into the horizontality of the architecture. For CDT, this result of the two architecture principles has an important consequence for users: the internet is free of constraints, there are no limitations nor borders to the maximum distance information can cover; and there are no limits to what citizens can make out of that information. These are the human principles of borderlessness and free choice.
In comparison to the threefold model of communication, internet neutrality thus simplifies at the microscopic level to the transmission between symmetrical endpoints:
The threefold model: Sender>Message>Receiver
The neutral network transmission: Node<>Transmission<>Node
Looking more closely (and noting the distinction between free expression and free choice), each iteration has two ways: a user exerts free choice by requesting information, and then receives the information requested, fulfilling the free expression of others.
NNT W1: User > Request (Free choice) > Content
NNT W2: User < Transmission (Free Expression) < Content
Of course, content can have all kinds of different forms: (software, audio, video, apps, written text, website contents, VoIP protocols, etc.) with different sizes and associated difficulties, but the enforcement of the end-to-end principle should guarantee access providers only have to worry about the capacity of their networks.
Problems with CDT’s account of internet neutrality
I have found CDT’s account to suffer from three major shortcomings.
1. Human Rights or Fair Market?
In my opinion, a very weak point in CDT’s position is the persistent focus on Human Rights even though the problem at hand matters most because of its economic implications. In words of Tim Wu, the inventor of the “net neutrality” term:
The promotion of network neutrality is no different than the challenge of promoting fair evolutionary competition in any privately owned environment, whether a telephone network, operating system, or even a retail store. Government regulation in such contexts invariably tries to help ensure that the short-term interests of the owner do not prevent the best products or applications becoming available to end-users. The same interest animates the promotion of network neutrality: preserving a Darwinian competition among every conceivable use of the Internet so that the only the best survive.
There. The network neutrality issue is not (eminently) about human rights, it’s not (eminently) about freedom of expression and freedom of choice, it’s not (eminently) about promoting the civil liberties. CDT’s paper gives the impression that we need FCC’s no-blocking rules to guarantee access to the websites of activist organizations, bloggers from Iran or tweets from Venezuela and Ukraine. Instead, the current debate is about preserving a fair market where innovations are possible and the best new service can replace present technological giants. Where a small company won’t lose against the big ones because it can’t afford a peer agreement with a major ISP.
This shortcoming comes first on my list because it’s a hard blow against the rhetoric foundations of CDT’s position. Even though the paper opens with a strong humanist appeal to the importance of education and culture, ultimately the better defended Human Right is not the freedom of expression, but rather its pseudo-synonym, the freedom of choice, which implies viewing internet users as users and customers, as opposed to citizens and creators.
2. The reality of networks
With a graphic quote from an article appeared in Wired last June, I would like to challenge the notion that the mature internet as we know it has a neutral architecture.
Now, even if this image can be considered a distorted and overly simplified model of how the internet functions, it does call into play the existence of backbone services, which are missing in CDT’s account, and places Google’s search engine at the user’s end of the process. Given that major content players are closer to the backbone than any other node in the network and have a differential importance on how users access what they want, the symmetrical model above is probably misleading.
Opposed to this likelier realization, CDT’s discourse seems on the verge of falling into essentialism. It does acknowledge that certain characteristics of the internet “are not immutable”, but only after having secured they are what “have defined the internet since its inception”. In the next paragraph I will discuss some of the reasons why I don’t think the internet has always been by definition user-controlled and decentralized, which are for CDT two of “the defining attributes of the internet”
What is really an access provider?
Google is in the business of accessing the internet. So is Facebook. Both are main donors of CDT. Arguably all these companies discriminate internet traffic based on its content, source, destination, or a combination of the above. Google’s algorithm ranks pages, creates an index, and prioritizes their meaningfulness to the user. Facebook feeds have a great power to shape user behavior, as a recent and polemical study linking AB tests to emotions has shown. Naver requires creating affiliated content in one of their café’s in order to appear in their homepage. They all enjoy quasi-monopolistic positions in different markets and clash directly with Wu’s demand for more competition.
Arguably, these and other companies have long ago disrupted the internet as a place where decentralized, user-controlled communication can happen. But (together with CDT) they are amongst the important names advocating for net neutrality.
If what we really want is a neutral network, we probably should take some steps back and look at what normal user behavior is really like. The request-and-transmission model above describes an unrealistic relationship of users with the content of the internet where the user can freely decide what he desires and magically retrieve it. Instead, few actors play a mediating role that has curved the way the internet works.
Google access: User > searches terms > chooses from indexed information
Facebook access: User < enters the feed < clicks on assorted contents
If Facebook decided to erase the updates of feminist pages from the feeds of all male users (and there are several reasons why they could be interested in this move), or if Google decided to give priority to commercial websites, they could change radically the identity and ideology of users without their consent, and enter in a major conflict with their freedom of choice. To an important extent, this is already happening. So much so that, according to some, social media could be one of the places where freedom of expression could be losing its democratic value –one of the strong claims of Eli Pariser’s The Filter Bubble.
Currently, based on the Communications Act of 1934,
- ISPs are considered providers of information services and not providers of telecommunications services.
- Telecommunication service providers are common carriers, information services providers are not.
- Only common carriers, due to their close link to the public interest, can be subject to restrictions such as the no-discrimination and the no-blocking rule.
An obvious measure is amending the Communications Act of 1934. It’s common sense that ISPs are providing telecommunications services that consist in the transmission of information, so that distinction needs to be eroded. Also, in the twenty-first century more and more basic everyday actions, like looking for a job, purchasing a health insurance or enrolling in college require a connection to the internet, so the declaration of internet access as an utility (and of ISPs as common carriers) can easily be advocated with large support from vast constituencies.
What we still need […] is a better way of regulating internet service providers. One way of doing this is through common carrier law —as defined in the Title II section of the 1934 Communications Act. Basically, this would treat ISPs as utilities. This would allow the government to prevent them from blocking or degrading traffic, but it would also force the ISPs to offer their internet lines to other companies. That creates competition, which is really the best way of ensuring that ISPs behave. As it stands, there’s very little competition.
The next step is actually tougher, because it is about regulating the companies that are in the business of accessing content. Can we imagine a federal agency or an international organization dedicated to the creation and update of indexes that users could employ to navigate the internet? Google is currently fulfilling a job that in the 1920’s would have been the job of the Ministry of Information, in what Siva Vaidhyanathan (The Googlization of everything… and why we should worry, p. 41) has called a public failure:
Public failure […] occurs when instruments of the state cannot satisfy public needs and deliver services effectively
Google does their job very well, but it wouldn’t be absurd to subject it to reasonable public controls.