Deep packet inspection and bandwidth management: Battles over BitTorrent in Canada and the United States

https://doi.org/10.1016/j.telpol.2012.04.003Get rights and content

Abstract

Two case studies explore the reciprocal influence between technological change and Internet governance. Both focus on the use by Internet service providers of a new capability known as deep packet inspection (DPI). DPI was used by major network operators in the U.S. and Canada to block or restrict the speed of peer to peer file sharing traffic by their customers. In both cases, DPI implementations led to public protests, litigation and major regulatory proceedings. In both cases, network neutrality norms were used to challenge DPI deployments. The paper's descriptive comparison is supplemented by quantitative data drawn from the use of Glasnost, a network test that allows third parties to detect BitTorrent throttling via DPI.

The paper asks whether the use of DPI by ISPs disrupted the way the Internet is regulated, and whether political and institutional factors alter or constrain DPI use. It finds that the power to shape traffic flows redistributes control among actors in the Internet ecosystem, generating broad political economy debates about efficiency, fairness, innovation and transparency. But the actual results of those conflicts are indeterminate, reflecting institutional and historical contingencies.

Highlights

► The use of DPI disrupted established methods of Internet governance in Canada and the U.S. ► Use of DPI for BitTorrent throttling led to public mobilizations, litigation and major regulatory proceedings. ► In the U.S., ISPs backed away from application-specific throttling despite the FCC's lack of authority. ► In Canada, CRTC regulations did not have any impact on prior use of DPI and actually led to an increase in its use.

Introduction

Deep packet inspection (DPI) is a technology for scanning and analyzing Internet traffic and making decisions about how to handle it in real time. In theory, DPI can be used to address various Internet governance problems. These include the security problems associated with malware (Kim & Lee, 2007), copyright protection on networks (de Beer and Clemmer, 2009, Rossenhövel, 2008), and the need to optimize or monetize Internet services (Aghasaryan et al., 2010, Vorhaus and Bieberich, 2007). But the most popular and important application of DPI is to analyze and manage bandwidth utilization (Coward, 2009, Finnie, 2009, Mochalski and Schulze, 2009).

Bandwidth is usually a shared, scarce resource on the Internet. The growing number and diversity of broadband applications, especially ones involving video, are increasing bandwidth consumption. Internet service providers (ISPs) must invest in expanding bandwidth and/or economize on its use. The pressures are especially intense for the mobile Internet. As a tool of bandwidth management, DPI introduces ‘intelligence’ into what has often been called a ‘dumb’ network (Isenberg, 1997). Data packets are inspected at Layers 2 through 7 as they move through the network, allowing the operator to discriminate among the treatment received by different applications, services or users (Proch & Truesdell, 2009). It can be used to block applications, to rate limit or slow down packet speeds or to prioritize some users or applications over others.

While some would restrict the definition of DPI to applications involving inspection of layer 7 (i.e., the packet payload), most others – including the authors – consider “whole packet inspection” to be a more accurate label for the technology. Most DPI applications inspect the whole packet, not just the data payload, and correlate information in the payload with information in headers (see Mueller (2011)and Parsons (2008) for further discussion of this definitional issue).

This new technological capability seems to carry the potential to fundamentally alter the governance of the Internet. Classically, the Internet was based on best-effort packet forwarding. By making the network ‘aware’ of what is going through it, and by linking traffic content to specific subscribers, DPI challenges three traditional principles of Internet governance. The first is the end to end principle, which places control of applications at the endpoints (Saltzer, Reed, & Clark, 1984); the second is the expectation that communications are confidential (Collins, 2010, Meyer and Van Audenhove, 2010, Topolski, 2008). Third, DPI challenges the idea that ISPs, as ‘mere conduits,’ cannot and should not be held responsible for regulating activity on their networks (Bendrath and Mueller, 2011, Frieden, 2007).

DPI for bandwidth management impacts the end-to-end principle most directly, allowing the ISP to interfere with or alter the behavior of applications or services run from the edges. For some, this raises fears that ISPs will discriminate between services and users on their network in ways that will stifle innovation and competition (Riley & Scott, 2009). Others counter that network operators, who must make large, risky investments in bandwidth, have the right and the obligation to manage their capacity efficiently as bandwidth scarcity becomes a bigger concern (Yoo, 2006). The long-running controversy over network neutrality is a symptom of the profundity and high stakes of this change.

Section snippets

Research questions and study method

DPI's enhanced capacity to monitor and manipulate Internet traffic creates a broad potential to change the architecture and governance of the Internet. But it is also possible that DPI use could be regulated and limited in ways that make its use consistent with the laws and norms of the existing Internet. The focus of this paper, therefore, is on whether a change in network technology leads to change in the way the Internet is regulated and governed. It seeks to answer two specific research

USA: The Comcast case

In 2002 the FCC, chaired by deregulation supporter Michael Powell, classified cable modem internet as an ‘information service.’ In the U.S. regulatory regime, this had far-reaching consequences. It meant that cable-supplied Internet service was largely deregulated and exempt from the common carrier obligations in Title II of the Communications Act. Although the decision was hotly challenged, the FCC was upheld by the U.S. Supreme Court in 2005.1

Canada: P2P throttling blessed

Canada's institutional setting is similar in some respects to that of the United States. It is a federal system and the authority of the federal regulator, the Canadian Radio–Television and Telecommunications Commission (CRTC), closely parallels that of the United States' FCC. But there are two key differences. One is that Canada's Telecommunications Act does not exempt facilities-based broadband Internet service providers from common carrier regulation. Section 36 of its Telecommunications Act

The comparison as reflected in Glasnost data

The Glasnost data provides a quantitative measure of DPI use for BitTorrent blocking or throttling. Data can be aggregated by country or by ISP, using a procedure outlined below. This aggregated data reveals how the percentage of positive tests changed over time; these changes can then be correlated with external events, such as regulatory decisions or publicity.

Glasnost is crowd-sourced data; people on the Internet have to choose to run the tests. The Glasnost client transfers data between the

Concluding analysis

This paper tried to answer two research questions:

  • 1.

    Has the use of DPI by ISPs disrupted the way the Internet is regulated?

  • 2.

    Do political and institutional factors alter or constrain DPI use?

The evidence presented here suggests an affirmative answer to the first question and leaves us with a puzzle regarding the second.

Regarding RQ1, in both the U.S. and Canada there was a direct causal relationship between DPI deployment and disruptive change in Internet regulation. A key fact is that both U.S.

References (31)

  • S. Kim et al.

    A system architecture for high-speed deep packet inspection in signature-based network intrusion prevention

    Journal of Systems Architecture

    (2007)
  • A. Aghasaryan et al.

    Personalized application enablement by web session analysis and multisource user profiling

    Bell Labs Technical Journal

    (2010)
  • Anderson, N. (2008). New filings reveal extent, damage of Bell Canada throttling. Ars Technica. Retrieved from...
  • R. Bendrath et al.

    The end of the net as we know it? Deep packet inspection and Internet governance

    New Media and Society

    (2011)
  • P. Brey

    Artifacts as social agents

  • R. Collins

    The privacy implications of deep packet inspection technology: Why the next wave in online advertising shouldn't rock the self-regulatory boat

    Georgia Law review

    (2010)
  • M. Coward

    Deep packet inspection optimizes mobile applications

    EDN

    (2009)
  • J. de Beer et al.

    Global trends in online copyright enforcement: A non-neutral role for network intermediaries?

    Jurimetrics

    (2009)
  • Dischinger, M., Marcon, M., Guha, S., Gummadi, K. P., Mahajan, R., & Saroiu, S. (2010). Glasnost: Enabling end users to...
  • Finnie, G. (2009). ISP traffic management technologies: The state of the art. Report prepared for The Canadian Radio...
  • Frieden, R. M. (2007). Internet packet sniffing and its impact on the balance of power. Retrieved from...
  • Glasner, J. (2005). P2P fuels global bandwidth binge. Wired. Retrieved from...
  • Isenberg, D. (1997). Rise of the stupid network. Retrieved from...
  • Lemley, M. A., & Lessig, L. (2000). The end of end-to-end: Preserving the architecture of the Internet in the broadband...
  • L. Lessig

    The future of ideas: The fate of the commons in a connected world

    (2001)
  • Cited by (35)

    • Addressing the train–test gap on traffic classification combined subflow model with ensemble learning

      2020, Knowledge-Based Systems
      Citation Excerpt :

      To resolve the issue, deep packet inspection is proposed based on signature string matching in the payload of IP packets [5]. The method is effective for several years with high accuracy since the payload signatures in the same application packets remain unchanged [6]. However, this method goes impractical when privacy protection is taken into account, and network traffic encryption is widely implemented [7].

    • Data mining and machine learning methods for sustainable smart cities traffic classification: A survey

      2020, Sustainable Cities and Society
      Citation Excerpt :

      Hardware For better performance solution dedicated hardware is also useful of the analysis of real time internet traffic. For this purpose of real time internet traffic analysis some companies provide hardware base solution such as Wildpackets (Mueller & Asghari, 2012) Napatech (Kaletha, 2013) and ipoque (Real-time, 2020) etc. Protocol Level Internet traffic measurement can be performed on different level single protocol level or multiple protocol level.

    • Policy implications of technology for detecting P2P and copyright violations

      2014, Telecommunications Policy
      Citation Excerpt :

      It also led the FCC to rule against Comcast in an adjudicatory proceeding (FCC, 2008; Mueller and Asghari, 2012). After questions were raised about the FCC's authority and its adherence to administrative procedures in this case, the FCC decision was overturned by a federal appeals court (U.S. Court of Appeals, 2010; Mueller & Asghari, 2012). Part of the problem in this particular case was that subscribers were not informed.

    View all citing articles on Scopus
    View full text