Volltext-Downloads (blau) und Frontdoor-Views (grau)

Network Traffic Exposed and Concealed

  • Cyberspace: a world at war. Our privacy, freedom of speech, and with them the very foundations of democracy are under attack. In the virtual world frontiers are not set by nations or states, they are set by those, who control the flows of information. And control is, what everybody wants. The Five Eyes are watching, storing, and evaluating every transmission. Internet corporations compete for our data and decide if, when, and how we gain access to that data and to their pretended free services. Search engines control what information we are allowed - or want - to consume. Network access providers and carriers are fighting for control of larger networks and for better ways to shape the traffic. Interest groups and copyright holders struggle to limit access to specific content. Network operators try to keep their networks and their data safe from outside - or inside - adversaries. And users? Many of them just don’t care. Trust in concepts and techniques is implicit. Those who do care try to take back control of the Internet through privacy-preserving techniques. This leads to an arms race between those who try to classify the traffic, and those who try to obfuscate it. But good or bad lies in the eye of the beholder, and one will find himself fighting on both sides. Network Traffic Classification is an important tool for network security. It allows identification of malicious traffic and possible intruders, and can also optimize network usage. Network Traffic Obfuscation is required to protect transmissions of important data from unauthorized observers, to keep the information private. However, with security and privacy both crumbling under the grip of legal and illegal black hat crackers, we dare say that contemporary traffic classification and obfuscation techniques are fundamentally flawed. The underlying concepts cannot keep up with technological evolution. Their implementation is insufficient, inefficient and requires too much resources. We provide (1) a unified view on the apparently opposed fields of traffic classification and obfuscation, their deficiencies and limitations, and how they can be improved. We show that (2) using multiple classification techniques, optimized for specific tasks improves overall resource requirements and subsequently increases classification speed. (3) Classification based on application domain behavior leads to more accurate information than trying to identify communication protocols. (4) Current approaches to identify signatures in packet content are slow and require much space or memory. Enhanced methods reduce these requirements and allow faster matching. (5) Simple and easy to implement obfuscation techniques allow circumvention of even sophisticated contemporary classification systems. (6) Trust and privacy can be increased by reducing communication to a required minimum and limit it to known and trustworthy communication partners. Our techniques improve both security and privacy and can be applied efficiently on a large scale. It is but a small step in taking back the Web.

Export metadata

Additional Services

Search Google Scholar


Author:Thomas Zink
Referee:Marcel WaldvogelORCiD, Oliver HaaseGND
Document Type:Doctoral Thesis
Year of Publication:2014
Date of final exam:2014/12/18
Corporation:Universität Konstanz, Faculty of Sciences Department of Computer and Information Science
Release Date:2018/02/26
Tag:network traffic; traffic classification; traffic obfuscation; P2P; string matching
Page Number:XIV, 161 Seiten
DDC functional group:004 Informatik
Open Access?:Ja
Licence (German):License LogoCreative Commons - Namensnennung