Refine
Year of publication
Document Type
- Conference Proceeding (492)
- Article (218)
- Part of a Book (48)
- Doctoral Thesis (31)
- Other Publications (28)
- Master's Thesis (14)
- Report (13)
- Working Paper (12)
- Book (9)
- Bachelor Thesis (8)
Language
- English (883) (remove)
Keywords
- (Strict) sign-regularity (1)
- 1D-CNN (1)
- 2 D environment Laser data (1)
- 360-degree coverage (1)
- 3D Extended Object Tracking (1)
- 3D Extended Object Tracking (EOT) (2)
- 3D shape tracking (1)
- 3D ship detection (1)
- 3D urban planning (1)
- AAL (3)
Institute
- Fakultät Architektur und Gestaltung (6)
- Fakultät Bauingenieurwesen (26)
- Fakultät Elektrotechnik und Informationstechnik (16)
- Fakultät Informatik (63)
- Fakultät Maschinenbau (12)
- Fakultät Wirtschafts-, Kultur- und Rechtswissenschaften (43)
- Institut für Angewandte Forschung - IAF (78)
- Institut für Optische Systeme - IOS (36)
- Institut für Strategische Innovation und Technologiemanagement - IST (38)
- Institut für Systemdynamik - ISD (98)
- Institut für Werkstoffsystemtechnik Konstanz - WIK (4)
- Institut für Werkstoffsystemtechnik Thurgau - WITg (14)
- Institut für angewandte Thermo- und Fluiddynamik - IATF (3)
- Institut für professionelles Schreiben - IPS (1)
- Konstanz Institut für Corporate Governance - KICG (6)
- Konstanz Institut für WerteManagement - KIeM (4)
- Konstanzer Institut für Prozesssteuerung - KIPS (8)
We propose and apply a requirements engineering approach that focuses on security and privacy properties and takes into account various stakeholder interests. The proposed methodology facilitates the integration of security and privacy by design into the requirements engineering process. Thus, specific, detailed security and privacy requirements can be implemented from the very beginning of a software project. The method is applied to an exemplary application scenario in the logistics industry. The approach includes the application of threat and risk rating methodologies, a technique to derive technical requirements from legal texts, as well as a matching process to avoid duplication and accumulate all essential requirements.
Cyberspace: a world at war. Our privacy, freedom of speech, and with them the very foundations of democracy are under attack. In the virtual world frontiers are not set by nations or states, they are set by those, who control the flows of information. And control is, what everybody wants.
The Five Eyes are watching, storing, and evaluating every transmission. Internet corporations compete for our data and decide if, when, and how we gain access to that data and to their pretended free services. Search engines control what information we are allowed - or want - to consume. Network access providers and carriers are fighting for control of larger networks and for better ways to shape the traffic. Interest groups and copyright holders struggle to limit access to specific content. Network operators try to keep their networks and their data safe from outside - or inside - adversaries.
And users? Many of them just don’t care. Trust in concepts and techniques is implicit. Those who do care try to take back control of the Internet through privacy-preserving techniques.
This leads to an arms race between those who try to classify the traffic, and those who try to obfuscate it. But good or bad lies in the eye of the beholder, and one will find himself fighting on both sides.
Network Traffic Classification is an important tool for network security. It allows identification of malicious traffic and possible intruders, and can also optimize network usage. Network Traffic Obfuscation is required to protect transmissions of important data from unauthorized observers, to keep the information private. However, with security and privacy both crumbling under the grip of legal and illegal black hat crackers, we dare say that contemporary traffic classification and obfuscation techniques are fundamentally flawed. The underlying concepts cannot keep up with technological evolution. Their implementation is insufficient, inefficient and requires too much resources.
We provide (1) a unified view on the apparently opposed fields of traffic classification and obfuscation, their deficiencies and limitations, and how they can be improved. We show that (2) using multiple classification techniques, optimized for specific tasks improves overall resource requirements and subsequently increases classification speed. (3) Classification based on application domain behavior leads to more accurate information than trying to identify communication protocols. (4) Current approaches to identify signatures in packet content are slow and require much space or memory. Enhanced methods reduce these requirements and allow faster matching. (5) Simple and easy to implement obfuscation techniques allow circumvention of even sophisticated contemporary classification systems. (6) Trust and privacy can be increased by reducing communication to a required minimum and limit it to known and trustworthy communication partners.
Our techniques improve both security and privacy and can be applied efficiently on a large scale. It is but a small step in taking back the Web.
The IT unit is not the only provider of information technology (IT) used in business processes. Aiming for increased performance, many business workgroups autonomously implement IT resources not covered by their organizational IT service management. This is called shadow IT. Risks and inefficiencies associated with this phenomenon challenge organizations. Organizations need to decide how to deal with identified shadow IT and if the business or the IT unit should be responsible for corresponding tasks and components. This study proposes design principles for a method to control identified shadow IT following action design research in four organizational settings. The procedure results in an allocation of IT task responsibilities between the business workgroups and the IT unit following risk considerations and transaction cost economics, leading to an IT service governance. This contributes to governance research regarding adaptive and efficient arrangements with reduced risks for business-located IT activities.
In several organizations, business workgroups autonomously implement information technology (IT) outside the purview of the IT department. Shadow IT, evolving as a type of workaround from nontransparent and unapproved end-user computing (EUC), is a term used to refer to this phenomenon, which challenges norms relative to IT controllability. This report describes shadow IT based on case studies of three companies and investigates its management. In 62% of cases, companies decided to reengineer detected instances or reallocate related subtasks to their IT department. Considerations of risks and transaction cost economics with regard to specificity, uncertainty, and scope explain these actions and the resulting coordination of IT responsibilities between the business workgroups and IT departments. This turns shadow IT into controlled business-managed IT activities and enhances EUC management. The results contribute to the governance of IT task responsibilities and provide a way to formalize the role of workarounds in business workgroups.
InBetween
(2017)
Traditional Western philosophy, cognitive science and traditional HCI frameworks approach the term digital and its implications with an implicit dualism (nature/cul-ture, theory /practice, body/mind, human/machine). What lies between is a feature of our postmodern times, in which different states, conditions or positions merge and co-exist in a new, hybrid reality, a “continuous beta” (Mühlenbeck & Skibicki, 2007) version of becoming .Post-digitality involves the physical dimensions of spatio-temporal engagements. This new ontological paradigm reconceptualizes digital technology through the ex-perience of the human body and its senses, thus emphasizing form-taking, situation-al engagement and practice rather than symbolic, disembodied rationality. This rais-es two questions in particular: how to encourage curiosity, playfulness, serendipity, emergence, discourse and collectivity? How to construct working methods without foregrounding and dividing the subject into an individual that already takes posi-tion? This paper briefly outlines the rhizomatic framework that I developed within my PhD research. This attempts to overcome two prevailing tendencies: first, the one-sided view of scientific approaches to knowledge acquisition and the pure-ly application-oriented handling of materials, technologies and machines; second, the distanced perception of the world. In contrast, my work involves project-driven alchemic curiosity and doing research through artistic design practice. This means thinking through materials, technologies and machinic interactions. Now, at the end of this PhD journey, 10 interdisciplinary projects have emerged from this ontological queer-paradigm that is post-digital–crafting 4.0. Below I illustrate this approach and its outcomes.
This PhD investigation lies at the intersection of Architecture, Textile Design and Interaction Design and speculates about sustainable forms of future living, focussing on bionic principles to create alternative lightweight building structures with textiles and digital fabrication techniques. In an interdisciplinary, practice- based design approach, informed by radical case studies from the 1960s to 80s on soft architectures like Archigram, Buckminster Fuller, Cedric Price, or Yona Friedman and critical theory on new materialism, (D. Haraway 1997, K. Barad 1998, J. Bennett 2007) sociological, philosophical (B. Latour 2005, G.Deleuze F., Guattari F 1987) and phenomenological thinkers (L. Malafouris 2005, J.Rancière 2004, B. Massumi 2002, N. Bourriaud 2002 , M. Merleau-Ponty 1963) this research investigates the cultural and social rootedness (Verortung) of novel materials and technologies, exploring in between prosthetic relations between the body and the environment.
The actual task of electrocardiographic examinations is to increase the reliability of diagnosing the condition of the heart. Within the framework of this task, an important direction is the solution of the inverse problem of electrocardiography, based on the processing of electrocardiographic signals of multichannel cardio leads at known electrode coordinates in these leads (Titomir et al. Noninvasiv electrocardiotopography, 2003), (Macfarlane et al. Comprehensive Electrocardiology, 2nd ed. (Chapter 9), 2011).