Refine
Year of publication
Document Type
- Conference Proceeding (642)
- Article (426)
- Other Publications (143)
- Part of a Book (141)
- Working Paper (128)
- Book (118)
- Report (115)
- Journal (Complete Issue of a Journal) (85)
- Master's Thesis (77)
- Doctoral Thesis (58)
Language
- German (1113)
- English (882)
- Multiple languages (8)
Keywords
Institute
- Fakultät Architektur und Gestaltung (41)
- Fakultät Bauingenieurwesen (104)
- Fakultät Elektrotechnik und Informationstechnik (34)
- Fakultät Informatik (121)
- Fakultät Maschinenbau (60)
- Fakultät Wirtschafts-, Kultur- und Rechtswissenschaften (106)
- Institut für Angewandte Forschung - IAF (115)
- Institut für Naturwissenschaften und Mathematik - INM (3)
- Institut für Optische Systeme - IOS (39)
- Institut für Strategische Innovation und Technologiemanagement - IST (60)
For European space missions the importance of electric propulsion is strongly growing and has recently experienced a real burst in the telecom market. The initial drivers of this development were programs of the European Space Agency and projects of the European national space agencies. In addition, electric propulsion is now on the priority list of European commercial satellite manufacturers. Actual programs target orbit raising and station keeping with full electric propulsion for telecom satellites. European space industry, represented by individual companies, has developed specific and generic solutions for the electronics dedicated to powering and controlling electric propulsion systems. The European Space Agency and the European Commission providing support for enabling technology related to Power Processing Units (PPUs) and increasing competitiveness.
This paper studies suitable models for the identification of nonlinear acoustic systems. A cascaded structure of nonlinear filters is proposed that contains several parallel branches, consisting of polynomial functions followed by a linear filter for each order of nonlinearity. The second order of nonlinearity is additionally modelled with a parallel branch, containing a Volterra filter. These are followed by a long linear FIR filter that is able to model the room acoustics. The model is applied to the identification of a tube power amplifier feeding a guitar loudspeaker cabinet in an acoustic room. The adaptive identification is performed by the normalized least mean square (NLMS) algorithm. Compared with a generalized polynomial Hammerstein (GPH) model, the accuracy in modelling the dedicated real world system can be improved to a greater extend than increasing the order of nonlinearity in the GPH model.
The multichannel Wiener filter (MWF) is a well-established noise reduction technique for speech processing. Most commonly, the speech component in a selected reference microphone is estimated. The choice of this reference microphone influences the broadband output signal-to-noise ratio (SNR) as well as the speech distortion. Recently, a generalized formulation for the MWF (G-MWF) was proposed that uses a weighted sum of the individual transfer functions from the speaker to the microphones to form a better speech reference resulting in an improved broadband output SNR. For the MWF, the influence of the phase reference is often neglected, because it has no impact on the narrow-band output SNR. The G-MWF allows an arbitrary choice of the phase reference especially in the context of spatially distributed microphones.
In this work, we demonstrate that the phase reference determines the overall transfer function and hence has an impact on both the speech distortion and the broadband output SNR. We propose two speech references that achieve a better signal-to-reverberation ratio (SRR) and an improvement in the broadband output SNR. Both proposed references are based on the phase of a delay-and-sum beamformer. Hence, the time-difference-of-arrival (TDOA) of the speech source is required to align the signals. The different techniques are compared in terms of SRR and SNR performance.
Die vom Forum Compliance & Integrity (FCI) veröffentlichte Handreichung „Unternehmensintegrität & Compliance – Was wirklich wichtig ist“ zielt darauf ab, Entscheider in Unternehmen – v.a. Vorstände, Geschäftsführer, Aufsichtsräte und obere Führungskräfte – mit den wesentlichen Grundlagen, theoretischen Zusammenhängen und anwendungsbezogenen Konzepten zum Thema Unternehmensintegrität vertraut zu machen. Es scheint eine zwingende Voraussetzung für das Gelingen einer verantwortungsvollen Unternehmensführung zu sein, dass sich die Unternehmenslenker systematisch mit der moralischen Seite des Wirtschaftens befassen, um Rechts- und Reputationsrisiken präventiv begegnen und Chancen aus vertrauensvollen Kooperationsbeziehungen mit den Stakeholdern des Unternehmens nachhaltig nutzen zu können.
Viele negative Beispiele aus den letzten Jahren im Bereich der unternehmensbezogenen Korruption, der Geldwäsche, der Wettbewerbsdelikte, der Verstöße gegen das Umweltrecht etc. haben gezeigt, dass vorhandene Compliance-Systeme systematisches Fehlverhalten in bzw. von Unternehmen nicht verhindern oder frühzeitig aufdecken konnten. Der „VW-Abgasskandal“ steht sinnbildlich für dieses „Compliance-Versagen“. Häufig liegt die Ursache dafür in der mangelnden Ernsthaftigkeit und Glaubwürdigkeit der unternehmensbezogenen Redlichkeitsbemühungen.
Die FCI-Handreichung möchte daher erstens einen Diskussionsbeitrag dazu leisten, wie durch Maßnahmen des „Integrity Managements“ die herkömmlichen Compliance-Systeme in ihrer Wirksamkeit verbessert werden können. Dabei ist klar, dass diese Ausführungen keine letzten Wahrheiten darstellen, sondern das ernsthafte Bemühen um bessere Lösungen fördern sollen. Eine zweite weiterführende Zielsetzung ist die Förderung der Reflexion der über die rechtlich normierten Standards hinausgehenden Unternehmensverantwortung (Corporate Responsibility). Eine so gelagerte werteorientierte Unternehmensführung erkennt die für das eigene Unternehmen kritischen moralischen Fragen, setzt Maßstäbe für das eigene Handeln und zeigt dabei sowohl den eigenen Gestaltungsanspruch als auch die Grenzen der (möglichen) Verantwortungsübernahme klar und selbstbewusst auf.
Die konzeptionellen Ausführungen richten sich ebenso wie die enthaltenen Handlungsempfehlungen in Form von Dos und Don‘ts an Manager, die von sich aus davon überzeugt sind, dass nachhaltiger Erfolg durch eine integre Unternehmensführung unterstützt wird. Damit ist gleichzeitig gesagt, dass es auch nicht-integre Unternehmen gibt, die viel Geld verdienen und die Handreichung keinerlei missionarische Absichten hegt. Manager und Unternehmen sollen angesprochen werden, die sich dem Leitbild des „ehrbaren Kaufmanns“ längst verschrieben haben und nach Anregungen und Vorschlägen suchen, wie dieses Leitbild umgesetzt werden kann.
Eines ist dabei zugleich klar: auch Manager und Unternehmen, die sich ernsthaft bemühen, verantwortungsvoll und integer zu führen und zu handeln, können im Einzelfall scheitern. Dass es zu keinem „systemischen Fehlverhalten“ in diesen Organisationen kommt, dazu möchte die Handreichung des FCI beitragen.
Integritätsmanagement
(2016)
Compliance hat in der Vergangenheit oftmals versagt, der aktuelle „VW-Abgasskandal“ ist nur ein Beispiel. Es muss nicht alles neu erfunden werden, und nicht alles, was Unternehmen in der Vergangenheit im Bereich Compliance taten, ist schlecht. Aber es ist an der Zeit, Compliance neu zu denken. Worauf kommt es wirklich an im Compliance Management 2.0? Meine Antwort lautet, dass Ernsthaftigkeit und Glaubwürdigkeit gefördert werden müssen und gefördert werden können. Wie, das versuche ich im Rahmen von sechs Thesen zu skizzieren.
Digital bedruckte Oberflächen müssen strengen funktionalen und ästhetischen Anforderungen genügen. Diese Eigenschaften werden im Rahmen der Qualitätsprüfung kontrolliert. Hierbei wirken sich Oberflächendefekte oftmals erst dann aus, wenn diese auch vom Menschen wahrgenommen werden. Aufgrund der hohen Produktionsgeschwindigkeit kann eine solche Bewertung der Sichtbarkeit von Defekten bisher nur außerhalb des Produktionsflusses durch manuelle - subjektiv geprägte - Inspektion erfolgen. Ziel des Projektes ist (1) die Modellierung von Texturen in einer Form, die an das menschliche visuelle System angepasst ist und (2) die automatisierte Beurteilung der Wahrnehmung von Texturfehlern. Im Rahmen des Projekts wurde ein prototypisches System zur Inline-Erfassung von texturierten Oberflächen entwickelt. Auf Basis von realen Aufnahmen industriell produzierter Holzdekore wurde eine repräsentative Texturdatenbank erstellt. Gezeigt werden erste Resultate im Bereich der Defektdetektion auf Basis von statistischen Merkmalen. Diese Ergebnisse dienen als Grundlage für die spätere wahrnehmungsorientierte Bewertung. Letztlich sollen die im Rahmen des Projekts erlangten Ergebnisse in einen prototypischen Aufbau zur Inspektion von digital bedruckten Dekoren einfließen.
Digital cameras are used in a large variety of scientific and industrial applications. For most applications the acquired data should represent the real light intensity per pixel as accurately as possible. However, digital cameras are subject to different sources of noise which distort the resulting image. Noise includes photon noise, fixed pattern noise and read noise. The aim of the radiometric calibration is to improve the quality of the resulting images by reducing the influence of the different types of noise on the measured data. In this paper, a new approach for the radiometric calibration of digital cameras using sparse Gaussian process regression is presented. Gaussian process regression is a kernel based supervised machine learning technique. It is used to learn the response of a camera system from a set of training images to allow for the calibration of new images. Compared to the standard Gaussian process method or flat field correction our sparse approach allows for faster calibration and higher reconstruction quality.
FishNet
(2016)
The corrosion resistance of stainless steels is massively influenced by the condition of their surface. The surface quality includes the topography of the surface, the structure and composition of the passive layer, and the surface near structure of the base material. These factors are influenced by final physical/chemical surface treatments. The presented work shows significantly lower corrosion resistance for mechanical machined specimens than for etched specimens. It also turns out that the rougher the surface, the lower the corrosion resistance gets. However, there is no general finding which shows if blasted or grinded surfaces are more appropriate, but a dependency on process parameters and the characteristics on corrosive exposure in terms of corrosion behavior. The results show that not only the surface roughness Ra has an influence on corrosion behavior but also the shape of peaks and valleys which are evolved by surface treatments. Imperfections in the base material, like sulfidic inclusions lead to a weaker passive layer, respectively, to a decrease of the corrosion resistance. By using special passivating techniques the corrosion resistance of stainless steels can be increased to a higher level in comparison to common passivation.
Even though immutability is a desirable property, especially in a multi-threaded environment, implementing immutable Java classes is surprisingly hard because of a lack of language support. We present a static analysis tool using abstract bytecode interpretation that checks Java classes for compliance with a set of rules that together constitute state-based immutability. Being realized as a Find Bugs plug in, the tool can easily be integrated into most IDEs and hence the software development process. Our evaluation on a large, real world codebase shows that the average run-time effort for a single class is in the range of a few milliseconds, with only a very few statistical spikes.
ERP systems integrate a major part of all business processes and organizations include them in their IT service management. Besides these formal systems, there are additional systems that are rather stand-alone and not included in the IT management tasks. These so-called ‘shadow systems’ also support business processes but hinder a high enterprise integration. Shadow systems appear during their explicit detection or during software maintenance projects such as enhancements or release changes of enterprise systems. Organizations then have to decide if and to what extent they integrate the identified shadow systems into their ERP systems. For this decision, organizations have to compare the capabilities of each identified shadow system with their ERP systems. Based on multiple-case studies, we provide a dependency approach to enable their comparison. We derive categories for different stages of the dependency and base insights into integration possibilities on these stages. Our results show that 64% of the shadow systems in our case studies are related to ERP systems. This means that they share parts or all of their data and/or functionality with the ERP system. Our research contributes to the field of integration as well as to the discussion about shadow systems.
Measuring the natural frequency of buildings and bridges is a possibility to get information about the stiffness of the construction. Decreasing stiffness can be detected be repeatedly measurements. Damaged parts of the construction or too high wood moisture can be reasons for decreasing stiffness. The earlier the failure is detected the better is the chance to repair it with low costs. The method of monitoring by repeatedly measuring the natural frequency is applied at timber bridges, especially on footbridges. As damages due to high wood moisture cannot be seen easily, measuring the natural frequency is a good possibility to detect them and then to repair them. Equations to calculate the natural frequencies regarding the damaged parts are shown and applied to a simple supported beam.
adidas and Reebok
(2016)
Die ISO 26000 schafft einen international anerkannten Referenzrahmen für die Wahrnehmung gesellschaftlicher Verantwortung durch Organisationen aller Art (Unternehmen, staatliche Institutionen, Vereinigungen). Das EFQM-Modell ist ein geeigneter Referenzrahmen, um eine Organisation so zu gestalten, dass die Unternehmensziele nachhaltig erreicht werden unter Einhaltung der gesetzlichen Vorgaben und der Erfüllung legitimer Anforderungen relevanter Stakeholder.
Social Compliance
(2016)
The business plan is one of the most frequently available artifacts to innovation intermediaries of technology-based ventures' presentations in their early stages [1]–[4]. Agreement on the evaluations of venturing projects based on the business plans highly depends on the individual perspective of the readers [5], [6]. One reason is that little empirical proof exists for descriptions in business plans that suggest survival of early-stage technology ventures [7]–[9]. We identified descriptions of transaction relations [10]–[13] as an anchor of the snapshot model business plan to business reality [13]. In the early-stage, surviving ventures are building transaction relations to human resources, financial resources, and suppliers on the input side, and customers on the output side of the business towards a stronger ego-centric value network [10]–[13]. We conceptualized a multidimensional measurement instrument that evaluates the maturity of this ego-centric value networks based on the transaction relations of different strength levels that are described in business plans of early-stage technology ventures [13]. In this paper, the research design and the instrument are purified to achieve high agreement in the evaluation of business plans [14]–[16]. As a result, we present an overall research design that can reach acceptable quality for quantitative research. The paper so contributes to the literature on business analysis in the early-stage of technology-based ventures and the research technique of content analysis.
Sprachliche Anforderungen in verschiedenen Fächern, Vermittlungskonzepte und Kursorganisation
(2016)
Der von Ehlich eingeführte Begriff „alltägliche Wissenschaftssprache“ hat die Diskussion um die Sprachvermittlung für den Sprachgebrauch im Studium maßgeblich geprägt. Auch die Abkürzung „AWS“ ist mittlerweile recht verbreitet. Gemeint sind sprachliche Elemente, die neben der Fachterminologie in unterschiedlichen Disziplinen weitgehend ähnlich sind. Man geht davon aus, dass diese Elemente in mehreren Disziplinen verwendet werden. Mit dieser Argumentation lässt sich eine disziplinübergreifende Sprachvermittlung begründen. In diesem Beitrag wird das AWS-Konzept einer neuerlichen Betrachtung unterzogen.
Dazu werden folgende Fragen formuliert:
1. Ist ein disziplinübergreifender Ansatz für die Kursorganisation hilfreich? Das AWS-Konzept bezieht seine Attraktivität auch aus dem praktischen Nutzen für die Planung von Sprachkursen, denn es ist häufig schwierig, gesonderte Sprachkurse für verschiedene Disziplinen anzubieten.
2. Ist das Konzept der AWS aus linguistischer Sicht zutreffend? Zur Frage, wie unterschiedlich die sprachlichen Anforderungen der einzelnen wissenschaftlichen Disziplinen sind, gibt es in der Literatur divergierende Auffassungen. Einige Forschungsansätze und Ergebnisse werden in diesem Beitrag skizziert.
3. Überzeugt das Konzept der AWS als Grundlage für Vermittlungsprozesse? Ein Blick auf Lehrmaterialien verdeutlicht, wie Wissenschaftssprache disziplinübergreifend vermittelt werden kann und welche Probleme sich ergeben.
Die hier vorgestellten Überlegungen münden letztlich in den Appell, die unterschiedlichen sprachlichen Anforderungen in den verschiedenen Fächern nach Möglichkeit zu berücksichtigen.
Lernfabrik
(2016)
Die Einführung von cyberphysischen Systemen in der Fertigung wird die Arbeitsbedingungen und Prozesse genauso wie Geschäftsmodelle stark verändern. In der Praxis kann eine wachsende Diskrepanz zwischen Großunternehmen und KMU beobachtet werden. Genau diese Diskrepanz soll die im Folgenden präsentierte Lernfabrik überbrücken, die Unternehmen eine Plattform zum Probieren bietet, die Möglichkeit zur Ausbildung von Studenten und Mitarbeitern schafft und Beratungsangebote bereithält. Zur Umsetzung wird ein integriertes, offenes und standardisiertes Automatisierungskonzept vorgestellt, das einzelne Geräte, ganze Produktionslinien bis hin zu höheren Automatisierungssystemen umfasst und auch eine Community bereitstellt sowie zur Umsetzung neuer Geschäftsmodelle dient.
Smart factory and education
(2016)
The introduction of cyber physical systems into production companies is highly changing working conditions and processes as well as business models. In practice a growing discrepancy between big and small respectively medium-sized companies can be observed. Bridging that gap a university smart factory is introduced to give that companies a platform to trial, educate employees and access consultancy. Realizing the smart factory a highly integrated, open and standardized automation concept is shown comprising single devices, production lines up to a higher automation system maintaining a community or business models.
Creating cages that enclose a 3D-model of some sort is part of many preprocessing pipelines in computational geometry. Creating a cage of preferably lower resolution than the original model is of special interest when performing an operation on the original model might be to costly. The desired operation can be applied to the cage first and then transferred to the enclosed model. With this paper the authors present a short survey of recent and well known methods for cage computation.
The authors would like to give the reader an insight in common methods and their differences.
To learn from the past, we analyse 1,088 "computer as a target" judgements for evidential reasoning by extracting four case elements: decision, intent, fact, and evidence. Analysing the decision element is essential for studying the scale of sentence severity for cross-jurisdictional comparisons. Examining the intent element can facilitate future risk assessment. Analysing the fact element can enhance an organization's capability of analysing criminal activities for future offender profiling. Examining the evidence used against a defendant from previous judgements can facilitate the preparation of evidence for upcoming legal disclosure. Follow the concepts of argumentation diagrams, we develop an automatic judgement summarizing system to enhance the accessibility of judgements and avoid repeating past mistakes. Inspired by the feasibility of extracting legal knowledge for argument construction and employing grounds of inadmissibility for probability assessment, we conduct evidential reasoning of kernel traces for forensic readiness. We integrate the narrative methods from attack graphs/languages for preventing confirmation bias, the argumentative methods from argumentation diagrams for constructing legal arguments, and the probabilistic methods from Bayesian networks for comparing hypotheses.
In this paper we provide a performance analysis framework for wireless industrial networks by deriving a service curve and a bound on the delay violation probability. For this purpose we use the (min,×)stochastic network calculus as well as a recently presented recursive formula for an end-to-end delay bound of wireless heterogeneous networks. The derived results are mapped to WirelessHART networks used in process automation and were validated via simulations. In addition to WirelessHART, our results can be applied to any wireless network whose physical layer conforms the IEEE 802.15.4 standard, while its MAC protocol incorporates TDMA and channel hopping, like e.g. ISA100.11a or TSCH-based networks. The provided delay analysis is especially useful during the network design phase, offering further research potential towards optimal routing and power management in QoS-constrained wireless industrial networks.
These days computer analysis of ECG (Electrocardiograms) signals is common. There are many real-time QRS recognition algorithms; one of these algorithms is Pan-Tompkins Algorithm. Which the Pan-Tompkins Algorithm can detect QRS complexes of ECG signals. The proposed algorithm is analysed the data stream of the heartbeat based on the digital analysis of the amplitude, the bandwidth, and the slope. In addition to that, the stress algorithm compares whether the current heartbeat is similar or different to the last heartbeat after detecting the ECG signals. This algorithm determines the stress detection for the patient on the real-time. In order to implement the new algorithm with higher performance, the parallel programming language CUDA is used. The algorithm determines stress at the same time by determining the RR interval. The algorithm uses a different function as beat detector and a beat classifier of stress.
Die Seewassernutzung weist ein beachtliches Potential zu Kühl- und Heizzwecken auf. Bereits seit längerem eingesetzte seewasserbetriebene Wärmepumpen in der Schweiz beweisen fortwährend ihre Praxistauglichkeit. In Deutschland wird diese Technik jedoch bislang kaum genutzt. Mit Hilfe eines interdisziplinären geowissenschaftlichen Ansatzes wird derzeit das bestehende Potential in Deutschland quantifiziert und dessen Nutzungshemmnisse identifiziert, um in einem weiteren Schritt Handlungsoptionen für einen verstärkten Einsatz dieser Technologie zu erarbeiten.
Realistic traffic modeling plays a key role in efficient Dynamic Spectrum Access (DSA) which is considered as enabler for the employment of wireless technologies in critical industrial automation applications (IAA). The majority of models of spectrum usage are not suitable for this specific use case as they are based on measurement campaigns conducted in urban or controlled laboratory environments. In this work we present a time-domain traffic model for industrial communication in the 2.4 GHz industrial, scientific, medical (ISM) band based on measurements in an industrial automotive production site. As DSA is usually implemented on Software Defined Radios (SDR), our measurement campaign is based on SDR platforms rather than sophisticated spectrum analyzers. We show through the estimation of the Hurst parameter that industrial wireless traffic possesses inherent self-similarity that could be exploited for efficient DSA. We also show that wireless traffic could be modeled as a semi-Markov model with channel on and off durations Log-normally and Pareto distributed, respectively. We finally estimate the parameters of the derived models using Maximum Likelihood estimation.
Cognitive radio (CR) is a key enabler of wireless in industrial applications especially for those with strict quality-of-service (QoS) requirements. The cornerstone of CR is spectrum occupancy prediction that enables agile and proactive spectrum access and efficient utilization of spectral resources. Hidden Markov Models (HMM) provide powerful and flexible tools for statistical spectrum prediction. In this paper we introduce a HMM-based spectrum prediction algorithm for industrial applications that accurately predicts multiple slots in the future. Traditional HMM prediction approaches use two hidden states enabling the prediction of only one step ahead in the future. This one step is most often not enough due to internal hardware delays that render it outdated. We show in this work that extending the number of hidden states and formulating the prediction problem as a maximum likelihood (ML) classification approach enables a prediction span of multiple slots in the future even with fine spectrum sensing resolution. We verify the suitability of our approach to industrial wireless through extensive simulations that utilize a realistic measurement-based traffic model specifically tailored for industrial automotive settings.
Increasing robustness of handwriting recognition using character N-Gram decoding on large lexica
(2016)
Offline handwriting recognition systems often include a decoding step, that is retrieving the most likely character sequence from the underlying machine learning algorithm. Decoding is sensitive to ranges of weakly predicted characters, caused e.g. by obstructions in the scanned document. We present a new algorithm for robust decoding of handwriting recognizer outputs using character n-grams. Multidimensional hierarchical subsampling artificial neural networks with Long-Short-Term-Memory cells have been successfully applied to offline handwriting recognition. Output activations from such networks, trained with Connectionist Temporal Classification, can be decoded with several different algorithms in order to retrieve the most likely literal string that it represents. We present a new algorithm for decoding the network output while restricting the possible strings to a large lexicon. The index used for this work is an n-gram index with tri-grams used for experimental comparisons. N-grams are extracted from the network output using a backtracking algorithm and each n-gram assigned a mean probability. The decoding result is obtained by intersecting the n-gram hit lists while calculating the total probability for each matched lexicon entry. We conclude with an experimental comparison of different decoding algorithms on a large lexicon.
Recent years have seen the proposal of several different gradient-based optimization methods for training artificial neural networks. Traditional methods include steepest descent with momentum, newer methods are based on per-parameter learning rates and some approximate Newton-step updates. This work contains the result of several experiments comparing different optimization methods. The experiments were targeted at offline handwriting recognition using hierarchical subsampling networks with recurrent LSTM layers. We present an overview of the used optimization methods, the results that were achieved and a discussion of why the methods lead to different results.
The magneto-mechanical behavior of magnetic shape memory (MSM) materials has been investigated by means of different simulation and modeling approaches by several research groups. The target of this paper is to simulate actuators driven by MSM alloys and to understand the MSM element behavior during actuation, which shall lead to an increased performance of the actuator. It is shown that internal and external stresses should be taken into consideration using numerical computation tools for magnetic fields in an efficient way.
So is About Urbanity
(2016)
Südstadt Tübingen is a successful military land redevelopment project, in which functional mix and urban diversity have been identified as main goals. In the whole development process, joint building venture has been utilized as important urban development instrument. On one hand, this instrument helps citizens development adaptive and affordable living space and let them find own identity to the new quarter; on the other hand, it is also important to cultivate urban diversity and suitable spatial feeling, mix relationship in living and working conditions, so that the quarters will have very flexible structure to accommodate diversified urban life styles.
Stress is recognized as a predominant disease with raising costs for rehabilitation and treatment. Currently there several different approaches that can be used for determining and calculating the stress levels. Usually the methods for determining stress are divided in two categories. The first category do not require any special equipment for measuring the stress. This category useless the variation in the behaviour patterns that occur while stress. The core disadvantage for the category is their limitation to specific use case. The second category uses laboratories instruments and biological sensors. This category allow to measure stress precisely and proficiently but on the same time they are not mobile and transportable and do not support real-time feedback. This work presents a mobile system that provides the calculation of stress. For achieving this, the of a mobile ECG sensor is analysed, processed and visualised over a mobile system like a smartphone. This work also explains the used stress measurement algorithm. The result of this work is a portable system that can be used with a mobile system like a smartphone as visual interface for reporting the current stress level.
Stress is a recognized as a predominant disease with growing costs of treatment. The approach presented here is aimed to detect stress using a light weighted, mobile, cheap and easy to use system. The result shows that stress can be detected even in case a person’s natural bio vital data is out of the main range. The system enables storage of measured data, while maintaining communication channels of online and post-processing.
„Das Gedächtnis des Sees“
(2016)
Wie können archäologische Forschungsergebnisse einer breiten Öffentlichkeit möglichst eindrücklich und nachvollziehbar vermittelt werden? Am Beispiel der UNESCO-Welterbestätte Pfahlbausiedlung Hornstaad demonstrieren Studierende der Hochschule für Technik, Wirtschaft und Gestaltung Konstanz (HTWG) die Visualisierung komplexer wissenschaftlicher Inhalte mithilfe moderner Virtual-
Reality-Brillen-Technologie (VR-Brillen). Eine dreidimensionale digitale Rekonstruktion der Pfahlbausiedlung und Simulation von Tages- und Jahresverlauf in Verbindung mit der realen Bewegung im Ausstellungsraum ermöglichen dem Betrachter das hautnahe Erleben des Alltags in einer neolithischen Siedlung.
Das Ausstellungsprojekt „Das Gedächtnis des Sees“ wird im Rahmen der Großen Landesaustellung „4000 Jahre Pfahlbauten“ zu sehen sein.
Navigation on the Danube
(2016)
This report contains two parts: The first part presents an overview on studies concerning the Danube, inland navigation or the impact of climate change on either of those. The second part gives a more detailed analysis of inland navigation on the Danube, partly based on studies presented in part one. Part two covers the current situation along the Danube by covering the topic of bottlenecks and other limitations for shipping along the Danube. Based on these informations, an estimation of the economic impacts of low water periods on inland navigation is made. As a last step, measures to reduce the impact of low water on inland navigations are presented. The report shows, that inland navigation still is an important transport mode, along the Danube as well as in other european regions. Especially in Romania, inland navigation still has a large share of more than 20% and rising in total transport. However, inland navigation depends strongly on good conditions of its infrastructure. These good conditions are limited mainly by two factors: one are the so called bottlenecks. Those are areas with sub-optimal shipping conditions e.g. due to solid rock formations in the river that lead to a reduced water depths. The other factor is the weather (and, on a longer time scale, the climate) which, mostly depending on precipitation and evaporation, can lead to low water levels seasonally. In addition to these two natural factors, laws which e.g. regulate the maximum number of barges allowed. Human build structures like locks limit the size of vessels as well as the speed they can travel with. These limiting factors are identified and located in the first chapter of part two of this report, before the water depth needed by several ship sizes as well as the cargo fleet available along the Danube are presented. One of the targets of this report is to estimate the economic impact of low water periods. All the factors named above as well as the freight prices charged for connections along the Danube are used to each this target in chapter II.4. To estimate the impact of low water periods on the freight prices, a method developed by Jonkeren et al. (2007) for the Rhine is transfered to the Danube. By transfering Jonkeren et al. (2007) method, regression equations for several transport connections along the Danube are identified that give a first estimate for the connection of freight prices and water levels. With the help of these regression equations, an estimation of the total expenses for transport via inland navigation for several years is possible. The yearly and seasonal variability is identified as well as the additional expenses due to water levels below 280cm. But additional expenses are not the only impact of changing water levels on inland navigation. Another is, that while the demand for transport stays at the same level, sometimes the water levels are not sufficient enough to use the full capacity of the fleet. Therefore, the (theoretical) amount of cargo that could not be transported due to low water levels is calculated as well and presented in chapter II.5. Finally, some measures to overcome some of the here named problems of inland navigation due to low water levels are presented. These are separated into two general approches: change the ship or change the river. Both methods have their advantages and disadvantages due to technical as well as regulatory and other factors. The list presented here however is incomplete and only gives a few ideas of how some problems can be overcome. In the end, an individual mix for the different regions along the river and sometimes for the individual companies must be found.
Present demographic change and a growing population of elderly people leads to new medical needs. Meeting these with state of the art technology is as a consequence a rapidly growing market. So this work is aimed at taking modern concepts of mobile and sensor technology and putting them in a medical context. By measuring a user’s vital signs on sensors which are processed on a Android smartphone, the target system is able to determine the current health state of the user and to visualize gathered information. The system also includes a weather forecasting functionality, which alerts the user on possibly dangerous future meteorological events. All information are collected centrally and distributed to users based on their location. Further, the system can correlate the client-side measurement of vital signs with a server-side weather history. This enables personalized forecasting for each user individually. Finally, a portable and affordable application was developed that continuously monitors the health status by many vital sensors, all united on a common smartphone.
The corporate entrepreneur
(2016)
Corporate entrepreneurship is one tool for established companies to strengthen their capabilities for strategic renewal and innovativeness. The question, however, which factors are influencing the success of a corporate entrepreneurship initiative requires further attention. The corporate entrepreneur who is acting as both the leader of embedded entrepreneurial teams and linking pin to the corporate, is providing one possible perspective. Based on 6 interviews conducted in 6 German organizations this study contributes to the understanding of the role of the corporate entrepreneur and how this role can be distinguished from other roles in the context of innovation.
Know-how für die Weltmeere
(2016)
Know-how für die Weltmeere
(2016)
This paper proposes a soft input decoding algorithm and a decoder architecture for generalized concatenated (GC) codes. The GC codes are constructed from inner nested binary Bose-Chaudhuri-Hocquenghem (BCH) codes and outer Reed-Solomon codes. In order to enable soft input decoding for the inner BCH block codes, a sequential stack decoding algorithm is used. Ordinary stack decoding of binary block codes requires the complete trellis of the code. In this paper, a representation of the block codes based on the trellises of supercodes is proposed in order to reduce the memory requirements for the representation of the BCH codes. This enables an efficient hardware implementation. The results for the decoding performance of the overall GC code are presented. Furthermore, a hardware architecture of the GC decoder is proposed. The proposed decoder is well suited for applications that require very low residual error rates.
A lot of procedures for estimating the spool position in linear electromagnetic actuators using voltage and current measurements only, can be found in the literature. Subject to the accuracy of the estimated spool position some achieve better, some worse results. However, in almost every approach hysteresis has a huge impact on the estimation accuracy that can be achieved. Regardless whether these effects are caused by magnetic or mechanical hysteresis, they will limit the accuracy of the position estimate, if not taken into account. In this paper, a model is introduced which covers the hysteresis effects as well as other nonlinear ities occurring in estimated position-dependent parameters. A classical Preisach model is deployed first, which is then adjusted by using novel elementary preceding Relay-Operators. The resulting model for the estimated position-dependent parameters including the adjusted Preisach model can be easily applied to position estimation tasks. It is shown that the considered model distinctly improves the accuracy for the spool position estimate, while it is kept as simple as possible for real-time implementation reasons.
When mobile devices at the network edge want to communicate with each other, they too often depend on the availability of faraway resources. For direct communication, feasible user-friendly service discovery is essential. DNS Service Discovery over Multicast DNS (DNS-SD/mDNS) is widely used for configurationless service discovery in local networks, due inno small part to the fact that it is based on the well establishedDNS, and efficient in small networks. In our research, we enhance DNS-SD/mDNS providing versatility, user control, efficiency, and privacy, while maintaining the deployment simplicity and backward compatibility. These enhancements are necessary to make it a solid, flexible foundationfor device communication in the edge of the Internet. In this paper, we focus on providing multi-link capabilities and scalable scopes for DNS-SD while being mindful of both user-friendliness and efficiency. We propose DNS-SD over StatelessDNS (DNS-SD/sDNS), a solution that allows configurationless service discovery in arbitrary self-named scopes - largely independentof the physical network layout - by leveraging ourStateless DNS technique and the Raft consensus algorithm.
Loch im Putz = alles neu?
(2016)
Corrosion
(2016)
Tango Argentino
(2016)
Geographieunterricht ist in besonderem Maße prädestiniert, Mensch-Umwelt- und Mensch-Mitwelt-Beziehungen bewusst zu machen. Hierzu soll dieser Band beitragen, denn Musik wirkt ganzheitlich und lässt uns geographische Sachverhalte nicht nur visuell wahrnehmen, sondern macht sie akustisch "fühlbar" und bringt somit eine neue Erfahrungsdimension in den Unterricht. Da Musik aber auch eng mit Kultur verbunden ist, ermöglichen die 12 Unterrichtsbeispiele zudem Zugänge zu kulturellen Vorstellungen anhand diverser Raumbeispiele.
Purpose The purpose of this paper is to find out tourism movement patterns via the tracking of tourists with the help of positioning systems like GPS in the rural area of the Lake Constance destination in Germany. In doing so past, present and future of tourist tracking is illustrated. Design/methodology/approach The tracking is realized via common smartphones extended by an app, with dedicated sensors like position loggers and a survey. The three different approaches are applied in order to compare and cross-check results (triangulation of data and methods). Findings Movement patterns turned out to be diverse and individualistic within the rural destination of Lake Constance and following an ants trail in sub-destinations like the city of Constance. Repeat visitors and first-time visitors alike always visit the bigger cities and main day-trip destinations of the Lake. A possible prediction tool enables new avenues of governing tourism movement patterns. Research limitations/implications The tracking techniques can be developed further into the direction of “quantified self” using gamification in order to make the tracking app even more attractive. Practical implications An algorithm-based prediction tool would offer new perspectives to the management of tourism movements. Social implications Further research is needed to overcome the feeling of invasiveness of the app to allow tracking with that approach. Originality/value This study is original and innovative because of the first-time use of a smartphone app in tourist tracking, the application on a rural destination and the conceptual description of a prediction tool.
This paper builds upon the widely-used resource-based approach to explaining survival of new technology-based firms (NTBFs). However, instead of looking at the NTBF's initial resource configuration, a process-oriented perspective is taken by focusing on the entrepreneur's ability to transform resources in response to triggers resulting from market interactions. Transaction relations reflect these interactions and are thus operationalized with a suggested method for measuring the status of venture emergence (VE) applicable to early-stage NTBFs. NTBFs' value network maturity is reflected in the number and strength of their transaction relations in the four market dimensions customer, investor, partner, and human resource. Business plans of NTBFs represent the artifact that contains this data in the form of transaction relation descriptions. Using content analysis, a multi-step combined human and computer coding process has been developed to annotate and classify transaction relations from business plans in order to empirically determine NTBFs' status of VE. Results of the business plan analysis suggest that the level of transaction relations allows to draw conclusions on the VE status. Moreover, applying the developed process, first analysis of a business plan coding test shows that the transaction relation based VE status significantly relates to NTBF survival capability.
Software startups
(2016)
Software startup companies develop innovative, software-intensive products within limited time frames and with few resources, searching for sustainable and scalable business models. Software startups are quite distinct from traditional mature software companies, but also from micro-, small-, and medium-sized enterprises, introducing new challenges relevant for software engineering research. This paper’s research agenda focuses on software engineering in startups, identifying, in particular, 70+ research questions in the areas of supporting startup engineering activities, startup evolution models and patterns, ecosystems and innovation hubs, human aspects in software startups, applying startup concepts in non-startup environments, and methodologies and theories for startup research. We connect and motivate this research agenda with past studies in software startup research, while pointing out possible future directions. While all authors of this research agenda have their main background in Software Engineering or Computer Science, their interest in software startups broadens the perspective to the challenges, but also to the opportunities that emerge from multi-disciplinary research. Our audience is therefore primarily software engineering researchers, even though we aim at stimulating collaborations and research that crosses disciplinary boundaries. We believe that with this research agenda we cover a wide spectrum of the software startup industry current needs.
Nowadays there is a rich diversity of sleep monitoring systems available on the market. They promise to offer information about sleep quality of the user by recording a limited number of vital signals, mainly heart rate and body movement. Typically, fitness trackers, smart watches, smart shirts, smartphone applications or patches do not provide access to the raw sensor data. Moreover, the sleep classification algorithm and the agreement ratio with the gold standard, polysomnography (PSG) are not disclosed. Some commercial systems record and store the data on the wearable device, but the user needs to transfer and import it into specialised software applications or return it to the doctor, for clinical evaluation of the data set. Thus an immediate feedback mechanism or the possibility of remote control and supervision are lacking. Furthermore, many such systems only distinguish between sleep and wake states, or between wake, light sleep and deep sleep. It is not always clear how these stages are mapped to the four known sleep stages: REM, NREM1, NREM2, NREM3-4. [1] The goal of this research is to find a reduced complexity method to process a minimum number of bio vital signals, while providing accurate sleep classification results. The model we propose offers remote control and real time supervision capabilities, by using Internet of Things (IoT) technology. This paper focuses on the data processing method and the sleep classification logic. The body sensor network representing our data acquisition system will be described in a separate publication. Our solution showed promising results and a good potential to overcome the limitations of existing products. Further improvements will be made and subjects with different age and health conditions will be tested.
An approach for an adaptive position-dependent friction estimation for linear electromagnetic actuators with altered characteristics is proposed in this paper. The objective is to obtain a friction model that can be used to describe different stages of aging of magnetic actuators. It is compared to a classical Stribeck friction model by means of model fit, sensitivity, and parameter correlation. The identifiability of the parameters in the friction model is of special interest since the model is supposed to be used for diagnostic and prognostic purposes. A method based on the Fisher information matrix is employed to analyze the quality of the model structure and the parameter estimates.
Finite-Element-Methode
(2016)
Baudynamik
(2016)
In Ganglaboren werden medizinische Gang- und Laufanalysen durchgeführt. Eine in eine Decke eingelassene Kraftmessplatte ermöglicht dabei die präzise Messung der vom Menschen auf den Boden übertragenen Kräfte. Die Beschleunigungen der Decke, in die die Kraftmessplatte eingelassen ist, müssen begrenzt werden, damit die Messgenauigkeit der Kraftmessplatte nicht beeinträchtigt wird.
Für das wissenschaftliche Ganglabor im Neubau des Medizinischen Trainings- und Rehabilitationszentrums an der Universitätsklinik Tübingen wurde untersucht, inwieweit die Messgenauigkeit einer Kraftmessplatte durch personeninduzierte Deckenschwingungen beeinflusst wird. Als Ergebnis kann der Messfehler in Abhängigkeit von der Masse des Probanden im Ganglabor bei unterschiedlichen Szenarien angegeben werden.
In this paper, utilisation of an Unscented Kalman Filter for concurrently performing disturbance estimation and wave filtering is investigated. Experimental results are provided that demonstrate very good performance subject to both tasks. For the filter, a dynamic model has been used which was optimised via correlation analysis in order to obtain a minimum set of relevant parameters. This model has also been validated by experiments deploying a small vessel. A simulation study is presented to evaluate the performance using known quantities. Experimental trials have been performed on the Rhine river. The results show that for instance flow direction and varying current velocities can continuously be estimated with decent precision, even while the boat is performing turning manoeuvres. Moreover, the filtering properties are very satisfactory. This makes the filter suitable for being used, for instance, in autonomous vessel applications or assistance systems.
The IT unit is not the only provider of information technology (IT) used in business processes. Aiming for increased performance, many business workgroups autonomously implement IT resources not covered by their organizational IT service management. This is called shadow IT. Risks and inefficiencies associated with this phenomenon challenge organizations. Organizations need to decide how to deal with identified shadow IT and if the business or the IT unit should be responsible for corresponding tasks and components. This study proposes design principles for a method to control identified shadow IT following action design research in four organizational settings. The procedure results in an allocation of IT task responsibilities between the business workgroups and the IT unit following risk considerations and transaction cost economics, leading to an IT service governance. This contributes to governance research regarding adaptive and efficient arrangements with reduced risks for business-located IT activities.
Vorliegende Arbeit bilanziert die Treibhausgasemissionen (CO2e) der Fakultät Bauingenieurwesen an der Hochschule Konstanz im akademischen Jahr 2014/2015. Als grundlegende Methodik wird das Greenhouse Gas Protocol: A Corporate Accounting and Reporting Standard verwendet. Bilanziert werden Emissionen, die bei der Erzeugung von Wärme und Strom gebildet werden und Emissionen des Fuhrparks. Mit Hilfe einer Umfrage werden die Emissionen, welche durch Pendelverkehr und Papierverbrauch sowie Mensa-Mahlzeiten von Fakultätsangehörigen entstehen, berechnet. Es werden die größten Emittenten identifiziert und mögliche Einsparmaßnahmen aus der Bilanz und aus dem Vergleich mit anderen Hochschulen abgeleitet.
Im akademischen Jahr 2014/2015 sind an der Fakultät Bauingenieurwesen der Hochschule Konstanz insgesamt 264,8 t CO2e Emissionen entstanden. 120,6 t CO2e wurden durch Pendelverkehr von Studenten, Mitarbeitern und Professoren verursacht. Die Erzeugung von 238,1 MWh Wärme verursachte 51,4 t CO2e. Für die Bereitstellung von 80,5 MWh Strom wurden 11,8 t CO2e emittiert. Mit 76,1 t CO2e entstand der zweitgrößte Teil der jährlichen CO2e-Emissionen durch die Herstellung der Mensa-Mahlzeiten für Fakultätsangehörige. Der Fuhrpark der Fakultät, bestehend aus einem VW Bus, emittierte 3,5 t CO2e. Der mit 1,4 t CO2e geringste Teil der Emissionen wurde durch den Papierverbrauch erzeugt.
Es wird empfohlen, in den einzelnen Bereichen Einsparmaßnahmen vorzunehmen, um die jährlichen Treibhausgasemissionen zu verringern. Die Motivation hierfür kann durch eine Umweltstrategie an der Hochschule, eine mögliche Kosteneinsparung und durch die Förderung einer positiven Außendarstellung der Hochschule erhöht werden.
Mit der Erstellung der CO2-Bilanz wurden die Klimaauswirkungen der Fakultät Bauingenieurwesen quantifiziert. Die vorliegenden Ergebnisse können als Basis zur Untersuchung von weiteren Reduktionspotentialen verwendet werden.
Stadt entwerfen
(2018)
Zweite, komplett überarbeitete, aktualisierte und erweiterte Auflage in neuem Layout, größerem Format und mit neuen Projekten.
Städtebauliches Entwerfen basiert auf Ordnungs- und Gestaltungsprinzipien, die funktionale Ansprüche erfüllen und zugleich die Entwurfselemente zu einem unverwechselbaren Ganzen fügen müssen. Auch wenn Entwürfe fast immer vom Zeitgeist geprägt sind, so sind die kompositorischen Grundprinzipien weitgehend zeitlos. Stadt entwerfen erläutert die wichtigsten Entwurfs- und Darstellungsprinzipien im Städtebau anhand von ausgewählten historischen Beispielen und internationalen zeitgenössischen Wettbewerbsbeiträgen. Im Zentrum der Publikation steht die Frage, wie die Projekte entworfen wurden und welche Methoden und Instrumente dem Entwerfer zur Verfügung stehen: Neben dem klassischen Entwurf, der im Kopf des Entwerfers entsteht, erweitert sich gegenwärtig das Repertoire durch neue, computergestützte Methoden, wie z.B. dem parametrischen Entwerfen, bei dem veränderbare Parameter den Entwurf automatisiert beeinflussen und eine Vielfalt von Lösungsmöglichkeiten anbieten. Drei Best-Practice-Beispiele, die Hafencity Hamburg, Belval-Ouest in Luxemburg und die Südstadt in Tübingen, zeigen im Schlusskapitel, wie prämierte städtebauliche Konzepte und Entwürfe erfolgreich realisiert werden.
Mit Beiträgen von Oliver Fritz, Rolo Fütterer und Markus Neppl.
Erscheinungsverlauf:
Ausgabe Nr. 1, Herbst 2012
Ausgabe Nr. 2, Frühjahr 2013
Ausgabe Nr. 3, Herbst 2013
Ausgabe Nr. 4, Frühjahr 2014
Ausgabe Nr. 5, Herbst 2014
Ausgabe Nr. 6, Frühjahr 2015
Ausgabe Nr. 7, Herbst 2015
Doppelausgabe Nr. 8. und 9, Herbst 2016
Ausgabe Nr. 10, Frühjahr 2017
Ausgabe Nr. 11, Herbst 2017
Doppelausgabe Nr. 12 und 13, Frühjahr 2018
Doppelausgabe Nr. 14 und 15, Herbst 2019
Doppelausgabe Nr. 16 und 17, Herbst 2020
Ausgabe Nr. 18, Frühjahr 2021
Doppelausgabe Nr. 19 und 20, Frühjahr 2022
Ausgabe Nr. 21, Herbst 2022
Ausgabe Nr. 22, Frühjahr 2022
Ausgabe Nr. 23, Herbst 2023
Architectura
(2015)
This paper compares the surface morphology of differently finished austenitic stainless steel AISI 316L, also in combination with low temperature carburization. Milled and tumbled surfaces were analyzed by means of corrosion resistance and surface morphology. The results of potentiodynamic measurements show that professional grinding operations with SiC and Al2O3 always lead to a better corrosion resistance of low temperature carburized surfaces compared to the untreated reference in the used acidified chloride solution. Big influence on the corrosion resistance of vibratory ground or tumbled surfaces has the amount of plastic deformation while machining, that has to be kept low for austenitic stainless steels. Due to the high ductility, plastic deformation can lead to the formation of meta stable pits that can be initiation points of corrosion. The formation of meta stable pits can be aggravated by low temperature diffusion processes.
Effect of cold working on the localized corrosion behavior of CrNi and CrNiMnN metastable austenites
(2015)
Differences in the pitting resistance between cold worked CrNi and CrNiMnN metastable austenites
(2015)
Fachvortrag auf dem Kongress CORROSION 2015, 15-19 March, Dallas, Texas, USA. NACE International
Hot isostatic pressing (HIP) allows the production of complex components geometry. Generally, a high quality of the components is achieved due to the well managed composition of the metal powder and the non-isotropic properties. If a duplex stainless steel is produced, a heat treatment after the HIP-process is necessary to remove precipitations like carbides, nitrides and intermetallic phases. In a new process, the sintering step should be combined with the heat treatment. In this case a high cooling rate is necessary to avoid precipitations in duplex stainless steels. In this work, the influence of the HIP-temperature and the wall thickness on corrosion resistance, microstructure and impact strength were investigated. The results should help to optimize the process parameters like temperature and cooling rate. For the investigation, two HIP-temperatures were tested in a classical HIP-process step with a defined cooling rate. An additional heat treatment was not conducted. The specimens were cut from different sectors of the HIP-block. For investigation of the corrosion resistance, the critical pitting temperature was determined with electrochemical method according to EN ISO 17864. An impact test was used to determine the impact transition temperature. Metallographic investigations show the microstructure in the different sectors of the HIP-block.
Zugleich: Dissertation Carl von Ossietzky Universität Oldenburg, 2015 unter dem Titel: Werteorientiertes Management von Chancen und Risiken in der kommunalen Energieversorgung. Eine Untersuchung der Herausforderungen und Handlungsmöglichkeiten von Stadtwerken in der Energiewende aus ökonomischer, moralischer und kultureller Sicht.
Unter dem Ansatz der werteorientierten Unternehmensfuhrung erforscht die Autorin die Herausforderungen und Handlungsmöglichkeiten von (kommunalen) Energieversorgern im Zuge der Energiewende und des Energiewandels seit 2011. Sie zeigt, dass kommunale Energieversorger durch ihre Verzahnung zwischen Privatwirtschaft und öffentlicher Hand eine Sonderrolle einnehmen, die in Bezug auf die neuen Herausforderungen der Energiewende zu besonderen Chancen führt. Mit Hilfe des werteorientierten Managements können Dezentralität, Vertrauensvorsprung etc. erkannt und gehoben werden.
3-Stufen-Pulswechselrichter
(2016)
Moderne als Geschichte
(2015)
Arbeit an der Zukunft
(2016)
Die Digitalisierung beeinflusst nahezu alle Lebensbereiche. Welche der durch sie möglichen »Zukünfte« eintreten wird, ist heute schwer zu erkennen. Klar ist aber: Die HTWG muss ihre Studierenden darauf vorbereiten. Und sie will die Zukunft selbst mitgestalten. Ein Weg dahin ist die Modellfabrik 4.0, die fakultätsübergreifend Lehr- und Forschungsansätze bietet.
Filmische Dokumentation zu interdisziplinärem studentischen Hochschulprojekt des Fachbereichs Architektur der HTWG Konstanz (Studierende Bachelor und Master) und der BTU Cottbus-Senftenberg, Ausstellung zeitgenössischer Kunst, Gallery Weekend Berlin, 29.4.-01.05.2016, Dauer: 4.35 min.
Volum III ist eine ortsspezifische, audio-visuelle Inszenierung eines außergewöhnlichen und historisch imposanten Baudenkmals, das sich mitten in der Stadt unterirdisch versteckt.