Refine
Year of publication
- 2020 (106) (remove)
Document Type
- Conference Proceeding (46)
- Article (31)
- Part of a Book (11)
- Book (6)
- Other Publications (6)
- Doctoral Thesis (4)
- Study Thesis (1)
- Working Paper (1)
Has Fulltext
- no (106) (remove)
Keywords
- 3D ship detection (1)
- Accelerometers (1)
- Accessible Tourism (1)
- Actions (1)
- Adaptive (1)
- Adivasi (1)
- Agiles Lehren (1)
- Apnoe (1)
- Assisted living (1)
- BCG (1)
Institute
- Fakultät Architektur und Gestaltung (1)
- Fakultät Bauingenieurwesen (3)
- Fakultät Informatik (19)
- Fakultät Maschinenbau (2)
- Fakultät Wirtschafts-, Kultur- und Rechtswissenschaften (10)
- Institut für Angewandte Forschung - IAF (2)
- Institut für Optische Systeme - IOS (7)
- Institut für Strategische Innovation und Technologiemanagement - IST (9)
- Institut für Systemdynamik - ISD (14)
- Institut für Werkstoffsystemtechnik Thurgau - WITg (5)
Von wegen Bauschutt
(2020)
RC-Betone sind keine Neu-Entwicklungen, aber sie erleben seit circa 15 Jahren in Deutschland eine Renaissance mit Materialzusammensetzungen, die den heutigen Anforderungen an Normalbetone gerecht werden. Es gab immer wieder Abschnitte in der (Bau-)Geschichte, in denen Gebäude aus Ziegelsplitt-Betonen errichtet wurden, wie das Max-Kade-Studentenwohnheim in Stuttgart und das Technische Rathaus in Tübingen. Beide stammen aus der Nachkriegszeit und weisen einen guten Erhaltungszustand auf. Sie sind Beispiele für die Bewährung "historischer" Ziegelsplitt-Betone in der Baupraxis und ihre lange technische Lebensdauer.
Weder für moderne Recycling-Betone gemäß Regelwerk noch für Ziegelsplittbetone der Nachkriegsjahre bestehen prinzipielle Bedenken gegen deren Einsatz oder die Weiternutzung im Hochbau. Die Autoren wünschen sich mehr Akzeptanz und Vertrauen in Recyclingbaustoffe und dass sich für "Vintage" im Baubereich irgendwann ein ähnliches Interesse herausbildet wie für Vintage-Möbel oder Used-Look-Kleidung - und dies nicht nur hinsichtlich der Wiederverwendung gebrauchter Türen und Treppen, sondern auch für mineralische Massenbaustoffe wie Beton. Der Beitrag veranschaulicht anhand erfolgreich realisierter Objektbeispiele, wie Hochhäuser (z.B. das Studentenwohnheim Max-Kade-Haus in Stuttgart, 1953, aus Bauschuttbeton) oder Sakralgebäude (Fatima-Kirche in Kassel aus Sichtbeton mit Ziegelbruch, 60 Jahre alt) sowie auch Verwaltungsbauten (Technisches Rathaus in Tübingen aus den 1950er Jahren) erfolgreich und nachhaltig mit Recyclingmaterialien errichtet wurden.
Due to its economic size, economic policy measures, in particular trade policies, have a far‐reaching impact on global economic developments. This chapter quantifies the economic consequences of US protectionist trade aspirations. It focuses on trade policy scenarios, which have been communicated by the current US administration as potential new trade policies. The chapter draws on the results of a study of the ifo Institute conducted on behalf of the Bertelsmann Foundation. In the first simulation, a retraction from the North American Free Trade Agreement is considered. The chapter then illustrates the potential consequences of a “border tax adjustment” policy. It also simulates further measures to protect the US market by presuming an increase in American duties. The chapter presents robust quantitative results that can be expected if an increasingly protectionist US trade policy were to be implemented.
This article introduces the Global Sanctions Data Base (GSDB), a new dataset of economic sanctions that covers all bilateral, multilateral, and plurilateral sanctions in the world during the 1950–2016 period across three dimensions: type, political objective, and extent of success. The GSDB features by far the most cases amongst data bases that focus on effective sanctions (i.e., excluding threats) and is particularly useful for analysis of bilateral international transactional data (such as trade flows). We highlight five important stylized facts: (i) sanctions are increasingly used over time; (ii) European countries are the most frequent users and African countries the most frequent targets; (iii) sanctions are becoming more diverse, with the share of trade sanctions falling and that of financial or travel sanctions rising; (iv) the main objectives of sanctions are increasingly related to democracy or human rights; (v) the success rate of sanctions has gone up until 1995 and fallen since then. Using state-of-the-art gravity modeling, we highlight the usefulness of the GSDB in the realm of international trade. Trade sanctions have a negative but heterogeneous effect on trade, which is most pronounced for complete bilateral sanctions, followed by complete export sanctions.
This paper examines the corporate organisational aspects of the implementation of Industry 4.0. Industry 4.0 builds on new technologies and appears as a disruptive innovation to manufacturing firms. Although we do have a good understanding of the technical components, the implementation of the management and organisational aspects of Industry 4.0 is under-researched. It is challenging to find qualitative empirical evidence which provides comprehensive insights about real implementation cases. Based on a case study in a German high value manufacturing firm, we explore the corporate organisation and implementation of Industry 4.0. By using the framework of Complex Adaptive System (CAS), we have identified three key factors which facilitate the implementation of Industry 4.0 namely 1.) Organisational structure changes such as the foundation of a central department for digital transformation, 2.) The election of a Chief Digital Officer as a personnel change, and 3.) Corporate opening up towards cooperating with partners as a cultural change. We have furthermore found that Lean Management is an important enabler that ensures readiness for the adoption of Industry 4.0.
Globalization has increased the number of road trips and vehicles. The result has been an intensification of traffic accidents, which are becoming one of the most important causes of death worldwide. Traffic accidents are often due to human error, the probability of which increases when the cognitive ability of the driver decreases. Cognitive capacity is closely related to the driver’s mental state, as well as other external factors such as the CO2 concentration inside the vehicle. The objective of this work is to analyze how these elements affect driving. We have conducted an experiment with 50 drivers who have driven for 25 min using a driving simulator. These drivers completed a survey at the start and end of the experiment to obtain information about their mental state. In addition, during the test, their stress level was monitored using biometric sensors and the state of the environment (temperature, humidity and CO2 level) was recorded. The results of the experiment show that the initial level of stress and tiredness of the driver can have a strong impact on stress, driving behavior and fatigue produced by the driving test. Other elements such as sadness and the conditions of the interior of the vehicle also cause impaired driving and affect compliance with traffic regulations.
The expansion of a given multivariate polynomial into Bernstein polynomials is considered. Matrix methods for the calculation of the Bernstein expansion of the product of two polynomials and of the Bernstein expansion of a polynomial from the expansion of one of its partial derivatives are provided which allow also a symbolic computation.
Let A = [a_ij] be a real symmetric matrix. If f:(0,oo)-->[0,oo) is a Bernstein function, a sufficient condition for the matrix [f(a_ij)] to have only one positive eigenvalue is presented. By using this result, new results for a symmetric matrix with exactly one positive eigenvalue, e.g., properties of its Hadamard powers, are derived.
In today's volatile world, established companies must be capable of optimizing their core business with incremental innovations while simultaneously developing discontinuous innovations to maintain their long-term competitiveness. Balancing both is a major challenge for companies, since different types of innovation require different organizational structures, operational modes and management styles. Established companies tend to excel in improving their current business through incremental innovations which are closely related to their current knowledge base and competencies. However, this often goes hand in hand with challenges in the exploration of knowledge that is new to the company and that is essential for the development of discontinuous innovations. In this respect, the concept of corporate entrepreneurship is recognized as a way to strengthen the exploration of new knowledge and to support the development of discontinuous innovation. For managing corporate entrepreneurship more effectively, it is crucial to understand which types of knowledge can be created through corporate entrepreneurship and which organizational designs are more suited to gain certain types of knowledge. To answer these questions, this study analyzed 23 semi-structured interviews conducted with established companies that are running such entrepreneurial activities. The results show (1) that three general types of knowledge can be explored through corporate entrepreneurship and (2) that some organizational designs are more suited to explore certain knowledge types than others are.
Soft-input decoding of concatenated codes based on the Plotkin construction and BCH component codes
(2020)
Low latency communication requires soft-input decoding of binary block codes with small to medium block lengths.
In this work, we consider generalized multiple concatenated (GMC) codes based on the Plotkin construction. These codes are similar to Reed-Muller (RM) codes. In contrast to RM codes, BCH codes are employed as component codes. This leads to improved code parameters. Moreover, a decoding algorithm is proposed that exploits the recursive structure of the concatenation. This algorithm enables efficient soft-input decoding of binary block codes with small to medium lengths. The proposed codes and their decoding achieve significant performance gains compared with RM codes and recursive GMC decoding.
Deep neural networks (DNNs) are known for their high prediction performance, especially in perceptual tasks such as object recognition or autonomous driving. Still, DNNs are prone to yield unreliable predictions when encountering completely new situations without indicating their uncertainty. Bayesian variants of DNNs (BDNNs), such as MC dropout BDNNs, do provide uncertainty measures. However, BDNNs are slow during test time because they rely on a sampling approach. Here we present a single shot MC dropout approximation that preserves the advantages of BDNNs without being slower than a DNN. Our approach is to analytically approximate for each layer in a fully connected network the expected value and the variance of the MC dropout signal. We evaluate our approach on different benchmark datasets and a simulated toy example. We demonstrate that our single shot MC dropout approximation resembles the point estimate and the uncertainty estimate of the predictive distribution that is achieved with an MC approach, while being fast enough for real-time deployments of BDNNs.
Side Channel Attack Resistance of the Elliptic Curve Point Multiplication using Gaussian Integers
(2020)
Elliptic curve cryptography is a cornerstone of embedded security. However, hardware implementations of the elliptic curve point multiplication are prone to side channel attacks. In this work, we present a new key expansion algorithm which improves the resistance against timing and simple power analysis attacks. Furthermore, we consider a new concept for calculating the point multiplication, where the points of the curve are represented as Gaussian integers. Gaussian integers are subset of the complex numbers, such that the real and imaginary parts are integers. Since Gaussian integer fields are isomorphic to prime fields, this concept is suitable for many elliptic curves. Representing the key by a Gaussian integer expansion is beneficial to reduce the computational complexity and the memory requirements of a secure hardware implementation.
Side Channel Attack Resistance of the Elliptic Curve Point Multiplication using Eisenstein Integers
(2020)
Asymmetric cryptography empowers secure key exchange and digital signatures for message authentication. Nevertheless, consumer electronics and embedded systems often rely on symmetric cryptosystems because asymmetric cryptosystems are computationally intensive. Besides, implementations of cryptosystems are prone to side-channel attacks (SCA). Consequently, the secure and efficient implementation of asymmetric cryptography on resource-constrained systems is demanding. In this work, elliptic curve cryptography is considered. A new concept for an SCA resistant calculation of the elliptic curve point multiplication over Eisenstein integers is presented and an efficient arithmetic over Eisenstein integers is proposed. Representing the key by Eisenstein integer expansions is beneficial to reduce the computational complexity and the memory requirements of an SCA protected implementation.
This paper presents the goals, service design approach, and the results of the project “Accessible Tourism around Lake Constance”, which is currently run by different universities, industrial partners and selected hotels in Switzerland, Germany and Austria. In the 1st phase, interviews with different persons with disabilities and elderly persons have been conducted to identify the barriers and pains faced by tourists who want to spend their holidays in the region of Lake Constance as well as possible assistive technologies that help to overcome these barriers. The analysis of the interviews shows that one third of the pains and barriers are due to missing, insufficient, wrong or inaccessible information about the
accessibility of the accommodation, surroundings, and points of interests during the planning phase of the holidays. Digital assistive technologies hence play a
major role in bridging this information gap. In the 2nd phase so-called Hotel-Living-Labs (HLL) have been established where the identified assistive technologies
can be evaluated. Based on these HLLs an overall service for accessible holidays has been designed and developed. In the last phase, this service has been implemented
based on the HLLs as well as the identified assistive technologies and is currently field tested with tourists with disabilities from the three participated countries.
Seamless-Learning-Plattform
(2020)
Schreiben im Studium
(2020)
Sollte das wissenschaftliche Schreiben als allgemeine Wissenschaftssprache vermittelt werden oder sollten die sprachlichen Anforderungen in den jeweiligen Disziplinen berücksichtigt werden? Die Antwort auf diese Frage hängt auch davon ab, wie stark sich die Wissenschaftssprachen unterscheiden. In der vorliegenden Studie wurden wissenschaftssprachliche Besonderheiten anhand zweier Korpora untersucht, die deutschsprachige Dissertationen der Fächer Betriebswirtschaftslehre (BWL) bzw. Maschinenbau (MB) umfassten und in denen der Gebrauch von Mehrworteinheiten (z.B. im vergleich zu den) verglichen wurde. Die Untersuchung verdeutlicht den unterschiedlichen Gebrauch der Mehrworteinheiten zwischen den Disziplinen: Nur vierzehn der 50 häufigsten Mehrworteinheiten wurden in beiden Korpora genutzt. Die Ergebnisse werden mit Blick auf die Vermittlung des wissenschaftlichen Schreibens erörtert. Es werden Überlegungen zur Weiterentwicklung der Methode angestellt und es wird diskutiert, wie die Korpuslinguistik die Schreibvermittlung im Studium unterstützen könnte.
Methods based exclusively on heart rate hardly allow to differentiate between physical activity, stress, relaxation, and rest, that is why an additional sensor like activity/movement sensor added for detection and classification. The response of the heart to physical activity, stress, relaxation, and no activity can be very similar. In this study, we can observe the influence of induced stress and analyze which metrics could be considered for its detection. The changes in the Root Mean Square of the Successive Differences provide us with information about physiological changes. A set of measurements collecting the RR intervals was taken. The intervals are used as a parameter to distinguish four different stages. Parameters like skin conductivity or skin temperature were not used because the main aim is to maintain a minimum number of sensors and devices and thereby to increase the wearability in the future.
Totally nonnegative matrices, i.e., matrices having all their minors nonnegative, and matrix intervals with respect to the checkerboard partial order are considered. It is proven that if the two bound matrices of such a matrix interval are totally nonnegative and satisfy certain conditions, then all matrices from this interval are also totally nonnegative and satisfy the same conditions.
The recovery of our body and brain from fatigue directly depends on the quality of sleep, which can be determined from the results of a sleep study. The classification of sleep stages is the first step of this study and includes the measurement of vital data and their further processing. The non-invasive sleep analysis system is based on a hardware sensor network of 24 pressure sensors providing sleep phase detection. The pressure sensors are connected to an energy-efficient microcontroller via a system-wide bus. A significant difference between this system and other approaches is the innovative way in which the sensors are placed under the mattress. This feature facilitates the continuous use of the system without any noticeable influence on the sleeping person. The system was tested by conducting experiments that recorded the sleep of various healthy young people. Results indicate the potential to capture respiratory rate and body movement.
Probabilistic Deep Learning
(2020)
Probabilistic Deep Learning is a hands-on guide to the principles that support neural networks. Learn to improve network performance with the right distribution for different data types, and discover Bayesian variants that can state their own uncertainty to increase accuracy. This book provides easy-to-apply code and uses popular frameworks to keep you focused on practical applications.
Die Überwindung des Bruchs (Seam) beim Lernen im Studium zwischen dem Hochschulkontext und der beruflichen Praxis ist durch die zeitlich, räumlich und organisatorisch bedingte Trennung der relevanten Akteure (u. a. Lehrende, Lernende, Unternehmensvertreter) eine sehr große Herausforderung (Milrad et al., 2013). Eine seamless-learning-basierte Konzeption einer Lehrveranstaltung auf Basis agiler Werte und Methoden (u. a. inkrementelles Vorgehen, Fokus auf lernendenzentrierte Veranstaltungen, individualisiertes Lernenden-Feedback) kann bei der Überwindung dieses bedeutenden Bruchs helfen. In dem Poster wird das grundsätzliche Design eines derartigen agilen SL-Konzepts auf Basis eines iterativ, inkrementellen Vorgehens innerhalb eines Semesterzyklus von 15 Wochen in drei Lernsprints erörtert. Darüber hinaus wird über erste Lehrerfahrungen der Dozierenden sowohl aus der Hochschule als auch aus dem industriellen Umfeld und Lernerfahrungen der Studierenden aus den vergangenen zwei Jahren berichtet.
Pop-up Workshopreihe
(2020)
Philosophie & Rhetorik
(2020)
Aufgrund der Corona-Pandemie kam es 2020 zu einer verstärkten Nutzung von Homeoffice und Teleworking. Sowohl bzgl. der Wahl des Arbeitsortes als auch der genutzen Kommunikationstechnologien existieren Pfadabhängigkeiten. Der Beitrag thematisiert diese Pfadabhängigkeiten systematisch, insbesondere ihre Ursachen und Folgen sowie die Möglichkeiten zur Pfadbrechung.
The IETF, concerned with the evolution of the Internet architecture, nowadays also looks into industrial automation processes. The contributions of a variety of IETF activities, initiated during the last ten years, enable now the replacement of proprietary standards by an open standardized protocol stack. This stack, denoted in the following as 6TiSCH-stack, is tailored for industrial internet of things (IIoTs). The suitability of 6TiSCH-stack for Industry 4.0 is yet to explore. In this paper, we identify four challenges that, in our opinion, may delay or hinder its adoption. As a prime example of that, we focus on the initial 6TiSCHnetwork
formation, highlighting the shortcomings of the default procedure and introducing our current work for a fast and reliable formation of dense network.
Polysomnography is a gold standard for a sleep study, and it provides very accurate results, but its cost (both personnel and material) are quite high. Therefore, the development of a low-cost system for overnight breathing and heartbeat monitoring, which provides more comfort while recording the data, is a well-motivated challenge. The system proposed in this manuscript is based on the usage of resistive pressure sensors installed under the mattress. These sensors can measure slight pressure changes provoked during breathing and heartbeat. The captured signal requires advanced processing, like applying filters and amplifiers before the analog signal is ready for the next step. Then, the output signal is digitalized and further processed by an algorithm that performs a custom filtering before it can recognize breathing and heart rate in real-time. The result can be directly visualized. Furthermore, a CSV file is created containing the raw data, timestamps, and unique IDs to facilitate further processing. The achieved results are promising, and the average deviation from a reference device is about 4bpm.
The Montgomery multiplication is an efficient method for modular arithmetic. Typically, it is used for modular arithmetic over integer rings to prevent the expensive inversion for the modulo reduction. In this work, we consider modular arithmetic over rings of Gaussian integers. Gaussian integers are subset of the complex numbers such that the real and imaginary parts are integers. In many cases Gaussian integer rings are isomorphic to ordinary integer rings. We demonstrate that the concept of the Montgomery multiplication can be extended to Gaussian integers. Due to independent calculation of the real and imaginary parts, the computation complexity of the multiplication is reduced compared with ordinary integer modular arithmetic. This concept is suitable for coding applications as well as for asymmetric key cryptographic systems, such as elliptic curve cryptography or the Rivest-Shamir-Adleman system.
In this thesis, the recognition problem and the properties of eigenvalues and eigenvectors of matrices which are strictly sign-regular of a given order, i.e., matrices whose minors of a given order have the same strict sign, are considered. The results are extended to matrices which are sign-regular of a given order, i.e., matrices whose minors of a given order have the same sign or are allowed to vanish. As a generalization, a new type of matrices called oscillatory of a specific order, are introduced. Furthermore, the properties for this type are investigated. Also, same applications to dynamic systems are given.
Pascal Laube presents machine learning approaches for three key problems of reverse engineering of defective structured surfaces: parametrization of curves and surfaces, geometric primitive classification and inpainting of high-resolution textures. The proposed methods aim to improve the reconstruction quality while further automating the process. The contributions demonstrate that machine learning can be a viable part of the CAD reverse engineering pipeline.
The evaluation of the effectiveness of different machine learning algorithms on a publicly available database of signals derived from wearable devices is presented with the goal of optimizing human activity recognition and classification. Among the wide number of body signals we choose a couple of signals, namely photoplethysmographic (optically detected subcutaneous blood volume) and tri-axis acceleration signals that are easy to be simultaneously acquired using commercial widespread devices (e.g. smartwatches) as well as custom wearable wireless devices designed for sport, healthcare, or clinical purposes. To this end, two widely used algorithms (decision tree and k-nearest neighbor) were tested, and their performance were compared to two new recent algorithms (particle Bernstein and a Monte Carlo-based regression) both in terms of accuracy and processing time. A data preprocessing phase was also considered to improve the performance of the machine learning procedures, in order to reduce the problem size and a detailed analysis of the compression strategy and results is also presented.
Production and marketing of cereal grains are some of the main activities in developing countries to ensure food security. However, the food gap is complicated further by high postharvest loss of grains during storage. This study aimed to compare low‐cost modified‐atmosphere hermetic storage structures with traditional practice to minimize quantitative and qualitative losses of grains during storage. The study was conducted in two phases: in the first phase, seven hermetic storage structures with or without smoke infusion were compared, and one selected structure was further validated at scaled‐up capacity in the second phase.
Multi-dimensional spatial modulation is a multipleinput/ multiple-output wireless transmission technique, that uses only a few active antennas simultaneously. The computational complexity of the optimal maximum-likelihood (ML) detector at the receiver increases rapidly as more transmit antennas or larger modulation orders are employed. ML detection may be infeasible for higher bit rates. Many suboptimal detection algorithms for spatial modulation use two-stage detection schemes where the set of active antennas is detected in the first stage and the transmitted symbols in the second stage. Typically, these detection schemes use the ML strategy for the symbol detection. In this work, we consider a suboptimal detection algorithm for the second detection stage. This approach combines equalization and list decoding. We propose an algorithm for multi-dimensional signal constellations with a reduced search space in the second detection stage through set partitioning. In particular, we derive a set partitioning from the properties of Hurwitz integers. Simulation results demonstrate that the new algorithm achieves near-ML performance. It significantly reduces the complexity when compared with conventional two-stage detection schemes. Multi-dimensional constellations in combination with suboptimal detection can even outperform conventional signal constellations in combination with ML detection.
Lean Digitalization
(2020)
Die Schlafapnoe ist eine häufig auftretende Schlafstörung,
die unterschiedliche Auswirkungen auf unseren Alltag hat; so wurde z. B.
über eine Tagesschläfrigkeit von etwa 25 % der Patienten mit obstruktiver
Schlafapnoe (OSA) berichtet. Ziel dieser Arbeit ist die Entwicklung eines
Systems, das eine nichtinvasive Erkennung der Schlafapnoe in häuslicher
Umgebung ermöglichen soll.
At present, the majority of the proposed Deep Learning (DL) methods provide point predictions without quantifying the model's uncertainty. However, a quantification of the reliability of automated image analysis is essential, in particular in medicine when physicians rely on the results for making critical treatment decisions. In this work, we provide an entire framework to diagnose ischemic stroke patients incorporating Bayesian uncertainty into the analysis procedure. We present a Bayesian Convolutional Neural Network (CNN) yielding a probability for a stroke lesion on 2D Magnetic Resonance (MR) images with corresponding uncertainty information about the reliability of the prediction. For patient-level diagnoses, different aggregation methods are proposed and evaluated, which combine the individual image-level predictions. Those methods take advantage of the uncertainty in the image predictions and report model uncertainty at the patient-level. In a cohort of 511 patients, our Bayesian CNN achieved an accuracy of 95.33% at the image-level representing a significant improvement of 2% over a non-Bayesian counterpart. The best patient aggregation method yielded 95.89% of accuracy. Integrating uncertainty information about image predictions in aggregation models resulted in higher uncertainty measures to false patient classifications, which enabled to filter critical patient diagnoses that are supposed to be closer examined by a medical doctor. We therefore recommend using Bayesian approaches not only for improved image-level prediction and uncertainty estimation but also for the detection of uncertain aggregations at the patient-level.
Which testing formats are most appropriate for assessing academic language skills? Christian Krekeler addresses this question for informal language assessment in the classroom. In the first part of this chapter discrete items, isolated tests of writing, and integrated tests, three formats which are frequently used in standardised assessment, are investigated and their usefulness for classroom assessment, where diagnosis, achievement and learning are important test functions, is discussed. Because academic language use is cognitively demanding, it is argued that assessment of academic language skills should include authentic and complex tasks as well as subject specific content. The resulting tensions between complex, authentic and context sensitive tasks and the requirements of language assessment methodology are discussed in the second part of the chapter.
Traditional Western philosophy, cognitive science and traditional HCI frameworks approach the term digital and its implications with an implicit dualism (nature/cul-ture, theory /practice, body/mind, human/machine). What lies between is a feature of our postmodern times, in which different states, conditions or positions merge and co-exist in a new, hybrid reality, a “continuous beta” (Mühlenbeck & Skibicki, 2007) version of becoming .Post-digitality involves the physical dimensions of spatio-temporal engagements. This new ontological paradigm reconceptualizes digital technology through the ex-perience of the human body and its senses, thus emphasizing form-taking, situation-al engagement and practice rather than symbolic, disembodied rationality. This rais-es two questions in particular: how to encourage curiosity, playfulness, serendipity, emergence, discourse and collectivity? How to construct working methods without foregrounding and dividing the subject into an individual that already takes posi-tion? This paper briefly outlines the rhizomatic framework that I developed within my PhD research. This attempts to overcome two prevailing tendencies: first, the one-sided view of scientific approaches to knowledge acquisition and the pure-ly application-oriented handling of materials, technologies and machines; second, the distanced perception of the world. In contrast, my work involves project-driven alchemic curiosity and doing research through artistic design practice. This means thinking through materials, technologies and machinic interactions. Now, at the end of this PhD journey, 10 interdisciplinary projects have emerged from this ontological queer-paradigm that is post-digital–crafting 4.0. Below I illustrate this approach and its outcomes.
In modern fruit processing technology, non-destructive quality measuring techniques aresought for determining and controlling changes in the optical, structural, and chemical properties of theproducts. In this context, changes inside the product can be measured during processing. Especiallyfor industrial use, fast, precise, but robust methods are particularly important to obtain high-qualityproducts. In this work, a newly developed multi-spectral imaging system was implemented andadapted for drying processes. Further it was investigated if the system could be used to link changesin the surface spectral reflectance during mango drying with changes in moisture content andcontents of chemical components. This was achieved by recovering the spectral reflectance frommulti-spectral image data and comparing the spectral changes with changes of the total soluble solids(TSS), pH-value and the relative moisture contentxwbof the products. In a first step, the camera wasmodified to be used in drying, then the changes in the spectra and quality criteria during mangodrying were measured. For this, mango slices were dried at air temperatures of 40–80◦C and relativeair humidities of 5%–30%. Samples were analyzed and pictures were taken with the multi-spectralimaging system. The quality criteria were then predicted from spectral data. It could be shown thatthe newly developed multi-spectral imaging system can be used for quality control in fruit drying.There are strong indications as well, that it can be employed for the prediction of chemical qualitycriteria of mangoes during drying. This way, quality changes can be monitored inline during theprocess using only one single measuring device.
The ballistocardiography is a technique that measures the heart rate from the mechanical vibrations of the body due to the heart movement. In this work a novel noninvasive device placed under the mattress of a bed estimates the heart rate using the ballistocardiography. Different algorithms for heart rate estimation have been developed.
Compliance ist als Integritätsmanagement heute ein wesentlicher Aspekt erfolgreichen unternehmerischen Handelns. Unter den Bedingungen der digitalen Transformation, die neue Geschäftsmodelle ermöglicht, die Welt verbindet und vielseitige ungekannte Risiken birgt, ist ein robustes Compliance-Management gefragt, das sich nicht nur mit klassischer Korruptionsbekämpfung auseinandersetzt – auch Themen der Cyber- und Privacy-, Produkt- und Technik-Compliance, Geldwäsche und die Einhaltung von Menschen- und Arbeitnehmerrechten in internationalen Wertschöpfungsketten stehen immer mehr im Fokus. Die Neubearbeitung dieses Handbuchs spiegelt diese beispiellose Dynamik der letzten Jahren praxisgerecht wider: Mit internationalem Blick (Sie finden Länderstudien u.a. zu China, Lateinamerika, Russland, Afrika) und über 40 neuen Beiträgen zu den aktuell wichtigsten Compliance-Topics.
Compliance ist in aller Munde, nicht zuletzt dank des Abgasskandals in der Automobilindustrie. Zahlreiche weitere und häufig auch spektakuläre Unternehmensskandale der letzten Jahre (v. a. aufgrund von Kartellverstößen, Korruptions-, Betrugs- und Bilanzierungsdelikten) haben weltweit eine Welle der Regulierung und verschärfter Durchsetzung bereits vorhandener rechtlicher und regulatorischer Anforderungen ausgelöst. Unternehmen suchen daher verstärkt nach Wegen der Prävention rechtlich sanktionierbaren Fehlverhaltens im Unternehmensbereich, nicht zuletzt, um die Haftung für Unternehmen, Organe und Mitarbeiter zu vermeiden. Dabei ist wichtig zu erkennen, dass über das rechtliche Haftungsrisiko (Geldstrafen und Bußgelder, Verfall/Gewinnabschöpfungen, Gefängnisstrafen) hinaus insbesondere auch die Vermeidung des ökonomischen Haftungsfalles aufgrund von Non-Compliance gemeint ist. Diese ökonomische Haftung stellt sich als Reputationsschaden ein und kann sich konkret zeigen an z. B. fallenden Börsenkursen, dem Abbruch von Geschäftsbeziehungen oder dem Ausschluss von öffentlichen Ausschreibungen 5 sowie durch Probleme bei der Rekrutierung von Spitzenpersonal oder der Demission von Managern und Mitarbeitern. Hinzu kommen die Kosten für die Untersuchung von entdeckten Fällen von Non-Compliance (Kosten für externe Anwälte, Wirtschaftsprüfer etc.), die Kosten für die Beseitigung der Ursachen (z. B. Schließung von Lücken im Internen Kontrollsystem: sog. „Remediation“) und die durch die Absorption von Management- und Mitarbeiterkapazitäten aufgrund der Aufarbeitung eines Falles von Non-Compliance verursachten Kosten.
„All organizations are inherently criminogenic“. Wenngleich diese Aussage durchaus übertrieben klingen mag, zeigen neuere Untersuchungen, dass über die Hälfte der Manager in Deutschland bereits mit unethischem Verhalten in Unternehmen konfrontiert wurden. Aktuelle Skandale in der Automobil- (sog. „Abgasskandal“) oder der Finanzindustrie (zuletzt sog. „Cum ex-“ und „Cum-Cum-Geschäfte“) scheinen darüber hinaus zu belegen, dass auch Organisationen mit einem etablierten Compliance-Management, immer wieder nur die Reaktion auf delinquente Handlungen bleibt. Die Prävention wirtschaftskrimineller und unmoralischer Verhaltensweisen hingegen scheitert regelmäßig, trotz zum Teil umfangreich ausgestalteter Compliance-Management-Systeme (CMS).
Gesellschaftsrecht
(2020)
Die vorliegende Veröffentlichung gibt einen kompakten Überblick über wesentliche Regelungen des Personen- und Kapitalgesellschaftsrechts. Nach den gesellschaftsrechtlichen Grundprinzipien werden zunächst die Personengesellschaften (GbR, OHG, KG, Stille Gesellschaft und Partnerschaft) und hieran anschließend die Kapitalgesellschaften (GmbH und AG) behandelt. Abschließend geht es um die diesbezüglichen Einflüsse des europäischen Rechts. Sie wendet sich in Lehrbuchform vornehmlich an Studierende von Universitäten, Hochschulen, Berufsakademien und anderen Bildungseinrichtungen.
This paper proposes a novel transmission scheme for generalized multistream spatial modulation. This new approach uses one Mannheim error correcting codes over Gaussian or Eisenstein integers as multidimensional signal constellations. These codes enable a suboptimal decoding strategy with near maximum likelihood performance for transmission over the additive white Gaussian noise channel. In this contribution, this decoding algorithm is generalized to the detection for generalized multistream spatial modulation. The proposed method can outperform conventional generalized multistream spatial modulation with respect to decoding performance, detection complexity, and spectral efficiency.
In this article, we give the construction of new four-dimensional signal constellations in the Euclidean space, which represent a certain combination of binary frequency-shift keying (BFSK) and M-ary amplitude-phase-shift keying (MAPSK). Description of such signals and the formulas for calculating the minimum squared Euclidean distance are presented. We have developed an analytic building method for even and odd values of M. Hence, no computer search and no heuristic methods are required. The new optimized BFSK-MAPSK (M = 5,6,···,16) signal constructions are built for the values of modulation indexes h =0.1,0.15,···,0.5 and their parameters are given. The results of computer simulations are also provided. Based on the obtained results we can conclude, that BFSK-MAPSK systems outperform similar four-dimensional systems both in terms of minimum squared Euclidean distance and simulated symbol error rate.
Die wenigen Literaturangaben zu Sorptionsisothermen von mineralischen Estrichen beziehen sich im Wesentlichen auf Calciumsulfatestriche und genormte Zementestriche, sowie i.d.R. nur auf eine festgesetzte Lufttemperatur (= 20 Grad C). Daher war es das Anliegen der im Beitrag beschriebenen Untersuchung, die Feuchtigkeitseigenschaften von Estrichen bei unterschiedlichen Klimaten mithilfe von Sorptionsisothermen zu charakterisieren. Ergänzend sollten die seit ca. 20 Jahren marktüblichen ternären Schnellzemente mit untersucht und die baupraktisch interessanten Temperaturen von 15 Grad C und 25 Grad C einbezogen werden. Ebenso wurden die Auswirkungen der Klimabedingungen auf der Baustelle (Jahreszeit, Luftfeuchtigkeit, Temperatur) auf den Hydratationsvorgang der Estriche untersucht. In Kombination mit den Ergebnissen der Gefügeuntersuchungen (u.a. Hg-Porosimetrie) wird belegt, weshalb sich die zement- und schnellzementgebundenen Estriche vollkommen anders verhalten als die calciumsulfatgebundenen Estriche. Dieses unterschiedliche Verhalten ist auch einer der Gründe, warum Estriche mit der KRL-Methode in Bezug auf ihren Feuchtegehalt nicht bewertet werden können. Deshalb folgt ein Vergleich der Materialfeuchtemessungen "KRL-Methode" mit der handwerksüblichen und seit Jahrzehnten in der Praxis bewährten "CM-Methode".
This document presents a new complete standalone system for a recognition of sleep apnea using signals from the pressure sensors placed under the mattress. The developed hardware part of the system is tuned to filter and to amplify the signal. Its software part performs more accurate signal filtering and identification of apnea events. The overall achieved accuracy of the recognition of apnea occurrence is 91%, with the average measured recognition delay of about 15 seconds, which confirms the suitability of the proposed method for future employment. The main aim of the presented approach is the support of the healthcare system with the cost-efficient tool for recognition of sleep apnea in the home environment.
Die Themen Compliance und wertebasiertes unternehmerisches Handeln sind spätestens seit dem sogenannten „Abgasskandal“, der den Volkswagen-Konzern bereits Strafzahlungen und Schadensersatz in Milliardenhöhe kostete, auch in Unternehmen des Mittelstands und bei Verbrauchern präsenter denn je.
Im Juni 2018 verhängte die Staatsanwaltschaft Braunschweig im Zusammenhang mit der Manipulation von Abgaswerten ein Bußgeld i. H. v. einer Milliarde Euro gegen Volkswagen. Anfang 2017 einigte sich VW mit dem US-amerikanischen Justizministerium auf einen Vergleich, der die Zahlung von 4,3 Mrd. US-Dollar (2,8 Mrd. Dollar Strafzahlungen, 1,5 Mrd. Dollar für zivilrechtliche Bußgelder) vorsieht. Durch weitere Vergleiche mit Kunden und Autohändlern in den USA belaufen sich die Gesamtkosten für Strafen, Bußgelder und Vergleichszahlungen auf etwa 25 Mrd. Dollar. Daneben sind weitere Kosten aus der Dieselaffäre zu erwarten, die sich im Wesentlichen aus dem Kapitalmarktrecht (ggf. versäumte Ad-hoc-Mitteilungen), aus Schadensersatzklagen, Motornachrüstungen und eventuell weiteren Strafzahlungen in anderen Jurisdiktionen ergeben. Hinzukommen die Kosten für die Aufarbeitung der Vorgänge und die sogenannte „Remediation“ des Compliance-Management-Systems (CMS), die der Gesamtschadenssumme ebenso zuzurechnen sind, wie die internen Kosten (Prozessverbesserungen, Management Attention etc.).
Der „Dieselskandal“ und seine Folgen belegen damit eindrucksvoll die Relevanz der Compliance.
Ein Beitrag zum Beobachterentwurf und zur sensorlosen Folgeregelung translatorischer Magnetaktoren
(2020)
1863 dichtet der Volksautor Wilhelm Busch die Geschichte von Max und Moritz, zweier Lausbuben, die gegen Regeln und Sitten der Dorfgemeinschaft verstoßen und das Zusammenleben empfindlich stören. Busch zeigt in seiner Dichtung schon in der Einleitung den Kern des Problems auf. „Ach, was muß man oft von bösen Kindern hören oder lesen! Wie zum Beispiel hier von diesen, Welche Max und Moritz hießen;
Die, anstatt durch weise Lehren Sich zum Guten zu bekehren, Oftmals noch darüber lachten Und sich heimlich lustig machten. Ja, zur Übeltätigkeit, Ja, dazu ist man bereit!“ Max und Moritz weigern sich, die Schule zu besuchen und sich regelkonform und anständig zu verhalten. Ihre Uneinsichtigkeit und ihre Rücksichtslosigkeit enden für die beiden tödlich, und erst mit diesem Ende kehrt wieder Ruhe im Dorf ein. Der Zusammenhang zwischen der bekannten Kindergeschichte „Max und Moritz“ von Wilhelm Busch und aktuellen Wirtschafts- und Unternehmensskandalen scheint vielleicht auf den ersten Blick etwas weit hergeholt. Als „modernes Märchen“ sprechen diese Geschichte und ihre Moral aber letztlich genau von dem, was Unternehmen heute weltweit zu schaffen macht. Ohne die bekannten Lausbubenstreiche mit Spielarten moderner Wirtschaftskriminalität gleichsetzen zu wollen, lässt sich aus den sieben Streichen für unser Thema folgendes ableiten:
– Regelwidriges Verhalten kann überall auftauchen
– Unter dem Fehlverhalten einzelner haben alle zu leiden
– Das Funktionieren des übergeordneten Systems wird nachhaltig gestört
– Und: klassische „Erziehungsversuche“ erweisen sich häufig als wirkungslos
In beispielhafter Zusammenarbeit zwischen Industrie (Geobrugg AG, Romanshorn/Schweiz) und Wissenschaft (WITg Institut für Werkstoffsystemtechnik Thurgau an der Hochschule Konstanz, Tägerwilen/Schweiz) wurde mit Unterstützung der Schweizer staatlichen Innovationsförderung (KTI, hee Innosuisse) ein neues Werkstoff- und Fertigungskonzept für den Bau von Fischzuchtnetzen aus hochfesten nichtrostenden Stahldrähten entwickelt.
Diese Entwicklung wurde 2019 von Swiss Inox, der Schweizer Innovationspreis Prix Inox ausgezeichnet.
Multi-Dimensional Connectionist Classification is amethod for weakly supervised training of Deep Neural Networksfor segmentation-free multi-line offline handwriting recognition.MDCC applies Conditional Random Fields as an alignmentfunction for this task. We discuss the structure and patterns ofhandwritten text that can be used for building a CRF. Since CRFsare cyclic graphical models, we have to resort to approximateinference when calculating the alignment of multi-line text duringtraining, here in the form of Loopy Belief Propagation. This workconcludes with experimental results for transcribing small multi-line samples from the IAM Offline Handwriting DB which showthat MDCC is a competitive methodology.
We provide an overview of the ongoing discussions on the objectives of the energy transition in the form of a conceptual framework, intending to facilitate the search for the most viable options for a successful transformation of the energy system. For this purpose, we examine the development of energy policy goals in Germany in the past and present, whereby we give an overview of objectives and assessment approaches from politics, economics, and science. Moreover, we then merge the different views into a common framework and analyze the central conflict between the wholeness of a hypothetical target circle and the simplification in favor of a hypothetical target point in more detail.
We present source code patterns that are difficult for modern static code analysis tools. Our study comprises 50 different open source projects in both a vulnerable and a fixed version for XSS vulnerabilities reported with CVE IDs over a period of seven years. We used three commercial and two open source static code analysis tools. Based on the reported vulnerabilities we discovered code patterns that appear to be difficult to classify by static analysis. The results show that code analysis tools are helpful, but still have problems with specific source code patterns. These patterns should be a focus in training for developers.
Die Nibelungenbrücke Worms
(2020)
Forschungsfrage: Welche Rollen lassen sich in Corporate Entrepreneurship identifizieren? Wie unterscheiden sich diese anhand verschiedener Merkmale und welche Fähigkeiten scheinen besonders relevant für ihre erfolgreiche Ausführung?
Methodik: Explorative Studie mit 56 semi-strukturierten Interviews mit Corporate-Entrepreneurship-Aktivitäten im DACH-Raum
Praktische Implikationen: Ein genaues Verständnis über die jeweiligen Rollen, ihre Unterschiedlichkeiten und Anforderungen ist notwendig, um die verschiedenen Corporate-Entrepreneurship-Aktivitäten mit passendem Personal zu besetzen.
Für die Überwachung des Schlafs zu Hause sind nichtinvasive Methoden besonders gut anwendbar. Die Signale, die häufig überwacht werden, sind Herzfrequenz und Atemfrequenz. Die Ballistokardiographie (BCG)ist eine Technik, bei der die Herzfrequenz aus den mechanischen Schwingungen des Körpers bei jedem Herzzyklus gemessen wird. Kürzlich wurden Übersichtsarbeiten veröffentlicht. Die Untersuchung soll in einem ersten Ansatz bewerten, ob die Herzfrequenz anhand von BCG erkannt werden kann. Die wesentlichen Randbedingungen sind, ob dies gelingt, wenn der Sensor unter der Matratze positioniert wird und kostengünstige Sensoren zum Einsatz kommen.
Mapping of tree seedlings is useful for tasks ranging from monitoring natural succession and regeneration to effective silvicultural management. Development of methods that are both accurate and cost-effective is especially important considering the dramatic increase in tree planting that is required globally to mitigate the impacts of climate change. The combination of high-resolution imagery from unmanned aerial vehicles and object detection by convolutional neural networks (CNNs) is one promising approach. However, unbiased assessments of these models and methods to integrate them into geospatial workflows are lacking. In this study, we present a method for rapid, large-scale mapping of young conifer seedlings using CNNs applied to RGB orthomosaic imagery. Importantly, we provide an unbiased assessment of model performance by using two well-characterised trial sites together containing over 30,000 seedlings to assemble datasets with a high level of completeness. Our results showed CNN-based models trained on two sites detected seedlings with sensitivities of 99.5% and 98.8%. False positives due to tall weeds at one site and naturally regenerating seedlings of the same species led to slightly lower precision of 98.5% and 96.7%. A model trained on examples from both sites had 99.4% sensitivity and precision of 97%, showing applicability across sites. Additional testing showed that the CNN model was able to detect 68.7% of obscured seedlings missed during the initial annotation of the imagery but present in the field data. Finally, we demonstrate the potential to use a form of weakly supervised training and a tile-based processing chain to enhance the accuracy and efficiency of CNNs applied to large, high-resolution orthomosaics.
Der digitale Seilüberwacher
(2020)
Location-aware mobile devices are becoming increasingly popular and GPS sensors are built into nearly every portable unit with computational capabilities. At the same time, the emergence of location-aware virtual services and ideas calls for new efficient spatial real-time queries. Communication latency in mobile environments interacting with high decentralization and the need of scalability in high-density systems with immense client counts leads to major challenges. In this paper we describe a decentralized architecture for continuous range queries in settings in which both, the requested and the requesting clients, are mobile. While prior works commonly use a request-response approach we provide a stream-based adaptive grid solution dealing with arbitrary high client counts and improving communication latency that meets given hard real-time constraints.
The ageing infrastructure in ports requires regular inspection. This inspection is currently carried out manually by divers who sense by hand the entire underwater infrastructure. This process is cost-intensive as it involves a lot of time and human resources. To overcome these difficulties, we propose to scan the above and underwater port structure with a Multi-SensorSystem, and -by a fully automated processto classify the obtained point cloud into damaged and undamaged zones. We make use of simulated training data to test our approach since not enough training data with corresponding class labels are available yet. To that aim, we build a rasterised heightfield of a point cloud of a sheet pile wall by cutting it into verticall slices. The distance from each slice to the corresponding line generates the heightfield. This latter is propagated through a convolutional neural network which detects anomalies. We use the VGG19 Deep Neural Network model pretrained on natural images. This neural network has 19 layers and it is often used for image recognition tasks. We showed that our approach can achieve a fully automated, reproducible, quality-controlled damage detection which is able to analyse the whole structure instead of the sample wise manual method with divers. The mean true positive rate is 0.98 which means that we detected 98 % of the damages in the simulated environment.
Uncertainty about the future requires companies to create discontinuous innovations. Established companies, however, struggle to do so; whereas independent startups seem to better cope with this. Consequently, established companies set up entrepreneurial initiatives to make use of startups' benefits. Consequently, this led-amongst others-to great interest in socalled corporate entrepreneurship (CE) programs and to the development and characterization of several different forms. Their processes to achieve certain objectives, yet, are still rather ineffective. Thus, considerations of the actions performed in preparation for and during CE programs could be one approach to improve this but are still absent today. Furthermore, the increasing use of several CE programs in parallel seems to bear the potential for synergies and, thus, more efficient use of resources. Aiming to provide insights to both issues, this study analyzes actions of CE programs, by looking at interviews with managers of seven corporate incubators and accelerator programs of five established German tech-companies.
Creative industry and cultural tourism destination Lake Constance - a media discourse analysis
(2020)
The following media discourse analysis examines the news media coverage of four regional online newspapers, about the topics “creative industries” and “cultural tourism” at Lake Constance region in the period from 2006 until 2016. The results show that, besides event-relater reporting, there is currently no vibrant media discourse on the topics “creative industries” and “cultural tourism”. Even though the image of the Lake Constance region is heavily influenced by tourism, “cultural tourism” also plays a secondary role when it comes to regional news reporting. Moreover, discourses do not overlap and thus no synergies within the local media discourse are formed. This result is relevant for the regional tourism development, because the cooperation between “creative industries” and “cultural tourism” creates opportunities such as the expansion of the tourism offer and an extension of the tourist season. To activate unused opportunities at the different destinations of the region, a supra-regional visibility of the sector “creative industries” should be developed and the cooperation of the sector with local stakeholders of cultural tourism should be promoted.
Um die Antwort zu der Frage nach der Notwendigkeit von Compliance- Management für mittelständische Unternehmen gleich vorweg zu nehmen: ja, auch mittelständische Unternehmen tun gut daran, sich mit Compliance-Management auseinander zu setzen. Die Notwendigkeit für Compliance-Management verstanden als die Sicherstellung von Compliance, d. h. der Einhaltung gesetzlicher Regelungen, von Soft Law sowie interner Regeln und Verhaltensstandards, ergibt sich bereits allein aus der Verantwortung der Unternehmensleitung, das Unternehmen vor Bedrohungen zu schützen und den Fortbestand des Unternehmens sowie die Kooperationsbeziehungen mit den verschiedenen Stakeholdergruppen langfristig zu sichern. Anders formuliert: Compliance-Management ist Risikomanagement und somit von strategischer Bedeutung für eine verantwortungsvolle Unternehmensführung (good Corporate Governance) – in Großunternehmen wie in mittelständischen Unternehmen. Einzig die Frage nach dem „Wie“ von Compliance-Management in mittelständischen Unternehmen hat ihre Berechtigung.
Compliance Governance
(2020)
Für die Funktionsfähigkeit und Wirksamkeit eines Compliance-Management-Systems (CMS) – und damit verbunden, einer guten und verantwortungsvollen Corporate Governance – ist die Compliance Governance von fundamentaler Bedeutung. Damit sind passende Compliance-Strukturen und eine positive Compliance-Kultur gemeint, die in den Mittelpunkt der Aufmerksamkeit rücken. Im vorliegenden Beitrag wird dieser Themenkomplex aus unterschiedlichen Perspektiven beleuchtet. Zunächst nimmt Stephan Grüninger eine Begriffsbestimmung zur „Compliance Governance“ vor (1.3.1), um darauf aufbauend die wesentlichen Erfolgsprinzipien wirksamer Compliance-Struktur und -Kultur aus betriebswirtschaftlicher und organisationstheoretischer Perspektive herauszuarbeiten (1.3.2). Der Kontrolle und Aufsicht im Rahmen einer guten „Compliance Governance“ kommt eine wichtige Funktion zu, so dass Roland Steinmeyer darauf aufbauend (1.3.3) aus rechtlicher Perspektive die Verantwortlichkeiten des Aufsichtsrats im Rahmen des Compliance- Managements (CM) beschreibt. Abschließend (1.3.4) gibt Christian Strenger Einblicke in die Praxis der Compliance-Aufsicht und benennt Anforderungen, die Aufsichtsräte bei der Ausübung ihrer Überwachungsfunktion zu erfüllen haben und Herausforderungen, denen sie dabei gegenüberstehen.
Good sleep is crucial for a healthy life of every person. Unfortunately, its quality often decreases with aging. A common approach to measuring the sleep characteristics is based on interviews with the subjects or letting them fill in a daily questionnaire and afterward evaluating the obtained data. However, this method has time and personal costs for the interviewer and evaluator of responses. Therefore, it would be important to execute the collection and evaluation of sleep characteristics automatically. To do that, it is necessary to investigate the level of agreement between measurements performed in a traditional way using questionnaires and measurements obtained using electronic monitoring devices. The study presented in this manuscript performs this investigation, comparing such sleep characteristics as "time going to bed", "total time in bed", "total sleep time" and "sleep efficiency". A total number of 106 night records of elderly persons (aged 65+) were analyzed. The results achieved so far reveal the fact that the degree of agreement between the two measurement methods varies substantially for different characteristics, from 31 minutes of mean difference for "time going to bed" to 77 minutes for "total sleep time". For this reason, a direct exchange of objective and subjective measuring methods is currently not possible.
Spatial modulation is a low-complexity multipleinput/ multipleoutput transmission technique. The recently proposed spatial permutation modulation (SPM) extends the concept of spatial modulation. It is a coding approach, where the symbols are dispersed in space and time. In the original proposal of SPM, short repetition codes and permutation codes were used to construct a space-time code. In this paper, we propose a similar coding scheme that combines permutation codes with codes over Gaussian integers. Short codes over Gaussian integers have good distance properties. Furthermore, the code alphabet can directly be applied as signal constellation, hence no mapping is required. Simulation results demonstrate that the proposed coding approach outperforms SPM with repetition codes.
Three-dimensional ship localization with only one camera is a challenging task due to the loss of depth information caused by perspective projection. In this paper, we propose a method to measure distances based on the assumption that ships lie on a flat surface. This assumption allows to recover depth from a single image using the principle of inverse perspective. For the 3D ship detection task, we use a hybrid approach that combines image detection with a convolutional neural network, camera geometry and inverse perspective. Furthermore, a novel calculation of object height is introduced. Experiments show that the monocular distance computation works well in comparison to a Velodyne lidar. Due to its robustness, this could be an easy-to-use baseline method for detection tasks in navigation systems.
Climate protection in Seychelles through tourism: the advantages of a small-sized destination
(2020)
CO2 abatement costs are often low in developing countries. This is why most carbon offset projects are being implemented there. Nevertheless, this does not mean that the holiday resort and the project country are in any way related to each other. Linking compensation projects with the destination country could increase the willingness of air travellers to finance voluntary CO2 compensation measures.
This paper describes how a possible combination of CO2 compensation projects in the Seychelles could affect the voluntary carbon offset behaviour of Seychelles tourists. On the one hand, the issue of whether the voluntary willingness of Seychelles travellers to compensate can be increased is examined; on the other hand, whether tourists would be willing to visit a co-financed project in the Seychelles.
As a result, the willingness of tourists to offset air-travel carbon emissions can be increased. Important factors for this are e.g. that all persons have adequate information and that the carbon offset providers display a high degree of transparency. In addition, a broad interest in visiting the projects in the Seychelles during the holiday was expressed. An important condition for this is the spatial vicinity to the project. Due to its small size, the Seychelles are an ideal location for fulfilling this premise.
NAND flash memory is widely used for data storage due to low power consumption, high throughput, short random access latency, and high density. The storage density of the NAND flash memory devices increases from one generation to the next, albeit at the expense of storage reliability.
Our objective in this dissertation is to improve the reliability of the NAND flash memory with a low hard implementation cost. We investigate the error characteristic, i.e. the various noises of the NAND flash memory. Based on the error behavior at different life-aging stages, we develop offset calibration techniques that minimize the bit error rate (BER).
Furthermore, we introduce data compression to reduce the write amplification effect and support the error correction codes (ECC) unit. In the first scenario, the numerical results show that the data compression can reduce the wear-out by minimizing the amount of data that is written to the flash. In the ECC scenario, the compression gain is used to improve the ECC capability. Based on the first scenario, the write amplification effect can be halved for the considered target flash and data model. By combining the ECC and data compression, the NAND flash memory lifetime improves three fold compared with uncompressed data for the same data model.
In order to improve the data reliability of the NAND flash memory, we investigate different ECC schemes based on concatenated codes like product codes, half-product codes, and generalized concatenated codes (GCC). We propose a construction for high-rate GCC for hard-input decoding. ECC based on soft-input decoding can significantly improve the reliability of NAND flash memories. Therefore, we propose a low-complexity soft-input decoding algorithm for high-rate GCC.
Despite the importance of Social Life Cycle Sustainability Assessment (S-LCSA), little research has addressed its integration into Product Lifecycle Management (PLM) systems. This paper presents a structured review of relevant research and practice. Also, to address practical aspects in more detail, it focuses on challenges and potential for adoption of such an integrated system at an electronics company.
We began by reviewing literature on implementations of Social-LCSA and identifying research needs. Then we investigated the status of Social-LCSA within the electronics industry, both by reviewing literature and interviewing decision makers, to identify challenges and the potential for adopting S-LCSA at an electronics company. We found low maturity of Social-LCSA, particularly difficulty in quantifying social sustainability. Adoption of Social-LCSA was less common among electronics industry suppliers, especially mining & smelting plants. Our results could provide a basis for conducting case studies that could further clarify issues involved in integrations of Social-LCSA into PLM systems.
A residual neural network was adapted and applied to the Physionet/Computing data in Cardiology Challenge 2020 to detect 24 different classes of cardiac abnormalities from 12-lead. Additive Gaussian noise, signal shifting, and the classification of signal sections of different lengths were applied to prevent the network from overfitting and facilitating generalization. Due to the use of a global pooling layer after the feature extractor, the network is independent of the signal’s length. On the hidden test set of the challenge, the model achieved a validation score of 0.656 and a full test score of 0.27, placing us 15th out of 41 officially ranked teams (Team name: UC_Lab_Kn). These results show the potential of deep neural networks for ap- plication to raw data and a complex multi-class multi-label classification problem, even if the training data is from di- verse datasets and of differing lengths.
In previous studies, we used a method for detecting stress that was based exclusively on heart rate and ECG for differentiation between such situations as mental stress, physical activity, relaxation, and rest. As a response of the heart to these situations, we observed different behavior in the Root Mean Square of the Successive differences heartbeats (RMSSD). This study aims to analyze Virtual Reality via a virtual reality headset as an effective stressor for future works. The value of the Root Mean Square of the Successive Differences is an important marker for the parasympathetic effector on the heart and can provide information about stress. For these measurements, the RR interval was collected using a breast belt. In these studies, we can observe the Root Mean Square of the successive differences heartbeats. Additional sensors for the analysis were not used. We conducted experiments with ten subjects that had to drive a simulator for 25 minutes using monitors and 25 minutes using virtual reality headset. Before starting and after finishing each simulation, the subjects had to complete a survey in which they had to describe their mental state. The experiment results show that driving using virtual reality headset has some influence on the heart rate and RMSSD, but it does not significantly increase the stress of driving.
The reliability of flash memories suffers from various error causes. Program/erase cycles, read disturb, and cell to cell interference impact the threshold voltages and cause bit errors during the read process. Hence, error correction is required to ensure reliable data storage. In this work, we investigate the bit-labeling of triple level cell (TLC) memories. This labeling determines the page capacities and the latency of the read process. The page capacity defines the redundancy that is required for error correction coding. Typically, Gray codes are used to encode the cell state such that the codes of adjacent states differ in a single digit. These Gray codes minimize the latency for random access reads but cannot balance the page capacities. Based on measured voltage distributions, we investigate the page capacities and propose a labeling that provides a better rate balancing than Gray labeling.
Dieses besondere Lehrbuch führt in die Berechnung von Verbrennungsmotoren ein, geht dabei von aktuellen Fragestellungen z. B. zur Fahrzeugdynamik oder Motorthermodynamik aus und stellt bei der Lösung die notwendige Theorie mit bereit. Damit auch Quereinsteiger erfolgreich sind, ist in einem Verzeichnis aufgeführt, welche theoretischen Kenntnisse man für die Lösung der jeweiligen Aufgabe benötigt und in welchem Abschnitt des Buches diese hergeleitet werden. Alle Berechnungen werden in Excel durchgeführt.
In der vorliegenden Auflage wurden aktuelle Beispiele zur Motoreffizienz, zum Motorkennfeld, zur Lastpunktanhebung, zum Fahrzyklus WLTC und zu realen Fahrzyklen (RDE) ergänzt.
Der Inhalt:
- Fahrwiderstand und Motorleistung
- Kraftstoffe und Stöchiometrie
- Motorleistung und Mitteldruck
- Motorthermodynamik
- Motormechanik
- Fahrzeugdynamik
- Hybrid- und Elektrofahrzeuge
- Aufladung von Verbrennungsmotoren
Die Zielgruppen:
Studierende des Maschinenbaus und der Kfz-Technik an Hochschulen in Bachelor- und Masterstudiengängen; Motoreningenieure, die sich mit verbrennungsmotorischen Fragestellungen beschäftigen; Absolventen von Weiterbildungskursen auf Meisterniveau
Der Autor Dr.-Ing. Klaus Schreiner ist Professor an der HTWG Konstanz und Leiter des Labors für Verbrennungsmotoren in der Fakultät Maschinenbau. Er war viele Jahre lang Didaktikbeauftragter seiner Hochschule. Im Jahr 2008 erhielt er den Lehrpreis des Landes Baden-Württemberg.
What drives entrepreneurial action to create a lasting impact? The creation of new ventures that aim at having an impact beyond their financial performance face additional challenges: achieving economic sustainability and at the same time addressing social or environmental issues. Little is known on how these new hybrid organizations, aiming for multiple impact dimensions, manage to be congruent with their blended values. A dataset of 4,125 early-stage ventures is used to gain insights into how blended values are converted into financial, social and environmental impacts, giving shape to different types of hybrid organizations. Our findings suggest new hybrid organizations might opt to sacrifice financial impact to achieve social impact, yet this is not the case when they aim to generate environmental or sustainable impact. Therefore, the tensions and sacrifices related to holding blended values are not homogeneous across all types of new hybrid organizations.
Im vorliegenden Beitrag wird der Einfluss der Modellierung des Kellergeschosses auf die Querkraft und die Verformungen von im Kellergeschoss eingespannten Stahlbeton‐Aussteifungswänden untersucht. Der Querkraftverlauf der Wand und die Verschiebung am Kopf der Wand werden mit den entsprechenden, am vereinfachten Kragwandmodell ermittelten Werten verglichen, bei dem die Wand auf Höhe der Kellerdecke voll eingespannt ist und die Weiterleitung der Schnittkräfte im Kellergeschoss nicht näher betrachtet wird. Für die Berücksichtigung des Kellergeschosses wird zunächst eine gelenkige Festhaltung durch die Kellerdecke und die Bodenplatte betrachtet, wodurch sich eine unrealistisch große Wandquerkraft im Kellergeschoss ergibt. Danach wird ein verfeinertes Modell mit Teileinspannung in der Bodenplatte und nachgiebiger Halterung durch die Kellerdecke untersucht. Es werden Empfehlungswerte für die Federkonstante der Drehfeder auf Höhe der Bodenplatte und der horizontalen Translationsfeder auf Höhe der Kellerdecke angegeben, die in der Praxis Anwendung finden können. Es wird besonders der Frage nachgegangen, welche Einflüsse die Berücksichtigung des Kellergeschosses bei der Erdbebenbemessung der Aussteifungswände hat. Dabei wird einerseits die Systemsteifigkeit, von der die Erdbebenersatzlasten abhängen, und andererseits die mögliche Verschiebungsduktilität, von der der mögliche Verhaltensbeiwert q abhängt, betrachtet.
In diesem Beitrag wird eine Methode des maschinellen Lernens entwickelt, die die Schlafstadienerkennung untersucht. Übliche Methoden der Schlafanalyse basieren auf der Polysomnographie (PSG). Der präsentierte Ansatz basiert auf Signalen, die ausschließlich nicht-invasiv in einer häuslichen Umgebung gemessen werden können. Bewegungs-, Herzschlags- und Atmungssignale können vergleichsweise leicht erfasst werden aber die Erkennung der Schlafstadien ist dadurch erschwert. Die Signale werden als Zeitreihenfolge strukturiert und in Epochen überführt. Die Leistungsfähigkeit von maschinellem Lernen wird der Polysomnographie gegenübergestellt und bewertet.
Digital technology and architecture have become inseparable, with new approaches and methodologies not just affecting the workflows and practice of architects but shaping the very character of architecture.
This compendious work offers a wide-ranging orientation to the new landscape with its opportunities, its challenges, and its vast potential.
Die Projektaufgabe bestand darin, den aktuellen Laborversuch zu modernisieren, indem die Kommunikation zwischen dem Versuchsaufbau und Laborrechner nicht wie bisher über Wandlerkarten stattfindet, sondern über EtherCAT und TwinCAT 3.
Die Installation von TwinCAT 3 mit den zugehörigen Erweiterungen und erforderlichen Programmen stellt sich als sehr umfangreich und schwierig dar, was die Installationsanleitungen zeigen. Außerdem gab es sehr viele Fehlerquellen, die nicht auf Anhieb ersichtlich waren, wie das Aktualisieren der aktuellen MATLAB Version. Ist die Installation abgeschlossen kann die Kommunikation zwischen MATLAB und TwinCAT relativ einfach umgesetzt werden.
In der Projektarbeit wurde anfangs dann die Kommunikation mit mehreren Tests überprüft und Optimierungen vorgenommen. So wurde zum Beispiel die Wegbegrenzung angepasst. Schwierigkeiten zeigten sich bei der Bedienung über MATLAB oder beim Abstürzen von MATLAB, da beim Stoppen oder Abstürzen von MATLAB, der zuletzt gesendete Wert immer noch an TwinCAT 3 anliegt und somit der Aktor weiter verfahren würde. Diese sehr gefährliche Situation wäre ein gravierender Nachteil, gegenüber der alten Kommunikation mit einer Wandlerkarte. Um einen sicheren Stopp zu garantieren, wird über ein neues TcCOM Objekt der Matlab-Status mit einem Togglebit überprüft, ändert sich der Wert des Bits nicht mehr, stoppt die Anlage sicher.
Um einen Vergleich mit dem bisherigen Masterversuch erhalten zu können, wurde die Strecke mit der neuen Kommunikation untersucht und ein passender Regler dafür auszulegt.
Die Auswertung der Impulsantwort sowie der „Spectrum-Analyse“ zeigten beim Vergleich mit den Schnittstellen gleiche Ergebnisse, somit sind die Versuche bei dem Laborversuch ohne Einschränkungen durchführbar. Die Auslegung des Reglers zeigte entgegen den Prognosen der Beckhoff-Experten sehr gute Ergebnisse und die Kommunikation über die Schnittstelle zeigte keine Probleme.
Einschränkungen zeigten sich jedoch bei der einzustellenden Abtastzeit, da eine Abtastzeit unter 2ms nicht möglich ist. Zwar kann man eine geringere Abtastzeit einstellen, jedoch zeigt sich bei der Auswertung, dass die Schnittstelle mit Abtastzeiten unter 2ms Probleme aufweist. Die Rechendauer wird deutlich größer und die größere Anzahl an Messpunkte kann nicht richtig verarbeitet werden. Ein Regler kann damit nicht implementiert werden.
Die Projektarbeit konnte somit erfolgreich angeschlossen werden und bis auf die aufwendige Installation sind die Erweiterungen von Beckhoff sehr zuverlässig und gut zu bedienen. Die ersten Voruntersuchen waren positiv, somit kann auch an weiteren Laborrechnern eine Umstellung der Schnittstelle in Betracht gezogen werden.
This work is a study about a comparison of survey tools and it should help developers in selecting a suited tool for application in an AAL environment. The first step was to identify the basic required functionality of the survey tools used for AAL technologies and to compare these tools by their functionality and assignments. The comparative study was derived from the data obtained, previous literature studies and further technical data. A list of requirements was stated and ordered in terms of relevance to the target application domain. With the help of an integrated assessment method, the calculation of a generalized estimate value was performed and the result is explained. Finally, the planned application of this tool in a running project is explained.
For a long time, the use of intermediate products in production has been growing more rapidly in most countries than domestic production. This is a strong indication of more interdependency in production. The main purpose of input-output analysis is to study the interdependency of industries in an economy. Often the term interindustry analysis is also used. Therefore, the exchange of intermediate products is a key issue of input-output analysis. We will use input–output data for this study that the author prepared for the new ‘Handbook on Supply, Use and Input–Output Tables with Extensions and Applications’ of the United Nations. The supply use and input–output tables contain separate valuation matrices for trade margins, transport margins, value added tax, other taxes on products and subsidies on products. For the study, two input–output models were developed to evaluate the impact of fuel subsidy and taxation reform on output, gross domestic product, inflation and trade. Six scenarios are discussed covering different aspects of the reform.
Many resource-constrained systems still rely on symmetric cryptography for verification and authentication. Asymmetric cryptographic systems provide higher security levels, but are very computational intensive. Hence, embedded systems can benefit from hardware assistance, i.e., coprocessors optimized for the required public key operations. In this work, we propose an elliptic curve cryptographic coprocessors design for resource-constrained systems. Many such coprocessor designs consider only special (Solinas) prime fields, which enable a low-complexity modulo arithmetic. Other implementations support arbitrary prime curves using the Montgomery reduction. These implementations typically require more time for the point multiplication. We present a coprocessor design that has low area requirements and enables a trade-off between performance and flexibility. The point multiplication can be performed either using a fast arithmetic based on Solinas primes or using a slower, but flexible Montgomery modular arithmetic.
In this work, we investigate a hybrid decoding approach that combines algebraic hard-input decoding of binary block codes with soft-input decoding. In particular, an acceptance criterion is proposed which determines the reliability of a candidate codeword. For many received codewords the stopping criterion indicates that the hard-decoding result is sufficiently reliable, and the costly soft-input decoding can be omitted. The proposed acceptance criterion significantly reduces the decoding complexity. For simulations we combine the algebraic hard-input decoding with ordered statistics decoding, which enables near maximum likelihood soft-input decoding for codes of small to medium block lengths.
We propose and apply a requirements engineering approach that focuses on security and privacy properties and takes into account various stakeholder interests. The proposed methodology facilitates the integration of security and privacy by design into the requirements engineering process. Thus, specific, detailed security and privacy requirements can be implemented from the very beginning of a software project. The method is applied to an exemplary application scenario in the logistics industry. The approach includes the application of threat and risk rating methodologies, a technique to derive technical requirements from legal texts, as well as a matching process to avoid duplication and accumulate all essential requirements.
Cardiovascular diseases are directly or indirectly responsible for up to 38.5% of all deaths in Germany and thus represent the most frequent cause of death. At present, heart diseases are mainly discovered by chance during routine visits to the doctor or when acute symptoms occur. However, there is no practical method to proactively detect diseases or abnormalities of the heart in the daily environment and to take preventive measures for the person concerned. Long-term ECG devices, as currently used by physicians, are simply too expensive, impractical, and not widely available for everyday use. This work aims to develop an ECG device suitable for everyday use that can be worn directly on the body. For this purpose, an already existing hardware platform will be analyzed, and the corresponding potential for improvement will be identified. A precise picture of the existing data quality is obtained by metrological examination, and corresponding requirements are defined. Based on these identified optimization potentials, a new ECG device is developed. The revised ECG device is characterized by a high integration density and combines all components directly on one board except the battery and the ECG electrodes. The compact design allows the device to be attached directly to the chest. An integrated microcontroller allows digital signal processing without the need for an additional computer. Central features of the evaluation are a peak detection for detecting R-peaks and a calculation of the current heart rate based on the RR interval. To ensure the validity of the detected R-peaks, a model of the anatomical conditions is used. Thus, unrealistic RR-intervals can be excluded. The wireless interface allows continuous transmission of the calculated heart rate. Following the development of hardware and software, the results are verified, and appropriate conclusions about the data quality are drawn. As a result, a very compact and wearable ECG device with different wireless technologies, data storage, and evaluation of RR intervals was developed. Some tests yelled runtimes up to 24 hours with wireless Lan activated and streaming.