Refine
Year of publication
- 2020 (106) (remove)
Document Type
- Conference Proceeding (46)
- Article (31)
- Part of a Book (11)
- Book (6)
- Other Publications (6)
- Doctoral Thesis (4)
- Study Thesis (1)
- Working Paper (1)
Has Fulltext
- no (106) (remove)
Keywords
- 3D ship detection (1)
- Accelerometers (1)
- Accessible Tourism (1)
- Actions (1)
- Adaptive (1)
- Adivasi (1)
- Agiles Lehren (1)
- Apnoe (1)
- Assisted living (1)
- BCG (1)
Institute
- Fakultät Architektur und Gestaltung (1)
- Fakultät Bauingenieurwesen (3)
- Fakultät Informatik (19)
- Fakultät Maschinenbau (2)
- Fakultät Wirtschafts-, Kultur- und Rechtswissenschaften (10)
- Institut für Angewandte Forschung - IAF (2)
- Institut für Optische Systeme - IOS (7)
- Institut für Strategische Innovation und Technologiemanagement - IST (9)
- Institut für Systemdynamik - ISD (14)
- Institut für Werkstoffsystemtechnik Thurgau - WITg (5)
Totally nonnegative matrices, i.e., matrices having all their minors nonnegative, and matrix intervals with respect to the checkerboard partial order are considered. It is proven that if the two bound matrices of such a matrix interval are totally nonnegative and satisfy certain conditions, then all matrices from this interval are also totally nonnegative and satisfy the same conditions.
Let A = [a_ij] be a real symmetric matrix. If f:(0,oo)-->[0,oo) is a Bernstein function, a sufficient condition for the matrix [f(a_ij)] to have only one positive eigenvalue is presented. By using this result, new results for a symmetric matrix with exactly one positive eigenvalue, e.g., properties of its Hadamard powers, are derived.
In this thesis, the recognition problem and the properties of eigenvalues and eigenvectors of matrices which are strictly sign-regular of a given order, i.e., matrices whose minors of a given order have the same strict sign, are considered. The results are extended to matrices which are sign-regular of a given order, i.e., matrices whose minors of a given order have the same sign or are allowed to vanish. As a generalization, a new type of matrices called oscillatory of a specific order, are introduced. Furthermore, the properties for this type are investigated. Also, same applications to dynamic systems are given.
Soft-input decoding of concatenated codes based on the Plotkin construction and BCH component codes
(2020)
Low latency communication requires soft-input decoding of binary block codes with small to medium block lengths.
In this work, we consider generalized multiple concatenated (GMC) codes based on the Plotkin construction. These codes are similar to Reed-Muller (RM) codes. In contrast to RM codes, BCH codes are employed as component codes. This leads to improved code parameters. Moreover, a decoding algorithm is proposed that exploits the recursive structure of the concatenation. This algorithm enables efficient soft-input decoding of binary block codes with small to medium lengths. The proposed codes and their decoding achieve significant performance gains compared with RM codes and recursive GMC decoding.
For a long time, the use of intermediate products in production has been growing more rapidly in most countries than domestic production. This is a strong indication of more interdependency in production. The main purpose of input-output analysis is to study the interdependency of industries in an economy. Often the term interindustry analysis is also used. Therefore, the exchange of intermediate products is a key issue of input-output analysis. We will use input–output data for this study that the author prepared for the new ‘Handbook on Supply, Use and Input–Output Tables with Extensions and Applications’ of the United Nations. The supply use and input–output tables contain separate valuation matrices for trade margins, transport margins, value added tax, other taxes on products and subsidies on products. For the study, two input–output models were developed to evaluate the impact of fuel subsidy and taxation reform on output, gross domestic product, inflation and trade. Six scenarios are discussed covering different aspects of the reform.
The evaluation of the effectiveness of different machine learning algorithms on a publicly available database of signals derived from wearable devices is presented with the goal of optimizing human activity recognition and classification. Among the wide number of body signals we choose a couple of signals, namely photoplethysmographic (optically detected subcutaneous blood volume) and tri-axis acceleration signals that are easy to be simultaneously acquired using commercial widespread devices (e.g. smartwatches) as well as custom wearable wireless devices designed for sport, healthcare, or clinical purposes. To this end, two widely used algorithms (decision tree and k-nearest neighbor) were tested, and their performance were compared to two new recent algorithms (particle Bernstein and a Monte Carlo-based regression) both in terms of accuracy and processing time. A data preprocessing phase was also considered to improve the performance of the machine learning procedures, in order to reduce the problem size and a detailed analysis of the compression strategy and results is also presented.
Deep neural networks (DNNs) are known for their high prediction performance, especially in perceptual tasks such as object recognition or autonomous driving. Still, DNNs are prone to yield unreliable predictions when encountering completely new situations without indicating their uncertainty. Bayesian variants of DNNs (BDNNs), such as MC dropout BDNNs, do provide uncertainty measures. However, BDNNs are slow during test time because they rely on a sampling approach. Here we present a single shot MC dropout approximation that preserves the advantages of BDNNs without being slower than a DNN. Our approach is to analytically approximate for each layer in a fully connected network the expected value and the variance of the MC dropout signal. We evaluate our approach on different benchmark datasets and a simulated toy example. We demonstrate that our single shot MC dropout approximation resembles the point estimate and the uncertainty estimate of the predictive distribution that is achieved with an MC approach, while being fast enough for real-time deployments of BDNNs.
Die wenigen Literaturangaben zu Sorptionsisothermen von mineralischen Estrichen beziehen sich im Wesentlichen auf Calciumsulfatestriche und genormte Zementestriche, sowie i.d.R. nur auf eine festgesetzte Lufttemperatur (= 20 Grad C). Daher war es das Anliegen der im Beitrag beschriebenen Untersuchung, die Feuchtigkeitseigenschaften von Estrichen bei unterschiedlichen Klimaten mithilfe von Sorptionsisothermen zu charakterisieren. Ergänzend sollten die seit ca. 20 Jahren marktüblichen ternären Schnellzemente mit untersucht und die baupraktisch interessanten Temperaturen von 15 Grad C und 25 Grad C einbezogen werden. Ebenso wurden die Auswirkungen der Klimabedingungen auf der Baustelle (Jahreszeit, Luftfeuchtigkeit, Temperatur) auf den Hydratationsvorgang der Estriche untersucht. In Kombination mit den Ergebnissen der Gefügeuntersuchungen (u.a. Hg-Porosimetrie) wird belegt, weshalb sich die zement- und schnellzementgebundenen Estriche vollkommen anders verhalten als die calciumsulfatgebundenen Estriche. Dieses unterschiedliche Verhalten ist auch einer der Gründe, warum Estriche mit der KRL-Methode in Bezug auf ihren Feuchtegehalt nicht bewertet werden können. Deshalb folgt ein Vergleich der Materialfeuchtemessungen "KRL-Methode" mit der handwerksüblichen und seit Jahrzehnten in der Praxis bewährten "CM-Methode".
Ein Beitrag zum Beobachterentwurf und zur sensorlosen Folgeregelung translatorischer Magnetaktoren
(2020)
The ballistocardiography is a technique that measures the heart rate from the mechanical vibrations of the body due to the heart movement. In this work a novel noninvasive device placed under the mattress of a bed estimates the heart rate using the ballistocardiography. Different algorithms for heart rate estimation have been developed.
Globalization has increased the number of road trips and vehicles. The result has been an intensification of traffic accidents, which are becoming one of the most important causes of death worldwide. Traffic accidents are often due to human error, the probability of which increases when the cognitive ability of the driver decreases. Cognitive capacity is closely related to the driver’s mental state, as well as other external factors such as the CO2 concentration inside the vehicle. The objective of this work is to analyze how these elements affect driving. We have conducted an experiment with 50 drivers who have driven for 25 min using a driving simulator. These drivers completed a survey at the start and end of the experiment to obtain information about their mental state. In addition, during the test, their stress level was monitored using biometric sensors and the state of the environment (temperature, humidity and CO2 level) was recorded. The results of the experiment show that the initial level of stress and tiredness of the driver can have a strong impact on stress, driving behavior and fatigue produced by the driving test. Other elements such as sadness and the conditions of the interior of the vehicle also cause impaired driving and affect compliance with traffic regulations.
Die Nibelungenbrücke Worms
(2020)
Probabilistic Deep Learning
(2020)
Probabilistic Deep Learning is a hands-on guide to the principles that support neural networks. Learn to improve network performance with the right distribution for different data types, and discover Bayesian variants that can state their own uncertainty to increase accuracy. This book provides easy-to-apply code and uses popular frameworks to keep you focused on practical applications.
The IETF, concerned with the evolution of the Internet architecture, nowadays also looks into industrial automation processes. The contributions of a variety of IETF activities, initiated during the last ten years, enable now the replacement of proprietary standards by an open standardized protocol stack. This stack, denoted in the following as 6TiSCH-stack, is tailored for industrial internet of things (IIoTs). The suitability of 6TiSCH-stack for Industry 4.0 is yet to explore. In this paper, we identify four challenges that, in our opinion, may delay or hinder its adoption. As a prime example of that, we focus on the initial 6TiSCHnetwork
formation, highlighting the shortcomings of the default procedure and introducing our current work for a fast and reliable formation of dense network.
This document presents a new complete standalone system for a recognition of sleep apnea using signals from the pressure sensors placed under the mattress. The developed hardware part of the system is tuned to filter and to amplify the signal. Its software part performs more accurate signal filtering and identification of apnea events. The overall achieved accuracy of the recognition of apnea occurrence is 91%, with the average measured recognition delay of about 15 seconds, which confirms the suitability of the proposed method for future employment. The main aim of the presented approach is the support of the healthcare system with the cost-efficient tool for recognition of sleep apnea in the home environment.
The recovery of our body and brain from fatigue directly depends on the quality of sleep, which can be determined from the results of a sleep study. The classification of sleep stages is the first step of this study and includes the measurement of vital data and their further processing. The non-invasive sleep analysis system is based on a hardware sensor network of 24 pressure sensors providing sleep phase detection. The pressure sensors are connected to an energy-efficient microcontroller via a system-wide bus. A significant difference between this system and other approaches is the innovative way in which the sensors are placed under the mattress. This feature facilitates the continuous use of the system without any noticeable influence on the sleeping person. The system was tested by conducting experiments that recorded the sleep of various healthy young people. Results indicate the potential to capture respiratory rate and body movement.
Good sleep is crucial for a healthy life of every person. Unfortunately, its quality often decreases with aging. A common approach to measuring the sleep characteristics is based on interviews with the subjects or letting them fill in a daily questionnaire and afterward evaluating the obtained data. However, this method has time and personal costs for the interviewer and evaluator of responses. Therefore, it would be important to execute the collection and evaluation of sleep characteristics automatically. To do that, it is necessary to investigate the level of agreement between measurements performed in a traditional way using questionnaires and measurements obtained using electronic monitoring devices. The study presented in this manuscript performs this investigation, comparing such sleep characteristics as "time going to bed", "total time in bed", "total sleep time" and "sleep efficiency". A total number of 106 night records of elderly persons (aged 65+) were analyzed. The results achieved so far reveal the fact that the degree of agreement between the two measurement methods varies substantially for different characteristics, from 31 minutes of mean difference for "time going to bed" to 77 minutes for "total sleep time". For this reason, a direct exchange of objective and subjective measuring methods is currently not possible.
Die Schlafapnoe ist eine häufig auftretende Schlafstörung,
die unterschiedliche Auswirkungen auf unseren Alltag hat; so wurde z. B.
über eine Tagesschläfrigkeit von etwa 25 % der Patienten mit obstruktiver
Schlafapnoe (OSA) berichtet. Ziel dieser Arbeit ist die Entwicklung eines
Systems, das eine nichtinvasive Erkennung der Schlafapnoe in häuslicher
Umgebung ermöglichen soll.
Polysomnography is a gold standard for a sleep study, and it provides very accurate results, but its cost (both personnel and material) are quite high. Therefore, the development of a low-cost system for overnight breathing and heartbeat monitoring, which provides more comfort while recording the data, is a well-motivated challenge. The system proposed in this manuscript is based on the usage of resistive pressure sensors installed under the mattress. These sensors can measure slight pressure changes provoked during breathing and heartbeat. The captured signal requires advanced processing, like applying filters and amplifiers before the analog signal is ready for the next step. Then, the output signal is digitalized and further processed by an algorithm that performs a custom filtering before it can recognize breathing and heart rate in real-time. The result can be directly visualized. Furthermore, a CSV file is created containing the raw data, timestamps, and unique IDs to facilitate further processing. The achieved results are promising, and the average deviation from a reference device is about 4bpm.
We provide an overview of the ongoing discussions on the objectives of the energy transition in the form of a conceptual framework, intending to facilitate the search for the most viable options for a successful transformation of the energy system. For this purpose, we examine the development of energy policy goals in Germany in the past and present, whereby we give an overview of objectives and assessment approaches from politics, economics, and science. Moreover, we then merge the different views into a common framework and analyze the central conflict between the wholeness of a hypothetical target circle and the simplification in favor of a hypothetical target point in more detail.
What drives entrepreneurial action to create a lasting impact? The creation of new ventures that aim at having an impact beyond their financial performance face additional challenges: achieving economic sustainability and at the same time addressing social or environmental issues. Little is known on how these new hybrid organizations, aiming for multiple impact dimensions, manage to be congruent with their blended values. A dataset of 4,125 early-stage ventures is used to gain insights into how blended values are converted into financial, social and environmental impacts, giving shape to different types of hybrid organizations. Our findings suggest new hybrid organizations might opt to sacrifice financial impact to achieve social impact, yet this is not the case when they aim to generate environmental or sustainable impact. Therefore, the tensions and sacrifices related to holding blended values are not homogeneous across all types of new hybrid organizations.
Three-dimensional ship localization with only one camera is a challenging task due to the loss of depth information caused by perspective projection. In this paper, we propose a method to measure distances based on the assumption that ships lie on a flat surface. This assumption allows to recover depth from a single image using the principle of inverse perspective. For the 3D ship detection task, we use a hybrid approach that combines image detection with a convolutional neural network, camera geometry and inverse perspective. Furthermore, a novel calculation of object height is introduced. Experiments show that the monocular distance computation works well in comparison to a Velodyne lidar. Due to its robustness, this could be an easy-to-use baseline method for detection tasks in navigation systems.
Compliance ist in aller Munde, nicht zuletzt dank des Abgasskandals in der Automobilindustrie. Zahlreiche weitere und häufig auch spektakuläre Unternehmensskandale der letzten Jahre (v. a. aufgrund von Kartellverstößen, Korruptions-, Betrugs- und Bilanzierungsdelikten) haben weltweit eine Welle der Regulierung und verschärfter Durchsetzung bereits vorhandener rechtlicher und regulatorischer Anforderungen ausgelöst. Unternehmen suchen daher verstärkt nach Wegen der Prävention rechtlich sanktionierbaren Fehlverhaltens im Unternehmensbereich, nicht zuletzt, um die Haftung für Unternehmen, Organe und Mitarbeiter zu vermeiden. Dabei ist wichtig zu erkennen, dass über das rechtliche Haftungsrisiko (Geldstrafen und Bußgelder, Verfall/Gewinnabschöpfungen, Gefängnisstrafen) hinaus insbesondere auch die Vermeidung des ökonomischen Haftungsfalles aufgrund von Non-Compliance gemeint ist. Diese ökonomische Haftung stellt sich als Reputationsschaden ein und kann sich konkret zeigen an z. B. fallenden Börsenkursen, dem Abbruch von Geschäftsbeziehungen oder dem Ausschluss von öffentlichen Ausschreibungen 5 sowie durch Probleme bei der Rekrutierung von Spitzenpersonal oder der Demission von Managern und Mitarbeitern. Hinzu kommen die Kosten für die Untersuchung von entdeckten Fällen von Non-Compliance (Kosten für externe Anwälte, Wirtschaftsprüfer etc.), die Kosten für die Beseitigung der Ursachen (z. B. Schließung von Lücken im Internen Kontrollsystem: sog. „Remediation“) und die durch die Absorption von Management- und Mitarbeiterkapazitäten aufgrund der Aufarbeitung eines Falles von Non-Compliance verursachten Kosten.
Um die Antwort zu der Frage nach der Notwendigkeit von Compliance- Management für mittelständische Unternehmen gleich vorweg zu nehmen: ja, auch mittelständische Unternehmen tun gut daran, sich mit Compliance-Management auseinander zu setzen. Die Notwendigkeit für Compliance-Management verstanden als die Sicherstellung von Compliance, d. h. der Einhaltung gesetzlicher Regelungen, von Soft Law sowie interner Regeln und Verhaltensstandards, ergibt sich bereits allein aus der Verantwortung der Unternehmensleitung, das Unternehmen vor Bedrohungen zu schützen und den Fortbestand des Unternehmens sowie die Kooperationsbeziehungen mit den verschiedenen Stakeholdergruppen langfristig zu sichern. Anders formuliert: Compliance-Management ist Risikomanagement und somit von strategischer Bedeutung für eine verantwortungsvolle Unternehmensführung (good Corporate Governance) – in Großunternehmen wie in mittelständischen Unternehmen. Einzig die Frage nach dem „Wie“ von Compliance-Management in mittelständischen Unternehmen hat ihre Berechtigung.
Compliance Governance
(2020)
Für die Funktionsfähigkeit und Wirksamkeit eines Compliance-Management-Systems (CMS) – und damit verbunden, einer guten und verantwortungsvollen Corporate Governance – ist die Compliance Governance von fundamentaler Bedeutung. Damit sind passende Compliance-Strukturen und eine positive Compliance-Kultur gemeint, die in den Mittelpunkt der Aufmerksamkeit rücken. Im vorliegenden Beitrag wird dieser Themenkomplex aus unterschiedlichen Perspektiven beleuchtet. Zunächst nimmt Stephan Grüninger eine Begriffsbestimmung zur „Compliance Governance“ vor (1.3.1), um darauf aufbauend die wesentlichen Erfolgsprinzipien wirksamer Compliance-Struktur und -Kultur aus betriebswirtschaftlicher und organisationstheoretischer Perspektive herauszuarbeiten (1.3.2). Der Kontrolle und Aufsicht im Rahmen einer guten „Compliance Governance“ kommt eine wichtige Funktion zu, so dass Roland Steinmeyer darauf aufbauend (1.3.3) aus rechtlicher Perspektive die Verantwortlichkeiten des Aufsichtsrats im Rahmen des Compliance- Managements (CM) beschreibt. Abschließend (1.3.4) gibt Christian Strenger Einblicke in die Praxis der Compliance-Aufsicht und benennt Anforderungen, die Aufsichtsräte bei der Ausübung ihrer Überwachungsfunktion zu erfüllen haben und Herausforderungen, denen sie dabei gegenüberstehen.
Die Themen Compliance und wertebasiertes unternehmerisches Handeln sind spätestens seit dem sogenannten „Abgasskandal“, der den Volkswagen-Konzern bereits Strafzahlungen und Schadensersatz in Milliardenhöhe kostete, auch in Unternehmen des Mittelstands und bei Verbrauchern präsenter denn je.
Im Juni 2018 verhängte die Staatsanwaltschaft Braunschweig im Zusammenhang mit der Manipulation von Abgaswerten ein Bußgeld i. H. v. einer Milliarde Euro gegen Volkswagen. Anfang 2017 einigte sich VW mit dem US-amerikanischen Justizministerium auf einen Vergleich, der die Zahlung von 4,3 Mrd. US-Dollar (2,8 Mrd. Dollar Strafzahlungen, 1,5 Mrd. Dollar für zivilrechtliche Bußgelder) vorsieht. Durch weitere Vergleiche mit Kunden und Autohändlern in den USA belaufen sich die Gesamtkosten für Strafen, Bußgelder und Vergleichszahlungen auf etwa 25 Mrd. Dollar. Daneben sind weitere Kosten aus der Dieselaffäre zu erwarten, die sich im Wesentlichen aus dem Kapitalmarktrecht (ggf. versäumte Ad-hoc-Mitteilungen), aus Schadensersatzklagen, Motornachrüstungen und eventuell weiteren Strafzahlungen in anderen Jurisdiktionen ergeben. Hinzukommen die Kosten für die Aufarbeitung der Vorgänge und die sogenannte „Remediation“ des Compliance-Management-Systems (CMS), die der Gesamtschadenssumme ebenso zuzurechnen sind, wie die internen Kosten (Prozessverbesserungen, Management Attention etc.).
Der „Dieselskandal“ und seine Folgen belegen damit eindrucksvoll die Relevanz der Compliance.
„All organizations are inherently criminogenic“. Wenngleich diese Aussage durchaus übertrieben klingen mag, zeigen neuere Untersuchungen, dass über die Hälfte der Manager in Deutschland bereits mit unethischem Verhalten in Unternehmen konfrontiert wurden. Aktuelle Skandale in der Automobil- (sog. „Abgasskandal“) oder der Finanzindustrie (zuletzt sog. „Cum ex-“ und „Cum-Cum-Geschäfte“) scheinen darüber hinaus zu belegen, dass auch Organisationen mit einem etablierten Compliance-Management, immer wieder nur die Reaktion auf delinquente Handlungen bleibt. Die Prävention wirtschaftskrimineller und unmoralischer Verhaltensweisen hingegen scheitert regelmäßig, trotz zum Teil umfangreich ausgestalteter Compliance-Management-Systeme (CMS).
40 Jahre Neuland des Denkens
(2020)
Vor 40 Jahren erschien Frederic Vesters Hauptwerk „Neuland des Denkens“. Der Beitrag beleuchtet die wesentlichen Themen dieses programmatischen Buches im Hinblick auf Vesters Biokybernetik und deren Anwendung auf zahlreiche aktuelle Fragen in der Nachhaltigkeits-Debatte, z.B. Klimawandel-Problematik und Energiewende.
Aufgrund der Corona-Pandemie kam es 2020 zu einer verstärkten Nutzung von Homeoffice und Teleworking. Sowohl bzgl. der Wahl des Arbeitsortes als auch der genutzen Kommunikationstechnologien existieren Pfadabhängigkeiten. Der Beitrag thematisiert diese Pfadabhängigkeiten systematisch, insbesondere ihre Ursachen und Folgen sowie die Möglichkeiten zur Pfadbrechung.
Uncertainty about the future requires companies to create discontinuous innovations. Established companies, however, struggle to do so; whereas independent startups seem to better cope with this. Consequently, established companies set up entrepreneurial initiatives to make use of startups' benefits. Consequently, this led-amongst others-to great interest in socalled corporate entrepreneurship (CE) programs and to the development and characterization of several different forms. Their processes to achieve certain objectives, yet, are still rather ineffective. Thus, considerations of the actions performed in preparation for and during CE programs could be one approach to improve this but are still absent today. Furthermore, the increasing use of several CE programs in parallel seems to bear the potential for synergies and, thus, more efficient use of resources. Aiming to provide insights to both issues, this study analyzes actions of CE programs, by looking at interviews with managers of seven corporate incubators and accelerator programs of five established German tech-companies.
The ageing infrastructure in ports requires regular inspection. This inspection is currently carried out manually by divers who sense by hand the entire underwater infrastructure. This process is cost-intensive as it involves a lot of time and human resources. To overcome these difficulties, we propose to scan the above and underwater port structure with a Multi-SensorSystem, and -by a fully automated processto classify the obtained point cloud into damaged and undamaged zones. We make use of simulated training data to test our approach since not enough training data with corresponding class labels are available yet. To that aim, we build a rasterised heightfield of a point cloud of a sheet pile wall by cutting it into verticall slices. The distance from each slice to the corresponding line generates the heightfield. This latter is propagated through a convolutional neural network which detects anomalies. We use the VGG19 Deep Neural Network model pretrained on natural images. This neural network has 19 layers and it is often used for image recognition tasks. We showed that our approach can achieve a fully automated, reproducible, quality-controlled damage detection which is able to analyse the whole structure instead of the sample wise manual method with divers. The mean true positive rate is 0.98 which means that we detected 98 % of the damages in the simulated environment.
At present, the majority of the proposed Deep Learning (DL) methods provide point predictions without quantifying the model's uncertainty. However, a quantification of the reliability of automated image analysis is essential, in particular in medicine when physicians rely on the results for making critical treatment decisions. In this work, we provide an entire framework to diagnose ischemic stroke patients incorporating Bayesian uncertainty into the analysis procedure. We present a Bayesian Convolutional Neural Network (CNN) yielding a probability for a stroke lesion on 2D Magnetic Resonance (MR) images with corresponding uncertainty information about the reliability of the prediction. For patient-level diagnoses, different aggregation methods are proposed and evaluated, which combine the individual image-level predictions. Those methods take advantage of the uncertainty in the image predictions and report model uncertainty at the patient-level. In a cohort of 511 patients, our Bayesian CNN achieved an accuracy of 95.33% at the image-level representing a significant improvement of 2% over a non-Bayesian counterpart. The best patient aggregation method yielded 95.89% of accuracy. Integrating uncertainty information about image predictions in aggregation models resulted in higher uncertainty measures to false patient classifications, which enabled to filter critical patient diagnoses that are supposed to be closer examined by a medical doctor. We therefore recommend using Bayesian approaches not only for improved image-level prediction and uncertainty estimation but also for the detection of uncertain aggregations at the patient-level.
Modeling a suitable birth density is a challenge when using Bernoulli filters such as the Labeled Multi-Bernoulli (LMB) filter. The birth density of newborn targets is unknown in most applications, but must be given as a prior to the filter. Usually the birth density stays unchanged or is designed based on the measurements from previous time steps.
In this paper, we assume that the true initial state of new objects is normally distributed. The expected value and covariance of the underlying density are unknown parameters. Using the estimated multi-object state of the LMB and the Rauch-Tung-Striebel (RTS) recursion, these parameters are recursively estimated and adapted after a target is detected.
The main contribution of this paper is an algorithm to estimate the parameters of the birth density and its integration into the LMB framework. Monte Carlo simulations are used to evaluate the detection driven adaptive birth density in two scenarios. The approach can also be applied to filters that are able to estimate trajectories.
Seamless-Learning-Plattform
(2020)
This paper presents the goals, service design approach, and the results of the project “Accessible Tourism around Lake Constance”, which is currently run by different universities, industrial partners and selected hotels in Switzerland, Germany and Austria. In the 1st phase, interviews with different persons with disabilities and elderly persons have been conducted to identify the barriers and pains faced by tourists who want to spend their holidays in the region of Lake Constance as well as possible assistive technologies that help to overcome these barriers. The analysis of the interviews shows that one third of the pains and barriers are due to missing, insufficient, wrong or inaccessible information about the
accessibility of the accommodation, surroundings, and points of interests during the planning phase of the holidays. Digital assistive technologies hence play a
major role in bridging this information gap. In the 2nd phase so-called Hotel-Living-Labs (HLL) have been established where the identified assistive technologies
can be evaluated. Based on these HLLs an overall service for accessible holidays has been designed and developed. In the last phase, this service has been implemented
based on the HLLs as well as the identified assistive technologies and is currently field tested with tourists with disabilities from the three participated countries.
Der digitale Seilüberwacher
(2020)
In modern fruit processing technology, non-destructive quality measuring techniques aresought for determining and controlling changes in the optical, structural, and chemical properties of theproducts. In this context, changes inside the product can be measured during processing. Especiallyfor industrial use, fast, precise, but robust methods are particularly important to obtain high-qualityproducts. In this work, a newly developed multi-spectral imaging system was implemented andadapted for drying processes. Further it was investigated if the system could be used to link changesin the surface spectral reflectance during mango drying with changes in moisture content andcontents of chemical components. This was achieved by recovering the spectral reflectance frommulti-spectral image data and comparing the spectral changes with changes of the total soluble solids(TSS), pH-value and the relative moisture contentxwbof the products. In a first step, the camera wasmodified to be used in drying, then the changes in the spectra and quality criteria during mangodrying were measured. For this, mango slices were dried at air temperatures of 40–80◦C and relativeair humidities of 5%–30%. Samples were analyzed and pictures were taken with the multi-spectralimaging system. The quality criteria were then predicted from spectral data. It could be shown thatthe newly developed multi-spectral imaging system can be used for quality control in fruit drying.There are strong indications as well, that it can be employed for the prediction of chemical qualitycriteria of mangoes during drying. This way, quality changes can be monitored inline during theprocess using only one single measuring device.
Location-aware mobile devices are becoming increasingly popular and GPS sensors are built into nearly every portable unit with computational capabilities. At the same time, the emergence of location-aware virtual services and ideas calls for new efficient spatial real-time queries. Communication latency in mobile environments interacting with high decentralization and the need of scalability in high-density systems with immense client counts leads to major challenges. In this paper we describe a decentralized architecture for continuous range queries in settings in which both, the requested and the requesting clients, are mobile. While prior works commonly use a request-response approach we provide a stream-based adaptive grid solution dealing with arbitrary high client counts and improving communication latency that meets given hard real-time constraints.
1863 dichtet der Volksautor Wilhelm Busch die Geschichte von Max und Moritz, zweier Lausbuben, die gegen Regeln und Sitten der Dorfgemeinschaft verstoßen und das Zusammenleben empfindlich stören. Busch zeigt in seiner Dichtung schon in der Einleitung den Kern des Problems auf. „Ach, was muß man oft von bösen Kindern hören oder lesen! Wie zum Beispiel hier von diesen, Welche Max und Moritz hießen;
Die, anstatt durch weise Lehren Sich zum Guten zu bekehren, Oftmals noch darüber lachten Und sich heimlich lustig machten. Ja, zur Übeltätigkeit, Ja, dazu ist man bereit!“ Max und Moritz weigern sich, die Schule zu besuchen und sich regelkonform und anständig zu verhalten. Ihre Uneinsichtigkeit und ihre Rücksichtslosigkeit enden für die beiden tödlich, und erst mit diesem Ende kehrt wieder Ruhe im Dorf ein. Der Zusammenhang zwischen der bekannten Kindergeschichte „Max und Moritz“ von Wilhelm Busch und aktuellen Wirtschafts- und Unternehmensskandalen scheint vielleicht auf den ersten Blick etwas weit hergeholt. Als „modernes Märchen“ sprechen diese Geschichte und ihre Moral aber letztlich genau von dem, was Unternehmen heute weltweit zu schaffen macht. Ohne die bekannten Lausbubenstreiche mit Spielarten moderner Wirtschaftskriminalität gleichsetzen zu wollen, lässt sich aus den sieben Streichen für unser Thema folgendes ableiten:
– Regelwidriges Verhalten kann überall auftauchen
– Unter dem Fehlverhalten einzelner haben alle zu leiden
– Das Funktionieren des übergeordneten Systems wird nachhaltig gestört
– Und: klassische „Erziehungsversuche“ erweisen sich häufig als wirkungslos
Schreiben im Studium
(2020)
Sollte das wissenschaftliche Schreiben als allgemeine Wissenschaftssprache vermittelt werden oder sollten die sprachlichen Anforderungen in den jeweiligen Disziplinen berücksichtigt werden? Die Antwort auf diese Frage hängt auch davon ab, wie stark sich die Wissenschaftssprachen unterscheiden. In der vorliegenden Studie wurden wissenschaftssprachliche Besonderheiten anhand zweier Korpora untersucht, die deutschsprachige Dissertationen der Fächer Betriebswirtschaftslehre (BWL) bzw. Maschinenbau (MB) umfassten und in denen der Gebrauch von Mehrworteinheiten (z.B. im vergleich zu den) verglichen wurde. Die Untersuchung verdeutlicht den unterschiedlichen Gebrauch der Mehrworteinheiten zwischen den Disziplinen: Nur vierzehn der 50 häufigsten Mehrworteinheiten wurden in beiden Korpora genutzt. Die Ergebnisse werden mit Blick auf die Vermittlung des wissenschaftlichen Schreibens erörtert. Es werden Überlegungen zur Weiterentwicklung der Methode angestellt und es wird diskutiert, wie die Korpuslinguistik die Schreibvermittlung im Studium unterstützen könnte.
Which testing formats are most appropriate for assessing academic language skills? Christian Krekeler addresses this question for informal language assessment in the classroom. In the first part of this chapter discrete items, isolated tests of writing, and integrated tests, three formats which are frequently used in standardised assessment, are investigated and their usefulness for classroom assessment, where diagnosis, achievement and learning are important test functions, is discussed. Because academic language use is cognitively demanding, it is argued that assessment of academic language skills should include authentic and complex tasks as well as subject specific content. The resulting tensions between complex, authentic and context sensitive tasks and the requirements of language assessment methodology are discussed in the second part of the chapter.
Pascal Laube presents machine learning approaches for three key problems of reverse engineering of defective structured surfaces: parametrization of curves and surfaces, geometric primitive classification and inpainting of high-resolution textures. The proposed methods aim to improve the reconstruction quality while further automating the process. The contributions demonstrate that machine learning can be a viable part of the CAD reverse engineering pipeline.