Refine
Year of publication
Document Type
- Article (429) (remove)
Language
- English (218)
- German (209)
- Multiple languages (2)
Keywords
- (Strict) sign-regularity (1)
- 1D-CNN (1)
- 3D urban planning (1)
- AAL (2)
- Aboriginal people (1)
- Abstract interpretation (1)
- Accelerometer calibration (1)
- Accelerometer sensor (1)
- Accelerometers (1)
- Actuators (1)
Institute
- Fakultät Architektur und Gestaltung (5)
- Fakultät Bauingenieurwesen (25)
- Fakultät Elektrotechnik und Informationstechnik (1)
- Fakultät Informatik (9)
- Fakultät Maschinenbau (9)
- Fakultät Wirtschafts-, Kultur- und Rechtswissenschaften (43)
- Institut für Angewandte Forschung - IAF (30)
- Institut für Optische Systeme - IOS (9)
- Institut für Strategische Innovation und Technologiemanagement - IST (14)
- Institut für Systemdynamik - ISD (27)
Reliability Assessment of an Unscented Kalman Filter by Using Ellipsoidal Enclosure Techniques
(2022)
The Unscented Kalman Filter (UKF) is widely used for the state, disturbance, and parameter estimation of nonlinear dynamic systems, for which both process and measurement uncertainties are represented in a probabilistic form. Although the UKF can often be shown to be more reliable for nonlinear processes than the linearization-based Extended Kalman Filter (EKF) due to the enhanced approximation capabilities of its underlying probability distribution, it is not a priori obvious whether its strategy for selecting sigma points is sufficiently accurate to handle nonlinearities in the system dynamics and output equations. Such inaccuracies may arise for sufficiently strong nonlinearities in combination with large state, disturbance, and parameter covariances. Then, computationally more demanding approaches such as particle filters or the representation of (multi-modal) probability densities with the help of (Gaussian) mixture representations are possible ways to resolve this issue. To detect cases in a systematic manner that are not reliably handled by a standard EKF or UKF, this paper proposes the computation of outer bounds for state domains that are compatible with a certain percentage of confidence under the assumption of normally distributed states with the help of a set-based ellipsoidal calculus. The practical applicability of this approach is demonstrated for the estimation of state variables and parameters for the nonlinear dynamics of an unmanned surface vessel (USV).
Experimental Validation of Ellipsoidal Techniques for State Estimation in Marine Applications
(2022)
A reliable quantification of the worst-case influence of model uncertainty and external disturbances is crucial for the localization of vessels in marine applications. This is especially true if uncertain GPS-based position measurements are used to update predicted vessel locations that are obtained from the evaluation of a ship’s state equation. To reflect real-life working conditions, these state equations need to account for uncertainty in the system model, such as imperfect actuation and external disturbances due to effects such as wind and currents. As an application scenario, the GPS-based localization of autonomous DDboat robots is considered in this paper. Using experimental data, the efficiency of an ellipsoidal approach, which exploits a bounded-error representation of disturbances and uncertainties, is demonstrated.
Extended Target Tracking With a Lidar Sensor Using Random Matrices and a Virtual Measurement Model
(2022)
Random matrices are widely used to estimate the extent of an elliptically contoured object. Usually, it is assumed that the measurements follow a normal distribution, with its standard deviation being proportional to the object’s extent. However, the random matrix approach can filter the center of gravity and the covariance matrix of measurements independently of the measurement model. This work considers the whole chain from data acquisition to the linear Kalman Filter with extension estimation as a reference plant. The input is the (unknown) ground truth (position and extent). The output is the filtered center of gravity and the filtered covariance matrix of the measurement distribution. A virtual measurement model emulates the behavior of the reference plant. The input of the virtual measurement model is adapted using the proposed algorithm until the output parameters of the virtual measurement model match the result of the reference plant. After the adaptation, the input to the virtual measurement model is considered an estimation for position and extent. The main contribution of this paper is the reference model concept and an adaptation algorithm to optimize the input of the virtual measurement model.
Die digitale Transformation von Geschäftsprozessen und die stärkere Einbindung von IT-Systemen erzeugen bei kleinen und mittelständischen Unternehmen (KMU) Chancen und Risiken zugleich. Risiken, die insbesondere in einer fehlenden IT-Compliance resultieren können. Wie Studien zeigen, sind KMU in Bezug auf IT-Compliance-Maßnahmen im Vergleich zu kapitalmarktorientierten Unternehmen jedoch im Rückstand [1]. Im Beitrag wird mithilfe von Experteninterviews und einer qualitativen Datenanalyse der Frage nachgegangen, welcher Status quo an Maßnahmen aktuell implementiert und wie der empfundene Compliance-Reifegrad ist. Weiterhin werden die Gründe und Motive erörtert, die zu diesem Zustand geführt haben. Letztlich sind Treiber identifiziert worden, die zu einem höheren Bewusstsein in der Zukunft führen können. Die Arbeit zeigt interessante Erkenntnisse aus der Praxis, da die Experteninterviews Einblicke in den aktuellen Status quo in Bezug auf IT-Compliance liefern.
Business Partner Compliance
(2022)
In tomato drying, degradation in final quality may occur based on the drying method used and predrying preparation. Hence, this research was conducted to evaluate the effect of different predrying treatments on physicochemical quality and drying kinetics of twin-layer-solar-tunnel-dried tomato slices. During the experimental work, tomato slices of var. Galilea were used. As predrying treatments, 0.5% calcium chloride (CaCl2), 0.5% ascorbic acid (C6H8O6), 0.5% citric acid (C6H8O7), and 0.5% sodium chloride (NaCl) were used. The tomato samples were sliced to 5 mm thickness, socked in the pretreatments for ten minutes, and dried in a twin layer solar tunnel dryer under the weather conditions of Jimma, Ethiopia. Untreated samples were used as control. The moisture losses from the samples were monitored by weighing samples at 2 h interval from each treatment. SAS statistical software version 9.2 was used for analyzing data on the physicochemical quality of tomato slices in CRD with three replications. From the experimental result, it was observed that dried tomato slices pretreated with 0.5% ascorbic acid gave the best retention of vitamin C and total phenolic content with a high sugar/acid ratio. Better retention of lycopene and fast drying were observed in dried tomato slices pretreated with 0.5% sodium chloride, and pretreating tomatoes with 0.5% citric acid resulted in better color values than the other treatments. Compared to the control, pretreating significantly preserved the overall quality of dried tomato slices and increased the moisture removal rate in the twin layer solar tunnel dryer.
The scoring of sleep stages is an essential part of sleep studies. The main objective of this research is to provide an algorithm for the automatic classification of sleep stages using signals that may be obtained in a non-obtrusive way. After reviewing the relevant research, the authors selected a multinomial logistic regression as the basis for their approach. Several parameters were derived from movement and breathing signals, and their combinations were investigated to develop an accurate and stable algorithm. The algorithm was implemented to produce successful results: the accuracy of the recognition of Wake/NREM/REM stages is equal to 73%, with Cohen's kappa of 0.44 for the analyzed 19324 sleep epochs of 30 seconds each. This approach has the advantage of using the only movement and breathing signals, which can be recorded with less effort than heart or brainwave signals, and requiring only four derived parameters for the calculations. Therefore, the new system is a significant improvement for non-obtrusive sleep stage identification compared to existing approaches.
As interest in the investigation of possible sources and environmental sinks of technology-critical elements (TCEs) continues to grow, the demand for reliable background level information of these elements in environmental matrices increases. In this study, a time series of ten years of sediment samples from two different regions of the German North Sea were analyzed for their mass fractions of Ga, Ge, Nb, In, REEs, and Ta (grain size fraction < 20 µm). Possible regional differences were investigated in order to determine preliminary reference values for these regions. Throughout the investigated time period, only minor variations in the mass fractions were observed and both regions did not show significant differences. Calculated local enrichment factors ranging from 0.6 to 2.3 for all TCEs indicate no or little pollution in the investigated areas. Consequently, reference values were calculated using two different approaches (Median + 2 median absolute deviation (M2MAD) and Tukey inner fence (TIF)). Both approaches resulted in consistent threshold values for the respective regions ranging from 158 µg kg−1 for In to 114 mg kg−1 for Ce. As none of the threshold values exceed the observed natural variation of TCEs in marine and freshwater sediments, they may be considered baseline values of the German Bight for future studies.
Die GIGA-Adaptionsmethode
(2022)
Der Aufsatz stellt eine schreibdidaktische Lehrmethode vor, die auf einem Cicero-Zitat über das Aptum, die Angemessenheit des Stils, basiert. Nach einigen psychologischen Vorüberlegungen zur Differenz von Sprech- und Schreibsituation wird das Zitat im Hochschulschreibunterricht in Hinblick auf die darin genannten Stilfaktoren analysiert. Die Ergebnisliste dient als Grundlage einer Methode, mit der sich die stilistische Passgenauigkeit von Texten aller Art stark verbessern lässt. Ziel ist es, eine möglicherweise schreibferne Klientel dazu zu motivieren, zu einem musterhaften, zweckdienlichen, sachgerechten und zielgruppenorientierten Schreiben zu finden.
In this paper, a novel feature-based sampling strategy for nonlinear Model Predictive Path Integral (MPPI) control is presented. Using the MPPI approach, the optimal feedback control is calculated by solving a stochastic optimal control (OCP) problem online by evaluating the weighted inference of sampled stochastic trajectories. While the MPPI algorithm can be excellently parallelized, the closed-loop performance strongly depends on the information quality of the sampled trajectories. To draw samples, a proposal density is used. The solver’s and thus, the controller’s performance is of high quality if the sampled trajectories drawn from this proposal density are located in low-cost regions of state-space. In classical MPPI control, the explored state-space is strongly constrained by assumptions that refer to the control value’s covariance matrix, which are necessary for transforming the stochastic Hamilton–Jacobi–Bellman (HJB) equation into a linear second-order partial differential equation. To achieve excellent performance even with discontinuous cost functions, in this novel approach, knowledge-based features are introduced to constitute the proposal density and thus the low-cost region of state-space for exploration. This paper addresses the question of how the performance of the MPPI algorithm can be improved using a feature-based mixture of base densities. Furthermore, the developed algorithm is applied to an autonomous vessel that follows a track and concurrently avoids collisions using an emergency braking feature. Therefore, the presented feature-based MPPI algorithm is applied and analyzed in both simulation and full-scale experiments.
Technologiebasierte Startups leisten einen wesentlichen Beitrag zur wirtschaftlichen sowie gesellschaftlichen Entwicklung. Im Zuge ihrer Gründung benötigen sie Unterstützung in Form von Risikokapital, das in der Seed- und Early-Stage primär durch Business Angels (BAs) bereitgestellt wird. Die Abläufe und Bewertungskriterien des BA Investmentprozesses sind bisher jedoch unzureichend erforscht. Der vorliegende Beitrag nutzt Experteninterviews im Rahmen einer Fallstudie des baden-württembergischen entrepreneurialen Ökosystems zur Identifikation des Vorgehens von BAs bei der Bewertung und Auswahl technologiebasierter Startups. Zudem werden die Kriterien, nach denen BAs vielversprechende von scheiternden Startups unterscheiden abgeleitet. Somit trägt der Beitrag zur Öffnung der „Black Box” von Investmentaktivitäten in den frühsten Gründungsphasen bei.
Evaluation of tech ventures’ evolving business models: rules for performance-related classification
(2022)
At the early stage of a successful tech venture's life cycle, it is assumed that the business model will evolve to higher quality over time. However, there are few empirical insights into business model evolution patterns for the performance-related classification of early-stage tech ventures. We created relevant variables evaluating the evolution of the venture-centric network and the technological proposition of both digital and non-digital ventures' business models using the text of submissions to the official business plan award in the German State of Baden-Württemberg between 2006 and 2012. Applying a principal component analysis/rough set theory mixed methodology, we explore performance-related business model classification rules in the heterogeneous sample of business plans. We find that ventures need to demonstrate real interactions with their customers' needs to survive. The distinguishing success rules are related to patent applications, risk capital, and scaling of the organisation. The rules help practitioners to classify business models in a way that allows them to prioritise action for performance.
Ziel des Forschungsprojekts "Ekont" ist es, ein handgeführtes Gerät zum Betonabtrag an Innenkanten und Störstellen in Kernkraftwerken (KKW) zu entwickeln. Um die Reaktionskräfte zu reduzieren wird hierbei der neuartige Ansatz eines gegenläufigen Fräsprozesses untersucht. Ergebnis ist eine Getriebelösung, bei der eine mittlere Frässcheibe mit annähernd derselben Umfangsgeschwindigkeit in die entgegengesetzte Richtung von weiteren Frässcheiben rotiert.
The present contribution proposes a novel method for the indirect measurement of the ground reaction forces (GRF) induced by a pedestrian during walking on a vibrating structure. Its main idea is to formulate and solve an inverse problem in the time domain with the aim of finding the optimal time dependent moving point force describing the GRF of a pedestrian (input data), which minimizes the difference between a set of computed and a set of measured structural responses (output data). The solution of the inverse problem is addressed by means of the gradient-based trust region optimization strategy. The moving force identification process uses output data from a set of acceleration and displacement time histories recorded at different locations on the structure. The practicability and the accuracy of the proposed GRF identification method is firstly evaluated using simulated measurements, which revealed a high accuracy, robustness and stability of the results in relation to high noise levels. Subsequently, a comprehensive experimental validation process using real measurement data recorded on the HUMVIB experimental footbridge on the campus of the Technical University of Darmstadt (Germany) was carried out. Besides the conventional sensors for the acquisition of structural responses, an array of biomechanical force plates as well as classical load cells at the supports were used for measurement reference GRFs needed in the experimental validation process. The results show that the proposed method delivers a very accurate estimation of the GRF induced by a subject during walking on the experimental structure.
Outcomes with a natural order commonly occur in prediction problems and often the available input data are a mixture of complex data like images and tabular predictors. Deep Learning (DL) models are state-of-the-art for image classification tasks but frequently treat ordinal outcomes as unordered and lack interpretability. In contrast, classical ordinal regression models consider the outcome’s order and yield interpretable predictor effects but are limited to tabular data. We present ordinal neural network transformation models (ontrams), which unite DL with classical ordinal regression approaches. ontrams are a special case of transformation models and trade off flexibility and interpretability by additively decomposing the transformation function into terms for image and tabular data using jointly trained neural networks. The performance of the most flexible ontram is by definition equivalent to a standard multi-class DL model trained with cross-entropy while being faster in training when facing ordinal outcomes. Lastly, we discuss how to interpret model components for both tabular and image data on two publicly available datasets.
Sustainable technologies are being increasingly used in various areas of human life. While they have a multitude of benefits, they are especially useful in health monitoring, especially for certain groups of people, such as the elderly. However, there are still several issues that need to be addressed before its use becomes widespread. This work aims to clarify the aspects that are of great importance for increasing the acceptance of the use of this type of technology in the elderly. In addition, we aim to clarify whether the technologies that are already available are able to ensure acceptable accuracy and whether they could replace some of the manual approaches that are currently being used. A two-week study with people 65 years of age and over was conducted to address the questions posed here, and the results were evaluated. It was demonstrated that simplicity of use and automatic functioning play a crucial role. It was also concluded that technology cannot yet completely replace traditional methods such as questionnaires in some areas. Although the technologies that were tested were classified as being “easy to use”, the elderly population in the current study indicated that they were not sure that they would use these technologies regularly in the long term because the added value is not always clear, among other issues. Therefore, awareness-raising must take place in parallel with the development of technologies and services.
Generalized Concatenated Codes over Gaussian and Eisenstein Integers for Code-Based Cryptography
(2021)
The code-based McEliece and Niederreiter cryptosystems are promising candidates for post-quantum public-key encryption. Recently, q-ary concatenated codes over Gaussian integers were proposed for the McEliece cryptosystem together with the one-Mannheim error channel, where the error values are limited to Mannheim weight one. Due to the limited error values, the codes over Gaussian integers achieve a higher error correction capability than maximum distance separable (MDS) codes with bounded minimum distance decoding. This higher error correction capability improves the work factor regarding decoding attacks based on information-set decoding. The codes also enable a low complexity decoding algorithm for decoding beyond the guaranteed error correction capability. In this work, we extend this coding scheme to codes over Eisenstein integers. These codes have advantages for the Niederreiter system. Additionally, we propose an improved code construction based on generalized concatenated codes. These codes extent the rate region where the work factor is beneficial compared to MDS codes. Moreover, generalized concatenated codes are more robust against structural attacks than ordinary concatenated codes.
Many countries offer state credit guarantees to support credit-constrained exporters. The policy instrument is commonly justified by governments as a means to mitigating adverse outcomes of financial market frictions for exporting firms. Accumulated returns to the German state credit guarantee scheme deriving from risk-compensating premia have outweighed accumulated losses over the past 60 years. Why do private financial agents not step in and provide insurance given that the state-run program yields positive returns? We argue that costs of risk diversification, liquidity management, and coordination among creditors limit the ability of private financial agents to offer comparable insurance products. Moreover, we suggest that the government’s greater effectiveness in recovering claims in foreign countries endows the state with a cost advantage in dealing with the risks involved in large export projects. We test these hypotheses using monthly firm-level data combined with official transaction-level data on covered exports of German firms and find suggestive evidence that positive effects on trade are due to mitigated financial constraints: State credit guarantees benefit firms that are dependent on external finance, if the value at risk which they seek to cover is large, and at times when refinancing conditions on the private financial market are tight.