Refine
Year of publication
Document Type
- Conference Proceeding (642) (remove)
Language
- English (492)
- German (149)
- Multiple languages (1)
Keywords
- 360-degree coverage (1)
- 3D Extended Object Tracking (1)
- 3D Extended Object Tracking (EOT) (2)
- 3D shape tracking (1)
- 3D ship detection (1)
- AAL (1)
- ADAM (1)
- AHI (1)
- Abrasive grain material (1)
- Abtragsprinzip (1)
Institute
- Fakultät Bauingenieurwesen (9)
- Fakultät Elektrotechnik und Informationstechnik (10)
- Fakultät Informatik (50)
- Fakultät Maschinenbau (9)
- Fakultät Wirtschafts-, Kultur- und Rechtswissenschaften (8)
- Institut für Angewandte Forschung - IAF (53)
- Institut für Optische Systeme - IOS (19)
- Institut für Strategische Innovation und Technologiemanagement - IST (29)
- Institut für Systemdynamik - ISD (64)
- Institut für Werkstoffsystemtechnik Konstanz - WIK (5)
Virtual measurement models (VMM) can be used to generate artificial measurements and emulate complex sensor models such as Lidar. The input of the VMM is an estimation and the output is the set of measurements this estimation would cause. A Kalman filter with extension estimation based on random matrices is used to filter mean and covariance of the real measurements. If these match the mean and covariance of the artificial measurements, then the given estimation is appropriate. The optimal input of the VMM is found using an adaptation algorithm. In this paper, the VMM approach is expanded for multi-extended object tracking where objects can be occluded and are only partially visible. The occlusion can be compensated if the extension estimation is performed for all objects together. The VMM now receives as input an estimation for the multi-object state and the output are the measurements that this multi-object state would cause.
With the high resolution of modern sensors such as multilayer LiDARs, estimating the 3D shape in an extended object tracking procedure is possible. In recent years, 3D shapes have been estimated in spherical coordinates using Gaussian processes, spherical double Fourier series or spherical harmonics. However, observations have shown that in many scenarios only a few measurements are obtained from top or bottom surfaces, leading to error-prone estimates in spherical coordinates. Therefore, in this paper we propose to estimate the shape in cylindrical coordinates instead, applying harmonic functions. Specifically, we derive an expansion for 3D shapes in cylindrical coordinates by solving a boundary value problem for the Laplace equation. This shape representation is then integrated in a plain greedy association model and compared to shape estimation procedures in spherical coordinates. Since the shape representation is only integrated in a basic estimator, the results are preliminary and a detailed discussion for future work is presented at the end of the paper.
The use of deep learning models with medical data is becoming more widespread. However, although numerous models have shown high accuracy in medical-related tasks, such as medical image recognition (e.g. radiographs), there are still many problems with seeing these models operating in a real healthcare environment. This article presents a series of basic requirements that must be taken into account when developing deep learning models for biomedical time series classification tasks, with the aim of facilitating the subsequent production of the models in healthcare. These requirements range from the correct collection of data, to the existing techniques for a correct explanation of the results obtained by the models. This is due to the fact that one of the main reasons why the use of deep learning models is not more widespread in healthcare settings is their lack of clarity when it comes to explaining decision making.
Nowadays, the importance of early active patient mobilization in the recovery and rehabilitation phase has increased significantly. One way to involve patients in the treatment is a gamification-like approach, which is one of the methods of motivation in various life processes. This article shows a system prototype for patients who require physical activity because of active early mobilization after medical interventions or during illness. Bedridden patients and people with a sedentary lifestyle (predominantly lying in bed) are also potential users. The main idea for the concept was non-contact system implementation for the patients making them feel effortless during its usage. The system consists of three related parts: hardware, software, and game application. To test the relevance and coherence of the system, it was used by 35 people. The participants were asked to play a video game requiring them to make body movements while lying down. Then they were asked to take part in a small survey to evaluate the system's usability. As a result, we offer a prototype consisting of hardware and software parts that can increase and diversify physical activity during active early mobilization of patients and prevent the occurrence of possible health problems due to predominantly low activity. The proposed design can be possibly implemented in hospitals, rehabilitation centers, and even at home.
Home health applications have evolved over the last few decades. Assistive systems such as a data platform in connection with health devices can allow for health-related data to be automatically transmitted to a database. However, there remain significant challenges concerning intermodular communication. Central among them is the challenge of achieving interoperability, the ability of devices to communicate and share data with each other. A major goal of this project was to extend an existing data platform (COMES®) and establish working interoperability by connecting assistive devices with differing approaches. We describe this process for a sleep monitoring and a physical exercise device. Furthermore, we aimed to test this setup and the implementation with a data platform in both a laboratory and an in-home setting with 11 elderly participants. The platform modification was realized, and the relevant changes were made so that the incoming data could be processed by the data platform, as well as visually displayed in real-time. Data was recorded by the respective device and transmitted into the data server with minor disruptions. Our observations affirmed that difficulties and data loss are far more likely to occur with increasing technical complexity, in the event of instable internet connection, or when the device setup requires (elderly) subjects to take specific steps for proper functioning. We emphasize the importance for tests and evaluations of home health technologies in real-life circumstances.
Healthy sleep is required for sufficient restoration of the human body and brain. Therefore, in the case of sleep disorders, appropriate therapy should be applied timely, which requires a prompt diagnosis. Traditionally, a sleep diary is a part of diagnosis and therapy monitoring for some sleep disorders, such as cognitive behaviour therapy for insomnia. To automatise sleep monitoring and make it more comfortable for users, substituting a sleep diary with a smartwatch measurement could be considered. With the aim of providing accurate results, a study with a total of 30 night recordings was conducted. Objective sleep measurement with a Samsung Galaxy Watch 4 was compared with a subjective approach (sleep diary), evaluating the four relevant sleep characteristics: time of getting asleep, wake up time, sleep efficiency (SE), and total sleep time (TST). The performed analysis has demonstrated that the median difference between both measurement approaches was equal to 7 and 3 minutes for a time of getting asleep and wake up time correspondingly, which allows substituting a subjective measurement with a smartwatch. The SE was determined with a median difference between the two measurement methods of 5.22%. This result also implicates a possibility of substitution. Some single recordings have indicated a higher variance between the two approaches. Therefore, the conclusion can be made that a substitution provides reliable results primarily in the case of long-term monitoring. The results of the evaluation of the TST measurement do not allow to recommend substitution of the measurement method.
In order to support entrepreneurs in the Business Model Innovation (BMI) process, practice-oriented frameworks such as the St. Gallen Business Model NavigatorTM (BMN) are considered to be a powerful tool. The aim of this paper is to identify strengths and limitations of the BMN when applied to start-ups in their early stages and to contribute to the optimization of the BMN in terms of applicability for start-ups. Furthermore, the paper aims to emphasize the importance of BMI for start-ups and the relevance of a supporting framework as well as formulating a unified catalogue of requirements of BMI frameworks for start-ups.
Because process and product innovations are usually no longer sufficient to establish a company in the market or to generate a competitive advantage, Business Model Innovation is considered a powerful tool, especially for start-ups for which innovation is at the core of their business. Due to the complexity of this process, frameworks should help entrepreneurs with executing Business Model Innovation. However, theory and practice diverge. The aim of this paper is to identify the needs of a start-up regarding Business Model Innovation frameworks, underlining the importance of Business Model Innovation for start-ups as well as the relevance of a supporting framework. The research results aim to contribute to an ideal process for Business Model Innovation when applied to start-ups.
The citizen-centered health platform project is intended to provide a platform that can be used in EU cross-border regions, where social and economic exchange occurs across national borders. The overriding challenges are: (a) social: improving citizen-centered health and care provision; (b) technical: providing a digital platform for networking citizens, service providers, and municipal actors; (c) economic: developing long-term successful (sustainable) business models/value chains. The platform should strengthen and expand existing networks and establish new regional networks. Each network addresses particular challenges and apply them in a region-specific manner. Here, the national boundary conditions and the interregional needs play an essential role. These objectives require sufficient participation of civil society representatives. Furthermore, the platform will establish an overarching, sustainable, and knowledge-based network of health experts. The platform is to be jointly developed and implemented in the regions and follow an open-access approach. Therefore, synergies will be shared more quickly, strengthening competencies and competitiveness. In addition to practice partners, scientific and municipal institutions and SMEs are involved. The actors thus contribute to scientific performance, innovative strength, and resilience.
Feature-Based Proposal Density Optimization for Nonlinear Model Predictive Path Integral Control
(2022)
This paper presents a novel feature-based sampling strategy for nonlinear Model Predictive Path Integral (MPPI) control. In MPPI control, the optimal control is calculated by solving a stochastic optimal control problem online using the weighted inference of stochastic trajectories. While the algorithm can be excellently parallelized the closed- loop performance is dependent on the information quality of the drawn samples. Because these samples are drawn using a proposal density, its quality is crucial for the solver and thus the controller performance. In classical MPPI control, the explored state-space is strongly constrained by assumptions that refer to the control value variance, which are necessary for transforming the Hamilton-Jacobi-Bellman (HJB) equation into a linear second-order partial differential equation. To achieve excellent performance even with discontinuous cost-functions, in this novel approach, knowledge-based features are used to determine the proposal density and thus, the region of state- space for exploration. This paper addresses the question of how the performance of the MPPI algorithm can be improved using a feature-based mixture of base densities. Further, the developed algorithm is applied on an autonomous vessel that follows a track and concurrently avoids collisions using an emergency braking feature.
This paper presents a systematic comparison of different advanced approaches for motion prediction of vessels for docking scenarios. Therefore, a conventional nonlinear gray-box-model, its extension to a hybrid model using an additional regression neural network (RNN) and a black-box-model only based on a RNN are compared. The optimal hyperparameters are found by grid search. The training and validation data for the different models is collected in full-scale experiments using the solar research vessel Solgenia. The performances of the different prediction models are compared in full-scale scenarios. %To use the investigated approaches for controller design, a general optimal control problem containing the advanced models is described. These can improve advanced control strategies e.g., nonlinear model predictive control (NMPC) or reinforcement learning (RL). This paper explores the question of what the advantages and disadvantages of the different presented prediction approaches are and how they can be used to improve the docking behavior of a vessel.
Personalized remote healthcare monitoring is in continuous development due to the technology improvements of sensors and wearable electronic systems. A state of the art of research works on wearable sensors for healthcare applications is presented in this work. Furthermore, a state of the art of wearable devices, chest and wrist band and smartwatches available on the market for health and sport monitoring is presented in this paper. Many activity trackers are commercially available. The prices are continuously reducing and the performances are improving, but commercial devices do not provide raw data and are therefore not useful for research purposes.
Gamification is one of the recognized methods of motivating people in various life processes, and it has spread to many spheres of life, including healthcare. This article proposes a system design for long-term care patients using the method mentioned. The proposed system aims to increase patient engagement in the treatment and rehabilitation process via gamification. Literature research on available and earlier proposed systems was conducted to develop a suited system design. The primary target group includes bedridden patients and a sedentary lifestyle (predominantly lying in bed). One of the main criteria for selecting a suitable option was its contactless realization for the mentioned target groups in long-term care cases. As a result, we developed the system design for hardware and software that could prevent bedsores and other health problems from occurring because of low activity. The proposed design can be tested in hospitals, nursing homes, and rehabilitation centers.
In recent decades, it can be observed that a steady increase in the volume of tourism is a stable trend. To offer travel opportunities to all groups, it is also necessary to prepare offers for people in need of long-term care or people with disabilities. One of the ways to improve accessibility could be digital technologies, which could help in planning as well as in carrying out trips. In the work presented, a study of barriers was first conducted, which led to selecting technologies for a test setup after analysis. The main focus was on a mobile app with travel information and 360° tours. The evaluation results showed that both technologies could increase accessibility, but some essential aspects (such as usability, completeness, relevance, etc.) need to be considered when implementing them.
This paper describes the effectiveness and efficiency of Virtual Reality training during a commissioning process. Therefore, 500 picking orders with more than 2000 part-picking operations with 30 test persons have been conducted and analyzed in the Modellfabrik Bodensee. The study points out the advantages and disadvantages of virtual training in comparison to a real execution of a picking process with and without any training.
The digital twin concept has been widely known for asset monitoring in the industry for a long time. A clear example is the automotive industry. Recently, there has also been significant interest in the application of digital twins in healthcare, especially in genomics in what is known as precision medicine. This work focuses on another medical speciality where digital twins can be applied, sleep medicine. However, there is still great controversy about the fundamentals that constitute digital twins, such as what this concept is based on and how it can be included in healthcare effectively and sustainably. This article reviews digital twins and their role so far in what is known as personalized medicine. In addition, a series of steps will be exposed for a possible implementation of a digital twin for a patient suffering from sleep disorders. For this, artificial intelligence techniques, clinical data management, and possible solutions for explaining the results derived from artificial intelligence models will be addressed.
In many cases continuous monitoring of vital signals is required and low intrusiveness is an important requirement. Incorporating monitoring systems in the hospital or home bed could have benefits for patients and caregivers. The objective of this work is the definition of a measurement protocol and the creation of a data set of measurements using commercial and low-cost prototypes devices to estimate heart rate and breathing rate. The experimental data will be used to compare results achieved by the devices and to develop algorithms for feature extraction of vital signals.
There have been substantial research efforts for algorithms to improve continuous and automated assessment of various health-related questions in recent years. This paper addresses the deployment gap between those improving algorithms and their usability in care and mobile health applications. In practice, most algorithms require significant and founded technical knowledge to be deployed at home or support healthcare professionals. Therefore, the digital participation of persons in need of health care professionals lacks a usable interface to use the current technological advances. In this paper, we propose applying algorithms taken from research as web-based microservices following the common approach of a RESTful service to bridge the gap and make algorithms accessible to caregivers and patients without technical knowledge and extended hardware capabilities. We address implementation details, interpretation and realization of guidelines, and privacy concerns using our self-implemented example. Also, we address further usability guidelines and our approach to those.
Market research institutes forecast a growing relevance of Low-Code Development Platforms (LCDPs) for organizations. Moreover, the rising number of scientific publications in recent years shows the increasing interest of the academic community. However, an overview of current research focuses and fruitful future research topics is missing. This paper conducts a first scientific literature review on LCDPs to close this gap. The socio-technical system (STS) model, which categorizes information systems into a social and a technical system, serves to analyze the identified 32 publications. Most of current research focuses on the technical system (technology or task). In contrast, only three publications explicitly target the social system (structure or people). Hence, this paper enables future research to address the identified research gaps. Additionally, practitioners gain awareness of technical and social aspects involved in the development, implementation, and application of LCDPs.
Low-Code Development Platforms (LCDPs) enable non-information technology (IT) personnel to develop applications and workflows independently of the IT department. Consequently, these digital platforms help to overcome the growing need for software development. However, science and practice warn of several barriers that slow down or hinder the usage of LCDPs. This publication scientifically identifies, analyzes, and discusses challenges during implementation and application of LCDPs from both perspectives in a holistic manner. Therefore, we conduct an exploratory study (data from scientific literature, expert interviews, and practical studies) and assign the challenges to the socio-technical system model. The results show that the scientific and practical communities recognize common challenges (especially knowledge transfer) but also perceive differences related to technological (science) and social (practice) aspects. This paper proposes future research directions for academia, such as governance, culture change, and value evaluation of LCDPs. Additionally, practitioners can prepare for possible challenges when using LCPDs.
In tourism, energy demands are particularly high.Tourism facilities such as hotels require large amounts ofelectric and heating resp. cooling energy. Their supply howeveris usually still based on fossil energies. This research approachanalyses the potential of promoting renewable energies in BlackForest tourism. It focuses on a combined and hence highlyefficient production of both electric and thermal energy bybiogas plants on the one hand and its provision to local tourismfacilities via short distance networks on the other. Basing onsurveys and qualitative empiricism and considering regionalresource availability as well as socio-economic aspects, it thusexamines strengths, weaknesses, opportunities and threats thatcan arise from such a cooperation.
Docking Control of a Fully-Actuated Autonomous Vessel using Model Predictive Path Integral Control
(2022)
This paper presents the docking control of an autonomous vessel using the nonlinear Model Predictive Path Integral (MPPI) approach. This algorithm is based on a path integral over stochastic trajectories and can be parallelized easily. The controller parameters are tuned offline using knowledge of the system and simulations, including nonlinear state and disturbance observer. The cost function implicitly contains information regarding the surrounding of the docking position. This approach allows continuous optimization of the trajectory with respect to the system state, disturbance state and actuator dynamics. The control strategy has been tested in full-scale experiments using the solar research vessel Solgenia. The investigated MPPI controller has demonstrated excellent performance in both, simulation and real-world experiments. This paper addresses the question of how the MPPI algorithm can be applied to dock a fully-actuated vessel and what benefits its application achieves.
This paper presents the swinging up and stabilization control of a Furuta pendulum using the recently published nonlinear Model Predictive Path Integral (MPPI) approach. This algorithm is based on a path integral over stochastic trajectories and can be parallelized easily. The controller parameters are tuned offline regarding the nonlinear system dynamics and simulations. Constraints in terms of state and input are taken into account in the cost function. The presented approach sequentially computes an optimal control sequence that minimizes this optimal control problem online. The control strategy has been tested in full-scale experiments using a pendulum prototype. The investigated MPPI controller has demonstrated excellent performance in simulation for the swinging up and stabilizing task. In order to also achieve outstanding performance in a real-world experiment using a controller with limited computing power, a linear quadratic controller (LQR) is designed for the stabilization task. In this paper, the determination of the controller parameters for the MPPI algorithm is described in detail. Further, a discussion treats the advantages of the nonlinear MPPI control.
In this paper, approximating the shape of a sailing boat using elliptic cones is investigated. Measurements are assumed to be gathered from the target's surface recorded by 3D scanning devices such as multilayer LiDAR sensors. Therefore, different models for estimating the sailing boat's extent are presented and evaluated in simulated and real-world scenarios. In particular, the measurement source association problem is addressed in the models. Simulated investigations are conducted with a static and a moving elliptic cone. The real-world scenario was recorded with a Velodyne Alpha Prime (VLP-128) mounted on a ferry of Lake Constance. Final results of this paper constitute the extent estimation of a single sailing boat using LiDAR data applying various measurement models.
Multi-object tracking filters require a birth density to detect new objects from measurement data. If the initial positions of new objects are unknown, it may be useful to choose an adaptive birth density. In this paper, a circular birth density is proposed, which is placed like a band around the surveillance area. This allows for 360° coverage. The birth density is described in polar coordinates and considers all point-symmetric quantities such as radius, radial velocity and tangential velocity of objects entering the surveillance area. Since it is assumed that these quantities are unknown and may vary between different targets, detected trajectories, and in particular their initial states, are used to estimate the distribution of initial states. The adapted birth density is approximated as a Gaussian mixture, so that it can be used for filters operating on Cartesian coordinates.
In the last decade, both sustainability (Green &
Blue Economies) and business models for sustainability
(BMfS) have increased in importance. Social life cycle
sustainability assessment has not fully achieved goal,
mainly because sustainability‐oriented business is very
complex and dynamic. System Dynamics (SD) is a powerful
methodology and computer simulation modeling technique
for framing, understanding and discussing complex issues
and problems. This paper responds to the urgent need for
a new business model by presenting a concept for dynamic
business modeling for sustainability using system dynamics.
The paper illustrates the key operating principles through
an application from the smartphone industry with help
from STELLA® software for simulation. Simulations
suggest that dynamic business modeling for sustainability
may contribute to sustainable business model research and
practice by introducing a systemic design tool that frames
environmental, social, and economic drivers of value
generation into a dynamic business model causal feedback
structure, therefore overcoming shortcomings of current
business models when applied to complex systems.
It is widely recognized that sustainability is a new challenge for many manufacturing companies. In this paper, we tackle this issue by presenting an approach that deals with material and substance compliance within Product Lifecycle Management in a complex value chain. Our analysis explains why, how and when sustainable manufacturing arises, and it identifies, quantifies and evaluates the environmental impact of a new product. We propose (I) a Life Cycle Assessment tool (LCA) and (II) a model to validate this approach and evaluate the risk of non-compliance in supply chain. Our LCA approach provides comprehensive information on environmental impacts of a product.
Product and materials cycles are parallel and intersecting, making it challenging to integrate Material Selection Process across Product Lifecycle Management, Integration of LCA with PLM. We provide only a foundation. Further research in systems engineering is necessary. LCA is sensitive to data quality. Outsourcing production and having problems in supplier cooperation can result in material mismatch (such as property, composition mismatching) in the production process due to that may cause misleading of LCA results. This paper also describes research challenges using risk-based due diligence.
Despite the importance of Social Life Cycle Sustainability Assessment (S-LCSA), little research has addressed its integration into Product Lifecycle Management (PLM) systems. This paper presents a structured review of relevant research and practice. Also, to address practical aspects in more detail, it focuses on challenges and potential for adoption of such an integrated system at an electronics company.
We began by reviewing literature on implementations of Social-LCSA and identifying research needs. Then we investigated the status of Social-LCSA within the electronics industry, both by reviewing literature and interviewing decision makers, to identify challenges and the potential for adopting S-LCSA at an electronics company. We found low maturity of Social-LCSA, particularly difficulty in quantifying social sustainability. Adoption of Social-LCSA was less common among electronics industry suppliers, especially mining & smelting plants. Our results could provide a basis for conducting case studies that could further clarify issues involved in integrations of Social-LCSA into PLM systems.
Nowadays, the inexpensive memory space promotes an accelerating growth of stored image data. To exploit the data using supervised Machine or Deep Learning, it needs to be labeled. Manually labeling the vast amount of data is time-consuming and expensive, especially if human experts with specific domain knowledge are indispensable. Active learning addresses this shortcoming by querying the user the labels of the most informative images first. One way to obtain the ‘informativeness’ is by using uncertainty sampling as a query strategy, where the system queries those images it is most uncertain about how to classify. In this paper, we present a web-based active learning framework that helps to accelerate the labeling process. After manually labeling some images, the user gets recommendations of further candidates that could potentially be labeled equally (bulk image folder shift). We aim to explore the most efficient ‘uncertainty’ measure to improve the quality of the recommendations such that all images are sorted with a minimum number of user interactions (clicks). We conducted experiments using a manually labeled reference dataset to evaluate different combinations of classifiers and uncertainty measures. The results clearly show the effectiveness of an uncertainty sampling with bulk image shift recommendations (our novel method), which can reduce the number of required clicks to only around 20% compared to manual labeling.
Sleep is an important part of our life that significantly influences our health and well-being. The monitoring of sleep can provide data based on which sleep quality could be improved. This paper presents a system for heart rate detection during sleep. The data is collected from sensors underneath the test subjects. Though the data contains noise, it needs to be filtered to remove it. Due to the low strength of the signals, they need to be amplified after filtering. At some points of the signal, particular heartbeats may not be tracked by sensors due to the failure of a sensor or other reasons, which should be considered. The heart rate is detected in intervals of 15 s. A tool is implemented that detects the heart rate and visualizes it. The preprocessing of the data is performed with several filters: a highpass filter, a band-reject filter, a lowpass filter, and a motion detector. After the preprocessing of the data, the quality of the signal is significantly increased, and detection is possible.
Recognition of sleep and wake states is one of the relevant parts of sleep analysis. Performing this measurement in a contactless way increases comfort for the users. We present an approach evaluating only movement and respiratory signals to achieve recognition, which can be measured non-obtrusively. The algorithm is based on multinomial logistic regression and analyses features extracted out of mentioned above signals. These features were identified and developed after performing fundamental research on characteristics of vital signals during sleep. The achieved accuracy of 87% with the Cohen’s kappa of 0.40 demonstrates the appropriateness of a chosen method and encourages continuing research on this topic.
This paper examines the corporate organisational aspects of the implementation of Industry 4.0. Industry 4.0 builds on new technologies and appears as a disruptive innovation to manufacturing firms. Although we do have a good understanding of the technical components, the implementation of the management and organisational aspects of Industry 4.0 is under-researched. It is challenging to find qualitative empirical evidence which provides comprehensive insights about real implementation cases. Based on a case study in a German high value manufacturing firm, we explore the corporate organisation and implementation of Industry 4.0. By using the framework of Complex Adaptive System (CAS), we have identified three key factors which facilitate the implementation of Industry 4.0 namely 1.) Organisational structure changes such as the foundation of a central department for digital transformation, 2.) The election of a Chief Digital Officer as a personnel change, and 3.) Corporate opening up towards cooperating with partners as a cultural change. We have furthermore found that Lean Management is an important enabler that ensures readiness for the adoption of Industry 4.0.
Deep Learning-based EEG Detection of Mental Alertness States from Drivers under Ethical Aspects
(2022)
One of the most critical factors for a successful road trip is a high degree of alertness while driving. Even a split second of inattention or sleepiness in a crucial moment, will make the difference between life and death. Several prestigious car manufacturers are currently pursuing the aim of automated drowsiness identification to resolve this problem. The path between neuro-scientific research in connection with artificial intelligence and the preservation of the dignity of human individual’s and its inviolability, is very narrow. The key contribution of this work is a system of data analysis for EEGs during a driving session, which draws on previous studies analyzing heart rate (ECG), brain waves (EEG), and eye function (EOG). The gathered data is hereby treated as sensitive as possible, taking ethical regulations into consideration. Obtaining evaluable signs of evolving exhaustion includes techniques that obtain sleeping stage frequencies, problematic are hereby the correlated interference’s in the signal. This research focuses on a processing chain for EEG band splitting that involves band-pass filtering, principal component analysis (PCA), independent component analysis (ICA) with automatic artefact severance, and fast fourier transformation (FFT). The classification is based on a step-by-step adaptive deep learning analysis that detects theta rhythms as a drowsiness predictor in the pre-processed data. It was possible to obtain an offline detection rate of 89% and an online detection rate of 73%. The method is linked to the simulated driving scenario for which it was developed. This leaves space for more optimization on laboratory methods and data collection during wakefulness-dependent operations.
Since its first edition in 2008, the Workshop on Metallization and Interconnection for Crystalline Silicon SolarCells has been a key event where knowledge in the critical fields of crystalline silicon solar cell metallization andinterconnection is shared between experts from academia and industry. It has become a highly recognized event forthe quality of the contributions, the lively Q&A sessions, and the exceptional networking opportunity.The situation with the Covid-19 pandemic made organizing the 9th edition as an in-person event impossible andforced us to reconsider the event format. The event took place virtually on October 5th and 6th 2020. We used aninnovative online platform that enabled not only presentations followed by Q&A but also more informal interactions,where participants could see and talk directly to other participants. 120 experts from 22 countries took part andattended 21 contributions presented live. In spite of a few technical glitches, the workshop was successful and thegoals of exchanging on the state-of-the-art in research/industry and connecting experts in the field were achieved.All presentations are available on www.miworkshop.info as .pdf documents. These proceedings contain asummary of the 9th edition (MIW2020) and peer-reviewed papers based on the workshop contributions. The organizerswish to thank the members of the Scientific Committee for the time spent reviewing the MIW2020 abstracts andproceedings. The organizers also wish to thank again the sponsors and supporters for their financial contributionswhich made the 9th Workshop on Metallization and Interconnection for Crystalline Silicon Solar Cells possible.
Summary of the 9th workshop on metallization and interconnection for crystalline silicon solar cells
(2021)
The 9th edition of the Workshop on Metallization and Interconnection for Crystalline Silicon Solar Cells was held as an online event but nevertheless reached the workshop goals of knowledge sharing and networking. The technology of screen-printed contacts of high temperature pastes continues its fast progress enabled by better understanding of the phenomena taking place during printing and firing, and progress in materials. Great improvements were also achieved in low temperature paste printing and plated metallization. In the field of interconnection, progress was reported on multiwire approaches, electrically conductive adhesives and on foil-based approaches. Common to many contributions at the workshop was the use of advanced laser processes to improve performance or throughput.
Continuous range queries are a common means to handle mobile clients in high-density areas. Most existing approaches focus on settings in which the range queries for location-based services are mostly static whereas the mobile clients in the ranges move. We focus on a category called Dynamic Real-Time Range Queries (DRRQ) assuming that both, clients requested by the query and the inquirers, are mobile. In consequence, the query parameters results continuously change. This leads to two requirements: the ability to deal with an arbitrary high number of mobile nodes (scalability) and the real-time delivery of range query results. In this paper we present the highly decentralized solution Adaptive Quad Streaming (AQS) for the requirements of DRRQs. AQS approximates the query results in favor of a controlled real-time delivery and guaranteed scalability. While prior works commonly optimizes data structures on servers, we use AQS to focus on a highly distributed cell structure without data structures automatically adapting to changing client distributions. Instead of the commonly used request-response approach, we apply a lightweight streaming method in which no bidirectional communication and no storage or maintenance of queries are required at all.
This paper describes the development of a control system for an industrial heating application. In this process a moving substrate is passing through a heating zone with variable speed. Heat is applied by hot air to the substrate with the air flow rate being the manipulated variable. The aim is to control the substrate’s temperature at a specific location after passing the heating zone. First, a model is derived for a point attached to the moving substrate. This is modified to reflect the temperature of the moving substrate at the specified location. In order to regulate the temperature a nonlinear model predictive control approach is applied using an implicit Euler scheme to integrate the model and an augmented gradient based optimization approach. The performance of the controller has been validated both by simulations and experiments on the physical plant. The respective results are presented in this paper.
Trajectory Tracking of a Fully-actuated Surface Vessel using Nonlinear Model Predictive Control
(2021)
The trajectory tracking problem for a fully-actuated real-scaled surface vessel is addressed in this paper. The unknown hydrodynamic and propulsion parameters of the vessel’s dynamic model were identified using an experimental maneuver-based identification process. Then, a nonlinear model predictive control (NMPC) scheme is designed and the controller’s performance is assessed through the variation of NMPC parameters and constraints tightening for tracking a curved trajectory.
In this paper, a systematic comparison of three different advanced control strategies for automated docking of a vessel is presented. The controllers are automatically tuned offline by applying an optimization process using simulations of the whole system including trajectory planner and state and disturbance observer. Then investigations are conducted subject to performance and robustness using Monte Carlos simulation with varying model parameters and disturbances. The control strategies have also been tested in full scale experiments using the solar research vessel Solgenia. The investigated control strategies all have demonstrated very good performance in both, simulation and real world experiments. Videos are available under https://www.htwg-konstanz.de/forschung-und-transfer/institute-und-labore/isd/regelungstechnik/videos/
In multi-extended object tracking, parameters (e.g., extent) and trajectory are often determined independently. In this paper, we propose a joint parameter and trajectory (JPT) state and its integration into the Bayesian framework. This allows processing measurements that contain information about parameters and states. Examples of such measurements are bounding boxes given from an image processing algorithm. It is shown that this approach can consider correlations between states and parameters. In this paper, we present the JPT Bernoulli filter. Since parameters and state elements are considered in the weighting of the measurement data assignment hypotheses, the performance is higher than with the conventional Bernoulli filter. The JPT approach can be also used for other Bayes filters.
We analyse the results of a finite element simulation of a macroscopic model, which describes the movement of a crowd, that is considered as a continuum. A new formulation based on the macroscopic model from Hughes [2] is given. We present a stable numerical algorithm by approximating with a viscosity solution. The fundamental setting is given by an arbitrary domain that can contain several obstacles, several entries and must have at least one exit. All pedestrians have the goal to leave the room as quickly as possible. Nobody prefers a particular exit.
List decoding for concatenated codes based on the Plotkin construction with BCH component codes
(2021)
Reed-Muller codes are a popular code family based on the Plotkin construction. Recently, these codes have regained some interest due to their close relation to polar codes and their low-complexity decoding. We consider a similar code family, i.e., the Plotkin concatenation with binary BCH component codes. This construction is more flexible regarding the attainable code parameters. In this work, we consider a list-based decoding algorithm for the Plotkin concatenation with BCH component codes. The proposed list decoding leads to a significant coding gain with only a small increase in computational complexity. Simulation results demonstrate that the Plotkin concatenation with the proposed decoding achieves near maximum likelihood decoding performance. This coding scheme can outperform polar codes for moderate code lengths.
The encoding of antenna patterns with generalized spatial modulation as well as other index modulation techniques require w-out-of-n encoding where all binary vectors of length n have the same weight w. This constant-weight property cannot be obtained by conventional linear coding schemes. In this work, we propose a new class of constant-weight codes that result from the concatenation of convolutional codes with constant-weight block codes. These constant-weight convolutional codes are nonlinear binary trellis codes that can be decoded with the Viterbi algorithm. Some constructed constant-weight convolutional codes are optimum free distance codes. Simulation results demonstrate that the decoding performance with Viterbi decoding is close to the performance of the best-known linear codes. Similarly, simulation results for spatial modulation with a simple on-off keying show a significant coding gain with the proposed coded index modulation scheme.
Acoustic Echo Cancellation (AEC) plays a crucial role in speech communication devices to enable full-duplex communication. AEC algorithms have been studied extensively in the literature. However, device specific details like microphone or loudspeaker configurations are often neglected, despite their impact on the echo attenuation or near-end speech quality. In this work, we propose a method to investigate different loudspeaker-microphone configurations with respect to their contribution to the overall AEC performance. A generic AEC system consisting of an adaptive filter and a Wiener post filter is used for a fair comparison between different setups. We propose the near-end-to-residual-echo ratio (NRER) and the attenuation-of-near-end (AON) as quality measures for the full-duplex AEC performance.
Large-scale quantum computers threaten today's public-key cryptosystems. The code-based McEliece and Niederreiter cryptosystems are among the most promising candidates for post-quantum cryptography. Recently, a new class of q-ary product codes over Gaussian integers together with an efficient decoding algorithm were proposed for the McEliece cryptosystems. It was shown that these codes achieve a higher work factor for information-set decoding attacks than maximum distance separable (MDS) codes with comparable length and dimension. In this work, we adapt this q-ary product code construction to codes over Eisenstein integers. We propose a new syndrome decoding method which is applicable for Niederreiter cryptosystems. The code parameters and work factors for information-set decoding are comparable to codes over Gaussian integers. Hence, the new construction is not favorable for the McEliece system. Nevertheless, it is beneficial for the Niederreiter system, where it achieves larger message lengths. While the Niederreiter and McEliece systems have the same level of security, the Niederreiter system can be advantageous for some applications, e.g., it enables digital signatures. The proposed coding scheme is interesting for lightweight Niederreiter cryptosystems and embedded security due to the short code lengths and low decoding complexity.
Code-based cryptography is a promising candidate for post-quantum public-key encryption. The classic McEliece system uses binary Goppa codes, which are known for their good error correction capability. However, the key generation and decoding procedures of the classic McEliece system have a high computation complexity. Recently, q-ary concatenated codes over Gaussian integers were proposed for the McEliece cryptosystem together with the one-Mannheim error channel, where the error values are limited to Mannheim weight one. For this channel, concatenated codes over Gaussian integers achieve a higher error correction capability than maximum distance separable (MDS) codes with bounded minimum distance decoding. This improves the work factor regarding decoding attacks based on information-set decoding. This work proposes an improved construction for codes over Gaussian integers. These generalized concatenated codes extent the rate region where the work factor is beneficial compared to MDS codes. They allow for shorter public keys for the same level of security as the classic Goppa codes. Such codes are beneficial for lightweight code-based cryptosystems.
Cultural Mapping 4.0
(2021)
Die Bodenseeregion gehört zu einer der ältesten Kulturlandschaften Europas. Ihre regionale kulturelle Identität trägt zum Image sowie Identifikation seitens der Bevölkerung mit der Bodenseeregion bei. Dennoch mangelt es an einer ganzheitlichen, die gesamte Bodenseeregion umfassende Betrachtung der Frage, was die kulturelle Identität der Bodenseeregion ausmacht. Das Forschungsprojekt «CultMap4.0» hat daher zum Ziel, aus einer räumlichen Perspektive die Wechselwirkung zwischen regionaler Identität, Kultur und Mobilität zu untersuchen. Neben der Leitfrage, was kulturelle Identität in einer grenzüberschreitenden und diversen Region wie dem Bodensee zu sein und leisten vermag, werden im Rahmen von vier Themenschwerpunkten folgende Forschungsfragen untersucht: Wie nehmen einheimische Bevölkerung, Unternehmen und TouristInnen die regionale kulturelle Identität (Eigenbild) und das Image (Fremdbild) der Bodenseeregion wahr? Wie kann der Ansatz des “Cultural Mappings” durch partizipative Kartierung digital transformiert werden, und wie können kulturelle Identität und Mobilität mit digitalem Storytelling in Storymaps visualisiert werden? Welchen Beitrag kann “Cultural Mapping 4.0” als partizipatives Werkzeug zur Regionalplanung und zur Kommunikation mit Stakeholdern in der Bodenseeregion und anderenorts leisten? Die dabei entstehenden Storymaps – interaktive Webinhalte aus Texten, Karten und weiteren Medien – zur kulturellen Identität der Bodenseeregion sollen auf der Plattform “Cultural Mapping” Project Lake Constance” veröffentlicht werden, um so von den Stakeholdern als Planungs- und Entscheidungstool sowie fürs Standortmarketing genutzt werden zu können.
Cultural Mapping 4.0
(2021)
Cultural mapping aims to capture and visualize tangible and intangible cultural assets. This extend abstract proposes the consequent extension of analogue forms of cultural mapping using digital technologies, and its contribution is two-fold. First, the necessary theoretical basis is provided by a literature review of the still-young field of cultural mapping and the complementary disciplines of participatory mapping and digital story-mapping. Second, we propose a digitally enhanced Cultural Mapping 4.0 vision based on a case study from an ongoing research project in the Lake Constance region. Digital participatory mapping approaches are applied to capture data, and to validate and disseminate the results, story-mapping - a spatial form of digital storytelling - is used.
Text produced by entrepreneurs represents a data source in entrepreneurship research on venture performance and fund-raising success. Manual text coding of single variables is increasingly assisted or replaced by computer-aided text analysis. Yet, for the development of prediction models with several variables, such dictionary-based text analysis methods are less suitable. Natural language processing techniques are an alternative; however, the implementation is more complex and requires substantial programming skills. More work is required to understand how text analytics can advance entrepreneurship research. This study hence experiments with different artificial intelligence methods rooted in Natural Language Processing and deep learning. It uses 766 business plans to train a model for the automated measurement of transaction relations, a construct which is an indicator for new technology-based firm survival. Empirical findings show that the accuracy of construct measurement can be significantly increased with automated methods and improves with larger amounts of training data. Language complexity sets limits to the precision of automated construct measurement though. We therefore recommend a hybrid approach: making use of the inherent advantages of combining automated with human coding until the amount of training data is sufficiently large to substitute the human coding completely. The study provides insights into the applicability of different text analytics methods in entrepreneurship research and points at future research potential.
Innovation Labs
(2021)
Today's increasing pace of change and intense competition places demands on organizations to use a different approach to innovation, going beyond the incremental innovation that is typically developed within the core of the organization. As an option to escape the existing beliefs of the core organization, innovation labs are used to develop more discontinuous innovation. Despite the abundance of these so-called innovation labs in practice, researchers have devoted little effort to scrutinizing the concept and to provide managers with a framework for exploiting this form of innovation. In this paper, we aim to perform an empirical investigation and to create a consensus around the concept of innovation labs. To do so, we conducted a multiple case study in large international organizations with a total of 31 interviews of an average length of 70 minutes. We offer a framework by identifying four innovation lab types and consider when each is most appropriate. Furthermore, we highlight the importance for managers and their organizations to align the strategic intent with the innovation lab type as well as the interface between the innovation lab and the core business.
Guiding through the Fog
(2021)
Corporate Entrepreneurship (CE) programs are formalized efforts to realize entrepreneurial activities in established companies. Despite the growing and evolving landscape of CE programs, effectively managing them remains a challenging endeavor which results in disappointing outcomes and oftentimes leads to the early termination of such programs. We unmask the differences in goal setting of CE programs and highlight that setting appropriate goals is imperative for their desired outcomes. In practice, companies seem to struggle with the goal setting, and scholars have not yet fully solved the puzzle of goals setting in the context of CE programs either. Therefore, we set out to explore the current state of goal setting in the context of CE programs building upon 61 semi-structured interviews with CE program executives from cross-industry companies with different sizes. Our study contributes to a better understanding of goal setting in the context of CE programs by (1) characterizing the goal setting of CE programs based on goal attributes and goal types and (2) identifying differences among the goal setting of CE programs. We provide implications to practice for a more effective management of CE programs and conclude with a discussion for future research on the impact of the different goal settings.
Female Entrepreneurship has gained interest over the last 20 years. Therefore, this paper analyses 7,320 articles of the research field ‘women in entrepreneurial context’ published in 885 journals. The sample is analyzed by using a machine learning and text mining based methodological approach. Aiming to provide a broad overview over the research literature, 41 clusters and 11 superordinate topics were identified. Major developments of research attention are outlined by analyzing bibliometric data of the period from 2000 to 2020. Overall growth in terms of research attention measured by the development of yearly citations per article is best noticeable in clusters ‘corporate social responsibility’, ‘brand’, and ‘corporate (-governance)’, and in superordinate topics ‘performance’, ‘education’, and ‘corporate (board/ management)’. There are also indicators for an overall increase of research attention and cluster variety. The synthesis provides an insight into most trending superordinate topics. Therefore, this literature review gives a comprehensive and descriptive overview as well as an insight into thematic trend developments of the research field.
Deep transformation models
(2021)
We present a deep transformation model for probabilistic regression. Deep learning is known for outstandingly accurate predictions on complex data but in regression tasks it is predominantly used to just predict a single number. This ignores the non-deterministic character of most tasks. Especially if crucial decisions are based on the predictions, like in medical applications, it is essential to quantify the prediction uncertainty. The presented deep learning transformation model estimates the whole conditional probability distribution, which is the most thorough way to capture uncertainty about the outcome. We combine ideas from a statistical transformation model (most likely transformation) with recent transformation models from deep learning (normalizing flows) to predict complex outcome distributions. The core of the method is a parameterized transformation function which can be trained with the usual maximum likelihood framework using gradient descent. The method can be combined with existing deep learning architectures. For small machine learning benchmark datasets, we report state of the art performance for most dataset and partly even outperform it. Our method works for complex input data, which we demonstrate by employing a CNN architecture on image data.
This paper presents a generic method to enhance performance and incorporate temporal information for cardiorespiratory-based sleep stage classification with a limited feature set and limited data. The classification algorithm relies on random forests and a feature set extracted from long-time home monitoring for sleep analysis. Employing temporal feature stacking, the system could be significantly improved in terms of Cohen’s κ and accuracy. The detection performance could be improved for three classes of sleep stages (Wake, REM, Non-REM sleep), four classes (Wake, Non-REM-Light sleep, Non-REM Deep sleep, REM sleep), and five classes (Wake, N1, N2, N3/4, REM sleep) from a κ of 0.44 to 0.58, 0.33 to 0.51, and 0.28 to 0.44 respectively by stacking features before and after the epoch to be classified. Further analysis was done for the optimal length and combination method for this stacking approach. Overall, three methods and a variable duration between 30 s and 30 min have been analyzed. Overnight recordings of 36 healthy subjects from the Interdisciplinary Center for Sleep Medicine at Charité-Universitätsmedizin Berlin and Leave-One-Out-Cross-Validation on a patient-level have been used to validate the method.
Probabilistic Short-Term Low-Voltage Load Forecasting using Bernstein-Polynomial Normalizing Flows
(2021)
The transition to a fully renewable energy grid requires better forecasting of demand at the low-voltage level. However, high fluctuations and increasing electrification cause huge forecast errors with traditional point estimates. Probabilistic load forecasts take future uncertainties into account and thus enables various applications in low-carbon energy systems. We propose an approach for flexible conditional density forecasting of short-term load based on Bernstein-Polynomial Normalizing Flows where a neural network controls the parameters of the flow. In an empirical study with 363 smart meter customers, our density predictions compare favorably against Gaussian and Gaussian mixture densities and also outperform a non-parametric approach based on the pinball loss for 24h-ahead load forecasting for two different neural network architectures.
In this paper, a novel measurement model based on spherical double Fourier series (DFS) for estimating the 3D shape of a target concurrently with its kinematic state is introduced. Here, the shape is represented as a star-convex radial function, decomposed as spherical DFS. In comparison to ordinary DFS, spherical DFS do not suffer from ambiguities at the poles. Details will be given in the paper. The shape representation is integrated into a Bayesian state estimator framework via a measurement equation. As range sensors only generate measurements from the target side facing the sensor, the shape representation is modified to enable application of shape symmetries during the estimation process. The model is analyzed in simulations and compared to a shape estimation procedure using spherical harmonics. Finally, shape estimation using spherical and ordinary DFS is compared to analyze the effect of the pole problem in extended object tracking (EOT) scenarios.
Im Rahmen des KONTEC Kongresses 2021 in Dresden wurden sowohl ein Poster als auch ein Paper des Forschungsprojekts EKont veröffentlicht. Neben der Schilderung des Versuchsaufbaus werden neuartige Schneidprozesse und Abtragsprinzipien vorgestellt. Im Anschluss daran werden vier Prototypen (gleichsinniger Stufenfräser, gegenläufiger Stufenfräser, mittig gegenläufiger Stufenfräser - Getriebe und oszillierender Werkzeugaufsatz) beschrieben.
Identifikation von Schlaf- und Wachzuständen durch die Auswertung von Atem- und Bewegungssignalen
(2021)
The last decades have shown that the volume of tourism, in general, is constantly increasing (with some justified exceptions). To offer a possibility of travel for all groups of people, it is necessary to pay attention to accessibility. One of the possibilities for increasing accessibility is digital technologies, which could assist in planning and the implementation and completion of trips. To make a selection of technologies, first, a study of barriers was conducted, which was then analyzed, and finally, some technologies were made available in a test setup. A focus on two technologies was made: 360°-Tours and mobile app with the travel information. The two technologies were implemented and presented to the test subjects.
The evaluation results showed that both technologies could increase accessibility if some essential aspects (such as usability, completeness, relevance, etc.) are considered during the implementation.
The development of home health systems can provide continuous and user-friendly monitoring of key health parameters. This project aims to create a concept for such a system, implement it on a test basis, and evaluate it. Three health areas were selected for this purpose:
Sleep, Stress, and Rehabilitation. Appropriate devices were installed in the homes of test subjects and used by them for two weeks. Besides, relevant questionnaires were completed to obtain a complete picture. Finally, the implemented system was evaluated, and the results of the conducted study showed that home health systems have great potential. However, it is necessary to consider some points to increase the usability of the system and the motivation of the users. Among others, ease of use of the equipment is of extreme importance.
Health monitoring in a home environment can have broader use since it may provide continuous control of health parameters with relatively minor intrusiveness into regular life. This work aims to verify if it is possible to replace the typical in some sleep medicine areas subjective questioning by an objective measurement using electronic devices. For this purpose, a study was conducted with ten subjects, in which objective and subjective measurement of relevant sleep parameters took place. The results of both measurement methods were evaluated and analyzed. The results showed that while for some measures, such as Total Time in Bed, there is a high agreement between objective and subjective measurements, for others, such as sleep quality, there are significant differences. For this reason, currently, a combination of both measurement methods may be beneficial and provide the most detailed results, while a partial replacement can already reduce the number of questions at the subjective measurement by measurement through electronic devices.
Preliminary results of homomorphic deconvolution application to surface EMG signals during walking
(2021)
Homomorphic deconvolution is applied to sEMG signals recorded during walking. Gastrocnemius lateralis and tibialis anterior signals were acquired according to SENIAM recommendation. MUAP parameters like amplitude and scale were estimated, whilst the MUAP shape parameter was fixed. This features a useful time-frequency representation of sEMG signal. Estimation of scale MUAP parameter was verified extracting the mean frequency of filtered EMG signal, extracted from the scale parameter estimated with two different MUAP shape values.
Normal breathing during sleep is essential for people’s health and well-being. Therefore, it is crucial to diagnose apnoea events at an early stage and apply appropriate therapy. Detection of sleep apnoea is a central goal of the system design described in this article. To develop a correctly functioning system, it is first necessary to define the requirements outlined in this manuscript clearly. Furthermore, the selection of appropriate technology for the measurement of respiration is of great importance. Therefore, after performing initial literature research, we have analysed in detail three different methods and made a selection of a proper one according to determined requirements. After considering all the advantages and disadvantages of the three approaches, we decided to use the impedance measurement-based one. As a next step, an initial conceptual design of the algorithm for detecting apnoea events was created. As a result, we developed an activity diagram on which the main system components and data flows are visually represented.
Respiratory diseases are leading causes of death and disability in the world. The recent COVID-19 pandemic is also affecting the respiratory system. Detecting and diagnosing respiratory diseases requires both medical professionals and the clinical environment. Most of the techniques used up to date were also invasive or expensive.
Some research groups are developing hardware devices and techniques to make possible a non-invasive or even remote respiratory sound acquisition. These sounds are then processed and analysed for clinical, scientific, or educational purposes.
We present the literature review of non-invasive sound acquisition devices and techniques.
The results are about a huge number of digital tools, like microphones, wearables, or Internet of Thing devices, that can be used in this scope.
Some interesting applications have been found. Some devices make easier the sound acquisition in a clinic environment, but others make possible daily monitoring outside that ambient. We aim to use some of these devices and include the non-invasive recorded respiratory sounds in a Digital Twin system for personalized health.
Twenty-first century infrastructure needs to respond to changing demographics, becoming climate neutral, resilient and economically affordable, while remaining a driver for development and shared prosperity. However, the infrastructure sector remains one of the least innovative and digitalised, plagued by delays, cost overruns and benefit shortfalls (Cantarelli et al. 2008; Flyvbjerg, 2007; Flyvbjerg et al., 2003; Flyvbjerg et al., 2004). The root cause is the prevailing fragmentation of the infrastructure sector (Fellows and Liu, 2012). To help overcome these challenges, integration of the value chain is needed. This could be achieved through a use-case-based creation of federated ecosystems connecting open and trusted data spaces and advanced services applied to infrastructure projects. Such digital platforms enable full-lifecycle participation and responsible governance guided by a shared infrastructure vision. Digital federation enables secure and sovereign data exchange and thus collaboration across the silos within the infrastructure sector and between industries as well as within and between countries. Such an approach to infrastructure technology policy would not rely on technological solutionism but proposes the development of open and trusted data alliances. Federated data spaces provide access to the emerging data economy, especially for SMEs, and can foster the innovation of new digital services. Such responsible digital governance can help make the infrastructure sector more resilient, efficient and aligned with the realisation of ambitious decarbonisation and environmental protection targets. The European Union and the United States have already developed architectures for sovereign and secure data exchange.
Location-aware mobile devices are becoming increasingly popular and GPS sensors are built into nearly every portable unit with computational capabilities. At the same time, the emergence of location-aware virtual services and ideas calls for new efficient spatial real-time queries. Communication latency in mobile environments interacting with high decentralization and the need of scalability in high-density systems with immense client counts leads to major challenges. In this paper we describe a decentralized architecture for continuous range queries in settings in which both, the requested and the requesting clients, are mobile. While prior works commonly use a request-response approach we provide a stream-based adaptive grid solution dealing with arbitrary high client counts and improving communication latency that meets given hard real-time constraints.
A residual neural network was adapted and applied to the Physionet/Computing data in Cardiology Challenge 2020 to detect 24 different classes of cardiac abnormalities from 12-lead. Additive Gaussian noise, signal shifting, and the classification of signal sections of different lengths were applied to prevent the network from overfitting and facilitating generalization. Due to the use of a global pooling layer after the feature extractor, the network is independent of the signal’s length. On the hidden test set of the challenge, the model achieved a validation score of 0.656 and a full test score of 0.27, placing us 15th out of 41 officially ranked teams (Team name: UC_Lab_Kn). These results show the potential of deep neural networks for ap- plication to raw data and a complex multi-class multi-label classification problem, even if the training data is from di- verse datasets and of differing lengths.
Ballistocardiography (BCG) can be used to monitor heart rate activity. Besides, the accelerometer should have high sensitivity and minimal internal noise; a low-cost approach was taken into consideration. Several measurements have been executed to determine the optimal positioning of a sensor under the mattress to obtain a signal strong enough for further analysis. A prototype for an unobtrusive accelerometer-based measurement system has been developed and tested in a conventional bed without any specific extras. The influence of the human sleep position for the output accelerometer data was tested. The obtained results indicate the potential to capture BCG signals using accelerometers. The measurement system can detect heart rate in an unobtrusive form in the home environment.
The main aim of presented in this manuscript research is to compare the results of objective and subjective measurement of sleep quality for older adults (65+) in the home environment. A total amount of 73 nights was evaluated in this study. Placing under the mattress device was used to obtain objective measurement data, and a common question on perceived sleep quality was asked to collect the subjective sleep quality level. The achieved results confirm the correlation between objective and subjective measurement of sleep quality with the average standard deviation equal to 2 of 10 possible quality points.
Uncertainty about the future requires companies to create discontinuous innovations. Established companies, however, struggle to do so; whereas independent startups seem to better cope with this. Consequently, established companies set up entrepreneurial initiatives to make use of startups' benefits. Consequently, this led-amongst others-to great interest in socalled corporate entrepreneurship (CE) programs and to the development and characterization of several different forms. Their processes to achieve certain objectives, yet, are still rather ineffective. Thus, considerations of the actions performed in preparation for and during CE programs could be one approach to improve this but are still absent today. Furthermore, the increasing use of several CE programs in parallel seems to bear the potential for synergies and, thus, more efficient use of resources. Aiming to provide insights to both issues, this study analyzes actions of CE programs, by looking at interviews with managers of seven corporate incubators and accelerator programs of five established German tech-companies.
In today's volatile world, established companies must be capable of optimizing their core business with incremental innovations while simultaneously developing discontinuous innovations to maintain their long-term competitiveness. Balancing both is a major challenge for companies, since different types of innovation require different organizational structures, operational modes and management styles. Established companies tend to excel in improving their current business through incremental innovations which are closely related to their current knowledge base and competencies. However, this often goes hand in hand with challenges in the exploration of knowledge that is new to the company and that is essential for the development of discontinuous innovations. In this respect, the concept of corporate entrepreneurship is recognized as a way to strengthen the exploration of new knowledge and to support the development of discontinuous innovation. For managing corporate entrepreneurship more effectively, it is crucial to understand which types of knowledge can be created through corporate entrepreneurship and which organizational designs are more suited to gain certain types of knowledge. To answer these questions, this study analyzed 23 semi-structured interviews conducted with established companies that are running such entrepreneurial activities. The results show (1) that three general types of knowledge can be explored through corporate entrepreneurship and (2) that some organizational designs are more suited to explore certain knowledge types than others are.
We have analyzed a pool of 37,839 articles published in 4,404 business-related journals in the entrepreneurship research field using a novel literature review approach that is based on machine learning and text data mining. Most papers have been published in the journals ‘Small Business Economics’, ‘International Journal of Entrepreneurship and Small Business’, and ‘Sustainability’ (Switzerland), while the sum of citations is highest in the ‘Journal of Business Venturing’, ‘Entrepreneurship Theory and Practice’, and ‘Small Business Economics’. We derived 29 overarching themes based on 52 identified clusters. The social entrepreneurship, development, innovation, capital, and economy clusters represent the largest ones among those with high thematic clarity. The most discussed clusters measured by the average number of citations per assigned paper are research, orientation, capital, gender, and growth. Clusters with the highest average growth in publications per year are social entrepreneurship, innovation, development, entrepreneurship education, and (business-) models. Measured by the average yearly citation rate per paper, the thematic cluster ‘research’, mostly containing literature studies, received most attention. The MLR allows for an inclusion of a significantly higher number of publications compared to traditional reviews thus providing a comprehensive, descriptive overview of the whole research field.
This paper proposes a novel transmission scheme for generalized multistream spatial modulation. This new approach uses one Mannheim error correcting codes over Gaussian or Eisenstein integers as multidimensional signal constellations. These codes enable a suboptimal decoding strategy with near maximum likelihood performance for transmission over the additive white Gaussian noise channel. In this contribution, this decoding algorithm is generalized to the detection for generalized multistream spatial modulation. The proposed method can outperform conventional generalized multistream spatial modulation with respect to decoding performance, detection complexity, and spectral efficiency.
Soft-input decoding of concatenated codes based on the Plotkin construction and BCH component codes
(2020)
Low latency communication requires soft-input decoding of binary block codes with small to medium block lengths.
In this work, we consider generalized multiple concatenated (GMC) codes based on the Plotkin construction. These codes are similar to Reed-Muller (RM) codes. In contrast to RM codes, BCH codes are employed as component codes. This leads to improved code parameters. Moreover, a decoding algorithm is proposed that exploits the recursive structure of the concatenation. This algorithm enables efficient soft-input decoding of binary block codes with small to medium lengths. The proposed codes and their decoding achieve significant performance gains compared with RM codes and recursive GMC decoding.
The reliability of flash memories suffers from various error causes. Program/erase cycles, read disturb, and cell to cell interference impact the threshold voltages and cause bit errors during the read process. Hence, error correction is required to ensure reliable data storage. In this work, we investigate the bit-labeling of triple level cell (TLC) memories. This labeling determines the page capacities and the latency of the read process. The page capacity defines the redundancy that is required for error correction coding. Typically, Gray codes are used to encode the cell state such that the codes of adjacent states differ in a single digit. These Gray codes minimize the latency for random access reads but cannot balance the page capacities. Based on measured voltage distributions, we investigate the page capacities and propose a labeling that provides a better rate balancing than Gray labeling.
Side Channel Attack Resistance of the Elliptic Curve Point Multiplication using Eisenstein Integers
(2020)
Asymmetric cryptography empowers secure key exchange and digital signatures for message authentication. Nevertheless, consumer electronics and embedded systems often rely on symmetric cryptosystems because asymmetric cryptosystems are computationally intensive. Besides, implementations of cryptosystems are prone to side-channel attacks (SCA). Consequently, the secure and efficient implementation of asymmetric cryptography on resource-constrained systems is demanding. In this work, elliptic curve cryptography is considered. A new concept for an SCA resistant calculation of the elliptic curve point multiplication over Eisenstein integers is presented and an efficient arithmetic over Eisenstein integers is proposed. Representing the key by Eisenstein integer expansions is beneficial to reduce the computational complexity and the memory requirements of an SCA protected implementation.
Deep neural networks (DNNs) are known for their high prediction performance, especially in perceptual tasks such as object recognition or autonomous driving. Still, DNNs are prone to yield unreliable predictions when encountering completely new situations without indicating their uncertainty. Bayesian variants of DNNs (BDNNs), such as MC dropout BDNNs, do provide uncertainty measures. However, BDNNs are slow during test time because they rely on a sampling approach. Here we present a single shot MC dropout approximation that preserves the advantages of BDNNs without being slower than a DNN. Our approach is to analytically approximate for each layer in a fully connected network the expected value and the variance of the MC dropout signal. We evaluate our approach on different benchmark datasets and a simulated toy example. We demonstrate that our single shot MC dropout approximation resembles the point estimate and the uncertainty estimate of the predictive distribution that is achieved with an MC approach, while being fast enough for real-time deployments of BDNNs.
We compared vulnerable and fixed versions of the source code of 50 different PHP open source projects based on CVE reports for SQL injection vulnerabilities. We scanned the source code with commercial and open source tools for static code analysis. Our results show that five current state-of-the-art tools have issues correctly marking vulnerable and safe code. We identify 25 code patterns that are not detected as a vulnerability by at least one of the tools and 6 code patterns that are mistakenly reported as a vulnerability that cannot be confirmed by manual code inspection. Knowledge of the patterns could help vendors of static code analysis tools, and software developers could be instructed to avoid patterns that confuse automated tools.
Three-dimensional ship localization with only one camera is a challenging task due to the loss of depth information caused by perspective projection. In this paper, we propose a method to measure distances based on the assumption that ships lie on a flat surface. This assumption allows to recover depth from a single image using the principle of inverse perspective. For the 3D ship detection task, we use a hybrid approach that combines image detection with a convolutional neural network, camera geometry and inverse perspective. Furthermore, a novel calculation of object height is introduced. Experiments show that the monocular distance computation works well in comparison to a Velodyne lidar. Due to its robustness, this could be an easy-to-use baseline method for detection tasks in navigation systems.
This paper presents the goals, service design approach, and the results of the project “Accessible Tourism around Lake Constance”, which is currently run by different universities, industrial partners and selected hotels in Switzerland, Germany and Austria. In the 1st phase, interviews with different persons with disabilities and elderly persons have been conducted to identify the barriers and pains faced by tourists who want to spend their holidays in the region of Lake Constance as well as possible assistive technologies that help to overcome these barriers. The analysis of the interviews shows that one third of the pains and barriers are due to missing, insufficient, wrong or inaccessible information about the
accessibility of the accommodation, surroundings, and points of interests during the planning phase of the holidays. Digital assistive technologies hence play a
major role in bridging this information gap. In the 2nd phase so-called Hotel-Living-Labs (HLL) have been established where the identified assistive technologies
can be evaluated. Based on these HLLs an overall service for accessible holidays has been designed and developed. In the last phase, this service has been implemented
based on the HLLs as well as the identified assistive technologies and is currently field tested with tourists with disabilities from the three participated countries.
The ageing infrastructure in ports requires regular inspection. This inspection is currently carried out manually by divers who sense by hand the entire underwater infrastructure. This process is cost-intensive as it involves a lot of time and human resources. To overcome these difficulties, we propose to scan the above and underwater port structure with a Multi-SensorSystem, and -by a fully automated processto classify the obtained point cloud into damaged and undamaged zones. We make use of simulated training data to test our approach since not enough training data with corresponding class labels are available yet. To that aim, we build a rasterised heightfield of a point cloud of a sheet pile wall by cutting it into verticall slices. The distance from each slice to the corresponding line generates the heightfield. This latter is propagated through a convolutional neural network which detects anomalies. We use the VGG19 Deep Neural Network model pretrained on natural images. This neural network has 19 layers and it is often used for image recognition tasks. We showed that our approach can achieve a fully automated, reproducible, quality-controlled damage detection which is able to analyse the whole structure instead of the sample wise manual method with divers. The mean true positive rate is 0.98 which means that we detected 98 % of the damages in the simulated environment.
Modeling a suitable birth density is a challenge when using Bernoulli filters such as the Labeled Multi-Bernoulli (LMB) filter. The birth density of newborn targets is unknown in most applications, but must be given as a prior to the filter. Usually the birth density stays unchanged or is designed based on the measurements from previous time steps.
In this paper, we assume that the true initial state of new objects is normally distributed. The expected value and covariance of the underlying density are unknown parameters. Using the estimated multi-object state of the LMB and the Rauch-Tung-Striebel (RTS) recursion, these parameters are recursively estimated and adapted after a target is detected.
The main contribution of this paper is an algorithm to estimate the parameters of the birth density and its integration into the LMB framework. Monte Carlo simulations are used to evaluate the detection driven adaptive birth density in two scenarios. The approach can also be applied to filters that are able to estimate trajectories.
The expansion of a given multivariate polynomial into Bernstein polynomials is considered. Matrix methods for the calculation of the Bernstein expansion of the product of two polynomials and of the Bernstein expansion of a polynomial from the expansion of one of its partial derivatives are provided which allow also a symbolic computation.
We propose and apply a requirements engineering approach that focuses on security and privacy properties and takes into account various stakeholder interests. The proposed methodology facilitates the integration of security and privacy by design into the requirements engineering process. Thus, specific, detailed security and privacy requirements can be implemented from the very beginning of a software project. The method is applied to an exemplary application scenario in the logistics industry. The approach includes the application of threat and risk rating methodologies, a technique to derive technical requirements from legal texts, as well as a matching process to avoid duplication and accumulate all essential requirements.
We present source code patterns that are difficult for modern static code analysis tools. Our study comprises 50 different open source projects in both a vulnerable and a fixed version for XSS vulnerabilities reported with CVE IDs over a period of seven years. We used three commercial and two open source static code analysis tools. Based on the reported vulnerabilities we discovered code patterns that appear to be difficult to classify by static analysis. The results show that code analysis tools are helpful, but still have problems with specific source code patterns. These patterns should be a focus in training for developers.
The evaluation of the effectiveness of different machine learning algorithms on a publicly available database of signals derived from wearable devices is presented with the goal of optimizing human activity recognition and classification. Among the wide number of body signals we choose a couple of signals, namely photoplethysmographic (optically detected subcutaneous blood volume) and tri-axis acceleration signals that are easy to be simultaneously acquired using commercial widespread devices (e.g. smartwatches) as well as custom wearable wireless devices designed for sport, healthcare, or clinical purposes. To this end, two widely used algorithms (decision tree and k-nearest neighbor) were tested, and their performance were compared to two new recent algorithms (particle Bernstein and a Monte Carlo-based regression) both in terms of accuracy and processing time. A data preprocessing phase was also considered to improve the performance of the machine learning procedures, in order to reduce the problem size and a detailed analysis of the compression strategy and results is also presented.
Good sleep is crucial for a healthy life of every person. Unfortunately, its quality often decreases with aging. A common approach to measuring the sleep characteristics is based on interviews with the subjects or letting them fill in a daily questionnaire and afterward evaluating the obtained data. However, this method has time and personal costs for the interviewer and evaluator of responses. Therefore, it would be important to execute the collection and evaluation of sleep characteristics automatically. To do that, it is necessary to investigate the level of agreement between measurements performed in a traditional way using questionnaires and measurements obtained using electronic monitoring devices. The study presented in this manuscript performs this investigation, comparing such sleep characteristics as "time going to bed", "total time in bed", "total sleep time" and "sleep efficiency". A total number of 106 night records of elderly persons (aged 65+) were analyzed. The results achieved so far reveal the fact that the degree of agreement between the two measurement methods varies substantially for different characteristics, from 31 minutes of mean difference for "time going to bed" to 77 minutes for "total sleep time". For this reason, a direct exchange of objective and subjective measuring methods is currently not possible.
Polysomnography is a gold standard for a sleep study, and it provides very accurate results, but its cost (both personnel and material) are quite high. Therefore, the development of a low-cost system for overnight breathing and heartbeat monitoring, which provides more comfort while recording the data, is a well-motivated challenge. The system proposed in this manuscript is based on the usage of resistive pressure sensors installed under the mattress. These sensors can measure slight pressure changes provoked during breathing and heartbeat. The captured signal requires advanced processing, like applying filters and amplifiers before the analog signal is ready for the next step. Then, the output signal is digitalized and further processed by an algorithm that performs a custom filtering before it can recognize breathing and heart rate in real-time. The result can be directly visualized. Furthermore, a CSV file is created containing the raw data, timestamps, and unique IDs to facilitate further processing. The achieved results are promising, and the average deviation from a reference device is about 4bpm.