Refine
Year of publication
- 2021 (87) (remove)
Document Type
- Conference Proceeding (35)
- Article (30)
- Part of a Book (8)
- Doctoral Thesis (6)
- Bachelor Thesis (3)
- Master's Thesis (2)
- Preprint (2)
- Report (1)
Language
- English (87) (remove)
Keywords
- (Strict) sign-regularity (1)
- Active Learning (1)
- Agrarprodukt (1)
- Alpine area (1)
- Ambient assisted living (1)
- Antidumping (1)
- Arrival and departure (1)
- Artificial Intelligence (1)
- Austenitic stainless steels (1)
- Automated Docking of Vessels (1)
Institute
- Fakultät Bauingenieurwesen (6)
- Fakultät Elektrotechnik und Informationstechnik (4)
- Fakultät Informatik (9)
- Fakultät Wirtschafts-, Kultur- und Rechtswissenschaften (11)
- Institut für Angewandte Forschung - IAF (13)
- Institut für Optische Systeme - IOS (2)
- Institut für Strategische Innovation und Technologiemanagement - IST (8)
- Institut für Systemdynamik - ISD (16)
- Institut für Werkstoffsystemtechnik Thurgau - WITg (4)
- Konstanz Institut für Corporate Governance - KICG (1)
Many countries offer state credit guarantees to support credit-constrained exporters. The policy instrument is commonly justified by governments as a means to mitigating adverse outcomes of financial market frictions for exporting firms. Accumulated returns to the German state credit guarantee scheme deriving from risk-compensating premia have outweighed accumulated losses over the past 60 years. Why do private financial agents not step in and provide insurance given that the state-run program yields positive returns? We argue that costs of risk diversification, liquidity management, and coordination among creditors limit the ability of private financial agents to offer comparable insurance products. Moreover, we suggest that the government’s greater effectiveness in recovering claims in foreign countries endows the state with a cost advantage in dealing with the risks involved in large export projects. We test these hypotheses using monthly firm-level data combined with official transaction-level data on covered exports of German firms and find suggestive evidence that positive effects on trade are due to mitigated financial constraints: State credit guarantees benefit firms that are dependent on external finance, if the value at risk which they seek to cover is large, and at times when refinancing conditions on the private financial market are tight.
In this paper, a systematic comparison of three different advanced control strategies for automated docking of a vessel is presented. The controllers are automatically tuned offline by applying an optimization process using simulations of the whole system including trajectory planner and state and disturbance observer. Then investigations are conducted subject to performance and robustness using Monte Carlos simulation with varying model parameters and disturbances. The control strategies have also been tested in full scale experiments using the solar research vessel Solgenia. The investigated control strategies all have demonstrated very good performance in both, simulation and real world experiments. Videos are available under https://www.htwg-konstanz.de/forschung-und-transfer/institute-und-labore/isd/regelungstechnik/videos/
This paper describes the development of a control system for an industrial heating application. In this process a moving substrate is passing through a heating zone with variable speed. Heat is applied by hot air to the substrate with the air flow rate being the manipulated variable. The aim is to control the substrate’s temperature at a specific location after passing the heating zone. First, a model is derived for a point attached to the moving substrate. This is modified to reflect the temperature of the moving substrate at the specified location. In order to regulate the temperature a nonlinear model predictive control approach is applied using an implicit Euler scheme to integrate the model and an augmented gradient based optimization approach. The performance of the controller has been validated both by simulations and experiments on the physical plant. The respective results are presented in this paper.
This paper presents a generic method to enhance performance and incorporate temporal information for cardiorespiratory-based sleep stage classification with a limited feature set and limited data. The classification algorithm relies on random forests and a feature set extracted from long-time home monitoring for sleep analysis. Employing temporal feature stacking, the system could be significantly improved in terms of Cohen’s κ and accuracy. The detection performance could be improved for three classes of sleep stages (Wake, REM, Non-REM sleep), four classes (Wake, Non-REM-Light sleep, Non-REM Deep sleep, REM sleep), and five classes (Wake, N1, N2, N3/4, REM sleep) from a κ of 0.44 to 0.58, 0.33 to 0.51, and 0.28 to 0.44 respectively by stacking features before and after the epoch to be classified. Further analysis was done for the optimal length and combination method for this stacking approach. Overall, three methods and a variable duration between 30 s and 30 min have been analyzed. Overnight recordings of 36 healthy subjects from the Interdisciplinary Center for Sleep Medicine at Charité-Universitätsmedizin Berlin and Leave-One-Out-Cross-Validation on a patient-level have been used to validate the method.
A nonlinear mathematical model for the dynamics of permanent magnet synchronous machines with interior magnets is discussed. The model of the current dynamics captures saturation and dependency on the rotor angle. Based on the model, a flatness-based field-oriented closed-loop controller and a feed-forward compensation of torque ripples are derived. Effectiveness and robustness of the proposed algorithms are demonstrated by simulation results.
Innovation Labs
(2021)
Today's increasing pace of change and intense competition places demands on organizations to use a different approach to innovation, going beyond the incremental innovation that is typically developed within the core of the organization. As an option to escape the existing beliefs of the core organization, innovation labs are used to develop more discontinuous innovation. Despite the abundance of these so-called innovation labs in practice, researchers have devoted little effort to scrutinizing the concept and to provide managers with a framework for exploiting this form of innovation. In this paper, we aim to perform an empirical investigation and to create a consensus around the concept of innovation labs. To do so, we conducted a multiple case study in large international organizations with a total of 31 interviews of an average length of 70 minutes. We offer a framework by identifying four innovation lab types and consider when each is most appropriate. Furthermore, we highlight the importance for managers and their organizations to align the strategic intent with the innovation lab type as well as the interface between the innovation lab and the core business.
The main objective of this paper is to revisit the Euro method in a critical and constructive way.Wehave analysed some arguments against the Euro method published recently in the literature as well as some other relevant aspects of the SUT-Euro and SUT-RAS methods not covered before. Although not being the Euro method perfect, we believe that there is still space for the use of the Euro method in updating/regionalizing Supply and Use tables.
Despite the increased attention dedicated to research on the antecedents and determinants of new venture survival in entrepreneurship, defining and capturing survival as an outcome represents a challenge in quantitative studies. This paper creates awareness for ventures being inactive while still classified as surviving based on the data available. We describe this as the ‘living dead’ phenomenon, arguing that it yields potential effects on the empirical results of survival studies. Based on a systematic literature review, we find that this issue of inactivity has not been sufficiently considered in previous new venture survival studies. Based on a sample of 501 New Technology-Based Firms, we empirically illustrate that the classification of living dead ventures into either survived or failed can impact the factors determining survival. On this basis, we contribute to an understanding of the issue by defining the ‘living dead’ phenomenon and by proposing recommendations for research practice to solve this issue in survival studies, taking the data source, the period under investigation and the sample size into account.
Text produced by entrepreneurs represents a data source in entrepreneurship research on venture performance and fund-raising success. Manual text coding of single variables is increasingly assisted or replaced by computer-aided text analysis. Yet, for the development of prediction models with several variables, such dictionary-based text analysis methods are less suitable. Natural language processing techniques are an alternative; however, the implementation is more complex and requires substantial programming skills. More work is required to understand how text analytics can advance entrepreneurship research. This study hence experiments with different artificial intelligence methods rooted in Natural Language Processing and deep learning. It uses 766 business plans to train a model for the automated measurement of transaction relations, a construct which is an indicator for new technology-based firm survival. Empirical findings show that the accuracy of construct measurement can be significantly increased with automated methods and improves with larger amounts of training data. Language complexity sets limits to the precision of automated construct measurement though. We therefore recommend a hybrid approach: making use of the inherent advantages of combining automated with human coding until the amount of training data is sufficiently large to substitute the human coding completely. The study provides insights into the applicability of different text analytics methods in entrepreneurship research and points at future research potential.
Respiratory diseases are leading causes of death and disability in the world. The recent COVID-19 pandemic is also affecting the respiratory system. Detecting and diagnosing respiratory diseases requires both medical professionals and the clinical environment. Most of the techniques used up to date were also invasive or expensive.
Some research groups are developing hardware devices and techniques to make possible a non-invasive or even remote respiratory sound acquisition. These sounds are then processed and analysed for clinical, scientific, or educational purposes.
We present the literature review of non-invasive sound acquisition devices and techniques.
The results are about a huge number of digital tools, like microphones, wearables, or Internet of Thing devices, that can be used in this scope.
Some interesting applications have been found. Some devices make easier the sound acquisition in a clinic environment, but others make possible daily monitoring outside that ambient. We aim to use some of these devices and include the non-invasive recorded respiratory sounds in a Digital Twin system for personalized health.
Matrix methods for the computation of bounds for the range of a complex polynomial and its modulus over a rectangular region in the complex plane are presented. The approach relies on the expansion of the given polynomial into Bernstein polynomials. The results are extended to multivariate complex polynomials and rational functions.
This paper examines the interdependencies of tourism, Buddhism and sustainability combining in-depth-interviews with Buddhism experts and non-participant observation in a mixed-method approach. The area under investigation is the Alpine region of Austria, Germany and Switzerland, since it is home to Asian and Western forms of Buddhism tourism alike. Results show that Buddhism tourism as a value-based activity on the one hand is not commercial, but since demand is rising, on the other hand tendencies towards more commercial forms can be observed. As a modest form of activity Buddhism tourism does not shape the landscape of the Alpine area and by its nature it incorporates sustainability.
Generalized Concatenated Codes over Gaussian and Eisenstein Integers for Code-Based Cryptography
(2021)
The code-based McEliece and Niederreiter cryptosystems are promising candidates for post-quantum public-key encryption. Recently, q-ary concatenated codes over Gaussian integers were proposed for the McEliece cryptosystem together with the one-Mannheim error channel, where the error values are limited to Mannheim weight one. Due to the limited error values, the codes over Gaussian integers achieve a higher error correction capability than maximum distance separable (MDS) codes with bounded minimum distance decoding. This higher error correction capability improves the work factor regarding decoding attacks based on information-set decoding. The codes also enable a low complexity decoding algorithm for decoding beyond the guaranteed error correction capability. In this work, we extend this coding scheme to codes over Eisenstein integers. These codes have advantages for the Niederreiter system. Additionally, we propose an improved code construction based on generalized concatenated codes. These codes extent the rate region where the work factor is beneficial compared to MDS codes. Moreover, generalized concatenated codes are more robust against structural attacks than ordinary concatenated codes.
The performance and reliability of non-volatile NAND flash memories deteriorate as the number of program/erase cycles grows. The reliability also suffers from cell to cell interference, long data retention time, and read disturb. These processes effect the read threshold voltages. The aging of the cells causes voltage shifts which lead to high bit error rates (BER) with fixed pre-defined read thresholds. This work proposes two methods that aim on minimizing the BER by adjusting the read thresholds. Both methods utilize the number of errors detected in the codeword of an error correction code. It is demonstrated that the observed number of errors is a good measure for the voltage shifts and is utilized for the initial calibration of the read thresholds. The second approach is a gradual channel estimation method that utilizes the asymmetrical error probabilities for the one-to-zero and zero-to-one errors that are caused by threshold calibration errors. Both methods are investigated utilizing the mutual information between the optimal read voltage and the measured error values.
Numerical results obtained from flash measurements show that these methods reduce the BER of NAND flash memories significantly.
This paper describes the rationale and the development of a structured digital approach for measuring corporate environmental sustainability using performance metrics.
It is impossible to imagine today's age without the preservation of our environment, not even in the corporate environment. Currently, sustainability is mostly only rudimentarily considered in companies, mostly only with written down phrases on the website. This will no longer be sufficient in the future, which is why companies should record sustainability on a numerical basis. Based on the development of a workable concept for companies, a small empirical study was carried out, which can be used to numerically measure the sustainability performance of companies. Two utility analyses were completed.
One of them was supplemented by expert interviews. Well-known practitioners from the business world were interviewed and asked for their assessment of ecological performance indicators. The result of the research is an indicator-based concept that can be applied in corporate practice to determine ecological sustainability performance.
Deep transformation models
(2021)
We present a deep transformation model for probabilistic regression. Deep learning is known for outstandingly accurate predictions on complex data but in regression tasks it is predominantly used to just predict a single number. This ignores the non-deterministic character of most tasks. Especially if crucial decisions are based on the predictions, like in medical applications, it is essential to quantify the prediction uncertainty. The presented deep learning transformation model estimates the whole conditional probability distribution, which is the most thorough way to capture uncertainty about the outcome. We combine ideas from a statistical transformation model (most likely transformation) with recent transformation models from deep learning (normalizing flows) to predict complex outcome distributions. The core of the method is a parameterized transformation function which can be trained with the usual maximum likelihood framework using gradient descent. The method can be combined with existing deep learning architectures. For small machine learning benchmark datasets, we report state of the art performance for most dataset and partly even outperform it. Our method works for complex input data, which we demonstrate by employing a CNN architecture on image data.
Ballistocardiography (BCG) can be used to monitor heart rate activity. Besides, the accelerometer should have high sensitivity and minimal internal noise; a low-cost approach was taken into consideration. Several measurements have been executed to determine the optimal positioning of a sensor under the mattress to obtain a signal strong enough for further analysis. A prototype for an unobtrusive accelerometer-based measurement system has been developed and tested in a conventional bed without any specific extras. The influence of the human sleep position for the output accelerometer data was tested. The obtained results indicate the potential to capture BCG signals using accelerometers. The measurement system can detect heart rate in an unobtrusive form in the home environment.
Introduction. Despite its high accuracy, polysomnography (PSG) has several drawbacks for diagnosing obstructive sleep apnea (OSA). Consequently, multiple portable monitors (PMs) have been proposed. Objective. This systematic review aims to investigate the current literature to analyze the sets of physiological parameters captured by a PM to select the minimum number of such physiological signals while maintaining accurate results in OSA detection. Methods. Inclusion and exclusion criteria for the selection of publications were established prior to the search. The evaluation of the publications was made based on one central question and several specific questions. Results. The abilities to detect hypopneas, sleep time, or awakenings were some of the features studied to investigate the full functionality of the PMs to select the most relevant set of physiological signals. Based on the physiological parameters collected (one to six), the PMs were classified into sets according to the level of evidence. The advantages and the disadvantages of each possible set of signals were explained by answering the research questions proposed in the methods. Conclusions. The minimum number of physiological signals detected by PMs for the detection of OSA depends mainly on the purpose and context of the sleep study. The set of three physiological signals showed the best results in the detection of OSA.
In today's volatile market environments, companies must be able to continuously innovate. In this context, innovation does not only refer to the development of new products or business models but often also affects the entire organization, which has to transform its structures, processes, and ways of working.Corporate entrepreneurship (CE) programs are often used by established companies to address these innovation and transformation challenges. In general, they are understood as formalized entrepreneurial activities to (1) support internal corporate ventures or (2) work with external startups. The organizational design and value creation of CE programs exhibit a high degree of heterogeneity. On the one hand, this heterogeneity makes CE programs a valuable management tool that can be used for many purposes. On the other hand, it can be seen as a reason for the current challenges that companies experience in effectively using and managing CE programs.By systematically analyzing 54 different cases in established companies in Germany, Switzerland, and Austria, this study contributes to a better understanding of the heterogeneity of CE programs. The taxonomic approach provides clearly defined types of CE programs that are distinguished according to their organizational design and the outputs they generate.
The development of home health systems can provide continuous and user-friendly monitoring of key health parameters. This project aims to create a concept for such a system, implement it on a test basis, and evaluate it. Three health areas were selected for this purpose:
Sleep, Stress, and Rehabilitation. Appropriate devices were installed in the homes of test subjects and used by them for two weeks. Besides, relevant questionnaires were completed to obtain a complete picture. Finally, the implemented system was evaluated, and the results of the conducted study showed that home health systems have great potential. However, it is necessary to consider some points to increase the usability of the system and the motivation of the users. Among others, ease of use of the equipment is of extreme importance.
Ferromagnetism is of increasing importance in the growing field of electromobility and data storage. In stable austenitic steels, the occurrence of ferromagnetism is not expected and would also interfere with many applications. However, ferromagnetism in austenitic stainless steels after low-temperature nitriding has already been shown in the past. Herein, the presence of ferromagnetism in austenitic steels is discovered after low-temperature carburization (Kolsterizing), which represents a novel and unique finding. A zone of expanded austenite is established on various austenitic stainless steels by low-temperature carburization and the respective ferromagnetism is investigated in relation to the alloy composition. The ferromagnetism occurring is determined by means of a commercial magnetoinductive sensor (Feritscope). Ferromagnetic domains are visualized by magnetic force microscopy and a ferrofluid. X-ray diffraction measurements indicate a clear difference in the lattice expansion of the different alloys. Furthermore, a different appearance of the magnetizable microstructure regions (magnetic domain structure) is detected depending on the grain orientation determined by electron backscatter diffraction (EBSD). Strongly pronounced magnetic domains show no linear lattice defects, whereas in small magnetizable areas linear lattice defects are detected by electron channeling contrast imaging and EBSD.
We compared vulnerable and fixed versions of the source code of 50 different PHP open source projects based on CVE reports for SQL injection vulnerabilities. We scanned the source code with commercial and open source tools for static code analysis. Our results show that five current state-of-the-art tools have issues correctly marking vulnerable and safe code. We identify 25 code patterns that are not detected as a vulnerability by at least one of the tools and 6 code patterns that are mistakenly reported as a vulnerability that cannot be confirmed by manual code inspection. Knowledge of the patterns could help vendors of static code analysis tools, and software developers could be instructed to avoid patterns that confuse automated tools.
With the increasing challenges of the 21st century, such as a rapidly growing population, increasing hunger and the destruction of the environment, the demand for sustainable and future-oriented ways of living is growing. To meet this demand, a residential district named Maun Science Park is being built in Botswana to develop a resilient society. In addition to the application of modern technology to optimise the use of resources, the environmentally friendly construction of the buildings is another goal of the project. This thesis investigates the prefabrication of rammed earth in terms of implementation and profitability for the Maun Science Park.
For this purpose, the specific properties, handling, as well as the application of the building material in prefabrication are first discussed.
This is followed by an investigation of how the work processes of prefabrication can be implemented in the Maun Science Park. Based on this, a profitability test is carried out using a break-even and sensitivity analysis.
The analyses showed that the investment in prefabrication is not profitable within the assumed production volume, which is due to the high fixed costs. These are primarily generated by the two main cost drivers, consisting of the new construction of the production hall and the rental of heavy construction equipment.
Lastly, recommendations for action were formulated that provide for a cost reduction in both the two main cost drivers as well as for other decisive factors.
Nowadays, the inexpensive memory space promotes an accelerating growth of stored image data. To exploit the data using supervised Machine or Deep Learning, it needs to be labeled. Manually labeling the vast amount of data is time-consuming and expensive, especially if human experts with specific domain knowledge are indispensable. Active learning addresses this shortcoming by querying the user the labels of the most informative images first. One way to obtain the ‘informativeness’ is by using uncertainty sampling as a query strategy, where the system queries those images it is most uncertain about how to classify. In this paper, we present a web-based active learning framework that helps to accelerate the labeling process. After manually labeling some images, the user gets recommendations of further candidates that could potentially be labeled equally (bulk image folder shift). We aim to explore the most efficient ‘uncertainty’ measure to improve the quality of the recommendations such that all images are sorted with a minimum number of user interactions (clicks). We conducted experiments using a manually labeled reference dataset to evaluate different combinations of classifiers and uncertainty measures. The results clearly show the effectiveness of an uncertainty sampling with bulk image shift recommendations (our novel method), which can reduce the number of required clicks to only around 20% compared to manual labeling.
This paper examines how varying antidumping methodologies applied within the World Trade Organization differ in the extent to which they reduce targeted exports. We show that antidumping duties, on average, hit Chinese exporters harder than those of other targeted countries. This difference can be traced back in part to China's non-market economy status, which affects the way antidumping duties are calculated. Furthermore, we show that the type of imposed duty matters, as ad-valorem duties affect exports differently compared to specific duties or duties conditional on the export price. Overall, however, antidumping duties remain effective in reducing imports independent of market economy status.
Modular arithmetic over integers is required for many cryptography systems. Montgomeryreduction is an efficient algorithm for the modulo reduction after a multiplication. Typically, Mont-gomery reduction is used for rings of ordinary integers. In contrast, we investigate the modularreduction over rings of Gaussian integers. Gaussian integers are complex numbers where the real andimaginary parts are integers. Rings over Gaussian integers are isomorphic to ordinary integer rings.In this work, we show that Montgomery reduction can be applied to Gaussian integer rings. Twoalgorithms for the precision reduction are presented. We demonstrate that the proposed Montgomeryreduction enables an efficient Gaussian integer arithmetic that is suitable for elliptic curve cryptogra-phy. In particular, we consider the elliptic curve point multiplication according to the randomizedinitial point method which is protected against side-channel attacks. The implementation of thisprotected point multiplication is significantly faster than comparable algorithms over ordinary primefields.
Algorithms and Architectures for Cryptography and Source Coding in Non-Volatile Flash Memories
(2021)
In this work, algorithms and architectures for cryptography and source coding are developed, which are suitable for many resource-constrained embedded systems such as non-volatile flash memories. A new concept for elliptic curve cryptography is presented, which uses an arithmetic over Gaussian integers. Gaussian integers are a subset of the complex numbers with integers as real and imaginary parts. Ordinary modular arithmetic over Gaussian integers is computational expensive. To reduce the complexity, a new arithmetic based on the Montgomery reduction is presented. For the elliptic curve point multiplication, this arithmetic over Gaussian integers improves the computational efficiency, the resistance against side channel attacks, and reduces the memory requirements. Furthermore, an efficient variant of the Lempel-Ziv-Welch (LZW) algorithm for universal lossless data compression is investigated. Instead of one LZW dictionary, this algorithm applies several dictionaries to speed up the encoding process. Two dictionary partitioning techniques are introduced that improve the compression rate and reduce the memory size of this parallel dictionary LZW algorithm.
Electricity generation from renewable energies often fluctuates due to weather and other natural effects. The instrument of control energy (balancing energy) can compensate for these fluctuations and thus guarantee the system and supply security of the electricity grid. Luxury hotels on tourist islands could react to fluctuations in electricity generation and provide balancing energy. The purpose of this paper is to investigate the electricity consumption of luxury hotels to assess their potential as a source for providing control energy.
Four-Dimensional Hurwitz Signal Constellations, Set Partitioning, Detection, and Multilevel Coding
(2021)
The Hurwitz lattice provides the densest four-dimensional packing. This fact has motivated research on four-dimensional Hurwitz signal constellations for optical and wireless communications. This work presents a new algebraic construction of finite sets of Hurwitz integers that is inherently accompanied by a respective modulo operation. These signal constellations are investigated for transmission over the additive white Gaussian noise (AWGN) channel. It is shown that these signal constellations have a better constellation figure of merit and hence a better asymptotic performance over an AWGN channel when compared with conventional signal constellations with algebraic structure, e.g., two-dimensional Gaussian-integer constellations or four-dimensional Lipschitz-integer constellations. We introduce two concepts for set partitioning of the Hurwitz integers. The first method is useful to reduce the computational complexity of the symbol detection. This suboptimum detection approach achieves near-maximum-likelihood performance. In the second case, the partitioning exploits the algebraic structure of the Hurwitz signal constellations. We partition the Hurwitz integers into additive subgroups in a manner that the minimum Euclidean distance of each subgroup is larger than in the original set. This enables multilevel code constructions for the new signal constellations.
Market research institutes forecast a growing relevance of Low-Code Development Platforms (LCDPs) for organizations. Moreover, the rising number of scientific publications in recent years shows the increasing interest of the academic community. However, an overview of current research focuses and fruitful future research topics is missing. This paper conducts a first scientific literature review on LCDPs to close this gap. The socio-technical system (STS) model, which categorizes information systems into a social and a technical system, serves to analyze the identified 32 publications. Most of current research focuses on the technical system (technology or task). In contrast, only three publications explicitly target the social system (structure or people). Hence, this paper enables future research to address the identified research gaps. Additionally, practitioners gain awareness of technical and social aspects involved in the development, implementation, and application of LCDPs.
The present work proposes the use of modern ICT technologies such as smartphones, NFCs, internet, and web technologies, to help patients in carrying out their therapies. The implemented system provides a calendar with a reminder of the assumptions, ensures the drug identification through NFC, allows remote assistance from healthcare staff and family members to check and manage the therapy in real-time. The system also provides centralized information on the patient's therapeutic situation, helpful in choosing new compatible therapies.
Preliminary results of homomorphic deconvolution application to surface EMG signals during walking
(2021)
Homomorphic deconvolution is applied to sEMG signals recorded during walking. Gastrocnemius lateralis and tibialis anterior signals were acquired according to SENIAM recommendation. MUAP parameters like amplitude and scale were estimated, whilst the MUAP shape parameter was fixed. This features a useful time-frequency representation of sEMG signal. Estimation of scale MUAP parameter was verified extracting the mean frequency of filtered EMG signal, extracted from the scale parameter estimated with two different MUAP shape values.
Female Entrepreneurship has gained interest over the last 20 years. Therefore, this paper analyses 7,320 articles of the research field ‘women in entrepreneurial context’ published in 885 journals. The sample is analyzed by using a machine learning and text mining based methodological approach. Aiming to provide a broad overview over the research literature, 41 clusters and 11 superordinate topics were identified. Major developments of research attention are outlined by analyzing bibliometric data of the period from 2000 to 2020. Overall growth in terms of research attention measured by the development of yearly citations per article is best noticeable in clusters ‘corporate social responsibility’, ‘brand’, and ‘corporate (-governance)’, and in superordinate topics ‘performance’, ‘education’, and ‘corporate (board/ management)’. There are also indicators for an overall increase of research attention and cluster variety. The synthesis provides an insight into most trending superordinate topics. Therefore, this literature review gives a comprehensive and descriptive overview as well as an insight into thematic trend developments of the research field.
Twenty-first century infrastructure needs to respond to changing demographics, becoming climate neutral, resilient, and economically affordable, while remaining a driver for development and shared prosperity. However, the infrastructure sector remains one of the least innovative and digitalized, plagued by delays, cost overruns, and benefit shortfalls. The authors assessed trends and barriers in the planning and delivery of infrastructure based on secondary research, qualitative
interviews with internationally leading experts, and expert workshops. The analysis concludes that the root-cause of the industry’s problems is the prevailing fragmentation of the infrastructure value chain and a lacking long-term vision for infrastructure. To help overcome these challenges, an integration of the value chain is needed. The authors propose that this could be achieved through a use-case-based, as well as vision and governance-driven creation of federated digital platforms applied to infrastructure projects and outline a concept. Digital platforms enable full-lifecycle participation and responsible governance guided by a shared infrastructure vision. This paper has contributed as policy recommendation to the Group of Twenty (G20) in 2021.
Cultural Mapping 4.0
(2021)
Cultural mapping aims to capture and visualize tangible and intangible cultural assets. This extend abstract proposes the consequent extension of analogue forms of cultural mapping using digital technologies, and its contribution is two-fold. First, the necessary theoretical basis is provided by a literature review of the still-young field of cultural mapping and the complementary disciplines of participatory mapping and digital story-mapping. Second, we propose a digitally enhanced Cultural Mapping 4.0 vision based on a case study from an ongoing research project in the Lake Constance region. Digital participatory mapping approaches are applied to capture data, and to validate and disseminate the results, story-mapping - a spatial form of digital storytelling - is used.
Urban car-free mobility
(2021)
Across the globe, urban areas experience the phenomena of rising road-congestion, air pollution and car accidents. These are just a few popular quantified effects that arise due to rapid, uncoordinated urbanization on a car-centric city layout. There is an urgent need to consider new concepts of urban mobility development to combat these negative effects. Car-free mobility is one notion adopted in diverse formats by numerous cities to create a more inclusive, just, healthy and sustainable urban life. The focus of this thesis is to ex- amine whether a car-free mobility concept is applicable to the Maun Science Park, Bot- swana. Therefore, the idea of car-free mobility, its positive aspects as well as its con- straints, are described first. This illustrates the complexity of urban transport planning as it is intertwined with urban land-use, political vision and people’s perceptions and behav- iors. Secondly, examples and strategies on how to change existing structures are pre- sented. Following this, the smart developments in the field of sustainable urban mobility are considered to provide an insight into their assets and drawbacks. Then the local mo- bility conditions are examined before the car-free concept is exemplarily applied to the Maun Science Park via scenario construction. These scenarios give a first vision of how a car-free concept can be applied to the MSP and additionally provide a starting point for future strategic planning as well as inspiration for other cities to follow along.
Guiding through the Fog
(2021)
Corporate Entrepreneurship (CE) programs are formalized efforts to realize entrepreneurial activities in established companies. Despite the growing and evolving landscape of CE programs, effectively managing them remains a challenging endeavor which results in disappointing outcomes and oftentimes leads to the early termination of such programs. We unmask the differences in goal setting of CE programs and highlight that setting appropriate goals is imperative for their desired outcomes. In practice, companies seem to struggle with the goal setting, and scholars have not yet fully solved the puzzle of goals setting in the context of CE programs either. Therefore, we set out to explore the current state of goal setting in the context of CE programs building upon 61 semi-structured interviews with CE program executives from cross-industry companies with different sizes. Our study contributes to a better understanding of goal setting in the context of CE programs by (1) characterizing the goal setting of CE programs based on goal attributes and goal types and (2) identifying differences among the goal setting of CE programs. We provide implications to practice for a more effective management of CE programs and conclude with a discussion for future research on the impact of the different goal settings.
InnoCrowd, a Product Classification System for Design Decision in a Crowdsourced Product Innovation
(2021)
System engineering focuses on how to design and manage complex systems. Meanwhile, in the era of Industry 4.0 and Internet of Things (IoT), systems are getting more complex. Contributors to higher complexity include the usage of modern components (e.g. mechatronics), new manufacturing technologies (e.g. 3D Print) and new engineering product development processes, e.g. open innovation. Open innovation is enabled by IoT, where people and devices are easily connected, and it supports development of more innovative products through ideas gained from predecessors and collaborators world wide. Some researchers suggest this approach is up to three times faster and five times cheaper than conventional approaches [Gassmann, 2012], [Howe, 2008], [Kusumah, 2018]. Because open innovation is relatively new, many managers do not know how to employ it effectively in some phases of product development [Schenk, 2009], [Afuah, 2017], including requirements definition, design and engineering processes (task assignment) through quality assurance. Also, they have trouble estimating and controlling development time and cost [Nevo, 2020], [Thanh, 2015]. As a consequence, the acceptance of this new approach in the industry is limited. Research activities addressing this new approach mainly address high-level and qualitive issues. Few effective methods are available to estimate project risk and to decide whether to initiate a project.
We propose InnoCrowd, a decision support system that uses an improved method to support these tasks and make decisions about crowdsourced engineering product development.
InnoCrowd uses natural language processing and machine learning to build a knowledgebase of crowdsourced product developments. InnoCrowd presents a manager with results of similar projects to show which practices led to good results. A manager of a new project can use this guidance to employ best practices for product requirements definition, project schedule, and other aspects, thereby reducing risk and increasing chances for success.
Interpretability and uncertainty modeling are important key factors for medical applications. Moreover, data in medicine are often available as a combination of unstructured data like images and structured predictors like patient’s metadata. While deep learning models are state-of-the-art for image classification, the models are often referred to as ’black-box’, caused by the lack of interpretability. Moreover, DL models are often yielding point predictions and are too confident about the parameter estimation and outcome predictions.
On the other side with statistical regression models, it is possible to obtain interpretable predictor effects and capture parameter and model uncertainty based on the Bayesian approach. In this thesis, a publicly available melanoma dataset, consisting of skin lesions and patient’s age, is used to predict the melanoma types by using a semi-structured model, while interpretable components and model uncertainty is quantified. For Bayesian models, transformation model-based variational inference (TM-VI) method is used to determine the posterior distribution of the parameter. Several model constellations consisting of patient’s age and/or skin lesion were implemented and evaluated. Predictive performance was shown to be best by using a combined model of image and patient’s age, while providing the interpretable posterior distribution of the regression coefficient is possible. In addition, integrating uncertainty in image and tabular parts results in larger variability of the outputs corresponding to high uncertainty of the single model components.
Background:
One of the most promising health care development areas is introducing telemedicine services and creating solutions based on blockchain technology. The study of systems combining both these domains indicates the ongoing expansion of digital technologies in this market segment.
Objective:
This paper aims to review the feasibility of blockchain technology for telemedicine.
Methods:
The authors identified relevant studies via systematic searches of databases including PubMed, Scopus, Web of Science, IEEE Xplore, and Google Scholar. The suitability of each for inclusion in this review was assessed independently. Owing to the lack of publications, available blockchain-based tokens were discovered via conventional web search engines (Google, Yahoo, and Yandex).
Results:
Of the 40 discovered projects, only 18 met the selection criteria. The 5 most prevalent features of the available solutions (N=18) were medical data access (14/18, 78%), medical service processing (14/18, 78%), diagnostic support (10/18, 56%), payment transactions (10/18, 56%), and fundraising for telemedical instrument development (5/18, 28%).
Conclusions:
These different features (eg, medical data access, medical service processing, epidemiology reporting, diagnostic support, and treatment support) allow us to discuss the possibilities for integration of blockchain technology into telemedicine and health care on different levels. In this area, a wide range of tasks can be identified that could be accomplished based on digital technologies using blockchains.
Discourse Ethics
(2021)
The Global Sanctions Data Base (GSDB): an update that includes the years of the Trump presidency
(2021)
Trajectory Tracking of a Fully-actuated Surface Vessel using Nonlinear Model Predictive Control
(2021)
The trajectory tracking problem for a fully-actuated real-scaled surface vessel is addressed in this paper. The unknown hydrodynamic and propulsion parameters of the vessel’s dynamic model were identified using an experimental maneuver-based identification process. Then, a nonlinear model predictive control (NMPC) scheme is designed and the controller’s performance is assessed through the variation of NMPC parameters and constraints tightening for tracking a curved trajectory.
Continuous range queries are a common means to handle mobile clients in high-density areas. Most existing approaches focus on settings in which the range queries for location-based services are mostly static whereas the mobile clients in the ranges move. We focus on a category called Dynamic Real-Time Range Queries (DRRQ) assuming that both, clients requested by the query and the inquirers, are mobile. In consequence, the query parameters results continuously change. This leads to two requirements: the ability to deal with an arbitrary high number of mobile nodes (scalability) and the real-time delivery of range query results. In this paper we present the highly decentralized solution Adaptive Quad Streaming (AQS) for the requirements of DRRQs. AQS approximates the query results in favor of a controlled real-time delivery and guaranteed scalability. While prior works commonly optimizes data structures on servers, we use AQS to focus on a highly distributed cell structure without data structures automatically adapting to changing client distributions. Instead of the commonly used request-response approach, we apply a lightweight streaming method in which no bidirectional communication and no storage or maintenance of queries are required at all.
Botswana is a country in southern Africa with rich mineral resources, which has built its economy on mining. Due to challenges in the upcoming years caused by climate and demographic change, it aims to move away from a resource-based economy to a knowledge-based economy in the long term. In order to support the
process, the Maun Science Park, a centre for research and development is planned to be created in Maun, a town on the edge of the Okavango Delta. The project is initiated by the “International Resilience and Sustainability Partnership” (inRES), a non-governmental organization. The project is currently in the initiation phase.
The purpose of this thesis is to determine a cost framework with exemplary developer calculation and sensitivity analysis for the Maun Science Park Project in Botswana. Therefor, a source research was performed in a first step. Based on this, interviews were conducted with members of the inRES. Based on the data
obtained and further assumptions, a cost framework for the different project phases of the MSP project was established. Subsequently, a developer calculation
was exemplarily carried out on the basis of the project phase 2 and a sensitivity analysis was performed.
During the interviews, data was collected on the different project phases. It became clear that the interview partners had partly inconsistent perceptions
about different project phases. The calculation can be used as a basis for further calculation at the time of concretization of the planning data.
The main challenge in Bayesian models is to determine the posterior for the model parameters. Already, in models with only one or few parameters, the analytical posterior can only be determined in special settings. In Bayesian neural networks, variational inference is widely used to approximate difficult-to-compute posteriors by variational distributions. Usually, Gaussians are used as variational distributions (Gaussian-VI) which limits the quality of the approximation due to their limited flexibility. Transformation models on the other hand are flexible enough to fit any distribution. Here we present transformation model-based variational inference (TM-VI) and demonstrate that it allows to accurately approximate complex posteriors in models with one parameter and also works in a mean-field fashion for multi-parameter models like neural networks.
In multi-extended object tracking, parameters (e.g., extent) and trajectory are often determined independently. In this paper, we propose a joint parameter and trajectory (JPT) state and its integration into the Bayesian framework. This allows processing measurements that contain information about parameters and states. Examples of such measurements are bounding boxes given from an image processing algorithm. It is shown that this approach can consider correlations between states and parameters. In this paper, we present the JPT Bernoulli filter. Since parameters and state elements are considered in the weighting of the measurement data assignment hypotheses, the performance is higher than with the conventional Bernoulli filter. The JPT approach can be also used for other Bayes filters.
This paper presents the current state of development and selected technological challenges in the application of ecologically and economically sustainable nets for aquaculture based on ongoing development projects. These aim at the development of a new material system of high-strength stainless steel wires as net material with environmentally compatible antifouling properties for nearshore and offshore aquacultures. Current plastic netting materials will be replaced with high-strength stainless steel to provide a more environmentally friendly system that can withstand more severe mechanical stresses (waves, storms, tides and predators). A new antifouling strategy is expected to solve current challenges, such as ecological damage (e.g., due to pollution from copper-containing antifouling substances or microplastics), high maintenance costs (e.g., cleaning and repairs), and shorter service life. Approaches for the next development steps are presented based on previous experience as well as calculation models based on this experience.
Acoustic Echo Cancellation (AEC) plays a crucial role in speech communication devices to enable full-duplex communication. AEC algorithms have been studied extensively in the literature. However, device specific details like microphone or loudspeaker configurations are often neglected, despite their impact on the echo attenuation or near-end speech quality. In this work, we propose a method to investigate different loudspeaker-microphone configurations with respect to their contribution to the overall AEC performance. A generic AEC system consisting of an adaptive filter and a Wiener post filter is used for a fair comparison between different setups. We propose the near-end-to-residual-echo ratio (NRER) and the attenuation-of-near-end (AON) as quality measures for the full-duplex AEC performance.
Bittamo
(2021)
The Ethiopian state increasingly seeks to enlist putative ‘traditional authorities’ to lend legitimacy to policies and interventions in the southwestern peripheries of the country. The underlying assumptions do not accord with the perceptions of the local populations: among the Kara in the South Omo region, legitimacy is predicated upon duty and accountability, and higher degrees of public legitimacy are disconnected from authority and direct command over other people’s conduct. The office of the Kara bitti, the highest spiritual leader, thus proves intractable to such attempts at enlistment and has been little affected by the radical transformation of the Kara’s lives through increasing integration into the Ethiopian state over recent decades. But even as the office has changed little, the lives of those expected to assume the role of bitti has, and the duties of a bitti strongly constrain the office holder and limit their personal ambitions and participation in politics at the local, regional and national level.
Anthropologists’ arrival stories have long served to justify, naturalize, and domesticate—often through humor—the fraught moment of entering unasked into other people's lives. This textual convention has been thoroughly critiqued, but no comparable attention has been paid to the analogous moment of departure from the field. The digital age enables both sides to maintain contact, a shift that negates the finality of earlier departures. This article engages the changes wrought by digital media that allow us to remain connected to the field. While this seems a humane affordance, it also means that it is no longer feasible to cleanly sever ties established ‘there’. When anthropologists leave the field, the field will likely follow them—on Facebook or Instagram.
In this article, the collection of classes of matrices presented in [J. Garloff, M. Adm, ad J. Titi, A survey of classes of matrices possessing the interval property and related properties, Reliab. Comput. 22:1-14, 2016] is continued. That is, given an interval of matrices with respect to a certain partial order, it is desired to know whether a special property of the entire matrix interval can be inferred from some of its element matrices lying on the vertices of the matrix interval. The interval property of some matrix classes found in the literature is presented, and the interval property of further matrix classes including the ultrametric, the conditionally positive semidefinite, and the infinitely divisible matrices is given for the first time. For the inverse M-matrices the cardinality of the required set of vertex matrices known so far is significantly reduced.
Normal breathing during sleep is essential for people’s health and well-being. Therefore, it is crucial to diagnose apnoea events at an early stage and apply appropriate therapy. Detection of sleep apnoea is a central goal of the system design described in this article. To develop a correctly functioning system, it is first necessary to define the requirements outlined in this manuscript clearly. Furthermore, the selection of appropriate technology for the measurement of respiration is of great importance. Therefore, after performing initial literature research, we have analysed in detail three different methods and made a selection of a proper one according to determined requirements. After considering all the advantages and disadvantages of the three approaches, we decided to use the impedance measurement-based one. As a next step, an initial conceptual design of the algorithm for detecting apnoea events was created. As a result, we developed an activity diagram on which the main system components and data flows are visually represented.
The last decades have shown that the volume of tourism, in general, is constantly increasing (with some justified exceptions). To offer a possibility of travel for all groups of people, it is necessary to pay attention to accessibility. One of the possibilities for increasing accessibility is digital technologies, which could assist in planning and the implementation and completion of trips. To make a selection of technologies, first, a study of barriers was conducted, which was then analyzed, and finally, some technologies were made available in a test setup. A focus on two technologies was made: 360°-Tours and mobile app with the travel information. The two technologies were implemented and presented to the test subjects.
The evaluation results showed that both technologies could increase accessibility if some essential aspects (such as usability, completeness, relevance, etc.) are considered during the implementation.
The main aim of presented in this manuscript research is to compare the results of objective and subjective measurement of sleep quality for older adults (65+) in the home environment. A total amount of 73 nights was evaluated in this study. Placing under the mattress device was used to obtain objective measurement data, and a common question on perceived sleep quality was asked to collect the subjective sleep quality level. The achieved results confirm the correlation between objective and subjective measurement of sleep quality with the average standard deviation equal to 2 of 10 possible quality points.
Sustainable technologies are being increasingly used in various areas of human life. While they have a multitude of benefits, they are especially useful in health monitoring, especially for certain groups of people, such as the elderly. However, there are still several issues that need to be addressed before its use becomes widespread. This work aims to clarify the aspects that are of great importance for increasing the acceptance of the use of this type of technology in the elderly. In addition, we aim to clarify whether the technologies that are already available are able to ensure acceptable accuracy and whether they could replace some of the manual approaches that are currently being used. A two-week study with people 65 years of age and over was conducted to address the questions posed here, and the results were evaluated. It was demonstrated that simplicity of use and automatic functioning play a crucial role. It was also concluded that technology cannot yet completely replace traditional methods such as questionnaires in some areas. Although the technologies that were tested were classified as being “easy to use”, the elderly population in the current study indicated that they were not sure that they would use these technologies regularly in the long term because the added value is not always clear, among other issues. Therefore, awareness-raising must take place in parallel with the development of technologies and services.
Health monitoring in a home environment can have broader use since it may provide continuous control of health parameters with relatively minor intrusiveness into regular life. This work aims to verify if it is possible to replace the typical in some sleep medicine areas subjective questioning by an objective measurement using electronic devices. For this purpose, a study was conducted with ten subjects, in which objective and subjective measurement of relevant sleep parameters took place. The results of both measurement methods were evaluated and analyzed. The results showed that while for some measures, such as Total Time in Bed, there is a high agreement between objective and subjective measurements, for others, such as sleep quality, there are significant differences. For this reason, currently, a combination of both measurement methods may be beneficial and provide the most detailed results, while a partial replacement can already reduce the number of questions at the subjective measurement by measurement through electronic devices.
The McEliece cryptosystem is a promising candidate for post-quantum public-key encryption. In this work, we propose q-ary codes over Gaussian integers for the McEliece system and a new channel model. With this one Mannheim error channel, errors are limited to weight one. We investigate the channel capacity of this channel and discuss its relation to the McEliece system. The proposed codes are based on a simple product code construction and have a low complexity decoding algorithm. For the one Mannheim error channel, these codes achieve a higher error correction capability than maximum distance separable codes with bounded minimum distance decoding. This improves the work factor regarding decoding attacks based on information-set decoding.
Error correction coding for optical communication and storage requires high rate codes that enable high data throughput and low residual errors. Recently, different concatenated coding schemes were proposed that are based on binary BCH codes with low error correcting capabilities. In this work, low-complexity hard- and soft-input decoding methods for such codes are investigated. We propose three concepts to reduce the complexity of the decoder. For the algebraic decoding we demonstrate that Peterson's algorithm can be more efficient than the Berlekamp-Massey algorithm for single, double, and triple error correcting BCH codes. We propose an inversion-less version of Peterson's algorithm and a corresponding decoding architecture. Furthermore, we propose a decoding approach that combines algebraic hard-input decoding with soft-input bit-flipping decoding. An acceptance criterion is utilized to determine the reliability of the estimated codewords. For many received codewords the stopping criterion indicates that the hard-decoding result is sufficiently reliable, and the costly soft-input decoding can be omitted. To reduce the memory size for the soft-values, we propose a bit-flipping decoder that stores only the positions and soft-values of a small number of code symbols. This method significantly reduces the memory requirements and has little adverse effect on the decoding performance.
This paper introduces the concept of Universal Memory Automata (UMA) and automated compilation of Verilog Hardware Description Language (HDL) code at Register Transfer Level (RTL) from UMA graphs for digital designs. The idea is based on the observation that Push Down Automata (PDA) are able to process the Dyk-Language - commonly known as the balanced bracket problem - with a finite set of states while Finite State Machines (FSM) require an infinite set of states. Since infinite sets of states are not applicable to real designs, PDAs appear promising for types of problems similar to the Dyk-Language. PDAs suffer from the problem that complex memory operations need to be emulated by a specific stack management. The presented UMA therefore extends the PDA by other types of memory, e.g. Queue, RAM or CAM. Memories that are eligible for UMAs are supposed to have at least one read and one write port and a one-cycle read/write latency. With their modified state-transfer- and output-function, UMAs are able to operate user-defined numbers, configurations and types of memories. Proof of concept is given by an implementation of a cache coherency protocol, i.e. a practical problem in microprocessor design.
The main goal of this work was to experimentally characterize the hot air-drying process of agricultural products (Potato, Carrot, Tomato) and verify it with numerical solutions at single layer and industrial scale dryer using Comsol Multiphysics® 5.3.
Input parameters at single layer dryer effects on quality attributes were examined. Two strategies of drying were applied on batch dryer to examine the input effects on quality attributes. Constant input parameters strategy was designed by using central composite design formulation and optimized by Response Surface Methodology (RSM). The second strategy was applied for further optimization of the selected region by using square wave profile of the air temperature and relative humidity. Similarly, numerical method for single layer dryer, unsteady-state partial differential equations have been solved by means of the Finite Elements Method coupled to the Arbitrary Lagrangian-Eulerian (ALE). Also, for batch dryer, the mechanistic mathematical models of coupled heat and mass transfer were developed and solved as solid porous moist material.
With this work, the process of convective drying of agricultural products could be optimized. Furthermore, important knowledge about the basic mechanisms of the drying process was found and implemented in the numerical models.
In Maun, Botswana, a self-sufficient, sustainable and future-oriented district will be created, the Maun Science Park. Within this project, several 5-8 storey smart homes shall be built in sustainable construction. The aim of this thesis is to develop a sustainable structural concept for those homes of the Maun Science Park. In a first step, the general basics for tall building structures and sustainable construction were established. Based on those fundamentals, criteria for the structural requirements, the ecological as well as the social sustainability of a structural design could be defined. Subsequently, four structural systems were drafted: a concrete core structure, a steel shear frame structure, a rammed earth shear wall structure and a wooden diagrid structure. In addition to the pre-dimensioning of the systems, a life cycle assessment was set up to evaluate the ecological sustainability of the designs. With the help of a utility value analysis, the wooden diagrid structure was determined as the preferred variant. The comparison of the designs also allows to draw general conclusions for the development of sustainable tall building structures. The results of the life cycle assessment show the advantage of wood as an ecological building material over industrially manufactured building materials, such as steel and concrete. Whereas rammed earth, a likewise ecological building material, is not convincing due to its low strength. In general, a balance is created in the life cycle assessment between ecological and industrially manufactured products in regard of strength and environmental impact. In terms of social sustainability, the design of the structure system can significantly influence the flexibility and use of local resources. However, due to the diversity of sustainable construction, the development of a structural system should be linked to an overarching sustainability concept that takes architecture and stakeholders into account.
Specific climate adaptation and resilience measures can be efficiently designed and implemented at regional and local levels. Climate and environmental databases are critical for achieving the sustainable development goals (SDGs) and for efficiently planning and implementing appropriate adaptation measures. Available federated and distributed databases can serve as necessary starting points for municipalities to identify needs, prioritize resources, and allocate investments, taking into account often tight budget constraints. High-quality geospatial, climate, and environmental data are now broadly available and remote sensing data, e.g., Copernicus services, will be critical. There are forward-looking approaches to use these datasets to derive forecasts for optimizing urban planning processes for local governments. On the municipal level, however, the existing data have only been used to a limited extent. There are no adequate tools for urban planning with which remote sensing data can be merged and meaningfully combined with local data and further processed and applied in municipal planning and decision-making. Therefore, our project CoKLIMAx aims at the development of new digital products, advanced urban services, and procedures, such as the development of practical technical tools that capture different remote sensing and in-situ data sets for validation and further processing. CoKLIMAx will be used to develop a scalable toolbox for urban planning to increase climate resilience. Focus areas of the project will be water (e.g., soil sealing, stormwater drainage, retention, and flood protection), urban (micro)climate (e.g., heat islands and air flows), and vegetation (e.g., greening strategy, vegetation monitoring/vitality). To this end, new digital process structures will be embedded in local government to enable better policy decisions for the future.
Twenty-first century infrastructure needs to respond to changing demographics, becoming climate neutral, resilient and economically affordable, while remaining a driver for development and shared prosperity. However, the infrastructure sector remains one of the least innovative and digitalised, plagued by delays, cost overruns and benefit shortfalls (Cantarelli et al. 2008; Flyvbjerg, 2007; Flyvbjerg et al., 2003; Flyvbjerg et al., 2004). The root cause is the prevailing fragmentation of the infrastructure sector (Fellows and Liu, 2012). To help overcome these challenges, integration of the value chain is needed. This could be achieved through a use-case-based creation of federated ecosystems connecting open and trusted data spaces and advanced services applied to infrastructure projects. Such digital platforms enable full-lifecycle participation and responsible governance guided by a shared infrastructure vision. Digital federation enables secure and sovereign data exchange and thus collaboration across the silos within the infrastructure sector and between industries as well as within and between countries. Such an approach to infrastructure technology policy would not rely on technological solutionism but proposes the development of open and trusted data alliances. Federated data spaces provide access to the emerging data economy, especially for SMEs, and can foster the innovation of new digital services. Such responsible digital governance can help make the infrastructure sector more resilient, efficient and aligned with the realisation of ambitious decarbonisation and environmental protection targets. The European Union and the United States have already developed architectures for sovereign and secure data exchange.
The State of Custom
(2021)
In our article, we engage with the anthropologist Gerd Spittler’s pathbreaking
article “Dispute settlement in the shadow of Leviathan” (1980) in which
he strives to integrate the existence of state courts (the eponymous Leviathan’s
shadow) in (post-)colonial Africa into the analysis on non-state court legal practices.
According to Spittler, it is because of undesirable characteristics inherent
in state courts that the disputing parties tended to rather involve mediators than
pursue a state court judgment. The less people liked state courts, the more likely
they were to (re-)turn to dispute settlement procedures. Now how has this situation
changed in the last four decades since its publication date? We relate his findings
to contemporary debates in legal anthropology that investigate the relationship
between disputing, law and the state. We also show through our own work in
Africa and Asia, particularly in Southern Ethiopia and Kyrgyzstan, in what ways
Spittler’s by now classical contribution to the field of legal anthropology in 1980
can be made fruitful for a contemporary anthropology of the state at a time when
not only (legal) anthropology has changed, but especially the way states deal with
putatively “customary” forms of dispute settlement.
The main objective of this paper is to revisit Temursho’s (2020) article “On the Euro method” in a critical and constructive way. We have praised part of his work and at the same time, we have analysed some of his arguments against the Euro method and against the work published by Valderas-Jaramillo et al. (2019). Moreover, we have analysed some other relevant aspects of the SUT-Euro and SUT-RAS methods not covered in Temursho (2020). Temursho (2020) seems to conclude that no one should use the Euro method again because of its limitations and drawbacks. However, although not being the Euro method perfect, we are afraid that there is still space for the use of the Euro method in updating/regionalizing supply and use tables.
For decades now, exports and import have grown more rapidly than domestic production. This is a strong indication that, besides the rapid growth of foreign trade in final goods, trade in intermediates is becoming increasingly important. For this reason, an input-output ap-proach is more appropriate for any analysis of diversification than a traditional approach based purely on macroeconomic data.
This article analyses economic diversification in Gulf Cooperation Council (GCC) countries using data from input-output tables which are an integral part of the national accounts. We compare the performance of the GCC economies with that of a reference case, Norway, which is considered to have successfully diversified its economy despite having a large oil resource base. It also assesses these countries’ relative progress on sustainable development using a measure of the World Bank, adjusted net savings, which evaluates the true rate of savings in an economy after accounting for investments in physical and human capital, de-pletion of natural resources, and damage from environmental pollution.
The article concludes that GCC countries have, contrary to expectation, collectively per-formed relatively well on diversification, but their performance on sustainable development varies.
Summary of the 9th workshop on metallization and interconnection for crystalline silicon solar cells
(2021)
The 9th edition of the Workshop on Metallization and Interconnection for Crystalline Silicon Solar Cells was held as an online event but nevertheless reached the workshop goals of knowledge sharing and networking. The technology of screen-printed contacts of high temperature pastes continues its fast progress enabled by better understanding of the phenomena taking place during printing and firing, and progress in materials. Great improvements were also achieved in low temperature paste printing and plated metallization. In the field of interconnection, progress was reported on multiwire approaches, electrically conductive adhesives and on foil-based approaches. Common to many contributions at the workshop was the use of advanced laser processes to improve performance or throughput.
In this paper, a novel measurement model based on spherical double Fourier series (DFS) for estimating the 3D shape of a target concurrently with its kinematic state is introduced. Here, the shape is represented as a star-convex radial function, decomposed as spherical DFS. In comparison to ordinary DFS, spherical DFS do not suffer from ambiguities at the poles. Details will be given in the paper. The shape representation is integrated into a Bayesian state estimator framework via a measurement equation. As range sensors only generate measurements from the target side facing the sensor, the shape representation is modified to enable application of shape symmetries during the estimation process. The model is analyzed in simulations and compared to a shape estimation procedure using spherical harmonics. Finally, shape estimation using spherical and ordinary DFS is compared to analyze the effect of the pole problem in extended object tracking (EOT) scenarios.
The encoding of antenna patterns with generalized spatial modulation as well as other index modulation techniques require w-out-of-n encoding where all binary vectors of length n have the same weight w. This constant-weight property cannot be obtained by conventional linear coding schemes. In this work, we propose a new class of constant-weight codes that result from the concatenation of convolutional codes with constant-weight block codes. These constant-weight convolutional codes are nonlinear binary trellis codes that can be decoded with the Viterbi algorithm. Some constructed constant-weight convolutional codes are optimum free distance codes. Simulation results demonstrate that the decoding performance with Viterbi decoding is close to the performance of the best-known linear codes. Similarly, simulation results for spatial modulation with a simple on-off keying show a significant coding gain with the proposed coded index modulation scheme.
List decoding for concatenated codes based on the Plotkin construction with BCH component codes
(2021)
Reed-Muller codes are a popular code family based on the Plotkin construction. Recently, these codes have regained some interest due to their close relation to polar codes and their low-complexity decoding. We consider a similar code family, i.e., the Plotkin concatenation with binary BCH component codes. This construction is more flexible regarding the attainable code parameters. In this work, we consider a list-based decoding algorithm for the Plotkin concatenation with BCH component codes. The proposed list decoding leads to a significant coding gain with only a small increase in computational complexity. Simulation results demonstrate that the Plotkin concatenation with the proposed decoding achieves near maximum likelihood decoding performance. This coding scheme can outperform polar codes for moderate code lengths.
Probabilistic Short-Term Low-Voltage Load Forecasting using Bernstein-Polynomial Normalizing Flows
(2021)
The transition to a fully renewable energy grid requires better forecasting of demand at the low-voltage level. However, high fluctuations and increasing electrification cause huge forecast errors with traditional point estimates. Probabilistic load forecasts take future uncertainties into account and thus enables various applications in low-carbon energy systems. We propose an approach for flexible conditional density forecasting of short-term load based on Bernstein-Polynomial Normalizing Flows where a neural network controls the parameters of the flow. In an empirical study with 363 smart meter customers, our density predictions compare favorably against Gaussian and Gaussian mixture densities and also outperform a non-parametric approach based on the pinball loss for 24h-ahead load forecasting for two different neural network architectures.
Positive systems play an important role in systems and control theory and have found applications in multiagent systems, neural networks, systems biology, and more. Positive systems map the nonnegative orthant to itself (and also the non-positive orthant to itself). In other words, they map the set of vectors with zero sign variation to itself. In this article, discrete-time linear systems that map the set of vectors with up to k-1 sign variations to itself are introduced. For the special case k = 1 these reduce to discrete-time positive linear systems. Properties of these systems are analyzed using tools from the theory of sign-regular matrices. In particular, it is shown that almost every solution of such systems converges to the set of vectors with up to k-1 sign variations. It is also shown that these systems induce a positive dynamics of k-dimensional parallelotopes.
In this paper, rectangular matrices whose minors of a given order have the same strict sign are considered and sufficient conditions for their recognition are presented. The results are extended to matrices whose minors of a given order have the same sign or are allowed to vanish. A matrix A is called oscillatory if all its minors are nonnegative and there exists a positive integer k such that A^k has all its minors positive. As a generalization, a new type of matrices, called oscillatory of a specific order, is introduced and some of their properties are investigated.
The class of square matrices of order n having a negative determinant and all their minors up to order n-1 nonnegative is considered. A characterization of these matrices is presented which provides an easy test based on the Cauchon algorithm for their recognition. Furthermore, the maximum allowable perturbation of the entry in position (2,2) such that the perturbed matrix remains in this class is given. Finally, it is shown that all matrices lying between two matrices of this class with respect to the checkerboard ordering are contained in this class, too.
SyNumSeS is a Python package for numerical simulation of semiconductor devices. It uses the Scharfetter-Gummel discretization for solving the one dimensional Van Roosbroeck system which describes the free electron and hole transport by the drift-diffusion model. As boundary conditions voltages can be applied to Ohmic contacts. It is suited for the simulation of pn-diodes, MOS-diodes, LEDs (hetero junction), solar cells, and (hetero) bipolar transistors.
Since its first edition in 2008, the Workshop on Metallization and Interconnection for Crystalline Silicon SolarCells has been a key event where knowledge in the critical fields of crystalline silicon solar cell metallization andinterconnection is shared between experts from academia and industry. It has become a highly recognized event forthe quality of the contributions, the lively Q&A sessions, and the exceptional networking opportunity.The situation with the Covid-19 pandemic made organizing the 9th edition as an in-person event impossible andforced us to reconsider the event format. The event took place virtually on October 5th and 6th 2020. We used aninnovative online platform that enabled not only presentations followed by Q&A but also more informal interactions,where participants could see and talk directly to other participants. 120 experts from 22 countries took part andattended 21 contributions presented live. In spite of a few technical glitches, the workshop was successful and thegoals of exchanging on the state-of-the-art in research/industry and connecting experts in the field were achieved.All presentations are available on www.miworkshop.info as .pdf documents. These proceedings contain asummary of the 9th edition (MIW2020) and peer-reviewed papers based on the workshop contributions. The organizerswish to thank the members of the Scientific Committee for the time spent reviewing the MIW2020 abstracts andproceedings. The organizers also wish to thank again the sponsors and supporters for their financial contributionswhich made the 9th Workshop on Metallization and Interconnection for Crystalline Silicon Solar Cells possible.