Refine
Year of publication
- 2019 (99) (remove)
Document Type
- Conference Proceeding (60)
- Article (24)
- Part of a Book (5)
- Doctoral Thesis (2)
- Master's Thesis (2)
- Other Publications (2)
- Report (2)
- Book (1)
- Working Paper (1)
Language
- English (99) (remove)
Keywords
- AAL (3)
- Aboriginal people (1)
- Abrasive grain material (1)
- Academic german (1)
- Activity monitoring (1)
- Actuators (1)
- Additive manufacturing (1)
- Aerobic fermentation (1)
- Alternative Energy Production (1)
- Ambient assisted living (1)
Institute
- Fakultät Bauingenieurwesen (4)
- Fakultät Elektrotechnik und Informationstechnik (6)
- Fakultät Informatik (10)
- Fakultät Maschinenbau (5)
- Fakultät Wirtschafts-, Kultur- und Rechtswissenschaften (8)
- Institut für Angewandte Forschung - IAF (4)
- Institut für Optische Systeme - IOS (3)
- Institut für Strategische Innovation und Technologiemanagement - IST (6)
- Institut für Systemdynamik - ISD (16)
- Institut für Werkstoffsystemtechnik Konstanz - WIK (3)
This thesis deals with the object tracking problem of multiple extended objects. For instance, this tracking problem occurs when a car with sensors drives on the road and detects multiple other cars in front of it. When the setup between the senor and the other cars is in a such way that multiple measurements are created by each single car, the cars are called extended objects. This can occur in real world scenarios, mainly with the use of high resolution sensors in near field applications. Such a near field scenario leads a single object to occupy several resolution cells of the sensor so that multiple measurements are generated per scan. The measurements are additionally superimposed by the sensor’s noise. Beside the object generated measurements, there occur false alarms, which are not caused by any object and sometimes in a sensor scan, single objects could be missed so that they not generate any measurements.
To handle these scenarios, object tracking filters are needed to process the sensor measurements in order to obtain a stable and accurate estimate of the objects in each sensor scan. In this thesis, the scope is to implement such a tracking filter that handles the extended objects, i.e. the filter estimates their positions and extents. In context of this, the topic of measurement partitioning occurs, which is a pre-processing of the measurement data. With the use of partitioning, the measurements that are likely generated by one object are put into one cluster, also called cell. Then, the obtained cells are processed by the tracking filter for the estimation process. The partitioning of measurement data is a crucial part for the performance of tracking filter because insufficient partitioning leads to bad tracking performance, i.e. inaccurate object estimates.
In this thesis, a Gaussian inverse Wishart Probability Hypothesis Density (GIW-PHD) filter was implemented to handle the multiple extended object tracking problem. Within this filter framework, the number of objects are modelled as Random Finite Sets (RFSs) and the objects’ extent as random matrices (RM). The partitioning methods that are used to cluster the measurement data are existing ones as well as a new approach that is based on likelihood sampling methods. The applied classical heuristic methods are Distance Partitioning (DP) and Sub-Partitioning (SP), whereas the proposed likelihood-based approach is called Stochastic Partitioning (StP). The latter was developed in this thesis based on the Stochastic Optimisation approach by Granström et al. An implementation, including the StP method and its integration into the filter framework, is provided within this thesis.
The implementations, using the different partitioning methods, were tested on simulated random multi-object scenarios and in a fixed parallel tracking scenario using Monte Carlo methods. Further, a runtime analysis was done to provide an insight into the computational effort using the different partitioning methods. It emphasized, that the StP method outperforms the classical partitioning methods in scenarios, where the objects move spatially close. The filter using StP performs more stable and with more accurate estimates. However, this advantage is associated with a higher computational effort compared to the classical heuristic partitioning methods.
Error correction coding (ECC) for optical communication and persistent storage systems require high rate codes that enable high data throughput and low residual errors. Recently, different concatenated coding schemes were proposed that are based on binary Bose-Chaudhuri-Hocquenghem (BCH) codes that have low error correcting capabilities. Commonly, hardware implementations for BCH decoding are based on the Berlekamp-Massey algorithm (BMA). However, for single, double, and triple error correcting BCH codes, Peterson's algorithm can be more efficient than the BMA. The known hardware architectures of Peterson's algorithm require Galois field inversion. This inversion dominates the hardware complexity and limits the decoding speed. This work proposes an inversion-less version of Peterson's algorithm. Moreover, a decoding architecture is presented that is faster than decoders that employ inversion or the fully parallel BMA at a comparable circuit size.
A new thermal shock application-oriented testing method for ceramic components and refractories
(2019)
Ceramics and refractories are often used in high-temperature applications like industrial furnaces. Therefore, thermomechanical and heat resistance of ceramic and refractory materials are important. The material behaviour is described by thermal stress resistance. Established material tests to determine thermal shock behaviour are complex and do not yield key figures. The potential of application-related material testing in combination with simulations with transfer from ceramics to refractories is described below. The combination of model-based simulation with applied material testing offers numerous advantages. On the one hand, the design of the test setup is supported by the simulation, which results in a goal and application-oriented test setup. On the other hand, the iterative approach allows the model verification with the help of the applied material testing. The simulation shows that the transfer from ceramics to refractory material is possible and results according to literature. The design reliability of the components is thereby improved, since initially different loads can be simulated in the model in combination with a variety of materials and geometries, and thereby substitute complex and expensive preliminary tests. As a result, verified models offer a great savings potential in terms of time to market, development expenses and use of raw materials. Very important is, that the method is suitable for technical ceramics and refractory materials.
If the process contains a delay (dead time), the Nyquist criterion is well suited to derive a PI or PID tuning rule because the delay is taken into account without approximation. The tuning of the speed of the closed loop enters naturally by the crossover frequency. The goal of robustness and performance is translated into the phase margin.
The investigation of stress requires to distinguish between stress caused by physical activity and stress that is caused by psychosocial factors. The behaviour of the heart in response to stress and physical activity is very similar in case the set of monitored parameters is reduced to one. Currently, the differentiation remains difficult and methods which only use the heart rate are not able to differentiate between stress and physical activity, without using additional sensor data input. The approach focusses on methods which generate signals providing characteristics that are useful for detecting stress, physical activity, no activity and relaxation.
The Lempel-Ziv-Welch (LZW) algorithm is an important dictionary-based data compression approach that is used in many communication and storage systems. The parallel dictionary LZW (PDLZW) algorithm speeds up the LZW encoding by using multiple dictionaries. The PDLZW algorithm applies different dictionaries to store strings of different lengths, where each dictionary stores only strings of the same length. This simplifies the parallel search in the dictionaries for hardware implementations. The compression gain of the PDLZW depends on the partitioning of the address space, i.e. on the sizes of the parallel dictionaries. However, there is no universal partitioning that is optimal for all data sources. This work proposes an address space partitioning technique that optimizes the compression rate of the PDLZW using a Markov model for the data. Numerical results for address spaces with 512, 1024, and 2048 entries demonstrate that the proposed partitioning improves the performance of the PDLZW compared with the original proposal.
While existing resource extraction debates have contributed to a better understanding of national economic and political dilemmas and institutional responses, there are flaws in understanding the specific relevance of the various types of mining schemes for rural households to deal with the various problems they are confronted with. Our paper examines the perceptions of gold mining effects on households in Northern Burkina Faso. The findings of our survey across six districts representing different mining schemes (industrial, artisanal, no mining) highlight the fact that artisanal gold mining can generate job opportunities and cash income for local households; whereas industrial gold mining widely fails to do so. However, the general economic and environmental settings exert a much stronger influence on the household state. Gold mining effects are perceived as being less advantageous in districts where people are suffering from a lack of education, a higher vulnerability to drought and poor market access. Our findings provide empirical support for those who back the enhanced formalization of artisanal and small-scale mining (ASM) and policies that entail more rigorous state monitoring of mining concessions, especially in economic and environmentally disadvantaged contexts. Effectively addressing communal and pro-poor development requires greater attention to the political economy of ASM and corporate mining. It also calls for a greater inclusion of local mining stakeholders and a more effective alignment of international regulatory and advocacy efforts.
Ceramics are often used in high-temperature applications. Therefore, thermomechanical and heat resistance of ceramic and refractory materials are important. The material behaviour is described by thermal stress resistance. Established material tests to determine thermal shock behaviour are complex. The potential of application-related material testing in combination with simulations is described below.
This paper analyses international cooperation in alternative energy production research and development. Therefore, patents of the technological domain, registered at the European Patent Office from 1997 until 2016, are analysed. International cooperation is considered when patents involve co-assignment or co-inventorship comprising two or more different countries. Generally, international R&D cooperation tends to be increasing over time in alternative energy production. In total, 2234 co-patents from 87 countries are identified. Through social network analysis the cooperative relationships between countries are examined. The most significant states of the network are the United States of America and Germany. Innovative clusters and strong partnerships are identified. Alternative energy technologies that involve international cooperation most extensively are harnessing energy from manmade waste, solar energy and bio-fuels. The paper clarifies which countries are cooperating with each other for what purpose. The findings can be used for establishing R&D strategies in the domain of alternative energy production.
The first part of this work shows the development and application of a new material system using high strength duplex stainless steel wires as net material with environmentally compatible antifouling properties for off-shore fish farm cages. Current net materials from textiles (polyamide) shall be partially replaced by high strength duplex stainless steel in order to have a more environmentally compatible system which meets the more severe mechanical loads (waves, storms, predatores (sharks, seals)). With a new antifouling strategy current issues like reduced ecological damage (e.g. due to copper disposal), lower maintenance costs (e.g. cleaning) and reduced durability shall be resolved.
High strength steel wires are also widely used in geological protection systems, for example rockfall protection or slope stabilisation. Normally hot-dip galvanised carbon steel is used in this case. But in highly corrosive environments like coastal areas, volcanic areas or mines for example, other solutions with a high corrosion resistance and sufficient mechanical properties are necessary. Protection systems made of high strength duplex stainless steel wires enable a significantly longer service life of the portection systems and therefore a higher level of security.
Today, many resource-constrained systems, such as embedded systems, still rely on symmetric cryptography for authentication and digital signatures. Asymmetric cryptography provide a higher security level, but software implementations of public-key algorithms on small embedded systems are extremely slow. Hence, such embedded systems require hardware assistance, i.e. crypto coprocessors optimized for public key operations. Many such coprocessor designs aim on high computational performance. In this work, an area efficient elliptic curve cryptography (ECC) coprocessor is presented for applications in small embedded systems where high performance coprocessors are too costly. We propose a simple control unit with a small instruction set that supports different ECC point multiplication (PM) algorithms. The control unit reduces the logic and number of registers compared with other implementations of ECC point multiplications.
We present an alternative approach to grid management in low voltage grids by the use of artificial intelligence. The developed decision support system is based on an artificial neural network (ANN). Due to the fast reaction time of our system, real time grid management will be possible. Remote controllable switches and tap changers in transformer stations are used to actively manage the grid infrastructure. The algorithm can support the distribution system operators to keep the grid in a safe state at any time. Its functionality is demonstrated by a case study using a virtual test grid. The ANN achieves a prediction rate of around 90% for the different grid management strategies. By considering the four most likely solutions proposed by the ANN, the prediction rate increases to 98.8%, with a 0.1 second increase in the running time of the model.
Martensitic stainless steels has a wide use, for example for blades, knifes or cutter. The best corrosion resistance of these materials is in hardened condition. For better mechanical properties a tempering is normally applied to increase the durability. The tempering is also reducing the hardness and finally the corrosion resistance. Austempering is meanly used at low alloyed steels and brings a good compromise between durability, hardness and corrosion resistance. For martensitic stainless steels, austempering is normally not a topic because of the very long tempering times.
This work shows first results of austempering of some standard martensitic stainless steels and the influence to corrosion resistance. For reference, hardened and also hardened and tempered specimens were investigated. The corrosions resistance was investigated by electrochemical methods.
It is well known that signal constellations which are based on a hexagonal grid, so-called Eisenstein constellations, exhibit a performance gain over conventional QAM ones. This benefit is realized by a packing and shaping gain of the Eisenstein (hexagonal) integers in comparison to the Gaussian (complex) integers. Such constellations are especially relevant in transmission schemes that utilize lattice structures, e.g., in MIMO communications. However, for coded modulation, the straightforward approach is to combine Eisenstein constellations with ternary channel codes. In this paper, a multilevel-coding approach is proposed where encoding and multistage decoding can directly be performed with state-of-the-art binary channel codes. An associated mapping and a binary set partitioning are derived. The performance of the proposed approach is contrasted to classical multilevel coding over QAM constellations. To this end, both the single-user AWGN scenario and the (multiuser) MIMO broadcast scenario using lattice-reduction-aided preequalization are considered. Results obtained from numerical simulations with LDPC codes complement the theoretical aspects.
Rheumatoid arthritis is an autoimmune disease that causes chronic inflammation of synovial joints, often resulting in irreversible structural damage. The activity of the disease is evaluated by clinical examinations, laboratory tests, and patient self-assessment. The long-term course of the disease is assessed with radiographs of hands and feet. The evaluation of the X-ray images performed by trained medical staff requires several minutes per patient. We demonstrate that deep convolutional neural networks can be leveraged for a fully automated, fast, and reproducible scoring of X-ray images of patients with rheumatoid arthritis. A comparison of the predictions of different human experts and our deep learning system shows that there is no significant difference in the performance of human experts and our deep learning model.
Flash memories are non-volatile memory devices. The rapid development of flash technologies leads to higher storage density, but also to higher error rates. This dissertation considers this reliability problem of flash memories and investigates suitable error correction codes, e.g. BCH-codes and concatenated codes. First, the flash cells, their functionality and error characteristics are explained. Next, the mathematics of the employed algebraic code are discussed. Subsequently, generalized concatenated codes (GCC) are presented. Compared to the commonly used BCH codes, concatenated codes promise higher code rates and lower implementation complexity. This complexity reduction is achieved by dividing a long code into smaller components, which require smaller Galois-Field sizes. The algebraic decoding algorithms enable analytical determination of the block error rate. Thus, it is possible to guarantee very low residual error rates for flash memories. Besides the complexity reduction, general concatenated codes can exploit soft information. This so-called soft decoding is not practicable for long BCH-codes. In this dissertation, two soft decoding methods for GCC are presented and analyzed. These methods are based on the Chase decoding and the stack algorithm. The last method explicitly uses the generalized concatenated code structure, where the component codes are nested subcodes. This property supports the complexity reduction. Moreover, the two-dimensional structure of GCC enables the correction of error patterns with statistical dependencies. One chapter of the thesis demonstrates how the concatenated codes can be used to correct two-dimensional cluster errors. Therefore, a two-dimensional interleaver is designed with the help of Gaussian integers. This design achieves the correction of cluster errors with the best possible radius. Large parts of this works are dedicated to the question, how the decoding algorithms can be implemented in hardware. These hardware architectures, their throughput and logic size are presented for long BCH-codes and generalized concatenated codes. The results show that generalized concatenated codes are suitable for error correction in flash memories, especially for three-dimensional NAND memory systems used in industrial applications, where low residual errors must be guaranteed.
The aim of this paper is to portray the risks of climate change for low mountain range tourism and to develop sustainable business models as adaption strategy. A mixed-method-approach is applied combining secondary analysis, a quantitative survey, and qualitative in-depth-interviews in a transdisciplinary setting. Results show, that until now, climate change impacts on the snow situation in the Black Forest – at least above 1,000 m – have been mild and compensated by artificial snowmaking, and up to now have not had measurable effects on tourism demand. In general, the Black Forest appears to be an attractive destination for more reasons than just snow. The climate issue seems to be regarded as a rather incidental occurrence with little importance to current business decisions. However, the authors present adaption strategies as alternatives for snow tourism, e. g. the implementation of hiking hostels, since climate change will make winter tourism in the Black Forest impossible in the long run.
Border issues continue to be of interest in tourism literature, most significantly that which focusses on cross-border shopping (e.g., currency values, taxation,
security). Borders as destinations are recognized in this area but the notion of shopping as a destination is perhaps less acknowledged. Following a review of the relevant literature, including the presentation of a table summarizing key areas of cross-border tourism research around the world, this paper presents a unique example of a border region with two-way traffic for cross-border shopping tourism: the border between Germany and Switzerland.
The particular case is where two cities meet at the border: Konstanz, Germany and Kreuzlingen, Switzerland. An intercept survey and key informant interviews were conducted in both communities in the spring of 2015. The results indicate high levels of traffic for various products and services. And while residents are generally satisfied with cross-border shopping in their communities, there are emerging issues related to volume and, in particular, too many in Konstanz and not enough in Kreuzlingen.
The paper concludes with a discussion that includes the development of a model cross-border shopping tourism that recognizes the multiple layers in space and destination.
The paper concludes with a proposal to further investigate the particular issues related to the volume on both sides of borders where cross-border shopping is the destination.
This paper presents a framework to assess the cultural sustainability of Aboriginal tourism in British Columbia, which meets must take into account the protection of human rights, good self-governance, identity, control of land, the tourism product’s authenticity, and a market-ready tourism product. These criteria are specified by two indicators each. The cultural sustainability framework was generated by triangulating qualitative research methods like experts’ interviews, secondary research, and participant and non-participant observations. This paper is thus conceptual in nature and inductive in its approach. It partly leverages a collaborative approach, as it includes interviewees in an iterative research loop. Furthermore, the paper shows why cultural sustainability is a determinant of the success of Aboriginal tourism.
A significant proportion of road traffic accidents are due to inattentiveness or fatigue at the wheel. Approaches to monitoring the driver's condition range from eye tracking and driving behavior analysis to yawn and blink detection and ECG measurement. This work describes the development of a mobile system for the measurement and processing of ECG data. The aim of the signal processing is to quantify the driver’s fatigue with the heartrate variability (HRV). The work includes the hardware and software design of the sensor. First, the development of low-noise electronics including AD conversion is described. Then the software signal processing with QRS complex detection and plotting front end is explained. The resulting sensor is compact, low-cost and provides a good signal for HRV extraction.
The business model canvas (BMC) and the lean start-up manifesto (LSM) have been changing both the entrepreneurial education and, on the practical side, the mindset in setting up innovative ventures since the burst of the dot-com bubble. However, few empirical insights on the business model implementation patterns that distinguish between digital and non-digital innovative ventures exist. Connecting practical management tools to network theory as well as to the theory of organizational learning, this paper investigates evolution patterns of digital and non-digital business models out of the deal flow of an innovation intermediary. For this purpose, a multi-dimensional quantitative content analysis research design is applied to 242 ventures' business plans. The measured strength of transaction relations to customers, suppliers, people, and financiers has been combined with performance indicators of the sampled ventures. The results indicate that in order to succeed, digital ventures iterate their business on the market early and search for investment afterwards. Contrariwise, non-digital ventures already need financial investments in the early stages to set up a product ready to be tested on the market. In both groups we found strong evidence that specific evolutionary patterns relate to higher rates of success.
Vortrag und Abstract
In this paper, the problem of controlling the dissolved oxygen level (DO) during an aerobic fermentation is considered. The proposed approach deals with three major difficulties in respect to the nonlinear dynamics of the DO, the poor accuracy of the empirical models for the oxygen consumption rate and the fact that only sampled measurements are available on-line. A nonlinear integral high-gain control law including a continuous-discrete time observer is designed to keep the DO in the neighborhood of a set point value without any knowledge on the dissolved oxygen consumption rate. The local stability of the control algorithm is proved using Lyapunov tools. The performance of the control scheme is first analyzed in simulation and then experimentally evaluated during a successfull fermentation of the bacteria over a period of three days. Pseudomonas putida mt-2
Why does a teacher use drama elements in her language classes? This article critically reflects on the author's experiences with employing drama elements in a compulsory class of German as a Foreign Language at college level. Motives for using drama are discussed, with regard to possible positive effects for the teacher-such as rapport with learners and personal confidence-and potential benefits for the language learners-such as promoting vocabulary and grammar learning, fostering oral skills and increasing motivation. After that the drama activities used in class are briefly described. However, two years of experimentation have also shown up a number of problems with implementing drama elements in class and still leave questions open, requiring further iterations of the drama-based approach in this context. The paper was developed from a talk given at the Drama in Education Days in Konstanz 2018.
Fachvortrag auf dem Swiss Tribology Symposium, 12.11.2019, Hightechzentrum Aargau, Brugg, Schweiz
Fatigue and drowsiness are responsible for a significant percentage of road traffic accidents. There are several approaches to monitor the driver’s drowsiness, ranging from the driver’s steering behavior to analysis of the driver, e.g. eye tracking, blinking, yawning or electrocardiogram (ECG). This paper describes the development of a low-cost ECG sensor to derive heart rate variability (HRV) data for the drowsiness detection. The work includes the hardware and the software design. The hardware has been implemented on a printed circuit board (PCB) designed so that the board can be used as an extension shield for an Arduino. The PCB contains a double, inverted ECG channel including low-pass filtering and provides two analog outputs to the Arduino, that combined them and performs the analog-to-digital conversion. The digital ECG signal is transferred to an NVidia embedded PC where the processing takes place, including QRS-complex, heart rate and HRV detection as well as visualization features. The compact resulting sensor provides good results in the extraction of the main ECG parameters. The sensor is being used in a larger frame, where facial-recognition-based drowsiness detection is combined with ECG-based detection to improve the recognition rate under unfavorable light or occlusion conditions.
Low temperature carburizing of a series of austenitic stainless with various combinations of chromium and nickel equivalents was performed. The investigation of the response towards low temperature carburized for three stainless steels with various Cr- and Ni-equivalents showed that the carbon uptake depends significantly on the chemical composition of the base material. The higher carbon content in the expanded austenite layer of specimen 6 (1.4565) and specimen 4 (1.4539/AISI 904L) compared to specimen 2 (1.4404/AISI 316L) is assumed to be mainly related to the difference in the specimens’ chromium content. More chromium leads to more lattice expansion. Along with the higher carbon content, higher hardness values and higher compressive residual stresses in the expanded austenite zone are introduced than for low temperature carburized AISI 316L. The residual stresses obtained from X-ray diffraction lattice strain investigation depend strongly on the chosen X-ray elastic constants. Presently, no values are known for carbon (or nitrogen) stabilized expanded austenite. Nevertheless, first principle elastic constants for γ′&minus Fe4C appear to provide realistic residual stress values. Magnetic force microscopy and measurement with an eddy current probe indicate that austenitic stainless steels can become ferromagnetic upon carburizing, similar for low temperature nitriding. The apparent transition from para- to ferromagnetism cannot be attributed entirely to the interstitially dissolved carbon content in the formed expanded austenite layer but appears to depend also on the metallic composition of the alloy, in particular the Ni content.
The organizational capability to adapt to the fast and radical changes of market parameters becomes a prerequisite for companies’ long-term survival. In this context, organizational ambidexterity has gained much attention in research and practice. It is the capability to develop new businesses (exploration) while simultaneously optimizing the existing core businesses (exploitation). Established companies face several challenges in achieving this capability, as the underlying learning modes of exploration and exploitation are mutually incompatible. One way to solve these challenges is to separate the exploration-oriented part from the core organization. Corporate venturing has been widely recognized as one tool to create these dual structures to develop new businesses, based on discontinuous innovation. In recent times, new corporate venturing forms emerge in practice. This growing number of different forms has led to new applications of corporate venturing which go beyond the pure development of new businesses, toward supporting the entrepreneurial transformation of companies. This study aims at answering how different corporate venturing forms contribute to the strategic renewal of established companies. For this purpose, qualitative research methods are used to analyze data from 17 interviews conducted in two German high-tech companies. The study at hand provides empirical evidence in the field of corporate venturing by uncovering new insights about the different transformational effects of corporate venturing initiatives on the core organization. It further reveals that corporate venturing forms can be classified into two categories according to their respective level of entrepreneurship and frequency of execution. Both categories exhibit different transformational effects and can be understood as being complementary to each other.
The Lempel–Ziv–Welch (LZW) algorithm is an important dictionary-based data compression approach that is used in many communication and storage systems. The parallel dictionary LZW (PDLZW) algorithm speeds up the LZW encoding by using multiple dictionaries. This simplifies the parallel search in the dictionaries. However, the compression gain of the PDLZW depends on the partitioning of the address space, i.e. on the sizes of the parallel dictionaries. This work proposes an address space partitioning technique that optimises the compression rate of the PDLZW. Numerical results for address spaces with 512, 1024, and 2048 entries demonstrate that the proposed address partitioning improves the performance of the PDLZW compared with the original proposal. These address space sizes are suitable for flash storage systems. Moreover, the PDLZW has relative high memory requirements which dominate the costs of a hardware implementation. This work proposes a recursive dictionary structure and a word partitioning technique that significantly reduce the memory size of the parallel dictionaries.
The goal of this paper pretends to show how a bed system with an embedded system with sensor is able to analyze a person’s movement, breathing and recognizing the positions that the subject is lying on the bed during the night without any additional physical contact. The measurements are performed with sensors placed between the mattress and the frame. An Intel Edison board was used as an endpoint that served as a communication node from the mesh network to external service. Two nodes and Intel Edison are attached to the bottom of the bed frame and they are connected to the sensors.
The reliable supply of energy is an essential prerequisite for the economic success of a country. Questions of sustainability and the replacement of import dependencies require new tasks with new approaches. This contribution provides an overview of dependencies using the example of German electrical power grids integrating renewable energies. Aspects of energy trading and grid stability are brought into connection, stock exchange trading, grid codes and volatility of used primary energies are discussed.
Engineering and management
(2019)
Entrepreneurial employees
(2019)
Volatile markets and accelerating innovation cycles progressively force established companies to adopt alternative innovation strategies such as entrepreneurship. Due to the key role entrepreneurial employees play for strengthening the company's abilities for innovation and change, various concepts have emerged like corporate entrepreneurship or intrapreneurship. While the extant literature has increasingly examined only specific issues of entrepreneurial employees, an overall view on it lacks investigation. Therefore, the purpose of this paper is to structurally present current research on entrepreneurial employees by conducting a broad systematic literature review. The resulting research streams contribute to a clearer justification for future research and are a first step towards a comprehensive research view related to intrapreneurship.
Fast and reliable acquisition of truth data for document analysis using cyclic suggest algorithms
(2019)
In document analysis the availability of ground truth data plays a crucial role for the success of a project. This is even more true at the rise of new deep learning methods which heavily rely on the availability of training data. But even for traditional, hand crafted algorithms that are not trained on data, reliable test data is important for the improvement and evaluation of the methods. Because ground truth acquisition is expensive and time consuming, semi-automatic methods are introduced which make use of suggestions coming from document analysis systems. The interaction between the human operator and the automatic analysis algorithms is the key to speed up the process while improving the quality of the data. The final confirmation of data may always be done by the human operator. This paper demonstrates a use case for acquisition of truth data in a mail processing system. It shows why a new, extended view on truth data is necessary in development and engineering of such systems. An overview over the tool and the data handling is given, the advantages in the workflow are shown, and consequences for the construction of analysis algorithms are discussed. It can be shown that the interplay between suggest algorithms and human operator leads to very fast truth data capturing. The surprising finding is the fact that if multiple suggest algorithms circularly depend on data, they are especially effective in terms of speed and accuracy.
Flatness-based feed-forward control of solenoid actuators is considered. For precise motion planning and accurate steering of conventional solenoids, eddy currents cannot be neglected. The system of ordinary differential equations including eddy currents, that describes the nonlinear dynamics of such actuators, is not differentially flat. Thus, a distributed parameter approach based on a diffusion equation is considered, that enables the parametrization of the eddy current by the armature position and its time derivatives. In order to design the feedforward control, the distributed parameter model of the eddy current subsystem is combined with a typical nonlinear lumped parameter model for the electrical and mechanical subsystems of the solenoid. The control design and its application are illustrated by numerical and practical results for an industrial solenoid actuator.
Flooded Edge Gateways
(2019)
Increasing numbers of internet-compatible devices, in particular in the context of IoT, usually cause increasing amounts of data. The processing and analysis of a continuously growing amount of data in real-time by means of cloud platforms cannot be guaranteed anymore. Approaches of Edge Computing decentralize parts of the data analysis logics towards the data sources in order to control the data transfer rate to the cloud through pre-processing with predefined quality-of-service parameters. In this paper, we present a solution for preventing overloaded gateways by optimizing the transfer of IoT data through a combination of Complex Event Processing and Machine Learning. The presented solution is completely based on open-source technologies and can therefore also be used in smaller companies.
Generative Design Software - How does digitalization change the professional profile of architects
(2019)
Abstract, Poster und Vortrag
Extracting suitable features from acquired data to accurately depict the current health state of a system is crucial in data driven condition monitoring and prediction. Usually, analogue sensor data is sampled at rates far exceeding the Nyquist-rate containing substantial amounts of redundancies and noise, imposing high computational loads due to the subsequent and necessary feature processing chain (generation, dimensionality reduction, rating and selection). To overcome these problems, Compressed Sensing can be used to sample directly to a compressed space, provided the signal at hand and the employed compression/measurement system meet certain criteria. Theory states, that during this compression step enough information is conserved, such that a reconstruction of the original signal is possible with high probability. The proposed approach however does not rely on reconstructed data for condition monitoring purposes, but uses directly the compressed signal representation as feature vector. It is hence assumed that enough information is conveyed by the compression for condition monitoring purposes. To fuse the compressed coefficients into one health index that can be used as input for remaining useful life prediction algorithms and is limited to a reasonable range between 1 and 0, a logistic regression approach is used. Run-to-failure data of three translational electromagnetic actuators is used to demonstrate the health index generation procedure. A comparison to the time domain ground truth signals obtained from Nyquist sampled coil current measurements shows reasonable agreement. I.e. underlying wear-out phenomena can be reproduced by the proposed approach enabling further investigation of the application of prognostic methods.
The project aims for the development of a new material system from high tensile stainless steel wires as net material with environmentally compatible antifouling properties for off-shore fish farm cages. Therefore, current net materials from textiles (polyamide) shall be partially replaced by high strength stainless steel in order to have a more environmentally compatible system which meets the more severe mechanical loads (waves, storms, predators (sharks)). With a new antifouling strategy current issues like reduced ecological damage (e.g. due to copper disposal), lower maintenance costs (e.g. cleaning) and reduced durability shall be resolved.
The goal of the presented project is to develop the concept of home ehealth centers for barrier-free and cross-border telemedicine. AAL technologies are already present on the market but there is still a gap to close until they can be used for ordinary patient needs. The general idea needs to be accompanied by new services, which should be brought together in order to provide a full coverage of service for the users. Sleep and stress were chosen as predominant diseases for a detailed study within this project because of their widespread influence in the population. The executed scientific study of available home devices analyzing sleep has provided the necessary to select appropriate devices. The first choice for the project implementation is the device EMFIT QS+. This equipment provides a part of a complete system that a home telemedical hospital can provide at a level of precision and communication with internal and/or external health services.
One important skill for engineers is the ability of optimizing their experiments. On their job they will often spend a lot more time designing and improving an experimental setup compared to running the actual experiment itself. Is it possible to teach this complex task in physics labs? A method for reaching this goal is proposed an example is given and discussed.
The Industrial Internet of Things (IIoT) will leverage on wireless network technologies to integrate in a seamless manner Cyber-Physical Systems into existing information systems. In this context, the 6TiSCH architecture, proposed by IETF, represents the current leading standardization effort to enable timed and reliable data communication within IPv6 networks for industrial applications. In wireless networks, Link Quality Estimation (LQE) is a crucial task to select the best routes for data forwarding, regardless of unpredictable time varying conditions. Although, many solutions for LQE have been proposed in literature, the majority of them are not designed specifically for 6TiSCH networks. In this paper, we analyze the performance of existing LQE strategies on 6TiSCH networks.
First, we run a set of simulations to measure the performance of one existing LQE strategy in 6TiSCH. Our simulations show that such strategy can result in measurements with low accuracy due to the 6TiSCH default timeslot allocation strategy. Consequently, we propose an extension of the 6TiSCH Minimal Configuration that allocates specific timeslots for the transmission of probing messages to mitigate the problem. The proposed methodology is demonstrated to effectively reduce the LQE error.
Corporate Entrepreneurship (CE) became the new paradigm for organizations to cope with the accelerated development of innovations. Therefore, especially established organizations increasingly implement CE activities, even in combination. Scholars point out that a coordinated portfolio of CE activities could yield synergies and thus higher value for the organization and further call for more scientific examinations. This literature review aims to better the understanding of the combined and coordinated use of CE activities as well as about resulting synergies. Results show that there are only very few studies that addressed a combination and/or coordination of CE activities with respect to the creation of additional value, however, without empirical analyses. Yet, five categories of direct and indirect synergies could be derived. Discussing the results as well as the heterogenous use of terminology and concepts, this paper concludes with a research agenda for future analyses.
This PhD investigation lies at the intersection of Architecture, Textile Design and Interaction Design and speculates about sustainable forms of future living, focussing on bionic principles to create alternative lightweight building structures with textiles and digital fabrication techniques. In an interdisciplinary, practice- based design approach, informed by radical case studies from the 1960s to 80s on soft architectures like Archigram, Buckminster Fuller, Cedric Price, or Yona Friedman and critical theory on new materialism, (D. Haraway 1997, K. Barad 1998, J. Bennett 2007) sociological, philosophical (B. Latour 2005, G.Deleuze F., Guattari F 1987) and phenomenological thinkers (L. Malafouris 2005, J.Rancière 2004, B. Massumi 2002, N. Bourriaud 2002 , M. Merleau-Ponty 1963) this research investigates the cultural and social rootedness (Verortung) of novel materials and technologies, exploring in between prosthetic relations between the body and the environment.