Refine
Year of publication
- 2018 (101) (remove)
Document Type
- Conference Proceeding (64)
- Article (24)
- Part of a Book (6)
- Doctoral Thesis (5)
- Patent (1)
- Working Paper (1)
Language
- English (101) (remove)
Keywords
- Actuators (1)
- Agenda 2030 (1)
- Anisotropic hyperelasticity (1)
- Antenna arrays (1)
- Application Integration (1)
- Automotive Industry (1)
- Basic prices (1)
- Bernstein Basis (1)
- Bernstein polynomial (2)
- Bio-vital data (1)
Institute
- Fakultät Architektur und Gestaltung (1)
- Fakultät Bauingenieurwesen (1)
- Fakultät Informatik (15)
- Fakultät Maschinenbau (3)
- Fakultät Wirtschafts-, Kultur- und Rechtswissenschaften (5)
- Institut für Angewandte Forschung - IAF (5)
- Institut für Optische Systeme - IOS (13)
- Institut für Strategische Innovation und Technologiemanagement - IST (4)
- Institut für Systemdynamik - ISD (14)
- Konstanz Institut für Corporate Governance - KICG (1)
This paper describes an early lumping approach for generating a mathematical model of the heating process of a moving dual-layer substrate. The heat is supplied by convection and nonlinearly distributed over the whole considered spatial extend of the substrate. Using CFD simulations as a reference, two different modelling approaches have been investigated in order to achieve the most suitable model type. It is shown that due to the possibility of using the transition matrix for time discretization, an equivalent circuit model achieves superior results when compared to the Crank-Nicolson method. In order to maintain a constant sampling time for the in-visioned-control strategies, the effect of variable speed is transformed into a system description, where the state vector has constant length but a variable number of non-zero entries. The handling of the variable transport speed during the heating process is considered as the main contribution of this work. The result is a model, suitable for being used in future control strategies.
Online-based business models, such as shopping platforms, have added new possibilities for consumers over the last two decades. Aside from basic differences to other distribution channels, customer reviews on such platforms have become a powerful tool, which bestows an additional source for gaining transparency to consumers. Related research has, for the most part, been labelled under the term electronic word-of-mouth (eWOM). An approach, providing a theoretical basis for this phenomenon, will be provided here. The approach is mainly based on work in the field of consumer culture theory (CCT) and on the concept of co-creation. The work of several authors in these streams of research is used to construct a culturally informed resource-based theory, as advocated by Arnould & Thompson and Algesheimer & Gurâu.
A constructive method for the design of nonlinear observers is discussed. To formulate conditions for the construction of the observer gains, stability results for nonlinear singularly perturbed systems are utilised. The nonlinear observer is designed directly in the given coordinates, where the error dynamics between the plant and the observer becomes singularly perturbed by a high-gain part of the observer injection, and the information of the slow manifold is exploited to construct the observer gains of the reduced-order dynamics. This is in contrast to typical high-gain observer approaches, where the observer gains are chosen such that the nonlinearities are dominated by a linear system. It will be demonstrated that the considered approach is particularly suited for self-sensing electromechanical systems. Two variants of the proposed observer design are illustrated for a nonlinear electromagnetic actuator, where the mechanical quantities, i.e. the position and the velocity, are not measured
Generalized concatenated (GC) codes with soft-input decoding were recently proposed for error correction in flash memories. This work proposes a soft-input decoder for GC codes that is based on a low-complexity bit-flipping procedure. This bit-flipping decoder uses a fixed number of test patterns and an algebraic decoder for soft-input decoding. An acceptance criterion for the final candidate codeword is proposed. Combined with error and erasure decoding of the outer Reed-Solomon codes, this bit-flipping decoder can improve the decoding performance and reduce the decoding complexity compared to the previously proposed sequential decoding. The bit-flipping decoder achieves a decoding performance similar to a maximum likelihood decoder for the inner codes.
The introduction of multiple-level cell (MLC) and triple-level cell (TLC) technologies reduced the reliability of flash memories significantly compared with single-level cell flash. With MLC and TLC flash cells, the error probability varies for the different states. Hence, asymmetric models are required to characterize the flash channel, e.g., the binary asymmetric channel (BAC). This contribution presents a combined channel and source coding approach improving the reliability of MLC and TLC flash memories. With flash memories data compression has to be performed on block level considering short-data blocks. We present a coding scheme suitable for blocks of 1 kB of data. The objective of the data compression algorithm is to reduce the amount of user data such that the redundancy of the error correction coding can be increased in order to improve the reliability of the data storage system. Moreover, data compression can be utilized to exploit the asymmetry of the channel to reduce the error probability. With redundant data, the proposed combined coding scheme results in a significant improvement of the program/erase cycling endurance and the data retention time of flash memories.
When designing drying processes for sensitive biological foodstuffs like fruit or vegetables, energy and time efficiency as well as product quality are gaining more and more importance. These all are greatly influenced by the different drying parameters (e.g. air temperature, air velocity and dew point temperature) in the process. In sterilization of food products the cooking value is widely used as a cross-link between these parameters. In a similar way, the so-called cumulated thermal load (CTL) was introduced for drying processes. This was possible because most quality changes mainly depend on drying air temperature and drying time. In a first approach, the CTL was therefore defined as the time integral of the surface temperature of agricultural products. When conducting experiments with mangoes and pineapples, however, it was found that the CTL as it was used had to be adjusted to a more practical form. So the definition of the CTL was improved and the behaviour of the adjusted CTL (CTLad) was investigated in the drying of pineapples and mangoes. On the basis of these experiments and the work that had been done on the cooking value, it was found, that more optimization on the CTLad had to be done to be able to compare a great variety of different products as well as different quality parameters.
Advanced approaches for analysis and form finding of membrane structures with finite elements
(2018)
Part I deals with material modelling of woven fabric membranes. Due to their structure of crossed yarns embedded in coating, woven fabric membranes are characterised by a highly nonlinear stress-strain behaviour. In order to determine an accurate structural response of membrane structures, a suitable description of the material behaviour is required. A linear elastic orthotropic model approach, which is current practice, only allows a relative coarse approximation of the material behaviour. The present work focuses on two different material approaches: A first approach becomes evident by focusing on the meso-scale. The inhomogeneous, however periodic structure of woven fabrics motivates for microstructural modelling. An established microstructural model is considered and enhanced with regard to the coating stiffness. Secondly, an anisotropic hyperelastic material model for woven fabric membranes is considered. By performing inverse processes of parameter identification, fits of the two different material models w.r.t. measured data from a common biaxial test are shown. The results of the inversely parametrised material models are compared and discussed.
Part II presents an extended approach for a simultaneous form finding and cutting patterning computation of membrane structures. The approach is formulated as an optimisation problem in which both the geometries of the equilibrium and cutting patterning configuration are initially unknown. The design objectives are minimum deviations from prescribed stresses in warp and fill direction along with minimum shear deformation. The equilibrium equations are introduced into the optimisation problem as constraints. Additional design criteria can be formulated (for the geometry of seam lines etc.). Similar to the motivation for the Updated Reference Strategy [4] the described problem is singular in the tangent plane. In both the equilibrium and the cutting patterning configuration finite element nodes can move without changing stresses. Therefore, several approaches are presented to stabilise the algorithm. The overall result of the computation is a stressed equilibrium and an unstressed cutting patterning geometry. The interaction of both configurations is described in Total Lagrangian formulation.
The microstructural model, which is focused in Part I, is applied. Based on this approach, information about fibre orientation as well as the ending of fibres at cutting edges are available. As a result, more accurate results can be computed compared to simpler approaches commonly used in practice.
The paper investigates an innovative actuator combination based on the magnetic shape memory technology. The actuator is composed of an electromagnet, which is activated to produce motion, and a magnetic shape memory element, which is used passively to yield multistability, i.e. the possibility of holding a position without input power. Based on the experimental open-loop frequency characterization of the actuator, a position controller is developed and tested in several experiments.
This paper describes the effectiveness and efficiency of Virtual Reality training during a commissioning process. Therefore, 500 picking orders with more than 2000 part-picking operations with 30 test persons have been conducted and analyzed in the Modellfabrik Bodensee. The study points out the advantages and disadvantages of virtual training in comparison to a real execution of a picking process with and without any training.
Input–Output modellers are often faced with the task of estimating missing Use tables at basic prices and also valuation matrices of the individual countries. This paper examines a selection of estimation methods applied to the European context where the analysts are not in possession of superior data. The estimation methods are restricted to the use of automated methods that would require more than just the row and column sums of the tables (as in projections) but less than a combination of various conflicting information (as in compilation). The results are assessed against the official Supply, Use and Input–Output tables of Belgium, Germany, Italy, Netherlands, Finland, Austria and Slovakia by using matrix difference metrics. The main conclusion is that using the structures of previous years usually performs better than any other approach.
Objective: This paper presents an algorithm for non-invasive sleep stage identification using respiratory, heart rate and movement signals. The algorithm is part of a system suitable for long-term monitoring in a home environment, which should support experts analysing sleep. Approach: As there is a strong correlation between bio-vital signals and sleep stages, multinomial logistic regression was chosen for categorical distribution of sleep stages. Several derived parameters of three signals (respiratory, heart rate and movement) are input for the proposed method. Sleep recordings of five subjects were used for the training of a machine learning model and 30 overnight recordings collected from 30 individuals with about 27 000 epochs of 30 s intervals each were evaluated. Main results: The achieved rate of accuracy is 72% for Wake, NREM, REM (with Cohen's kappa value 0.67) and 58% for Wake, Light (N1 and N2), Deep (N3) and REM stages (Cohen's kappa is 0.50). Our approach has confirmed the potential of this method and disclosed several ways for its improvement. Significance: The results indicate that respiratory, heart rate and movement signals can be used for sleep studies with a reasonable level of accuracy. These inputs can be obtained in a non-invasive way applying it in a home environment. The proposed system introduces a convenient approach for a long-term monitoring system which could support sleep laboratories. The algorithm which was developed allows for an easy adjustment of input parameters that depend on available signals and for this reason could also be used with various hardware systems.
This thesis considers bounding functions for multivariate polynomials and rational functions over boxes and simplices. It also considers the synthesis of polynomial Lyapunov functions for obtaining the stability of control systems. Bounding the range of functions is an important issue in many areas of mathematics and its applications like global optimization, computer aided geometric design, robust control etc.
Research on Shadow IT is facing a conceptual dilemma in cases where previously "covert" systems developed by business entities (individual users, business workgroups, or business units) are integrated in the organizational IT management. These systems become visible, are therefore not "in the shadows" anymore, and subsequently do not fit to existing definitions of Shadow IT. Practice shows that some information systems share characteristics of Shadow IT, but are created openly in alignment with the IT department. This paper therefore proposes the term "Business-managed IT" to describe "overt" information systems developed or managed by business entities. We distinguish Business-managed IT from Shadow IT by illustrating case vignettes. Accordingly, our contribution is to suggest a concept and its delineation against other concepts. In this way, IS researchers interested in IT originated from or maintained by business entities can construct theories with a wider scope of application that are at the same time more specific to practical problems. In addition, value-laden terminology is complemented by a vocabulary that values potentially innovative developments by business entities more adequately. From a practical point of view, the distinction can be used to discuss the distribution of task responsibilities for information systems.
Deep neural networks have become a veritable alternative to classic speaker recognition and clustering methods in recent years. However, while the speech signal clearly is a time series, and despite the body of literature on the benefits of prosodic (suprasegmental) features, identifying voices has usually not been approached with sequence learning methods. Only recently has a recurrent neural network (RNN) been successfully applied to this task, while the use of convolutional neural networks (CNNs) (that are not able to capture arbitrary time dependencies, unlike RNNs) still prevails. In this paper, we show the effectiveness of RNNs for speaker recognition by improving state of the art speaker clustering performance and robustness on the classic TIMIT benchmark. We provide arguments why RNNs are superior by experimentally showing a “sweet spot” of the segment length for successfully capturing prosodic information that has been theoretically predicted in previous work.
Comparison and Identifiability Analysis of Friction Models for the Dither Motion of a Solenoid
(2018)
In this paper, the mechanical subsystem of a proportional solenoid excited by a dither signal is considered. The objective is to find a suitable friction model that reflects the characteristic mechanical properties of the dynamic system. Several different friction models from the literature are compared. The friction models are evaluated with respect to their accuracy as well as their practical identifiability, the latter being quantified based on the Fisher information matrix.
Industrial growth and a rapidly growing world population have large impacts on the global environment and allocation of material resources. Most changes in the environment are brought about by human activities and these activities result in a flow of materials. The flows of resources from the natural environment to the economy are a prerequisite of production while flows of residuals from the economy to the environment are the consequence of production and consumption. A full understanding of these processes requires a complete description of the physical dimension of the economy and its interaction with the environment.
Business units are increasingly able to fuel the transformation that digitalization demands of organizations. Thereby, they can implement Shadow IT (SIT) without involving a central IT department to create flexible and innovative solutions. Self-reinforcing effects lead to an intertwinement of SIT with the organization. As a result, high complexities, redundancies, and sometimes even lock-ins occur. IT Integration suggests itself to meet these challenges. However, it can also eliminate the benefits that SIT presents. To help organizations in this area of conflict, we are conducting a literature review including a systematic search and an analysis from a systemic viewpoint using path dependency and switching costs. Our resulting conceptual framework for SIT integration drawbacks classifies the drawbacks into three dimensions. The first dimension consists of switching costs that account for the financial, procedural, and emotional drawbacks and the drawbacks from a loss of SIT benefits. The second dimension includes organizational, technical, and level-spanning criteria. The third dimension classifies the drawbacks into the global level, the local level, and the interaction between them. We contribute to the scientific discussion by introducing a systemic viewpoint to the research on shadow IT. Practitioners can use the presented criteria to collect evidence to reach an IT integration decision.
Long-term sleep monitoring can be done primarily in the home environment. Good patient acceptance requires low user and installation barriers. The selection of parameters in this approach is significantly limited compared to a PSG session. The aim is a qualified selection of parameters, which on the one hand allow a sufficiently good classification of sleep phases and on the other hand can be detected by non-invasive methods.
Corporate venturing is one way for corporations to
introduce strategic renewal into their business portfolios, which is
imperative for ongoing success in innovation-driven industries.
Prior research finds that corporate ventures should be separated
from the mainstream business in loosely coupled sub-units, but
scholars continue to discuss how loose or tight the ventures should
be to balance exploration and exploitation. Hence, the antecedents
for successful venture management are yet to be fully explored and
our study contributes to this effort. The study shows that
corporate venture success is enhanced when corporate
management grants job and strategic autonomy to the venture
managers. This is further amplified when corporate management
simultaneously imposes an exploitative policy that forces venture
managers to prioritize extensions to and improvements of existing
competences and product-market offerings.
In this paper we present a method using deep learning to compute parametrizations for B-spline curve approximation. Existing methods consider the computation of parametric values and a knot vector as separate problems. We propose to train interdependent deep neural networks to predict parametric values and knots. We show that it is possible to include B-spline curve approximation directly into the neural network architecture. The resulting parametrizations yield tight approximations and are able to outperform state-of-the-art methods.
When integrating task-based learning into LSP courses, assessment procedures have to reflect the communicative goals of such tasks. However, performances on authentic and complex tasks are often difficult to score, because they involve not only specific language ability but also field specific knowledge. Moreover, the emphasis of tasks on meaning suggests that the communicative outcome should be regarded as the main criterion of success. This paper focuses on task design and scoring. It is argued that tasks have to be adapted to assessment purposes. However, to avoid a misalignment between teaching/learning and assessment, they should comprise the main features of LSP tasks in the sense of task-based language learning.
The article opens with brief examples of the varied contexts in which professionals with expert knowledge of intercultural communication are commissioned
by organisations to help improve organisational and individual performance.
The development needs that are evident in these contexts entail a broader repertoire of competencies than those generally reflected in the term intercultural communication skills, and so the next section elaborates on the concept of intercultural interaction competence (ICIC), reporting on the numerous sub-competencies which go to make up the ability to perform joint and purposeful activity effectively and appropriately across cultures.
The third section reports on approaches to developing the ICIC of members of organisations, discussing desirable framework conditions for the development
intervention, its possible goals, the nature of the cognitive, affective and behavioural development outcomes which can be achieved, and the content, methods and tools available to the intercultural developer.
The article finishes with a brief consideration of the qualification profile of interculturalists engaged in this kind of work, pointing out that a multi-disciplinary background will enable interculturalists to meet a broad range of organisational and human resource development needs in the area of intercultural communication which go beyond the ‘mere’ development of communicative foreign-language skills.
Simon Grimm examines new multi-microphone signal processing strategies that aim to achieve noise reduction and dereverberation. Therefore, narrow-band signal enhancement approaches are combined with broad-band processing in terms of directivity based beamforming. Previously introduced formulations of the multichannel Wiener filter rely on the second order statistics of the speech and noise signals. The author analyses how additional knowledge about the location of a speaker as well as the microphone arrangement can be used to achieve further noise reduction and dereverberation.
E-mobility in Tourism
(2018)
This article examines chances for and obstacles to e-mobility in tourism at the cross-border region of Lake Constance, Germany. Using secondary internet research, a database of key e-mobility supply factors was generated and visualized utilizing a geographical information system. The results show that fragmentation in infrastructure and information due to the cross-border situation of the four-country region is the main obstacle for e-mobility in tourism in the Lake Constance region. Cooperation and coordination of the supply side of e-mobility in the Lake Constance region turned out to be weak. To improve the chances of e-mobility in cross-border tourism a more client-oriented approach regarding information, accessibility, and conditions of use is necessary.
Due to their structure of crossed yarns embedded in coating, woven fabric membranes are characterised by a highly nonlinear stress-strain behaviour. In order to determine an accurate structural response of membrane structures, a suitable description of the material behaviour is required. Typical phenomenological material models like linear-elastic orthotropic models only allow a limited determination of the real material behaviour. A more accurate approach becomes evident by focusing on the meso-scale, which reveals an inhomogeneous however periodic structure of woven fabrics. The present work focuses on an established meso-scale model. The novelty of this work is an enhancement of this model with regard to the coating stiffness. By performing an inverse process of parameter identification using a state-of-the-art Levenberg-Marquardt algorithm, a close fit w.r.t. measured data from a common biaxial test is shown and compared to results applying established models. Subsequently, the enhanced meso-scale model is processed into a multi-scale model and is implemented as a material law into a finite element program. Within finite element analyses of an exemplary full scale membrane structure by using the implemented material model as well as by using established material models, the results are compared and discussed.
With the increased deployment of biometric authentication systems, some security concerns have also arisen. In particular, presentation attacks directed to the capture device pose a severe threat. In order to prevent them, liveness features such as the blood flow can be utilised to develop presentation attack detection (PAD) mechanisms. In this context, laser speckle contrast imaging (LSCI) is a technology widely used in biomedical applications in order to visualise blood flow. We therefore propose a fingerprint PAD method based on textural information extracted from pre-processed LSCI images. Subsequently, a support vector machine is used for classification. In the experiments conducted on a database comprising 32 different artefacts, the results show that the proposed approach classifies correctly all bona fides. However, the LSCI technology experiences difficulties with thin and transparent overlay attacks.
Today’s markets are characterized by fast and radical changes, posing an essential challenge to established companies. Startups, yet, seem to be more capable in developing radical innovations to succeed in those volatile markets. Thus, established companies started to experiment with various approaches to implement startup-like structures in their organization. Internal corporate accelerators (ICAs) are a novel form of corporate venturing, aiming to foster bottom-up innovations through intrapreneurship. However, ICAs still lack empirical investigations. This work contributes to a deeper understanding of the interface between the ICA and the core organization and the respective support activities (resource access and support services) that create an innovation-supportive work environment for the intrapreneurial team. The results of this qualitative study, comprising 12 interviews with ICA teams out of two German high-tech companies, show that the resources provided by ICAs differ from the support activities of external accelerators. Further, the study shows that some resources show both supportive as well as obstructive potential for the intrapreneurial teams within the ICA.
Further applications of the Cauchon algorithm to rank determination and bidiagonal factorization
(2018)
For a class of matrices connected with Cauchon diagrams, Cauchon matrices, and the Cauchon algorithm, a method for determining the rank, and for checking a set of consecutive row (or column) vectors for linear independence is presented. Cauchon diagrams are also linked to the elementary bidiagonal factorization of a matrix and to certain types of rank conditions associated with submatrices called descending rank conditions.
This letter proposes two contributions to improve the performance of transmission with generalized multistream spatial modulation (SM). In particular, a modified suboptimal detection algorithm based on the Gaussian approximation method is proposed. The proposed modifications reduce the complexity of the Gaussian approximation method and improve the performance for high signal-to-noise ratios. Furthermore, this letter introduces signal constellations based on Hurwitz integers, i.e., a 4-D lattice. Simulation results demonstrate that these signal constellations are beneficial for generalized SM with two active antennas.
Corporate venturing has gained much attention due
to challenges and changes that occur because of discontinuous
innovations – which seem to be promoted by digitalization. In this
context, open innovation has become a promising tool for
established companies to strengthen their innovation capabilities.
While the external opening of the innovation process has gained
much attention, the internal opening lacks on investigations.
Especially new organizational forms, such as Internal Corporate
Accelerators, have not been investigated sufficiently. This study,
which is based on 13 interviews from two German tech-companies,
contributes to a better understanding of this new form of corporate
venturing and the resulting effects on the organizational renewal.
Deep neural networks have been successfully applied to problems such as image segmentation, image super-resolution, coloration and image inpainting. In this work we propose the use of convolutional neural networks (CNN) for image inpainting of large regions in high-resolution textures. Due to limited computational resources processing high-resolution images with neural networks is still an open problem. Existing methods separate inpainting of global structure and the transfer of details, which leads to blurry results and loss of global coherence in the detail transfer step. Based on advances in texture synthesis using CNNs we propose patch-based image inpainting by a single network topology that is able to optimize for global as well as detail texture statistics. Our method is capable of filling large inpainting regions, oftentimes exceeding quality of comparable methods for images of high-resolution (2048x2048px). For reference patch look-up we propose to use the same summary statistics that are used in the inpainting process.
Influence of Temperature on the Corrosion behaviour of Stainless Steels under Tribological Stress
(2018)
Posterpräsentation