Refine
Year of publication
- 2018 (88) (remove)
Document Type
- Conference Proceeding (54)
- Article (22)
- Part of a Book (6)
- Doctoral Thesis (5)
- Patent (1)
Language
- English (88) (remove)
Has Fulltext
- no (88) (remove)
Keywords
- Actuators (1)
- Agenda 2030 (1)
- Anisotropic hyperelasticity (1)
- Antenna arrays (1)
- Application Integration (1)
- Automotive Industry (1)
- Basic prices (1)
- Bernstein Basis (1)
- Bernstein polynomial (1)
- Bio-vital data (1)
Institute
- Fakultät Architektur und Gestaltung (1)
- Fakultät Informatik (15)
- Fakultät Maschinenbau (3)
- Fakultät Wirtschafts-, Kultur- und Rechtswissenschaften (4)
- Institut für Angewandte Forschung - IAF (2)
- Institut für Optische Systeme - IOS (9)
- Institut für Strategische Innovation und Technologiemanagement - IST (3)
- Institut für Systemdynamik - ISD (14)
- Konstanz Institut für Corporate Governance - KICG (1)
- Konstanzer Institut für Prozesssteuerung - KIPS (4)
This work studies a wind noise reduction approach for communication applications in a car environment. An endfire array consisting of two microphones is considered as a substitute for an ordinary cardioid microphone capsule of the same size. Using the decomposition of the multichannel Wiener filter (MWF), a suitable beamformer and a single-channel post filter are derived. Due to the known array geometry and the location of the speech source, assumptions about the signal properties can be made to simplify the MWF beamformer and to estimate the speech and noise power spectral densities required for the post filter. Even for closely spaced microphones, the different signal properties at the microphones can be exploited to achieve a significant reduction of wind noise. The proposed beamformer approach results in an improved speech signal regarding the signal-to-noise-ratio and keeps the linear speech distortion low. The derived post filter shows equal performance compared to known approaches but reduces the effort for noise estimation.
Visualization-Assisted Development of Deep Learning Models in Offline Handwriting Recognition
(2018)
Deep learning is a field of machine learning that has been the focus of active research and successful applications in recent years. Offline handwriting recognition is one of the research fields and applications were deep neural networks have shown high accuracy. Deep learning models and their training pipeline show a large amount of hyper-parameters in their data selection, transformation, network topology and training process that are sometimes interdependent. This increases the overall difficulty and time necessary for building and training a model for a specific data set and task at hand. This work proposes a novel visualization-assisted workflow that guides the model developer through the hyper-parameter search in order to identify relevant parameters and modify them in a meaningful way. This decreases the overall time necessary for building and training a model. The contributions of this work are a workflow for hyper-parameter search in offline handwriting recognition and a heat map based visualization technique for deep neural networks in multi-line offline handwriting recognition. This work applies to offline handwriting recognition, but the general workflow can possibly be adapted to other tasks as well.
Although the Hospice Foundation in Constance knew they had a personnel
problem, they were unsure how to begin to fix it. In addition to difficulties in
finding and keeping employees, the Hospice Foundation’s employees were
often on sick leave, adding pressure on remaining staff. Twelve communication
design students in the masters program at the University of Applied
Sciences in Constance (HTWG Konstanz) conducted a study aimed at
identifying the causes for these problems and, more generally, understanding
how the employees work and feel. Even though the methods in this
study are well known, it presents an important prototype for designers and
design researchers because of its success in finding useful insights. It also
serves as a pre-design project briefing for both management and designers.
It demonstrates the usefulness of qualitative methods in providing a deeper
understanding of a complex situation and its usefulness as a strategic tool
and for defining a project’s focus and scope. Ideally, it also provides insights
into health care for the elderly.
Business units are increasingly able to fuel the transformation that digitalization demands of organizations. Thereby, they implement Shadow IT (SIT) to create flexible and innovative solutions. However, the individual implementation of SIT leads to high complexities and redundancies. Integration suggests itself to meet these challenges but can also eliminate the described benefits. In this emergent research, we develop propositions for a conceptual decision framework, that balances the benefits and drawbacks of an integration of SIT using a literature review as well as a multiple-case study. We thereby integrate the perspective of the overall organization as well as the specific business unit. We then pose six propositions regarding SIT integration that will serve to evaluate our conceptual framework in future research.
New Technology-Based Firms (NTBFs) learn their business in the early-stages of their life-cycle. As a central element of the entrepreneurial learning process, the business model describes the value-creation functions that are conceptualized in different stages of the NTBF’s life-cycle. Transaction relations connect the model with the business reality and ideally mature in strength over time to a functioning value-network. This chapter describes the development of a research design that determines, extracts, and evaluates semantics constructs of this entrepreneurial learning out of a convenient sample and three cohorts of business plans submitted to a business plan award between 2008 and 2010. The analysis shows empirical evidence for the survival and growth of those NTBFs that exhibit a balanced status of entrepreneurial learning in the maturity of the value-network that can be characterized as early startup-stage. The empirical findings of the network theory based business plan analysis will allow for a better explanation of the performance in the entrepreneurial process that is discussed for NTBFs based on theory of organizational learning.
The Role of Support-Activities for the successful Implementation of Internal Corporate Accelerators
(2018)
To get a better understanding of Cross Site Scripting vulnerabilities, we investigated 50 randomly selected CVE reports which are related to open source projects. The vulnerable and patched source code was manually reviewed to find out what kind of source code patterns were used. Source code pattern categories were found for sources, concatenations, sinks, html context and fixes. Our resulting categories are compared to categories from CWE. A source code sample which might have led developers to believe that the data was already sanitized is described in detail. For the different html context categories, the necessary Cross Site Scripting prevention mechanisms are described.
We investigated 50 randomly selected buffer overflow vulnerabilities in Firefox. The source code of these vulnerabilities and the corresponding patches were manually reviewed and patterns were identified. Our main contribution are taxonomies of errors, sinks and fixes seen from a developer's point of view. The results are compared to the CWE taxonomy with an emphasis on vulnerability details. Additionally, some ideas are presented on how the taxonomy could be used to improve the software security education.
Error correction coding based on soft-input decoding can significantly improve the reliability of non-volatile flash memories. This work proposes a soft-input decoder for generalized concatenated (GC) codes. GC codes are well suited for error correction in flash memories for high reliability data storage. We propose GC codes constructed from inner extended binary Bose-Chaudhuri-Hocquenghem (BCH) codes and outer Reed-Solomon codes. The extended BCH codes enable an efficient hard-input decoding. Furthermore, a low-complexity soft-input decoding method is proposed. This bit-flipping decoder uses a fixed number of test patterns and an algebraic decoder for soft-decoding. An acceptance criterion for the final candidate codeword is proposed. Combined with error and erasure decoding of the outer Reed-Solomon codes, this acceptance criterion can improve the decoding performance and reduce the decoding complexity. The presented simulation results show that the proposed bit-flipping decoder in combination with outer error and erasure decoding can outperform maximum likelihood decoding of the inner codes.
Generalised concatenated (GC) codes are well suited for error correction in flash memories for high-reliability data storage. The GC codes are constructed from inner extended binary Bose–Chaudhuri–Hocquenghem (BCH) codes and outer Reed–Solomon codes. The extended BCH codes enable high-rate GC codes and low-complexity soft input decoding. This work proposes a decoder architecture for high-rate GC codes. For such codes, outer error and erasure decoding are mandatory. A pipelined decoder architecture is proposed that achieves a high data throughput with hard input decoding. In addition, a low-complexity soft input decoder is proposed. This soft decoding approach combines a bit-flipping strategy with algebraic decoding. The decoder components for the hard input decoding can be utilised which reduces the overhead for the soft input decoding. Nevertheless, the soft input decoding achieves a significant coding gain compared with hard input decoding.