Refine
Year of publication
- 2015 (222) (remove)
Document Type
- Conference Proceeding (92)
- Article (50)
- Other Publications (39)
- Part of a Book (11)
- Book (10)
- Journal (Complete Issue of a Journal) (7)
- Report (4)
- Doctoral Thesis (3)
- Working Paper (3)
- Bachelor Thesis (1)
Keywords
- Algorithm (1)
- Apfel (1)
- Arbeitsrecht (1)
- Atom interferometer (1)
- Autonomy (1)
- Autoritarismus (1)
- BCH codes (1)
- BPM (1)
- Baden-Württemberg (1)
- Baugeschichte (1)
Institute
- Fakultät Architektur und Gestaltung (3)
- Fakultät Bauingenieurwesen (2)
- Fakultät Informatik (1)
- Fakultät Maschinenbau (1)
- Institut für Angewandte Forschung - IAF (1)
- Institut für Strategische Innovation und Technologiemanagement - IST (2)
- Konstanz Institut für Corporate Governance - KICG (1)
- Konstanzer Institut für Prozesssteuerung - KIPS (3)
Beyond the SDG compass
(2015)
CSR und Compliance
(2015)
To evaluate the quality of a person's sleep it is essential to identify the sleep stages and their durations. Currently, the gold standard in terms of sleep analysis is overnight polysomnography (PSG), during which several techniques like EEG (eletroencephalogram), EOG (electrooculogram), EMG (electromyogram), ECG (electrocardiogram), SpO2 (blood oxygen saturation) and for example respiratory airflow and respiratory effort are recorded. These expensive and complex procedures, applied in sleep laboratories, are invasive and unfamiliar for the subjects and it is a reason why it might have an impact on the recorded data. These are the main reasons why low-cost home diagnostic systems are likely to be advantageous. Their aim is to reach a larger population by reducing the number of parameters recorded. Nowadays, many wearable devices promise to measure sleep quality using only the ECG and body-movement signals. This work presents an android application developed in order to proof the accuracy of an algorithm published in the sleep literature. The algorithm uses ECG and body movement recordings to estimate sleep stages. The pre-recorded signals fed into the algorithm have been taken from physionet1 online database. The obtained results have been compared with those of the standard method used in PSG. The mean agreement ratios between the sleep stages REM, Wake, NREM-1, NREM-2 and NREM-3 were 38.1%, 14%, 16%, 75% and 54.3%.
This paper compares the surface morphology of differently finished austenitic stainless steel AISI 316L, also in combination with low temperature carburization. Milled and tumbled surfaces were analyzed by means of corrosion resistance and surface morphology. The results of potentiodynamic measurements show that professional grinding operations with SiC and Al2O3 always lead to a better corrosion resistance of low temperature carburized surfaces compared to the untreated reference in the used acidified chloride solution. Big influence on the corrosion resistance of vibratory ground or tumbled surfaces has the amount of plastic deformation while machining, that has to be kept low for austenitic stainless steels. Due to the high ductility, plastic deformation can lead to the formation of meta stable pits that can be initiation points of corrosion. The formation of meta stable pits can be aggravated by low temperature diffusion processes.
Energiequelle Seewasser
(2015)
RELOAD
(2015)
Vortrag auf dem Doktorandenkolloquium des Kooperativen Promotionskollegs der HTWG, 09.07.2015
The problem of vessel collisions or near-collision situations on sea, often caused by human error due to incomplete or overwhelming information, is becoming more and more important with rising maritime traffic. Approaches to supply navigators and Vessel Traffic Services with expert knowledge and suggest trajectories for all vessels to avoid collisions, are often aimed at situations where a single planner guides all vessels with perfect information. In contrast, we suggest a two-part procedure which plans trajectories using a specialised A* and negotiates trajectories until a solution is found, which is acceptable for all vessels. The solution obeys collision avoidance rules, includes a dynamic model of all vessels and negotiates trajectories to optimise globally without a global planner and extensive information disclosure. The procedure combines all components necessary to solve a multi-vessel encounter and is tested currently in simulation and on several test beds. The first results show a fast converging optimisation process which after a few negotiation rounds already produce feasible, collision free trajectories.
Post harvest technology
(2015)
The proposed approach applies current unsupervised clustering approaches in a different dynamic manner. Instead of taking all the data as input and finding clusters among them, the given approach clusters Holter ECG data (long-term electrocardiography data from a holter monitor) on a given interval which enables a dynamic clustering approach (DCA). Therefore advanced clustering techniques based on the well known Dynamic Time Warping algorithm are used. Having clusters e.g. on a daily basis, clusters can be compared by defining cluster shape properties. Doing this gives a measure for variation in unsupervised cluster shapes and may reveal unknown changes in healthiness. Embedding this approach into wearable devices offers advantages over the current techniques. On the one hand users get feedback if their ECG data characteristic changes unforeseeable over time which makes early detection possible. On the other hand cluster properties like biggest or smallest cluster may help a doctor in making diagnoses or observing several patients. Further, on found clusters known processing techniques like stress detection or arrhythmia classification may be applied.
This chapter contains three advanced topics in model order reduction (MOR): nonlinear MOR, MOR for multi-terminals (or multi-ports) and finally an application in deriving a nonlinear macromodel covering phase shift when coupling oscillators. The sections are offered in a preferred order for reading, but can be read independently.
Vortrag auf dem Doktorandenkolloquium des Kooperativen Promotionskollegs der HTWG, 09.07.2015
Arbeitsrecht für Dummies
(2015)
Land schafft Landschaft
(2015)
Green-Innovation
(2015)
Biokratie
(2015)
Aus systemisch-evolutorischer Perspektive wird die Frage nach Biokratie aus der intersystemischen Konkurrenz zwischen Biosphäre und Anthroposphäre angegangen. Mit zunehmender Eingriffstiefe in die Biosphäre entsteht die Gefahr einer gestörten Koevolution. Dem Verfasser geht es um die große zentrale Frage: Wie ist naturangepasstes Wirtschaften im Kontext evolvierender Bio- und Anthroposphären möglich? Die Grundprämisse geht davon aus, dass Kenntnisse hinsichtlich der Funktionsprinzipien der Biosphäre nicht nur zur Begrenzung der Eingriffsintensität erforderlich sind, sondern auch wertvolle Informationen zur Gestaltung des anthropogenen Metabolismus und der Wertschöpfungssysteme nach Naturvorbildern liefern. Zugleich sind diese potentiellen Innovationsfelder vor dem Hintergrund evolutionsökonomischer Erkenntnisse zu reflektieren.
Multistakeholder-Analyse zur Evaluation der KiCG Anforderungen an Compliance-Management-Systeme
(2015)
In einem Expertengespräch tauschen sich Herr Michael Kayser, Geschäftsführer der digital spirit GmbH und Herr Prof. Dr. Stephan Grüninger, Direktor des Center for Business Compliance & Integrity (CBCI) an der Hochschule Konstanz, zur neuen ISO 19600 “Compliance Management Systems” aus. Das Gespräch wurde von Frau Sabrina Quintus, Akademische Mitarbeiterin am CBCI, geleitet.
The detection of differences between images of a printed reference and a reprinted wood decor often requires an initial image registration step. Depending on the digitalization method, the reprint will be displaced and rotated with respect to the reference. The aim of registration is to match the images as precisely as possible. In our approach, images are first matched globally by extracting feature points from both images and finding corresponding point pairs using the RANSAC algorithm. From these correspondences, we compute a global projective transformation between both images. In order to get a pixel-wise registration, we train a learning machine on the point correspondences found by RANSAC. The learning algorithm (in our case Gaussian process regression) is used to nonlinearly interpolate between the feature points which results in a high precision image registration method on wood decors.
Technology commercialization is described as the most dreadful challenge for technology-based entrepreneurs. The scarcity of resources and limited managerial experience make it a daunting task, putting in danger the whole firm emergence. Prior research has often build upon the resource-based view to propose that the new firms' performance is dependent on their initial resource endowments and configurations. Nevertheless, little is known on how the early-stage decisions of the entrepreneur might influence on the growth of the firm. Scholars have suggested that both technology and market orientation actions could influence the performance and growth of firms in this context; nevertheless, there is limited empirical evidence of the influence of these different orientations in the context of new technology-based firms (NTBFs). In this study we propose to explore the influence of technology and demand creation actions adopting a demand-side view. We use a longitudinal study on a panel dataset (2004-2007) with 249 U.S. new high-technology firms to test our hypothesis. The results point towards a rather limited influence of initial resource configurations, as well as an unexpected influence of market and technology orientation in the growth dimensions of an NTBF. The research holds implications for the management of new technology-based firms and for those interested in supporting the development of technology entrepreneurship.
Domain-specific modelling is increasingly adopted in the software development industry. While open source metamodels like Ecore have a wide impact, they still have some problems. The independent storage of nodes (classes) and edges (references) is currently only possible with complex, specific solutions. Furthermore the developed models are stored in the extensible markup language (XML) data format, which leads to problems with large models in terms of scaling. In this paper we describe an approach that solves the problem of independent classes and references in metamodels and we store the models in the JavaScript Object Notation (JSON) data format to support high scalability. First results of our tests show that the developed approach works and classes and references can be defined independently. In addition, our approach reduces the amount of characters per model by a factor of approximately two compared to Ecore. The entire project is made available as open source under the name MoDiGen. This paper focuses on the description of the metamodel definition in terms of scaling.
Das hier beschriebene und auf einem FPGA vom Typ Spartan-3A DSP realisierte System dient dazu, auf besonders effiziente Weise die Häufigkeitsverteilung nicht erkannter fehlerhafter Nachrichten mit verschiedenen CRCPolynomen
zu berechnen. Damit die Berechnung in möglichst kurzer Zeit stattfindet, wurde das System aus 64 parallel arbeitenden Instanzen von CRC-Findern in mehrstufiger Fließbandorganisation aufgebaut. In der hier beschriebenen Ausbaustufe erreicht das System eine Gesamtleistung von 6,4 ·109 Operationen in der Sekunde.
This paper considers intervals of real matrices with respect to partial orders and the problem to infer from some exposed matrices lying on the boundary of such an interval that all real matrices taken from the interval possess a certain property. In many cases such a property requires that the chosen matrices have an identically signed inverse. We also briefly survey related problems, e.g., the invariance of matrix properties under entry-wise perturbations.
Nowadays, there is a continuous need for many corporations to renew their business portfolio strategically in anticipation of changes in the business environment (e.g., technological change). The ongoing booming of founding international start-ups suggests that small entrepreneurial teams are an effective means to develop new businesses. Corporations should be able to benefit from this form of self-organized innovation when entering novel business domains for strategic renewal. However, corporations that establish small entrepreneurial teams (corporate ventures) are facing two obstacles. First, corporate ventures often fail for reasons that are not well explored. Second, it remains unclear how the partial successes may be improved to large successes. Although the key success factors remain ambiguous, there is little hope that corporate ventures will be successful without effective management. Since an empirical model for corporate venture management does not exists so far, the thesis formulates and answers the following problem statement: How can corporate management effectively manage corporate ventures? Building on qualitative and quantitative research methodologies, a model for effective corporate venture management is developed and tested statistically in the German IT consulting industry. The research results reveal some of the essential management principles through which corporate management can increase corporate venture success systematically.
Nichts als Worte
(2015)
Das Abenteuer der Kritik
(2015)
Zitate sind kein Selbstzweck
(2015)
Rhetorik-Wörterbuch
(2015)
This work proposes an efficient hardware Implementation of sequential stack decoding of binary block codes. The decoder can be applied for soft input decoding for generalized concatenated (GC) codes. The GC codes are constructed from inner nested binary Bose-Chaudhuri-Hocquenghem (BCH) codes and outer Reed-Solomon (RS) codes. In order to enable soft input decoding for the inner BCH block codes, a sequential stack decoding algorithm is used.
Codes over quotient rings of Lipschitz integers have recently attracted some attention. This work investigates the performance of Lipschitz integer constellations for transmission over the AWGN channel by means of the constellation figure of merit. A construction of sets of Lipschitz integers that leads to a better constellation figure of merit compared to ordinary Lipschitz integer constellations is presented. In particular, it is demonstrated that the concept of set partitioning can be applied to quotient rings of Lipschitz integers where the number of elements is not a prime number. It is shown that it is always possible to partition such quotient rings into additive subgroups in a manner that the minimum Euclidean distance of each subgroup is strictly larger than in the original set. The resulting signal constellations have a better performance for transmission over an additive white Gaussian noise channel compared to Gaussian integer constellations and to ordinary Lipschitz integer constellations. In addition, we present multilevel code constructions for the new signal constellations.
Codes over quotient rings of Lipschitz integers have recently attracted some attention. This work investigates the performance of Lipschitz integer constellations for transmission over the AWGN channel by means of the constellation figure of merit. A construction of sets of Lipschitz integers is presented that leads to a better constellation figure of merit compared to ordinary Lipschitz integer constellations. In particular, it is demonstrated that the concept of set partitioning can be applied to quotient rings of Lipschitz integers where the number of elements is not a prime number. It is shown that it is always possible to partition such quotient rings into additive subgroups in a manner that the minimum Euclidean distance of each subgroup is strictly larger than in the original set. The resulting signal constellations have a better performance for transmission over an additive white Gaussian noise channel compared to Gaussian integer constellations and to ordinary Lipschitz integer constellations.
This contribution presents a data compression scheme for applications in non-volatile flash memories. The objective of the data compression algorithm is to reduce the amount of user data such that the redundancy of the error correction coding can be increased in order to improve the reliability of the data storage system. The data compression is performed on block level considering data blocks of 1 kilobyte. We present an encoder architecture that has low memory requirements and provides a fast data encoding.
Schatten-IT
(2015)
Social Entrepreneurship
(2015)
Entwicklung eines grafischen Editors zur Metamodellierung sowie Validierung von Modell-Instanzen
(2015)
Im Rahmen dieser Arbeit wurde ein grafischer Editor für die Modellierung von Metamodellen entwickelt. Der Editor wird in einem beliebigen Web-Browser ausgeführt und ist somit plattformunabhängig nutzbar. Er implementiert das an der HTWG Konstanz im Projekt Progress in Graphical Modeling Frameworks entwickelte MoDiGen-Metamodell, und erlaubt die Modellierung von Metamodellen, die zu diesem Meta-Metamodell konform sind.
Als Ausgabeformat nutzt der Editor eine JSON-Struktur, was die Datenhaltung mit Hilfe von JSON-basierten nicht-relationalen Datenbanken ermöglicht und die Implementierung des Editors in JavaScript erleichterte.
Zusätzlich wurde ein Werkzeug entwickelt, mit welchem die Instanzen des modellierten Metamodells, die Modelle, gegen das Metamodell geprüft werden können. Dieses Programm ist sowohl für die Clientseite im Web-Browser zur Prüfung eines Modells, als auch für die Serverseite zur Prüfung der Modelldaten vor der Persistierung relevant, weshalb die Validierung in JavaScript bzw. CoffeeScript entwickelt wurde. Im Web-Browser kann diese Implementierung direkt ausgeführt werden, serverseitig wurde die von der Mozilla Foundation in Java geschriebene JavaScript-Implementierung Rhino verwendet, um das JavaScript-Programm aufzurufen.
Der Theorieteil der Arbeit beschäftigt sich mit den Meta-Metamodellen, die die Grundlage des MoDiGen-Metamodells bilden, sowie ausführlich mit dem MoDiGen-Metamodell selbst. Im Praxisteil wird die Entwicklung und der Aufbau des Editors und des Validators erläutert.
Im Beitrag wird erörtert, welche Rolle der schriftlichen Fehlerkorrektur im studienbegleitenden und -vorbereitenden Deutschunterricht zukommt. Dazu wird ein Überblick über den Stand der Forschung gegeben und eine Studie zur Fehlerkorrektur mit fortgeschrittenen Lernenden vorgestellt, in der die Wirkung einer Fehlerkorrektur am Lernertext überprüft wurde. Die Fehlerkorrektur führte bei der Experimentalgruppe zwar zu einer Reduktion der Fehler in der Überarbeitung, im Vergleich zur Kontrollgruppe aber nicht zu einer signifikanten Verbesserung in einem neuen Text. Um zu eruieren, wie die Lernenden mit der Fehlerkorrektur umgingen, wurde in der Studie außerdem eine qualitative Analyse einzelner Arbeiten vorgenommen. Der Beitrag schließt mit der Frage, welche Rolle eine – nicht sonderlich effektive – Fehlerkorrektur in einer auf Wissenschaftssprache bezogenen Vermittlung einnehmen sollte.
Wettbewerb und Wagnis
(2015)
Karl Bernhard
(2015)
Wettbewerb und Wagnis
(2015)
Reconstruction of hand-held laser scanner data is used in industry primarily for reverse engineering. Traditionally, scanning and reconstruction are separate steps. The operator of the laser scanner has no feedback from the reconstruction results. On-line reconstruction of the CAD geometry allows for such an immediate feedback.
We propose a method for on-line segmentation and reconstruction of CAD geometry from a stream of point data based on means that are updated on-line. These means are combined to define complex local geometric properties, e.g., to radii and center points of spherical regions. Using means of local scores, planar, cylindrical, and spherical segments are detected and extended robustly with region growing. For the on-line computation of the means we use so-called accumulated means. They allow for on-line insertion and removal of values and merging of means. Our results show that this approach can be performed on-line and is robust to noise. We demonstrate that our method reconstructs spherical, cylindrical, and planar segments on real scan data containing typical errors caused by hand-held laser scanners.
We present a 3d-laser-scan simulation in virtual
reality for creating synthetic scans of CAD models. Consisting of
the virtual reality head-mounted display Oculus Rift and the
motion controller Razer Hydra our system can be used like
common hand-held 3d laser scanners. It supports scanning of
triangular meshes as well as b-spline tensor product surfaces
based on high performance ray-casting algorithms. While point
clouds of known scanning simulations are missing the man-made
structure, our approach overcomes this problem by imitating
real scanning scenarios. Calculation speed, interactivity and the
resulting realistic point clouds are the benefits of this system.
This Chapter introduces parameterized, or parametric, Model Order Reduction (pMOR). The Sections are offered in a prefered order for reading, but can be read independently. Section 5.1, written by Jorge Fernández Villena, L. Miguel Silveira, Wil H.A. Schilders, Gabriela Ciuprina, Daniel Ioan and Sebastian Kula, overviews the basic principles for pMOR. Due to higher integration and increasing frequency-based effects, large, full Electromagnetic Models (EM) are needed for accurate prediction of the real behavior of integrated passives and interconnects. Furthermore, these structures are subject to parametric effects due to small variations of the geometric and physical properties of the inherent materials and manufacturing process. Accuracy requirements lead to huge models, which are expensive to simulate and this cost is increased when parameters and their effects are taken into account. This Section introduces the framework of pMOR, which aims at generating reduced models for systems depending on a set of parameters.
Classification of point clouds by different types of geometric primitives is an essential part in the reconstruction process of CAD geometry. We use support vector machines (SVM) to label patches in point clouds with the class labels tori, ellipsoids, spheres, cones, cylinders or planes. For the classification features based on different geometric properties like point normals, angles, and principal curvatures are used. These geometric features are estimated in the local neighborhood of a point of the point cloud. Computing these geometric features for a random subset of the point cloud yields a feature distribution. Different features are combined for achieving best classification results. To minimize the time consuming training phase of SVMs, the geometric features are first evaluated using linear discriminant analysis (LDA).
LDA and SVM are machine learning approaches that require an initial training phase to allow for a subsequent automatic classification of a new data set. For the training phase point clouds are generated using a simulation of a laser scanning device. Additional noise based on an laser scanner error model is added to the point clouds. The resulting LDA and SVM classifiers are then used to classify geometric primitives in simulated and real laser scanned point clouds.
Compared to other approaches, where all known features are used for classification, we explicitly compare novel against known geometric features to prove their effectiveness.
Mehr als 40 Prozent der Unternehmen nutzen keine operativen Prozessdaten, um die Durchlaufzeiten oder Kosten ihrer Prozesse effektiv zu überwachen. Dennoch geben mehr als 60 Prozent der Unternehmen an, mit Prozessmanagement ihre Effizienz steigern zu wollen. Dies zeigt die Studie «Business Process Management 2015» der ZHAW School of Management and Law (SML). Die Ergebnisse wurden heute am BPM Symposium in Winterthur vorgestellt und mit einem breiten Fachpublikum aus Praxis und Wissenschaft diskutiert.
Grundlage für die digitale Transformation: Die Studie untersucht, wie und in welchem Ausmass Unternehmen das Standardrepertoire des Geschäftsprozessmanagements in Richtung Prozessintelligenz erweitern. Prozessintelligenz schliesst die Lücke zum operativen Geschäft und liefert eine neue Perspektive auf das Management der Geschäftsprozesse. Dabei konzentriert sie sich auf die Informationen, die in den operativen Prozessen entstehen und gebraucht werden und ist somit eine wesentliche Grundlage für die aktuell viel diskutierte digitale Transformation von Unternehmen. Um Prozesse besser verstehen, steuern und optimieren zu können, werden Methoden und Werkzeuge des Geschäftsprozessmanagements (BPM) und der Business Intelligence (BI) kombiniert.
Wertvolle Erfahrungen aus der Praxis: Für die Studie wurden in einer Online-Befragung über 80 Unternehmen zum Status quo ihrer «Prozessintelligenz» befragt. Ein Praxisworkshop diente als Rahmen, um Erfolgsmuster aus fünf Fallstudien bei Roche, AXA Winterthur, der St. Galler Kantonalbank sowie den Städten Lausanne und Konstanz zu identifizieren. Der Softwarehersteller Axon Ivy und SBB Immobilien haben im Rahmen einer Studienpartnerschaft mit dem Institut für Wirtschaftsinformatik der SML und dem Institut für Prozesssteuerung der HTWG Konstanz wertvolle Praxiserfahrungen beigesteuert. Das Resultat ist eine Momentaufnahme der strategischen, analytischen und praktischen Fähigkeiten, Methoden und Werkzeuge, mit denen Organisationen ihre Geschäftsprozesse gestalten, ausführen, überwachen und fortlaufend weiterentwickeln.
A semilinear distributed parameter approach for solenoid valve control including saturation effects
(2015)
In this paper a semilinear parabolic PDE for the control of solenoid valves is presented. The distributed parameter model of the cylinder becomes nonlinear by the inclusion of saturation effects due to the material's B/H-curve. A flatness based solution of the semilinear PDE is shown as well as a convergence proof of its series solution. By numerical simulation results the adaptability of the approach is demonstrated, and differences between the linear and the nonlinear case are discussed. The major contribution of this paper is the inclusion of saturation effects into the magnetic field governing linear diffusion equation, and the development of a flatness based solution for the resulting semilinear PDE as an extension of previous works [1] and [2].
Knowing the position of the spool in a solenoid valve, without using costly position sensors, is of considerable interest in a lot of industrial applications. In this paper, the problem of position estimation based on state observers for fast-switching solenoids, with sole use of simple voltage and current measurements, is investigated. Due to the short spool traveling time in fast-switching valves, convergence of the observer errors has to be achieved very fast. Moreover, the observer has to be robust against modeling uncertainties and parameter variations. Therefore, different state observer approaches are investigated, and compared to each other regarding possible uncertainties. The investigation covers a High-Gain-Observer approach, a combined High-Gain Sliding-Mode-Observer approach, both based on extended linearization, and a nonlinear Sliding-Mode-Observer based on equivalent output injection. The results are discussed by means of numerical simulations for all approaches, and finally physical experiments on a valve-mock-up are thoroughly discussed for the nonlinear Sliding-Mode-Observer.
Motion safety for vessels
(2015)
The improvement of collision avoidance for vessels in close range encounter situations is an important topic for maritime traffic safety. Typical approaches generate evasive trajectories or optimise the trajectories of all involved vessels. The idea of this work is to validate these trajectories related to guaranteed motion safety, which means that it is not sufficient for a trajectory to be collision-free, but it must additionally ensure that an evasive manoeuvre is performable at any time. An approach using the distance and the evolution of the distance to the other vessels is proposed. The concept of Inevitable Collision States (ICS) is adopted to identify the states for which no evasive manoeuvre exist. Furthermore, it is implemented into a collision avoidance system for recreational crafts to demonstrate the performance.
The improvement of collision avoidance for vessels in close range encounter situations is an important topic for maritime traffic safety. Typical approaches generate evasive trajectories or optimise the trajectories of all involved vessels. Such a collision avoidance system has to produce evasive manoeuvres that do not confuse other navigators. To achieve this behaviour, a probabilistic obstacle handling based on information from a radar sensor with target tracking, that considers measurement and tracking uncertainties is proposed. A grid based path search algorithm, that takes the information from the probabilistic obstacle handling into account, is then used to generate evasive trajectories. The proposed algorithms have been tested and verified in a simulated environment for inland waters.
Digitale Transformation
(2015)
Strategie der digitalen Ära
(2015)