Refine
Year of publication
- 2018 (77) (remove)
Document Type
- Conference Proceeding (77) (remove)
Has Fulltext
- no (77) (remove)
Keywords
- Agenda 2030 (1)
- Anisotropic hyperelasticity (1)
- Anstand (1)
- Automotive Industry (1)
- Bernstein polynomial (1)
- Bio-vital data (1)
- Bundeskongress Compliance Management (1)
- Cauchon algorithm (2)
- Checkerboard partial ordering (1)
- Cloud (1)
Institute
- Fakultät Informatik (12)
- Fakultät Maschinenbau (3)
- Fakultät Wirtschafts-, Kultur- und Rechtswissenschaften (2)
- Institut für Angewandte Forschung - IAF (2)
- Institut für Optische Systeme - IOS (5)
- Institut für Strategische Innovation und Technologiemanagement - IST (4)
- Institut für Systemdynamik - ISD (6)
- Institut für professionelles Schreiben - IPS (2)
- Konstanz Institut für Corporate Governance - KICG (2)
- Konstanzer Institut für Prozesssteuerung - KIPS (2)
SDG Voyager - A practical guide to align business excellence with Sustainable Development Goals
(2018)
By now, an inflationary high number of international publications on the topic “Agenda 2030” exist. But unanswered to this day seems to be the question of how the CSR-management of a company can make a concrete contribution to the SDGs. Instead of unilaterally demanding the reporting of companies’ sustainability activities, the SDG Voyager starts earlier in the process with the intention of encouraging companies of all sizes to become familiar with the fields of action for corporate responsibility and to attend to these issues without feeling overwhelmed. Many companies will find that they are already making a big contribution to sustainable development in a number of fields. In other areas, however, there will still be an urgent need for action. The SDG Voyager aims to acquaint companies with these topics and support them to fulfill their responsibilities towards their stakeholders and society.
Der DEX Deutscher Ethik Index® gibt Auskunft darüber, ob der Erfolg eines Unternehmens auf anständige Weise erreicht wurde. Zielgruppe des veröffentlichten DEX sind alle Anspruchsgruppen (Stakeholder) und die Gesellschaft. Der DEX informiert darüber, ob das Unternehmen einen Mehrwert für alle seine Stakeholder schafft (= Stakeholder Value). Der DEX unterstützt Entscheidungen der Öffentlichkeit, ob einer Organisation aufgrund ihres Beitrags für die Gesellschaft die „License to Operate“ erteilt wird (= Shared Value).
In this paper we present a method using deep learning to compute parametrizations for B-spline curve approximation. Existing methods consider the computation of parametric values and a knot vector as separate problems. We propose to train interdependent deep neural networks to predict parametric values and knots. We show that it is possible to include B-spline curve approximation directly into the neural network architecture. The resulting parametrizations yield tight approximations and are able to outperform state-of-the-art methods.
Deep neural networks have been successfully applied to problems such as image segmentation, image super-resolution, coloration and image inpainting. In this work we propose the use of convolutional neural networks (CNN) for image inpainting of large regions in high-resolution textures. Due to limited computational resources processing high-resolution images with neural networks is still an open problem. Existing methods separate inpainting of global structure and the transfer of details, which leads to blurry results and loss of global coherence in the detail transfer step. Based on advances in texture synthesis using CNNs we propose patch-based image inpainting by a single network topology that is able to optimize for global as well as detail texture statistics. Our method is capable of filling large inpainting regions, oftentimes exceeding quality of comparable methods for images of high-resolution (2048x2048px). For reference patch look-up we propose to use the same summary statistics that are used in the inpainting process.
It is widely recognized that sustainability is a new challenge for many manufacturing companies. In this paper, we tackle this issue by presenting an approach that deals with material and substance compliance within Product Lifecycle Management in a complex value chain. Our analysis explains why, how and when sustainable manufacturing arises, and it identifies, quantifies and evaluates the environmental impact of a new product. We propose (I) a Life Cycle Assessment tool (LCA) and (II) a model to validate this approach and evaluate the risk of noncompliance in supply chain. Our LCA approach provides comprehensive information on environmental impacts of a product.
Product and materials cycles are parallel and intersecting, making it challenging to integrate Material Selection Process across Product Lifecycle Management, Integration of LCA with PLM. We provide only a foundation. Further research in systems engineering is necessary. LCA is sensitive to data quality. Outsourcing production and having problems in supplier cooperation can result in material mismatch (such as property, composition mismatching) in the production process due to that may cause misleading of LCA results.
This paper also describes research challenges using riskbased due diligence.
Product development and product manufacturing are entering a new era, namely an era where engineering tasks are executed under collaboration of all involved parties. Engineers and potential customers work together mainly in a virtual world for the design and realization of the product. We address this so called “crowdsourcing” trend in the automotive industry that lowers cost and accelerates production of new car. Current practice and prior studies fail to handle data management and collaboration aspects in sufficient detail. We propose a PLM based crowdsourcing platform that applies best practices to the established approach and adapt it with new methods for handling specific requirements. Our work provides a basis for establishing an improved collaboration platform to support a Gig Economy in the automotive industry.
This paper presents a bed system able to analyze a person’s movement, breathing and recognize the positions that the subject is lying on the bed during the night without any additional physical contact. The measurements are performed with sensors placed between the mattress and the bed-frame. An Intel Edison board was used as an endpoint that served as a communication node from the mesh network to external service. Two nodes and Intel Edison are attached to the bottom of the bed frame and they are connected to the sensors. First test results have indicated the potential of the proposed approach for the recognition of sleep positions with 83% of correct recognized positions.
Influence of Temperature on the Corrosion behaviour of Stainless Steels under Tribological Stress
(2018)
Posterpräsentation
We consider the problem of increasing the informative value of electrocardiographic (ECG) surveys using data from multichannel electrocardiographic leads, that include both recorded electrocardiosignals and the coordinates of the electrodes placed on the surface of the human torso. In this area, we were interested in reconstruction of the surface distribution of the equivalent sources during the cardiac cycle at relatively low hardware cost. In our work, we propose to reconstruct the equivalent electrical sources by numerical methods, based on integral connection between the density of electrical sources and potential in a conductive medium. We consider maps of distributions of equivalent electric sources on the heart surface (HSSM), presenting source distributions in the form of a simple or double electrical layer. We indicate the dynamics of the heart electrical activity by the space-time mapping of equivalent electrical sources in HSSM.
Sleep study can be used for detection of sleep quality and in general bed behaviors. These results can helpful for regulating sleep and recognizing different sleeping disorders of human. In comparison to the leading standard measuring system, which is Polysomnography (PSG), the system proposed in this work is a non-invasive sleep monitoring device. For continuous analysis or home use, the PSG or wearable Actigraphy devices tends to be uncomfortable. Besides, these methods not only decrease practicality due to the process of having to put them on, but they are also very expensive. The system proposed in this paper classifies respiration and body movement with only one type of sensor and also in a noninvasive way. The sensor used is a pressure sensor. This sensor is low cost and can be used for commercial proposes. The system was tested by carrying out an experiment that recorded the sleep process of a subject. These recordings showed excellent results in the classification of breathing rate and body movements.
Investigation of magnetic effects on austenitic stainless steels after low temperature carburization
(2018)
Ingenieure auf die Bühne
(2018)
Einfluss der Oberfläche
(2018)
Long-term sleep monitoring can be done primarily in the home environment. Good patient acceptance requires low user and installation barriers. The selection of parameters in this approach is significantly limited compared to a PSG session. The aim is a qualified selection of parameters, which on the one hand allow a sufficiently good classification of sleep phases and on the other hand can be detected by non-invasive methods.
The process of restoring our body and brain from fatigue is directly depend-ing on the quality of sleep. It can be determined from the report of the sleep study results. Classification of sleep stages is the first step of this study and this includes the measurement of biovital data and its further processing.
In this work, the sleep analysis system is based on a hardware sensor net, namely a grid of 24 pressure sensors, supporting sleep phase recognition. In comparison to the leading standard, which is polysomnography, the proposed approach is a non-invasive system. It recognises respiration and body move-ment with only one type of low-cost pressure sensors forming a mesh archi-tecture. The nodes implement as a series of pressure sensors connected to a low-power and performant microcontroller. All nodes are connected via a system wide bus with address arbitration. The embedded processor is the mesh network endpoint that enables network configuration, storing and pre-processing of the data, external data access and visualization.
The system was tested by executing experiments recording the sleep of different healthy young subjects. The results obtained have indicated the po-tential to detect breathing rate and body movement. A major difference of this system in comparison to other approaches is the innovative way to place the sensors under the mattress. This characteristic facilitates the continuous using of the system without any influence on the common sleep process.
The paper investigates an innovative actuator combination based on the magnetic shape memory technology. The actuator is composed of an electromagnet, which is activated to produce motion, and a magnetic shape memory element, which is used passively to yield multistability, i.e. the possibility of holding a position without input power. Based on the experimental open-loop frequency characterization of the actuator, a position controller is developed and tested in several experiments.
Business units are increasingly able to fuel the transformation that digitalization demands of organizations. Thereby, they implement Shadow IT (SIT) to create flexible and innovative solutions. However, the individual implementation of SIT leads to high complexities and redundancies. Integration suggests itself to meet these challenges but can also eliminate the described benefits. In this emergent research, we develop propositions for a conceptual decision framework, that balances the benefits and drawbacks of an integration of SIT using a literature review as well as a multiple-case study. We thereby integrate the perspective of the overall organization as well as the specific business unit. We then pose six propositions regarding SIT integration that will serve to evaluate our conceptual framework in future research.
„Nackte Tatsachen genügen nicht, um den Leser bis zum Schluss bei der Stange zu halten. [Auch der] Verfasser von [Fach- und] Sachtexten hat das Recht – wenn nicht gar die Pflicht –, die Wunder der Erzählkunst für sich zu erschließen, um seine Texte zu einer interessanten und vergnüglichen Lektüre zu machen.“ – Der Vortrag geht der Frage nach, ob diese Forderung des vielgelesenen Ratgeberautors Sol Stein berechtigt ist und inwiefern sie eingelöst werden kann. Zu diesem Zweck werden Stil- und Kompositionsmittel aus den Bereichen Lexik, Grammatik, Syntax, Stilistik, Erzählstrategie und Textorganisation anhand von Formulierungsbeispielen vorgestellt und auf ihre Funktion in Erzählliteratur oder Journalismus hin untersucht. In einem zweiten Schritt wird jeweils nach der Übertragbarkeit dieser sprachlichen Mittel auf Fachtexte gefragt. Anhand parallel ausgerichteter Beispiele wird eine mögliche verlagerte Funktion im Wissenschaftskontext skizziert. Der erhoffte Erkenntnisgewinn des Vortrags betrifft sowohl das Verfassen eigener Publikationen als auch die Vermittlung von Schreibfähigkeiten für künftige Fachautor*innen, weshalb die Formulierungsbeispiele verschiedensten, auch technischen und naturwissenschaftlichen Domänen entnommen sind.
(Literaturangabe: Stein, Sol: Über das Schreiben [1995], dt. Waltraud Götting, 10. Aufl., Zweitausendeins, Frankfurt a. M., 2006, zit. S. 49)
Der Einsatz von Plagiatssoftware sollte auf die Zielgruppe abgestimmt sein. Da die Beurteilung des digitalen Prüfberichts Expertenwissen voraussetzt, erscheint ein unbegleiteter Zugriff nicht sinnvoll. Da meist nicht mit Betrugsabsicht, sondern aus Regelunkenntnis plagiiert wird, ist eine Beschränkung auf eine flächendeckende Detektion keine Lösung. Umfassendes Regelwissen, ein Verständnis für die grundlegende Bedeutung intertextueller Bezüge sowie selbstverantwortliches Handeln sind Vermittlungsziele der HTWG-Schreibberatung. Zu diesem Zweck macht das HTWG-Modell den Schreibkursbesuch zur Zulassungsvoraussetzung für die freiwillige Plagiatskontrolle und behält sich die Deutungshoheit über den Prüfbericht vor.
When integrating task-based learning into LSP courses, assessment procedures have to reflect the communicative goals of such tasks. However, performances on authentic and complex tasks are often difficult to score, because they involve not only specific language ability but also field specific knowledge. Moreover, the emphasis of tasks on meaning suggests that the communicative outcome should be regarded as the main criterion of success. This paper focuses on task design and scoring. It is argued that tasks have to be adapted to assessment purposes. However, to avoid a misalignment between teaching/learning and assessment, they should comprise the main features of LSP tasks in the sense of task-based language learning.
Generalized concatenated (GC) codes with soft-input decoding were recently proposed for error correction in flash memories. This work proposes a soft-input decoder for GC codes that is based on a low-complexity bit-flipping procedure. This bit-flipping decoder uses a fixed number of test patterns and an algebraic decoder for soft-input decoding. An acceptance criterion for the final candidate codeword is proposed. Combined with error and erasure decoding of the outer Reed-Solomon codes, this bit-flipping decoder can improve the decoding performance and reduce the decoding complexity compared to the previously proposed sequential decoding. The bit-flipping decoder achieves a decoding performance similar to a maximum likelihood decoder for the inner codes.
This work proposes a construction for low-density parity-check (LDPC) codes over finite Gaussian integer fields. Furthermore, a new channel model for codes over Gaussian integers is introduced and its channel capacity is derived. This channel can be considered as a first order approximation of the additive white Gaussian noise channel with hard decision detection where only errors to nearest neighbors in the signal constellation are considered. For this channel, the proposed LDPC codes can be decoded with a simple non-probabilistic iterative decoding algorithm similar to Gallager's decoding algorithm A.
Digitale Signaturen zum Überprüfen der Integrität von Daten, beispielsweise von Software-Updates, gewinnen zunehmend an Bedeutung. Im Bereich der eingebetteten Systeme kommen derzeit wegen der geringen Komplexität noch überwiegend symmetri-sche Verschlüsselungsverfahren zur Berechnung eines Authentifizierungscodes zum Einsatz. Asym-metrische Kryptosysteme sind rechenaufwendiger, bieten aber mehr Sicherheit, weil der Schlüssel zur Authentifizierung nicht geheim gehalten werden muss. Asymmetrische Signaturverfahren werden typischerweise zweistufig berechnet. Der Schlüssel wird nicht direkt auf die Daten angewendet, sondern auf deren Hash-Wert, der mit Hilfe einer Hash-funktion zuvor berechnet wurde. Zum Einsatz dieser Verfahren in eingebetteten Systemen ist es erforder-lich, dass die Hashfunktion einen hinreichend gro-ßen Datendurchsatz ermöglicht. In diesem Beitrag wird eine effiziente Hardware-Implementierung der SHA-256 Hashfunktion vorgestellt.
Optical surface inspection: A novelty detection approach based on CNN-encoded texture features
(2018)
In inspection systems for textured surfaces, a reference texture is typically known before novel examples are inspected. Mostly, the reference is only available in a digital format. As a consequence, there is no dataset of defective examples available that could be used to train a classifier. We propose a texture model approach to novelty detection. The texture model uses features encoded by a convolutional neural network (CNN) trained on natural image data. The CNN activations represent the specific characteristics of the digital reference texture which are learned by a one-class classifier. We evaluate our novelty detector in a digital print inspection scenario. The inspection unit is based on a camera array and a flashing light illumination which allows for inline capturing of multichannel images at a high rate. In order to compare our results to manual inspection, we integrated our inspection unit into an industrial single-pass printing system.
To get a better understanding of Cross Site Scripting vulnerabilities, we investigated 50 randomly selected CVE reports which are related to open source projects. The vulnerable and patched source code was manually reviewed to find out what kind of source code patterns were used. Source code pattern categories were found for sources, concatenations, sinks, html context and fixes. Our resulting categories are compared to categories from CWE. A source code sample which might have led developers to believe that the data was already sanitized is described in detail. For the different html context categories, the necessary Cross Site Scripting prevention mechanisms are described.
We investigated 50 randomly selected buffer overflow vulnerabilities in Firefox. The source code of these vulnerabilities and the corresponding patches were manually reviewed and patterns were identified. Our main contribution are taxonomies of errors, sinks and fixes seen from a developer's point of view. The results are compared to the CWE taxonomy with an emphasis on vulnerability details. Additionally, some ideas are presented on how the taxonomy could be used to improve the software security education.