Refine
Year of publication
- 2018 (88) (remove)
Document Type
- Conference Proceeding (88) (remove)
Keywords
Institute
- Fakultät Bauingenieurwesen (1)
- Fakultät Informatik (13)
- Fakultät Maschinenbau (3)
- Fakultät Wirtschafts-, Kultur- und Rechtswissenschaften (2)
- Institut für Angewandte Forschung - IAF (3)
- Institut für Optische Systeme - IOS (9)
- Institut für Strategische Innovation und Technologiemanagement - IST (5)
- Institut für Systemdynamik - ISD (6)
- Institut für professionelles Schreiben - IPS (2)
- Konstanz Institut für Corporate Governance - KICG (2)
- Konstanzer Institut für Prozesssteuerung - KIPS (2)
This paper describes an early lumping approach for generating a mathematical model of the heating process of a moving dual-layer substrate. The heat is supplied by convection and nonlinearly distributed over the whole considered spatial extend of the substrate. Using CFD simulations as a reference, two different modelling approaches have been investigated in order to achieve the most suitable model type. It is shown that due to the possibility of using the transition matrix for time discretization, an equivalent circuit model achieves superior results when compared to the Crank-Nicolson method. In order to maintain a constant sampling time for the in-visioned-control strategies, the effect of variable speed is transformed into a system description, where the state vector has constant length but a variable number of non-zero entries. The handling of the variable transport speed during the heating process is considered as the main contribution of this work. The result is a model, suitable for being used in future control strategies.
Online-based business models, such as shopping platforms, have added new possibilities for consumers over the last two decades. Aside from basic differences to other distribution channels, customer reviews on such platforms have become a powerful tool, which bestows an additional source for gaining transparency to consumers. Related research has, for the most part, been labelled under the term electronic word-of-mouth (eWOM). An approach, providing a theoretical basis for this phenomenon, will be provided here. The approach is mainly based on work in the field of consumer culture theory (CCT) and on the concept of co-creation. The work of several authors in these streams of research is used to construct a culturally informed resource-based theory, as advocated by Arnould & Thompson and Algesheimer & Gurâu.
Generalized concatenated (GC) codes with soft-input decoding were recently proposed for error correction in flash memories. This work proposes a soft-input decoder for GC codes that is based on a low-complexity bit-flipping procedure. This bit-flipping decoder uses a fixed number of test patterns and an algebraic decoder for soft-input decoding. An acceptance criterion for the final candidate codeword is proposed. Combined with error and erasure decoding of the outer Reed-Solomon codes, this bit-flipping decoder can improve the decoding performance and reduce the decoding complexity compared to the previously proposed sequential decoding. The bit-flipping decoder achieves a decoding performance similar to a maximum likelihood decoder for the inner codes.
When designing drying processes for sensitive biological foodstuffs like fruit or vegetables, energy and time efficiency as well as product quality are gaining more and more importance. These all are greatly influenced by the different drying parameters (e.g. air temperature, air velocity and dew point temperature) in the process. In sterilization of food products the cooking value is widely used as a cross-link between these parameters. In a similar way, the so-called cumulated thermal load (CTL) was introduced for drying processes. This was possible because most quality changes mainly depend on drying air temperature and drying time. In a first approach, the CTL was therefore defined as the time integral of the surface temperature of agricultural products. When conducting experiments with mangoes and pineapples, however, it was found that the CTL as it was used had to be adjusted to a more practical form. So the definition of the CTL was improved and the behaviour of the adjusted CTL (CTLad) was investigated in the drying of pineapples and mangoes. On the basis of these experiments and the work that had been done on the cooking value, it was found, that more optimization on the CTLad had to be done to be able to compare a great variety of different products as well as different quality parameters.
Advanced approaches for analysis and form finding of membrane structures with finite elements
(2018)
Part I deals with material modelling of woven fabric membranes. Due to their structure of crossed yarns embedded in coating, woven fabric membranes are characterised by a highly nonlinear stress-strain behaviour. In order to determine an accurate structural response of membrane structures, a suitable description of the material behaviour is required. A linear elastic orthotropic model approach, which is current practice, only allows a relative coarse approximation of the material behaviour. The present work focuses on two different material approaches: A first approach becomes evident by focusing on the meso-scale. The inhomogeneous, however periodic structure of woven fabrics motivates for microstructural modelling. An established microstructural model is considered and enhanced with regard to the coating stiffness. Secondly, an anisotropic hyperelastic material model for woven fabric membranes is considered. By performing inverse processes of parameter identification, fits of the two different material models w.r.t. measured data from a common biaxial test are shown. The results of the inversely parametrised material models are compared and discussed.
Part II presents an extended approach for a simultaneous form finding and cutting patterning computation of membrane structures. The approach is formulated as an optimisation problem in which both the geometries of the equilibrium and cutting patterning configuration are initially unknown. The design objectives are minimum deviations from prescribed stresses in warp and fill direction along with minimum shear deformation. The equilibrium equations are introduced into the optimisation problem as constraints. Additional design criteria can be formulated (for the geometry of seam lines etc.). Similar to the motivation for the Updated Reference Strategy [4] the described problem is singular in the tangent plane. In both the equilibrium and the cutting patterning configuration finite element nodes can move without changing stresses. Therefore, several approaches are presented to stabilise the algorithm. The overall result of the computation is a stressed equilibrium and an unstressed cutting patterning geometry. The interaction of both configurations is described in Total Lagrangian formulation.
The microstructural model, which is focused in Part I, is applied. Based on this approach, information about fibre orientation as well as the ending of fibres at cutting edges are available. As a result, more accurate results can be computed compared to simpler approaches commonly used in practice.
The paper investigates an innovative actuator combination based on the magnetic shape memory technology. The actuator is composed of an electromagnet, which is activated to produce motion, and a magnetic shape memory element, which is used passively to yield multistability, i.e. the possibility of holding a position without input power. Based on the experimental open-loop frequency characterization of the actuator, a position controller is developed and tested in several experiments.
This paper describes the effectiveness and efficiency of Virtual Reality training during a commissioning process. Therefore, 500 picking orders with more than 2000 part-picking operations with 30 test persons have been conducted and analyzed in the Modellfabrik Bodensee. The study points out the advantages and disadvantages of virtual training in comparison to a real execution of a picking process with and without any training.
Research on Shadow IT is facing a conceptual dilemma in cases where previously "covert" systems developed by business entities (individual users, business workgroups, or business units) are integrated in the organizational IT management. These systems become visible, are therefore not "in the shadows" anymore, and subsequently do not fit to existing definitions of Shadow IT. Practice shows that some information systems share characteristics of Shadow IT, but are created openly in alignment with the IT department. This paper therefore proposes the term "Business-managed IT" to describe "overt" information systems developed or managed by business entities. We distinguish Business-managed IT from Shadow IT by illustrating case vignettes. Accordingly, our contribution is to suggest a concept and its delineation against other concepts. In this way, IS researchers interested in IT originated from or maintained by business entities can construct theories with a wider scope of application that are at the same time more specific to practical problems. In addition, value-laden terminology is complemented by a vocabulary that values potentially innovative developments by business entities more adequately. From a practical point of view, the distinction can be used to discuss the distribution of task responsibilities for information systems.
Deep neural networks have become a veritable alternative to classic speaker recognition and clustering methods in recent years. However, while the speech signal clearly is a time series, and despite the body of literature on the benefits of prosodic (suprasegmental) features, identifying voices has usually not been approached with sequence learning methods. Only recently has a recurrent neural network (RNN) been successfully applied to this task, while the use of convolutional neural networks (CNNs) (that are not able to capture arbitrary time dependencies, unlike RNNs) still prevails. In this paper, we show the effectiveness of RNNs for speaker recognition by improving state of the art speaker clustering performance and robustness on the classic TIMIT benchmark. We provide arguments why RNNs are superior by experimentally showing a “sweet spot” of the segment length for successfully capturing prosodic information that has been theoretically predicted in previous work.
Comparison and Identifiability Analysis of Friction Models for the Dither Motion of a Solenoid
(2018)
In this paper, the mechanical subsystem of a proportional solenoid excited by a dither signal is considered. The objective is to find a suitable friction model that reflects the characteristic mechanical properties of the dynamic system. Several different friction models from the literature are compared. The friction models are evaluated with respect to their accuracy as well as their practical identifiability, the latter being quantified based on the Fisher information matrix.
Long-term sleep monitoring can be done primarily in the home environment. Good patient acceptance requires low user and installation barriers. The selection of parameters in this approach is significantly limited compared to a PSG session. The aim is a qualified selection of parameters, which on the one hand allow a sufficiently good classification of sleep phases and on the other hand can be detected by non-invasive methods.
Corporate venturing is one way for corporations to
introduce strategic renewal into their business portfolios, which is
imperative for ongoing success in innovation-driven industries.
Prior research finds that corporate ventures should be separated
from the mainstream business in loosely coupled sub-units, but
scholars continue to discuss how loose or tight the ventures should
be to balance exploration and exploitation. Hence, the antecedents
for successful venture management are yet to be fully explored and
our study contributes to this effort. The study shows that
corporate venture success is enhanced when corporate
management grants job and strategic autonomy to the venture
managers. This is further amplified when corporate management
simultaneously imposes an exploitative policy that forces venture
managers to prioritize extensions to and improvements of existing
competences and product-market offerings.
In this paper we present a method using deep learning to compute parametrizations for B-spline curve approximation. Existing methods consider the computation of parametric values and a knot vector as separate problems. We propose to train interdependent deep neural networks to predict parametric values and knots. We show that it is possible to include B-spline curve approximation directly into the neural network architecture. The resulting parametrizations yield tight approximations and are able to outperform state-of-the-art methods.
When integrating task-based learning into LSP courses, assessment procedures have to reflect the communicative goals of such tasks. However, performances on authentic and complex tasks are often difficult to score, because they involve not only specific language ability but also field specific knowledge. Moreover, the emphasis of tasks on meaning suggests that the communicative outcome should be regarded as the main criterion of success. This paper focuses on task design and scoring. It is argued that tasks have to be adapted to assessment purposes. However, to avoid a misalignment between teaching/learning and assessment, they should comprise the main features of LSP tasks in the sense of task-based language learning.
Einfluss der Oberfläche
(2018)
Der DEX Deutscher Ethik Index® gibt Auskunft darüber, ob der Erfolg eines Unternehmens auf anständige Weise erreicht wurde. Zielgruppe des veröffentlichten DEX sind alle Anspruchsgruppen (Stakeholder) und die Gesellschaft. Der DEX informiert darüber, ob das Unternehmen einen Mehrwert für alle seine Stakeholder schafft (= Stakeholder Value). Der DEX unterstützt Entscheidungen der Öffentlichkeit, ob einer Organisation aufgrund ihres Beitrags für die Gesellschaft die „License to Operate“ erteilt wird (= Shared Value).
„Nackte Tatsachen genügen nicht, um den Leser bis zum Schluss bei der Stange zu halten. [Auch der] Verfasser von [Fach- und] Sachtexten hat das Recht – wenn nicht gar die Pflicht –, die Wunder der Erzählkunst für sich zu erschließen, um seine Texte zu einer interessanten und vergnüglichen Lektüre zu machen.“ – Der Vortrag geht der Frage nach, ob diese Forderung des vielgelesenen Ratgeberautors Sol Stein berechtigt ist und inwiefern sie eingelöst werden kann. Zu diesem Zweck werden Stil- und Kompositionsmittel aus den Bereichen Lexik, Grammatik, Syntax, Stilistik, Erzählstrategie und Textorganisation anhand von Formulierungsbeispielen vorgestellt und auf ihre Funktion in Erzählliteratur oder Journalismus hin untersucht. In einem zweiten Schritt wird jeweils nach der Übertragbarkeit dieser sprachlichen Mittel auf Fachtexte gefragt. Anhand parallel ausgerichteter Beispiele wird eine mögliche verlagerte Funktion im Wissenschaftskontext skizziert. Der erhoffte Erkenntnisgewinn des Vortrags betrifft sowohl das Verfassen eigener Publikationen als auch die Vermittlung von Schreibfähigkeiten für künftige Fachautor*innen, weshalb die Formulierungsbeispiele verschiedensten, auch technischen und naturwissenschaftlichen Domänen entnommen sind.
(Literaturangabe: Stein, Sol: Über das Schreiben [1995], dt. Waltraud Götting, 10. Aufl., Zweitausendeins, Frankfurt a. M., 2006, zit. S. 49)
With the increased deployment of biometric authentication systems, some security concerns have also arisen. In particular, presentation attacks directed to the capture device pose a severe threat. In order to prevent them, liveness features such as the blood flow can be utilised to develop presentation attack detection (PAD) mechanisms. In this context, laser speckle contrast imaging (LSCI) is a technology widely used in biomedical applications in order to visualise blood flow. We therefore propose a fingerprint PAD method based on textural information extracted from pre-processed LSCI images. Subsequently, a support vector machine is used for classification. In the experiments conducted on a database comprising 32 different artefacts, the results show that the proposed approach classifies correctly all bona fides. However, the LSCI technology experiences difficulties with thin and transparent overlay attacks.
Today’s markets are characterized by fast and radical changes, posing an essential challenge to established companies. Startups, yet, seem to be more capable in developing radical innovations to succeed in those volatile markets. Thus, established companies started to experiment with various approaches to implement startup-like structures in their organization. Internal corporate accelerators (ICAs) are a novel form of corporate venturing, aiming to foster bottom-up innovations through intrapreneurship. However, ICAs still lack empirical investigations. This work contributes to a deeper understanding of the interface between the ICA and the core organization and the respective support activities (resource access and support services) that create an innovation-supportive work environment for the intrapreneurial team. The results of this qualitative study, comprising 12 interviews with ICA teams out of two German high-tech companies, show that the resources provided by ICAs differ from the support activities of external accelerators. Further, the study shows that some resources show both supportive as well as obstructive potential for the intrapreneurial teams within the ICA.
Corporate venturing has gained much attention due
to challenges and changes that occur because of discontinuous
innovations – which seem to be promoted by digitalization. In this
context, open innovation has become a promising tool for
established companies to strengthen their innovation capabilities.
While the external opening of the innovation process has gained
much attention, the internal opening lacks on investigations.
Especially new organizational forms, such as Internal Corporate
Accelerators, have not been investigated sufficiently. This study,
which is based on 13 interviews from two German tech-companies,
contributes to a better understanding of this new form of corporate
venturing and the resulting effects on the organizational renewal.
Deep neural networks have been successfully applied to problems such as image segmentation, image super-resolution, coloration and image inpainting. In this work we propose the use of convolutional neural networks (CNN) for image inpainting of large regions in high-resolution textures. Due to limited computational resources processing high-resolution images with neural networks is still an open problem. Existing methods separate inpainting of global structure and the transfer of details, which leads to blurry results and loss of global coherence in the detail transfer step. Based on advances in texture synthesis using CNNs we propose patch-based image inpainting by a single network topology that is able to optimize for global as well as detail texture statistics. Our method is capable of filling large inpainting regions, oftentimes exceeding quality of comparable methods for images of high-resolution (2048x2048px). For reference patch look-up we propose to use the same summary statistics that are used in the inpainting process.
Digitale Signaturen zum Überprüfen der Integrität von Daten, beispielsweise von Software-Updates, gewinnen zunehmend an Bedeutung. Im Bereich der eingebetteten Systeme kommen derzeit wegen der geringen Komplexität noch überwiegend symmetri-sche Verschlüsselungsverfahren zur Berechnung eines Authentifizierungscodes zum Einsatz. Asym-metrische Kryptosysteme sind rechenaufwendiger, bieten aber mehr Sicherheit, weil der Schlüssel zur Authentifizierung nicht geheim gehalten werden muss. Asymmetrische Signaturverfahren werden typischerweise zweistufig berechnet. Der Schlüssel wird nicht direkt auf die Daten angewendet, sondern auf deren Hash-Wert, der mit Hilfe einer Hash-funktion zuvor berechnet wurde. Zum Einsatz dieser Verfahren in eingebetteten Systemen ist es erforder-lich, dass die Hashfunktion einen hinreichend gro-ßen Datendurchsatz ermöglicht. In diesem Beitrag wird eine effiziente Hardware-Implementierung der SHA-256 Hashfunktion vorgestellt.
Influence of Temperature on the Corrosion behaviour of Stainless Steels under Tribological Stress
(2018)
Posterpräsentation
Ingenieure auf die Bühne
(2018)
Investigation of magnetic effects on austenitic stainless steels after low temperature carburization
(2018)
We propose a novel end-to-end neural network architecture that, once trained, directly outputs a probabilistic clustering of a batch of input examples in one pass. It estimates a distribution over the number of clusters k, and for each 1≤k≤kmax, a distribution over the individual cluster assignment for each data point. The network is trained in advance in a supervised fashion on separate data to learn grouping by any perceptual similarity criterion based on pairwise labels (same/different group). It can then be applied to different data containing different groups. We demonstrate promising performance on high-dimensional data like images (COIL-100) and speech (TIMIT). We call this “learning to cluster” and show its conceptual difference to deep metric learning, semi-supervise clustering and other related approaches while having the advantage of performing learnable clustering fully end-to-end.
This work proposes a construction for low-density parity-check (LDPC) codes over finite Gaussian integer fields. Furthermore, a new channel model for codes over Gaussian integers is introduced and its channel capacity is derived. This channel can be considered as a first order approximation of the additive white Gaussian noise channel with hard decision detection where only errors to nearest neighbors in the signal constellation are considered. For this channel, the proposed LDPC codes can be decoded with a simple non-probabilistic iterative decoding algorithm similar to Gallager's decoding algorithm A.
Algorithms for calculating the string edit distance are used in e.g. information retrieval and document analysis systems or for evaluation of text recognizers. Text recognition based on CTC-trained LSTM networks includes a decoding step to produce a string, possibly using a language model, and evaluation using the string edit distance. The decoded string can further be used as a query for database search, e.g. in document retrieval. We propose to closely integrate dictionary search with text recognition to train both combined in a continuous fashion. This work shows that LSTM networks are capable of calculating the string edit distance while allowing for an exchangeable dictionary to separate learned algorithm from data. This could be a step towards integrating text recognition and dictionary search in one deep network.
It is widely recognized that sustainability is a new challenge for many manufacturing companies. In this paper, we tackle this issue by presenting an approach that deals with material and substance compliance within Product Lifecycle Management in a complex value chain. Our analysis explains why, how and when sustainable manufacturing arises, and it identifies, quantifies and evaluates the environmental impact of a new product. We propose (I) a Life Cycle Assessment tool (LCA) and (II) a model to validate this approach and evaluate the risk of noncompliance in supply chain. Our LCA approach provides comprehensive information on environmental impacts of a product.
Product and materials cycles are parallel and intersecting, making it challenging to integrate Material Selection Process across Product Lifecycle Management, Integration of LCA with PLM. We provide only a foundation. Further research in systems engineering is necessary. LCA is sensitive to data quality. Outsourcing production and having problems in supplier cooperation can result in material mismatch (such as property, composition mismatching) in the production process due to that may cause misleading of LCA results.
This paper also describes research challenges using riskbased due diligence.
It is widely recognized that sustainability is a new challenge for many manufacturing companies. In this paper, we tackle this issue by presenting an approach that deals with material and substance compliance within Product Lifecycle Management in a complex value chain. Our analysis explains why, how and when sustainable manufacturing arises, and it identifies, quantifies and evaluates the environmental impact of a new product. We propose (I) a Life Cycle Assessment tool (LCA) and (II) a model to validate this approach and evaluate the risk of non-compliance in supply chain. Our LCA approach provides comprehensive information on environmental impacts of a product.
Product and materials cycles are parallel and intersecting, making it challenging to integrate Material Selection Process across Product Lifecycle Management, Integration of LCA with PLM. We provide only a foundation. Further research in systems engineering is necessary. LCA is sensitive to data quality. Outsourcing production and having problems in supplier cooperation can result in material mismatch (such as property, composition mismatching) in the production process due to that may cause misleading of LCA results. This paper also describes research challenges using risk-based due diligence.
Offline handwriting recognition systems often use LSTM networks, trained with line- or word-images. Multi-line text makes it necessary to use segmentation to explicitly obtain these images. Skewed, curved, overlapping, incorrectly written text, or noise can lead to errors during segmentation of multi-line text and reduces the overall recognition capacity of the system. Last year has seen the introduction of deep learning methods capable of segmentation-free recognition of whole paragraphs. Our method uses Conditional Random Fields to represent text and align it with the network output to calculate a loss function for training. Experiments are promising and show that the technique is capable of training a LSTM multi-line text recognition system.
This paper focuses on the multivariable control of a drawing tower process. The nature of the process together with the differences in measurement noise levels that affect the variables to be controlled motivated the development of a new MPC algorithm. An extension of a multivariable predictive control algorithm with separated prediction horizons is proposed. The obtained experimental results show the usefulness of the proposed algorithm..
Product development and product manufacturing are entering a new era, namely an era where engineering tasks are executed under collaboration of all involved parties. Engineers and potential customers work together mainly in a virtual world for the design and realization of the product. We address this so called “crowdsourcing” trend in the automotive industry that lowers cost and accelerates production of new car. Current practice and prior studies fail to handle data management and collaboration aspects in sufficient detail. We propose a PLM based crowdsourcing platform that applies best practices to the established approach and adapt it with new methods for handling specific requirements. Our work provides a basis for establishing an improved collaboration platform to support a Gig Economy in the automotive industry.
Sleep study can be used for detection of sleep quality and in general bed behaviors. These results can helpful for regulating sleep and recognizing different sleeping disorders of human. In comparison to the leading standard measuring system, which is Polysomnography (PSG), the system proposed in this work is a non-invasive sleep monitoring device. For continuous analysis or home use, the PSG or wearable Actigraphy devices tends to be uncomfortable. Besides, these methods not only decrease practicality due to the process of having to put them on, but they are also very expensive. The system proposed in this paper classifies respiration and body movement with only one type of sensor and also in a noninvasive way. The sensor used is a pressure sensor. This sensor is low cost and can be used for commercial proposes. The system was tested by carrying out an experiment that recorded the sleep process of a subject. These recordings showed excellent results in the classification of breathing rate and body movements.
Optical surface inspection: A novelty detection approach based on CNN-encoded texture features
(2018)
In inspection systems for textured surfaces, a reference texture is typically known before novel examples are inspected. Mostly, the reference is only available in a digital format. As a consequence, there is no dataset of defective examples available that could be used to train a classifier. We propose a texture model approach to novelty detection. The texture model uses features encoded by a convolutional neural network (CNN) trained on natural image data. The CNN activations represent the specific characteristics of the digital reference texture which are learned by a one-class classifier. We evaluate our novelty detector in a digital print inspection scenario. The inspection unit is based on a camera array and a flashing light illumination which allows for inline capturing of multichannel images at a high rate. In order to compare our results to manual inspection, we integrated our inspection unit into an industrial single-pass printing system.
Der Einsatz von Plagiatssoftware sollte auf die Zielgruppe abgestimmt sein. Da die Beurteilung des digitalen Prüfberichts Expertenwissen voraussetzt, erscheint ein unbegleiteter Zugriff nicht sinnvoll. Da meist nicht mit Betrugsabsicht, sondern aus Regelunkenntnis plagiiert wird, ist eine Beschränkung auf eine flächendeckende Detektion keine Lösung. Umfassendes Regelwissen, ein Verständnis für die grundlegende Bedeutung intertextueller Bezüge sowie selbstverantwortliches Handeln sind Vermittlungsziele der HTWG-Schreibberatung. Zu diesem Zweck macht das HTWG-Modell den Schreibkursbesuch zur Zulassungsvoraussetzung für die freiwillige Plagiatskontrolle und behält sich die Deutungshoheit über den Prüfbericht vor.
In tourism, energy demands are particularly high.Tourism facilities such as hotels require large amounts ofelectric and heating resp. cooling energy. Their supply howeveris usually still based on fossil energies. This research approachanalyses the potential of promoting renewable energies in BlackForest tourism. It focuses on a combined and hence highlyefficient production of both electric and thermal energy bybiogas plants on the one hand and its provision to local tourismfacilities via short distance networks on the other. Basing onsurveys and qualitative empiricism and considering regionalresource availability as well as socio-economic aspects, it thusexamines strengths, weaknesses, opportunities and threats thatcan arise from such a cooperation.
In tourism, energy demands are particularly high. Tourism facilities such as hotels require large amounts of electric and heating / cooling energy while their supply is usually still based on fossil energies.
This research approach analyses the potential of promoting renewable energies in tourism. It focuses on a combined and hence highly efficient production of both electric and thermal energy by biogas plants on the one hand and its provision to local tourism facilities via short distance networks on the other. Considering regional resource availability as well as socio-economic aspects, it thus examines strengths, weaknesses, opportunities and threats that can arise from such a micro-cooperation. The research aim is to provide an actor-based, spatially transferable feasibility analysis.
We consider the problem of increasing the informative value of electrocardiographic (ECG) surveys using data from multichannel electrocardiographic leads, that include both recorded electrocardiosignals and the coordinates of the electrodes placed on the surface of the human torso. In this area, we were interested in reconstruction of the surface distribution of the equivalent sources during the cardiac cycle at relatively low hardware cost. In our work, we propose to reconstruct the equivalent electrical sources by numerical methods, based on integral connection between the density of electrical sources and potential in a conductive medium. We consider maps of distributions of equivalent electric sources on the heart surface (HSSM), presenting source distributions in the form of a simple or double electrical layer. We indicate the dynamics of the heart electrical activity by the space-time mapping of equivalent electrical sources in HSSM.
SDG Voyager - A practical guide to align business excellence with Sustainable Development Goals
(2018)
By now, an inflationary high number of international publications on the topic “Agenda 2030” exist. But unanswered to this day seems to be the question of how the CSR-management of a company can make a concrete contribution to the SDGs. Instead of unilaterally demanding the reporting of companies’ sustainability activities, the SDG Voyager starts earlier in the process with the intention of encouraging companies of all sizes to become familiar with the fields of action for corporate responsibility and to attend to these issues without feeling overwhelmed. Many companies will find that they are already making a big contribution to sustainable development in a number of fields. In other areas, however, there will still be an urgent need for action. The SDG Voyager aims to acquaint companies with these topics and support them to fulfill their responsibilities towards their stakeholders and society.
The process of restoring our body and brain from fatigue is directly depend-ing on the quality of sleep. It can be determined from the report of the sleep study results. Classification of sleep stages is the first step of this study and this includes the measurement of biovital data and its further processing.
In this work, the sleep analysis system is based on a hardware sensor net, namely a grid of 24 pressure sensors, supporting sleep phase recognition. In comparison to the leading standard, which is polysomnography, the proposed approach is a non-invasive system. It recognises respiration and body move-ment with only one type of low-cost pressure sensors forming a mesh archi-tecture. The nodes implement as a series of pressure sensors connected to a low-power and performant microcontroller. All nodes are connected via a system wide bus with address arbitration. The embedded processor is the mesh network endpoint that enables network configuration, storing and pre-processing of the data, external data access and visualization.
The system was tested by executing experiments recording the sleep of different healthy young subjects. The results obtained have indicated the po-tential to detect breathing rate and body movement. A major difference of this system in comparison to other approaches is the innovative way to place the sensors under the mattress. This characteristic facilitates the continuous using of the system without any influence on the common sleep process.
Simulationsmethoden bei der Entwicklung von Extremleichtbaukomponenten in Faserverbundbauweise
(2018)
This paper presents a bed system able to analyze a person’s movement, breathing and recognize the positions that the subject is lying on the bed during the night without any additional physical contact. The measurements are performed with sensors placed between the mattress and the bed-frame. An Intel Edison board was used as an endpoint that served as a communication node from the mesh network to external service. Two nodes and Intel Edison are attached to the bottom of the bed frame and they are connected to the sensors. First test results have indicated the potential of the proposed approach for the recognition of sleep positions with 83% of correct recognized positions.
Smart-Future-Living-Bodensee
(2018)
We investigated 50 randomly selected buffer overflow vulnerabilities in Firefox. The source code of these vulnerabilities and the corresponding patches were manually reviewed and patterns were identified. Our main contribution are taxonomies of errors, sinks and fixes seen from a developer's point of view. The results are compared to the CWE taxonomy with an emphasis on vulnerability details. Additionally, some ideas are presented on how the taxonomy could be used to improve the software security education.
To get a better understanding of Cross Site Scripting vulnerabilities, we investigated 50 randomly selected CVE reports which are related to open source projects. The vulnerable and patched source code was manually reviewed to find out what kind of source code patterns were used. Source code pattern categories were found for sources, concatenations, sinks, html context and fixes. Our resulting categories are compared to categories from CWE. A source code sample which might have led developers to believe that the data was already sanitized is described in detail. For the different html context categories, the necessary Cross Site Scripting prevention mechanisms are described.
The Role of Support-Activities for the successful Implementation of Internal Corporate Accelerators
(2018)
Business units are increasingly able to fuel the transformation that digitalization demands of organizations. Thereby, they implement Shadow IT (SIT) to create flexible and innovative solutions. However, the individual implementation of SIT leads to high complexities and redundancies. Integration suggests itself to meet these challenges but can also eliminate the described benefits. In this emergent research, we develop propositions for a conceptual decision framework, that balances the benefits and drawbacks of an integration of SIT using a literature review as well as a multiple-case study. We thereby integrate the perspective of the overall organization as well as the specific business unit. We then pose six propositions regarding SIT integration that will serve to evaluate our conceptual framework in future research.
Visualization-Assisted Development of Deep Learning Models in Offline Handwriting Recognition
(2018)
Deep learning is a field of machine learning that has been the focus of active research and successful applications in recent years. Offline handwriting recognition is one of the research fields and applications were deep neural networks have shown high accuracy. Deep learning models and their training pipeline show a large amount of hyper-parameters in their data selection, transformation, network topology and training process that are sometimes interdependent. This increases the overall difficulty and time necessary for building and training a model for a specific data set and task at hand. This work proposes a novel visualization-assisted workflow that guides the model developer through the hyper-parameter search in order to identify relevant parameters and modify them in a meaningful way. This decreases the overall time necessary for building and training a model. The contributions of this work are a workflow for hyper-parameter search in offline handwriting recognition and a heat map based visualization technique for deep neural networks in multi-line offline handwriting recognition. This work applies to offline handwriting recognition, but the general workflow can possibly be adapted to other tasks as well.
Ziegelsplittbetone der Nachkriegsjahre und moderne RC-Betone - Nachhaltigkeit an Objektbeispielen
(2018)
Das Bauwesen gehört zu den größten Verbrauchern an natürlichen Ressourcen und Energie in der deutschen Wirtschaft. Das ist in vielen Fällen trotzdem ökologisch und ökonomisch vertretbar, weil die Bauteile und Bauwerke verglichen mit anderen "Produkten" eine deutlich längere technische Lebensdauer haben - im Fall des Betons zwischen ca. 25 und 100 Jahren - und wenn nach dem Rückbau hohe Recyclingquoten erzielt werden. In Bezug auf Einsparmöglichkeiten spielt der Massenbaustoff Beton eine ganz zentrale Rolle. Neben der Einsparung des sehr energieintensiven Zements und der Entwicklung von Substitutionsbindemitteln, stehen auch die Gesteinskörnungen im Fokus, die den größten Anteil am Beton ausmachen. Im Artikel werden auf der Basis eigener Ergebnisse aus einem DBU?geförderten Forschungsprojekt zu RC?Betonen die Nachhaltigkeit von Beton an Objektbeispielen vorgestellt und über den Sachstand zur aktuellen Nutzung von R-Betonen in Deutschland informiert. Der Fokus liegt dabei auf der Wertigkeit mineralischer Baustoffe aus vergleichsweise wenigen, überwiegend natürlichen Komponenten wie beim Beton, dessen Instandsetzungsmöglichkeiten und bessere Voraussetzungen für späteres Recycling gegenüber vielen modernen, kunststoffhaltigen Verbundbaustoffen.