Refine
Year of publication
- 2019 (169) (remove)
Document Type
- Conference Proceeding (71)
- Article (38)
- Other Publications (13)
- Report (13)
- Working Paper (10)
- Part of a Book (7)
- Book (5)
- Journal (Complete Issue of a Journal) (4)
- Doctoral Thesis (3)
- Master's Thesis (3)
- Bachelor Thesis (1)
- Study Thesis (1)
Language
- English (99)
- German (66)
- Multiple languages (4)
Keywords
- AAL (3)
- Aboriginal people (1)
- Abrasive grain material (1)
- Academic german (1)
- Activity monitoring (1)
- Actuators (1)
- Additive manufacturing (1)
- Aerobic fermentation (1)
- Alternative Energy Production (1)
- Ambient assisted living (1)
Institute
- Fakultät Architektur und Gestaltung (2)
- Fakultät Bauingenieurwesen (6)
- Fakultät Elektrotechnik und Informationstechnik (9)
- Fakultät Informatik (15)
- Fakultät Maschinenbau (10)
- Fakultät Wirtschafts-, Kultur- und Rechtswissenschaften (15)
- Institut für Angewandte Forschung - IAF (7)
- Institut für Optische Systeme - IOS (4)
- Institut für Strategische Innovation und Technologiemanagement - IST (6)
- Institut für Systemdynamik - ISD (16)
Untersuchung und Darstellung der Qualitätsveränderung von Agrarprodukten während der Trocknung
(2019)
Das Ziel der Arbeit war es optimale Trocknungsprozesse für verschiedene Agrarprodukte zu finden. Dazu wurden die Qualitätskriterien frischer und getrockneter Agrarprodukte analysiert und die Veränderungen durch die unterschiedlichen Trocknungsparameter, wie Luftgeschwindigkeit, Taupunkttemperatur, Trocknungstemperatur und –zeit dargestellt. In einer Literaturrecherche wurden sowohl die Faktoren für die Nachernteverluste und deren Höhe in Industrie- sowie Schwellen- und Entwicklungsländer untersucht. Zudem sind die Agrarprodukte und deren qualitätsbestimmenden Inhaltsstoffe vorgestellt. Auch die Extraktions- sowie die Analyse-Methoden werden aufgezeigt und erklärt. Dabei handelt es sich um die Hochleistungsflüssigkeit- und die Ionenausschlusschromatographie, aber auch um die UV/Vis-Spektroskopie und die Polarimetrie. Des Weiteren wurden während den Trocknungsprozessen mit der integrierten Kamera des Trockners in definierten Zeitabständen Bilder aufgenommen und diese über eine speziell entwickelte Software im Hinblick auf die Farbveränderung und die Schrumpfung der Agrarprodukte untersucht. Die Erstellung und Überprüfung der Versuchsergebnisse fand mittels Statistik-Software statt. Es wurden neue Diagramme, sogenannte Schädigungsdiagramme, eingeführt. Dabei handelt es sich um Diagramme, mit deren Hilfe die Identifizierung optimaler Trocknungsprozesse möglich ist. Für Chilis erwies sich eine Trocknungstemperatur von ~ 60 °C, für Kartoffeln von ~ 64 °C bis 74 °C, für Ananas von ~ 43 °C und Mangos von ~ 60 °C als optimal. Auch Taupunkttemperaturen von ~ <12 °C / >27 °C für Chilis, ~ 30 °C für Kartoffeln, ~ 14 °C für Ananas und ~ 20 °C Mangos waren optimal. Die Luftgeschwindigkeit wurde mit rund 1,2 m/s (Kartoffeln: ~ 1.2 m/s; Ananas: ~ 1.2 m/s und Mangos: ~ 0.9 m/s) als optimal befunden. Die Ergebnisse zeigten, dass bei jedem der vier Agrarprodukte die Trocknungstemperatur den größten Effekt auf die Reduzierung der qualitätsbestimmenden Eigenschaften hatte. Bei-spielsweise wurden die Ascorbinsäure, der Gesamtzucker-Gehalt sowie die organischen Säuren mit zunehmender Trocknungstemperatur stärker abgebaut. In Zukunft sollte neben den optimalen Trocknungsbedingungen auch beachtet werden, dass die Größe, Form, und Beschaffenheit der Proben einen entscheidenden Einfluss auf die stationären Trocknungsprozesse haben. Weiter ist es denkbar, instationäre Trocknungsprozesse zum Einsatz zu bringen. Dabei werden zuerst bei hohen Temperaturen die qualitätsreduzierenden Enzyme inaktiviert und anschließend bei geringen Temperatur und damit geringerer thermischer Belastung getrocknet. Weiter sollte darauf geachtet werden, dass Produkte nicht übertrocknen, so dass in Zukunft nur bis knapp unter den maximalen Restfeuchte-Gehalt und nicht wie in dieser Arbeit bis zur Gewichtskonstanz getrocknet wird.
In the field of autonomously driving vehicles the environment perception containing dynamic objects like other road users is essential. Especially, detecting other vehicles in the road traffic using sensor data is of utmost importance. As the sensor data and the applied system model for the objects of interest are noise corrupted, a filter algorithm must be used to track moving objects. Using LIDAR sensors one object gives rise to more than one measurement per time step and is therefore called extended object. This allows to jointly estimate the objects, position, as well as its orientation, extension and shape. Estimating an arbitrary shaped object comes with a higher computational effort than estimating the shape of an object that can be approximated using a basic geometrical shape like an ellipse or a rectangle. In the case of a vehicle, assuming a rectangular shape is an accurate assumption.
A recently developed approach models the contour of a vehicle as periodic B-spline function. This representation is an easy to use tool, as the contour can be specified by some basis points in Cartesian coordinates. Also rotating, scaling and moving the contour is easy to handle using a spline contour. This contour model can be used to develop a measurement model for extended objects, that can be integrated into a tracking filter. Another approach modeling the shape of a vehicle is the so-called bounding box that represents the shape as rectangle.
In this thesis the basics of single, multi and extended object tracking, as well as the basics of B-spline functions are addressed. Afterwards, the spline measurement model is established in detail and integrated into an extended Kalman filter to track a single extended object. An implementation of the resulting algorithm is compared with the rectangular shape estimator. The implementation of the rectangular shape estimator is provided. The comparison is done using long-term considerations with Monte Carlo simulations and by analyzing the results of a single run. Therefore, both algorithms are applied to the same measurements. The measurements are generated using an artificial LIDAR sensor in a simulation environment.
In a real-world tracking scenario detecting several extended objects and measurements that do not originate from a real object, named clutter measurements, is possible. Also, the sudden appearance and disappearance of an object is possible. A filter framework investigated in recent years that can handle tracking multiple objects in a cluttered environment is a random finite set based approach. The idea of random finite sets and its use in a tracking filter is recapped in this thesis. Afterwards, the spline measurement model is included in a multi extended object tracking framework. An implementation of the resulting filter is investigated in a long-term consideration using Monte Carlo simulations and by analyzing the results of a single run. The multi extended object filter is also applied to artificial LIDAR measurements generated in a simulation environment.
The results of comparing the spline based and rectangular based extended object trackers show a more stable performance of the spline extended object tracker. Also, some problems that have to be addressed in future works are discussed. The investigation of the resulting multi extended object tracker shows a successful integration of the spline measurement model in a multi extended object tracker. Also, with these results some problems remain, that have to be solved in future works.
This paper presents the integration of a spline based extension model into a probability hypothesis density (PHD) filter for extended targets. Using this filter the position and extension of each object as well as the number of present objects can jointly be estimated. Therefore, the spline extension model and the PHD filter are addressed and merged in a Gaussian mixture (GM) implementation. Simulation results using artificial laser measurements are used to evaluate the performance of the presented filter. Finally, the results are illustrated and discussed.
Summary of the 8th Workshop on Metallization and Interconnection for Crystalline Silicon Solar Cells
(2019)
This article gives a summary of the 8th Metallization and Interconnection workshop and attempts to place each contribution in the appropriate context. The field of metallization and interconnection continues to progress at a very fast pace. Several printing techniques can now achieve linewidths below 20 μm. Screen printing is more than ever the dominating metallization technology in the industry, with finger widths of 45 μm in routine mass production and values below 20 μm in the lab. Plating technology is also being improved, particularly through the development of lower cost patterning techniques. Interconnection technology is changing fast, with introduction in mass production of multiwire and shingled cells technologies. New models and characterization techniques are being introduced to study and understand in detail these new interconnection technologies.
Ziel der Masterarbeit war es, die Feuchtigkeitseigenschaften von Estrichen bei unterschiedlichen Klimaten mithilfe von Sorptionsisothermen zu charakterisieren. Die wenigen Literaturangaben zu Sorptionsisothermen von mineralischen Estrichen beziehen sich im Wesentlichen auf Calciumsulfatestriche und genormte Zementestriche (ohne dass die Zementart: Portlandzemente, Hochofenzemente bzw. CEM I, CEM II, CEM III etc. unterschieden werden) und i.d.R. nur auf eine Lufttemperatur (= 20 Grad C). Anliegen der Arbeit war es, zusätzlich die seit ca. 20 Jahren marktüblichen ternären Schnellzemente mit zu untersuchen und die baupraktisch interessanten Temperaturen von 15 Grad C und 25 Grad C einzubeziehen. Ebenso wurden die Auswirkungen der Klimabedingung auf der Baustelle (Jahreszeit, Luftfeuchtigkeit, Temperatur) auf den Hydratationsvorgang der Estriche untersucht. Dabei wurden jeweils nicht nur ein Vertreter der verschiedenen Bindemittelsysteme, sondern mindestens zwei verschiedene Estriche unterschiedlicher Hersteller mit einbezogen. In Kombination mit den Ergebnissen der Gefügeuntersuchungen (u. a. Hg-Porosametrie) wird belegt, weshalb sich die zement- und schnellzementgebundenen Estriche vollkommen anders verhalten als die calciumsulfatgebundenen Estriche. Dieses unterschiedliche Verhalten ist auch einer der Gründe, warum Estriche mit der KRL-Methode in Bezug auf ihren Feuchtegehalt nicht bewertet werden können. Aus diesem Grund folgt ein Vergleich der Materialfeuchtemessungen "KRL-Methode" mit der handwerksüblichen und seit Jahrzehnten in der Praxis bewährten "CM-Methode".
Flatness-based feed-forward control of solenoid actuators is considered. For precise motion planning and accurate steering of conventional solenoids, eddy currents cannot be neglected. The system of ordinary differential equations including eddy currents, that describes the nonlinear dynamics of such actuators, is not differentially flat. Thus, a distributed parameter approach based on a diffusion equation is considered, that enables the parametrization of the eddy current by the armature position and its time derivatives. In order to design the feedforward control, the distributed parameter model of the eddy current subsystem is combined with a typical nonlinear lumped parameter model for the electrical and mechanical subsystems of the solenoid. The control design and its application are illustrated by numerical and practical results for an industrial solenoid actuator.
At the University of Applied Sciences Konstanz, Germany, a modern electronically controlled dynamometer and several cars are available for tests. Numerous studies have been carried out, and the latest results will be presented. The paper is intended to explain different tests under load. One focus is the driving cycle WLTC (Worldwide harmonized Light vehicles Test Cycle) and the requirements for the proper conduct of investigation with this driving cycle. Two and three wheelers have a great importance for mobility in various Asian countries. But also in other countries, this segment is very important for the so-called First or Last Mile Vehicles. Because of this, a short explanation of the driving cycle WMTC (Worldwide harmonized Motorcycle Emissions Certification/Test Procedure) is given. The various possibilities for the operation of the dynamometer and for carrying out various experiments are shown.
Other important figures that can be determined on a dynamometer are the wheel power, the power losses and eventually the engine performance. With the brake specific torque, the traction force at the propelled wheels, the maximum acceleration or maximum gradeability of a car can be determined.
As well the slippage related to load can be measured on the dynamometer. The dynamic wheel radius of the driven wheels has a significant influence on the slippage. Because of the temperature increase of the tires during the tests the tire pressure increases. A rise of tire temperature, tire pressure, and wheel speed results in an increase of the dynamic wheel radius and slippage. Equations for the determination of the dynamic wheel radius are presented.
This thesis deals with the object tracking problem of multiple extended objects. For instance, this tracking problem occurs when a car with sensors drives on the road and detects multiple other cars in front of it. When the setup between the senor and the other cars is in a such way that multiple measurements are created by each single car, the cars are called extended objects. This can occur in real world scenarios, mainly with the use of high resolution sensors in near field applications. Such a near field scenario leads a single object to occupy several resolution cells of the sensor so that multiple measurements are generated per scan. The measurements are additionally superimposed by the sensor’s noise. Beside the object generated measurements, there occur false alarms, which are not caused by any object and sometimes in a sensor scan, single objects could be missed so that they not generate any measurements.
To handle these scenarios, object tracking filters are needed to process the sensor measurements in order to obtain a stable and accurate estimate of the objects in each sensor scan. In this thesis, the scope is to implement such a tracking filter that handles the extended objects, i.e. the filter estimates their positions and extents. In context of this, the topic of measurement partitioning occurs, which is a pre-processing of the measurement data. With the use of partitioning, the measurements that are likely generated by one object are put into one cluster, also called cell. Then, the obtained cells are processed by the tracking filter for the estimation process. The partitioning of measurement data is a crucial part for the performance of tracking filter because insufficient partitioning leads to bad tracking performance, i.e. inaccurate object estimates.
In this thesis, a Gaussian inverse Wishart Probability Hypothesis Density (GIW-PHD) filter was implemented to handle the multiple extended object tracking problem. Within this filter framework, the number of objects are modelled as Random Finite Sets (RFSs) and the objects’ extent as random matrices (RM). The partitioning methods that are used to cluster the measurement data are existing ones as well as a new approach that is based on likelihood sampling methods. The applied classical heuristic methods are Distance Partitioning (DP) and Sub-Partitioning (SP), whereas the proposed likelihood-based approach is called Stochastic Partitioning (StP). The latter was developed in this thesis based on the Stochastic Optimisation approach by Granström et al. An implementation, including the StP method and its integration into the filter framework, is provided within this thesis.
The implementations, using the different partitioning methods, were tested on simulated random multi-object scenarios and in a fixed parallel tracking scenario using Monte Carlo methods. Further, a runtime analysis was done to provide an insight into the computational effort using the different partitioning methods. It emphasized, that the StP method outperforms the classical partitioning methods in scenarios, where the objects move spatially close. The filter using StP performs more stable and with more accurate estimates. However, this advantage is associated with a higher computational effort compared to the classical heuristic partitioning methods.
This paper presents a new likelihood-based partitioning method of the measurement set for the extended object probability hypothesis density (PHD) filter framework. Recent work has mostly relied on heuristic partitioning methods that cluster the measurement data based on a distance measure between the single measurements. This can lead to poor filter performance if the tracked extended objects are closely spaced. The proposed method called Stochastic Partitioning (StP) is based on sampling methods and was inspired by a former work of Granström et. al. In this work, the StP method is applied to a Gaussian inverse Wishart (GIW) PHD filter and compared to a second filter implementation that uses the heuristic Distance Partitioning (DP) method. The performance is evaluated in Monte Carlo simulations in a scenario where two objects approach each other. It is shown that the sampling based StP method leads to an improved filter performance compared to DP.
Flooded Edge Gateways
(2019)
Increasing numbers of internet-compatible devices, in particular in the context of IoT, usually cause increasing amounts of data. The processing and analysis of a continuously growing amount of data in real-time by means of cloud platforms cannot be guaranteed anymore. Approaches of Edge Computing decentralize parts of the data analysis logics towards the data sources in order to control the data transfer rate to the cloud through pre-processing with predefined quality-of-service parameters. In this paper, we present a solution for preventing overloaded gateways by optimizing the transfer of IoT data through a combination of Complex Event Processing and Machine Learning. The presented solution is completely based on open-source technologies and can therefore also be used in smaller companies.
Angesichts der aktuellen höchstrichterlichen Rechtsprechung ist der betagte oder sonst hinfällige, auf Geldentschädigung klagende Verletzte gezwungen, in Rechtsmittelüberlegungen sein nahendes Ableben als erhebliches Prozessrisiko mit einzubeziehen. Das führt zu einer beträchtlichen Schmälerung der ideellen Bestandteile seines Persönlichkeitsrechts. Es ist daher dringend geboten, im Wege der richterlichen Rechtsfortbildung Kriterien für Ausnahmefallgruppen aufzustellen, um untragbare Härtefälle künftig verlässlich auszuschließen.
Die Massai greifen seit einiger Zeit international agierende Modehäuser an. Unstatthafte kulturelle Aneignung, so lautet der Vorwurf. Die Volksgruppe will die Unternehmen nun mit ihren eigenen Waffen schlagen. Sie plant, ihre kulturellen Zeichen zu kommodifizieren. Wer das vorhat, ist mit dem Eintritt ins internationale Marktgeschehen zwangsläufig denselben Regeln unterworfen wie alle anderen Marktteilnehmer auch. Das heißt: Ohne solide Ausschließlichkeitsrechte werden die Massai von möglichen Verhandlungspartnern als potentielle Lizenzgeber erwartungsgemäß nicht ernst genommen werden. Auch läuft die afrikanische Volkgsruppe Gefahr, dass die kulturell-historische Bedeutung ihrer Zeichen im Wege eines kommerziellen Einsatzes überschrieben oder im schlimmsten Fall sogar umgedeutet wird.
The Industrial Internet of Things (IIoT) will leverage on wireless network technologies to integrate in a seamless manner Cyber-Physical Systems into existing information systems. In this context, the 6TiSCH architecture, proposed by IETF, represents the current leading standardization effort to enable timed and reliable data communication within IPv6 networks for industrial applications. In wireless networks, Link Quality Estimation (LQE) is a crucial task to select the best routes for data forwarding, regardless of unpredictable time varying conditions. Although, many solutions for LQE have been proposed in literature, the majority of them are not designed specifically for 6TiSCH networks. In this paper, we analyze the performance of existing LQE strategies on 6TiSCH networks.
First, we run a set of simulations to measure the performance of one existing LQE strategy in 6TiSCH. Our simulations show that such strategy can result in measurements with low accuracy due to the 6TiSCH default timeslot allocation strategy. Consequently, we propose an extension of the 6TiSCH Minimal Configuration that allocates specific timeslots for the transmission of probing messages to mitigate the problem. The proposed methodology is demonstrated to effectively reduce the LQE error.
When a country grants preferential tariffs to another, either reciprocally in a free trade agreement (FTA) or unilaterally, rules of origin (RoOs) are defined to determine whether a product is eligible for preferential treatment. RoOs exist to avoid that exports from third countries enter through the member with the lowest tariff (trade deflection). However, RoOs distort exporters' sourcing decisions and burden them with red tape. Using a global data set, we show that, for 86% of all bilateral product-level comparisons within FTAs, trade deflection is not profitable because external tariffs are rather similar and transportation costs are non-negligible; in the case of unilateral trade preferences extended by rich countries to poor ones that ratio is a striking 98%. The pervasive and unconditional use of RoOs is, therefore, hard to rationalize.
Error correction coding (ECC) for optical communication and persistent storage systems require high rate codes that enable high data throughput and low residual errors. Recently, different concatenated coding schemes were proposed that are based on binary Bose-Chaudhuri-Hocquenghem (BCH) codes that have low error correcting capabilities. Commonly, hardware implementations for BCH decoding are based on the Berlekamp-Massey algorithm (BMA). However, for single, double, and triple error correcting BCH codes, Peterson's algorithm can be more efficient than the BMA. The known hardware architectures of Peterson's algorithm require Galois field inversion. This inversion dominates the hardware complexity and limits the decoding speed. This work proposes an inversion-less version of Peterson's algorithm. Moreover, a decoding architecture is presented that is faster than decoders that employ inversion or the fully parallel BMA at a comparable circuit size.
The computational complexity of the optimal maximum likelihood (ML) detector for spatial modulation increases rapidly as more transmit antennas or larger modulation orders are employed. Hence, ML detection may be infeasible for higher bit rates. This work proposes an improved suboptimal detection algorithm based on the Gaussian approximation method. It is demonstrated that the new method is closely related to the previously published signal vector based detection and the modified maximum ratio combiner, but can improve the detection performance compared to these methods. Furthermore, the performance of different signal constellations with suboptimal detection is investigated. Simulation results indicate that the performance loss compared to ML detection depends heavily on the signal constellation, where the recently proposed Eisenstein integer constellations are beneficial compared to classical QAM or PSK constellations.
Generative Design Software - How does digitalization change the professional profile of architects
(2019)
Abstract, Poster und Vortrag
Mechanical properties after stretching testings were calcu-lated and experimentally determined via Tempcore method for bar core, bar surface and whole bar cross section. It was displayed on the base of experiments and imitating simulation that deformation in core and surface areas of a bar are equal and therefore influence of structural parameters in the core area is principally decisive for initiating of neck forming in the surface area. The results showed that resistance to destruction of martensite surface layer has rather less effect on bar properties in general in comparison with previous investigations. It is concluded that improvement of core structure quality can help to lower brittleness of the whole bar. It was also proved that used techniques provide good concordance between the obtained results and experimental data. Therefore, the additivity rule for structural components can be used successfully for determination of whole bar parameters, taking into account thickness of surface layer that can be measured easily using hardness sensor. It will simplify practically quality control of products.
Die Erholung unseres Körpers und Gehirns von Müdigkeit ist direkt abhängig von der Qualität des Schlafes, die aus den Ergebnissen einer Schlafstudie ermittelt werden kann. Die Klassifizierung der Schlafstadien ist der erste Schritt dieser Studie und beinhaltet die Messung von Biovitaldaten und deren weitere Verarbeitung. Das non-invasive Schlafanalyse-System basiert auf einem Hardware-Sensornetz aus 24 Drucksensoren, das die Schlafphasenerkennung ermöglicht. Die Drucksensoren sind mit einem energieeffizienten Mikrocontroller über einen systemweiten Bus mit Adressarbitrierung verbunden. Ein wesentlicher Unterschied dieses Systems im Vergleich zu anderen Ansätzen ist die innovative Art, die Sensoren unter der Matratze zu platzieren. Diese Eigenschaft erleichtert die kontinuierliche Nutzung des Systems ohne fühlbaren Einfluss auf das gewohnte Bett. Das System wurde getestet, indem Experimente durchgeführt wurden, die den Schlaf verschiedener gesunder junger Personen aufzeichneten. Die ersten Ergebnisse weisen auf das Potenzial hin, nicht nur Atemfrequenz und Körperbewegung, sondern auch Herzfrequenz zu erfassen.
This document presents an algorithm for a non-obtrusive recognition of Sleep/Wake states using signals derived from ECG, respiration, and body movement captured while lying in a bed. As a core mathematical base of system data analytics, multinomial logistic regression techniques were chosen. Derived parameters of the three signals are used as the input for the proposed method. The overall achieved accuracy rate is 84% for Wake/Sleep stages, with Cohen’s kappa value 0.46. The presented algorithm should support experts in analyzing sleep quality in more detail. The results confirm the potential of this method and disclose several ways for its improvement.
In this paper, the problem of controlling the dissolved oxygen level (DO) during an aerobic fermentation is considered. The proposed approach deals with three major difficulties in respect to the nonlinear dynamics of the DO, the poor accuracy of the empirical models for the oxygen consumption rate and the fact that only sampled measurements are available on-line. A nonlinear integral high-gain control law including a continuous-discrete time observer is designed to keep the DO in the neighborhood of a set point value without any knowledge on the dissolved oxygen consumption rate. The local stability of the control algorithm is proved using Lyapunov tools. The performance of the control scheme is first analyzed in simulation and then experimentally evaluated during a successfull fermentation of the bacteria over a period of three days. Pseudomonas putida mt-2
Why does a teacher use drama elements in her language classes? This article critically reflects on the author's experiences with employing drama elements in a compulsory class of German as a Foreign Language at college level. Motives for using drama are discussed, with regard to possible positive effects for the teacher-such as rapport with learners and personal confidence-and potential benefits for the language learners-such as promoting vocabulary and grammar learning, fostering oral skills and increasing motivation. After that the drama activities used in class are briefly described. However, two years of experimentation have also shown up a number of problems with implementing drama elements in class and still leave questions open, requiring further iterations of the drama-based approach in this context. The paper was developed from a talk given at the Drama in Education Days in Konstanz 2018.
Fatigue and drowsiness are responsible for a significant percentage of road traffic accidents. There are several approaches to monitor the driver’s drowsiness, ranging from the driver’s steering behavior to analysis of the driver, e.g. eye tracking, blinking, yawning or electrocardiogram (ECG). This paper describes the development of a low-cost ECG sensor to derive heart rate variability (HRV) data for the drowsiness detection. The work includes the hardware and the software design. The hardware has been implemented on a printed circuit board (PCB) designed so that the board can be used as an extension shield for an Arduino. The PCB contains a double, inverted ECG channel including low-pass filtering and provides two analog outputs to the Arduino, that combined them and performs the analog-to-digital conversion. The digital ECG signal is transferred to an NVidia embedded PC where the processing takes place, including QRS-complex, heart rate and HRV detection as well as visualization features. The compact resulting sensor provides good results in the extraction of the main ECG parameters. The sensor is being used in a larger frame, where facial-recognition-based drowsiness detection is combined with ECG-based detection to improve the recognition rate under unfavorable light or occlusion conditions.
Compliance Management ist längst zu einem Thema in mittelständischen Unternehmen geworden. Aber was ist der richtige Ansatz, um Regeleinhaltung, Rechtschaffenheit und verantwortliches Wirtschaften als Unternehmer oder Unternehmenslenker wirksam und effizient zu organisieren? Welche Rolle spielen die Führungskräfte? Ausgehend von der Frage, was ein mittelständisch geprägtes Unternehmen im Bereich der Compliance tun kann und soll und dem dazu gehörenden Compliance Management System, wird in diesem Beitrag knapp umrissen, worauf es bei der Implementierung und Umsetzung von Maßnahmen im Unternehmensalltag besonders ankommt. Dabei wird der Schulung der Führungskräfte mittels qualitativ hochwertiger Dilemma-Trainings ein ebenso hoher Stellenwert eingeräumt, wie der auch dadurch sukzessive zu entwickelnden ethischen Unternehmenskultur.
The project aims for the development of a new material system from high tensile stainless steel wires as net material with environmentally compatible antifouling properties for off-shore fish farm cages. Therefore, current net materials from textiles (polyamide) shall be partially replaced by high strength stainless steel in order to have a more environmentally compatible system which meets the more severe mechanical loads (waves, storms, predators (sharks)). With a new antifouling strategy current issues like reduced ecological damage (e.g. due to copper disposal), lower maintenance costs (e.g. cleaning) and reduced durability shall be resolved.
Arbeitsrecht für Dummies
(2019)
Compliance im Personalwesen
(2019)
Der Erfolg eines Unternehmens hängt nicht nur von qualifizierten, sondern maßgeblich auch von motivierten, zuverlässigen und integren Mitarbeitern ab. Denn mögliche Compliance-Risiken beruhen in vielen Fällen auf einem Fehlverhalten der eigenen Mitarbeiter. Derartige Risiken können sehr einfach minimiert werden, indem von vornherein keine Personen eingestellt oder befördert werden, die in der Vergangenheit straffällig geworden sind oder deren Zuverlässigkeit und Integrität angezweifelt werden kann. Doch nicht immer ist die Sachlage so offensichtlich. Für Unternehmen ist es daher wichtig, Compliance auch im Personalmanagement und in den Personalprozessen zu berücksichtigen und zu integrieren.
Corporate Entrepreneurship (CE) became the new paradigm for organizations to cope with the accelerated development of innovations. Therefore, especially established organizations increasingly implement CE activities, even in combination. Scholars point out that a coordinated portfolio of CE activities could yield synergies and thus higher value for the organization and further call for more scientific examinations. This literature review aims to better the understanding of the combined and coordinated use of CE activities as well as about resulting synergies. Results show that there are only very few studies that addressed a combination and/or coordination of CE activities with respect to the creation of additional value, however, without empirical analyses. Yet, five categories of direct and indirect synergies could be derived. Discussing the results as well as the heterogenous use of terminology and concepts, this paper concludes with a research agenda for future analyses.