Refine
Year of publication
Document Type
- Conference Proceeding (642)
- Article (429)
- Other Publications (143)
- Part of a Book (141)
- Working Paper (128)
- Book (118)
- Report (115)
- Journal (Complete Issue of a Journal) (85)
- Master's Thesis (77)
- Doctoral Thesis (58)
Language
- German (1115)
- English (883)
- Multiple languages (8)
Keywords
Institute
- Fakultät Architektur und Gestaltung (41)
- Fakultät Bauingenieurwesen (105)
- Fakultät Elektrotechnik und Informationstechnik (34)
- Fakultät Informatik (121)
- Fakultät Maschinenbau (60)
- Fakultät Wirtschafts-, Kultur- und Rechtswissenschaften (106)
- Institut für Angewandte Forschung - IAF (115)
- Institut für Naturwissenschaften und Mathematik - INM (3)
- Institut für Optische Systeme - IOS (40)
- Institut für Strategische Innovation und Technologiemanagement - IST (60)
Die Relevanzanalyse
(2023)
Um unternehmensspezifische Risiken zu erfassen ist die Risikoanalyse unumgänglich. Ihr ist wiederum eine Relevanzanalyse voranzustellen. Nachdem im Heft 10 des Compliance Berater 2023, S. 400-404 die Grundlagen, Ziele, Anforderungen und Ansätze der Relevanzanalyse dargestellt wurden, widmet sich der nachfolgende Beitrag der Durchführung der Relevanzanalyse und gibt Hinweise zu deren Ablauf und Inhalt.
Die Relevanzanalyse
(2023)
Ordnungsgemäße Unternehmensführung ohne adäquates Risiko- und Compliance-Management ist kaum noch vor- und darstellbar. Rechtsprechung, Literatur, Politik und Gesellschaft stellen (mehr oder weniger) klare Anforderungen an ordnungsgemäßes unternehmerisches Verhalten und sanktionieren tatsächliche (und vermeintliche) Regelverstöße. Um die unternehmensspezifischen Risiken zu erfassen ist die Durchführung einer Risikoanalyse (Compliance Risk Assessment – CRA) unumgänglich1. Der eigentlichen Risikoanalyse ist eine Relevanzanalyse voranzustellen, um sich der bei unternehmerischen Aktivitäten naturgemäß nahezu unüberschaubaren potenziellen Risikomenge anzunähern und diese „abarbeitbar“ zu erfassen. Wird diese Relevanzanalyse professionell und strukturiert durchgeführt und dokumentiert, so kann sie einen wertvollen Beitrag zum Schutz und zur Hilfe gegen Compliance-Verstöße und deren Sanktionierung leisten. Der nachfolgende Beitrag stellt die Grundlagen, Ziele, Anforderungen und Ansätze der Relevanzanalyse dar. In einem weiteren Beitrag (erscheint in CB 11/2023) werden sich die Autoren der Durchführung der Relevanzanalyse widmen und Hinweise zu deren Ablauf und Inhalt geben.
Sanktionen gegen Russland
(2023)
Die EU hat aufgrund des völkerrechtswidrigen Angriffskrieges auf die Ukraine umfangreiche Sanktionen gegen Russland erlassen. Die Sanktionspakete umfassen insbesondere Wirtschaftssanktionen in Form von Einfuhr- und Ausfuhrbeschränkungen, die für deutsche Unternehmen mit unmittelbaren oder mittelbaren Geschäftsbeziehungen nach Russland von Bedeutung sind. Im Vordergrund der rechtlichen Thematik steht die Frage, ob und wann deutsche Unternehmen gegen EU-Sanktionen verstoßen. Aber auch deutsche Unternehmen mit Tochtergesellschaften in Drittstaaten stehen vor der großen Herausforderung, den Regelmechanismus der diversen Sanktionspakete zu durchleuchten, um sich nicht der Gefahr des Vorwurfs einer Umgehung der Sanktionen auszusetzen.
A key objective of this research is to take a more detailed look at a central aspect of resilience in small and medium-sized enterprises (SMEs). A literature review and expert interviews were used to investigate which factors have an impact on the innovative capacity of start-ups and whether these can also be adapted by SMEs. First of all, it must be stated that there are considerable structural and process-related differences between start-ups and SMEs. These can considerably inhibit cooperation between the two forms of enterprise. However, in the same context, success factors and issues in the start-up sector could also be identified that can improve cooperation with SMEs. These and other findings are then discussed in both an economic and an academic context. This article was written as part of the research activities of the Smart Services Competence Centre (proper name: Kompetenzzentrum Smart Services), a central contact point for all questions in the area of smart service digitalization in Baden-Wuerttemberg. Here, companies can obtain information about various digital technologies and take advantage of various measures for the development of new ideas and innovative services (Kompetenzzentrum Smart Services BW: Über das Kompetenzzentrum, 2021).
Measuring cardiorespiratory parameters in sleep, using non-contact sensors and the Ballistocardiography technique has received much attention due to the low-cost, unobtrusive, and non-invasive method. Designing a user-friendly, simple-to-use, and easy-to-deployment preserving less errorprone remains open and challenging due to the complex morphology of the signal. In this work, using four forcesensitive resistor sensors, we conducted a study by designing four distributions of sensors, in order to simplify the complexity of the system by identifying the region of interest for heartbeat and respiration measurement. The sensors are deployed under the mattress and attached to the bed frame without any interference with the subjects. The four distributions are combined in two linear horizontal, one linear vertical, and one square, covering the influencing region in cardiorespiratory activities. We recruited 4 subjects and acquired data in four regular sleeping positions, each for a duration of 80 seconds. The signal processing was performed using discrete wavelet transform bior 3.9 and smooth level of 4 as well as bandpass filtering. The results indicate that we have achieved the mean absolute error of 2.35 and 4.34 for respiration and heartbeat, respectively. The results recommend the efficiency of a triangleshaped structure of three sensors for measuring heartbeat and respiration parameters in all four regular sleeping positions.
Monitoring heart rate and breathing is essential in understanding the physiological processes for sleep analysis. Polysomnography (PSG) system have traditionally been used for sleep monitoring, but alternative methods can help to make sleep monitoring more portable in someone's home. This study conducted a series of experiments to investigate the use of pressure sensors placed under the bed as an alternative to PSG for monitoring heart rate and breathing during sleep. The following sets of experiments involved the addition of small rubber domes - transparent and black - that were glued to the pressure sensor. The resulting data were compared with the PSG system to determine the accuracy of the pressure sensor readings. The study found that the pressure sensor provided reliable data for extracting heart rate and respiration rate, with mean absolute errors (MAE) of 2.32 and 3.24 for respiration and heart rate, respectively. However, the addition of small rubber hemispheres did not significantly improve the accuracy of the readings, with MAEs of 2.3 bpm and 7.56 breaths per minute for respiration rate and heart rate, respectively. The findings of this study suggest that pressure sensors placed under the bed may serve as a viable alternative to traditional PSG systems for monitoring heart rate and breathing during sleep. These sensors provide a more comfortable and non-invasive method of sleep monitoring. However, the addition of small rubber domes did not significantly enhance the accuracy of the readings, indicating that it may not be a worthwhile addition to the pressure sensor system.
Sleep is an essential part of human existence, as we are in this state for approximately a third of our lives. Sleep disorders are common conditions that can affect many aspects of life. Sleep disorders are diagnosed in special laboratories with a polysomnography system, a costly procedure requiring much effort for the patient. Several systems have been proposed to address this situation, including performing the examination and analysis at the patient's home, using sensors to detect physiological signals automatically analysed by algorithms. This work aims to evaluate the use of a contactless respiratory recording system based on an accelerometer sensor in sleep apnea detection. For this purpose, an installation mounted under the bed mattress records the oscillations caused by the chest movements during the breathing process. The presented processing algorithm performs filtering of the obtained signals and determines the apnea events presence. The performance of the developed system and algorithm of apnea event detection (average values of accuracy, specificity and sensitivity are 94.6%, 95.3%, and 93.7% respectively) confirms the suitability of the proposed method and system for further ambulatory and in-home use.
Healthy sleep is one of the prerequisites for a good human body and brain condition, including general well-being. Unfortunately, there are several sleep disorders that can negatively affect this. One of the most common is sleep apnoea, in which breathing is impaired. Studies have shown that this disorder often remains undiagnosed. To avoid this, developing a system that can be widely used in a home environment to detect apnoea and monitor the changes once therapy has been initiated is essential. The conceptualisation of such a system is the main aim of this research. After a thorough analysis of the available literature and state of the art in this area of knowledge, a concept of the system was created, which includes the following main components: data acquisition (including two parts), storage of the data, apnoea detection algorithm, user and device management, data visualisation. The modules are interchangeable, and interfaces have been defined for data transfer, most of which operate using the MQTT protocol. System diagrams and detailed component descriptions, including signal requirements and visualisation mockups, have also been developed. The system's design includes the necessary concepts for the implementation and can be realised in a prototype in the next phase.
The influence of sleep on human health is enormous. Accordingly, sleep disorders can have a negative impact on it. To avoid this, they should be identified and treated in time. For this purpose, objective (with an appropriate device) or subjective (based on perceived values) measurement methods are used for sleep analysis to understand the problem. The aim of this work is to find out whether an exchange of the two methods is possible and can provide reliable results. In accordance with this goal, a study was conducted with people aged over 65 years old (a total of 154 night-time recordings) in which both measurement methods were compared. Sleep questionnaires and electronic devices for sleep assessment placed under the mattress were applied to achieve the study aims. The obtained results indicated that the correlation between both measurement methods could be observed for sleep characteristics such as total sleep time, total time in bed and sleep efficiency. However, there are also significant differences in absolute values of the two measurement approaches for some subjects/nights, which leads us to conclude that the substitution is more likely to be considered in case of long-term monitoring where the trends are of more importance and not the absolute values for individual nights.
The principal objective of this study is to investigate the impact of perceived stress on traffic and road safety. Therefore, we designed a study that allows the generation and collection of stress-relevant data. Drivers often experience stress due to their perception of lack of control during the driving process. This can lead to an increased likelihood of traffic accidents, driver errors, and traffic violations. To explore this phenomenon, we used the Stress Perceived Questionnaire (PSQ) to evaluate perceived stress levels during driving simulations and the EPQR questionnaire to determine the personality of the driver. With the presented study, participants can categorised based on their emotional stability and personality traits. Wearable devices were utilised to monitor each participant's instantaneous heart rate (HR) due to their non-intrusive and portable nature. The findings of this study deliver an overview of the link between stress and traffic and road safety. These findings can be utilised for future research and implementing strategies to reduce road accidents and promote traffic safety.
Development of an expert system to overpass citizens technological barriers on smart home and living
(2023)
Adopting new technologies can be overwhelming, even for people with experience in the field. For the general public, learning about new implementations, releases, brands, and enhancements can cause them to lose interest. There is a clear need to create point sources and platforms that provide helpful information about the novel and smart technologies, assisting users, technicians, and providers with products and technologies. The purpose of these platforms is twofold, as they can gather and share information on interests common to manufacturers and vendors. This paper presents the ”Finde-Dein-SmartHome” tool. Developed in association with the Smart Home & Living competence center [5] to help users learn about, understand, and purchase available technologies that meet their home automation needs. This tool aims to lower the usability barrier and guide potential customers to clear their doubts about privacy and pricing. Communities can use the information provided by this tool to identify market trends that could eventually lower costs for providers and incentivize access to innovative home technologies and devices supporting long-term care.
The development of automatic solutions for the detection of physiological events of interest is booming. Improvements in the collection and storage of large amounts of healthcare data allow access to these data faster and more efficiently. This fact means that the development of artificial intelligence models for the detection and monitoring of a large number of pathologies is becoming increasingly common in the medical field. In particular, developing deep learning models for detecting obstructive apnea (OSA) events is at the forefront. Numerous scientific studies focus on the architecture of the models and the results that these models can provide in terms of OSA classification and Apnea-Hypopnea-Index (AHI) calculation. However, little focus is put on other aspects of great relevance that are crucial for the training and performance of the models. Among these aspects can be found the set of physiological signals used and the preprocessing tasks prior to model training. This paper covers the essential requirements that must be considered before training the deep learning model for obstructive sleep apnea detection, in addition to covering solutions that currently exist in the scientific literature by analyzing the preprocessing tasks prior to training.
Die Kleinwasserkraft stand zuletzt zunehmend in der öffentlichen Kritik wegen des ökologischen Einflusses und der verhältnismäßigen geringen Stromerzeugung. Der vorliegende Beitrag beschreibt die Einschätzung von KWK-Betreibern zum Potenzial einer Effizienzsteigerung ihrer bestehenden Anlagen durch eine intelligente Informationsvernetzung innerhalb des Flusslaufes der Radolfzeller Aach im Süden Baden-Württembergs, um somit die Stromerzeugung der einzelnen Anlagen zu erhöhen.
Teilzeitmodelle sind beliebt und haben für viele Mitarbeiter einen ernsten Hintergrund. Ihre Lebensentwürfe lassen schlicht keine andere Form der Arbeit zu. Arbeitgeber sollten einerseits auf die Flexibilisierungswünsche der Betroffenen eingehen, um das Betriebsklima zu fördern und eine Unternehmensbindung herzustellen. Andererseits sind sie aber (mit wenigen Ausnahmen) verpflichtet, Teilzeit zu ermöglichen. Eine Variante ist dabei die Brückenteilzeit, also eine zuvor festgelegte (nur) zeitlich begrenzte Verringerung der Arbeitszeit.
Recently published nonlinear model-based control
approaches achieve impressive performances in complex real-
world applications. However, due to model-plant mismatches
and unforeseen disturbances, the model-based controller’s per-
formance is limited in full-scale applications. In most applica-
tions, low-level control loops mitigate the model-plant mismatch
and the sensitivity to disturbances. But what is the influence
of these low-level control loops? In this paper, we present
the model predictive path integral (MPPI) control of a self-
balancing vehicle and investigate the influence of subordinate
control loops on closed-loop performance. Therefore, simulation
and full-scale experiments are performed and analyzed. Subor-
dinate control loops empower the MPPI controller because they
dampen the influence of disturbances, and thus improve the
model’s accuracy. This is the basis for the successful application
of model-based control approaches in real-world systems. All
in all, a model is used to design a low-level controller, then
its closed-loop behavior is determined, and this model is used
within the superimposed MPPI control loop – modeling for
control and vice versa.
IT-Governance
(2023)
Die digitale Transformation verstärkt den Einfluss der Informationstechnologie auf den Unternehmenserfolg erheblich. Damit erhöhen sich auch die Anforderungen an das Führungssystem der IT in den Unternehmen. Hier gilt die einfache Weisheit: Ein ungeeignetes Managementsystem bringt in der Regel schlechtere Entscheidungen mit sich.
Wie Sie zielorientiert bestimmen, wer im Unternehmen wie auf IT-relevante Entscheidungen einwirken soll, zeigt Ihnen Christopher Rentrop mit viel Übersicht:
- Grundlegende Ziele und Erfolgsfaktoren der IT-Governance
- Gestaltungselemente der IT-Governance: Strukturen und Prozesse, Entscheidungsrechte, relationale Mechanismen u.a.
- COBIT als Rahmenwerk der IT-Governance
- Spezifische Entscheidungsdomänen, Handlungsfelder und Verantwortlichkeiten
- Management, Weiterentwicklung und Erfolgsmessung der IT-Governance
Eine prägnante Orientierungshilfe, die Sie Schritt für Schritt zu einer organisationsgerechten Ausgestaltung des Führungssystems der IT leitet.
In 3D extended object tracking (EOT), well-established models exist for tracking the object extent using various shape priors. A single update, however, has to be performed for every measurement using these models leading to a high computational runtime for high-resolution sensors. In this paper, we address this problem by using various model-independent downsampling schemes based on distance heuristics and random sampling as pre-processing before the update. We investigate the methods in a simulated and real-world tracking scenario using two different measurement models with measurements gathered from a LiDAR sensor. We found that there is a huge potential for speeding up 3D EOT by dropping up to 95\% of the measurements in our investigated scenarios when using random sampling. Since random sampling, however, can also result in a subset that does not represent the total set very well, leading to a poor tracking performance, there is still a high demand for further research.
Public-key cryptographic algorithms are an essential part of todays cyber security, since those are required for key exchange protocols, digital signatures, and authentication. But large scale quantum computers threaten the security of the most widely used public-key cryptosystems. Hence, the National Institute of Standards and Technology ( NIST ) is currently in a standardization process for post-quantum secure public-key cryptography. One type of such systems is based on the NP-complete problem of decoding random linear codes and therefore called code-based cryptography. The best-known code-based cryptographic system is the McEliece system proposed in 1978 by Robert McEliece. It uses a scrambled generator matrix as a public key and the original generator matrix as well as the scrambling as private key. When encrypting a message it is encoded in the public code and a random but correctable error vector is added. Only the legitimate receiver can correct the errors and decrypt the message using the knowledge of the private key generator matrix. The original proposal of the McEliece system was based on binary Goppa codes, which are also considered for standardization. While those codes seem to be a secure choice, the public keys are extremely large, limiting the practicality of those systems. Many different code families were proposed for the McEliece system, but many of them are considered insecure since attacks exist, which use the known code structure to recover the private key. The security of code-based cryptosystems mainly depends on the number of errors added by the sender, which is limited by the error correction capability of the code. Hence, in order to obtain a high security for relatively short codes one needs a high error correction capability. Therefore maximum distance separable ( MDS ) codes were proposed for those systems, since those are optimal for the Hamming distance. In order to increase the error correction capability we propose q -ary codes over different metrics. There are many code families that have a higher minimum distance in some other metric than in the Hamming metric, leading to increased error correction capability over this metric. To make use of this one needs to restrict not only the number of errors but also their value. In this work, we propose the weight-one error channel, which restricts the error values to weight one and can be applied for different metrics. In addition we propose some concatenated code constructions, which make use of this restriction of error values. For each of these constructions we discuss the usability in code-based cryptography and compare them to other state-of-the-art code-based cryptosystems. The proposed code constructions show that restricting the error values allows for significantly lower public key sizes for code-based cryptographic systems. Furthermore, the use of concatenated code constructions allows for low complexity decoding and therefore an efficient cryptosystem.
This thesis presents the development of two different state-feedback controllers to solve the trajectory tracking problem, where the vessel needs to reach and follow a time-varying reference trajectory. This motion problem was addressed to a real-scaled fully actuated surface vessel, whose dynamic model had unknown hydrodynamic and propulsion parameters that were identified by applying an experimental maneuver-based identification process. This dynamic model was then used to develop the controllers. The first one was the backstepping controller, which was designed with a local exponential stability proof. For the NMPC, the controller was developed to minimize the tracking error, considering the thrusters’ constraints. Moreover, both controllers considered the thruster allocation problem and counteracted environmental disturbance forces such as current, waves and wind.The effectiveness of these approaches was verified in simulation using Matlab/Simulink and GRAMPC (in the case of the NMPC), and in experimental scenarios, where they were applied to the vessel, performing docking maneuvers at the Rhine River in Constance (Germany).
In spite of the amount of new tools and methodologies adopted in the road infrastructure sector, the performance of road infrastructure projects is not constantly improving. Considering that the volume of projects undertaken is forecasted to increase every year, this is a substantial issue for the road infrastructure sector. Hence this work focuses on the principles of Blockchain Technology, road infrastructure sector and the information exchange with the aim to use the advantages of the Blockchain Technology in supporting to overcome the various challenges along the life cycle of road infrastructure projects.
Within the scope of this paper, two studies were conducted. First, focus groups were used to explore where society (road infrastructure sector) stands in terms of industry 4.0 and to get a better understanding if and where the principles of Blockchain Technology can be used when managing projects in the road infrastructure sector. Second, semi-structured interviews were administrated with experts of the road infrastructure sector and experts of Blockchain Technology to better understand the interrelation between these two areas. Based on the outcome of the two studies, technology barriers and enablers were explored for the purpose of improved information exchange within the road infrastructure sector.
The two studies revealed that there are significant and strong interrelations between the principles of the Blockchain Technology, project management within the road infrastructure sector and information exchange. These interrelations are complex and diverse, but overall it can be concluded that the adoption of the principles of Blockchain Technology into the field of information exchange improves the management of road infrastructure projects. Based on the two studies a theoretical framework was developed.
In summary this research showed that trust is an important factor and builds the foundation for communication and to ensure a proper information exchange. Within the scope of this thesis, it was demonstrated that the principles of the Blockchain Technology can be used to increase transparency, traceability and immutability during the life cycle of road infrastructure projects in the area of information exchange.
Foil-air bearings (FABs) are predominantly used for high-speed, oil-free applications. Offering many advantages such as friction loss at high speeds, stability and price, they lack, however, load capacity as well as start-up and coast-down friction wear resistance.
The friction losses of FABs have been studied experimentally by many authors. In order to predict the friction and, consequently, the lifespan of a FAB, the start-up and coast-down regimes are modelled in such a way that allows for accurate, efficient simulation and later optimisation of lift-off speed and wear characteristics. The proposed simulation method applies the Kirchhoff-Love plate theory to the top foil mapping [20]. This system of differential equations is coupled with the underlying compliant foil to simulate the displacement due to the pressure buildup. Consequently, this coupled system allows for simulation from almost zero rounds per minute (rpm) to full speed. The underlying simulation model uses the finite difference method for spatial discretisation and a temporal explicit Runge-Kutta method.
Difficulties to overcome are the smooth combination of various friction regimes across the sliding surfaces as well as the synchronous coupling of Reynolds, deformation and kinematic equations with highly non-linear terms. Introducing an exponential pressure component based on Greenwood and Tripp’s theory avoids impingement between the rotor and foil.
Unter bestimmten Kontaktbedingungen zwischen Rad und Schiene können selbsterregte Schwingungen angeregt werden, die zu gegenphasigen Drehbewegungen der Radscheiben und hohen Torsionsmomenten in der Radsatzwelle führen. Zur Bestimmung des maximalen Torsionsmoments sind bislang aufwendige Testfahrten erforderlich, da keine Verfahren bekannt waren, die eine konservative Berechnung des Torsionsmoments ermöglichen [1]. In den vergangenen Jahren wurden die drei folgenden Berechnungsmethoden vertieft untersucht, um das maximale, dynamische Torsionsmoment zu berechnen:
- Simulationen von komplexen Mehrkörpersystemen (MKS)
- Differentialgleichungssysteme mit numerischer Berechnung
- Analytische Berechnung durch Reduktion auf ein Minimalmodell
In dieser Publikation sollen diese Berechnungsmethoden näher vorgestellt werden und durch eine Gegenüberstellung der jeweils berechneten und gemessenen Ergebnisse deren Möglichkeiten aber auch Limitationen aufgezeigt werden.
Ziel des Forschungsprojekts "Ekont" ist es, ein handgeführtes Gerät zum Betonabtrag an Innenkanten und Störstellen in Kernkraftwerken (KKW) zu entwickeln. Um die Reaktionskräfte zu reduzieren wird hierbei der neuartige Ansatz eines gegenläufigen Fräsprozesses untersucht. Ergebnis ist eine Getriebelösung, bei der eine mittlere Frässcheibe mit annähernd derselben Umfangsgeschwindigkeit in die entgegengesetzte Richtung von weiteren Frässcheiben rotiert.
In the past years, algorithms for 3D shape tracking using radial functions in spherical coordinates represented with different methods have been proposed. However, we have seen that mainly measurements from the lateral surface of the target can be expected in a lot of dynamic scenarios and only few measurements from the top and bottom parts leading to an error-prone shape estimate in the top and bottom regions when using a representation in spherical coordinates. We, therefore, propose to represent the shape of the target using a radial function in cylindrical coordinates, as these only represent regions of the lateral surface, and no information from the top or bottom parts is needed. In this paper, we use a Fourier-Chebyshev double series for 3D shape representation since a mixture of Fourier and Chebyshev series is a suitable basis for expanding a radial function in cylindrical coordinates. We investigate the method in a simulated and real-world maritime scenario with a CAD model of the target boat as a reference. We have found that shape representation in cylindrical coordinates has decisive advantages compared to a shape representation in spherical coordinates and should preferably be used if no prior knowledge of the measurement distribution on the surface of the target is available.
Service in der Investitionsgüterindustrie wird heutzutage in der Regel immer noch manuell und vor Ort beim Kunden ausgeführt. Dazu braucht es qualifizierte Service-Techniker:innen, die über das nötige Produkt- Prozesswissen verfügen. Für kleine und mittelständische Unternehmen (KMU) der Investitionsgüterindustrie stellt insbesondere die Internationalisierung eine Herausforderung dar, da qualifizierte Service-Techniker:innen eine rare Ressource sind. Es gilt sie möglichst effektiv und effizient einzusetzen. Zu diesem Zweck wurde im Rahmen des SerWiss-Projektes eine Lösung entwickelt, die es KMU ermöglicht, service-rele-
vantes Wissen effizient zu generieren, zu strukturieren und am Point-of-Service bereitzustellen sowie im Rahmen geeigneter Geschäftsmodelle zu vermarkten. Im Beitrawird erläutert, wie sich dieses erfasste Wissen als kundenorientiertes Wertangebot einsetzen und erlöswirksam in entsprechenden Geschäftsmodellen umsetzen lässt.
The transformation to an Industry 4.0, which is in general seen as a solution to increasing market challenges, is forcing companies to radically change their way of thinking and to be open to new forms of cooperation. In this context, the opening-up of the innovation process is widely seen as a necessity to meet these challenges, especially for small and medium enterprises (SMEs). The aim of the study therefore is to analyze how cooperation today can be characterized, how this character has changed since the establishment of the term Industry 4.0 at Hanover Fair in 2011 and which cooperation strategies have proven successful. The analysis consists of a quantitative, secondary data analysis that includes country-specific data from 35 European countries of 2010 and 2016 collected by the European Commission and the OECD. The research, focusing on the secondary sector, shows that multinational enterprises MNEs still tend to cooperate more than SMEs, with a slight overall trend towards protectionism. Nevertheless, there is a clear tendency towards the opening-up of SMEs. In this regard, especially universities, competitors and suppliers have become increasingly attractive as cooperation partners for SMEs.
Jahresbericht 2023
(2023)
Scheinselbständigkeit
(2023)
Cities around the world are facing the implications of a changing climate as an increasingly pressing issue. The negative effects of climate change are already being felt today. Therefore, adaptation to these changes is a mission that every city must master. Leading practices worldwide demonstrate various urban efforts on climate change adaptation (CCA) which are already underway. Above all, the integration of climate data, remote sensing, and in situ data is key to a successful and measurable adaptation strategy. Furthermore, these data can act as a timely decision support tool for municipalities to develop an adaptation strategy, decide which actions to prioritize, and gain the necessary buy-in from local policymakers. The implementation of agile data workflows can facilitate the integration of climate data into climate-resilient urban planning. Due to local specificities, (supra)national, regional, and municipal policies and (by) laws, as well as geographic and related climatic differences worldwide, there is no single path to climate-resilient urban planning. Agile data workflows can support interdepartmental collaboration and, therefore, need to be integrated into existing management processes and government structures. Agile management, which has its origins in software development, can be a way to break down traditional management practices, such as static waterfall models and sluggish stage-gate processes, and enable an increased level of flexibility and agility required when urgent. This paper presents the findings of an empirical case study conducted in cooperation with the City of Constance in southern Germany, which is pursuing a transdisciplinary and trans-sectoral co-development approach to make management processes more agile in the context of climate change adaptation. The aim is to present a possible way of integrating climate data into CCA planning by changing the management approach and implementing a toolbox for low-threshold access to climate data. The city administration, in collaboration with the University of Applied Sciences Constance, the Climate Service Center Germany (GERICS), and the University of Stuttgart, developed a co-creative and participatory project, CoKLIMAx, with the objective of integrating climate data into administrative processes in the form of a toolbox. One key element of CoKLIMAx is the involvement of the population, the city administration, and political decision-makers through targeted communication and regular feedback loops among all involved departments and stakeholder groups. Based on the results of a survey of 72 administrative staff members and a literature review on agile management in municipalities and city administrations, recommendations on a workflow and communication structure for cross-departmental strategies for resilient urban planning in the City of Constance were developed.
Entwicklung eines Prozesses für die Software-Funktionsvorentwicklung am Fahrzeug mittels Matlab/Simulink, um einen Machbarkeitsnachweis neuer Funktionen vor Beginn der Software-Serienentwicklung sicherzustellen. Der Prozess beinhaltet die Anforderungserhebung, die Systementwicklung in den Bereichen Hydraulik und Elektrik, die Software-Funktionsentwicklung in Matlab/Simulink sowie die Funktionsprüfung am Fahrzeug.
Das Projekt eFlow, an dem unter anderem die HTWG Konstanz seit 2012 forscht, simuliert mit Hilfe einer mathematischen Simulation wie sich Menschenmassen verhalten, wenn sie ein vorgegebenes Gelände verlassen sollen. Die Simulation baut auf einen Ansatz der Finite Elemente Methode auf, in der mehrere gekoppelte Differenzialgleichungen berechnet werden müssen. Diese Berechnungen erweisen sich gerade bei komplexen Szenarien mit großem Gelände und vielen Personen als sehr rechenintensiv. Ziel dieser Bachelorarbeit ist es ein Surrogate Modell zu erstellen, welches basierend auf machine-learning Ansätzen im spezifischen auf Regressionsmethoden Ergebnisse der Simulation vorhersagen soll. Somit müssen Datensätze generiert werden. Diese entstehen durch wiederholte Durchläufe der Simulation, in der jeweils die Eingabeparameter, die in das Regressionsmodell einfließen sollen variiert werden und mit dem entsprechenden Ergebnis der Simulation verknüpft werden. Die Regressionsansätze werden dabei pro Durchlauf komplexer, in dem jeweils zusätzliche Eingabeparameter mit in die Datengenerierung aufgenommen werden. Es soll überprüft werden, ob diese Simulation mittels machine-learning Ansätzen reproduzierbar ist. Basierend auf diesen Surrogate Modellen soll es möglich gemacht werden, Situationen in Echtzeit zu überprüfen, ohne dabei den Weg der rechenaufwendigen Simulation zu gehen. Die Ergebnisse bestätigen, dass die mathematische Simulation mittels Regression reproduzierbar ist. Es erweist sich jedoch als sehr rechenaufwendig, Daten zu sammeln, um genügend Eingabeparameter mit in die Regressionsmethode einfließen zu lassen. Diese Arbeit gestaltet somit eine Vorstudie zur Umsetzung eines ausgereiften Surrogate Modells, welches jegliche Eingabeparameter der Simulation berücksichtigen kann.
Der Kundenservice von morgen
(2023)
Die digitale Selbstbedienung im Einzelhandel und anderen Dienstleistungsbereichen verändert die Konsumwelt. Self-Services werden zunehmend von Konsumenten aller Altersklassen genutzt. Der Handel muss seine Servicekanäle hinterfragen und vermehrt auf Self-Service als Kundenkontaktpunkt setzen. Andere Branchen haben diesbezüglich bereits Lösungen umgesetzt. Vor diesem Hintergrund analysiert der Beitrag die Nutzung von Self-Service-Lösungen in Abhängigkeit von der Generationen-Zugehörigkeit und gibt Handlungsempfehlungen für KMU aus dem Einzelhandel.
This thesis emphasizes problems that reports generated by vulnerability scanners impose on the process of vulnerability management, which are a. an overwhelming amount of data and b. an insufficient prioritization of the scan results.
To assist the process of developing means to counteract those problems and to allow for quantitative evaluation of their solutions, two metrics are proposed for their effectiveness and efficiency. These metrics imply a focus on higher severity vulnerabilities and can be applied to any simplification process of vulnerability scan results, given it relies on a severity score and time of remediation estimation for each vulnerability.
A priority score is introduced which aims to improve the widely used Common Vulnerability Scoring System (CVSS) base score of each vulnerability dependent on a vulnerability’s ease of exploit, estimated probability of exploitation and probability of its existence.
Patterns within the reports generated by the Open Vulnerability Assessment System (OpenVAS) vulnerability scanner between vulnerabilities are discovered which identify criteria by which they can be categorized from a remediation actor standpoint. These categories lay the groundwork of a final simplified report and consist of updates that need to be installed on a host, severe vulnerabilities, vulnerabilities that occur on multiple hosts and vulnerabilities that will take a lot of time for remediation. The highest potential time savings are found to exist within frequently occurring vulnerabilities, minor- and major suggested updates.
Processing of the results provided by the vulnerability scanner and creation of the report is realized in the form of a python script. The resulting reports are short, straight to the point and provide a top down remediation process which should theoretically allow to minimize the institutions attack surface as fast as possible. Evaluation of the practicality must follow as the reports are yet to be introduced into the Information Security Management Lifecycle.
Die Nibelungenbrücke Worms
(2020)
Die durch kleine und mittelständische Unternehmen geprägte Investitionsgüterindustrie steht aufgrund der zunehmenden Internationalisierung im Servicegeschäft vor großen Herausforderungen. Mitarbeiterengpässe, hohe Prozesskosten und unzureichendes Wissensmanagement machen den Service zur potenziellen betriebs- und volkswirtschaftlichen Wachstumsbremse. Durch die Digitalisierung entstehen aber auch große Nutzenpotenziale im Servicegeschäft. Ziel des im Projekt SerWiss entwickelten integrierten Ansatzes ist es, kleine und mittelständische Anbieter von Investitionsgütern zu befähigen, Servicewissen auf der Basis eines digitalen Lösungsansatzes unter Gewährleistung einer humanen Arbeitsgestaltung effizient zu generieren, zu strukturieren und international bereitzustellen bzw. zu vermarkten.
Im Investitionsgüterservice ist Wissen längst zu einem zentralen Erfolgshebel geworden, sowohl zur Steigerung der Prozesseffektivität und -effizienz als auch als Fundament für werthaltige Geschäftsmodelle. Das Management Service-relevanten Wissens ist für kleine und mittelständische Unternehmen der Investitionsgüterindustrie jedoch oftmals eine nicht zu unterschätzende Herausforderung, welche weit über IT-technische Aspekte hinausreicht. In dem vom BMBF sowie vom ESF (ko)finanzierten Projekt „SerWiss“ wurde vor diesem Hintergrund ein umfassender Lösungsansatz entwickelt und bei zwei Projektpartnern aus der Investitionsgüterindustrie prototypisch umgesetzt.
Die durch KMU geprägte Investitionsgüterindustrie steht aufgrund der zunehmenden Internationalisierung im Servicegeschäft, Mitarbeiterengpässen, hohen Prozesskosten sowie fehlendem Wissensmanagment vor großen Herausforderungen. Durch die Digitalisierung entstehen große Nutzenpotenziale im Servicegeschäft. Vor diesem Hintergrund wurde ein auf den Methoden Intelligent Swarming und Knowledge Centered Service basierender, integrierter Ansatz entwickelt, der KMU aus der Investitionsgüterindustrie befähigt, Servicewissen effizient zu generieren, zu strukturieren und international zu vermarkten.
Bauen nach dem Bauhaus
(2018)
Random matrices are used to filter the center of gravity (CoG) and the covariance matrix of measurements. However, these quantities do not always correspond directly to the position and the extent of the object, e.g. when a lidar sensor is used.In this paper, we propose a Gaussian processes regression model (GPRM) to predict the position and extension of the object from the filtered CoG and covariance matrix of the measurements. Training data for the GPRM are generated by a sampling method and a virtual measurement model (VMM). The VMM is a function that generates artificial measurements using ray tracing and allows us to obtain the CoG and covariance matrix that any object would cause. This enables the GPRM to be trained without real data but still be applied to real data due to the precise modeling in the VMM. The results show an accurate extension estimation as long as the reality behaves like the modeling and e.g. lidar measurements only occur on the side facing the sensor.
Das Management von Aktienfonds strebt effiziente Mischungen von Aktien an. Nachdem diese durch Optimierungsverfahren ermittelt wurden, müssen sie aus ökonomischen oder rechtlichen Gründen oft angepasst werden mit der Konsequenz, dass die Lösungen nicht mehr effizient sind. Ein rechtlicher Grund kann bei einem öffentlich angebotenen Aktienfond der Artikel 52(2) der EU-Richtlinie 2009/65/EC bzw. das KAGB § 206 sein. Ein Teil der Richtlinie besagt z.B., dass in eine Aktie nie mehr als 10% des Budgets investiert werden kann. Diese Regeln insgesamt sich auch als 5-10-40-Bedingung bekannt. Um derartige Risikobeschränkungen in der Portfoliooptimierung zu integrieren wurden zwei Optimierungsmodelle entwickelt – ein quadratisches und ein lineares. Die Modelle wurden anhand von historischen Renditedaten des HDAX getestet. Das lineare Modell zeigt, dass die Vorgaben der EU-Richtlinie die angestrebte Volatilitätsreduktion erreicht. Diese Risikobeschränkung hat aber einen Preis, der in den Währungen „Renditeverlust“ bzw. „Volatilitätszuwachs“ ausgedrückt werden kann. Bei gleicher Volatilität erzielte das nicht durch die 5-10-40-Bedingung eingeschränkte Portfolio eine ca. 10% höhere Jahresrendite. Der „Volatilitätszuwachs“ ist im Umfeld des minimalen Volatilitätspunktes (MVP) gering, kann aber bis zu 25% betragen, wenn Portfolios die unter der 5-10-40-Bedingung ermittelt wurden verglichen werden mit uneingeschränkt optimierten Portfolios bei jeweils gleicher Rendite. Das quadratische Modell baut auf dem Ansatz von H. Markowitz auf und zeigt einen flexibleren Weg der Risikobegrenzung der zu vergleichbaren Resultaten führt.
Die digitale Transformation von Geschäftsprozessen und die stärkere Integration von IT-Systemen führen zu Chancen und Risiken für kleine und mittlere Unternehmen (KMU). Risiken, die zu fehlender IT-Governance, Risk und Compliance (GRC) führen können. Ziel dieses Beitrags ist es, die Design- und Evaluierungsphase der Erstellung eines Artefakts darzustellen. Dabei wird der Design Science Research Ansatz nach Hevner verwendet. Das Artefakt wird für die Auswahl von Standards entwickelt, indem KMU-relevante Ausprägungen und bestehende Rahmenwerke auf die definierten Kriterien angepasst werden.
Contemporary empirical applications frequently require flexible regression models for complex response types and large tabular or non-tabular, including image or text, data. Classical regression models either break down under the computational load of processing such data or require additional manual feature extraction to make these problems tractable. Here, we present deeptrafo, a package for fitting flexible regression models for conditional distributions using a tensorflow backend with numerous additional processors, such as neural networks, penalties, and smoothing splines. Package deeptrafo implements deep conditional transformation models (DCTMs) for binary, ordinal, count, survival, continuous, and time series responses, potentially with uninformative censoring. Unlike other available methods, DCTMs do not assume a parametric family of distributions for the response. Further, the data analyst may trade off interpretability and flexibility by supplying custom neural network architectures and smoothers for each term in an intuitive formula interface. We demonstrate how to set up, fit, and work with DCTMs for several response types. We further showcase how to construct ensembles of these models, evaluate models using inbuilt cross-validation, and use other convenience functions for DCTMs in several applications. Lastly, we discuss DCTMs in light of other approaches to regression with non-tabular data.
This paper aims to apply the basics of the Service-Dominant Logic, especially the concept of creating benefits through serving, to the stationary retail industry. In the industrial context, the shift from a product-driven point of view to a service-driven perspective has been discussed widely. However, there are only few connections to how this can be applied to the retail sector on a B2C-level and how retailers can use smart services in order to enable customer engagement, loyalty and retention. The expectations of customers towards future stationary retail develop significantly as consumers got used to the comfort of online shopping. Especially the younger generation—the Generation Z—seems to have changed their priorities from the bare purchase of products to an experience- and service-driven approach when shopping over-the-counter. To stay successful long-term, companies from this sector need to adapt to the expectations of their future main customer group. Therefore, this paper will analyse the specific needs of Generation Z, explain how smart services contribute to creating benefit for this customer group and how this affects the economic sustainability of these firms.
As one of the most important branches of the industry in Germany and
the European Union, the mechanical and plant engineering sector is confronted with fundamental changes due to ever shorter innovation cycles and increased competitive pressure. This makes it even more important to increase the level of service components in business models with a low service level, which are still frequently found in SMEs. This paper is dedicated to the changes that the individual components of a business model have experienced and will experience. Special attention is paid to economic sustainability, since service business models can also positively influence the long-term nature of a business. Seven interviews conducted with relevant companies serve as the empirical basis of this paper. The analysed effects of smart services and active customer integration are structured and summarized within the three pillars of every business model (value proposition, the value creation architecture and the revenue mechanic).
AbstractSanctions encompass a wide set of policy instruments restricting cross‐border economic activities. In this paper, we study how different types of sanctions affect the export behavior of firms to the targeted countries. We combine Danish register data, including information on firm‐destination‐specific exports, with information on sanctions imposed by Denmark from the Global Sanctions Database. Our data allow us to study firms' export behavior in 62 sanctioned countries, amounting to a total of 453 country‐years with sanctions over the period 2000–2015. Methodologically, we apply a two‐stage estimation strategy to properly account for multilateral resistance terms. We find that, on average, sanctions lead to a significant reduction in firms' destination‐specific exports and a significant increase in firms' probability to exit the destination. Next, we study heterogeneity in the effects of sanctions across (i) sanction types and sanction packages, (ii) the objectives of sanctions, and (iii) countries subject to sanctions. Results confirm that the effects of sanctions on firms' export behavior vary considerably across these three dimensions.
Die Berücksichtigung ökologischer und sozialer Gesichtspunkte in der Konzeption, Planung und Errichtung von Gebäuden hat in den vergangenen Jahren großen Einfluss auf Marktfähigkeit der Immobilien gewonnen. Regulatorische Rahmenwerke wie die Taxonomie-Verordnung der Europäischen Union formulieren die klare Anforderung an die Bauwirtschaft dem Schutz von Mensch und Natur mehr Bedeutung einzuräumen. Nur mit einem wesentlichen Beitrag zu den Klimazielen der Europäischen Union wird es der Branche langfristig möglich sein sich einen uneingeschränkten Zugang zum Investorenmarkt zu sichern.
Die vorliegende wissenschaftliche Arbeit widmet sich dem Kriterienkatalog der Deutschen Gesellschaft für Nachhaltiges Bauen e.V. und legt Übereinstimmungen mit den technischen Bewertungskriterien der EU-Taxonomie Verordnung offen. Der im Frühjahr 2023 erschienen Kriterienkatalog umfasst eine Vielzahl von Kriterien, anhand derer Gebäude auf Nachhaltigkeit geprüft werden. Im Vergleich zu der Vorgängerversion aus dem Jahr 2018 wurden erhebliche Änderungen eingearbeitet. Besonders hervorzuheben sind neue technische Prüfkriterien im Bereich Klimaschutz, Ressourcengewinnung, Biodiversität und Kreislaufwirtschaft. Die Angleichung der Berechnungsmethode für die Ökobilanzen an das bundeseigene „Qualitätssiegel Nachhaltiges Gebäude“, die Mindestanforderung nach dem erhöhten Einsatz von nachhaltig gewachsenem Holz, die Prüfung spezifischer Zielquoten bei dem Einsatz von Recyclingbeton sowie Anforderungen an die Zirkularität sind nur ein Teil der Neuerungen. Für die zusätzlichen Anforderung müssen Projektentwickler mit Mehrkosten im hohen sechsstelligen Bereich im Vergleich zu der Vorgängerversion rechnen. Vorteile der Neuauflage des Kriterienkataloges sind eine erhöhte Übereinstimmung mit den Nachhaltigkeitsanforderungen der Europäischen Union. Es werden jedoch nicht alle Anforderungen erfüllt. Nachweise für den Primärenergiebedarf, die Schadstoffbelastung von Bauteilen bzw. -materialien und eine Umweltverträglichkeitsprüfung müssen zusätzlich zu dem Kriterienkatalog der Deutschen Gesellschaft für Nachhaltiges Bauen geleistet werden. Insgesamt ebnen die Kriterien der Deutschen Gesellschaft für Nachhaltiges Bauen aber den Weg hin zu einer EU-Konformität und helfen Projektentwicklern Immobilien erfolgreich auf dem Markt zu positionieren.
Effiziente Energienutzung ist eine bestehende Problematik, welche nicht nur Privathaushalte, sondern auch Institute und Unternehmen betrifft. Die Thematik, mit der sich diese Bachelorarbeit beschäftigt, ist intelligente Regelung von Wärmeenergie für Nichtwohngebäude. Das Ziel hierbei ist die Einsparung von Energie und die daraus folgenden Kosten. Hierfür wird mittels theoretischer Arbeit, Recherche für vorhandene Konzepte durchgeführt. Mit MATLAB Simulink soll anschließend ein eigenes Konzept für eine intelligente, vorausschauende Regelung aufgebaut und simuliert werden. Dabei soll die Raumlufttemperatur eines Raumes in einem Nichtwohngebäude, mithilfe eines modellbasierten prädiktiven Reglers (MPC), auf eine bestimmte Wunschtemperatur geregelt werden. Zum Schluss wird diese mit einer herkömmlichen Regelung (PID-Regelung) verglichen. Als Ergebnis kam dabei heraus, dass sich bei der vorausschauenden Regelung, im Vergleich zur herkömmlichen Regelung, ein deutlich besserer Temperaturverlauf ergibt. Die Raumtemperatur liegt im gewünschten Sollbereich, jedoch sind in den Ergebnissen keine nennenswerten Energieeinsparungen zu sehen. Durch zukünftige Erweiterungen in den MPC, sollte dies aber definitiv möglich sein. Deshalb und aufgrund der genaueren Regelung der Temperatur, wird eine Empfehlung zur Anwendung von MPC-Reglern an Nichtwohngebäude abgegeben.
Trotz des seit über zehn Jahren anhaltend negativen Trends im traditionellen Kameramarkt werden in Zukunft exponentiell mehr Bilder mit technischen Hilfsmitteln produziert und veröffentlicht werden, nur eben auf eine fundamental andere Weise, mit anderen, vermeintlich komfortableren Geräten, im Hintergrund unterstützt durch sogenannte »smarte« Technologien. Die blitzschnelle Verrechnung von kurzen Bildserien zu einem einzigen Bild, unter Zuhilfenahme von leistungsfähigen Algorithmen aus dem Bereich des maschinellen Sehens, simuliert eine handwerkliche Perfektion, die auf optisch, chemischem Weg so nicht möglich wäre. Auch wenn die analoge Fotografie, teils im Rückgriff auf Jahrhunderte alte Praktiken der Bildenden Kunst, einstmals die Vorbilder und Standards etabliert hat, auf die KI-Modelle derzeit trainiert werden, spielen analoge Bildgebungsverfahren heutzutage quantitativ kaum mehr eine Rolle. Qualitativ erfährt die analoge Fotografie, sowohl im Sinne einer entschleunigenden Gegenbewegung, als auch auf Grund ihrer vermeintlich höheren Authentizität und ihrer haptischen und materiellen Qualitäten, eine überraschend starke Aufmerksamkeit. Diese richtet sich auf die Auseinandersetzung mit fotografischen Wahrnehmungsweisen, die Erforschung unserer realen Umgebung und nicht zuletzt auf die Begegnung mit uns selbst. Analoge Kameras, die Arbeit in der Dunkelkammer und historische Verfahren, wie das Nasse-Kollodion-Verfahren oder die Cyanotypie haben seit einigen Jahren weltweit wieder Konjunktur unter Photoenthusiasten und Studierenden. Was die Protagonisten eines vermeintlich nostalgischen Retro-Trends indes nicht davon abhält, in ihrem Alltag ganz selbstverständlich in »Echtzeit« mit Handys und häufig unter strategisch diversifizierten, mehr oder weniger privaten User-Profilen und Identitäten in den sogenannten sozialen Netzwerken mit anderen (und auch sich selbst) in Form von digitalen bzw. digitalisierten Bildern in einen möglichst flüchtigen Kontakt zu treten.
An inter- and transdisciplinary concept has been developed, focusing on the scaling of industrial circular construction using innovative compacted mineral mixtures (CMM) derived from various soil types (sand, silt, clay) and recycled mineral waste. The concept aims to accelerate the systemic transformation of the construction industry towards carbon neutrality by promoting the large-scale adoption and automation of CMM-based construction materials, which incorporate natural mineral components and recycled aggregates or industrial by-products. In close collaboration with international and domestic stakeholders in the construction sector, the concept explores the integration of various CMM-based construction methods for producing wall elements in conventional building construction. Leveraging a digital urban mining platform, the concept aims to standardize the production process and enable mass-scale production. The ultimate goal is to fully harness the potential of automated CMM-based wall elements as a fast, competitive, emission-free, and recyclable alternative to traditional masonry and concrete construction techniques. To achieve this objective, the concept draws upon the latest advances in soil mechanics, rheology, and automation and incorporates open-source digital platform technologies to enhance data accessibility, processing, and knowledge acquisition. This will bolster confidence in CMM-based technologies and facilitate their widespread adoption. The extraordinary transfer potential of this approach necessitates both basic and applied research. As such, the proposed transformative, inter- and transdisciplinary concept will be conducted and synthesized using a comprehensive, holistic, and transfer-oriented methodology.
Digitization and sustainability are the two big topics of our current time. As the usage of digital products like IoT devices continues to grow, it affects the energy consumption caused by the Internet. At the same time, more and more companies feel the need to become carbon neutral and sustainable. Determining the environmental impact of an IoT device is challenging, as the production of the hardware components should be considered and the electricity consumption of the Internet since this is the primary communication medium of an IoT device. Estimating the electricity consumption of the Internet itself is a complex task. We performed a life cycle assessment (LCA) to determine the environmental impact of an intelligent smoke detector sold in Germany, taking its whole life-cycle from cradle-to-grave into account. We applied the impact assessment method ReCiPe 2016 Midpoint and compared its results with ILCD 2011 Midpoint+ to check the robustness of our results. The LCA results showed that electricity consumption during the use phase is the main contributor to environmental impacts. The mining of coal causes this contribution, which is a part of the German electricity mix. Consequently, the smoke detector mainly contributes to the impact categories of freshwater and marine ecotoxicity, but only marginally to global warming.
Die Forschungs- und Weiterbildungsaktivtäten konzentrierten sich auf die Anwendung von Virtual Reality in der Lehre. Hier interessierte vor allem, wo im Maschinenbau diese Technologie sinnvoll eingesetzt werden kann. Hierzu wurden die verschiedenen Bereiche im Maschinenbau untersucht.
Des Weiteren sollte die Frage beantwortet werden, wie man die Technologie sinnvoll einsetzen kann. Hierzu wurden die Arbeitsplätze der Modellfabrik herangezogen. Die Modellfabrik bot die Möglichkeit sowohl in der industrienahen Umgebung das reale Training durchzuführen als auch das VR-Training durchzuführen.
Neben den Hauptaktivitäten im Bereich Virtual Reality erlaubte mir das Forschungssemester auch, in andere Felder der Digitalisierung und Industrie 4.0 tiefer einzusteigen. Hier sei explizit die konkrete Anbindung der Montagelinien in der Modellfabrik an das Digitalisieurngs-Tool von Forcam zu erwähnen. Die digitale Anbindung der Montagelinie ist mit interessanten Problemstellungen verbunden, auf die hier nicht weiter eingegangen werden soll, die aber zu einem deutlich tieferen Verständnis von konkreten Umsetzungsproblemen der Industrie geführt haben.
Gegenstand der hier vorgestellten Arbeit ist ein Überblick über die Unterschiede zwischen der aktuell in Baden-Württemberg bauaufsichtlich gültigen Erdbebennorm DIN 4149 und der DIN EN 1998-1/NA 2021-07, die Sie künftig ersetzen soll und bereits dem aktuellen Stand der Technik entspricht. Es wird darauf eingegangen, welche Umstände hinter dem Umstieg auf die Europäische Norm und die Neuauflegung des Nationalen Anhangs stehen und ein Faktor ausgearbeitet, mit dem beide Normen direkt verglichen werden können. Zudem werden gängige Berechnungsverfahren zur Ermittlung von Erdbebenbeanspruchungen vorgestellt und anhand des Berechnungsprogramms Minea Design die Unterschiede zwischen vereinfachter 2D-Berechnung und 3D-Berechnung mit finiten Elementen untersucht. Daraus wird eine Aussage darüber abgeleitet, welches der Berechnungsverfahren auf der „sicheren Seite“ liegt und wie sich die Ergebnisse verifizieren lassen. An einem realen Projekt werden diese Erkenntnisse in Form einer Erdbebenbemessung angewandt.
MiniKueWeE-Abwärmenutzung
(2023)
Das Thema Energiewende ist derzeit so aktuell wie nie. Neben dem Umstieg von fossilen auf erneuerbare Energien gewinnt auch die Energieeffizienz auf allen Ebenen immer mehr an Bedeutung. Dies gilt besonders für viele Teile des Gebäudebereichs, wo heute eine beachtliche Energiemenge, nicht nur für die Wärmeerzeugung, sondern auch zur Raumkühlung benötigt wird (Umweltbundesamt 2020). In Anbetracht der Klimaveränderungen wird der Kühlbedarf in den nächsten Jahrzehnten zudem noch weiter ansteigen. Aus diesem Grund gibt es einen großen Bedarf an innovativen Lösungen, welche eine effiziente Raumkühlung unter möglichst geringem Energieeinsatz gewährleisten. Die vorliegende Projektarbeit untersucht einen Teilbereich einer solchen Lösung. Genaueres zum Hintergrund, den technischen Randbedingungen sowie den Zielen des Projekts, wird in den folgenden Abschnitten erläutert.
"KI first" braucht Verlierer
(2023)
Aktuell vergeht kaum eine Woche, in der nicht ein Unternehmen den Kampf um die Vorherrschaft im Bereich der Künstlichen Intelligenz (KI) aufnimmt. Tech-Konzerne versprechen sich auch von KI-gesteuerten Bildgeneratoren satte Gewinne. Diese ahmen mit synthetischen Mischbildern stilprägende Künstler/innen nach. Dabei wird auf die Rechtslage verwiesen, die eine zustimmungs- und vergütungsfreie Vervielfältigung ihrer Kunstwerke für Trainingszwecke angeblich zulässt. Doch Widerstand von Künstlern/innen hiergegen ist gesellschaftlich dringend geboten und wäre im Übrigen auch rechtlich gedeckt.
Carroll, A.B.
(2023)
Bribery
(2023)
This paper presents the integration of a spline based extension model into a probability hypothesis density (PHD) filter for extended targets. Using this filter the position and extension of each object as well as the number of present objects can jointly be estimated. Therefore, the spline extension model and the PHD filter are addressed and merged in a Gaussian mixture (GM) implementation. Simulation results using artificial laser measurements are used to evaluate the performance of the presented filter. Finally, the results are illustrated and discussed.
A growing share of modern trade policy instruments is shaped by non-tariff barriers (NTBs). Based on a structural gravity equation and the recently updated Global Trade Alert database, we empirically investigate the effect of NTBs on imports. Our analysis reveals that the implementation of NTBs reduces imports of affected products by up to 12%. Their trade dampening effect is thus comparable to that of trade defence instruments such as anti-dumping duties. It is smaller for exporters that have a free trade agreement with the importing country. Different types of NTBs affect trade to a different extent. Finally, we investigate the effect of behind-the-border measures, showing that they significantly lower the importer’s market access.
CSR Pyramid
(2023)
Die neue DIN ISO 26000 ist der Leitfaden zur gesellschaftlichen Verantwortung. Die Norm hilft jeder Organisation festzustellen, welche Form gesellschaftlicher Verantwortung sie übernehmen kann. Diese Erkenntnisse lassen sich für einen gelungenen Öffentlichkeitsauftritt und zur Verbesserung von Image und Ansehen nutzen. In diesem Beuth-Praxis-Band werden in einfacher Sprache die Kerngedanken und der Aufbau der Norm erläutert. Überblicksdarstellungen und leitende Fragen sollen allen Leser*innen dabei helfen, die Bedeutung der Norm im Kontext der eigenen Organisation abzuschätzen. Im Mittelpunkt stehen dabei Hilfestellungen zur Identifizierung von Stakeholdern, zur Abgrenzung des eigenen Einflussbereiches sowie zur Bestimmung von Relevanz und Signifikanz der einzelnen Kernthemen und Handlungsfelder.Hinweis: Im April 2021 ist der „Leitfaden zur gesellschaftlichen Verantwortung“ in das europäische Normenwerk übernommen worden und als DIN EN ISO 26000:2021-04 neu erschienen. Die Aussagen im Leitfaden sind dabei inhaltlich unverändert geblieben, sodass die Inhalte dieser Veröffentlichung uneingeschränkt auch auf die DIN EN ISO 26000 angewendet werden können.
Kleine und mittlere Unternehmen erhalten mit diesem Beuth-Praxis-Band eine Orientierungshilfe für die erfolgreiche Umsetzung der DIN ISO 26000 in der eigenen Firma. Nach einer kurzen Einführung in die CSR-Materie geben die Autorinnen einen Überblick zu derzeitigen Entwicklungen und Trends. Danach berichten acht Unternehmen unterschiedlicher Größe über ihre eigenen Erfahrungen bei der strategischen Auseinandersetzung mit dem Thema CSR. Dabei geht es nicht nur um Vorteile oder positive Ergebnisse, sondern auch um Schwierigkeiten und Herausforderungen, auf die vor allem kleine und Kleinstunternehmen stoßen können. Die Unternehmensberichte orientieren sich inhaltlich an der Gliederung des Kapitels 7 der ISO 26000.
Motion estimation is an essential element for autonomous vessels. It is used e.g. for lidar motion compensation as well as mapping and detection tasks in a maritime environment. Because the use of gyroscopes is not reliable and a high performance inertial measurement unit is quite expensive, we present an approach for visual pitch and roll estimation that utilizes a convolutional neural network for water segmentation, a stereo system for reconstruction and simple geometry to estimate pitch and roll. The algorithm is validated on a novel, publicly available dataset recorded at Lake Constance. Our experiments show that the pitch and roll estimator provides accurate results in comparison to an Xsens IMU sensor. We can further improve the pitch and roll estimation by sensor fusion with a gyroscope. The algorithm is available in its implementation as a ROS node.
Background: Polysomnography (PSG) is the gold standard for detecting obstructive sleep apnea (OSA). However, this technique has many disadvantages when using it outside the hospital or for daily use. Portable monitors (PMs) aim to streamline the OSA detection process through deep learning (DL).
Materials and methods: We studied how to detect OSA events and calculate the apnea-hypopnea index (AHI) by using deep learning models that aim to be implemented on PMs. Several deep learning models are presented after being trained on polysomnography data from the National Sleep Research Resource (NSRR) repository. The best hyperparameters for the DL architecture are presented. In addition, emphasis is focused on model explainability techniques, concretely on Gradient-weighted Class Activation Mapping (Grad-CAM).
Results: The results for the best DL model are presented and analyzed. The interpretability of the DL model is also analyzed by studying the regions of the signals that are most relevant for the model to make the decision. The model that yields the best result is a one-dimensional convolutional neural network (1D-CNN) with 84.3% accuracy.
Conclusion: The use of PMs using machine learning techniques for detecting OSA events still has a long way to go. However, our method for developing explainable DL models demonstrates that PMs appear to be a promising alternative to PSG in the future for the detection of obstructive apnea events and the automatic calculation of AHI.
Increasing demand for sustainable, resilient, and low-carbon construction materials has highlighted the potential of Compacted Mineral Mixtures (CMMs), which are formulated from various soil types (sand, silt, clay) and recycled mineral waste. This paper presents a comprehensive inter- and transdisciplinary research concept that aims to industrialise and scale up the adoption of CMM-based construction materials and methods, thereby accelerating the construction industry’s systemic transition towards carbon neutrality. By drawing upon the latest advances in soil mechanics, rheology, and automation, we propose the development of a robust material properties database to inform the design and application of CMM-based materials, taking into account their complex, time-dependent behaviour. Advanced soil mechanical tests would be utilised to ensure optimal performance under various loading and ageing conditions. This research has also recognised the importance of context-specific strategies for CMM adoption. We have explored the implications and limitations of implementing the proposed framework in developing countries, particularly where resources may be constrained. We aim to shed light on socio-economic and regulatory aspects that could influence the adoption of these sustainable construction methods. The proposed concept explores how the automated production of CMM-based wall elements can become a fast, competitive, emission-free, and recyclable alternative to traditional masonry and concrete construction techniques. We advocate for the integration of open-source digital platform technologies to enhance data accessibility, processing, and knowledge acquisition; to boost confidence in CMM-based technologies; and to catalyse their widespread adoption. We believe that the transformative potential of this research necessitates a blend of basic and applied investigation using a comprehensive, holistic, and transfer-oriented methodology. Thus, this paper serves to highlight the viability and multiple benefits of CMMs in construction, emphasising their pivotal role in advancing sustainable development and resilience in the built environment.
100 Jahre Türkische Republik
(2023)
Sanktionen stellen Zwangsmaßnahmen dar, die bei der Bewältigung politischer Spannungen zwischen Nationen eine lange und wiederkehrende Stellung einnehmen. Sie werden sowohl einseitig als auch in Staatenbündnissen verhängt und besonders nach dem 2. Weltkrieg mit zunehmender Häufigkeit eingesetzt. Während im letzten Jahrhundert, insbesondere vor dem 2. Weltkrieg, Handelsbeschränkungen und umfassende Wirtschaftsblockaden die vorherrschenden Sanktionsinstrumente darstellten, werden heute in einer stärker integrierten und globalisierten Welt Sanktionen in verschiedenen weiteren Formen verhängt, einschließlich internationaler Finanzbeschränkungen, Reiseverbote, Handelseinschränkungen für bestimmte Gütergruppen, Aufhebung militärischer Hilfen und spezifische Einschränkungen, wie beispielsweise Flugverbote und Hafensperrungen.
This paper introduces the third update/release of the Global Sanctions Data Base (GSDB-R3). The GSDB-R3 extends the period of coverage from 1950–2019 to 1950–2022, which includes two special periods—COVID-19 and the new sanctions against Russia. This update of the GSDB contains a total of 1325 cases. In response to multiple inquiries and requests, the GSDB-R3 has been amended with a new variable that distinguishes between unilateral and multilateral sanctions. As before, the GSDB comes in two versions, case-specific and dyadic, which are freely available upon request at GSDB@drexel.edu. To highlight one of the new features of the GSDB, we estimate the heterogeneous effects of unilateral and multilateral sanctions on trade. We also obtain estimates of the effects on trade of the 2014 sanctions on Russia.
The scoring of sleep stages is one of the essential tasks in sleep analysis. Since a manual procedure requires considerable human and financial resources, and incorporates some subjectivity, an automated approach could result in several advantages. There have been many developments in this area, and in order to provide a comprehensive overview, it is essential to review relevant recent works and summarise the characteristics of the approaches, which is the main aim of this article. To achieve it, we examined articles published between 2018 and 2022 that dealt with the automated scoring of sleep stages. In the final selection for in-depth analysis, 125 articles were included after reviewing a total of 515 publications. The results revealed that automatic scoring demonstrates good quality (with Cohen's kappa up to over 0.80 and accuracy up to over 90%) in analysing EEG/EEG + EOG + EMG signals. At the same time, it should be noted that there has been no breakthrough in the quality of results using these signals in recent years. Systems involving other signals that could potentially be acquired more conveniently for the user (e.g. respiratory, cardiac or movement signals) remain more challenging in the implementation with a high level of reliability but have considerable innovation capability. In general, automatic sleep stage scoring has excellent potential to assist medical professionals while providing an objective assessment.
The architecture, engineering and construction (AEC) industry is currently in transformation. Within this transformation, digitalization has a leading function, whereby a higher level of efficiency is pursued. In order to ensure a coordinated information exchange within the digitalization, the concept of the Common Data Environment (CDE) has been developed in the last years. A CDE is a cloud based collaborative platform that is used to exchange project information and data between the different stakeholders of a construction project. The main objective of this thesis is the implementation of a suitable CDE solution for the purpose of a construction management company.
For this purpose, an evaluation of different CDE software on the market, based on the functions and usability of the different software, has been complied to identify the most suitable solution. Secondly, a concept for the setup and implementation of a CDE solution has been developed. Therefore, advice for a successful change-management has been established to ensure the good implementation of the CDE.
The first part of the thesis includes a literature review through which the current state of the information management in the AEC industry is analyzed. The advantages of a CDE for the information management are also analyzed. Therefore, a synthesis of the functions and requirements to a CDE has been complied.
The second part of the thesis links the concept of Building Information Modelling (BIM) with the CDE. The advantages of a CDE in the BIM process has been established.
The third part of the thesis treats the evaluation of CDE software. This evaluation has been complied with a scoring method. As basis of the evaluation, a requirement list has been developed in which all the required functions of a CDE are listed.
In the fourth part of the thesis, a concept for the establishment of a CDE has been developed. The developed concept is a practical application of the standards specifications to CDE. The concept has been developed with CDE expert and CDE power users. The developed concept treats overall of the structuration, the standardization, and the implementation of a CDE.
The last part treats of the change process engender by the implementation of a CDE.
Summary, this thesis provides a structure for the implementation of a CDE software and serves as framework for companies of the AEC industry to select and establish a CDE.
Mit Eis erneuerbar Heizen
(2023)
In order to ensure sufficient recovery of the human body and brain, healthy sleep is indispensable. For this purpose, appropriate therapy should be initiated at an early stage in the case of sleep disorders. For some sleep disorders (e.g., insomnia), a sleep diary is essential for diagnosis and therapy monitoring. However, subjective measurement with a sleep diary has several disadvantages, requiring regular action from the user and leading to decreased comfort and potential data loss. To automate sleep monitoring and increase user comfort, one could consider replacing a sleep diary with an automatic measurement, such as a smartwatch, which would not disturb sleep. To obtain accurate results on the evaluation of the possibility of such a replacement, a field study was conducted with a total of 166 overnight recordings, followed by an analysis of the results. In this evaluation, objective sleep measurement with a Samsung Galaxy Watch 4 was compared to a subjective approach with a sleep diary, which is a standard method in sleep medicine. The focus was on comparing four relevant sleep characteristics: falling asleep time, waking up time, total sleep time (TST), and sleep efficiency (SE). After evaluating the results, it was concluded that a smartwatch could replace subjective measurement to determine falling asleep and waking up time, considering some level of inaccuracy. In the case of SE, substitution was also proved to be possible. However, some individual recordings showed a higher discrepancy in results between the two approaches. For its part, the evaluation of the TST measurement currently does not allow us to recommend substituting the measurement method for this sleep parameter. The appropriateness of replacing sleep diary measurement with a smartwatch depends on the acceptable levels of discrepancy. We propose four levels of similarity of results, defining ranges of absolute differences between objective and subjective measurements. By considering the values in the provided table and knowing the required accuracy, it is possible to determine the suitability of substitution in each individual case. The introduction of a “similarity level” parameter increases the adaptability and reusability of study findings in individual practical cases.
Characterization of NiTi Shape Memory Damping Elements designed for Automotive Safety Systems
(2014)
Actuator elements made of NiTi shape memory material are more and more known in industry because of their unique properties. Due to the martensitic phase change, they can revert to their original shape by heating when subjected to an appropriate treatment. This thermal shape memory effect (SME) can show a significant shape change combined with a considerable force. Therefore such elements can be used to solve many technical tasks in the field of actuating elements and mechatronics and will play an increasing role in the next years, especially within the automotive technology, energy management, power, and mechanical engineering as well as medical technology. Beside this thermal SME, these materials also show a mechanical SME, characterized by a superelastic plateau with reversible elongations in the range of 8%. This behavior is based on the building of stress-induced martensite of loaded austenite material at constant temperature and facilitates a lot of applications especially in the medical field. Both SMEs are attended by energy dissipation during the martensitic phase change. This paper describes the first results obtained on different actuator and superelastic NiTi wires concerning their use as damping elements in automotive safety systems. In a first step, the damping behavior of small NiTi wires up to 0.5 mm diameter was examined at testing speeds varying between 0.1 and 50 mm/s upon an adapted tensile testing machine. In order to realize higher testing speeds, a drop impact testing machine was designed, which allows testing speeds up to 4000 mm/s. After introducing this new type of testing machine, the first results of vertical-shock tests of superelastic and electrically activated actuator wires are presented. The characterization of these high dynamic phase change parameters represents the basis for new applications for shape memory damping elements, especially in automotive safety systems.
In automotive a lot of electromagnetically, pyrotechnically or mechanically driven actuators are integrated to run comfort systems and to control safety systems in modern passenger cars. Using shape memory alloys (SMA) the existing systems could be simplified, performing the same function through new mechanisms with reduced size, weight, and costs. A drawback for the use of SMA in safety systems is the lack of materials knowledge concerning the durability of the switching function (long-time stability of the shape memory effect). Pedestrian safety systems play a significant role to reduce injuries and fatal casualties caused by accidents. One automotive safety system for pedestrian protection is the bonnet lifting system. Based on such an application, this article gives an introduction to existing bonnet lifting systems for pedestrian protection, describes the use of quick changing shape memory actuators and the results of the study concerning the long-time stability of the tested NiTi-wires. These wires were trained, exposed up to 4years at elevated temperatures (up to 140°C) and tested regarding their phase change temperatures, times, and strokes. For example, it was found that A P-temperature is shifted toward higher temperatures with longer exposing periods and higher temperatures. However, in the functional testing plant a delay in the switching time could not be detected. This article gives some answers concerning the long-time stability of NiTi-wires that were missing till now. With this knowledge, the number of future automotive applications using SMA can be increased. It can be concluded, that the use of quick changing shape memory actuators in safety systems could simplify the mechanism, reduce maintenance and manufacturing costs and should be insertable also for other automotive applications.
Network effects, economies of scale, and lock-in-effects increasingly lead to a concentration of digital resources and capabilities, hindering the free and equitable development of digital entrepreneurship, new skills, and jobs, especially in small communities and their small and medium-sized enterprises (“SMEs”). To ensure the affordability and accessibility of technologies, promote digital entrepreneurship and community well-being, and protect digital rights, we propose data cooperatives as a vehicle for secure, trusted, and sovereign data exchange. In post-pandemic times, community/SME-led cooperatives can play a vital role by ensuring that supply chains to support digital commons are uninterrupted, resilient, and decentralized. Digital commons and data sovereignty provide communities with affordable and easy access to information and the ability to collectively negotiate data-related decisions. Moreover, cooperative commons (a) provide access to the infrastructure that underpins the modern economy, (b) preserve property rights, and (c) ensure that privatization and monopolization do not further erode self-determination, especially in a world increasingly mediated by AI. Thus, governance plays a significant role in accelerating communities’/SMEs’ digital transformation and addressing their challenges. Cooperatives thrive on digital governance and standards such as open trusted application programming interfaces (“APIs”) that increase the efficiency, technological capabilities, and capacities of participants and, most importantly, integrate, enable, and accelerate the digital transformation of SMEs in the overall process. This review article analyses an array of transformative use cases that underline the potential of cooperative data governance. These case studies exemplify how data and platform cooperatives, through their innovative value creation mechanisms, can elevate digital commons and value chains to a new dimension of collaboration, thereby addressing pressing societal issues. Guided by our research aim, we propose a policy framework that supports the practical implementation of digital federation platforms and data cooperatives. This policy blueprint intends to facilitate sustainable development in both the Global South and North, fostering equitable and inclusive data governance strategies.
Entwicklung und Analyse eines Turmes aus Buchenfurnierschichtholz für eine Kleinwindkraftanlage
(2023)
Die Energiepreise in Deutschland steigen und viele Immobilienbesitzer wollen unabhängiger von den Stromerzeugern werden, weshalb sie in eine eigene Stromproduktion investieren. Eine optimale Ergänzung zu einer Photovoltaikanlage stellt dabei eine Kleinwindkraftanlage dar, die im Vergleich zu den großen Anlagen geringere Umweltbelastungen verursacht. Allerdings entstehen beim Bau einer Kleinwindkraftanlage hohe Kosten, weshalb dieses Konzept kaum verbreitet ist.
Die Türme von Windkraftanlagen werden normalerweise aus Stahl hergestellt. Bei der Herstellung dieses Baustoffs werden große Mengen an Treibhausgasen freigesetzt. Ein umweltfreundlicheres Material ist der nachwachsende Rohstoff Holz. Die Herstellung der Türme in Holzbauweise ist bisher kaum zu finden. Daher wird in dieser Arbeit untersucht, ob ein Holzturm eine Alternative zum Stahlturm sein kann und ob dadurch die Investitionskosten für eine Kleinwindkraftanlage gesenkt werden können.
Um die Forschungsfrage zu beantworten, wurde ein Konzept für einen Holzturm aus Buchenfurnierschichtholz entwickelt. Zunächst wurde die Bedeutung von Buchenholz im Bauwesen erarbeitet. Anschließend wurden die Grundlagen für den Bau von Kleinwindkraftanlagen erforscht und die Abmessungen einer fiktiven Anlage festgelegt. Es wurden die relevanten Einwirkungen ermittelt und die Anlage mittels einer Finiten-Elemente-Berechnung untersucht. Zuletzt wurden die Nachweise für die Tragfähigkeit und die Ermüdungssicherheit des Querschnittes und der Verbindungsmittel geführt.
Bei den Untersuchungen der Randbedingungen konnten keine Argumente gefunden werden, die gegen die Verwendung von Holz sprechen. In einigen Punkten wie beispielsweise in der Herstellungs- und Errichtungsphase sind sogar Vorteile gegenüber dem Stahl zu erkennen. Zudem wurde in einer Kostenschätzung herausgefunden, dass ein Holzturm auch preislich mithalten kann und dass die Herstellungskosten sogar gesenkt werden können. Die Gesamtkosten für eine Kleinwindkraftanlage können dennoch nur unwesentlich gesenkt werden. Die Investition ist daher nur sinnvoll, wenn am geplanten Standort genügend Wind zur Verfügung steht und wenn der produzierte Strom größtenteils selbst genutzt werden kann.
Das veränderte Einkaufs- und Konsumverhalten vieler Kunden stellt den Einzelhandel vor große Herausforderungen. Dabei scheint besonders die junge Generation Z, die mit dem Internet, sozialen Medien und digitalen Anwendungen aufgewachsen ist, nicht mehr den traditionellen Konsummustern zu entsprechen und erwartet eine Anpassung des Einzelhandels an ihre Bedürfnisse. Um herauszufinden, welche Anforderungen junge Konsumentinnen und Konsumenten an den Einzelhandel stellen und wie sich diese zwischen verschiedenen Generationen unterscheiden, wurden mehr als 300 Personen aller Altersgruppen befragt. Zu den Schwerpunkten der Untersuchung zählten das Einkaufsverhalten, die Personalisierung von Kommunikation und Angeboten sowie die Nutzung digitaler Services und Technologien im Einzelhandel.
This study aims to investigate the utilization of Bayesian techniques for the calibration of micro-electro-mechanical system (MEMS) accelerometers. These devices have garnered substantial interest in various practical applications and typically require calibration through error-correcting functions. The parameters of these error-correcting functions are determined during a calibration process. However, due to various sources of noise, these parameters cannot be determined with precision, making it desirable to incorporate uncertainty in the calibration models. Bayesian modeling offers a natural and complete way of reflecting uncertainty by treating the model parameters as variables rather than fixed values. In addition, Bayesian modeling enables the incorporation of prior knowledge, making it an ideal choice for calibration. Nevertheless, it is infrequently used in sensor calibration. This study introduces Bayesian methods for the calibration of MEMS accelerometer data in a straightforward manner using recent advances in probabilistic programming.
Non-volatile NAND flash memories store information as an electrical charge. Different read reference voltages are applied to read the data. However, the threshold voltage distributions vary due to aging effects like program erase cycling and data retention time. It is necessary to adapt the read reference voltages for different life-cycle conditions to minimize the error probability during readout. In the past, methods based on pilot data or high-resolution threshold voltage histograms were proposed to estimate the changes in voltage distributions. In this work, we propose a machine learning approach with neural networks to estimate the read reference voltages. The proposed method utilizes sparse histogram data for the threshold voltage distributions. For reading the information from triple-level cell (TLC) memories, several read reference voltages are applied in sequence. We consider two histogram resolutions. The simplest histogram consists of the zero-and-one ratios for the hard decision read operation, whereas a higher resolution is obtained by considering the quantization levels for soft-input decoding. This approach does not require pilot data for the voltage adaptation. Furthermore, only a few measurements of extreme points of the threshold voltage distributions are required as training data. Measurements with different conditions verify the proposed approach. The resulting neural networks perform well under other life-cycle conditions.
Creative Coding - void draw
(2023)
Creative Coding ist eines der vielen trendigen Schlagwörter, die in letzter Zeit in der Designbranche auftauchen. Wie so oft ist Creative Coding aber prinzipiell gar nicht so neu, sondern eine Wortschöpfung, die etwas beschreibt, was DesignerInnen schon lange betreiben, was jedoch nun breiter diskutiert wird und als wichtiges Konzept für Designer anerkannt ist.