Refine
Year of publication
Document Type
- Conference Proceeding (642) (remove)
Language
- English (492)
- German (149)
- Multiple languages (1)
Keywords
- 360-degree coverage (1)
- 3D Extended Object Tracking (1)
- 3D Extended Object Tracking (EOT) (2)
- 3D shape tracking (1)
- 3D ship detection (1)
- AAL (1)
- ADAM (1)
- AHI (1)
- Abrasive grain material (1)
- Abtragsprinzip (1)
Institute
- Fakultät Bauingenieurwesen (9)
- Fakultät Elektrotechnik und Informationstechnik (10)
- Fakultät Informatik (50)
- Fakultät Maschinenbau (9)
- Fakultät Wirtschafts-, Kultur- und Rechtswissenschaften (8)
- Institut für Angewandte Forschung - IAF (53)
- Institut für Optische Systeme - IOS (19)
- Institut für Strategische Innovation und Technologiemanagement - IST (29)
- Institut für Systemdynamik - ISD (64)
- Institut für Werkstoffsystemtechnik Konstanz - WIK (5)
We examine to what extent a transaction relation-based value network maturity status of New Technology-Based Firms (NTBFs) is related to their survival. A specific challenge of NTBFs is their lack of market-orientation, which is why the maturity of the ties they form towards the market in terms of customers, financiers, personnel and partners is supposed to be a strong indicator for survival. We analyze a sample of 170 NTBFs by capturing their value network status from business plans and defining their survival status using secondary research. Simple statistical tests and regressions suggest that the official registration of the business is a pre-step for survival that requires industry-specific value network dimension strengths. A sub-sample survival analysis shows that for all NTBFs that have reached registration, regardless of their industry, a stronger customer value network maturity dimension prevents from failure and is thus a significant predictor for survival. Moreover, the analyses partly support the idea that NTBFs from the IT sector are less dependent on a strong value network in the financier dimension to survive. The results are of relevance for both practitioners and researchers in the innovation system: a better understanding of the factors impacting on NTBF survival can help to provide more tailored support services for young firms, increase the effectiveness of resource allocations, and provide a basis for further research.
In 3D extended object tracking (EOT), well-established models exist for tracking the object extent using various shape priors. A single update, however, has to be performed for every measurement using these models leading to a high computational runtime for high-resolution sensors. In this paper, we address this problem by using various model-independent downsampling schemes based on distance heuristics and random sampling as pre-processing before the update. We investigate the methods in a simulated and real-world tracking scenario using two different measurement models with measurements gathered from a LiDAR sensor. We found that there is a huge potential for speeding up 3D EOT by dropping up to 95\% of the measurements in our investigated scenarios when using random sampling. Since random sampling, however, can also result in a subset that does not represent the total set very well, leading to a poor tracking performance, there is still a high demand for further research.
Analysing observability is an important step in the
process of designing state feedback controllers. While for linear
systems observability has been widely studied and easy-to-check
necessary and sufficient conditions are available, for nonlinear
systems, such a general recipe does not exist and different classes
of systems require different techniques. In this paper, we analyse
observability for an industrial heating process where a stripe-
shaped plastic workpiece is moving through a heating zone where
it is heated up to a specific temperature by applying hot air to its
surface through a nozzle. A modeling approach for this process
is briefly presented, yielding a nonlinear Ordinary Differential
Equation model. Sensitivity-based observability analysis is used
to identify unobservable states and make suggestions for addi-
tional sensor locations. In practice, however, it is not possible
to place additional sensors, so the available measurements are
used to implement a simple open-loop state estimator with
offset compensation and numerical and experimental results are
presented.
In this paper, a gain-scheduled nonlinear control structure is proposed for a surface vessel, which takes advantage of extended linearisation techniques. Thereby, an accurate tracking of desired trajectories can be guaranteed that contributes to a safe and reliable water transport. The PI state feedback control is extended by a feedforward control based on an inverse system model. To achieve an accurate trajectory tracking, however, an observer-based disturbance compensation is necessary: external disturbances by cross currents or wind forces in lateral direction and wave-induced measurement disturbances are estimated by a nonlinear observer and used for a compensation. The efficiency and the achieved tracking performance are shown by simulation results using a validated model of the ship Korona at the HTWG Konstanz, Germany. Here, both tracking behaviour and rejection of disturbance forces in lateral direction are considered.
Polysomnography is a gold standard for a sleep study, and it provides very accurate results, but its cost (both personnel and material) are quite high. Therefore, the development of a low-cost system for overnight breathing and heartbeat monitoring, which provides more comfort while recording the data, is a well-motivated challenge. The system proposed in this manuscript is based on the usage of resistive pressure sensors installed under the mattress. These sensors can measure slight pressure changes provoked during breathing and heartbeat. The captured signal requires advanced processing, like applying filters and amplifiers before the analog signal is ready for the next step. Then, the output signal is digitalized and further processed by an algorithm that performs a custom filtering before it can recognize breathing and heart rate in real-time. The result can be directly visualized. Furthermore, a CSV file is created containing the raw data, timestamps, and unique IDs to facilitate further processing. The achieved results are promising, and the average deviation from a reference device is about 4bpm.
Sleep study can be used for detection of sleep quality and in general bed behaviors. These results can helpful for regulating sleep and recognizing different sleeping disorders of human. In comparison to the leading standard measuring system, which is Polysomnography (PSG), the system proposed in this work is a non-invasive sleep monitoring device. For continuous analysis or home use, the PSG or wearable Actigraphy devices tends to be uncomfortable. Besides, these methods not only decrease practicality due to the process of having to put them on, but they are also very expensive. The system proposed in this paper classifies respiration and body movement with only one type of sensor and also in a noninvasive way. The sensor used is a pressure sensor. This sensor is low cost and can be used for commercial proposes. The system was tested by carrying out an experiment that recorded the sleep process of a subject. These recordings showed excellent results in the classification of breathing rate and body movements.
Respiratory diseases are leading causes of death and disability in the world. The recent COVID-19 pandemic is also affecting the respiratory system. Detecting and diagnosing respiratory diseases requires both medical professionals and the clinical environment. Most of the techniques used up to date were also invasive or expensive.
Some research groups are developing hardware devices and techniques to make possible a non-invasive or even remote respiratory sound acquisition. These sounds are then processed and analysed for clinical, scientific, or educational purposes.
We present the literature review of non-invasive sound acquisition devices and techniques.
The results are about a huge number of digital tools, like microphones, wearables, or Internet of Thing devices, that can be used in this scope.
Some interesting applications have been found. Some devices make easier the sound acquisition in a clinic environment, but others make possible daily monitoring outside that ambient. We aim to use some of these devices and include the non-invasive recorded respiratory sounds in a Digital Twin system for personalized health.
Sleep analysis using a Polysomnography system is difficult and expensive. That is why we suggest a non-invasive and unobtrusive measurement. Very few people want the cables or devices attached to their bodies during sleep. The proposed approach is to implement a monitoring system, so the subject is not bothered. As a result, the idea is a non-invasive monitoring system based on detecting pressure distribution. This system should be able to measure the pressure differences that occur during a single heartbeat and during breathing through the mattress. The system consists of two blocks signal acquisition and signal processing. This whole technology should be economical to be affordable enough for every user. As a result, preprocessed data is obtained for further detailed analysis using different filters for heartbeat and respiration detection. In the initial stage of filtration, Butterworth filters are used.
Codes over quotient rings of Lipschitz integers have recently attracted some attention. This work investigates the performance of Lipschitz integer constellations for transmission over the AWGN channel by means of the constellation figure of merit. A construction of sets of Lipschitz integers is presented that leads to a better constellation figure of merit compared to ordinary Lipschitz integer constellations. In particular, it is demonstrated that the concept of set partitioning can be applied to quotient rings of Lipschitz integers where the number of elements is not a prime number. It is shown that it is always possible to partition such quotient rings into additive subgroups in a manner that the minimum Euclidean distance of each subgroup is strictly larger than in the original set. The resulting signal constellations have a better performance for transmission over an additive white Gaussian noise channel compared to Gaussian integer constellations and to ordinary Lipschitz integer constellations.
Product development and product manufacturing are entering a new era, namely an era where engineering tasks are executed under collaboration of all involved parties. Engineers and potential customers work together mainly in a virtual world for the design and realization of the product. We address this so called “crowdsourcing” trend in the automotive industry that lowers cost and accelerates production of new car. Current practice and prior studies fail to handle data management and collaboration aspects in sufficient detail. We propose a PLM based crowdsourcing platform that applies best practices to the established approach and adapt it with new methods for handling specific requirements. Our work provides a basis for establishing an improved collaboration platform to support a Gig Economy in the automotive industry.
Large persistent memory is crucial for many applications in embedded systems and automotive computing like AI databases, ADAS, and cutting-edge infotainment systems. Such applications require reliable NAND flash memories made for harsh automotive conditions. However, due to high memory densities and production tolerances, the error probability of NAND flash memories has risen. As the number of program/erase cycles and the data retention times increase, non-volatile NAND flash memories' performance and dependability suffer. The read reference voltages of the flash cells vary due to these aging processes. In this work, we consider the issue of reference voltage adaption. The considered estimation procedure uses shallow neural networks to estimate the read reference voltages for different life-cycle conditions with the help of histogram measurements. We demonstrate that the training data for the neural networks can be enhanced by using shifted histograms, i.e., a training of the neural networks is possible based on a few measurements of some extreme points used as training data. The trained neural networks generalize well for other life-cycle conditions.
Das hier vorgestellte Netzoptimierungstool kann dem Verteilnetzbetreiber bei einem Störfall im Netz in Echtzeit eine Lösung zur Steuerung seiner Betriebsmittel vorschlagen. Dadurch kann das bestehende Netz optimal genutzt werden und ein kostenintensiver Netzausbau im Mittel- und Niederspannungsnetz verringert oder sogar verhindert werden. Als Grundlage für den Netzoptimierer dient ein künstliches neuronales Netz (KNN). Zum Training des KNN wurden Störfälle generiert, die auf reellen Erzeugungs- und Lastprofilen aus dem CoSSMic-Projekt basieren [1]. Für jeden Störfall wurde aus allen möglichen und sinnvollen Netzkonfigurationen eine optimierte Netztopologie anhand von Lastflussberechnungen ermittelt. Durch die Variation der Stufenschalter der Transformatoren und der Stellungen aller installierten Schalter im Netz wurde berechnet, wie der Stromfluss gelenkt werden muss, damit keines der Betriebsmittel die zulässigen Belastungsgrenzen mehr überschreitet. Für ein virtuelles Testnetz konnte mit einem trainierten KNN zu 90 Prozent die optimale Lösung des jeweiligen Störfalls erkannt werden. Durch die Anwendung der N-Best Methode konnte die Vorhersagewahrscheinlichkeit auf annähernd 99 Prozent erhöht werden.
Im Rahmen der Lehrveranstaltung "Nachhaltigkeit im industriellen Umfeld" im Masterstudiengang Umwelt- und Verfahrenstechnik der Hochschulen Konstanz und Ravensburg-Weingarten wurde 2015 eine studentische Fachkonferenz durchgeführt.
Die Studierenden entwickelten in Einzelarbeit oder als Zweierteam Konferenzbeiträge zu folgenden Themen:
- Innovationen und Spannendes aus dem Bereich der Energieerzeugung und -wandlung
- Aspekte der Schließung von Stoffkreisläufen und Vermeidung von Schadstoffeinträgen in die Umwelt
- Chancen und Herausforderungen Nachwachsender Rohstoffe bei verschiedenen Einsatzmöglichkeiten sowie Themen der Nachhaltigkeit in der Landwirtschaft
- verschiedene Blickwinkel auf das Thema Wasser (von der Abwasserreinigung bis zum Wasserkonsum der Konsumenten)
- die Betrachtung spezifischer Industrien und Unternehmen sowie deren Werkzeuge zur Umsetzung von Nachhaltigkeit
Die Ergebnisse der studentischen Fachkonferenz zur „Nachhaltigkeit im industriellen Umfeld“ werden in der vorliegenden Publikation präsentiert.
Im Rahmen der Lehrveranstaltung "Nachhaltigkeit im industriellen Umfeld" im Masterstudiengang Umwelt- und Verfahrenstechnik der Hochschulen Konstanz und Ravensburg-Weingarten fand im Dezember 2016 eine studentische Fachkonferenz statt. Die Studierenden entwickelten in Einzelarbeit oder als Zweierteam
Konferenzbeiträge zu folgenden Themen:
- Spannendes aus dem Bereich der Energieerzeugung und der Grauen Energie
- Aspekte der Kreislaufwirtschaft
- Ökosysteme - ihre Belastung und Erhalt
- Spezifische Wirtschaftszweige und Nachhaltigkeit
Die Ergebnisse der studentischen Fachkonferenz zur „Nachhaltigkeit im
industriellen Umfeld“ werden in der vorliegenden Publikation präsentiert.
This paper focuses on the multivariable control of a drawing tower process. The nature of the process together with the differences in measurement noise levels that affect the variables to be controlled motivated the development of a new MPC algorithm. An extension of a multivariable predictive control algorithm with separated prediction horizons is proposed. The obtained experimental results show the usefulness of the proposed algorithm..
This work introduces new signal constellations based on Eisenstein integers, i.e., the hexagonal lattice. These sets of Eisenstein integers have a cardinality which is an integer power of three. They are proposed as signal constellations for representation in the equivalent complex baseband model, especially for applications like physical-layer network coding or MIMO transmission where the constellation is required to be a subset of a lattice. It is shown that these constellations form additive groups where the addition over the complex plane corresponds to the addition with carry over ternary Galois fields. A ternary set partitioning is derived that enables multilevel coding based on ternary error-correcting codes. In the subsets, this partitioning achieves a gain of 4.77 dB, which results from an increased minimum squared Euclidean distance of the signal points. Furthermore, the constellation-constrained capacities over the AWGN channel and the related level capacities in case of ternary multilevel coding are investigated. Simulation results for multilevel coding based on ternary LDPC codes are presented which show that a performance close to the constellation-constrained capacities can be achieved.
Offline handwriting recognition systems often use LSTM networks, trained with line- or word-images. Multi-line text makes it necessary to use segmentation to explicitly obtain these images. Skewed, curved, overlapping, incorrectly written text, or noise can lead to errors during segmentation of multi-line text and reduces the overall recognition capacity of the system. Last year has seen the introduction of deep learning methods capable of segmentation-free recognition of whole paragraphs. Our method uses Conditional Random Fields to represent text and align it with the network output to calculate a loss function for training. Experiments are promising and show that the technique is capable of training a LSTM multi-line text recognition system.
Recently published nonlinear model-based control
approaches achieve impressive performances in complex real-
world applications. However, due to model-plant mismatches
and unforeseen disturbances, the model-based controller’s per-
formance is limited in full-scale applications. In most applica-
tions, low-level control loops mitigate the model-plant mismatch
and the sensitivity to disturbances. But what is the influence
of these low-level control loops? In this paper, we present
the model predictive path integral (MPPI) control of a self-
balancing vehicle and investigate the influence of subordinate
control loops on closed-loop performance. Therefore, simulation
and full-scale experiments are performed and analyzed. Subor-
dinate control loops empower the MPPI controller because they
dampen the influence of disturbances, and thus improve the
model’s accuracy. This is the basis for the successful application
of model-based control approaches in real-world systems. All
in all, a model is used to design a low-level controller, then
its closed-loop behavior is determined, and this model is used
within the superimposed MPPI control loop – modeling for
control and vice versa.
Motion safety for vessels
(2015)
The improvement of collision avoidance for vessels in close range encounter situations is an important topic for maritime traffic safety. Typical approaches generate evasive trajectories or optimise the trajectories of all involved vessels. The idea of this work is to validate these trajectories related to guaranteed motion safety, which means that it is not sufficient for a trajectory to be collision-free, but it must additionally ensure that an evasive manoeuvre is performable at any time. An approach using the distance and the evolution of the distance to the other vessels is proposed. The concept of Inevitable Collision States (ICS) is adopted to identify the states for which no evasive manoeuvre exist. Furthermore, it is implemented into a collision avoidance system for recreational crafts to demonstrate the performance.
The Montgomery multiplication is an efficient method for modular arithmetic. Typically, it is used for modular arithmetic over integer rings to prevent the expensive inversion for the modulo reduction. In this work, we consider modular arithmetic over rings of Gaussian integers. Gaussian integers are subset of the complex numbers such that the real and imaginary parts are integers. In many cases Gaussian integer rings are isomorphic to ordinary integer rings. We demonstrate that the concept of the Montgomery multiplication can be extended to Gaussian integers. Due to independent calculation of the real and imaginary parts, the computation complexity of the multiplication is reduced compared with ordinary integer modular arithmetic. This concept is suitable for coding applications as well as for asymmetric key cryptographic systems, such as elliptic curve cryptography or the Rivest-Shamir-Adleman system.
Measuring the natural frequency of buildings and bridges is a possibility to get information about the stiffness of the construction. Decreasing stiffness can be detected be repeatedly measurements. Damaged parts of the construction or too high wood moisture can be reasons for decreasing stiffness. The earlier the failure is detected the better is the chance to repair it with low costs. The method of monitoring by repeatedly measuring the natural frequency is applied at timber bridges, especially on footbridges. As damages due to high wood moisture cannot be seen easily, measuring the natural frequency is a good possibility to detect them and then to repair them. Equations to calculate the natural frequencies regarding the damaged parts are shown and applied to a simple supported beam.
Sleep is extremely important for physical and mental health. Although polysomnography is an established approach in sleep analysis, it is quite intrusive and expensive. Consequently, developing a non-invasive and non-intrusive home sleep monitoring system with minimal influence on patients, that can reliably and accurately measure cardiorespiratory parameters, is of great interest. The aim of this study is to validate a non-invasive and unobtrusive cardiorespiratory parameter monitoring system based on an accelerometer sensor. This system includes a special holder to install the system under the bed mattress. The additional aim is to determine the optimum relative system position (in relation to the subject) at which the most accurate and precise values of measured parameters could be achieved. The data were collected from 23 subjects (13 males and 10 females). The obtained ballistocardiogram signal was sequentially processed using a sixth-order Butterworth bandpass filter and a moving average filter. As a result, an average error (compared to reference values) of 2.24 beats per minute for heart rate and 1.52 breaths per minute for respiratory rate was achieved, regardless of the subject’s sleep position. For males and females, the errors were 2.28 bpm and 2.19 bpm for heart rate and 1.41 rpm and 1.30 rpm for respiratory rate. We determined that placing the sensor and system at chest level is the preferred configuration for cardiorespiratory measurement. Further studies of the system’s performance in larger groups of subjects are required, despite the promising results of the current tests in healthy subjects.
This paper describes the development of a control system for an industrial heating application. In this process a moving substrate is passing through a heating zone with variable speed. Heat is applied by hot air to the substrate with the air flow rate being the manipulated variable. The aim is to control the substrate’s temperature at a specific location after passing the heating zone. First, a model is derived for a point attached to the moving substrate. This is modified to reflect the temperature of the moving substrate at the specified location. In order to regulate the temperature a nonlinear model predictive control approach is applied using an implicit Euler scheme to integrate the model and an augmented gradient based optimization approach. The performance of the controller has been validated both by simulations and experiments on the physical plant. The respective results are presented in this paper.
Misbehave like Nobody’s Watching? Investor Attention to Corporate Misconduct and its Implications
(2023)
Die Studienanfänger in den technischen Studiengängen der Hochschulen für angewandte Wissenschaften haben nicht nur in Mathematik sondern auch in Physik sehr unterschiedliche Vorkenntnisse. Obwohl diese Fächer für das grundlegende Verständnis technischer Vorgänge von großer Bedeutung sind, kann die Ausbildung in diesen Bereichen angesichts der begrenzten dafür im Verlauf des Studiums zur Verfügung stehenden Zeitfenster nicht bei Null anfangen. Für Mathematik wurde daher von der Arbeitsgruppe cosh ein Mindestanforderungskatalog zusammengestellt und 2014 veröffentlicht. Er beschreibt Kenntnisse und Fertigkeiten, die Studienanfänger zur erfolgreichen Aufnahme eines WiMINT-Studiums (Wirtschaft, Mathematik, Informatik, Naturwissenschaft, Technik) an einer Hochschule benötigen. Inzwischen hat sich nun eine Arbeitsgruppe von Physikerinnen und Physikern an Hochschulen in Baden-Württemberg gebildet, deren Ziel es ist, einen analogen Mindestanforderungskatalog für den Bereich Physik zu erstellen. Hier wird der aktuell erreichte Stand der Arbeiten vorgestellt.
Radsatz-Torsionsschwingungen verursachen Torsionsamplituden, die das quasistatische Torsionsmoment um ein Vielfaches überschreiten können. Dies führt zu enormen Beanspruchungen von Radsatzwelle, Pressverbänden und Radsatzgetrieben. Durch die Auswertung und Analyse zahlreicher Messfahrten verschiedener Fahrzeuge konnten einige Merkmale bezüglich der Verteilung der Torsionsbelastung innerhalb eines Triebdrehgestells identifiziert werden. Da die Messresultate stark von der individuellen Antriebsregelung den vorliegenden Kraftschlussbedingungen abhängig sind lassen sich für Fahrzeuge mit unterschiedlicher Antriebskonfiguration nur bedingt vergleichen. Während im Regelbetrieb von Schienenfahrzeugen das Auftreten von Radsatz- Torsionsschwingungen vermieden werden soll, sind insbesondere bei Zulassungsfahrten Fahrzustände anzustreben in denen bei aktiver Rollier- und Schleuderschutzsoftware maximale Torsionsschwingungen erzeugt werden können. Basierend auf den durchgeführten Analysen und gesammelten Erfahrungen haben sich für die Durchführung von Versuchsfahrten folgende Bedingungen zielführend herausgestellt: (1) Herstellung möglichst identischer Kraftschlussverhältnisse aller im Gruppenantrieb elektrisch gekoppelten Radsätze durch gemeinsame Bewässerung in beiden Fahrrichtungen. (2) Sicherstellung einer maximalen Zugkraftausnutzung durch hohe Fahrwiderstände, geeignete Streckwahl mit konstanter Steigung oder Einsatz einer Bremslokomotive.
Szenografie ist eine Universaldisziplin, die mit Raum, Licht, Grafik, Ton und digitalen Medien Erlebnisräume schafft, in denen Wissen vermittelt und Besuchern verborgene Zusammenhänge intuitiv zugänglich gemacht werden können. Hierbei spielen mediale Vermittlungsstrategien eine zentrale Rolle. Eine dieser Strategien ist Information on demand: kontextualisierende Informationen werden hier nicht permanent angeboten, sondern nur dann, wenn der Besucher sie abruft. Dies ermöglicht sowohl eine auratische Objektpräsentation als auch gleichermaßen Kontextualisierung und Informationsvermittlung. Eine weitere Strategie sind
personalisierte Devices, die zudem eine zielgruppenorientierte Ansprache von Besuchern ermöglichen. Darüber hinaus bieten VR- und AR-Technologien die Möglichkeit zur publikumswirksamen Darstellung komplexer wissenschaftlicher Zusammenhänge.
This paper broadens the resource-based approach to explaining survival of new technology-based firms (NTBFs) by focusing on the entrepreneur's ability to transform resources in response to triggers resulting from market interactions. Network theory is used to define a construct that allows determining the status of venture emergence (VE).The operationalization of the VE construct is built on the firm's value network maturity in the four market dimensions customer, investor, partner, and human resource. Business plans of NTBFs represent the artifact that contains this data in the form of transaction relation descriptions. Using content analysis, a multi-step combined human and computer coding process has been developed to empirically determine NTBFs' status of VE.Results of the business plan analysis suggests that the level of transaction relations allows to draw conclusions on the status of VE. Moreover, applying the developed process, a business plan coding test shows that the transaction relation based VE status significantly relates to NTBFs' survival capabilities.
In recent years, there has been a noticeable trend towards a general contractor strategy for plant engineering companies. Multiple disciplines and departments must be administered in a joint project. In the process, different work results are often managed in various systems without any associative relationship. A possible way to address this complexity is to implement a specifically tailored PLM strategy to gain a competitive advantage. Maturity models as well as methods to evaluate possible benefits constitute increasingly applied tools during this journey. Both methods have been theoretically described in previous publications. However, this paper should provide insights in the practical application within machinery industry. Therefore, a medium-sized German plant engineering company serves as an example for determining the scope and value of a multi-national overarching Product Lifecycle Management architecture as the central piece of a future digitalization strategy. The company’s current maturity levels for several digitalization capabilities are evaluated, prioritized and benchmarked against a set of similar companies. This allows to derive suitable target states in terms of maturity levels as well as the technical specification of digitalization use cases. In order to provide profound data for cost justification the resulting benefits are quantified.
It is widely recognized that sustainability is a new challenge for many manufacturing companies. In this paper, we tackle this issue by presenting an approach that deals with material and substance compliance within Product Lifecycle Management in a complex value chain. Our analysis explains why, how and when sustainable manufacturing arises, and it identifies, quantifies and evaluates the environmental impact of a new product. We propose (I) a Life Cycle Assessment tool (LCA) and (II) a model to validate this approach and evaluate the risk of noncompliance in supply chain. Our LCA approach provides comprehensive information on environmental impacts of a product.
Product and materials cycles are parallel and intersecting, making it challenging to integrate Material Selection Process across Product Lifecycle Management, Integration of LCA with PLM. We provide only a foundation. Further research in systems engineering is necessary. LCA is sensitive to data quality. Outsourcing production and having problems in supplier cooperation can result in material mismatch (such as property, composition mismatching) in the production process due to that may cause misleading of LCA results.
This paper also describes research challenges using riskbased due diligence.
It is widely recognized that sustainability is a new challenge for many manufacturing companies. In this paper, we tackle this issue by presenting an approach that deals with material and substance compliance within Product Lifecycle Management in a complex value chain. Our analysis explains why, how and when sustainable manufacturing arises, and it identifies, quantifies and evaluates the environmental impact of a new product. We propose (I) a Life Cycle Assessment tool (LCA) and (II) a model to validate this approach and evaluate the risk of non-compliance in supply chain. Our LCA approach provides comprehensive information on environmental impacts of a product.
Product and materials cycles are parallel and intersecting, making it challenging to integrate Material Selection Process across Product Lifecycle Management, Integration of LCA with PLM. We provide only a foundation. Further research in systems engineering is necessary. LCA is sensitive to data quality. Outsourcing production and having problems in supplier cooperation can result in material mismatch (such as property, composition mismatching) in the production process due to that may cause misleading of LCA results. This paper also describes research challenges using risk-based due diligence.
We have analyzed a pool of 37,839 articles published in 4,404 business-related journals in the entrepreneurship research field using a novel literature review approach that is based on machine learning and text data mining. Most papers have been published in the journals ‘Small Business Economics’, ‘International Journal of Entrepreneurship and Small Business’, and ‘Sustainability’ (Switzerland), while the sum of citations is highest in the ‘Journal of Business Venturing’, ‘Entrepreneurship Theory and Practice’, and ‘Small Business Economics’. We derived 29 overarching themes based on 52 identified clusters. The social entrepreneurship, development, innovation, capital, and economy clusters represent the largest ones among those with high thematic clarity. The most discussed clusters measured by the average number of citations per assigned paper are research, orientation, capital, gender, and growth. Clusters with the highest average growth in publications per year are social entrepreneurship, innovation, development, entrepreneurship education, and (business-) models. Measured by the average yearly citation rate per paper, the thematic cluster ‘research’, mostly containing literature studies, received most attention. The MLR allows for an inclusion of a significantly higher number of publications compared to traditional reviews thus providing a comprehensive, descriptive overview of the whole research field.