Refine
Year of publication
- 2015 (92) (remove)
Document Type
- Conference Proceeding (92) (remove)
Keywords
- Algorithm (1)
- BCH codes (1)
- Bernstein coefficients (1)
- Bernstein polynomials (1)
- Biomechanics Laboratory (1)
- Body-movement (1)
- CRCPolynom (1)
- Checkerboard ordering (1)
- Collision avoidance (1)
- Convex optimization (1)
Konventionelle, von Dieselmotoren angetriebene Radlader beeinträchtigen die Lebensqualität der Menschen in ihrer unmittelbaren Umgebung mit Lärm- und Schadstoffemissionen. Das vom BMBF geförderte Forschungsvorhaben "Emissionsarmer Elektroradlader" verfolgt das Ziel, die lokalen Emissionen von Radladern deutlich herabzusetzen und die Effizienz des Fahrzeugs zu steigern. Im Rahmen des Vorhabens wurde ein konventioneller Radlader auf elektrische Antriebe umgerüstet. Als Energiespeicher dient eine LiFeYPO4-Batterie, die für eine Betriebsdauer von vier Stunden ausgelegt ist. In ersten praktischen Untersuchungen wurde die Energiebilanz des Emissionsarmen Elektro-Radladers mit der des konventionellen Serienfahrzeugs verglichen. Dazu wurde ein modifizierter Y-Arbeitszyklus entworfen, der sich an den üblichen Arbeitsaufgaben des Radladers orientiert und sich durch eine hohe Reproduzierbarkeit auszeichnet. Für die vollständige Bewertung wird die komplette Kette der Energieumwandlung betrachtet, beginnend mit der Energie im Kraftstoff bzw. der dem Stromnetz entnommenen Energie, bis zur mechanischen Arbeit, die das Gerät verrichtet. Daraus lassen sich Rückschlüsse auf die unterschiedlichen CO2-Emissionen beider Fahrzeuge ableiten.
Durch den Einsatz von mobilen Endgeräten (z.B. Tablets, Smartphones) erschließen sich immer mehr Möglichkeiten, die Ausführung von Geschäftsprozessen zu unterstützen. Beispielsweise können Geschäftsprozessaktivitäten (z.B. Genehmigung eines Angebots) ortsunabhängig bearbeitet werden, wodurch die Durchlaufzeit signifikant reduziert wird. Die Nutzung von mobilen Apps beschränkt sich hierbei meist nur auf die Unterstützung von effizienter und flexibler Interaktion zwischen den verschiedenen ausführenden Rollen. Dieser Artikel beschreibt, wie mobile Apps nicht nur die Ausführung, sondern auch die Optimierung von Geschäftsprozessen unterstützen können. Hierzu werden vordefinierte Qualitätskriterien kontextabhängig während der Ausführung von Aktivitäten erfasst. Die durch traditionelle Methoden erfassten Daten (z.B. Messung von Kennzahlen) werden somit durch in Echtzeit gesammeltes User Feedback ergänzt. Der Ansatz wird am Beispiel einer eigens entwickelten mobilen App demonstriert und evaluiert.
In biomechanics laboratories the ground reaction force time histories of the foot-fall of persons are usually measured using a force plate. The accelerations of the floor, in which the force plate is embedded, have to be limited, as they may influence the accuracy of the force measurements. For the numerical simulation of vibrations induced by humans in biomechanical laboratories, loading scenarios are defined. They include continuous motions of persons (walking, running) as well as jumps, typical for biomechanical investigations on athletes. The modeling of floors has to take into account the influence of floor screed in case of portable force plates. Criteria for the assessment of the measuring error provoked by floor vibrations are given. As an example a floor designed to accommodate a force platform in a biomechanical laboratory of the University Hospital in Tübingen, Germany, has been investi-gated for footfall induced vibrations. The numerical simulation by a finite element analysis has been validated by field measurements. As a result, the measuring error of the force plate installed in the laboratory is obtained for diverse scenarios.
Tourist tracking
(2015)
Putze und Fugenmoertel, insbesondere in Sichtmauerwerken uebernehmen wesentliche technische, bauphysikalische und aesthetische Aufgaben fuer die Fassade und tragen massgeblich zum Erhalt der historischen Bausubstanz bei. Dabei sind an Denkmalen haeufig Befunde historischer Putze und Fugenmoertel sowie Verarbeitungsweisen und Handwerkstechniken vorhanden, die heute nicht mehr gebraeuchlich sind und beherrscht werden. Deshalb kommt ihrer Erhaltung besondere Bedeutung zu. Als aeusserster 'Angriffspunkt' der Gebaeudehuelle sind die Putze und Fugenmoertel der Witterung und zum Teil chemischen, biologischen und mechanischen Belastungen frei ausgesetzt und werden in Spritzwasserbereichen auch durch Tausalze beansprucht. Die dadurch sowie durch Alterung und Ermuedung entstehenden Schaeden werden beschrieben und Moeglichkeiten fuer die Erhaltung und Instandsetzung der Putze aufgezeigt.
This work investigates soft input decoding for generalized concatenated (GC) codes. The GC codes are constructed from inner nested binary Bose-Chaudhuri-Hocquenghem (BCH)codes and outer Reed-Solomon (RS) codes. In order to enable soft input decoding for the inner BCH block codes, a sequential stack decoding algorithm is used. Ordinary stack decoding of binary block codes requires the complete trellis of the code.
In this work a representation of the block codes based on the trellises of supercodes is proposed in order to reduce the memory requirements for the representation of the BCH codes. Results for the decoding performance of the overall GC code are presented.
Furthermore, an efficient hardware implementation of the GC decoder is proposed.
Hot isostatic pressing (HIP) allows the production of complex components geometry. Generally, a high quality of the components is achieved due to the well managed composition of the metal powder and the non-isotropic properties. If a duplex stainless steel is produced, a heat treatment after the HIP-process is necessary to remove precipitations like carbides, nitrides and intermetallic phases. In a new process, the sintering step should be combined with the heat treatment. In this case a high cooling rate is necessary to avoid precipitations in duplex stainless steels. In this work, the influence of the HIP-temperature and the wall thickness on corrosion resistance, microstructure and impact strength were investigated. The results should help to optimize the process parameters like temperature and cooling rate. For the investigation, two HIP-temperatures were tested in a classical HIP-process step with a defined cooling rate. An additional heat treatment was not conducted. The specimens were cut from different sectors of the HIP-block. For investigation of the corrosion resistance, the critical pitting temperature was determined with electrochemical method according to EN ISO 17864. An impact test was used to determine the impact transition temperature. Metallographic investigations show the microstructure in the different sectors of the HIP-block.
Differences in the pitting resistance between cold worked CrNi and CrNiMnN metastable austenites
(2015)
Fachvortrag auf dem Kongress CORROSION 2015, 15-19 March, Dallas, Texas, USA. NACE International
Effect of cold working on the localized corrosion behavior of CrNi and CrNiMnN metastable austenites
(2015)
Probabilistic data association for tracking extended targets under clutter using random matrices
(2015)
The use of random matrices for tracking extended objects has received high attention in recent years. It is an efficient approach for tracking objects that give rise to more than one measurement per time step. In this paper, the concept of random matrices is used to track surface vessels using highresolution automotive radar sensors. Since the radar also receives a large number of clutter measurements from the water, for the data association problem, a generalized probabilistic data association filter is applied. Additionally, a modification of the filter update step is proposed to incorporate the Doppler velocity measurements. The presented tracking algorithm is validated using Monte Carlo Simulation, and some performance results with real radar data are shown as well.
Small vessels or unmanned surface vehicles only have a limited amount of space and energy available. If these vessels require an active sensing collision avoidance system it is often not possible to mount large sensor systems like X-Band radars. Thus, in this paper an energy efficient automotive radar and a laser range sensor are evaluated for tracking surrounding vessels. For these targets, those type of sensors typically generate more than one detection per scan. Therefore, an extended target tracking problem has to be solved to estimate state end extension of the vessels. In this paper, an extended version of the probabilistic data association filter that uses random matrices is applied. The performance of the tracking system using either radar or laser range data is demonstrated in real experiments.
Stress is recognized as a factor of predominant disease and in the future the costs for treatment will increase. The presented approach tries to detect stress in a very basic and easy to implement way, so that the cost for the device and effort to wear it remain low. The user should benefit from the fact that the system offers an easy interface reporting the status of his body in real time. In parallel, the system provides interfaces to pass the obtained data forward for further processing and (professional) analyses, in case the user agrees. The system is designed to be used in every day’s activities and it is not restricted to laboratory use or environments. The implementation of the enhanced prototype shows that the detection of stress and the reporting can be managed using correlation plots and automatic pattern recognition even on a very light-weighted microcontroller platform.
Digital cameras are subject to physical, electronic and optic effects that result in errors and noise in the image. These effects include for example a temperature dependent dark current, read noise, optical vignetting or different sensitivities of individual pixels. The task of a radiometric calibration is to reduce these errors in the image and thus improve the quality of the overall application. In this work we present an algorithm for radiometric calibration based on Gaussian processes. Gaussian processes are a regression method widely used in machine learning that is particularly useful in our context. Then Gaussian process regression is used to learn a temperature and exposure time dependent mapping from observed gray-scale values to true light intensities for each pixel. Regression models based on the characteristics of single pixels suffer from excessively high runtime and thus are unsuitable for many practical applications. In contrast, a single regression model for an entire image with high spatial resolution leads to a low quality radiometric calibration, which also limits its practical use. The proposed algorithm is predicated on a partitioning of the pixels such that each pixel partition can be represented by one single regression model without quality loss. Partitioning is done by extracting features from the characteristic of each pixel and using them for lexicographic sorting. Splitting the sorted data into partitions with equal size yields the final partitions, each of which is represented by the partition centers. An individual Gaussian process regression and model selection is done for each partition. Calibration is performed by interpolating the gray-scale value of each pixel with the regression model of the respective partition. The experimental comparison of the proposed approach to classical flat field calibration shows a consistently higher reconstruction quality for the same overall number of calibration frames.
Der Begriff Schatten-IT beschreibt Systeme, die außerhalb des Verantwortungsbereiches der IT durch die Fachabteilung eigenständig entwickelt und betrieben werden. Obwohl dieses Phänomen weit verbreitet ist, blieb diese Schatten-IT in Wissenschaft und in Praxis lange unbeachtet. Seit einiger Zeit ist eine verstärkte Auseinandersetzung mit dem Phänomen erkennbar. Jedoch gibt es bisher nur wenige empirisch gestützte Erkenntnisse über die Schatten-IT. Diese Lücke wird in dem vorliegenden Beitrag angegangen. Auf Basis von vier Fallstudien werden die Verbreitung, Verwendung und die Qualität der Schatten-IT ermittelt. Dabei zeigt sich, dass die Schatten-IT mit durchaus erheblichen Risiken verbunden sein kann. Demgegenüber bleibt die Qualität jedoch merklich zurück. Erkennbar ist dabei auch, dass die Schatten-IT nicht im Blickfeld der Kontrollsysteme der Unternehmen erscheint. Durch Maßnahmen wie die Steigerung der Awareness der Schatten-IT im Unternehmen sowie eine Entwicklung der Schatten-IT hin zu einer gesteuerten Fachbereichs-IT lassen sich die Risiken wirksam reduzieren. Zudem sind die Kontrollsysteme der Unternehmen so anzupassen, dass auch die Schatten-IT in diesen sichtbar wird.
Shadow IT risk
(2015)
Nowadays, the number of flexible and fast human to application system interactions is dramatically increasing. For instance, citizens interact with the help of the internet to organize surveys or meetings (in real-time) spontaneously. These interactions are supported by technologies and application systems such as free wireless networks, web -or mobile apps. Smart Cities aim at enabling their citizens to use these digital services, e.g., by providing enhanced networks and application infrastructures maintained by the public administration. However, looking beyond technology, there is still a significant lack of interaction and support between "normal" citizens and the public administration. For instance, democratic decision processes (e.g. how to allocate public disposable budgets) are often discussed by the public administration without citizen involvement. This paper introduces an approach, which describes the design of enhanced interactional web applications for Smart Cities based on dialogical logic process patterns. We demonstrate the approach with the help of a budgeting scenario as well as a summary and outlook on further research.
Besides energy efficiency, product quality is gaining importance in the design of drying processes for sensitive biological foodstuffs. The influence of drying parameters on the drying kinetics of apples has been extensively investigated; the information about effects on product quality available in literature however, is often contradictory. Furthermore quality changes obtained applying different drying parameters are usually hard to compare. As most quality changes can be expressed as zero, first or second order reactions and mainly depend on drying air temperature and drying time, it would be desirable to cross-check the results in function thereof. This paper introduces a method of quality determination using a new reference value, the cumulated thermal load. It is defined as the time integral of the product surface temperature and improves the comparability of quality changes obtained by different experimental settings in drying of apples and tomatoes. It could be shown that quality parameters like color changes and shrinkage during apple drying and the content of temperature sensitive acids in tomatoes vary linearly with the integral of product temperature over time.
Conducting surveillance impact assessment is the first step to solve the "Who monitors the monitor?" problem. Since the surveillance impacts on different dimensions of privacy and society are always changing, measuring compliance and impact through metrics can ensure the negative consequences are minimized to acceptable levels. To develop metrics systematically for surveillance impact assessment, we follow the top-down process of the Goal/Question/Metric paradigm: 1) establish goals through the social impact model, 2) generate questions through the dimensions of surveillance activities, and 3) develop metrics through the scales of measure. With respect to the three factors of impact magnitude: the strength of sources, the immediacy of sources, and the number of sources, we generate questions concerning surveillance activities: by whom, for whom, why, when, where, of what, and how, and develop metrics with the scales of measure: the nominal scale, the ordinal scale, the interval scale, and the ratio scale. In addition to compliance assessment and impact assessment, the developed metrics have the potential to address the power imbalance problem through sousveillance, which employs surveillance to control and redirect the impact exposures.
Technology-based ventures provide an important route for successful technology transfer [1], [2]. Their founders are supported in successful technology commercialization by innovation intermediaries [3]. Accordingly, the performance of an innovation system, at least to some extent, depends on the efficiency of these intermediaries in terms of the impact of their scarce resources on the survival and growth of technology-based ventures. To increase their efficiency, intermediaries typically optimize their "intake" by requesting a formal business plan to base their selection on as a hygiene factor [4]-[7]. Thus, some scholars argue that written business plans show significant distortion as being produced only to attract support from innovation intermediaries [6], [8]. Accordingly, they rarely serve for these addressees as a source of information for analyzing the strengths and weaknesses of ventures, in order to derive actionable conclusions and more effectively support ventures [9], [10]. Addressees search for different indicators in business plans for their evaluation [11]. The descriptions of these indicators only evince little empirical proof for the performance of technology-based venture's [8], [12]. This gap is herein addressed, in contrast to the lacking empirical insight, as the most frequently produced artifact of early-stage technology ventures is at the same time a written business plan [10], [13]. This paper addresses this gap by conceptualizing transaction relations described in the written business plan as a means for working around the inevitable inaccuracies and uncertainties that delimit the explanatory abilities [14] of the snapshot model [10] presented by a business plan. Using a qualitative content analysis, we derive from the descriptions of transaction relations in a written business plan valid indicators for the maturity of the venture's value-network in different dimensions [15]. To this extent, this paper presents the findings from a pre-study that was conducted based on a sample of forty business plans from an overall population of 800 business plans in a longitudinal sample from one of Europe's most active innovation systems, the regional State of Baden-Württemberg. Such findings may be used by innovation intermediaries to enhance their efficiency, by enabling these to not only derive individual support strategies for business acceleration but also to analyze the impact of support measures by reliably monitoring maturity progress in venture activities.
Assessment Literacy
(2015)
Die dynamische Belastung durch Sprünge von Menschen ist eine der maßgebenden personeninduzierten Einwirkungen bei dem Nachweis der Gebrauchstauglichkeit von Decken mit schwingungsgefährdenden Geräten. Sie kommt beispielsweise in Ganglaboren und anderen Forschungseinrichtungen bei biomechanischen Untersuchungen in der Sprungkraftdiagnostik vor. Die in diesem Zusammenhang typischen Sprungarten, nämlich der Countermovement Jump (beidbeiniger vertikaler Sprung mit Ausholbewegung) und der Drop Jump (Niederhochsprung), wurden anhand von über 200 Kraftverlaufsmessungen analysiert und daraus die dazugehörigen idealisierten Belastungsmodelle ermittelt. Damit wurde ein vereinfachtes Nachweisverfahren zur Abschätzung der Schwingungsantwort einer Decke entwickelt. Als Ergebnis kann die Beschleunigung der Decke in Abhängigkeit von der modalen Masse und den Werten der Eigenformen am Lastangriffspunkt und am Ausgabepunkt angegeben werden.
In this paper an approach towards databased fault diagnosis of linear electromagnetic actuators is presented. Time and time-frequency-domain methods were applied to extract fault related features from current and voltage measurements. The resulting features were transformed to enhance class separability using either Principal Component Analysis (PCA) or Optimal Transformation. Feature selection and dimensionality reduction was performed employing a modified Fisher-ratio. Fault detection was carried out using a Support-Vector-Machine classifier trained with randomly selected data subsets. Results showed, that not only the used feature sets (time-domain/time-frequency-domain) are crucial for fault detection and classification, but also feature pre-processing. PCA transformed time-domain features allow fault detection and classification without misclassification, relying on current and voltage measurements making two sensors necessary to generate the data. Optimal transformed time-frequency-domain features allow a misclassification free result as well, but as they are calculated from current measurements only, a dedicated voltage sensor is not necessary. Using those features is a promising alternative even for detecting purely supply voltage related faults.
Beyond the SDG compass
(2015)
CSR und Compliance
(2015)
To evaluate the quality of a person's sleep it is essential to identify the sleep stages and their durations. Currently, the gold standard in terms of sleep analysis is overnight polysomnography (PSG), during which several techniques like EEG (eletroencephalogram), EOG (electrooculogram), EMG (electromyogram), ECG (electrocardiogram), SpO2 (blood oxygen saturation) and for example respiratory airflow and respiratory effort are recorded. These expensive and complex procedures, applied in sleep laboratories, are invasive and unfamiliar for the subjects and it is a reason why it might have an impact on the recorded data. These are the main reasons why low-cost home diagnostic systems are likely to be advantageous. Their aim is to reach a larger population by reducing the number of parameters recorded. Nowadays, many wearable devices promise to measure sleep quality using only the ECG and body-movement signals. This work presents an android application developed in order to proof the accuracy of an algorithm published in the sleep literature. The algorithm uses ECG and body movement recordings to estimate sleep stages. The pre-recorded signals fed into the algorithm have been taken from physionet1 online database. The obtained results have been compared with those of the standard method used in PSG. The mean agreement ratios between the sleep stages REM, Wake, NREM-1, NREM-2 and NREM-3 were 38.1%, 14%, 16%, 75% and 54.3%.
This paper compares the surface morphology of differently finished austenitic stainless steel AISI 316L, also in combination with low temperature carburization. Milled and tumbled surfaces were analyzed by means of corrosion resistance and surface morphology. The results of potentiodynamic measurements show that professional grinding operations with SiC and Al2O3 always lead to a better corrosion resistance of low temperature carburized surfaces compared to the untreated reference in the used acidified chloride solution. Big influence on the corrosion resistance of vibratory ground or tumbled surfaces has the amount of plastic deformation while machining, that has to be kept low for austenitic stainless steels. Due to the high ductility, plastic deformation can lead to the formation of meta stable pits that can be initiation points of corrosion. The formation of meta stable pits can be aggravated by low temperature diffusion processes.