Refine
Year of publication
- 2015 (92) (remove)
Document Type
- Conference Proceeding (92) (remove)
Keywords
- Algorithm (1)
- BCH codes (1)
- Bernstein coefficients (1)
- Bernstein polynomials (1)
- Biomechanics Laboratory (1)
- Body-movement (1)
- CRCPolynom (1)
- Checkerboard ordering (1)
- Collision avoidance (1)
- Convex optimization (1)
This contribution presents a data compression scheme for applications in non-volatile flash memories. The objective of the data compression algorithm is to reduce the amount of user data such that the redundancy of the error correction coding can be increased in order to improve the reliability of the data storage system. The data compression is performed on block level considering data blocks of 1 kilobyte. We present an encoder architecture that has low memory requirements and provides a fast data encoding.
A semilinear distributed parameter approach for solenoid valve control including saturation effects
(2015)
In this paper a semilinear parabolic PDE for the control of solenoid valves is presented. The distributed parameter model of the cylinder becomes nonlinear by the inclusion of saturation effects due to the material's B/H-curve. A flatness based solution of the semilinear PDE is shown as well as a convergence proof of its series solution. By numerical simulation results the adaptability of the approach is demonstrated, and differences between the linear and the nonlinear case are discussed. The major contribution of this paper is the inclusion of saturation effects into the magnetic field governing linear diffusion equation, and the development of a flatness based solution for the resulting semilinear PDE as an extension of previous works [1] and [2].
We present a 3d-laser-scan simulation in virtual
reality for creating synthetic scans of CAD models. Consisting of
the virtual reality head-mounted display Oculus Rift and the
motion controller Razer Hydra our system can be used like
common hand-held 3d laser scanners. It supports scanning of
triangular meshes as well as b-spline tensor product surfaces
based on high performance ray-casting algorithms. While point
clouds of known scanning simulations are missing the man-made
structure, our approach overcomes this problem by imitating
real scanning scenarios. Calculation speed, interactivity and the
resulting realistic point clouds are the benefits of this system.
This work proposes an efficient hardware Implementation of sequential stack decoding of binary block codes. The decoder can be applied for soft input decoding for generalized concatenated (GC) codes. The GC codes are constructed from inner nested binary Bose-Chaudhuri-Hocquenghem (BCH) codes and outer Reed-Solomon (RS) codes. In order to enable soft input decoding for the inner BCH block codes, a sequential stack decoding algorithm is used.
Domain-specific modelling is increasingly adopted in the software development industry. While open source metamodels like Ecore have a wide impact, they still have some problems. The independent storage of nodes (classes) and edges (references) is currently only possible with complex, specific solutions. Furthermore the developed models are stored in the extensible markup language (XML) data format, which leads to problems with large models in terms of scaling. In this paper we describe an approach that solves the problem of independent classes and references in metamodels and we store the models in the JavaScript Object Notation (JSON) data format to support high scalability. First results of our tests show that the developed approach works and classes and references can be defined independently. In addition, our approach reduces the amount of characters per model by a factor of approximately two compared to Ecore. The entire project is made available as open source under the name MoDiGen. This paper focuses on the description of the metamodel definition in terms of scaling.
Assessment Literacy
(2015)
Beyond the SDG compass
(2015)
CSR und Compliance
(2015)
Nowadays, the number of flexible and fast human to application system interactions is dramatically increasing. For instance, citizens interact with the help of the internet to organize surveys or meetings (in real-time) spontaneously. These interactions are supported by technologies and application systems such as free wireless networks, web -or mobile apps. Smart Cities aim at enabling their citizens to use these digital services, e.g., by providing enhanced networks and application infrastructures maintained by the public administration. However, looking beyond technology, there is still a significant lack of interaction and support between "normal" citizens and the public administration. For instance, democratic decision processes (e.g. how to allocate public disposable budgets) are often discussed by the public administration without citizen involvement. This paper introduces an approach, which describes the design of enhanced interactional web applications for Smart Cities based on dialogical logic process patterns. We demonstrate the approach with the help of a budgeting scenario as well as a summary and outlook on further research.
The proposed approach applies current unsupervised clustering approaches in a different dynamic manner. Instead of taking all the data as input and finding clusters among them, the given approach clusters Holter ECG data (long-term electrocardiography data from a holter monitor) on a given interval which enables a dynamic clustering approach (DCA). Therefore advanced clustering techniques based on the well known Dynamic Time Warping algorithm are used. Having clusters e.g. on a daily basis, clusters can be compared by defining cluster shape properties. Doing this gives a measure for variation in unsupervised cluster shapes and may reveal unknown changes in healthiness. Embedding this approach into wearable devices offers advantages over the current techniques. On the one hand users get feedback if their ECG data characteristic changes unforeseeable over time which makes early detection possible. On the other hand cluster properties like biggest or smallest cluster may help a doctor in making diagnoses or observing several patients. Further, on found clusters known processing techniques like stress detection or arrhythmia classification may be applied.
Conducting surveillance impact assessment is the first step to solve the "Who monitors the monitor?" problem. Since the surveillance impacts on different dimensions of privacy and society are always changing, measuring compliance and impact through metrics can ensure the negative consequences are minimized to acceptable levels. To develop metrics systematically for surveillance impact assessment, we follow the top-down process of the Goal/Question/Metric paradigm: 1) establish goals through the social impact model, 2) generate questions through the dimensions of surveillance activities, and 3) develop metrics through the scales of measure. With respect to the three factors of impact magnitude: the strength of sources, the immediacy of sources, and the number of sources, we generate questions concerning surveillance activities: by whom, for whom, why, when, where, of what, and how, and develop metrics with the scales of measure: the nominal scale, the ordinal scale, the interval scale, and the ratio scale. In addition to compliance assessment and impact assessment, the developed metrics have the potential to address the power imbalance problem through sousveillance, which employs surveillance to control and redirect the impact exposures.
Differences in the pitting resistance between cold worked CrNi and CrNiMnN metastable austenites
(2015)
Fachvortrag auf dem Kongress CORROSION 2015, 15-19 March, Dallas, Texas, USA. NACE International
Digitale Transformation
(2015)
Technology commercialization is described as the most dreadful challenge for technology-based entrepreneurs. The scarcity of resources and limited managerial experience make it a daunting task, putting in danger the whole firm emergence. Prior research has often build upon the resource-based view to propose that the new firms' performance is dependent on their initial resource endowments and configurations. Nevertheless, little is known on how the early-stage decisions of the entrepreneur might influence on the growth of the firm. Scholars have suggested that both technology and market orientation actions could influence the performance and growth of firms in this context; nevertheless, there is limited empirical evidence of the influence of these different orientations in the context of new technology-based firms (NTBFs). In this study we propose to explore the influence of technology and demand creation actions adopting a demand-side view. We use a longitudinal study on a panel dataset (2004-2007) with 249 U.S. new high-technology firms to test our hypothesis. The results point towards a rather limited influence of initial resource configurations, as well as an unexpected influence of market and technology orientation in the growth dimensions of an NTBF. The research holds implications for the management of new technology-based firms and for those interested in supporting the development of technology entrepreneurship.
Effect of cold working on the localized corrosion behavior of CrNi and CrNiMnN metastable austenites
(2015)
In this paper an approach towards databased fault diagnosis of linear electromagnetic actuators is presented. Time and time-frequency-domain methods were applied to extract fault related features from current and voltage measurements. The resulting features were transformed to enhance class separability using either Principal Component Analysis (PCA) or Optimal Transformation. Feature selection and dimensionality reduction was performed employing a modified Fisher-ratio. Fault detection was carried out using a Support-Vector-Machine classifier trained with randomly selected data subsets. Results showed, that not only the used feature sets (time-domain/time-frequency-domain) are crucial for fault detection and classification, but also feature pre-processing. PCA transformed time-domain features allow fault detection and classification without misclassification, relying on current and voltage measurements making two sensors necessary to generate the data. Optimal transformed time-frequency-domain features allow a misclassification free result as well, but as they are calculated from current measurements only, a dedicated voltage sensor is not necessary. Using those features is a promising alternative even for detecting purely supply voltage related faults.
Stress is recognized as a factor of predominant disease and in the future the costs for treatment will increase. The presented approach tries to detect stress in a very basic and easy to implement way, so that the cost for the device and effort to wear it remain low. The user should benefit from the fact that the system offers an easy interface reporting the status of his body in real time. In parallel, the system provides interfaces to pass the obtained data forward for further processing and (professional) analyses, in case the user agrees. The system is designed to be used in every day’s activities and it is not restricted to laboratory use or environments. The implementation of the enhanced prototype shows that the detection of stress and the reporting can be managed using correlation plots and automatic pattern recognition even on a very light-weighted microcontroller platform.
Hot isostatic pressing (HIP) allows the production of complex components geometry. Generally, a high quality of the components is achieved due to the well managed composition of the metal powder and the non-isotropic properties. If a duplex stainless steel is produced, a heat treatment after the HIP-process is necessary to remove precipitations like carbides, nitrides and intermetallic phases. In a new process, the sintering step should be combined with the heat treatment. In this case a high cooling rate is necessary to avoid precipitations in duplex stainless steels. In this work, the influence of the HIP-temperature and the wall thickness on corrosion resistance, microstructure and impact strength were investigated. The results should help to optimize the process parameters like temperature and cooling rate. For the investigation, two HIP-temperatures were tested in a classical HIP-process step with a defined cooling rate. An additional heat treatment was not conducted. The specimens were cut from different sectors of the HIP-block. For investigation of the corrosion resistance, the critical pitting temperature was determined with electrochemical method according to EN ISO 17864. An impact test was used to determine the impact transition temperature. Metallographic investigations show the microstructure in the different sectors of the HIP-block.
This paper compares the surface morphology of differently finished austenitic stainless steel AISI 316L, also in combination with low temperature carburization. Milled and tumbled surfaces were analyzed by means of corrosion resistance and surface morphology. The results of potentiodynamic measurements show that professional grinding operations with SiC and Al2O3 always lead to a better corrosion resistance of low temperature carburized surfaces compared to the untreated reference in the used acidified chloride solution. Big influence on the corrosion resistance of vibratory ground or tumbled surfaces has the amount of plastic deformation while machining, that has to be kept low for austenitic stainless steels. Due to the high ductility, plastic deformation can lead to the formation of meta stable pits that can be initiation points of corrosion. The formation of meta stable pits can be aggravated by low temperature diffusion processes.
In biomechanics laboratories the ground reaction force time histories of the foot-fall of persons are usually measured using a force plate. The accelerations of the floor, in which the force plate is embedded, have to be limited, as they may influence the accuracy of the force measurements. For the numerical simulation of vibrations induced by humans in biomechanical laboratories, loading scenarios are defined. They include continuous motions of persons (walking, running) as well as jumps, typical for biomechanical investigations on athletes. The modeling of floors has to take into account the influence of floor screed in case of portable force plates. Criteria for the assessment of the measuring error provoked by floor vibrations are given. As an example a floor designed to accommodate a force platform in a biomechanical laboratory of the University Hospital in Tübingen, Germany, has been investi-gated for footfall induced vibrations. The numerical simulation by a finite element analysis has been validated by field measurements. As a result, the measuring error of the force plate installed in the laboratory is obtained for diverse scenarios.
Besides energy efficiency, product quality is gaining importance in the design of drying processes for sensitive biological foodstuffs. The influence of drying parameters on the drying kinetics of apples has been extensively investigated; the information about effects on product quality available in literature however, is often contradictory. Furthermore quality changes obtained applying different drying parameters are usually hard to compare. As most quality changes can be expressed as zero, first or second order reactions and mainly depend on drying air temperature and drying time, it would be desirable to cross-check the results in function thereof. This paper introduces a method of quality determination using a new reference value, the cumulated thermal load. It is defined as the time integral of the product surface temperature and improves the comparability of quality changes obtained by different experimental settings in drying of apples and tomatoes. It could be shown that quality parameters like color changes and shrinkage during apple drying and the content of temperature sensitive acids in tomatoes vary linearly with the integral of product temperature over time.