Refine
Year of publication
Document Type
- Article (425) (remove)
Language
- English (216)
- German (207)
- Multiple languages (2)
Keywords
- (Strict) sign-regularity (1)
- 1D-CNN (1)
- 3D urban planning (1)
- AAL (2)
- Aboriginal people (1)
- Abstract interpretation (1)
- Accelerometer calibration (1)
- Accelerometer sensor (1)
- Accelerometers (1)
- Actuators (1)
Institute
- Fakultät Architektur und Gestaltung (5)
- Fakultät Bauingenieurwesen (24)
- Fakultät Elektrotechnik und Informationstechnik (1)
- Fakultät Informatik (9)
- Fakultät Maschinenbau (9)
- Fakultät Wirtschafts-, Kultur- und Rechtswissenschaften (43)
- Institut für Angewandte Forschung - IAF (29)
- Institut für Optische Systeme - IOS (8)
- Institut für Strategische Innovation und Technologiemanagement - IST (14)
- Institut für Systemdynamik - ISD (27)
Der Begriff "Integrität" nimmt das Verhältnis zwischen individuellem Handeln und der Einhaltung von Regeln und Werten in den Blick. Grüninger/Wanzek betonen, dass integres Handeln nicht blinde Regelbefolgung, sondern die Erfüllung der zugrundeliegenden Werte erfordert. Von Integrität wird gesprochen, wenn die ethischen Werte im individuellen Denken und Tun sowie auf persönlicher und organisationaler Ebene übereinstimmen.
Wie gehen mittelständische Unternehmen mit internationaler Geschäftstätigkeit mit Compliance-Risiken um? Wie gelingt das Risikomanagement spezifischer Herausforderungen der Regelkonformität in Wachstumsländern, die aus Compliance-Gesichtspunkten als Hochrisikoländer eingestuft werden? Und was beschäftigt dabei Compliance-Officer im Mittelstand? Diesen Fragen widmete sich ein anwendungsorientiertes Forschungsprojekt am Konstanz Institut für Corporate Governance.
At present, the majority of the proposed Deep Learning (DL) methods provide point predictions without quantifying the model's uncertainty. However, a quantification of the reliability of automated image analysis is essential, in particular in medicine when physicians rely on the results for making critical treatment decisions. In this work, we provide an entire framework to diagnose ischemic stroke patients incorporating Bayesian uncertainty into the analysis procedure. We present a Bayesian Convolutional Neural Network (CNN) yielding a probability for a stroke lesion on 2D Magnetic Resonance (MR) images with corresponding uncertainty information about the reliability of the prediction. For patient-level diagnoses, different aggregation methods are proposed and evaluated, which combine the individual image-level predictions. Those methods take advantage of the uncertainty in the image predictions and report model uncertainty at the patient-level. In a cohort of 511 patients, our Bayesian CNN achieved an accuracy of 95.33% at the image-level representing a significant improvement of 2% over a non-Bayesian counterpart. The best patient aggregation method yielded 95.89% of accuracy. Integrating uncertainty information about image predictions in aggregation models resulted in higher uncertainty measures to false patient classifications, which enabled to filter critical patient diagnoses that are supposed to be closer examined by a medical doctor. We therefore recommend using Bayesian approaches not only for improved image-level prediction and uncertainty estimation but also for the detection of uncertain aggregations at the patient-level.
Insecurity Refactoring is a change to the internal structure of software to inject a vulnerability without changing the observable behavior in a normal use case scenario. An implementation of Insecurity Refactoring is formally explained to inject vulnerabilities in source code projects by using static code analysis. It creates learning examples with source code patterns from known vulnerabilities.
Insecurity Refactoring is achieved by creating an Adversary Controlled Input Dataflow tree based on a Code Property Graph. The tree is used to find possible injection paths. Transformation of the possible injection paths allows to inject vulnerabilities. Insertion of data flow patterns introduces different code patterns from related Common Vulnerabilities and Exposures (CVE) reports. The approach is evaluated on 307 open source projects. Additionally, insecurity-refactored projects are deployed in virtual machines to be used as learning examples. Different static code analysis tools, dynamic tools and manual inspections are used with modified projects to confirm the presence of vulnerabilities.
The results show that in 8.1% of the open source projects it is possible to inject vulnerabilities. Different inspected code patterns from CVE reports can be inserted using corresponding data flow patterns. Furthermore the results reveal that the injected vulnerabilities are useful for a small sample size of attendees (n=16). Insecurity Refactoring is useful to automatically generate learning examples to improve software security training. It uses real projects as base whereas the injected vulnerabilities stem from real CVE reports. This makes the injected vulnerabilities unique and realistic.
Dieser Beitrag untersucht, ob externe Interventionen, in Form von Forschung und/oder Wissenschaftskommunikation, als Mediator für Innovationen in Krisenzeiten in der Tourismusbranche fungieren können. Dabei wird anhand dreier Case Studies diskutiert, inwiefern die Corona-Krise ein Window-
of-opportunity für innovative Geschäftsmodelle im Tourismus darstellen konnte. Die Projektergebnisse geben Hinweise darauf, dass Krisen im Allgemeinen und Wissenschaftskommunikation im Speziellen als Push-Faktoren Innovationen befördern können. Zwar kam es bei den Projektpartnern zu einer Entwicklung von Innovationen im Projektzeitraum, jedoch wurde die Implementierung vermehrt in eine unbestimmte Zukunft verschoben. Durch die damit verbundene Rückkehr zum Status-Quo blieben die angestoßenen Innovationen zu einem Großteil auf einer konzeptionellen Ebene. Dies deutet auf eine Attitude-behavior-gap in Bezug auf die Schaffung und Umsetzung von Innovationen in Krisenzeiten.
The corrosion resistance of stainless steels is massively influenced by the condition of their surface. The surface quality includes the topography of the surface, the structure and composition of the passive layer, and the surface near structure of the base material. These factors are influenced by final physical/chemical surface treatments. The presented work shows significantly lower corrosion resistance for mechanical machined specimens than for etched specimens. It also turns out that the rougher the surface, the lower the corrosion resistance gets. However, there is no general finding which shows if blasted or grinded surfaces are more appropriate, but a dependency on process parameters and the characteristics on corrosive exposure in terms of corrosion behavior. The results show that not only the surface roughness Ra has an influence on corrosion behavior but also the shape of peaks and valleys which are evolved by surface treatments. Imperfections in the base material, like sulfidic inclusions lead to a weaker passive layer, respectively, to a decrease of the corrosion resistance. By using special passivating techniques the corrosion resistance of stainless steels can be increased to a higher level in comparison to common passivation.
The massive use of patient data for the training of artificial intelligence algorithms is common nowadays in medicine. In this scientific work, a statistical analysis of one of the most used datasets for the training of artificial intelligence models for the detection of sleep disorders is performed: sleep health heart study 2. This study focuses on determining whether the gender and age of the patients have a relevant influence to consider working with differentiated datasets based on these variables for the training of artificial intelligence models.
Mechanical properties after stretching testings were calcu-lated and experimentally determined via Tempcore method for bar core, bar surface and whole bar cross section. It was displayed on the base of experiments and imitating simulation that deformation in core and surface areas of a bar are equal and therefore influence of structural parameters in the core area is principally decisive for initiating of neck forming in the surface area. The results showed that resistance to destruction of martensite surface layer has rather less effect on bar properties in general in comparison with previous investigations. It is concluded that improvement of core structure quality can help to lower brittleness of the whole bar. It was also proved that used techniques provide good concordance between the obtained results and experimental data. Therefore, the additivity rule for structural components can be used successfully for determination of whole bar parameters, taking into account thickness of surface layer that can be measured easily using hardness sensor. It will simplify practically quality control of products.
Für den deutschen bzw. europäischen Tourismus ist
Indien seit Jahrzehnten eine Destination, die kulturtouristisch,
aber auch zunehmend gesundheitstouristische
interessant ist. Sonderformen wie spiritueller
Tourismus, nachhaltiger Tourismus oder Tanztourismus
(Bollywood-Dance) haben sich in Nischen
etabliert oder beginnen diese zu verlassen. Indien
selbst unternahm mit seiner Kampagne „Incredible
India“ 2002 eine weltweit beachtete, selbstironische
Initiative, sich als „unglaubliche“ Destination in den
Auslandsmärkten zu positionieren. Demgegenüber
steht eine Realität in Indien, die einerseits Massenarmut,
Korruption, Sicherheitsprobleme, Bürokratie
und mangelnde Infrastruktur ebenso vorhält wie eine
beeindruckende Kultur- und Naturlandschaft, gut
ausgebildete englischsprachige Menschen, ethnische
Vielfalt sowie Mystik und Spiritualität. Indien definiert
sich – auch touristisch – durch Extreme (vgl. Freyer &
Thimm 2011: 261).
Traditional Western philosophy, cognitive science and traditional HCI frameworks approach the term digital and its implications with an implicit dualism (nature/cul-ture, theory /practice, body/mind, human/machine). What lies between is a feature of our postmodern times, in which different states, conditions or positions merge and co-exist in a new, hybrid reality, a “continuous beta” (Mühlenbeck & Skibicki, 2007) version of becoming .Post-digitality involves the physical dimensions of spatio-temporal engagements. This new ontological paradigm reconceptualizes digital technology through the ex-perience of the human body and its senses, thus emphasizing form-taking, situation-al engagement and practice rather than symbolic, disembodied rationality. This rais-es two questions in particular: how to encourage curiosity, playfulness, serendipity, emergence, discourse and collectivity? How to construct working methods without foregrounding and dividing the subject into an individual that already takes posi-tion? This paper briefly outlines the rhizomatic framework that I developed within my PhD research. This attempts to overcome two prevailing tendencies: first, the one-sided view of scientific approaches to knowledge acquisition and the pure-ly application-oriented handling of materials, technologies and machines; second, the distanced perception of the world. In contrast, my work involves project-driven alchemic curiosity and doing research through artistic design practice. This means thinking through materials, technologies and machinic interactions. Now, at the end of this PhD journey, 10 interdisciplinary projects have emerged from this ontological queer-paradigm that is post-digital–crafting 4.0. Below I illustrate this approach and its outcomes.
Cities around the world are facing the implications of a changing climate as an increasingly pressing issue. The negative effects of climate change are already being felt today. Therefore, adaptation to these changes is a mission that every city must master. Leading practices worldwide demonstrate various urban efforts on climate change adaptation (CCA) which are already underway. Above all, the integration of climate data, remote sensing, and in situ data is key to a successful and measurable adaptation strategy. Furthermore, these data can act as a timely decision support tool for municipalities to develop an adaptation strategy, decide which actions to prioritize, and gain the necessary buy-in from local policymakers. The implementation of agile data workflows can facilitate the integration of climate data into climate-resilient urban planning. Due to local specificities, (supra)national, regional, and municipal policies and (by) laws, as well as geographic and related climatic differences worldwide, there is no single path to climate-resilient urban planning. Agile data workflows can support interdepartmental collaboration and, therefore, need to be integrated into existing management processes and government structures. Agile management, which has its origins in software development, can be a way to break down traditional management practices, such as static waterfall models and sluggish stage-gate processes, and enable an increased level of flexibility and agility required when urgent. This paper presents the findings of an empirical case study conducted in cooperation with the City of Constance in southern Germany, which is pursuing a transdisciplinary and trans-sectoral co-development approach to make management processes more agile in the context of climate change adaptation. The aim is to present a possible way of integrating climate data into CCA planning by changing the management approach and implementing a toolbox for low-threshold access to climate data. The city administration, in collaboration with the University of Applied Sciences Constance, the Climate Service Center Germany (GERICS), and the University of Stuttgart, developed a co-creative and participatory project, CoKLIMAx, with the objective of integrating climate data into administrative processes in the form of a toolbox. One key element of CoKLIMAx is the involvement of the population, the city administration, and political decision-makers through targeted communication and regular feedback loops among all involved departments and stakeholder groups. Based on the results of a survey of 72 administrative staff members and a literature review on agile management in municipalities and city administrations, recommendations on a workflow and communication structure for cross-departmental strategies for resilient urban planning in the City of Constance were developed.
Während jeder Phase der touristischen Wertschöpfungskette spielen Wahrnehmungsprozesse eine relevante Rolle. Sofern es sich nicht um einen Wiederholungsbesucher handelt, kann der potenzielle Tourist zunächst nur auf das Image, das Fremdbild einer Destination, zurückgreifen, das nicht nur durch Tourismusmarketing, sondern auch durch andere Wahrnehmungsprozesse geformt wurde: immer wenn die Destination Gegenstand in einem Film, einer Nachrichtensendung oder einem Buch ist, trägt dies dazu bei, das Image zu formen. Somit ist nur ein Teil der touristischen Wahrnehmung vor der Reise eine durch Tourismusmarketing gesteuerte. Das Image oder Bild, das ein potenziell Reisender von einer Destination hat, kann somit über Jahre hinweg aufgebaut worden sein, durch Verbindung sämtlicher Eindrücke von ihr, die er im Laufe seines Lebens gesammelt hat. Diese inneren Bilder können individuell abweichen, weisen aber in vielen Fällen kollektive Gemeinsamkeiten auf (z. B. wird Paris mehrheitlich mit Romantik, mit der Südsee das Paradies und Berlin mit Party verbunden). Während und nach der Reise sucht der Tourist die Bestätigung dieser inneren Bilder – sie erweisen sich als persistent auch gegenüber der Realität. Diese Art der touristischen Wahrnehmung wird von Hennig (1997) treffend mit dem Begriff der „Imaginären Geographie“ beschrieben und soll daher zentraler Bezugspunkt dieses Beitrages sein. Angewendet wird dieses Konzept auf das Fallbeispiel Sevilla. Ziele des Beitrages sind daher:
- Aufzeigen des Erklärungspotenzialsdes Konzepts der „Imaginären Geographie“ von Christoph Hennig für touristische Wahrnehmungsprozesse
- Darstellung der Rolle der touristischen Intermediäre klassischer und neuer Reisemedienanhand des Fallbeispiels Sevilla
- Erläuterung des Bezugs der „Imaginäre Geographien“ (und verwandter Ansätze) zur Praxis (hier Fallbeispiel Sevilla)
Ignorantia doctorum
(2022)
Since the turn of the millennium, many writing centers have been established at universities in the German-speaking world, in order to support students in academic writing. This essay argues for offering subject-anchored directive guidance and using scientific texts as a basis for model learning. It states that the rhetorical tradition is hardly taken into account in the writing centers. Five arguments for this ignorance are discussed and, if possible, dispelled: the antiquity argument, which considers rhetoric outdated; the orality argument, which understands rhetoric as irrelevant to writing; the moral argument, which condemns rhetoric as a tool for demagogues; the positivist argument, which criticizes rhetoric as unempirical; and the didactic argument, which rejects rhetoric as a rigid doctrine. The discussion shows, however, that the rhetorical tradition, with its normative power and centuries of teaching practice, is a treasure trove of writing didactics that holds many resources, such as well-founded assessment criteria for the quality, appropriateness, and usefulness of texts.
The transformation to an Industry 4.0, which is in general seen as a solution to increasing market challenges, is forcing companies to radically change their way of thinking and to be open to new forms of cooperation. In this context, the opening-up of the innovation process is widely seen as a necessity to meet these challenges, especially for small and medium enterprises (SMEs). The aim of the study therefore is to analyze how cooperation today can be characterized, how this character has changed since the establishment of the term Industry 4.0 at Hanover Fair in 2011 and which cooperation strategies have proven successful. The analysis consists of a quantitative, secondary data analysis that includes country-specific data from 35 European countries of 2010 and 2016 collected by the European Commission and the OECD. The research, focusing on the secondary sector, shows that multinational enterprises MNEs still tend to cooperate more than SMEs, with a slight overall trend towards protectionism. Nevertheless, there is a clear tendency towards the opening-up of SMEs. In this regard, especially universities, competitors and suppliers have become increasingly attractive as cooperation partners for SMEs.
The goal of the presented project is to develop the concept of home ehealth centers for barrier-free and cross-border telemedicine. AAL technologies are already present on the market but there is still a gap to close until they can be used for ordinary patient needs. The general idea needs to be accompanied by new services, which should be brought together in order to provide a full coverage of service for the users. Sleep and stress were chosen as predominant diseases for a detailed study within this project because of their widespread influence in the population. The executed scientific study of available home devices analyzing sleep has provided the necessary to select appropriate devices. The first choice for the project implementation is the device EMFIT QS+. This equipment provides a part of a complete system that a home telemedical hospital can provide at a level of precision and communication with internal and/or external health services.
The project aims for the development of a new material system from high tensile stainless steel wires as net material with environmentally compatible antifouling properties for off-shore fish farm cages. Therefore, current net materials from textiles (polyamide) shall be partially replaced by high strength stainless steel in order to have a more environmentally compatible system which meets the more severe mechanical loads (waves, storms, predators (sharks)). With a new antifouling strategy current issues like reduced ecological damage (e.g. due to copper disposal), lower maintenance costs (e.g. cleaning) and reduced durability shall be resolved.
In modern fruit processing technology, non-destructive quality measuring techniques aresought for determining and controlling changes in the optical, structural, and chemical properties of theproducts. In this context, changes inside the product can be measured during processing. Especiallyfor industrial use, fast, precise, but robust methods are particularly important to obtain high-qualityproducts. In this work, a newly developed multi-spectral imaging system was implemented andadapted for drying processes. Further it was investigated if the system could be used to link changesin the surface spectral reflectance during mango drying with changes in moisture content andcontents of chemical components. This was achieved by recovering the spectral reflectance frommulti-spectral image data and comparing the spectral changes with changes of the total soluble solids(TSS), pH-value and the relative moisture contentxwbof the products. In a first step, the camera wasmodified to be used in drying, then the changes in the spectra and quality criteria during mangodrying were measured. For this, mango slices were dried at air temperatures of 40–80◦C and relativeair humidities of 5%–30%. Samples were analyzed and pictures were taken with the multi-spectralimaging system. The quality criteria were then predicted from spectral data. It could be shown thatthe newly developed multi-spectral imaging system can be used for quality control in fruit drying.There are strong indications as well, that it can be employed for the prediction of chemical qualitycriteria of mangoes during drying. This way, quality changes can be monitored inline during theprocess using only one single measuring device.
To evaluate the quality of sleep, it is important to determine how much time was spent in each sleep stage during the night. The gold standard in this domain is an overnight polysomnography (PSG). But the recording of the necessary electrophysiological signals is extensive and complex and the environment of the sleep laboratory, which is unfamiliar to the patient, might lead to distorted results. In this paper, a sleep stage detection algorithm is proposed that uses only the heart rate signal, derived from electrocardiogram (ECG), as a discriminator. This would make it possible for sleep analysis to be performed at home, saving a lot of effort and money. From the heart rate, using the fast Fourier transformation (FFT), three parameters were calculated in order to distinguish between the different sleep stages. ECG data along with a hypnogram scored by professionals was used from Physionet database, making it easy to compare the results. With an agreement rate of 41.3%, this approach is a good foundation for future research.
Accurate monitoring of a patient's heart rate is a key element in the medical observation and health monitoring. In particular, its importance extends to the identification of sleep-related disorders. Various methods have been established that involve sensor-based recording of physiological signals followed by automated examination and analysis. This study attempts to evaluate the efficacy of a non-invasive HR monitoring framework based on an accelerometer sensor specifically during sleep. To achieve this goal, the motion induced by thoracic movements during cardiac contractions is captured by a device installed under the mattress. Signal filtering techniques and heart rate estimation using the symlets6 wavelet are part of the implemented computational framework described in this article. Subsequent analysis indicates the potential applicability of this system in the prognostic domain, with an average error margin of approximately 3 beats per minute. The results obtained represent a promising advancement in non-invasive heart rate monitoring during sleep, with potential implications for improved diagnosis and management of cardiovascular and sleep-related disorders.
Ziel des Forschungsprojekts "Ekont" ist es, ein handgeführtes Gerät zum Betonabtrag an Innenkanten und Störstellen in Kernkraftwerken (KKW) zu entwickeln. Um die Reaktionskräfte zu reduzieren wird hierbei der neuartige Ansatz eines gegenläufigen Fräsprozesses untersucht. Ergebnis ist eine Getriebelösung, bei der eine mittlere Frässcheibe mit annähernd derselben Umfangsgeschwindigkeit in die entgegengesetzte Richtung von weiteren Frässcheiben rotiert.
Flamenco und Tango gehören zu den ersten Assoziationen, mit denen Sevilla und Buenos Aires in Verbindung gebracht werden. Seit der Entstehung beider Kunstformen im 19. Jahrhundert entwickelte sich kontinuierlich ein Tanztourismus, der bis heute zu ausdifferenzierten Geschäftsmodellen führte. Flamenco und Tango erlangten als immaterielles Weltkulturerbe außerdem erhebliche Bedeutung für das Destinationsimage der beiden Städte. Das Modell von Gereffi et al. (2005) zu «Governance of Global Value Chains» wird in adaptierter Form auf Flamenco- und Tangotanztourismus angewendet, um Bedeutungsdimensionen im Destinationsmanagement herauszuarbeiten.
This letter proposes two contributions to improve the performance of transmission with generalized multistream spatial modulation (SM). In particular, a modified suboptimal detection algorithm based on the Gaussian approximation method is proposed. The proposed modifications reduce the complexity of the Gaussian approximation method and improve the performance for high signal-to-noise ratios. Furthermore, this letter introduces signal constellations based on Hurwitz integers, i.e., a 4-D lattice. Simulation results demonstrate that these signal constellations are beneficial for generalized SM with two active antennas.
In this letter, we present an approach to building a new generalized multistream spatial modulation system (GMSM), where the information is conveyed by the two active antennas with signal indices and using all possible active antenna combinations. The signal constellations associated with these antennas may have different sizes. In addition, four-dimensional hybrid frequency-phase modulated signals are utilized in GMSM. Examples of GMSM systems are given and computer simulation results are presented for transmission over Rayleigh and deep Nakagami- m flat-fading channels when maximum-likelihood detection is used. The presented results indicate a significant improvement of characteristics compared to the best-known similar systems.
Generalized Concatenated Codes over Gaussian and Eisenstein Integers for Code-Based Cryptography
(2021)
The code-based McEliece and Niederreiter cryptosystems are promising candidates for post-quantum public-key encryption. Recently, q-ary concatenated codes over Gaussian integers were proposed for the McEliece cryptosystem together with the one-Mannheim error channel, where the error values are limited to Mannheim weight one. Due to the limited error values, the codes over Gaussian integers achieve a higher error correction capability than maximum distance separable (MDS) codes with bounded minimum distance decoding. This higher error correction capability improves the work factor regarding decoding attacks based on information-set decoding. The codes also enable a low complexity decoding algorithm for decoding beyond the guaranteed error correction capability. In this work, we extend this coding scheme to codes over Eisenstein integers. These codes have advantages for the Niederreiter system. Additionally, we propose an improved code construction based on generalized concatenated codes. These codes extent the rate region where the work factor is beneficial compared to MDS codes. Moreover, generalized concatenated codes are more robust against structural attacks than ordinary concatenated codes.
Für mehr Qualität am Bau
(2018)
Compliance wirkt, wenn sie in den Köpfen der Menschen ankommt – und wenn sie adaptiv die Themen der Zeit aufgreift.
Eine Führungskräftebefragung der Kommunikationsagentur A&B One, unterstützt vom Zentrum für Wirtschaftsethik (ZfW), zeichnet hierzu ein aktuelles Bild der wichtigen Zielgruppe “Management”. Der Beitrag skizziert die Sicht von Führungsverantwortlichen aller Ebenen auf die viel diskutierte unternehmerische Verantwortung, auf die Wirksamkeit von Compliance-Maßnahmen und auf Compliance-Risiken im Homeoffice. Er gibt Impulse für die Praxis: von Synergien zwischen Purpose und Integrität bis hin zur Förderung von Regelbewusstsein beim (coronabedingten) Remote Working.
In this article, the collection of classes of matrices presented in [J. Garloff, M. Adm, ad J. Titi, A survey of classes of matrices possessing the interval property and related properties, Reliab. Comput. 22:1-14, 2016] is continued. That is, given an interval of matrices with respect to a certain partial order, it is desired to know whether a special property of the entire matrix interval can be inferred from some of its element matrices lying on the vertices of the matrix interval. The interval property of some matrix classes found in the literature is presented, and the interval property of further matrix classes including the ultrametric, the conditionally positive semidefinite, and the infinitely divisible matrices is given for the first time. For the inverse M-matrices the cardinality of the required set of vertex matrices known so far is significantly reduced.
Further applications of the Cauchon algorithm to rank determination and bidiagonal factorization
(2018)
For a class of matrices connected with Cauchon diagrams, Cauchon matrices, and the Cauchon algorithm, a method for determining the rank, and for checking a set of consecutive row (or column) vectors for linear independence is presented. Cauchon diagrams are also linked to the elementary bidiagonal factorization of a matrix and to certain types of rank conditions associated with submatrices called descending rank conditions.
In this article, we give the construction of new four-dimensional signal constellations in the Euclidean space, which represent a certain combination of binary frequency-shift keying (BFSK) and M-ary amplitude-phase-shift keying (MAPSK). Description of such signals and the formulas for calculating the minimum squared Euclidean distance are presented. We have developed an analytic building method for even and odd values of M. Hence, no computer search and no heuristic methods are required. The new optimized BFSK-MAPSK (M = 5,6,···,16) signal constructions are built for the values of modulation indexes h =0.1,0.15,···,0.5 and their parameters are given. The results of computer simulations are also provided. Based on the obtained results we can conclude, that BFSK-MAPSK systems outperform similar four-dimensional systems both in terms of minimum squared Euclidean distance and simulated symbol error rate.
Four-Dimensional Hurwitz Signal Constellations, Set Partitioning, Detection, and Multilevel Coding
(2021)
The Hurwitz lattice provides the densest four-dimensional packing. This fact has motivated research on four-dimensional Hurwitz signal constellations for optical and wireless communications. This work presents a new algebraic construction of finite sets of Hurwitz integers that is inherently accompanied by a respective modulo operation. These signal constellations are investigated for transmission over the additive white Gaussian noise (AWGN) channel. It is shown that these signal constellations have a better constellation figure of merit and hence a better asymptotic performance over an AWGN channel when compared with conventional signal constellations with algebraic structure, e.g., two-dimensional Gaussian-integer constellations or four-dimensional Lipschitz-integer constellations. We introduce two concepts for set partitioning of the Hurwitz integers. The first method is useful to reduce the computational complexity of the symbol detection. This suboptimum detection approach achieves near-maximum-likelihood performance. In the second case, the partitioning exploits the algebraic structure of the Hurwitz signal constellations. We partition the Hurwitz integers into additive subgroups in a manner that the minimum Euclidean distance of each subgroup is larger than in the original set. This enables multilevel code constructions for the new signal constellations.
FishNet
(2016)
In diesem Beitrag wird der finnische Tangotanztourismus
unter Berücksichtigung des Konzeptes des verkörperten Raumes (Low 2003) und des Raumverständnisses von Lefebvre (1991) auf den vielschichtig miteinander verbundenen Ebenen von Körper, Kultur und Raum analysiert. Die finnische „Kultur der Schweigsamkeit“ wird in diesem Zusammenhang im Besonderen
betrachtet. Methodisch werden hierbei sowohl Interviews mit Expertinnen und Experten, teilnehmende Beobachtung als auch die Auswertung von Filmmaterial herangezogen. Im Ergebnis zeigen sich vielfältige Wechselwirkungen von Körper, Kultur und Raum, die zusätzlich Potenziale für den finnischen Tangotanztourismus
aufzeigen.
Die wenigen Literaturangaben zu Sorptionsisothermen von mineralischen Estrichen beziehen sich im Wesentlichen auf Calciumsulfatestriche und genormte Zementestriche, sowie i.d.R. nur auf eine festgesetzte Lufttemperatur (= 20 Grad C). Daher war es das Anliegen der im Beitrag beschriebenen Untersuchung, die Feuchtigkeitseigenschaften von Estrichen bei unterschiedlichen Klimaten mithilfe von Sorptionsisothermen zu charakterisieren. Ergänzend sollten die seit ca. 20 Jahren marktüblichen ternären Schnellzemente mit untersucht und die baupraktisch interessanten Temperaturen von 15 Grad C und 25 Grad C einbezogen werden. Ebenso wurden die Auswirkungen der Klimabedingungen auf der Baustelle (Jahreszeit, Luftfeuchtigkeit, Temperatur) auf den Hydratationsvorgang der Estriche untersucht. In Kombination mit den Ergebnissen der Gefügeuntersuchungen (u.a. Hg-Porosimetrie) wird belegt, weshalb sich die zement- und schnellzementgebundenen Estriche vollkommen anders verhalten als die calciumsulfatgebundenen Estriche. Dieses unterschiedliche Verhalten ist auch einer der Gründe, warum Estriche mit der KRL-Methode in Bezug auf ihren Feuchtegehalt nicht bewertet werden können. Deshalb folgt ein Vergleich der Materialfeuchtemessungen "KRL-Methode" mit der handwerksüblichen und seit Jahrzehnten in der Praxis bewährten "CM-Methode".
Im Beitrag wird erörtert, welche Rolle der schriftlichen Fehlerkorrektur im studienbegleitenden und -vorbereitenden Deutschunterricht zukommt. Dazu wird ein Überblick über den Stand der Forschung gegeben und eine Studie zur Fehlerkorrektur mit fortgeschrittenen Lernenden vorgestellt, in der die Wirkung einer Fehlerkorrektur am Lernertext überprüft wurde. Die Fehlerkorrektur führte bei der Experimentalgruppe zwar zu einer Reduktion der Fehler in der Überarbeitung, im Vergleich zur Kontrollgruppe aber nicht zu einer signifikanten Verbesserung in einem neuen Text. Um zu eruieren, wie die Lernenden mit der Fehlerkorrektur umgingen, wurde in der Studie außerdem eine qualitative Analyse einzelner Arbeiten vorgenommen. Der Beitrag schließt mit der Frage, welche Rolle eine – nicht sonderlich effektive – Fehlerkorrektur in einer auf Wissenschaftssprache bezogenen Vermittlung einnehmen sollte.
Im Beitrag wird erörtert, welche Rolle der schriftlichen Fehlerkorrektur im studienbegleitenden und -vorbereitenden Deutschunterricht zukommt. Dazu wird ein Überblick über den Stand der Forschung gegeben und eine Studie zur Fehlerkorrektur mit fortgeschrittenen Lernenden vorgestellt, in der die Wirkung einer Fehlerkorrektur am Lernertext überprüft wurde. Die Fehlerkorrektur führte bei der Experimentalgruppe zwar zu einer Reduktion der Fehler in der Überarbeitung, im Vergleich zur Kontrollgruppe aber nicht zu einer signifikanten Verbesserung in einem neuen Text. Um zu eruieren, wie die Lernenden mit der Fehlerkorrektur umgingen, wurde in der Studie außerdem eine qualitative Analyse einzelner Arbeiten vorgenommen. Der Beitrag schließt mit der Frage, welche Rolle eine – nicht sonderlich effektive – Fehlerkorrektur in einer auf Wissenschaftssprache bezogenen Vermittlung einnehmen sollte.
Twenty-first century infrastructure needs to respond to changing demographics, becoming climate neutral, resilient, and economically affordable, while remaining a driver for development and shared prosperity. However, the infrastructure sector remains one of the least innovative and digitalized, plagued by delays, cost overruns, and benefit shortfalls. The authors assessed trends and barriers in the planning and delivery of infrastructure based on secondary research, qualitative
interviews with internationally leading experts, and expert workshops. The analysis concludes that the root-cause of the industry’s problems is the prevailing fragmentation of the infrastructure value chain and a lacking long-term vision for infrastructure. To help overcome these challenges, an integration of the value chain is needed. The authors propose that this could be achieved through a use-case-based, as well as vision and governance-driven creation of federated digital platforms applied to infrastructure projects and outline a concept. Digital platforms enable full-lifecycle participation and responsible governance guided by a shared infrastructure vision. This paper has contributed as policy recommendation to the Group of Twenty (G20) in 2021.
In this paper, a novel feature-based sampling strategy for nonlinear Model Predictive Path Integral (MPPI) control is presented. Using the MPPI approach, the optimal feedback control is calculated by solving a stochastic optimal control (OCP) problem online by evaluating the weighted inference of sampled stochastic trajectories. While the MPPI algorithm can be excellently parallelized, the closed-loop performance strongly depends on the information quality of the sampled trajectories. To draw samples, a proposal density is used. The solver’s and thus, the controller’s performance is of high quality if the sampled trajectories drawn from this proposal density are located in low-cost regions of state-space. In classical MPPI control, the explored state-space is strongly constrained by assumptions that refer to the control value’s covariance matrix, which are necessary for transforming the stochastic Hamilton–Jacobi–Bellman (HJB) equation into a linear second-order partial differential equation. To achieve excellent performance even with discontinuous cost functions, in this novel approach, knowledge-based features are introduced to constitute the proposal density and thus the low-cost region of state-space for exploration. This paper addresses the question of how the performance of the MPPI algorithm can be improved using a feature-based mixture of base densities. Furthermore, the developed algorithm is applied to an autonomous vessel that follows a track and concurrently avoids collisions using an emergency braking feature. Therefore, the presented feature-based MPPI algorithm is applied and analyzed in both simulation and full-scale experiments.
Tests for speeding up the determination of the Bernstein enclosure of the range of a multivariate polynomial and a rational function over a box and a simplex are presented. In the polynomial case, this enclosure is the interval spanned by the minimum and the maximum of the Bernstein coefficients which are the coefficients of the polynomial with respect to the tensorial or simplicial Bernstein basis. The methods exploit monotonicity properties of the Bernstein coefficients of monomials as well as a recently developed matrix method for the computation of the Bernstein coefficients of a polynomial over a box.
Image novelty detection is a repeating task in computer vision and describes the detection of anomalous images based on a training dataset consisting solely of normal reference data. It has been found that, in particular, neural networks are well-suited for the task. Our approach first transforms the training and test images into ensembles of patches, which enables the assessment of mean-shifts between normal data and outliers. As mean-shifts are only detectable when the outlier ensemble and inlier distribution are spatially separate from each other, a rich feature space, such as a pre-trained neural network, needs to be chosen to represent the extracted patches. For mean-shift estimation, the Hotelling T2 test is used. The size of the patches turned out to be a crucial hyperparameter that needs additional domain knowledge about the spatial size of the expected anomalies (local vs. global). This also affects model selection and the chosen feature space, as commonly used Convolutional Neural Networks or Vision Image Transformers have very different receptive field sizes. To showcase the state-of-the-art capabilities of our approach, we compare results with classical and deep learning methods on the popular dataset CIFAR-10, and demonstrate its real-world applicability in a large-scale industrial inspection scenario using the MVTec dataset. Because of the inexpensive design, our method can be implemented by a single additional 2D-convolution and pooling layer and allows particularly fast prediction times while being very data-efficient.
Im modernen Automobilbau spielen Fussgängerschutzsysteme eine immer stärker werdende Rolle, um Verletzungen und Todesfälle bei Verkehrsunfällen mit Fussgängerbeteiligung zu reduzieren. Eines dieser Sicherheitssysteme ist die aktive Motorhaube. Durch die Verwendung von Formgedächtnislegierungen (FGL) als Aktoren können die bestehenden Systeme vereinfacht werden, wobei die gleiche Funktion durch neue Mechanismen bei reduzierter Grösse und Gewicht sowie verringerten Kosten ausgeführt werden kann. In diesem Beitrag werden nach einer Einleitung zu existierenden Systemen zur Motorhauben-Anhebung FG-Aktoren und deren potenzielle Einsatzmöglichkeiten in Automobilbau kurz vorgestellt.
Extended Target Tracking With a Lidar Sensor Using Random Matrices and a Virtual Measurement Model
(2022)
Random matrices are widely used to estimate the extent of an elliptically contoured object. Usually, it is assumed that the measurements follow a normal distribution, with its standard deviation being proportional to the object’s extent. However, the random matrix approach can filter the center of gravity and the covariance matrix of measurements independently of the measurement model. This work considers the whole chain from data acquisition to the linear Kalman Filter with extension estimation as a reference plant. The input is the (unknown) ground truth (position and extent). The output is the filtered center of gravity and the filtered covariance matrix of the measurement distribution. A virtual measurement model emulates the behavior of the reference plant. The input of the virtual measurement model is adapted using the proposed algorithm until the output parameters of the virtual measurement model match the result of the reference plant. After the adaptation, the input to the virtual measurement model is considered an estimation for position and extent. The main contribution of this paper is the reference model concept and an adaptation algorithm to optimize the input of the virtual measurement model.
Many countries offer state credit guarantees to support credit-constrained exporters. The policy instrument is commonly justified by governments as a means to mitigating adverse outcomes of financial market frictions for exporting firms. Accumulated returns to the German state credit guarantee scheme deriving from risk-compensating premia have outweighed accumulated losses over the past 60 years. Why do private financial agents not step in and provide insurance given that the state-run program yields positive returns? We argue that costs of risk diversification, liquidity management, and coordination among creditors limit the ability of private financial agents to offer comparable insurance products. Moreover, we suggest that the government’s greater effectiveness in recovering claims in foreign countries endows the state with a cost advantage in dealing with the risks involved in large export projects. We test these hypotheses using monthly firm-level data combined with official transaction-level data on covered exports of German firms and find suggestive evidence that positive effects on trade are due to mitigated financial constraints: State credit guarantees benefit firms that are dependent on external finance, if the value at risk which they seek to cover is large, and at times when refinancing conditions on the private financial market are tight.
Experimental Validation of Ellipsoidal Techniques for State Estimation in Marine Applications
(2022)
A reliable quantification of the worst-case influence of model uncertainty and external disturbances is crucial for the localization of vessels in marine applications. This is especially true if uncertain GPS-based position measurements are used to update predicted vessel locations that are obtained from the evaluation of a ship’s state equation. To reflect real-life working conditions, these state equations need to account for uncertainty in the system model, such as imperfect actuation and external disturbances due to effects such as wind and currents. As an application scenario, the GPS-based localization of autonomous DDboat robots is considered in this paper. Using experimental data, the efficiency of an ellipsoidal approach, which exploits a bounded-error representation of disturbances and uncertainties, is demonstrated.
Excubation
(2015)
To learn from the past, we analyse 1,088 "computer as a target" judgements for evidential reasoning by extracting four case elements: decision, intent, fact, and evidence. Analysing the decision element is essential for studying the scale of sentence severity for cross-jurisdictional comparisons. Examining the intent element can facilitate future risk assessment. Analysing the fact element can enhance an organization's capability of analysing criminal activities for future offender profiling. Examining the evidence used against a defendant from previous judgements can facilitate the preparation of evidence for upcoming legal disclosure. Follow the concepts of argumentation diagrams, we develop an automatic judgement summarizing system to enhance the accessibility of judgements and avoid repeating past mistakes. Inspired by the feasibility of extracting legal knowledge for argument construction and employing grounds of inadmissibility for probability assessment, we conduct evidential reasoning of kernel traces for forensic readiness. We integrate the narrative methods from attack graphs/languages for preventing confirmation bias, the argumentative methods from argumentation diagrams for constructing legal arguments, and the probabilistic methods from Bayesian networks for comparing hypotheses.
Technologiebasierte Startups leisten einen wesentlichen Beitrag zur wirtschaftlichen sowie gesellschaftlichen Entwicklung. Im Zuge ihrer Gründung benötigen sie Unterstützung in Form von Risikokapital, das in der Seed- und Early-Stage primär durch Business Angels (BAs) bereitgestellt wird. Die Abläufe und Bewertungskriterien des BA Investmentprozesses sind bisher jedoch unzureichend erforscht. Der vorliegende Beitrag nutzt Experteninterviews im Rahmen einer Fallstudie des baden-württembergischen entrepreneurialen Ökosystems zur Identifikation des Vorgehens von BAs bei der Bewertung und Auswahl technologiebasierter Startups. Zudem werden die Kriterien, nach denen BAs vielversprechende von scheiternden Startups unterscheiden abgeleitet. Somit trägt der Beitrag zur Öffnung der „Black Box” von Investmentaktivitäten in den frühsten Gründungsphasen bei.
Evaluation of tech ventures’ evolving business models: rules for performance-related classification
(2022)
At the early stage of a successful tech venture's life cycle, it is assumed that the business model will evolve to higher quality over time. However, there are few empirical insights into business model evolution patterns for the performance-related classification of early-stage tech ventures. We created relevant variables evaluating the evolution of the venture-centric network and the technological proposition of both digital and non-digital ventures' business models using the text of submissions to the official business plan award in the German State of Baden-Württemberg between 2006 and 2012. Applying a principal component analysis/rough set theory mixed methodology, we explore performance-related business model classification rules in the heterogeneous sample of business plans. We find that ventures need to demonstrate real interactions with their customers' needs to survive. The distinguishing success rules are related to patent applications, risk capital, and scaling of the organisation. The rules help practitioners to classify business models in a way that allows them to prioritise action for performance.
The scoring of sleep stages is an essential part of sleep studies. The main objective of this research is to provide an algorithm for the automatic classification of sleep stages using signals that may be obtained in a non-obtrusive way. After reviewing the relevant research, the authors selected a multinomial logistic regression as the basis for their approach. Several parameters were derived from movement and breathing signals, and their combinations were investigated to develop an accurate and stable algorithm. The algorithm was implemented to produce successful results: the accuracy of the recognition of Wake/NREM/REM stages is equal to 73%, with Cohen's kappa of 0.44 for the analyzed 19324 sleep epochs of 30 seconds each. This approach has the advantage of using the only movement and breathing signals, which can be recorded with less effort than heart or brainwave signals, and requiring only four derived parameters for the calculations. Therefore, the new system is a significant improvement for non-obtrusive sleep stage identification compared to existing approaches.
The estimation of the holding periods of financial products has to be done in a dynamic process in which the size of the observation time interval influences the result. Small intervals will produce smaller average holding periods than bigger ones. The approach developed in this paper offers the possibility of estimating this average independently of the size of this time interval. This method is demonstrated on the example of two distributions, based on the exponential and the geometric probability functions. The estimation will be found by maximizing the likelihood function.
Angesichts der aktuellen höchstrichterlichen Rechtsprechung ist der betagte oder sonst hinfällige, auf Geldentschädigung klagende Verletzte gezwungen, in Rechtsmittelüberlegungen sein nahendes Ableben als erhebliches Prozessrisiko mit einzubeziehen. Das führt zu einer beträchtlichen Schmälerung der ideellen Bestandteile seines Persönlichkeitsrechts. Es ist daher dringend geboten, im Wege der richterlichen Rechtsfortbildung Kriterien für Ausnahmefallgruppen aufzustellen, um untragbare Härtefälle künftig verlässlich auszuschließen.
The growing error rates of triple-level cell (TLC) and quadruple-level cell (QLC) NAND flash memories have led to the application of error correction coding with soft-input decoding techniques in flash-based storage systems. Typically, flash memory is organized in pages where the individual bits per cell are assigned to different pages and different codewords of the error-correcting code. This page-wise encoding minimizes the read latency with hard-input decoding. To increase the decoding capability, soft-input decoding is used eventually due to the aging of the cells. This soft-decoding requires multiple read operations. Hence, the soft-read operations reduce the achievable throughput, and increase the read latency and power consumption. In this work, we investigate a different encoding and decoding approach that improves the error correction performance without increasing the number of reference voltages. We consider TLC and QLC flashes where all bits are jointly encoded using a Gray labeling. This cell-wise encoding improves the achievable channel capacity compared with independent page-wise encoding. Errors with cell-wise read operations typically result in a single erroneous bit per cell. We present a coding approach based on generalized concatenated codes that utilizes this property.
Nach heutigem Stand der Technik kommen für die Dekontamination von Störstellen wie z.B. Ecken und Innenkanten, weitestgehend Technik aus dem konventionellen Sanierungsbereich zum Einsatz. Maschinen wie Nadelpistolen und Stockgeräte belasten das Arbeitspersonal mit starken Vibrationen und hohen Rückstellkräften. Daher sind entsprechend lange Pausenzeiten erforderlich, wodurch die ohnehin schon geringe Abtragleistung weiter gesenkt wird. Neben dem zusätzlichen Mehraufwand kann die Technik, aufgrund fehlender Absaugungseinrichtungen, unter Umständen zu einer Kontaminationsverschleppung führen. Hierbei werden in bereits dekontaminierten Bereichen kontaminierte Partikel verteilt, wodurch die erzielten Bearbeitungsfortschritte teilweise rückgängig gemacht werden.
Aufgrund der Vielzahl von Nachteilen, die bei den bisher eingesetzten Geräten auftreten, wurde das Forschungsprojekt EKONT-1 zur „Entwicklung eines innovativen, teilautomatisierten Gerätes für eine trocken-mechanische Ecken-, Kanten- und Störstellendekontamination in kerntechnischen Anlagen“ angestoßen und durchgeführt. Im Rahmen dieses Projektes konnten viele neue Erkenntnisse gewonnen und mehrere funktionsfähige Prototypen entwickelt, gebaut und sowohl im Labor als auch im praktischen Einsatz getestet werden. Da im Laufe der Versuche noch einige Verbesserungspotenziale aufgetreten sind, wurde zum 01.07.23 das Folge Projekt EKONT-2 gestartet, was sich mit der Weiterentwicklung der existierenden Prototypen beschäftigt.
Schon die Römer nutzten die Binnenschifffahrt für den Transport ihrer Güter (SCHRöTER 2005). Noch heute stellt die Binnenschifffahrt auf dem Rhein einen wichtigen Standortfaktor dar, weshalb sich viele Unternehmen, die auf den Transport von Massengütern angewiesen sind, an seinen Ufern niedergelassen haben, um die Binnenschifffahrt als günstiges Transportmittel nutzen zu können (ROTHSTEIN et al. 2009, SCHOLTEN et al. 2009).
Im folgenden Artikel wird sowohl kurz die wirtschaftliche Bedeutung der Binnenschifffahrt auf dem Rhein angesprochen als auch deren rezente und zukünftige Entwicklung. Dabei werden auch die Auswirkungen niedriger Fahrrinnentiefen auf die Transportkapazität der Binnenschifffahrt betrachtet.
Purpose
The goal of this research survey was to propose an entrepreneurship education model for students in higher education institutions.
Methodology
A questionnaire was distributed to 246 randomly sampled students at the Universitas Negeri Jakarta. The data was analyzed through Structural Equation Modeling to study the variables of entrepreneurship education for higher education students and examine whether it can be predicted by the university leadership as a facilitator of entrepreneurial culture, university departments as promoters of entrepreneurial skills, and university research as an incubator of local business
development.
Findings
The results show that university leadership as a facilitator of entrepreneurial culture is supported by the university leadership’s fostering a culture of entrepreneurial thinking. It was also evident that the university placed sufficient emphasis on entrepreneurial education, and it successfully motivated lecturers to embrace entrepreneurship education, and students to embrace entrepreneurship education. The results also indicated that university departments acted as promoters of entrepreneurial skills and stimulated students to attain sufficient entrepreneurial skills during their university education. Lastly, the university research also proved as an incubator of local business development and was found influenced by the university conducting research projects with local
private sector businesses and supporting graduates planning to launch start-ups.
Implications to Research and Practice
The survey results will provide valuable policy insights to improve entrepreneurship education. The university faculty and students would have opportunities to gain practical experience in local private sector businesses. The model of entrepreneurship education proposed herein can be applied for higher education students.
Due to their structure of crossed yarns embedded in coating, woven fabric membranes are characterised by a highly nonlinear stress-strain behaviour. In order to determine an accurate structural response of membrane structures, a suitable description of the material behaviour is required. Typical phenomenological material models like linear-elastic orthotropic models only allow a limited determination of the real material behaviour. A more accurate approach becomes evident by focusing on the meso-scale, which reveals an inhomogeneous however periodic structure of woven fabrics. The present work focuses on an established meso-scale model. The novelty of this work is an enhancement of this model with regard to the coating stiffness. By performing an inverse process of parameter identification using a state-of-the-art Levenberg-Marquardt algorithm, a close fit w.r.t. measured data from a common biaxial test is shown and compared to results applying established models. Subsequently, the enhanced meso-scale model is processed into a multi-scale model and is implemented as a material law into a finite element program. Within finite element analyses of an exemplary full scale membrane structure by using the implemented material model as well as by using established material models, the results are compared and discussed.
Energiequelle Seewasser
(2015)
Was in Kommunen im benachbarten Ausland, bspw. der Schweiz, Oesterreich oder den Niederlanden, offenbar seit vielen Jahren Stand der Technik ist, ist auf Deutschlands Kommunalstrassen eine 'Sonderbauweise': Verkehrsflaechen aus Beton. Die Gruende fuer die Vernachlaessigung der Betonbauweise im kommunalen Umfeld liegen offenbar in einer aufwendigeren Planung, hoeheren Ausfuehrungskosten, einem komplexeren Einbau (gerade in Zusammenhang mit Einbauten), Instandhaltungsmassnahmen und Massnahmen Dritter an den Ver- und Entsorgungseinrichtungen. Offen ist auch die Frage, inwieweit sich die hoeheren Aufwendungen bei der Betonbauweise hinsichtlich Planung und Ausfuehrung im Rahmen einer Lebenszyklusbetrachtung im Vergleich zur Asphaltbauweise durch einen geringeren Erhaltungsbedarf amortisieren oder ggf. sogar guenstiger darstellen. Infolge vermehrt auftretender typischer Schaeden wie Spurrinnen und Verdrueckungen werden hochbelastete kommunale Verkehrsflaechen wie Bushaltestellen, Busspuren und Kreisverkehre immer haeufiger anstatt in Asphalt oder Pflaster in Beton ausgefuehrt. Die Thematik 'Einsatz von Betonflaechen in Kommunen' ist sehr umfangreich und weitlaeufig. Generell wird hier auch auf die spezifischen Merkblaetter der FGSV verwiesen. Mit den Ausfuehrungen im Fachbeitrag soll demnach grundsaetzlich auf die Belange der Planung, des Baus und die Wirtschaftlichkeit kommunaler Verkehrsflaechen in Betonbauweise eingegangen werden. Es koennen leider nicht alle Besonderheiten und Einzelheiten wie bspw. Baustoffe (Glasfaser) Beruecksichtigung finden. Ziel ist, generelle Moeglichkeiten hinsichtlich des Einsatzes von Betonflaechen im kommunalen Bereich aufzuzeigen. Besonderer Dank gilt dem Strassenbauamt Boeblingen sowie Herrn Baudirektor Andreas Klein, dessen persoenliche Erfahrungen hier einfliessen durften.
Einsatz von Bankettbeton bei schmalen und stark beanspruchten Ortsverbindungs- und Kreisstraßen
(2021)
In Ganglaboren werden medizinische Gang- und Laufanalysen durchgeführt. Eine in eine Decke eingelassene Kraftmessplatte ermöglicht dabei die präzise Messung der vom Menschen auf den Boden übertragenen Kräfte. Die Beschleunigungen der Decke, in die die Kraftmessplatte eingelassen ist, müssen begrenzt werden, damit die Messgenauigkeit der Kraftmessplatte nicht beeinträchtigt wird.
Für das wissenschaftliche Ganglabor im Neubau des Medizinischen Trainings- und Rehabilitationszentrums an der Universitätsklinik Tübingen wurde untersucht, inwieweit die Messgenauigkeit einer Kraftmessplatte durch personeninduzierte Deckenschwingungen beeinflusst wird. Als Ergebnis kann der Messfehler in Abhängigkeit von der Masse des Probanden im Ganglabor bei unterschiedlichen Szenarien angegeben werden.
Dass sich Stoffe bei Temperaturänderungen zusammenziehen beziehungsweise ausdehnen, ist eine uralte Erkenntnis. Diesbezüglich stechen Formgedächtnislegierungen heraus, da diese die verblüffende Eigenschaft besitzen, sich bei Temperaturänderung ungewöhnlich rasch zusamenzuziehen beziehungsweise auszudehnen. Eine physikalische Besonderheit, die sich auf vielfältige Weise nutzen lässt.
Die zum UNESCO Weltkulturerbe zaehlenden Meisterhaeuser des Dessauer Bauhauses wurden 1925/1926 nach Plaenen des Bauhaus-Gruenders Walter Gropius errichtet. Nach Umbauten waehrend des Nationalsozialismus wurden im Zweiten Weltkrieg zwei Gebaeude des Ensembles voellig zerstoert: das Direktorenhaus und die Doppelhaushaelfte von Laszlo Moholy-Nagy. 1956 erfolgte ein veraenderter Wiederaufbau des Hauses Gropius als Haus Emmer. Fuer einen adaequaten denkmalpflegerischen Umgang mit dem Ensemble nach der Wiedervereinigung wurden nach laengeren Vorueberlegungen 2008 und 2010 Wettbewerbe fuer eine staedtebauliche Reparatur durchgefuehrt. Auf dieser Grundlage wurde das Berliner Architektenbuero Bruno Fioretti Marquez mit dem Neubau des Direktorenhauses und des Hauses Moholy-Nagy beauftragt. Da die Quellenlage fuer eine Rekonstruktion nicht ausreichte, wurde ein anspruchsvolles architektonisches Konzept umgesetzt, das die Kubatur der verlorenen Bauten nachzeichnet. Die Detailausbildung wurde frei in gegenueber den erhaltenen Meisterhaeusern, die bereits zuvor saniert worden waren, stark reduzierten Formen gestaltet.
Ein Holzhaus als Botschaft. Die erste diplomatische Vertretung des Deutschen Reiches in Ankara 1924
(2015)
Auf die Gründung der türkischen Republik und die Verlagerung der Hauptstadt nach Ankara reagierte das Deutsche Reich 1924 zeitnah mit der Errichtung eines Botschaftsgebäudes in Fertigteilbauweise. Dieses präfabrizierte Holzhaus wurde von der für den Holzbau der Moderne bedeutenden Firma Christoph & Unmack AG aus Niesky/Oberlausitz geliefert. Nach Errichtung eines größeren, massiven Botschaftsgebäudes in Ankara wurde das Holzhaus 1934 auf das Gelände des Atatürk Orman Çiftliği transloziert, wo es noch heute existiert. 2010 konnte es einer gründlichen Baudokumentation unterzogen werden. Im Rahmen von drei Einzelbeiträgen werden die Ergebnisse der Untersuchungen an diesem ebenso ungewöhnlichen wie bedeutenden Baudenkmal vorgestellt.
Die Kleinwasserkraft stand zuletzt zunehmend in der öffentlichen Kritik wegen des ökologischen Einflusses und der verhältnismäßigen geringen Stromerzeugung. Der vorliegende Beitrag beschreibt die Einschätzung von KWK-Betreibern zum Potenzial einer Effizienzsteigerung ihrer bestehenden Anlagen durch eine intelligente Informationsvernetzung innerhalb des Flusslaufes der Radolfzeller Aach im Süden Baden-Württembergs, um somit die Stromerzeugung der einzelnen Anlagen zu erhöhen.
The Lempel–Ziv–Welch (LZW) algorithm is an important dictionary-based data compression approach that is used in many communication and storage systems. The parallel dictionary LZW (PDLZW) algorithm speeds up the LZW encoding by using multiple dictionaries. This simplifies the parallel search in the dictionaries. However, the compression gain of the PDLZW depends on the partitioning of the address space, i.e. on the sizes of the parallel dictionaries. This work proposes an address space partitioning technique that optimises the compression rate of the PDLZW. Numerical results for address spaces with 512, 1024, and 2048 entries demonstrate that the proposed address partitioning improves the performance of the PDLZW compared with the original proposal. These address space sizes are suitable for flash storage systems. Moreover, the PDLZW has relative high memory requirements which dominate the costs of a hardware implementation. This work proposes a recursive dictionary structure and a word partitioning technique that significantly reduce the memory size of the parallel dictionaries.
In this work, a storage study was conducted to find suitable packaging material for tomato powder storage. Experiments were laid out in a single factor completely randomized design (CRD) to study the effect of packaging materials on lycopene, vitamin C moisture content, and water activity of tomato powder; The factor (packaging materials) has three levels (low‐density polyethylene bag, polypropylene bottle, wrapped with aluminum foils, and packed in low‐density polyethylene bag) and is replicated three times. During the study, a twin layer solar tunnel dried tomato slices of var. Galilea was used. The dried tomato slices were then ground and packed (40 g each) in the packaging materials and stored at room temperature. Samples were drawn from the packages at 2‐month interval for quality analysis and SAS (version 9.2) software was used for statistical analysis. From the result, higher retention of lycopene (80.13%) and vitamin C (49.32%) and a nonsignificant increase in moisture content and water activity were observed for tomato powder packed in polypropylene bottles after 6 months of storage. For low‐density polyethylene packed samples and samples wrapped with aluminum foil and packed in a low‐density polyethylene bag, 57.06% and 60.45% lycopene retention and 42.9% and 49.23% Vitamin C retention were observed, respectively, after 6 months of storage. Considering the results found, it can be concluded that lycopene and vitamin C content of twin layer solar tunnel dried tomato powder can be preserved at ambient temperature storage by packing in a polypropylene bottle with a safe range of moisture content and water activity levels for 6 months.
In tomato drying, degradation in final quality may occur based on the drying method used and predrying preparation. Hence, this research was conducted to evaluate the effect of different predrying treatments on physicochemical quality and drying kinetics of twin-layer-solar-tunnel-dried tomato slices. During the experimental work, tomato slices of var. Galilea were used. As predrying treatments, 0.5% calcium chloride (CaCl2), 0.5% ascorbic acid (C6H8O6), 0.5% citric acid (C6H8O7), and 0.5% sodium chloride (NaCl) were used. The tomato samples were sliced to 5 mm thickness, socked in the pretreatments for ten minutes, and dried in a twin layer solar tunnel dryer under the weather conditions of Jimma, Ethiopia. Untreated samples were used as control. The moisture losses from the samples were monitored by weighing samples at 2 h interval from each treatment. SAS statistical software version 9.2 was used for analyzing data on the physicochemical quality of tomato slices in CRD with three replications. From the experimental result, it was observed that dried tomato slices pretreated with 0.5% ascorbic acid gave the best retention of vitamin C and total phenolic content with a high sugar/acid ratio. Better retention of lycopene and fast drying were observed in dried tomato slices pretreated with 0.5% sodium chloride, and pretreating tomatoes with 0.5% citric acid resulted in better color values than the other treatments. Compared to the control, pretreating significantly preserved the overall quality of dried tomato slices and increased the moisture removal rate in the twin layer solar tunnel dryer.
In beispielhafter Zusammenarbeit zwischen Industrie (Geobrugg AG, Romanshorn/Schweiz) und Wissenschaft (WITg Institut für Werkstoffsystemtechnik Thurgau an der Hochschule Konstanz, Tägerwilen/Schweiz) wurde mit Unterstützung der Schweizer staatlichen Innovationsförderung (KTI, hee Innosuisse) ein neues Werkstoff- und Fertigungskonzept für den Bau von Fischzuchtnetzen aus hochfesten nichtrostenden Stahldrähten entwickelt.
Diese Entwicklung wurde 2019 von Swiss Inox, der Schweizer Innovationspreis Prix Inox ausgezeichnet.
E-mobility in Tourism
(2018)
This article examines chances for and obstacles to e-mobility in tourism at the cross-border region of Lake Constance, Germany. Using secondary internet research, a database of key e-mobility supply factors was generated and visualized utilizing a geographical information system. The results show that fragmentation in infrastructure and information due to the cross-border situation of the four-country region is the main obstacle for e-mobility in tourism in the Lake Constance region. Cooperation and coordination of the supply side of e-mobility in the Lake Constance region turned out to be weak. To improve the chances of e-mobility in cross-border tourism a more client-oriented approach regarding information, accessibility, and conditions of use is necessary.
In the last decade, both sustainability and business models for sustainability have increased in importance. Sustainability issues have become the focus of discussion. These issues are interlinked and often negatively impact each other. They are complex and include socio-ecological dilemmas, exist in almost every aspect of our society (economic, environmental, social), and are hard to formulate. They may have multiple, incompatible solutions, competing objectives, and open timeframes. Previous research has not developed satisfactory ways to comprehend and solve problems of this nature. Life Cycle Assessment (LCA) the widely used method to assess sustainable development has reached its limitation to achieve sustainable social goals. System Dynamics (SD) is a valuable methodology that enhances understanding of the structure and internal dynamic behaviours of large, complex, and dynamic systems, leading to improved decision-making. It offers a philosophy and set of tools for modelling, analysing, and simulating dynamic systems. This research applied system dynamics methods in conjunction with simulation software to assess the potential impact of a solution on environmental, social, and economic aspects of a complex system, aims to gain insights into the system's behaviour and identify the potential consequences of interventions or policy changes across multiple dimensions. This paper responds to the urgent need for a new business model by presenting a concept for an adapted dynamic business modelling for sustainability (aDBMfS) using system dynamics. Case studies in the smartphone industry are applied.
In this paper, the problem of controlling the dissolved oxygen level (DO) during an aerobic fermentation is considered. The proposed approach deals with three major difficulties in respect to the nonlinear dynamics of the DO, the poor accuracy of the empirical models for the oxygen consumption rate and the fact that only sampled measurements are available on-line. A nonlinear integral high-gain control law including a continuous-discrete time observer is designed to keep the DO in the neighborhood of a set point value without any knowledge on the dissolved oxygen consumption rate. The local stability of the control algorithm is proved using Lyapunov tools. The performance of the control scheme is first analyzed in simulation and then experimentally evaluated during a successfull fermentation of the bacteria over a period of three days. Pseudomonas putida mt-2
We provide an overview of the ongoing discussions on the objectives of the energy transition in the form of a conceptual framework, intending to facilitate the search for the most viable options for a successful transformation of the energy system. For this purpose, we examine the development of energy policy goals in Germany in the past and present, whereby we give an overview of objectives and assessment approaches from politics, economics, and science. Moreover, we then merge the different views into a common framework and analyze the central conflict between the wholeness of a hypothetical target circle and the simplification in favor of a hypothetical target point in more detail.
Positive systems play an important role in systems and control theory and have found applications in multiagent systems, neural networks, systems biology, and more. Positive systems map the nonnegative orthant to itself (and also the non-positive orthant to itself). In other words, they map the set of vectors with zero sign variation to itself. In this article, discrete-time linear systems that map the set of vectors with up to k-1 sign variations to itself are introduced. For the special case k = 1 these reduce to discrete-time positive linear systems. Properties of these systems are analyzed using tools from the theory of sign-regular matrices. In particular, it is shown that almost every solution of such systems converges to the set of vectors with up to k-1 sign variations. It is also shown that these systems induce a positive dynamics of k-dimensional parallelotopes.
Digitalisierung im Bauwesen
(2015)
Sustainable technologies are being increasingly used in various areas of human life. While they have a multitude of benefits, they are especially useful in health monitoring, especially for certain groups of people, such as the elderly. However, there are still several issues that need to be addressed before its use becomes widespread. This work aims to clarify the aspects that are of great importance for increasing the acceptance of the use of this type of technology in the elderly. In addition, we aim to clarify whether the technologies that are already available are able to ensure acceptable accuracy and whether they could replace some of the manual approaches that are currently being used. A two-week study with people 65 years of age and over was conducted to address the questions posed here, and the results were evaluated. It was demonstrated that simplicity of use and automatic functioning play a crucial role. It was also concluded that technology cannot yet completely replace traditional methods such as questionnaires in some areas. Although the technologies that were tested were classified as being “easy to use”, the elderly population in the current study indicated that they were not sure that they would use these technologies regularly in the long term because the added value is not always clear, among other issues. Therefore, awareness-raising must take place in parallel with the development of technologies and services.
The business model canvas (BMC) and the lean start-up manifesto (LSM) have been changing both the entrepreneurial education and, on the practical side, the mindset in setting up innovative ventures since the burst of the dot-com bubble. However, few empirical insights on the business model implementation patterns that distinguish between digital and non-digital innovative ventures exist. Connecting practical management tools to network theory as well as to the theory of organizational learning, this paper investigates evolution patterns of digital and non-digital business models out of the deal flow of an innovation intermediary. For this purpose, a multi-dimensional quantitative content analysis research design is applied to 242 ventures' business plans. The measured strength of transaction relations to customers, suppliers, people, and financiers has been combined with performance indicators of the sampled ventures. The results indicate that in order to succeed, digital ventures iterate their business on the market early and search for investment afterwards. Contrariwise, non-digital ventures already need financial investments in the early stages to set up a product ready to be tested on the market. In both groups we found strong evidence that specific evolutionary patterns relate to higher rates of success.
This paper examines how varying antidumping methodologies applied within the World Trade Organization differ in the extent to which they reduce targeted exports. We show that antidumping duties, on average, hit Chinese exporters harder than those of other targeted countries. This difference can be traced back in part to China's non-market economy status, which affects the way antidumping duties are calculated. Furthermore, we show that the type of imposed duty matters, as ad-valorem duties affect exports differently compared to specific duties or duties conditional on the export price. Overall, however, antidumping duties remain effective in reducing imports independent of market economy status.