Refine
Year of publication
Document Type
- Conference Proceeding (642)
- Article (426)
- Other Publications (143)
- Part of a Book (141)
- Working Paper (128)
- Book (118)
- Report (115)
- Journal (Complete Issue of a Journal) (85)
- Master's Thesis (77)
- Doctoral Thesis (58)
Language
- German (1113)
- English (882)
- Multiple languages (8)
Keywords
Institute
- Fakultät Architektur und Gestaltung (41)
- Fakultät Bauingenieurwesen (104)
- Fakultät Elektrotechnik und Informationstechnik (34)
- Fakultät Informatik (121)
- Fakultät Maschinenbau (60)
- Fakultät Wirtschafts-, Kultur- und Rechtswissenschaften (106)
- Institut für Angewandte Forschung - IAF (115)
- Institut für Naturwissenschaften und Mathematik - INM (3)
- Institut für Optische Systeme - IOS (39)
- Institut für Strategische Innovation und Technologiemanagement - IST (60)
Tugend als social strings
(2006)
Low-Code Development Platforms (LCDPs) enable non-information technology (IT) personnel to develop applications and workflows independently of the IT department. Consequently, these digital platforms help to overcome the growing need for software development. However, science and practice warn of several barriers that slow down or hinder the usage of LCDPs. This publication scientifically identifies, analyzes, and discusses challenges during implementation and application of LCDPs from both perspectives in a holistic manner. Therefore, we conduct an exploratory study (data from scientific literature, expert interviews, and practical studies) and assign the challenges to the socio-technical system model. The results show that the scientific and practical communities recognize common challenges (especially knowledge transfer) but also perceive differences related to technological (science) and social (practice) aspects. This paper proposes future research directions for academia, such as governance, culture change, and value evaluation of LCDPs. Additionally, practitioners can prepare for possible challenges when using LCPDs.
Low-Code Development Plattformen (LCDPs) fördern die digitale Transformation von Organisationen, indem sie die Applikationsentwicklung durch FachbereichsmitarbeiterInnen ohne tiefgreifende Programmierkenntnisse – sogenannte Citizen Developer – ermöglichen. Marktforschungsinstitute prognostizieren, dass in den nächsten Jahren mehr als die Hälfte aller Applikationen mit LCDPs entwickelt werden. Nichtsdestotrotz stehen Organisationen vor der Herausforderung, sich für die richtigen Implementierungs- und Anwendungsansätze von LCDPs zu entscheiden. Dieser Artikel liefert daher ein umfassendes Bild über das praktische Verständnis und aktuelle Ansätze in verschiedenen Organisationen und leitet daraus Handlungsempfehlungen ab. Dafür wurden 16 Experteninterviews durchgeführt und wissenschaftlich analysiert. Die Ergebnisse zeigen, dass die Praxis grundsätzlich ein ähnliches Verständnis des Begriffs LCDP hat. Die Initiative für die Einführung kommt meist aus den Fachbereichen, die Entscheidung für oder gegen die LCDP-Implementierung wird jedoch meist von der Geschäftsführung in Kooperation mit der IT-Abteilung getroffen. Dabei unterscheiden sich die aktuellen Anwendungsansätze: Unternehmen nutzen entweder einen Self-Service-Ansatz durch die Fachbereiche oder integrieren die Entscheidung über eine potenzielle LCDP-Entwicklung durch die Citizen Developer in das bestehende Demand-Management der IT-Abteilung. Eine etablierte und adaptive Governance ist für beide Ansätze eine wichtige Voraussetzung. Die Erkenntnisse des Beitrags tragen zur wissenschaftlichen Diskussion bei, da dieser Artikel eine der ersten umfassenden und wissenschaftlich fundierten qualitativen Analysen über aktuelle praktische Adoptionsansätze der Praxis liefert. PraktikerInnen erfahren zudem, wie andere Unternehmen mit aktuellen Herausforderungen umgehen und welche Ansätze erfolgversprechend sind.
Market research institutes forecast a growing relevance of Low-Code Development Platforms (LCDPs) for organizations. Moreover, the rising number of scientific publications in recent years shows the increasing interest of the academic community. However, an overview of current research focuses and fruitful future research topics is missing. This paper conducts a first scientific literature review on LCDPs to close this gap. The socio-technical system (STS) model, which categorizes information systems into a social and a technical system, serves to analyze the identified 32 publications. Most of current research focuses on the technical system (technology or task). In contrast, only three publications explicitly target the social system (structure or people). Hence, this paper enables future research to address the identified research gaps. Additionally, practitioners gain awareness of technical and social aspects involved in the development, implementation, and application of LCDPs.
This paper describes the effectiveness and efficiency of Virtual Reality training during a commissioning process. Therefore, 500 picking orders with more than 2000 part-picking operations with 30 test persons have been conducted and analyzed in the Modellfabrik Bodensee. The study points out the advantages and disadvantages of virtual training in comparison to a real execution of a picking process with and without any training.
NAND flash memory is widely used for data storage due to low power consumption, high throughput, short random access latency, and high density. The storage density of the NAND flash memory devices increases from one generation to the next, albeit at the expense of storage reliability.
Our objective in this dissertation is to improve the reliability of the NAND flash memory with a low hard implementation cost. We investigate the error characteristic, i.e. the various noises of the NAND flash memory. Based on the error behavior at different life-aging stages, we develop offset calibration techniques that minimize the bit error rate (BER).
Furthermore, we introduce data compression to reduce the write amplification effect and support the error correction codes (ECC) unit. In the first scenario, the numerical results show that the data compression can reduce the wear-out by minimizing the amount of data that is written to the flash. In the ECC scenario, the compression gain is used to improve the ECC capability. Based on the first scenario, the write amplification effect can be halved for the considered target flash and data model. By combining the ECC and data compression, the NAND flash memory lifetime improves three fold compared with uncompressed data for the same data model.
In order to improve the data reliability of the NAND flash memory, we investigate different ECC schemes based on concatenated codes like product codes, half-product codes, and generalized concatenated codes (GCC). We propose a construction for high-rate GCC for hard-input decoding. ECC based on soft-input decoding can significantly improve the reliability of NAND flash memories. Therefore, we propose a low-complexity soft-input decoding algorithm for high-rate GCC.
These days computer analysis of ECG (Electrocardiograms) signals is common. There are many real-time QRS recognition algorithms; one of these algorithms is Pan-Tompkins Algorithm. Which the Pan-Tompkins Algorithm can detect QRS complexes of ECG signals. The proposed algorithm is analysed the data stream of the heartbeat based on the digital analysis of the amplitude, the bandwidth, and the slope. In addition to that, the stress algorithm compares whether the current heartbeat is similar or different to the last heartbeat after detecting the ECG signals. This algorithm determines the stress detection for the patient on the real-time. In order to implement the new algorithm with higher performance, the parallel programming language CUDA is used. The algorithm determines stress at the same time by determining the RR interval. The algorithm uses a different function as beat detector and a beat classifier of stress.
Error correction coding based on soft-input decoding can significantly improve the reliability of non-volatile flash memories. This work proposes a soft-input decoder for generalized concatenated (GC) codes. GC codes are well suited for error correction in flash memories for high reliability data storage. We propose GC codes constructed from inner extended binary Bose-Chaudhuri-Hocquenghem (BCH) codes and outer Reed-Solomon codes. The extended BCH codes enable an efficient hard-input decoding. Furthermore, a low-complexity soft-input decoding method is proposed. This bit-flipping decoder uses a fixed number of test patterns and an algebraic decoder for soft-decoding. An acceptance criterion for the final candidate codeword is proposed. Combined with error and erasure decoding of the outer Reed-Solomon codes, this acceptance criterion can improve the decoding performance and reduce the decoding complexity. The presented simulation results show that the proposed bit-flipping decoder in combination with outer error and erasure decoding can outperform maximum likelihood decoding of the inner codes.
The introduction of multi level cell (MLC) and triple level cell (TLC) technologies reduced the reliability of flash memories significantly compared with single level cell (SLC) flash. The reliability of the flash memory suffers from various errors causes. Program/erase cycles, read disturb, and cell to cell interference impact the threshold voltages. With pre-defined fixed read thresholds a voltage shift increases the bit error rate (BER). This work proposes a read threshold calibration method that aims on minimizing the BER by adapting the read voltages. The adaptation of the read thresholds is based on the number of errors observed in the codeword protecting a small amount of meta-data. Simulations based on flash measurements demonstrate that this method can significantly reduce the BER of TLC memories.
As part of large-scale project in the Lake Constance region that borders Germany, Austria and Switzerland, seamless learning prototypes are designed, implemented, and evaluated. A design-based research (DBR) approach is used for the development, implementation and evaluation. The project consists of one base project that brings together instructional design and instructional technology specialists as well as (educational) software architects, supporting seven subprojects in developing seamless-learning lighthouse implementations mostly within higher and further education.
Industrial partners are involved in all subprojects. The authors are unaware of similar seamless learning projects using a DBR approach that include evaluation and redesign and allow for comparison of experiences by applying DBR across subprojects. The project aims to generate novel seamless learning implementations, respective novel research and novel software supporting different aspects of seamless learning. The poster will present the project design, delineate areas of research and development, and include initial empirical results.
Experimental Validation of Ellipsoidal Techniques for State Estimation in Marine Applications
(2022)
A reliable quantification of the worst-case influence of model uncertainty and external disturbances is crucial for the localization of vessels in marine applications. This is especially true if uncertain GPS-based position measurements are used to update predicted vessel locations that are obtained from the evaluation of a ship’s state equation. To reflect real-life working conditions, these state equations need to account for uncertainty in the system model, such as imperfect actuation and external disturbances due to effects such as wind and currents. As an application scenario, the GPS-based localization of autonomous DDboat robots is considered in this paper. Using experimental data, the efficiency of an ellipsoidal approach, which exploits a bounded-error representation of disturbances and uncertainties, is demonstrated.
Reliability Assessment of an Unscented Kalman Filter by Using Ellipsoidal Enclosure Techniques
(2022)
The Unscented Kalman Filter (UKF) is widely used for the state, disturbance, and parameter estimation of nonlinear dynamic systems, for which both process and measurement uncertainties are represented in a probabilistic form. Although the UKF can often be shown to be more reliable for nonlinear processes than the linearization-based Extended Kalman Filter (EKF) due to the enhanced approximation capabilities of its underlying probability distribution, it is not a priori obvious whether its strategy for selecting sigma points is sufficiently accurate to handle nonlinearities in the system dynamics and output equations. Such inaccuracies may arise for sufficiently strong nonlinearities in combination with large state, disturbance, and parameter covariances. Then, computationally more demanding approaches such as particle filters or the representation of (multi-modal) probability densities with the help of (Gaussian) mixture representations are possible ways to resolve this issue. To detect cases in a systematic manner that are not reliably handled by a standard EKF or UKF, this paper proposes the computation of outer bounds for state domains that are compatible with a certain percentage of confidence under the assumption of normally distributed states with the help of a set-based ellipsoidal calculus. The practical applicability of this approach is demonstrated for the estimation of state variables and parameters for the nonlinear dynamics of an unmanned surface vessel (USV).
Schatten-BI - Was tun?
(2015)
Shadow IT risk
(2015)
Schatten-IT
(2015)
IT-Governance
(2023)
Die digitale Transformation verstärkt den Einfluss der Informationstechnologie auf den Unternehmenserfolg erheblich. Damit erhöhen sich auch die Anforderungen an das Führungssystem der IT in den Unternehmen. Hier gilt die einfache Weisheit: Ein ungeeignetes Managementsystem bringt in der Regel schlechtere Entscheidungen mit sich.
Wie Sie zielorientiert bestimmen, wer im Unternehmen wie auf IT-relevante Entscheidungen einwirken soll, zeigt Ihnen Christopher Rentrop mit viel Übersicht:
- Grundlegende Ziele und Erfolgsfaktoren der IT-Governance
- Gestaltungselemente der IT-Governance: Strukturen und Prozesse, Entscheidungsrechte, relationale Mechanismen u.a.
- COBIT als Rahmenwerk der IT-Governance
- Spezifische Entscheidungsdomänen, Handlungsfelder und Verantwortlichkeiten
- Management, Weiterentwicklung und Erfolgsmessung der IT-Governance
Eine prägnante Orientierungshilfe, die Sie Schritt für Schritt zu einer organisationsgerechten Ausgestaltung des Führungssystems der IT leitet.
Schatten-IT
(2015)
Das Thema Schatten-IT hat erst in den letzten Jahren in Wissenschaft und Praxis erheblich an Bedeutung gewonnen.1 Unter dem Begriff Schatten-IT werden dabei Systeme verstanden, welche die Fachbereiche in Eigenregie beschaffen oder erstellen und die nicht in das IT-Servicemanagement des Unternehmens eingebunden sind.2 Das Thema Schatten-IT ist dabei weder neu noch selten. Trotz dieser langen Geschichte war über die Verbreitung der Schatten-IT wenig Genaues bekannt. Im einem Beitrag in dieser Zeitschrift im Jahr 2011 haben wir die Risiken der Schatten-IT beschrieben und damit die Notwendigkeit für die Auseinandersetzung mit dem Thema Schatten-IT hergeleitet.3 Im Forschungsprojekt Schatten-IT des Konstanzer Instituts für Prozesssteuerung wurde dann in den vergangenen Jahren eine Methode zum Management der Schatten-IT entwickelt und in verschiedenen Fallstudien evaluiert. Ziel dieses Beitrages ist es, die empirischen Ergebnisse aus den Fallstudien vorzustellen.4 Die Datenbasis bilden dabei die Fallstudien in vier Unternehmen, in denen insgesamt 386 Schatten-IT Systeme gefunden wurden. Zwei der Unternehmen stammen aus der Finanz- und Versicherungsbranche, die beiden anderen aus der Fertigungsindustrie. Je ein Abteilungsbereich wurde für die Untersuchung herangezogen. Aus diesen Fallstudien stellt der Beitrag im zweiten Kapitel die Ausprägungen der aufgedeckten Schatten-IT vor. Im dritten Kapitel folgt dann die Beschreibung der geschäftlichen Relevanz der Schatten-IT. Im vierten Abschnitt gehen wir auf die Qualität der Schatten-IT Systeme ein. In der Fortsetzung des Artikels werden in einem weiteren Beitrag die möglichen Prüfungsansätze beschrieben.
Schatten-IT (Teil 2)
(2017)
Der Begriff Schatten-IT beschreibt Systeme, die außerhalb des Verantwortungsbereiches der IT durch die Fachabteilung eigenständig entwickelt und betrieben werden. Obwohl dieses Phänomen weit verbreitet ist, blieb diese Schatten-IT in Wissenschaft und in Praxis lange unbeachtet. Seit einiger Zeit ist eine verstärkte Auseinandersetzung mit dem Phänomen erkennbar. Jedoch gibt es bisher nur wenige empirisch gestützte Erkenntnisse über die Schatten-IT. Diese Lücke wird in dem vorliegenden Beitrag angegangen. Auf Basis von vier Fallstudien werden die Verbreitung, Verwendung und die Qualität der Schatten-IT ermittelt. Dabei zeigt sich, dass die Schatten-IT mit durchaus erheblichen Risiken verbunden sein kann. Demgegenüber bleibt die Qualität jedoch merklich zurück. Erkennbar ist dabei auch, dass die Schatten-IT nicht im Blickfeld der Kontrollsysteme der Unternehmen erscheint. Durch Maßnahmen wie die Steigerung der Awareness der Schatten-IT im Unternehmen sowie eine Entwicklung der Schatten-IT hin zu einer gesteuerten Fachbereichs-IT lassen sich die Risiken wirksam reduzieren. Zudem sind die Kontrollsysteme der Unternehmen so anzupassen, dass auch die Schatten-IT in diesen sichtbar wird.
Nach heutigem Stand der Technik kommen für die Dekontamination von Störstellen wie z.B. Ecken und Innenkanten, weitestgehend Technik aus dem konventionellen Sanierungsbereich zum Einsatz. Maschinen wie Nadelpistolen und Stockgeräte belasten das Arbeitspersonal mit starken Vibrationen und hohen Rückstellkräften. Daher sind entsprechend lange Pausenzeiten erforderlich, wodurch die ohnehin schon geringe Abtragleistung weiter gesenkt wird. Neben dem zusätzlichen Mehraufwand kann die Technik, aufgrund fehlender Absaugungseinrichtungen, unter Umständen zu einer Kontaminationsverschleppung führen. Hierbei werden in bereits dekontaminierten Bereichen kontaminierte Partikel verteilt, wodurch die erzielten Bearbeitungsfortschritte teilweise rückgängig gemacht werden.
Aufgrund der Vielzahl von Nachteilen, die bei den bisher eingesetzten Geräten auftreten, wurde das Forschungsprojekt EKONT-1 zur „Entwicklung eines innovativen, teilautomatisierten Gerätes für eine trocken-mechanische Ecken-, Kanten- und Störstellendekontamination in kerntechnischen Anlagen“ angestoßen und durchgeführt. Im Rahmen dieses Projektes konnten viele neue Erkenntnisse gewonnen und mehrere funktionsfähige Prototypen entwickelt, gebaut und sowohl im Labor als auch im praktischen Einsatz getestet werden. Da im Laufe der Versuche noch einige Verbesserungspotenziale aufgetreten sind, wurde zum 01.07.23 das Folge Projekt EKONT-2 gestartet, was sich mit der Weiterentwicklung der existierenden Prototypen beschäftigt.
Technologiebasierte Startups leisten einen wesentlichen Beitrag zur wirtschaftlichen sowie gesellschaftlichen Entwicklung. Im Zuge ihrer Gründung benötigen sie Unterstützung in Form von Risikokapital, das in der Seed- und Early-Stage primär durch Business Angels (BAs) bereitgestellt wird. Die Abläufe und Bewertungskriterien des BA Investmentprozesses sind bisher jedoch unzureichend erforscht. Der vorliegende Beitrag nutzt Experteninterviews im Rahmen einer Fallstudie des baden-württembergischen entrepreneurialen Ökosystems zur Identifikation des Vorgehens von BAs bei der Bewertung und Auswahl technologiebasierter Startups. Zudem werden die Kriterien, nach denen BAs vielversprechende von scheiternden Startups unterscheiden abgeleitet. Somit trägt der Beitrag zur Öffnung der „Black Box” von Investmentaktivitäten in den frühsten Gründungsphasen bei.
Digitalisierung im Bauwesen
(2015)
In the digital age, information technology (IT) is a strategic asset for organizations. As a result, the IT costs are rising, and the cost-effective management of IT is crucial. Nevertheless, organizations still face major challenges and former studies lack comprehensiveness and depth. The goal of this paper is to generate a deep and holistic view on current management challenges of IT costs. In 15 expert interviews, we identify 23 challenges divided into 7 categories. The main challenges are to ensure transparency on IT cost information, to demonstrate the business impact of IT as well as to change the mindset for the value of IT and overcoming them requires attention to their interactions. Hence, this paper leads to a better understanding of the issues that IT cost management (ITCM) faces in the digital age and builds a base for future research.
Nowadays, organizations must invest strategically in information technology (IT) and choose the right digital initiatives to maximize their benefit. Nevertheless, Chief Information Officers still struggle to communicate IT costs and demonstrate the business value of IT. The goal of this paper is to support their effective communication. In focus groups, we analyzed how different stakeholders perceive IT costs and the business value of IT as the basis of communication. We identified 16 success factors to establish effective communication. Hence, this paper enables a better understanding of the perception and the operationalization of effective communication.
Nowadays, information technology (IT) is a strategic asset for organizations. As a result, the IT costs are rising and there is a need for transparency about their root causes. Cost drivers as an instrument in IT cost management enable a better transparency and understanding of costs. However, there is a lack of IT cost driver research with a focus on the strategic position of IT within organizations. The goal of this paper is to develop a comprehensive overview of strategic drivers of IT costs. The Delphi study leads to the identification and validation of 17 strategic drivers. Hence, this paper builds a base for cost driver analysis and contributes to a better understanding of the causes of costs. It facilitates future research regarding cost behavior and the business value of IT. Additionally, practitioners gain awareness of levers to influence IT costs and consequences of managerial decisions on their IT spend.
We identify 74 generic, reusable technical requirements based on the GDPR that can be applied to software products which process personal data. The requirements can be traced to corresponding articles and recitals of the GDPR and fulfill the key principles of lawfulness and transparency. Therefore, we present an approach to requirements engineering with regard to developing legally compliant software that satisfies the principles of privacy by design, privacy by default as well as security by design.
The overall goal of this work is to detect and analyze a person's movement, breathing and heart rate during sleep in a common bed overnight without any additional physical contact. The measurement is performed with the help of
sensors placed between the mattress and the frame. A two-stage pattern classification algorithm based has been implemented that applies statistics analysis to recognize the position of patients. The system is implemented in a sensors-network, hosting several nodes and communication end-points to support quick and efficient classification. The overall tests show convincing results for the position recognition and a reasonable overlap in matching.
The goal of this paper pretends to show how a bed system with an embedded system with sensor is able to analyze a person’s movement, breathing and recognizing the positions that the subject is lying on the bed during the night without any additional physical contact. The measurements are performed with sensors placed between the mattress and the frame. An Intel Edison board was used as an endpoint that served as a communication node from the mesh network to external service. Two nodes and Intel Edison are attached to the bottom of the bed frame and they are connected to the sensors.
Research Report
(2024)
Requirements Engineering in Business Analytics for Innovation and Product Lifecycle Management
(2014)
Considering Requirements Engineering (RE) in business analytics, involving market oriented management, computer science and statistics, may be valuable for managing innovation in Product Lifecycle Management (PLM). RE and business analytics can help maximize the value of corporate product information throughout the value chain starting with innovation management. Innovation and PLM must address 1) big data, 2) development of well-defined business goals and principles, 3) cost/benefit analysis, 4) continuous change management, and 5) statistical and report science. This paper is a positioning note that addresses some business case considerations for analytics project involving PLM data, patents, and innovations. We describe a number of research challenges in RE that addresses business analytics when high PLM data should be turned into a successful market oriented innovation management strategy. We provide a draft on how to address these research challenges.
Deep Learning-based EEG Detection of Mental Alertness States from Drivers under Ethical Aspects
(2022)
One of the most critical factors for a successful road trip is a high degree of alertness while driving. Even a split second of inattention or sleepiness in a crucial moment, will make the difference between life and death. Several prestigious car manufacturers are currently pursuing the aim of automated drowsiness identification to resolve this problem. The path between neuro-scientific research in connection with artificial intelligence and the preservation of the dignity of human individual’s and its inviolability, is very narrow. The key contribution of this work is a system of data analysis for EEGs during a driving session, which draws on previous studies analyzing heart rate (ECG), brain waves (EEG), and eye function (EOG). The gathered data is hereby treated as sensitive as possible, taking ethical regulations into consideration. Obtaining evaluable signs of evolving exhaustion includes techniques that obtain sleeping stage frequencies, problematic are hereby the correlated interference’s in the signal. This research focuses on a processing chain for EEG band splitting that involves band-pass filtering, principal component analysis (PCA), independent component analysis (ICA) with automatic artefact severance, and fast fourier transformation (FFT). The classification is based on a step-by-step adaptive deep learning analysis that detects theta rhythms as a drowsiness predictor in the pre-processed data. It was possible to obtain an offline detection rate of 89% and an online detection rate of 73%. The method is linked to the simulated driving scenario for which it was developed. This leaves space for more optimization on laboratory methods and data collection during wakefulness-dependent operations.
Rheumatoid arthritis is an autoimmune disease that causes chronic inflammation of synovial joints, often resulting in irreversible structural damage. The activity of the disease is evaluated by clinical examinations, laboratory tests, and patient self-assessment. The long-term course of the disease is assessed with radiographs of hands and feet. The evaluation of the X-ray images performed by trained medical staff requires several minutes per patient. We demonstrate that deep convolutional neural networks can be leveraged for a fully automated, fast, and reproducible scoring of X-ray images of patients with rheumatoid arthritis. A comparison of the predictions of different human experts and our deep learning system shows that there is no significant difference in the performance of human experts and our deep learning model.
Nowadays, most digital modulation schemes are based on conventional signal constellations that have no algebraic group, ring, or field properties, e.g. square quadrature-amplitude modulation constellations. Signal constellations with algebraic structure can enhance the system performance. For instance, multidimensional signal constellations based on dense lattices can achieve performance gains due to the dense packing. The algebraic structure enables low-complexity decoding and detection schemes. In this work, signal constellations with algebraic properties and their application in spatial modulation transmission schemes are investigated. Several design approaches of two- and four-dimensional signal constellations based on Gaussian, Eisenstein, and Hurwitz integers are shown. Detection algorithms with reduced complexity are proposed. It is shown, that the proposed Eisenstein and Hurwitz constellations combined with the proposed suboptimal detection can outperform conventional two-dimensional constellations with ML detection.
This work proposes a construction for low-density parity-check (LDPC) codes over finite Gaussian integer fields. Furthermore, a new channel model for codes over Gaussian integers is introduced and its channel capacity is derived. This channel can be considered as a first order approximation of the additive white Gaussian noise channel with hard decision detection where only errors to nearest neighbors in the signal constellation are considered. For this channel, the proposed LDPC codes can be decoded with a simple non-probabilistic iterative decoding algorithm similar to Gallager's decoding algorithm A.
This paper proposes a novel transmission scheme for generalized multistream spatial modulation. This new approach uses one Mannheim error correcting codes over Gaussian or Eisenstein integers as multidimensional signal constellations. These codes enable a suboptimal decoding strategy with near maximum likelihood performance for transmission over the additive white Gaussian noise channel. In this contribution, this decoding algorithm is generalized to the detection for generalized multistream spatial modulation. The proposed method can outperform conventional generalized multistream spatial modulation with respect to decoding performance, detection complexity, and spectral efficiency.
Spatial modulation is a low-complexity multipleinput/ multipleoutput transmission technique. The recently proposed spatial permutation modulation (SPM) extends the concept of spatial modulation. It is a coding approach, where the symbols are dispersed in space and time. In the original proposal of SPM, short repetition codes and permutation codes were used to construct a space-time code. In this paper, we propose a similar coding scheme that combines permutation codes with codes over Gaussian integers. Short codes over Gaussian integers have good distance properties. Furthermore, the code alphabet can directly be applied as signal constellation, hence no mapping is required. Simulation results demonstrate that the proposed coding approach outperforms SPM with repetition codes.
Four-Dimensional Hurwitz Signal Constellations, Set Partitioning, Detection, and Multilevel Coding
(2021)
The Hurwitz lattice provides the densest four-dimensional packing. This fact has motivated research on four-dimensional Hurwitz signal constellations for optical and wireless communications. This work presents a new algebraic construction of finite sets of Hurwitz integers that is inherently accompanied by a respective modulo operation. These signal constellations are investigated for transmission over the additive white Gaussian noise (AWGN) channel. It is shown that these signal constellations have a better constellation figure of merit and hence a better asymptotic performance over an AWGN channel when compared with conventional signal constellations with algebraic structure, e.g., two-dimensional Gaussian-integer constellations or four-dimensional Lipschitz-integer constellations. We introduce two concepts for set partitioning of the Hurwitz integers. The first method is useful to reduce the computational complexity of the symbol detection. This suboptimum detection approach achieves near-maximum-likelihood performance. In the second case, the partitioning exploits the algebraic structure of the Hurwitz signal constellations. We partition the Hurwitz integers into additive subgroups in a manner that the minimum Euclidean distance of each subgroup is larger than in the original set. This enables multilevel code constructions for the new signal constellations.