Refine
Year of publication
Document Type
- Report (103)
- Working Paper (88)
- Journal (Complete Issue of a Journal) (84)
- Master's Thesis (76)
- Article (73)
- Book (49)
- Bachelor Thesis (44)
- Conference Proceeding (34)
- Study Thesis (8)
- Other Publications (7)
Language
- German (447)
- English (138)
- Multiple languages (1)
Has Fulltext
- yes (586) (remove)
Keywords
- .NET Remoting (1)
- .NET-Plattform (1)
- 1D-CNN (1)
- 2 D environment Laser data (1)
- 3D Extended Object Tracking (1)
- 3D shape tracking (1)
- 3D-Druck (1)
- 5-10-40-Bedingung (1)
- ADO.NET (1)
- ASP.NET (1)
Institute
- Fakultät Architektur und Gestaltung (27)
- Fakultät Bauingenieurwesen (77)
- Fakultät Elektrotechnik und Informationstechnik (23)
- Fakultät Informatik (59)
- Fakultät Maschinenbau (33)
- Fakultät Wirtschafts-, Kultur- und Rechtswissenschaften (40)
- Institut für Angewandte Forschung - IAF (44)
- Institut für Naturwissenschaften und Mathematik - INM (2)
- Institut für Optische Systeme - IOS (14)
- Institut für Strategische Innovation und Technologiemanagement - IST (2)
Photovoltaik
(2010)
Planspiel KLIMA 10.X
(1992)
PowerPoint-Rhetorik
(2017)
Interpretability and uncertainty modeling are important key factors for medical applications. Moreover, data in medicine are often available as a combination of unstructured data like images and structured predictors like patient’s metadata. While deep learning models are state-of-the-art for image classification, the models are often referred to as ’black-box’, caused by the lack of interpretability. Moreover, DL models are often yielding point predictions and are too confident about the parameter estimation and outcome predictions.
On the other side with statistical regression models, it is possible to obtain interpretable predictor effects and capture parameter and model uncertainty based on the Bayesian approach. In this thesis, a publicly available melanoma dataset, consisting of skin lesions and patient’s age, is used to predict the melanoma types by using a semi-structured model, while interpretable components and model uncertainty is quantified. For Bayesian models, transformation model-based variational inference (TM-VI) method is used to determine the posterior distribution of the parameter. Several model constellations consisting of patient’s age and/or skin lesion were implemented and evaluated. Predictive performance was shown to be best by using a combined model of image and patient’s age, while providing the interpretable posterior distribution of the regression coefficient is possible. In addition, integrating uncertainty in image and tabular parts results in larger variability of the outputs corresponding to high uncertainty of the single model components.
Forecasting is crucial for both system planning and operations in the energy sector. With increasing penetration of renewable energy sources, increasing fluctuations in the power generation need to be taken into account. Probabilistic load forecasting is a young, but emerging research topic focusing on the prediction of future uncertainties. However, the majority of publications so far focus on techniques like quantile regression, ensemble, or scenario-based methods, which generate discrete quantiles or sets of possible load curves. The conditioned probability distribution remains unknown and can only be estimated when the output is post-processed using a statistical method like kernel density estimation.
Instead, the proposed probabilistic deep learning model uses a cascade of transformation functions, known as normalizing flow, to model the conditioned density function from a smart meter dataset containing electricity demand information for over 4,000 buildings in Ireland. Since the whole probability density function is tractable, the parameters of the model can be obtained by minimizing the negative loglikelihood through the state of the art gradient descent. This leads to the model with the best representation of the data distribution.
Two different deep learning models have been compared, a simple three-layer fully connected neural network and a more advanced convolutional neural network for sequential data processing inspired by the WaveNet architecture. These models have been used to parametrize three different probabilistic models, a simple normal distribution, a Gaussian mixture model, and the normalizing flow model. The prediction horizon is set to one day with a resolution of 30 minutes, hence the models predict 48 conditioned probability distributions.
The normalizing flow model outperforms the two other variants for both architectures and proves its ability to capture the complex structures and dependencies causing the variations in the data. Understanding the stochastic nature of the task in such detail makes the methodology applicable for other use cases apart from forecasting. It is shown how it can be used to detect anomalies in the power grid or generate synthetic scenarios for grid planning.
Probabilistic Short-Term Low-Voltage Load Forecasting using Bernstein-Polynomial Normalizing Flows
(2021)
The transition to a fully renewable energy grid requires better forecasting of demand at the low-voltage level. However, high fluctuations and increasing electrification cause huge forecast errors with traditional point estimates. Probabilistic load forecasts take future uncertainties into account and thus enables various applications in low-carbon energy systems. We propose an approach for flexible conditional density forecasting of short-term load based on Bernstein-Polynomial Normalizing Flows where a neural network controls the parameters of the flow. In an empirical study with 363 smart meter customers, our density predictions compare favorably against Gaussian and Gaussian mixture densities and also outperform a non-parametric approach based on the pinball loss for 24h-ahead load forecasting for two different neural network architectures.
The target of this thesis is the introduction of a client management system (CMS) at Haaland Internet Productions (HiP), a web design and hosting company in Burbank, California, USA. The company needs a system to track orders and improve workflow. HiP needs a system which not only tracks orders, but also stores all client information in a database. This client information can be used for a variety of marketing and contact reasons. It is an important and integral part of HiP's client relationship management (CRM). The lack of a cohesive CMS at HiP caused many fundamental business problems, such as lost orders, missed billing statements, and over/under billing. The research done during the investigation and analysis of the company and their needs should lead to a global system which totally fulfils the needs of HiP. This global system could be in the form of an off-the-shelf product with some customizations, or a completely new, in-house system. Either solution will have respective pros and cons; the goal is to reach a decision that best fits HiP's needs and situation. The following is a concise version of the project. Particular emphasis is placed upon the single steps which made up the decision process, as well as the practiced techniques, methods, and their applications.
In tourism, energy demands are particularly high.Tourism facilities such as hotels require large amounts ofelectric and heating resp. cooling energy. Their supply howeveris usually still based on fossil energies. This research approachanalyses the potential of promoting renewable energies in BlackForest tourism. It focuses on a combined and hence highlyefficient production of both electric and thermal energy bybiogas plants on the one hand and its provision to local tourismfacilities via short distance networks on the other. Basing onsurveys and qualitative empiricism and considering regionalresource availability as well as socio-economic aspects, it thusexamines strengths, weaknesses, opportunities and threats thatcan arise from such a cooperation.
In tourism, energy demands are particularly high. Tourism facilities such as hotels require large amounts of electric and heating / cooling energy while their supply is usually still based on fossil energies.
This research approach analyses the potential of promoting renewable energies in tourism. It focuses on a combined and hence highly efficient production of both electric and thermal energy by biogas plants on the one hand and its provision to local tourism facilities via short distance networks on the other. Considering regional resource availability as well as socio-economic aspects, it thus examines strengths, weaknesses, opportunities and threats that can arise from such a micro-cooperation. The research aim is to provide an actor-based, spatially transferable feasibility analysis.
Die Gründe für eine Überwachung und Überprüfung von CMS können mannigfaltig sein:
- Unternehmen und Geschäftsleitung beabsichtigen die Minderung des Haftungsrisikos (vgl. z.B. USSG, wonach der Nachweis eines „Effective Program to Detect and Prevent Violations of Law“ einen Strafmilderungsgrund darstellen kann)
- Die Geschäftsleitung möchte sich nach erstmaliger Implementierung eines CMS dessen Effektivität versichern lassen
- Aufsichtsrat/Prüfungsausschuss kommen ihrer Pflicht gem. §§ 111 Abs. 1, 107 Abs. 3 AktG sowie den Empfehlungen des Deutschen Corporate Governance Kodex (vgl. Ziff. 5.3.2 des DCGK) nach, die Wirksamkeit eines eingerichteten internen Kontrollsystems zu überwachen. Dies umfasst auch die Überwachung der Angemessenheits- und Funktionsfähigkeit des CMS4
- Prüfung im Rahmen der gesetzlichen bzw. freiwilligen Abschlussprüfung
- Nachweis eines durch eine unabhängige Instanz geprüften CMS im Rahmen von M&A Transaktionen
- Überprüfung des CMS nach einem erheblichen Compliance-Verstoß
- Einsetzung eines sog. „Compliance Monitors“ nach US-amerikanischer Rechtspraxis, der auf Grundlage einer Vereinbarung mit US Behörden das Unternehmen einige Jahre dahingehend kontrolliert, ob die eingerichteten Compliance-Maßnahmen auch greifen
Das Projekt RALV (rasche Analyse der Lichtstärkeverteilung) hatte das Ziel zu untersuchen, inwieweit eine schnelle Messung von Lichtstärkeverteilungen mit Hilfe von CCD Kamera-Aufnahmen von Streubildern auf einem Schirm möglich ist. Dazu wird eine Probe von einer gerichteten Lichtquelle (Laser) bestrahlt. Das gestreute Licht trifft auf einen Schirm. Auf dem Schirm entsteht eine Helligkeitsverteilung. Diese wird auf der Rückseite abfotographiert. Aus den Geometriedaten der Anordnung und der gemessenen Helligkeitsverteilung lässt sich die Lichtstärkeverteilung und damit die Streucharakteristik errechnen. Durch geeignete Wahl der Schirm-Materialien, eine gute Kamera-Auslegung sowie einer dafür entwickelten Software ist es im RALV-Projekt gelungen die Funktionsfähigkeit dieses Verfahrens nachzuweisen. Schwerpunkte bei der Arbeit im Projekt waren: - Finden geeigneter Komponenten: Lichtquellen, Schirmmaterial, Kamera - Schreiben der benötigten Software für: Steuerung, Auswertung und Darstellung - Vergleich der RALV-Methode mit Standard Methoden (ebenes Goniometer) Nach Ablauf des Projektes liegen die Daten für den Bau eines industriell einsetzbaren Messgerätes vor. Es zeigte sich, dass das RALV Messverfahren seine Stärken bei Materialien mit einer spiegelnden Vorzugsrichtung hat. Als Nebeneffekte des Projektes wurde ein ebenes Goniometer mit 3 achsiger Probenaufnahme gefertigt. Zudem wurde die Bewitterungskammer der FH Konstanz für lichttechnische Fragen erweitert.
The performance and reliability of non-volatile NAND flash memories deteriorate as the number of program/erase cycles grows. The reliability also suffers from cell to cell interference, long data retention time, and read disturb. These processes effect the read threshold voltages. The aging of the cells causes voltage shifts which lead to high bit error rates (BER) with fixed pre-defined read thresholds. This work proposes two methods that aim on minimizing the BER by adjusting the read thresholds. Both methods utilize the number of errors detected in the codeword of an error correction code. It is demonstrated that the observed number of errors is a good measure for the voltage shifts and is utilized for the initial calibration of the read thresholds. The second approach is a gradual channel estimation method that utilizes the asymmetrical error probabilities for the one-to-zero and zero-to-one errors that are caused by threshold calibration errors. Both methods are investigated utilizing the mutual information between the optimal read voltage and the measured error values.
Numerical results obtained from flash measurements show that these methods reduce the BER of NAND flash memories significantly.
Non-volatile NAND flash memories store information as an electrical charge. Different read reference voltages are applied to read the data. However, the threshold voltage distributions vary due to aging effects like program erase cycling and data retention time. It is necessary to adapt the read reference voltages for different life-cycle conditions to minimize the error probability during readout. In the past, methods based on pilot data or high-resolution threshold voltage histograms were proposed to estimate the changes in voltage distributions. In this work, we propose a machine learning approach with neural networks to estimate the read reference voltages. The proposed method utilizes sparse histogram data for the threshold voltage distributions. For reading the information from triple-level cell (TLC) memories, several read reference voltages are applied in sequence. We consider two histogram resolutions. The simplest histogram consists of the zero-and-one ratios for the hard decision read operation, whereas a higher resolution is obtained by considering the quantization levels for soft-input decoding. This approach does not require pilot data for the voltage adaptation. Furthermore, only a few measurements of extreme points of the threshold voltage distributions are required as training data. Measurements with different conditions verify the proposed approach. The resulting neural networks perform well under other life-cycle conditions.