Refine
Year of publication
Document Type
- Conference Proceeding (50)
- Master's Thesis (34)
- Report (17)
- Article (9)
- Bachelor Thesis (3)
- Doctoral Thesis (3)
- Other Publications (3)
- Working Paper (2)
Language
- English (63)
- German (56)
- Multiple languages (2)
Keywords
- .NET Remoting (1)
- 2 D environment Laser data (1)
- AAL (2)
- ADO.NET (1)
- ASP.NET (1)
- Accelerometers (1)
- Active Server Pages (1)
- ActiveX Data Objects (ADO) (1)
- Activity monitoring (1)
- Agenten-Plattform (1)
Institute
- Fakultät Informatik (121) (remove)
IT-Compliance in KMU
(2023)
This thesis emphasizes problems that reports generated by vulnerability scanners impose on the process of vulnerability management, which are a. an overwhelming amount of data and b. an insufficient prioritization of the scan results.
To assist the process of developing means to counteract those problems and to allow for quantitative evaluation of their solutions, two metrics are proposed for their effectiveness and efficiency. These metrics imply a focus on higher severity vulnerabilities and can be applied to any simplification process of vulnerability scan results, given it relies on a severity score and time of remediation estimation for each vulnerability.
A priority score is introduced which aims to improve the widely used Common Vulnerability Scoring System (CVSS) base score of each vulnerability dependent on a vulnerability’s ease of exploit, estimated probability of exploitation and probability of its existence.
Patterns within the reports generated by the Open Vulnerability Assessment System (OpenVAS) vulnerability scanner between vulnerabilities are discovered which identify criteria by which they can be categorized from a remediation actor standpoint. These categories lay the groundwork of a final simplified report and consist of updates that need to be installed on a host, severe vulnerabilities, vulnerabilities that occur on multiple hosts and vulnerabilities that will take a lot of time for remediation. The highest potential time savings are found to exist within frequently occurring vulnerabilities, minor- and major suggested updates.
Processing of the results provided by the vulnerability scanner and creation of the report is realized in the form of a python script. The resulting reports are short, straight to the point and provide a top down remediation process which should theoretically allow to minimize the institutions attack surface as fast as possible. Evaluation of the practicality must follow as the reports are yet to be introduced into the Information Security Management Lifecycle.
Systematic Generation of XSS and SQLi Vulnerabilities in PHP as Test Cases for Static Code Analysis
(2022)
Synthetic static code analysis test suites are important to test the basic functionality of tools. We present a framework that uses different source code patterns to generate Cross Site Scripting and SQL injection test cases. A decision tree is used to determine if the test cases are vulnerable. The test cases are split into two test suites. The first test suite contains 258,432 test cases that have influence on the decision trees. The second test suite contains 20 vulnerable test cases with different data flow patterns. The test cases are scanned with two commercial static code analysis tools to show that they can be used to benchmark and identify problems of static code analysis tools. Expert interviews confirm that the decision tree is a solid way to determine the vulnerable test cases and that the test suites are relevant.
The influence of sleep on human life, including physiological, psychological, and mental aspects, is remarkable. Therefore, it is essential to apply appropriate therapy in the case of sleep disorders. For this, however, the irregularities must first be recognised, preferably conveniently for the person concerned. This dissertation, structured as a composition of research articles, presents the development of mathematically based algorithmic principles for a sleep analysis system. The particular focus is on the classification of sleep stages with a minimal set of physiological parameters. In addition, the aspects of using the sleep analysis system as part of the more complex healthcare systems are explored. Design of hardware for non-obtrusive measurement of relevant physiological parameters and the use of such systems to detect other sleep disorders, such as sleep apnoea, are also referred to. Multinomial logistic regression was selected as the basis for development resulting from the investigations carried out. By following a methodical procedure, the number of physiological parameters necessary for the classification of sleep stages was successively reduced to two: Respiratory and Movement signals. These signals might be measured in a contactless way. A prototype implementation of the developed algorithms was performed to validate the proposed method, and the evaluation of 19324 sleep epochs was carried out. The results, with the achieved accuracy of 73% in the classification of Wake/NREM/REM stages and Cohen's kappa of 0.44, outperform the state of the art and demonstrate the appropriateness of the selected approach. In the future, this method could enable convenient, cost-effective, and accurate sleep analysis, leading to the detection of sleep disorders at an early stage so that therapy can be initiated as soon as possible, thus improving the general population's health status and quality of life.
Der Gegenstand dieser Bachelorarbeit ist die automatisierte Extraktion von Polygonzügen anhand eines Grundrissbildes. Diese Polygonzüge sollen die Räumlichkeiten wiedergeben. In dieser Bachelorarbeit wurde daher ein Algorithmus für die Grundrissbildverarbeitung mittels Python entwickelt und implementiert. Zuerst wird ein Grundrissbild bereinigt, d. h. es werden unerwünschte Bildstrukturen verwaschen. Mithilfe des Canny-Kantendetektors werden anschließend die Kanten detektiert. Danach werden die Ecken im Grundrissbild via Harris-Eckendetektor lokalisiert. Um die Ecken sinnvoll zu verbinden, wird eine abgewandelte Form des Dijkstra Algorithmus herangezogen. Die daraus gewonnen Daten dienen zur Erstellung der Polygonzüge, welche für die Simulation von pFlow benötigt werden. Der entwickelte Algorithmus eignet sich insbesondere für klare und simple Grundrissbilder.
Die digitale Transformation von Geschäftsprozessen und die stärkere Einbindung von IT-Systemen erzeugen bei kleinen und mittelständischen Unternehmen (KMU) Chancen und Risiken zugleich. Risiken, die insbesondere in einer fehlenden IT-Compliance resultieren können. Wie Studien zeigen, sind KMU in Bezug auf IT-Compliance-Maßnahmen im Vergleich zu kapitalmarktorientierten Unternehmen jedoch im Rückstand [1]. Im Beitrag wird mithilfe von Experteninterviews und einer qualitativen Datenanalyse der Frage nachgegangen, welcher Status quo an Maßnahmen aktuell implementiert und wie der empfundene Compliance-Reifegrad ist. Weiterhin werden die Gründe und Motive erörtert, die zu diesem Zustand geführt haben. Letztlich sind Treiber identifiziert worden, die zu einem höheren Bewusstsein in der Zukunft führen können. Die Arbeit zeigt interessante Erkenntnisse aus der Praxis, da die Experteninterviews Einblicke in den aktuellen Status quo in Bezug auf IT-Compliance liefern.
Dynamic Real-Time Range Queries (DRRQ) are a common means to handle mobile clients in high-density areas where both, clients requested by the query and the inquirers, are mobile. In contrast to the very well-known continuous range queries, only a few approaches, such as Adaptive Quad Streaming (AQS), address the mandatory scalability and real-time requirements of these so-called ad-hoc mobility challenges. In this paper we present the highly decentralized solution Adaptive Quad Streaming Flexible (AQSflex) as an extension of the already existing more theoretical AQS approach. Beside a highly distributed cell structure without data structures and a lightweight streaming communication, we use a multi-cell-assignment on limited pool resources instead of an idealistic unlimited cell-per-server assignment. The described experimental results show the potential of our local capacity balancing scheme for cell handover in a strongly decentralized setting. Leafs of a cell hierarchy define a kind of self-optimizing fuzzy edge for the processing resources in high-density systems without any centralized controlling or cloud component.
Ballistocardiography (BCG) can be used to monitor heart rate activity. Besides, the accelerometer should have high sensitivity and minimal internal noise; a low-cost approach was taken into consideration. Several measurements have been executed to determine the optimal positioning of a sensor under the mattress to obtain a signal strong enough for further analysis. A prototype for an unobtrusive accelerometer-based measurement system has been developed and tested in a conventional bed without any specific extras. The influence of the human sleep position for the output accelerometer data was tested. The obtained results indicate the potential to capture BCG signals using accelerometers. The measurement system can detect heart rate in an unobtrusive form in the home environment.
We compared vulnerable and fixed versions of the source code of 50 different PHP open source projects based on CVE reports for SQL injection vulnerabilities. We scanned the source code with commercial and open source tools for static code analysis. Our results show that five current state-of-the-art tools have issues correctly marking vulnerable and safe code. We identify 25 code patterns that are not detected as a vulnerability by at least one of the tools and 6 code patterns that are mistakenly reported as a vulnerability that cannot be confirmed by manual code inspection. Knowledge of the patterns could help vendors of static code analysis tools, and software developers could be instructed to avoid patterns that confuse automated tools.