lnu.sePublications
Change search
Refine search result
1234567 1 - 50 of 9114
Cite
Citation style
• apa
• ieee
• modern-language-association-8th-edition
• vancouver
• Other style
More styles
Language
• de-DE
• en-GB
• en-US
• fi-FI
• nn-NO
• nn-NB
• sv-SE
• Other locale
More languages
Output format
• html
• text
• asciidoc
• rtf
Rows per page
• 5
• 10
• 20
• 50
• 100
• 250
Sort
• Standard (Relevance)
• Author A-Ö
• Author Ö-A
• Title A-Ö
• Title Ö-A
• Publication type A-Ö
• Publication type Ö-A
• Issued (Oldest first)
• Created (Oldest first)
• Last updated (Oldest first)
• Disputation date (earliest first)
• Disputation date (latest first)
• Standard (Relevance)
• Author A-Ö
• Author Ö-A
• Title A-Ö
• Title Ö-A
• Publication type A-Ö
• Publication type Ö-A
• Issued (Oldest first)
• Created (Oldest first)
• Last updated (Oldest first)
• Disputation date (earliest first)
• Disputation date (latest first)
Select
The maximal number of hits you can export is 250. When you want to export more records please use the Create feeds function.
• 1.
Univ Tecnol Eindhoven, Netherlands.
Univ Tecnol Eindhoven, Netherlands. Linnaeus University, Faculty of Health and Life Sciences, Department of Biology and Environmental Science.
Desafíos en la gestión de residuos sólidos para las ciudades de países en desarrollo [Solid waste management challenges for cities in developing countries]2015In: Tecnología en Marcha, ISSN 0379-3982, Vol. 28, no 2, p. 141-168Article in journal (Refereed)

Solid waste management is a challenge for the cities' authorities in developing countries mainly due to the increasing generation of waste, the burden posed on the municipal budget as a result of the high costs associated to its management, the lack of understanding over a diversity of factors that affect the different stages of waste management and linkages necessary to enable the entire handling system functioning. An analysis of literature on the work done and reported mainly in publications from 2005 to 2011, related to waste management in developing countries, showed that few articles give quantitative information. The analysis was conducted in two of the major scientific journals, Waste Management Journal and Waste Management and Research. The objective of this research was to determine the stakeholders' action/behavior that have a role in the waste management process and to analyze influential factors on the system, in more than thirty urban areas in 22 developing countries in 4 continents. A combination of methods was used in this study in order to assess the stakeholders and the factors influencing the performance of waste management in the cities. Data was collected from scientific literature, existing data bases, observations made during visits to urban areas, structured interviews with relevant professionals, exercises provided to participants in workshops and a questionnaire applied to stakeholders. Descriptive and inferential statistic methods were used to draw conclusions. The outcomes of the research are a comprehensive list of stakeholders that are relevant in the waste management systems and a set of factors that reveal the most important causes for the systems' failure. The information provided is very useful when planning, changing or implementing waste management systems in cities.

• 2.
Linnaeus University, Faculty of Technology, Department of computer science and media technology (CM).

Modern software systems are increasingly more connected, pervasive, and dynamic, as such, they are subject to more runtime variations than legacy systems. Runtime variations affect system properties, such as performance and availability. The variations are difficult to anticipate and thus mitigate in the system design.

Self-adaptive software systems were proposed as a solution to monitor and adapt systems in response to runtime variations. Research has established a vast body of knowledge on engineering self-adaptive systems. However, there is a lack of systematic process support that leverages such engineering knowledge and provides for systematic reuse for self-adaptive systems development.

This thesis proposes the Autonomic Software Product Lines (ASPL), which is a strategy for developing self-adaptive software systems with systematic reuse. The strategy exploits the separation of a managed and a managing subsystem and describes three steps that transform and integrate a domain-independent managing system platform into a domain-specific software product line for self-adaptive software systems.

Applying the ASPL strategy is however not straightforward as it involves challenges related to variability and uncertainty. We analyzed variability and uncertainty to understand their causes and effects. Based on the results, we developed the Autonomic Software Product Lines engineering (ASPLe) methodology, which provides process support for the ASPL strategy. The ASPLe has three processes, 1) ASPL Domain Engineering, 2) Specialization and 3) Integration. Each process maps to one of the steps in the ASPL strategy and defines roles, work-products, activities, and workflows for requirements, design, implementation, and testing. The focus of this thesis is on requirements and design.

We validate the ASPLe through demonstration and evaluation. We developed three demonstrator product lines using the ASPLe. We also conducted an extensive case study to evaluate key design activities in the ASPLe with experiments, questionnaires, and interviews. The results show a statistically significant increase in quality and reuse levels for self-adaptive software systems designed using the ASPLe compared to current engineering practices.

Doctoral Thesis (Comprehensive Summary)
Front Page
• 3.
Umeå universitet.
Properites of "Good" Java Examples2010Independent thesis Advanced level (degree of Master (Two Years)), 20 credits / 30 HE creditsStudent thesis

Example programs are well known as an important tool to learn computer programming. Realizing the signicance of example programs, this study has been conducted with a goalto measure and evaluate the quality of examples used in academia. We make a distinctionbetween good and bad examples, as badly designed examples may prove harmful for novice learners. In general, students differ from expert programmers in their approach to read and comprehend a program. How do students understand example programs is explored in the light of classical theories and models of program comprehension. Key factors that impact program quality and comprehension are identified. To evaluate as well as improve the quality of examples, a set of quality attributes is proposed. Relationship between program complexity and quality is examined. We rate readability as a prime quality attribute and hypothesize that example programs with low readability are difficult to understand. Software Reading Ease Score (SRES), a program readability metric proposed by Börstler et al. is implemented to provide a readability measurement tool. SRES is based on lexical tokens and is easy to compute using static code analysis techniques. To validate SRES metric, results are statistically analyzed in correlation to earlier existing well acknowledged software metrics.

• 4.
Linnaeus University, Faculty of Science and Engineering, School of Computer Science, Physics and Mathematics.
Towards autonomic software product lines2011In: SPLC '11 Proceedings of the 15th International Software Product Line Conference, Volume 2, ACM Press, 2011, p. 44:1-44:8Conference paper (Refereed)

We envision an Autonomic Software Product Line (ASPL). The ASPL is a dynamic software product line that supports self adaptable products. We plan to use reflective architecture to model and develop ASPL. To evaluate the approach, we have implemented three autonomic product lines which show promising results. The ASPL approach is at initial stages, and require additional work. We plan to exploit online learning to realize more dynamic software product lines to cope with the problem of product line evolution. We propose on-line knowledge sharing among products in a product line to achieve continuous improvement of quality in product line products.

• 5.
Linnaeus University, Faculty of Technology, Department of Computer Science.
Linnaeus University, Faculty of Technology, Department of Computer Science.
Architectural reasoning for dynamic software product lines2013In: Proceedings of the 17th International Software Product Line Conference co-located workshops, ACM Press, 2013, p. 117-124Conference paper (Refereed)

Software quality is critical in today's software systems. A challenge is the trade-off situation architects face in the design process. Designers often have two or more alternatives, which must be compared and put into context before a decision is made. The challenge becomes even more complex for dynamic software product lines, where domain designers have to take runtime variations into consideration as well. To address the problem we propose extensions to an architectural reasoning framework with constructs/artifacts to define and model a domain's scope and dynamic variability. The extended reasoning framework encapsulates knowledge to understand and reason about domain quality behavior and self-adaptation as a primary variability mechanism. The framework is demonstrated for a self-configuration property, self-upgradability on an educational product-line.

• 6.
Linnaeus University, Faculty of Technology, Department of computer science and media technology (CM), Department of Computer Science.
Linnaeus University, Faculty of Technology, Department of computer science and media technology (CM), Department of Computer Science.
Architectural Reasoning Support for Product-Lines of Self-adaptive Software Systems: A Case Study2015In: Software Architecture: 9th European Conference, ECSA 2015, Dubrovnik/Cavtat, Croatia, September 7-11, 201 / [ed] Danny Weyns, Raffaela Mirandola, Ivica Crnkovic, Springer, 2015, p. 20-36Conference paper (Refereed)

Software architecture serves as a foundation for the design and development of software systems. Designing an architecture requires extensive analysis and reasoning. The study presented herein focuses on the architectural analysis and reasoning in support of engineering self-adaptive software systems with systematic reuse. Designing self-adaptive software systems with systematic reuse introduces variability along three dimensions; adding more complexity to the architectural analysis and reasoning process. To this end, the study presents an extended Architectural Reasoning Framework with dedicated reasoning support for self-adaptive systems and reuse. To evaluate the proposed framework, we conducted an initial feasibility case study, which concludes that the proposed framework assists the domain architects to increase reusability, reduce fault density, and eliminate differences in skills and experiences among architects, which were our research goals and are decisive factors for a system's overall quality.

• 7.
Linnaeus University, Faculty of Technology, Department of computer science and media technology (CM), Department of Computer Science.
Linnaeus University, Faculty of Technology, Department of computer science and media technology (CM), Department of Computer Science.
ASPLe: a methodology to develop self-adaptive software systems with reuse2017Report (Other academic)

Advances in computing technologies are pushing software systems and their operating environments to become more dynamic and complex. The growing complexity of software systems coupled with uncertainties induced by runtime variations leads to challenges in software analysis and design. Self-Adaptive Software Systems (SASS) have been proposed as a solution to address design time complexity and uncertainty by adapting software systems at runtime. A vast body of knowledge on engineering self-adaptive software systems has been established. However, to the best of our knowledge, no or little work has considered systematic reuse of this knowledge. To that end, this study contributes an Autonomic Software Product Lines engineering (ASPLe) methodology. The ASPLe is based on a multi-product lines strategy which leverages systematic reuse through separation of application and adaptation logic. It provides developers with repeatable process support to design and develop self-adaptive software systems with reuse across several application domains. The methodology is composed of three core processes, and each process is organized for requirements, design, implementation, and testing activities. To exemplify and demonstrate the use of the ASPLe methodology, three application domains are used as running examples throughout the report.

ASPLe2017
• 8.
Linnaeus University, Faculty of Technology, Department of Computer Science.
Linnaeus University, Faculty of Technology, Department of Computer Science.
Harnessing Variability in Product-lines of Self-adaptive Software Systems2015In: Proceedings of the 19th International Conference on Software Product Line: SPLC '15, ACM Press, 2015, p. 191-200Conference paper (Refereed)

This work studies systematic reuse in the context of self-adaptive software systems. In our work, we realized that managing variability for such platforms is different compared to traditional platforms, primarily due to the run-time variability and system uncertainties. Motivated by the fact that recent trends show that self-adaptation will be used more often in future system generation and that software reuse state-of-practice or research do not provide sufficient support, we have investigated the problems and possibly resolutions in this context. We have analyzed variability for these systems, using a systematic reuse prism, and identified a research gap in variability management. The analysis divides variability handling into four activities: (1) identify variability, (2) constrain variability, (3) implement variability, and (4) manage variability. Based on the findings we envision a reuse framework for the specific domain and present an example framework that addresses some of the identified challenges. We argue that it provides basic support for engineering self-adaptive software systems with systematic reuse. We discuss some important avenues of research for achieving the vision.

• 9.
Linnaeus University, Faculty of Technology, Department of Computer Science.
Linnaeus University, Faculty of Technology, Department of Computer Science. Linnaeus University, Faculty of Technology, Department of Computer Science. Linnaeus University, Faculty of Technology, Department of Computer Science.
Rigorous architectural reasoning for self-adaptive software systems2016In: Proceedings: First Workshop on Qualitative Reasoning abut Software Architectures, QRASA 2016 / [ed] Lisa O'Conner, IEEE, 2016, p. 11-18Conference paper (Refereed)

Designing a software architecture requires architectural reasoning, i.e., activities that translate requirements to an architecture solution. Architectural reasoning is particularly challenging in the design of product-lines of self-adaptive systems, which involve variability both at development time and runtime. In previous work we developed an extended Architectural Reasoning Framework (eARF) to address this challenge. However, evaluation of the eARF showed that the framework lacked support for rigorous reasoning, ensuring that the design complies to the requirements. In this paper, we introduce an analytical framework that enhances eARF with such support. The framework defines a set of artifacts and a series of activities. Artifacts include templates to specify domain quality attribute scenarios, concrete models, and properties. The activities support architects with transforming requirement scenarios to architecture models that comply to required properties. Our focus in this paper is on architectural reasoning support for a single product instance. We illustrate the benefits of the approach by applying it to an example client-server system, and outline challenges for future work. © 2016 IEEE.

• 10.
Linnaeus University, Faculty of Science and Engineering, School of Computer Science, Physics and Mathematics.
Linnaeus University, Faculty of Science and Engineering, School of Computer Science, Physics and Mathematics. Linnaeus University, Faculty of Science and Engineering, School of Computer Science, Physics and Mathematics.
Autonomic Software Product Lines (ASPL)2010In: ECSA '10 Proceedings of the Fourth European Conference on Software Architecture: Companion Volume / [ed] Carlos E. Cuesta, ACM Press, 2010, p. 324-331Conference paper (Refereed)

We describe ongoing work on a variability mechanism for Autonomic Software Product Lines (ASPL). The autonomic software product lines have self-management characteristics that make product line instances more resilient to context changes and some aspects of product line evolution. Instances sense the context, selects and bind the best component variants to variation-points at run-time. The variability mechanism we describe is composed of a profile guided dispatch based on off-line and on-line training processes. Together they form a simple, yet powerful variability mechanism that continuously learns, which variants to bind given the current context and system goals.

• 11.
Linnaeus University, Faculty of Science and Engineering, School of Computer Science, Physics and Mathematics.
Linnaeus University, Faculty of Science and Engineering, School of Computer Science, Physics and Mathematics. Linnaeus University, Faculty of Science and Engineering, School of Computer Science, Physics and Mathematics.
Towards Autonomic Software Product Lines (ASPL) - A Technical Report2011Report (Other academic)

This report describes a work in progress to develop Autonomic Software Product Lines (ASPL). The ASPL is a dynamic software product line approach with a novel variability handling mechanism that enables traditional software product lines to adapt themselves at runtime in response to changes in their context, requirements and business goals. The ASPL variability mechanism is composed of three key activities: 1) context-profiling, 2) context-aware composition, and 3) online learning. Context-profiling is an offline activity that prepares a knowledge base for context-aware composition. The context-aware composition uses the knowledge base to derive a new product or adapts an existing product based on a product line's context attributes and goals. The online learning optimizes the knowledge base to remove errors and suboptimal information and to incorporate new knowledge. The three activities together form a simple yet powerful variability handling mechanism that learns and adapts a system at runtime in response to changes in system context and goals. We evaluated the ASPL variability mechanism on three small-scale software product lines and got promising results. The ASPL approach is, however, is yet at an initial stage and require improved development support with more rigorous evaluation.

fulltext
• 12.
Linnaeus University, Faculty of Technology, Department of computer science and media technology (CM).
Linnaeus University, Faculty of Technology, Department of computer science and media technology (CM). Linnaeus University, Faculty of Technology, Department of computer science and media technology (CM). KU Leuven, Belgium.
ASPLe: a methodology to develop self-adaptive software systems with systematic reuse2020In: Journal of Systems and Software, ISSN 0164-1212, E-ISSN 1873-1228, Vol. 167, article id 110626Article in journal (Refereed)

More than two decades of research have demonstrated an increasing need for software systems to be self-adaptive. Self-adaptation is required to deal with runtime dynamics which are difficult to predict before deployment. A vast body of knowledge to develop Self-Adaptive Software Systems (SASS) has been established. We, however, discovered a lack of process support to develop self-adaptive systems with reuse. To that end, we propose a domain-engineering based methodology, Autonomic Software Product Lines engineering (ASPLe), which provides step-by-step guidelines for developing families of SASS with systematic reuse. The evaluation results from a case study show positive effects on quality and reuse for self-adaptive systems designed using the ASPLe compared to state-of-the-art engineering practices.

• 13.
Linnaeus University, Faculty of Science and Engineering, School of Computer Science, Physics and Mathematics.
Linnaeus University, Faculty of Science and Engineering, School of Computer Science, Physics and Mathematics. Linnaeus University, Faculty of Science and Engineering, School of Computer Science, Physics and Mathematics.
Knowledge evolution in autonomic software product lines2011In: SPLC '11 Proceedings of the 15th International Software Product Line Conference, Volume 2, New York, NY, USA: ACM Press, 2011, p. 36:1-36:8Conference paper (Refereed)

We describe ongoing work in knowledge evolution management for autonomic software product lines. We explore how an autonomic product line may benefit from new knowledge originating from different source activities and artifacts at run time. The motivation for sharing run-time knowledge is that products may self-optimize at run time and thus improve quality faster compared to traditional software product line evolution. We propose two mechanisms that support knowledge evolution in product lines: online learning and knowledge sharing. We describe two basic scenarios for runtime knowledge evolution that involves these mechanisms. We evaluate online learning and knowledge sharing in a small product line setting that shows promising results.

• 14.
Linnaeus University, Faculty of Science and Engineering, School of Computer Science, Physics and Mathematics.
Linnaeus University, Faculty of Science and Engineering, School of Computer Science, Physics and Mathematics. Linnaeus University, Faculty of Science and Engineering, School of Computer Science, Physics and Mathematics.
Modeling Variability in Product Lines Using Domain Quality Attribute Scenarios2012In: Proceedings of the WICSA/ECSA 2012 Companion Volume, ACM Press, 2012, p. 135-142Conference paper (Refereed)

The concept of variability is fundamental in software product lines and a successful implementation of a product line largely depends on how well domain requirements and their variability are specified, managed, and realized. While developing an educational software product line, we identified a lack of support to specify variability in quality concerns. To address this problem we propose an approach to model variability in quality concerns, which is an extension of quality attribute scenarios. In particular, we propose domain quality attribute scenarios, which extend standard quality attribute scenarios with additional information to support specification of variability and deriving product specific scenarios. We demonstrate the approach with scenarios for robustness and upgradability requirements in the educational software product line.

• 15. Abbasi, R.
Linnaeus University, Faculty of Science and Engineering, School of Computer Science, Physics and Mathematics.
Limits on a muon flux from Kaluza-Klein dark matter annihilations in the Sun from the IceCube 22-string detector2010In: Physical Review D, ISSN 1550-7998, E-ISSN 1550-2368, Vol. 81, no 5, p. Article ID: 057101-Article in journal (Refereed)

A search for muon neutrinos from Kaluza-Klein dark matter annihilations in the Sun has been performed with the 22-string configuration of the IceCube neutrino detector using data collected in 104.3 days of live time in 2007. No excess over the expected atmospheric background has been observed. Upper limits have been obtained on the annihilation rate of captured lightest Kaluza-Klein particle (LKP) WIMPs in the Sun and converted to limits on the LKP-proton cross sections for LKP masses in the range 250-3000 GeV. These results are the most stringent limits to date on LKP annihilation in the Sun.

• 16. Abbasi, R.
Linnaeus University, Faculty of Science and Engineering, School of Computer Science, Physics and Mathematics.
SEARCH FOR MUON NEUTRINOS FROM GAMMA-RAY BURSTS WITH THE IceCube NEUTRINO TELESCOPE2010In: Astrophysical Journal, ISSN 0004-637X, E-ISSN 1538-4357, Vol. 710, no 1, p. 346-359Article in journal (Refereed)

We present the results of searches for high-energy muon neutrinos from 41 gamma-ray bursts (GRBs) in the northern sky with the IceCube detector in its 22 string configuration active in 2007/2008. The searches cover both the prompt and a possible precursor emission as well as a model-independent, wide time window of -1 hr to + 3 hr around each GRB. In contrast to previous searches with a large GRB population, we do not utilize a standard Waxman-Bahcall GRB flux for the prompt emission but calculate individual neutrino spectra for all 41 GRBs from the burst parameters measured by satellites. For all of the three time windows, the best estimate for the number of signal events is zero. Therefore, we place 90% CL upper limits on the fluence from the prompt phase of 3.7 x 10(-3) erg cm(-2) (72 TeV-6.5 PeV) and on the fluence from the precursor phase of 2.3 x 10(-3) erg cm(-2) (2.2-55 TeV), where the quoted energy ranges contain 90% of the expected signal events in the detector. The 90% CL upper limit for the wide time window is 2.7 x 10(-3) erg cm(-2) (3 TeV-2.8 PeV) assuming an E-2 flux.

• 17. Abbasi, R.
University of Kalmar, School of Pure and Applied Natural Sciences.
SEARCH FOR HIGH-ENERGY MUON NEUTRINOS FROM THE "NAKED-EYE" GRB 080319B WITH THE IceCube NEUTRINO TELESCOPE2009In: Astrophysical Journal, ISSN 0004-637X, E-ISSN 1538-4357, Vol. 701, no 2, p. 1721-1731Article in journal (Refereed)

We report on a search with the IceCube detector for high-energy muon neutrinos from GRB 080319B, one of the brightest gamma-ray bursts (GRBs) ever observed. The fireball model predicts that a mean of 0.1 events should be detected by IceCube for a bulk Lorentz boost of the jet of 300. In both the direct on-time window of 66 s and an extended window of about 300 s around the GRB, no excess was found above background. The 90% CL upper limit on the number of track-like events from the GRB is 2.7, corresponding to a muon neutrino fluence limit of 9.5 x 10(-3) erg cm(-2) in the energy range between 120 TeV and 2.2 PeV, which contains 90% of the expected events.

• 18. Abbasi, R.
University of Kalmar, School of Pure and Applied Natural Sciences.
Extending the Search for Neutrino Point Sources with IceCube above the Horizon2009In: Physical Review Letters, ISSN 0031-9007, E-ISSN 1079-7114, Vol. 103, no 22, p. Article ID: 221102-Article in journal (Refereed)

Point source searches with the IceCube neutrino telescope have been restricted to one hemisphere, due to the exclusive selection of upward going events as a way of rejecting the atmospheric muon background. We show that the region above the horizon can be included by suppressing the background through energy-sensitive cuts. This improves the sensitivity above PeV energies, previously not accessible for declinations of more than a few degrees below the horizon due to the absorption of neutrinos in Earth. We present results based on data collected with 22 strings of IceCube, extending its field of view and energy reach for point source searches. No significant excess above the atmospheric background is observed in a sky scan and in tests of source candidates. Upper limits are reported, which for the first time cover point sources in the southern sky up to EeV energies.

• 19. Abbasi, R.
Linnaeus University, Faculty of Science and Engineering, School of Computer Science, Physics and Mathematics.
Calibration and characterization of the IceCube photomultiplier tube2010In: Nuclear Instruments and Methods in Physics Research Section A: Accelerators, Spectrometers, Detectors and Associated Equipment, ISSN 0168-9002, E-ISSN 1872-9576, Vol. 618, no 1-3, p. 139-152Article in journal (Refereed)

Over 5000 PMTs are being deployed at the South Pole to compose the IceCube neutrino observatory. Many are placed deep in the ice to detect Cherenkov light emitted by the products of high-energy neutrino interactions, and others are frozen into tanks on the surface to detect particles from atmospheric cosmic ray showers. IceCube is using the 10-in. diameter R7081-02 made by Hamamatsu Photonics. This paper describes the laboratory characterization and calibration of these PMTs before deployment. PMTs were illuminated with pulses ranging from single photons to saturation level. Parameterizations are given for the single photoelectron charge spectrum and the saturation behavior. Time resolution, late pulses and afterpulses are characterized. Because the PMTs are relatively large, the cathode sensitivity uniformity was measured. The absolute photon detection efficiency was calibrated using Rayleigh-scattered photons from a nitrogen laser. Measured characteristics are discussed in the context of their relevance to IceCube event reconstruction and simulation efforts. (C) 2010 Elsevier B.V. All rights reserved.

• 20. Abbasi, R.
Linnaeus University, Faculty of Science and Engineering, School of Computer Science, Physics and Mathematics.
Measurement of sound speed vs. depth in South Pole ice for neutrino astronomy2010In: Astroparticle physics, ISSN 0927-6505, E-ISSN 1873-2852, Vol. 33, no 5-6, p. 277-286Article in journal (Refereed)

We have measured the speed of both pressure waves and shear waves as a function of depth between 80 and 500 m depth in South Pole ice with better than 1% precision. The measurements were made using the South Pole Acoustic Test Setup (SPATS), an array of transmitters and sensors deployed in the ice at the South Pole in order to measure the acoustic properties relevant to acoustic detection of astrophysical neutrinos. The transmitters and sensors use piezoceramics operating at similar to 5-25 kHz. Between 200 m and 500 m depth, the measured profile is consistent with zero variation of the sound speed with depth, resulting in zero refraction, for both pressure and shear waves. We also performed a complementary study featuring an explosive signal propagating vertically from 50 to 2250 m depth, from which we determined a value for the pressure wave speed consistent with that determined for shallower depths, higher frequencies, and horizontal propagation with the SPATS sensors. The sound speed profile presented here can be used to achieve good acoustic source position and emission time reconstruction in general, and neutrino direction and energy reconstruction in particular. The reconstructed quantities could also help separate neutrino signals from background. (C) 2010 Elsevier B.V. All rights reserved.

• 21. Abbasi, R.
University of Kalmar, School of Pure and Applied Natural Sciences.
First Neutrino Point-Source Results from the 22 String Icecube Detector2009In: The Astrophysical Journal Letters, ISSN 2041-8205, Vol. 701, no 1, p. L47-L51Article in journal (Refereed)

We present new results of searches for neutrino point sources in the northern sky, using data recorded in 2007-2008 with 22 strings of the IceCube detector (approximately one-fourth of the planned total) and 275.7 days of live time. The final sample of 5114 neutrino candidate events agrees well with the expected background of atmospheric muon neutrinos and a small component of atmospheric muons. No evidence of a point source is found, with the most significant excess of events in the sky at 2.2Ïƒ after accounting for all trials. The average upper limit over the northern sky for point sources of muon-neutrinos with E â€“2 spectrum is ##IMG## [http://ej.iop.org/images/1538-4357/701/1/L47/apjl318527ieqn1.gif] $E^2\,Φ _ν _μ < 1.4 \,\,\times\,\, 10^-11\; \mathrmTeV\;cm^-2\;\mathrms^-1$ , in the energy range from 3Â TeV to 3Â PeV, improving the previous best average upper limit by the AMANDA-II detector by a factor of 2.

• 22. Abbasi, R.
University of Kalmar, School of Pure and Applied Natural Sciences.
Limits on a Muon Flux from Neutralino Annihilations in the Sun with the IceCube 22-String Detector2009In: Physical Review Letters, ISSN 0031-9007, E-ISSN 1079-7114, Vol. 102, no 20, p. Article ID: 201302-Article in journal (Refereed)

A search for muon neutrinos from neutralino annihilations in the Sun has been performed with the IceCube 22-string neutrino detector using data collected in 104.3 days of live time in 2007. No excess over the expected atmospheric background has been observed. Upper limits have been obtained on the annihilation rate of captured neutralinos in the Sun and converted to limits on the weakly interacting massive particle (WIMP) proton cross sections for WIMP masses in the range 250-5000 GeV. These results are the most stringent limits to date on neutralino annihilation in the Sun.

• 23. Abbasi, R.
University of Kalmar, School of Pure and Applied Natural Sciences.
Determination of the atmospheric neutrino flux and searches for new physics with AMANDA-II2009In: Physical Review D, ISSN 1550-7998, E-ISSN 1550-2368, Vol. 79, no 10, p. Article ID: 102005-Article in journal (Refereed)

The AMANDA-II detector, operating since 2000 in the deep ice at the geographic South Pole, has accumulated a large sample of atmospheric muon neutrinos in the 100 GeV to 10 TeV energy range. The zenith angle and energy distribution of these events can be used to search for various phenomenological signatures of quantum gravity in the neutrino sector, such as violation of Lorentz invariance or quantum decoherence. Analyzing a set of 5511 candidate neutrino events collected during 1387 days of livetime from 2000 to 2006, we find no evidence for such effects and set upper limits on violation of Lorentz invariance and quantum decoherence parameters using a maximum likelihood method. Given the absence of evidence for new flavor-changing physics, we use the same methodology to determine the conventional atmospheric muon neutrino flux above 100 GeV.

• 24. Abbasi, R.
University of Kalmar, School of Pure and Applied Natural Sciences.
Search for point sources of high energy neutrinos with final data from AMANDA-II2009In: Physical Review D, ISSN 1550-7998, E-ISSN 1550-2368, Vol. 79, no 6, p. Article ID: 062001-Article in journal (Refereed)

We present a search for point sources of high energy neutrinos using 3.8 yr of data recorded by AMANDA-II during 2000-2006. After reconstructing muon tracks and applying selection criteria designed to optimally retain neutrino-induced events originating in the northern sky, we arrive at a sample of 6595 candidate events, predominantly from atmospheric neutrinos with primary energy 100 GeV to 8 TeV. Our search of this sample reveals no indications of a neutrino point source. We place the most stringent limits to date on E(-2) neutrino fluxes from points in the northern sky, with an average upper limit of E(2)Phi(nu mu)+nu(tau)<= 5.2x10(-11) TeV cm(-2) s(-1) on the sum of nu(mu) and nu(tau) fluxes, assumed equal, over the energy range from 1.9 TeV to 2.5 PeV.

• 25. Abbasi, R.
University of Kalmar, School of Pure and Applied Natural Sciences.
Solar Energetic Particle Spectrum on 2006 December 13 Determined by IceTop2008In: The Astrophysical Journal Letters, Vol. 689, no 1, p. L65-L68Article in journal (Refereed)

On 2006 December 13 the IceTop air shower array at the South Pole detected a major solar particle event. By numerically simulating the response of the IceTop tanks, which are thick Cerenkov detectors with multiple thresholds deployed at high altitude with no geomagnetic cutoff, we determined the particle energy spectrum in the energy range 0.6-7.6 GeV. This is the first such spectral measurement using a single instrument with a well-defined viewing direction. We compare the IceTop spectrum and its time evolution with previously published results and outline plans for improved resolution of future solar particle spectra.

• 26. Abbasi, R.
University of Kalmar, School of Pure and Applied Natural Sciences.
The IceCube data acquisition system: Signal capture, digitization, and timestamping2009In: Nuclear Instruments and Methods in Physics Research Section A: Accelerators, Spectrometers, Detectors and Associated Equipment, ISSN 0168-9002, E-ISSN 1872-9576, Vol. 601, no 3, p. 294-316Article in journal (Refereed)

IceCube is a km-scale neutrino observatory under construction at the South Pole with sensors both in the deep ice (InIce) and on the surface (IceTop). The sensors, called Digital Optical Modules (DOMs). detect, digitize and timestamp the signals from optical Cherenkov-radiation photons. The DOM Main Board (MB) data acquisition subsystem is connected to the central DAQ in the IceCube Laboratory (ICL) by a single twisted copper wire-pair and transmits packetized data on demand. Time calibration is maintained throughout the array by regular transmission to the DOMs of precisely timed analog signals, synchronized to a central GPS-disciplined clock. The design goals and consequent features, functional capabilities, and initial performance of the DOM MB, and the operation of a combined array of DOMs as a system, are described here. Experience with the first InIce strings and the IceTop stations indicates that the system design and performance goals have been achieved. (c) 2009 Elsevier B.V. All rights reserved.

• 27.
North-West University, South Africa.
A search for new supernova remnant shells in the Galactic plane with HESS2018In: Astronomy and Astrophysics, ISSN 0004-6361, E-ISSN 1432-0746, Vol. 612, article id A8Article in journal (Refereed)

A search for new supernova remnants (SNRs) has been conducted using TeV gamma-ray data from the H.E.S.S. Galactic plane survey. As an identification criterion, shell morphologies that are characteristic for known resolved TeV SNRs have been used. Three new SNR candidates were identified in the H.E.S.S. data set with this method. Extensive multiwavelength searches for counterparts were conducted. A radio SNR candidate has been identified to be a counterpart to HESS J1534-571. The TeV source is therefore classified as a SNR. For the other two sources, HESS J1614-518 and HESS J1912 + 101, no identifying counterparts have been found, thus they remain SNR candidates for the time being. TeV-emitting SNRs are key objects in the context of identifying the accelerators of Galactic cosmic rays. The TeV emission of the relativistic particles in the new sources is examined in view of possible leptonic and hadronic emission scenarios, taking the current multiwavelength knowledge into account.

• 28.
North-West University, South Africa.
First limits on the very-high energy gamma-ray afterglow emission of a fast radio burst HESS observations of FRB 1504182017In: Astronomy and Astrophysics, ISSN 0004-6361, E-ISSN 1432-0746, Vol. 597, article id A115Article in journal (Refereed)

Aims. Following the detection of the fast radio burst FRB150418 by the SUPERB project at the Parkes radio telescope, we aim to search for very-high energy gamma-ray afterglow emission. Methods. Follow-up observations in the very-high energy gamma-ray domain were obtained with the H.E.S.S. imaging atmospheric Cherenkov telescope system within 14.5 h of the radio burst. Results. The obtained 1.4 h of gamma-ray observations are presented and discussed. At the 99% C.L. we obtained an integral upper limit on the gamma-ray flux of Phi(gamma)(E > 350 GeV) < 1.33 x 10(-8) m(-2) s(-1). Differential flux upper limits as function of the photon energy were derived and used to constrain the intrinsic high-energy afterglow emission of FRB 150418. Conclusions. No hints for high-energy afterglow emission of FRB 150418 were found. Taking absorption on the extragalactic background light into account and assuming a distance of z = 0 : 492 based on radio and optical counterpart studies and consistent with the FRB dispersion, we constrain the gamma-ray luminosity at 1 TeV to L < 5 : 1 x 10(47) erg/s at 99% C.L.

• 29.
North West Univ, South Africa.
HESS Limits on Linelike Dark Matter Signatures in the 100 GeV to 2 TeV Energy Range Close to the Galactic Center2016In: Physical Review Letters, ISSN 0031-9007, E-ISSN 1079-7114, Vol. 117, no 15, article id 151302Article in journal (Refereed)

A search for dark matter linelike signals iss performed in the vicinity of the Galactic Center by the H.E.S.S. experiment on observational data taken in 2014. An unbinned likelihood analysis iss developed to improve the sensitivity to linelike signals. The upgraded analysis along with newer data extend the energy coverage of the previous measurement down to 100 GeV. The 18 h of data collected with the H.E.S.S. array allow one to rule out at 95% C.L. the presence of a 130 GeV line (at l = -1.5 degrees, b = 0 degrees and for a dark matter profile centered at this location) previously reported in Fermi-LAT data. This new analysis overlaps significantly in energy with previous Fermi-LAT and H.E.S.S. results. No significant excess associated with dark matter annihilations was found in the energy range of 100 GeV to 2 TeV and upper limits on the gamma-ray flux and the velocity weighted annihilation cross section are derived adopting an Einasto dark matter halo profile. Expected limits for present and future large statistics H.E.S.S. observations are also given.

• 30.
North-West University, South Africa.
Gamma-ray blazar spectra with HESS II mono analysis: The case of PKS2155-304 and PG1553+1132017In: Astronomy and Astrophysics, ISSN 0004-6361, E-ISSN 1432-0746, Vol. 600, article id A89Article in journal (Refereed)

Context. The addition of a 28 m Cherenkov telescope (CT5) to the H.E.S.S. array extended the experiment's sensitivity to lower energies. The lowest energy threshold is obtained using monoscopic analysis of data taken with CT5, providing access to gamma-ray energies below 100 GeV for small zenith angle observations. Such an extension of the instrument's energy range is particularly beneficial for studies of active galactic nuclei with soft spectra, as expected for those at a redshift >= 0.5. The high-frequency peaked BL Lac objects PKS 2155-304 (z = 0.116) and PG 1553 + 113 (0.43 < z < 0.58) are among the brightest objects in the gamma-ray sky, both showing clear signatures of gamma-ray absorption at E > 100 GeV interpreted as being due to interactions with the extragalactic background light (EBL). Aims. The aims of this work are twofold: to demonstrate the monoscopic analysis of CT5 data with a low energy threshold, and to obtain accurate measurements of the spectral energy distributions (SED) of PKS 2155-304 and PG 1553 + 113 near their SED peaks at energies approximate to 100 GeV. Methods. Multiple observational campaigns of PKS 2155 304 and PG 1553 + 113 were conducted during 2013 and 2014 using the full H.E.S.S. II instrument (CT1-5). A monoscopic analysis of the data taken with the new CT5 telescope was developed along with an investigation into the systematic uncertainties on the spectral parameters which are derived from this analysis. Results. Using the data from CT5, the energy spectra of PKS 2155 304 and PG 1553 + 113 were reconstructed down to conservative threshold energies of 80 GeV for PKS 2155 304, which transits near zenith, and 110 GeV for the more northern PG 1553 + 113. The measured spectra, well fitted in both cases by a log-parabola spectral model ( with a 5.0 similar to statistical preference for non-zero curvature for PKS 2155 304 and 4.5 sigma for PG 1553+113), were found consistent with spectra derived from contemporaneous Fermi-LAT data, indicating a sharp break in the observed spectra of both sources at E approximate to 100 GeV. When corrected for EBL absorption, the intrinsic H.E.S.S. II mono and Fermi-LAT spectrum of PKS 2155 304 was found to show significant curvature. For PG 1553+113, however, no significant detection of curvature in the intrinsic spectrum could be found within statistical and systematic uncertainties.

• 31.
North-West University, South Africa.