Poster abstracts PhD Day 2021

Isabelle Viole, Department of Technology Systems

How can we power a telescope in the Atacama desert sustainably?
Astronomy is a research discipline with a rather high carbon footprint. This is partially due to diesel or gas generators fueling most telescopes today.
The design project AtLAST, Towards an Atacama Large Aperture Submillimeter Telescope, aims to be plan the first stationary telescope which implements a sustainable power solution for the observatory already in its design phase. How can we fuel a 50-meter diameter single-dish telescope and its vast cryogenic cooling demand with solar power? This poster presents a potential energy system including a hybrid energy storage system for the telescope and discusses the modeling needed to turn this idea into reality.

Clarissa Akemi Kajiya Endo, Department of Biosciences

The survival and consequent successful recruitment of Northeast Arctic (NEA) cod is thought to depend on sufficient and suitable prey for the newly-hatched larvae, in particular the nauplii stages of the lipid-rich calanoid copepod species Calanus finmarchicus. The role of spatial and temporal variations in prey availability in combination with temperature and other factors in influencing growth and survival of cod larvae is, however, incompletely understood. We assessed larval growth and survival until they settle in their feeding habitat in the Barents Sea in early fall by using a combination of an individual based model for NEA cod larvae at the Norwegian coast with a high-resolution ocean model and a nutrient-phytoplankton-zooplankton-detritus model providing 18 years of daily environmental conditions and prey availability. We find on average a two-week delay from the peak timing of first-feeding cod larvae to the peak in prey availability. In warm years, more larvae experience food limitation than in normal years. The positive effects of high temperature on growth, survival and ultimately recruitment are nonetheless larger than the negative effects of food limitation. Food limitation mainly affects larvae spawned in southern areas or late in the spawning season as these larvae experience the highest temperatures and have the highest energy requirements. Our findings highlight the spatial and temporal differences in mechanisms that regulate growth and survival of early life stages of NEA cod and suggest that spatially resolved data may be essential for understanding match-mismatch dynamics.

Olga Zlygosteva, Department of Physics

Introduction 
Radiation therapy (RT) is one of the main treatment modalities for cancer along with surgery, chemotherapy, immunotherapy and hormonal therapy with approximately 50% of all cancer patients receiving RT during course of illness. However, damage to normal tissue following RT causes severe side effects and reductions in quality of life. The aim of this work is to establish a mouse model to study RT-induced response and underlying mechanisms in order to propose new strategies for mitigating side effects after photon and proton RT. The study was done within PROCCA convergence environment consisting of oncologists, physicists, biologists and psychologists that addresses key life science challenges.
M&M
C57BL/6J mice were irradiated with X-rays to total doses ranging from 30 to 65 Gy, given in 10 fractions over 5 days. The radiation field covered the head and neck region (H&N). Post-RT investigations included macroscopic and microscopic examinations of the tissues within the radiation field. 
Results 
Tissue damage localized to the irradiated region was observed for doses above 44 Gy. Histopathological examination confirmed the tissue injuries found macroscopically. The mouse strain tolerated the fractionated irradiation schedules quite well that led to testing of different schemes and optimization of protocols.
Discussion
The multidisciplinary nature of the project ensures maximum data extraction per mouse. The wide range of doses and analyzes used allowed establishing preclinical model for further investigation of normal tissue damage after RT of the H&N region.

Sneha Pandit, Institute of Theoretical Astrophysics

Analysis of ALMA full disk maps of the Sun:

The Sun, being the nearest star, can be observed well resolved, and thus be used as a
reference case for solar-like stars. Atacama Large Millimetre/sub-millimetre Array (ALMA) is a new set of eyes to look at the stars including our Sun. Specifically, the brightness temperatures provided by ALMA shed light on the
activity and the thermal structure of stellar atmospheres. The global aim of the presented
study is to establish more robust solar/stellar activity indicators using ALMA observations
in comparison with classical diagnostics.
Here, full disk solar maps from ALMA are analysed in combination with SDO-AIA and HMI
maps and, with full disk H-alpha and Ca II maps to understand the correlation between
them, which also imposes constraints for the height range mapped by ALMA. The centre
to limb variation in temperature observed for ALMA maps shows limb brightening which is
in line with the expectation that the radiation observed with ALMA originates from the
chromosphere. In order to transfer the insights gained from solar ALMA observations to
other stars, the full disk solar maps are converted into a corresponding stellar signal. Here
we present the first results.

Simón Rodríguez-Satizábal, Department of Biosciences

Antifoulants, Veterinary Medicinal Products and Organic Material can affect marine sediment organisms, but to what extent? - ANTIVENOM

Simón Rodríguez-Satizábal (a), Katrine Borgå (b), Tânia Gomes (a), Adam Lillicrap (a), Samantha Martins (a), Anders Ruus (a,b) and Ailbhe Macken (a)
a Norwegian Institute for Water Research, Økernveien 94, 0579, Oslo, Norwayb Department of Biosciences, University of Oslo, PO Box 1066 Blindern, 0316, Oslo, Norway

E-mail contact: simon.rodriguez@niva.no 
---------------------------------------------------------------------------------------------

The ANTIVENOM project is investigating whether the use of chemicals in aquaculture have a different impact on the environment than previously considered, and if existing guidance on the assessment of their environmental risks is sufficient to cover specifically acting chemicals or their combined effects. We aspire to gain a greater understanding of the hazards and risks of current chemical use in aquaculture. The aquaculture industry can be affected by biofouling and diseases which can have severe economic consequences. In order to address these issues, industry has developed different types of antifouling compounds and veterinary medicinal products (VMP). The knowledge generated from ANTIVENOM will serve as a basis for recommendations to improve the effects and environmental risk assessments (ERAs) of antifoulants and veterinary medicines used in aquaculture to support future sustainable practices within the industry. The project will focus on assessing the single and combined effects of chemicals, with the contribution of organic matter, on non-target sediment organisms, using non-standard hazard assessment strategies. The science delivered through ANTIVENOM will support and influence policy makers on the changes necessary to improve the ERA of chemicals used in aquaculture. It will also ensure better protection and mitigation of the impacts to the marine environment from aquaculture practices. Specific outcomes will be to improve the regulatory frameworks and guidance documents for veterinary medicines and antifoulants, by proposing new sediment toxicity tests and testing requirements that are most relevant for the protection of northern European marine waters.

Qindong Zhang, Department of Pharmacy

Tumor-associated macrophages (TAMs) are mostly derived from circulating monocytes and are a major component of most solid tumors. Within the tumor microenvironment, TAMs generally polarize into two extreme phenotypes, M1 and M2. M1 macrophages display anti-tumor effects, while M2 promotes tumor progression, induces angiogenesis, metastasis, and immune suppression. Most, if not all, TAMs display M2 phenotype. Moreover, the presence of TAMs is not only correlated with poor prognosis in most types of cancers but also can interfere commonly used cancer therapies. Considering the correlation between the poor prognosis in most cancer patients and the high density of TAMs, we aim to develop therapeutic peptides or antibodies to target these cells with the hope to elongate the life-span and improve the life qualities of patients. To date, we have successfully selected at least three targeted peptides. Among them, NW peptide can bind to monocytes and TAMs of high affinity. In addition, a fused lytic peptide derived from NW peptide exhibits a strong killing ability to monocytes and TAMs. However, the receptor of NW peptide still remains unknown. To identify the receptor, experiments involving chemical crosslinking and co-immunoprecipitation were performed.
 

Jenny Bjordal, Department of Geosciences

The economic impacts of climate change are highly uncertain. On the climate side, the most important uncertainty is the climate sensitivity, which relates warming to CO2 concentrations. And on the economic side, one of the most important uncertainties is the so-called damage functions, which relate climate change to economic damages and benefits. Despite broad awareness of these uncertainties, it is unclear which of them is most important, especially at the regional level. 
We have constructed regional damage functions based on well-established global functions. We further combined these regional damage functions with different spatial distributions of warming for a given emission scenario, as simulated by two Earth System Models with vastly different climate sensitivity. 
Our results show that uncertainty in both climate sensitivity and economic damage per degree of warming are of similar importance for the global economic impact. Increasing the climate sensitivity or the sensitivity of the damage function both increases the economic damages globally. Yet, at the country-level the effect varies depending on the initial temperature as well as how much the country warms. This means that a given region can be either a winner or a loser depending on the combination of regional warming and damage function.
Our findings show that the uncertainties that are important can vary between regions. They also emphasise the importance of a regional focus, including uncertainties, in future research on economic impacts and their policy implications.

Federica Ghione, Department of Geosciences

We present a Probabilistic Seismic Hazard Analysis for Northeast India and Bhutan, one of the most seismically active regions of the world. A common approach for hazard evaluation is to use simple area source zones. This approach potentially underevaluates the predicted ground motion level, due to smearing effects of source zones on the distribution of the activity rates. To compensate this limitation, a “hybrid” model which accounts for the presence of localized and potentially seismogenic structures is used, constrained by both, past seismicity and structural geological data. This approach combines two different types of earthquake source models: homogeneous area source zones and finite faults. The study region has been partitioned into ten seismogenic source zones of similar seismic potential and seismotectonic characteristics. Earthquake recurrence parameters for each zone have been obtained by direct magnitude-frequency analysis of local earthquakes from a precompiled catalogue. For the seismogenic faults, the recurrence parameters were indirectly derived from the fault’s slip rates calculated from GPS velocity data. Calculations were performed at Peak Ground Acceleration and several Spectral Acceleration periods for a Probability of Exceedance of 10 % in 50 years, corresponding to 475 years return period, and for reference rock condition of 800 m/s. The results, presented with a series of hazard curves, maps and Uniform Hazard Spectra, highlight significant acceleration levels in the Arunachal Pradesh region (Northeast India), mostly due to the presence of the Himalayan Frontal Thrust. Such results are to consider particularly important for local risk assessment and other disaster mitigation-related studies.

Ines Junge, Department of Informatics

"To design or not to design" is a fair question to ask, considering that contemporary technology design is part of the problem, namely that planet Earth is urgently endangered as the basis of life for us and other species. In this study of the philosophy of technology in times of striving for sustainability, we regard and explore the potential of a new design philosophy that turns design from part of the problem to part of the solution, hence the title "Design is the Problem is the Solution". This new understanding of design entails replacing the old with a new model that makes the old obsolete (Buckminster Fuller). The prevalent efficiency thinking is thus replaced by sufficiency thinking. "Sufficient" means a little/a bit/a few that is enough to "serve the purpose" (satisfy a need), but not nothing/null (which would mean unmet needs). While there might be one or the other unnecessary thing (technology) that does not meet a real need, "sufficient consumption" is not a call to get rid of everything and "go back" in time or technological progress. Rather, it is a call to "flee forward" by resolving the problematic technology design with (a different) design. The study exemplifies a structuring into work packages for any design workshop or task to tackle this alternation. Using material science based knowledge support for designers, these different design WPs attempt to integrate sustainability with innovation, not going back to the primitive (past) but forward to the simple (future) by acknowledging the complicated between (the present) (Saint-Exupery).
 

Mahdieh Kamalian, Department of Informatics

A user's transport mode (e.g., car, bus, walk, etc.) can be detected automatically by smartphones which exist in a great number and have many sensors. In addition, using a smartphone-based Transport Mode Detection (TMD) in a fog environment extends its capabilities ensuring low latency, low battery usage and high accuracy. 

The main idea is to use a machine leaning algorithm for a TMD system that has several steps: 1) data collection or sensing, 2) pre-processing, 3) feature extraction, and 4) classification. We already have a first prototype that uses a local classifier, GPS, accelerometer, and magnetometer. The novelty of this work is the use of the magnetometer data (for classification improvement) and the fog approach.
 
The original classifier (generated in Lisbon) was trained based on the data (trips) collected in Lisbon, but the test results are not adequately accurate to the transport modes used in Oslo. Therefore, we are using transfer learning and magnetometer readings to improve the classifier; transfer learning means that we are doing further training of the above-mentioned classifier with data trips collected in Oslo. Using this approach (fog approach) we believe that we can have this methodology applied in any other cities in the world.
 
Our results are very encouraging. We Obtained 36% accuracy with testing the original classifier (trained with trips data from Lisbon) in Oslo. However, with using transfer learning the accuracy improved to 89% in a fog approach. The magnetometer readings also can improve the accuracy; however, the results are not yet available.

Åsmund Dæhlen, Department of Informatics

Three of the most valuable companies in the world are Apple, Google, and Microsoft. Each of them is a platform ecosystem company relying on data as the principal value asset. On a global scale, health information systems are following similar models, where the value provided by the system is equal to the system's ability to share information. A crux of following the commercial entities' examples, is that these systems were not designed as a public goods service. Based on a belief that health information systems should be a system for the public, it (the public) should also have a say in its trajectory. Using an analogy to four principal functions of citizen participation in indirect democracy, we present a model of Democratic Design of large-scale information systems. We classify four channels where citizens can participate in a democracy: 1) channels of engagement, such as electoral systems; 2) channels of informing, i.e. media; 3) channels for direct transformative power such as becoming a politician or the staging of rallies; and 4) channels of learning, spaces where citizens learn about the various channels of their democracy, and how they can impact them. Based on this definition of participatory democracy, we propose four principles of user participation in the design of large-scale information systems for public good. We use this simplified and stylized presentation of participation in democracy to illustrate what a socio-technically complex participatory practice can look like.

Sanaz Tavakolisomeh, Department of Informatics

Nowadays, there are several Garbage Collector (GC) solutions that can be used in an application. Such GCs behave differently regarding several performance metrics, in particular throughput, pause time, and memory usage. Thus, choosing the correct GC is far from trivial due to the impact that different GCs have on several performance metrics. This problem is particularly evident in applications that process high volumes of data/transactions and potentially lead to missed Service Level Agreements (SLAs) or high cloud hosting costs. 
In this work, we present: i) thorough evaluation of several of the most widely known and available GCs, and ii) a method to easily pick the best one. Choosing the best GC is done while taking into account the kind of application that is being considered (CPU or I/O intensive) and the performance metrics that one may want to consider: throughput, pause time, or memory usage.
We focus on the most popular GCs available for Java in OpenJDK HotSpot and use a large number of widely available and representative benchmarks from DaCapo and Renaissance benchmark suites. We also developed an application to evaluate some specific aspects of the GCs (read and write operations) and analyze the performance impact of different GCs on PetClinic, a popular microservice benchmark based on Spring Boot. When applied to the same kind of application (CPU or I/O intensive) different GCs can improve throughput up to 6.06%, reduce pause time up to 9.72%, or reduce memory consumption up to 4.6%.

Ingunn Hanson, Department of Physics

How can ionizing radiation increase survival of cells?

Ionizing radiation can damage DNA and kill cells: this is the basis of radiation therapy against cancer. In small doses, however, radiation can make cells more withstanding to subsequent doses of radiation – this is what we call priming. If we give the small priming dose over a protracted time, the protection against later radiation damage becomes permanent for the cells and their progeny. 

We irradiated cells with gamma rays, x-rays, or a combination. We assessed their survival using the clonogenic survival assay, where we calculate the ratio of cells that are able to form colonies of daughter cells after treatment. We also assessed the cell cycle progression of the cells by measuring the amount of phosphohistone-H3, a marker of cell division. 

Low dose-rate priming doses of 0.3 Gy improved the radioresistance of cells, and removed the native hyper-radiosensitive response permanently. The primed cells experienced cell cycle arrest before division to a larger degree than unprimed cells did. Surprisingly, after challenge irradiation of 0.1-0.3 Gy, the survival of the primed cells increased to more than 100%. 

Activation of a checkpoint before cell division gives the cells time to repair DNA damage and increases their survival. The priming irradiation activates cellular signal molecules that allow this to happen even for very low doses. A survival of more than 100% is possibly explained by increased or re-activated cell division:  cells that were originally resting may have re-entered the cell cycle and started dividing.

Jingpeng Li, Department of Physics

Magnetic Resonance Imaging (MRI) acquisition is an inherently slow process, limiting its application to intraoperative applications. To accelerate data acquistion, compressed sensing (CS) reconstructs images from undersampled measurements by leveraging the transform sparsity of MR images. However, these traditional methods often incur high computational cost and perform worse for high acceleration acquisition. Recently, deep learning methods have been shown to outperform current state-of-the-art MR reconstruction methods with respect to reconstruction accuracy and computational speed. In this paper, we present a novel deep learning architecture integrating self-attention module and convolutional recurrent neural networks for accelerated MRI data reconstruction. By capturing long-range dependencies between image regions, the proposed method is able to effectively remove the aliasing artifacts and preserves finer anotomical image details. The proposed method are evaluated using healthy volunteer brain images and intraoperative MRI scans, and we show that our method consistently achieves competitive performance and gives impressive generalizability to new patient data.

Heidi Bråthen, Department of Informatics

Introduction 
The need and right for users to have autonomy are vital in the design of assistive technology. Autonomous assistive technologies can increasingly be developed without considerations towards user control or understanding. This poses new challenges to user autonomy and the disciplines of Human-Computer Interactions and Interaction Design which have traditions of researching and designing for usability and learnability. 

The Scandinavian Participatory Design tradition further advocates the rights of users of technology to take part in decisions regarding the creation of that technology. The black-boxed expression of autonomous technology reinforces existing challenges with communicating about technological possibilities in PD processes. 

Material and methods 
To explore this problem of limited access to and insight into the operations of autonomous technology, I would like to explore how a relational perspective could be applied to designing for user autonomy. If we apply a relational perspective, making technology relate to users by becoming ""aware"" of the user needs and thereby give the user autonomy might be done by design. Relational interdependencies can be designed for, even if autonomous technology may not need them from a strictly functional perspective.  
A relational and situated understanding of users' autonomy and abilities require engaging with real users in the context of interacting with autonomous technologies. To research possibilities for designing for user autonomy I, therefore, propose a participatory design process where the relationship between end-user participants and assistive robots is explored through collaborative prototyping.  

Results and Discussion
Central contribution from the proposed project will be to explore and test approaches to collaborative prototyping in PD processes, specifically exploring AI and machine learning technologies with end-participant users.