Parkinson's disease (PD) is a progressive movement disorder caused by the death of dopamine-producing cells in the midbrain. There is a need for frequent symptom assessment, since the treatment needs to be individualized as the disease progresses. The aim of this paper was to verify and further investigate the clinimetric properties of an entropy-based method for measuring PD-related upper limb temporal irregularities during spiral drawing tasks. More specifically, properties of a temporal irregularity score (TIS) for patients at different stages of PD, and medication time points were investigated. Nineteen PD patients and 22 healthy controls performed repeated spiral drawing tasks on a smartphone. Patients performed the tests before a single levodopa dose and at specific time intervals after the dose was given. Three movement disorder specialists rated videos of the patients based on the unified PD rating scale (UPDRS) and the Dyskinesia scale. Differences in mean TIS between the groups of patients and healthy subjects were assessed. Test-retest reliability of the TIS was measured. The ability of TIS to detect changes from baseline (before medication) to later time points was investigated. Correlations between TIS and clinical rating scores were assessed. The mean TIS was significantly different between healthy subjects and patients in advanced groups (p-value = 0.02). Test-retest reliability of TIS was good with Intra-class Correlation Coefficient of 0.81. When assessing changes in relation to treatment, TIS contained some information to capture changes from Off to On and wearing off effects. However, the correlations between TIS and clinical scores (UPDRS and Dyskinesia) were weak. TIS was able to differentiate spiral drawings drawn by patients in an advanced stage from those drawn by healthy subjects, and TIS had good test-retest reliability. TIS was somewhat responsive to single-dose levodopa treatment. Since TIS is an upper limb high-frequency-based measure, it cannot be detected during clinical assessment.

Parking a vehicle can often lead to frustration, air pollution and congestion due to limited availability of parking spaces. With increasing population density this problem can certainly increase unless addressed. Parking lots occupy large areas of scarce land resource therefore it is necessary to identify the driving behaviour in a parking lot to improve it further. This Paper tries study the driving behaviour in the parking lot and for this endeavours it conducted direct observation in three parking lots and used GPS data that was collected prior to this study by the University of Dalarna.

To evaluate the driving behaviour in the parking lot direct observation was conducted to obtain overall indices of the parking lot vehicles movement. The parking route taken by the driver was compared with the optimal path to identify the driving behaviour in parking lot in terms of distance. The collected data was evaluated, filtered and analysed to identify the route, the distance and the time the vehicle takes to find a parking space.

The outcome of the study shows that driving behaviour in the parking lots varies significantly among the parking user where most of the observed vehicles took unnecessary long time to complete their parking. The study shows that 56% of the 430 observed vehicles demonstrated inefficient driving behaviour as they took long driving path rather the than the optimal path. The study trace this behaviour to two factors, first, the absent of parking guidance in the parking lots and the second is the selectivity of the drivers when choosing the parking space.

The study also shows that the ability of GPS data to identify the driving behaviour in the parking lots varies based on the time interval and the type of the device that is being used. The small the time interval the more accurate the GPS data in detecting the driving behaviour in the parking lots.

This paper presents a two-step pseudo likelihood estimation technique for generalized linear mixed models with correlated random effects. The proposed estimation technique does not require reparametarisation of the model. Multivariate Taylor's approximation has been used to approximate the intractable integrals in the likelihood function of the GLMM. Based on the analytical expression for the estimator of the covariance matrix of the random effects, a condition has been presented as to when such a covariance matrix can be estimated through the estimates of the random effects. An application of the model with a binary response variable has been presented using a real data set on credit defaults from two Swedish banks. Due to the use of two-step estimation technique, proposed algorithm outperforms the conventional pseudo likelihood algorithms in terms of computational time.

15.

Alam, Moudud

Dalarna University, School of Technology and Business Studies, Statistics.

This thesis deals with developing and testing feasible computational procedures to facilitate the estimation of and carry out the prediction with the generalized linear mixed model (GLMM) with a scope of applying them to large data sets. The work of this thesis is motivated from an issue arising in credit risk modelling. We have access to a huge data set, consisting of about one million observations, on credit history obtained from two major Swedish banks. The principal research interest involved with the data analysis is to model the probability of credit defaults by incorporating the systematic dependencies among the default events. In order to model the dependent credit defaults we adopt the framework of GLMM which is a popular approach to model correlated binary data. However, existing computational procedures for GLMM did not offer us the flexibility to incorporate the desired correlation structure of defaults events. For the feasible estimation of the GLMM we propose two estimation techniques being the fixed effects (FE) approach and the two-step pseudo likelihood approach (2PL). The preciseness of the estimation techniques and their computational advantages are studied by Monte-Carlo simulations and by applying them to the credit risk modelling. Regarding the prediction issue, we show how to apply the likelihood principle to carry out prediction with GLMM. We also provide an R add-in package to facilitate the predictive inference for GLMM.

16.

Alam, Moudud

Dalarna University, School of Technology and Business Studies, Statistics.

This paper presents a two-step pseudo likelihood estimation technique for generalized linear mixed models with the random effects being correlated between groups. The core idea is to deal with the intractable integrals in the likelihood function by multivariate Taylor's approximation. The accuracy of the estimation technique is assessed in a Monte-Carlo study. An application of it with a binary response variable is presented using a real data set on credit defaults from two Swedish banks. Thanks to the use of two-step estimation technique, the proposed algorithm outperforms conventional pseudo likelihood algorithms in terms of computational time.

It is commonly agreed that the credit defaults are correlated. However, the mechanism of such dependence is not yet fully understood. This paper contributes to the current understanding about the defaults comovement in the following way. Assuming that the industries provides the basis of defaults comovement it provides empirical evidence as to how such comovements can be modeled using correlated industry shocks. Generalized linear mixed model (GLMM) with correlated random effects is used to model the defaults comovement. Empirical evidences are drawn through analyzing individual borrower level credit history data obtained from two major Swedish banks between the period 1994-2000. The results show that the defaults are correlated both within and between industries but not over time (quarters). A discussion has also been presented as to how a GLMM for defaults correlation can be explained.

18.

Alam, Moudud

Dalarna University, School of Technology and Business Studies, Statistics.

This paper presents the techniques of likelihood prediction for the generalized linear mixed models. Methods of likelihood prediction is explained through a series of examples; from a classical one to more complicated ones. The examples show, in simple cases, that the likelihood prediction (LP) coincides with already known best frequentist practice such as the best linear unbiased predictor. The paper outlines a way to deal with the covariate uncertainty while producing predictive inference. Using a Poisson error-in-variable generalized linear model, it has been shown that in complicated cases LP produces better results than already know methods.

In this paper, we discuss how a regression model, with a non-continuous response variable, which allows for dependency between observations, should be estimated when observations are clustered and measurements on the subjects are repeated. The cluster sizes are assumed to be large. We find that the conventional estimation technique suggested by the literature on generalized linear mixed models (GLMM) is slow and sometimes fails due to non-convergence and lack of memory on standard PCs. We suggest to estimate the random effects as fixed effects by generalized linear model and to derive the covariance matrix from these estimates. A simulation study shows that our proposal is feasible in terms of mean-square error and computation time. We recommend that our proposal be implemented in the software of GLMM techniques so that the estimation procedure can switch between the conventional technique and our proposal, depending on the size of the clusters.

This paper traces the developments of credit risk modeling in the past 10 years. Our work can be divided into two parts: selecting articles and summarizing results. On the one hand, by constructing an ordered logit model on historical Journal of Economic Literature (JEL) codes of articles about credit risk modeling, we sort out articles which are the most related to our topic. The result indicates that the JEL codes have become the standard to classify researches in credit risk modeling. On the other hand, comparing with the classical review Altman and Saunders(1998), we observe some important changes of research methods of credit risk. The main finding is that current focuses on credit risk modeling have moved from static individual-level models to dynamic portfolio models.

We consider methods for estimating the causal effects of treatment in the situation where the individuals in the treatment and the control group are self selected, i.e., the selection mechanism is not randomized. In this case, a simple comparison of treated and control outcomes will not generally yield valid estimates of casual effect. The propensity score method is frequently used for the evaluation of treatment effect. However, this method is based on some strong assumptions, which are not directly testable. In this paper, we present an alternative modelling approach to draw causal inferences by using a shared random-effect model and the computational algorithm to draw likelihood based inference with such a model. With small numerical studies and a real data analysis, we show that our approach gives not only more efficient estimates but also is less sensitive to model misspecifications, which we consider, than existing methods.

We consider methods for estimating causal effects of treatment in the situation where the individuals in the treatment and the control group are self selected, i.e., the selection mechanism is not randomized. In this case, simple comparison of treated and control outcomes will not generally yield valid estimates of casual effects. The propensity score method is frequently used for the evaluation of treatment effect. However, this method is based onsome strong assumptions, which are not directly testable. In this paper, we present an alternative modeling approachto draw causal inference by using share random-effect model and the computational algorithm to draw likelihood based inference with such a model. With small numerical studies and a real data analysis, we show that our approach gives not only more efficient estimates but it is also less sensitive to model misspecifications, which we consider, than the existing methods.

We present a new version (> 2.0) of the hglm package for fitting hierarchical generalized linear models (HGLMs) with spatially correlated random effects. CAR() and SAR() families for conditional and simultaneous autoregressive random effects were implemented. Eigen decomposition of the matrix describing the spatial structure (e.g., the neighborhood matrix) was used to transform the CAR/SAR random effects into an independent, but eteroscedastic, Gaussian random effect. A linear predictor is fitted for the random effect variance to estimate the parameters in the CAR and SAR models. This gives a computationally efficient algorithm for moderately sized problems.

We present a new version of the hglm package for fittinghierarchical generalized linear models (HGLM) with spatially correlated random effects. A CAR family for conditional autoregressive random effects was implemented. Eigen decomposition of the matrix describing the spatial structure (e.g. the neighborhood matrix) was used to transform the CAR random effectsinto an independent, but heteroscedastic, gaussian random effect. A linear predictor is fitted for the random effect variance to estimate the parameters in the CAR model.This gives a computationally efficient algorithm for moderately sized problems (e.g. n<5000).

Data migration in migration projects have a big chance of failure caused by underestimating the weight and importance of planning for the migration of data between systems. The most common cause of failure is by having inadequate validation. Therefore, knowing what one should be aware of to minimize the risk of data migration will be of help.

This includes steps from David Barkaway for data migration projects about how to gather and retain generated knowledge for future projects. This also includes steps of how to perform data migration and avoid common causes of failure.

Based on the collected theory, interview questions will be written that enables me to create criteria a list of criteria for evaluating tools for data migration.

In this study I have chosen two different tools to evaluate. I will interview four IT-consultants with experience of data migration to best answer which criteria these tools should fulfill to be of interest for these four IT-consultants.

These tools are:

1. SQL-Server Integrated Services: A service from Microsoft within the .Net-environment for datamigration och Business Intelligence.

2. Talend Open Studio for Data Integration: A product from Talend made in java and is open-source, used for data migration.

The result shows that Talend fulfills more of the criteria than SSIS does.

31. Alvarez-Castro, J.M.

et al.

Carlborg, Ö.

Rönnegård, Lars

Dalarna University, School of Technology and Business Studies, Statistics.

We introduce this communication with a brief outline of the historical landmarks in genetic modeling, especially concerning epistasis. Then, we present methods for the use of genetic modeling in QTL analyses. In particular, we summarize the essential expressions of the natural and orthogonal interactions (NOIA) model of genetic effects. Our motivation for reviewing that theory here is twofold. First, this review presents a digest of the expressions for the application of the NOIA model, which are often mixed with intermediate and additional formulae in the original articles. Second, we make the required theory handy for the reader to relate the genetic concepts to the particular mathematical expressions underlying them. We illustrate those relations by providing graphical interpretations and a diagram summarizing the key features for applying genetic modeling with epistasis in comprehensive QTL analyses. Finally, we briefly review some examples of the application of NOIA to real data and the way it improves the interpretability of the results.

This paper describes how distance educations in developing countries can enhance interactivityby means of information and communication technologies. It is argued that e-learning involvesa shift in the educational structure from traditional transmission of knowledge to interactivecreation of knowledge. Our case studies are two distance educations in Bangladesh and SriLanka that use different technologies for implementing interactivity; Internet and computersin one case and video and mobile phones in the other. The findings are analyzed based onStructuration Theory and we compare the two approaches based on emerging norms andbeliefs. Findings from both cases show the concurrent enactment of both the transmissionand the interactive structure. Whereas peer collaboration and the use of self-assessment toolsmake students take more ownership of their learning, we also found the idea of a classroomwith an instructive teacher to be deeply rooted in the students’ minds.

There is a debate about the advantages and disadvantages of using social media in education. Drawing on interviews and surveys with students and teachers in three Swedish schools, this study finds that studentsas well as teachers find much of the students' social media use distractive to learning. We investigate this by means of an interpretative study of students' and teachers' experiences. We find that concerns relate to how social media use makes students less social, how weaker students are more likely to get distracted, how teachers lack strategies for tackling the problem and how the responsibility of the use is delegated to the students. We discuss how the distractive use of social media is made possible as a result of education policies requiring a higher degree of individual work, individual responsibility, and educational choices forstudents. Teachers and school leaders need to jointly reclaim the students and coping strategies for the distractive use are urgently needed.

Software deployment can be seen as the process where all activities are included to make a software available to users without a manual installation on the user’s computer or other machine. There are several software deployment tools that manage automated installations available to enterprises on the market today.

The department HVDC at ABB in Ludvika has needs of starting to use a tool for automated installations of software which currently is installed manually and is time consuming. As Microsoft partners ABB wants to see how Microsoft’s tool for software deployment could help for this need.

Our study aimed to investigate how the department´s work with installations of software looks like today and to find opportunities for improvement for installations that can’t be automated at this time. The study also includes to develop a general framework for how businesses can proceed when they want to start using a software deployment tool. The framework also includes a designed requirement specification that will be evaluated against Microsoft´s tool.

To create an idea of how the work in the business looks like today, we have performed surveys and interviews with staff in HVDC. In order to develop a framework, we have used the data collected from the interviews, questionnaires and group interviews conducted to identify requirements and wishes from the staff of a software deployment tool. Literature studies were also conducted to create a theoretical framework to use when developing the framework and the requirement specification.

Our studies have resulted in a description of software deployment, opportunities for improvement in the work of software installations and a general framework that describes how businesses can proceed when they are about to start using a software deployment tool. The framework also provides a set of requirements that have used to evaluate Microsoft's tool for software distribution. In our study we have not seen that anyone before have developed a general framework and requirements specification that businesses can use as a basis when to start using a software deployment tool. Our results of the study can cover up this knowledge gap.

This BA-thesis investigates if the aspects dynamic range, color reproduction and resolution have any effects on how an audience experiences the quality of video/moving images. Today, with fast evolving technology, camera manufacturers have to keep producing new cameras to keep up with their competitors. This saturates the market and makes it harder for filmmakers not knowledgeable about how a camera works and what’s important. In our experience, filmmakers and camera manufacturers focus more on the technical specifications of cameras than what the production actually needs. To answer our question about the three camera aspects, we have conducted two different kinds of studies. The first one is objective, free of opinions and contains only data on how four different cameras compare with one another. After that, we compare the data from this study with a public survey where we show an audience four movies that might appear exactly the same but are filmed with different cameras, the same cameras as in the objective tests.

What we discovered is that dynamic range, color reproduction and resolution have little to no effect on how the audience experiences the video/moving images. The aspect that had the most effect on the audience was color, but reproduction accuracy was not the important factor and fell out of the scope of our study.

43.

Andersson, Ing-Marie

et al.

Dalarna University, School of Technology and Business Studies, Occupational science.

Rosén, Gunnar

Dalarna University, School of Technology and Business Studies, Occupational science.

Fällstrand Larsson, Nina

Dalarna University, School of Technology and Business Studies, Occupational science.

To integrate study visits to different workplaces in higher education implies important benefits for the course quality. The study visit gives the students a better understanding for the real situations they will meet in working life. However for practical and economical reasons is that not always possible. The purpose of this project is to create a virtual company that shall replace the real one for study visits. The goal is to create a realistic picture and that intended use of it can come as close as possible to a real study visit. It is also important to facilitate linking theory and practice. The virtual company is built up by pictures, videos and text. All material is made available on a web page and when entering the students will meet a layout of the company. From that position is it possible to walk around and look at videos from different workstations. Besides that can they also listen to interviews with managers and representatives of staff as well as reading reports concerning productivity and the work environment. The focus of the study visit is work sciences, therefore the material also include some visualized information about work hazards. On the web page there are also a number of tasks for the students to carry out. Until the autumn 2011, 132 students at Dalarna University have visited and produced reports from the virtual company. They were studying in programs for mechanical engineering, production technicians and human resource management. An evaluation among some ten students showed that the study visit to the virtual company is flexible in time and effective, but that students wish to have even more detailed information about the company. Experiences from four years of use in a number of classes show that the concept is worth further development. Furthermore with production of new material the concept is likely to be applicable for other purposes.

Role-based Access Control is a standardized and well established model in terms of handling access rights. However, the accepted ANSI standard 359-2004 lacks the support of geographically delimiting role authorizations. Information systems handling geographical data together with the increasing use of mobile devices call for a need to discuss such spatial aspects within the context of Role-Based Access Control. This thesis seeks to shed light on the current state of knowledge within the subject area as well as to identify aspects of it that are in need of further development. The theoretical framework conceived by the initial literature review has made the conduction of a systematic literature review possible, and the synthesis and analysis of the data together with the theoretical framework have led to the work’s contributions of knowledge: an overview of the subject where the state-of-the art in the area is presented and a structured list of desirous needs of research and development within the area of study.

Usability is a concept that exists to make it easier for visitors to find the right information efficiently, with goodsatisfaction and widely used. There are various guidelines to achieve good usability on websites that have theirown criteria that need to be met before the site can develop to be seen as useful. I have explored andevaluated a web site that contains an online system, based on previous research on usability of web sites anddata collection of user testing and surveys. I have tried to analyze the results I have gathered to try to answerHow important is it to follow the utility in the development of a website with a reservation system? My researchhas shown that guidelines are something that one should follow to the best of our ability, but only if theguidelines are relevant to the product objectives and one can speculate for the need for guidelines for usabilitytargeting sites with specific purposes.

46.

Andersson, Jonas

Dalarna University, School of Technology and Business Studies, Information Systems.

This work is a survey about CGI’s self-developed invoice system called CDI. This invoice system is used by Trafikverket and the foundation of this work is based on a survey that maps out what problems there is with the current user interface in CDI. The work focuses only on the view where the user is performing accounts of invoices.

The user interface that is used today was developed around 2007 and according to CGI, needs a modernization. The idea over sight is to replace the user interface with a new, updated one that helps the user more than the current user interface.

The purpose of the report is to investigate what problems that users is experiencing with the current user interface and with the help of two self-developed design-proposals evaluate how to fix these problems.

This is being performed with the help of a zero-state analysis, that maps the current problems and another analysis that evaluates the design-proposals. Both analysis is made from interviews and is performed with three persons from Trafikverket that work with the invoice system on a daily basis.

When the evaluation is done there will be a revised version of the earlier design-proposals, this will be the final product of this report. The final product will later be used as a foundation in the work around developing the new user interface for CDI.

Digitalization is a hot topic in today’s world. It’s almost a must for businesses to engage in digitalization if they want to keep thriving one the market. A study conducted by Tillväxtanalys, a Swedish management authority, shows that businesses within the Swedish construction sector are less likely to have a high degree of digital maturity and a digitalized business. The study also shows that small businesses in general are falling behind big businesses in regard to digitalization. In a small business in the Swedish construction sector, the management have tried for years to digitalize their analogue business. The problem however, is the fear and resistance to change within the organization.

The purpose of this thesis report was to identify and describe the factors needed to engage the employees of a small business in a digitalization. Identify which types of change resistance that existed within the organization and to determine the degree of digital maturity.

This case study was conducted with the use of interviews and a questionnaire which have been answered by the employees at this company. The collected data was then analyzed and compared to previous research. The result of this study shows that there are two types of change resistance within the organization and that there are five important factors to engage employees in digitalization within small business; competence, trust, communication, organization-wide need and vision for the change and finally participation.

49.

Andersson, Michael

et al.

Dalarna University, School of Technology and Business Studies, Computer Engineering.

Mickols, Andreas

Dalarna University, School of Technology and Business Studies, Computer Engineering.

The use of Intrusion Detection Systems is a normal thing today in bigger companies, butthe solutions that are to be found in market is often too expensive for the smallercompany. Therefore, we saw the need in investigating if there is a more affordablesolution. In this report, we will show that it is possible to use low cost single boardcomputers as part of a bigger centralized Intrusion Detection System. To investigate this,we set up a test system including 2 Raspberry Pi 3 Model B, a cloud server and the use oftwo home networks, one with port mirroring implemented in firmware and the other withdedicated span port. The report will show how we set up the environment and the testingwe have done to prove that this is a working solution.

I have in this case study evaluated advantages and disadvantages with mob programming, at a IT-company called CGM (CompuGroup Medical LAB AB), in Borlänge, Sweden.

Mob programming is new way to work with computer code creation or system maintenance. A group of developers works together to solve problems, like computer bugs. Only one person sits at the keyboard at one time. Like an evolution of pair programming.

I have interviewed a selected number of staffs at CGM.

What are the pros and cons with mob programming for information systems engineers?

I have also conducted a test, with university students that study information systems science at Dalarna University. The students tested to solve a problem with mob programming, and then evaluated the test in a questionnaire.