miun.sePublications
Change search
Refine search result
1234567 51 - 100 of 359
CiteExportLink to result list
Permanent link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Rows per page
  • 5
  • 10
  • 20
  • 50
  • 100
  • 250
Sort
  • Standard (Relevance)
  • Author A-Ö
  • Author Ö-A
  • Title A-Ö
  • Title Ö-A
  • Publication type A-Ö
  • Publication type Ö-A
  • Issued (Oldest first)
  • Issued (Newest first)
  • Created (Oldest first)
  • Created (Newest first)
  • Last updated (Oldest first)
  • Last updated (Newest first)
  • Disputation date (earliest first)
  • Disputation date (latest first)
  • Standard (Relevance)
  • Author A-Ö
  • Author Ö-A
  • Title A-Ö
  • Title Ö-A
  • Publication type A-Ö
  • Publication type Ö-A
  • Issued (Oldest first)
  • Issued (Newest first)
  • Created (Oldest first)
  • Created (Newest first)
  • Last updated (Oldest first)
  • Last updated (Newest first)
  • Disputation date (earliest first)
  • Disputation date (latest first)
Select
The maximal number of hits you can export is 250. When you want to export more records please use the Create feeds function.
  • 51.
    Bäckstedt, Dennis
    Mid Sweden University, Faculty of Science, Technology and Media, Department of Information Systems and Technology.
    Felsökning och optimering av trådlöst nätverk IEEE 802.11ac2017Independent thesis Basic level (university diploma), 10 credits / 15 HE creditsStudent thesis
    Abstract [en]

    Wireless networks are growing ever more and the norm is now that in an office or home you can

    settle down with your laptop and work without having to worry about network outlets or wires. IP

    telephony and tablet are now more natural in many users daily lives, and especially IP telephony

    places high demands on the network to which it is connected. Therefore, I have chosen that in this

    report, look at the 802.11 wireless technology developed by the IEEE standardization organization.

    The method I chose to use to accomplish this is to perform a measurement of an existing wire-

    less network. I will then investigate this data in order to hopefully submit a proposal for a better

    network design. As a tool for carrying out this measurement, I have equipment from Ekahau, a

    Finnish company whose main aim is measurement, optimization and troubleshooting of wireless

    networks. What I found out was that the wireless network had major shortcomings. In particular,

    coverage was a matter of concern, not just the lack of coverage, but sometimes even for good cove-

    rage, resulting in sticky clients. I have then presented a design that, in order to minimize costs and

    environmental impact, uses existing equipment, but requires that it be supplemented with 9 new

    access points.

  • 52.
    Cai, Zhipeng
    et al.
    Georgia State Univ, Atlanta, GA, USA.
    Chang, Rong N.
    IBM TJ Watson Res Ctr, Yorktown Hts, NY USA.
    Forsström, Stefan
    Mid Sweden University, Faculty of Science, Technology and Media, Department of Information Systems and Technology.
    Kos, Anton
    Univ Ljubljana, Ljubljana, Slovenia.
    Wang, Chaokun
    Tsinghua Univ, Beijing, Peoples R China.
    Privacy in the Internet of Things2018In: Wireless Communications & Mobile Computing, ISSN 1530-8669, E-ISSN 1530-8677, Vol. 2018, article id 8281379Article in journal (Other academic)
  • 53.
    Carlberg, Michael
    et al.
    Örebro Univ, Örebro.
    Koppel, Tarmo
    Tallinn Univ Technol, Tallinn, Estonia.
    Ahonen, Mikko
    Mid Sweden University, Faculty of Science, Technology and Media, Department of Information Systems and Technology.
    Hardell, Lennart
    Örebro Univ, Örebro.
    Case-control study on occupational exposure to extremely low-frequency electromagnetic fields and glioma risk2017In: American Journal of Industrial Medicine, ISSN 0271-3586, E-ISSN 1097-0274, Vol. 60, no 5, p. 494-503Article in journal (Refereed)
    Abstract [en]

    Background

    Exposure to extremely low-frequency electromagnetic fields (ELF-EMF) was in 2002 classified as a possible human carcinogen, Group 2B, by the International Agency for Research on Cancer at WHO.

    Methods

    Life time occupations were assessed in case-control studies during 1997-2003 and 2007-2009. An ELF-EMF Job-Exposure Matrix was used for associating occupations with ELF exposure (T). Cumulative exposure (T-years), average exposure (T), and maximum exposed job (T) were calculated.

    Results

    Cumulative exposure gave for astrocytoma grade IV (glioblastoma multiforme) in the time window 1-14 years odds ratio (OR)=1.9, 95% confidence interval (CI)=1.4-2.6, p linear trend <0.001, and in the time window 15+ years OR=0.9, 95%CI=0.6-1.3, p linear trend=0.44 in the highest exposure categories 2.75+ and 6.59+ T years, respectively.

    Conclusion

    An increased risk in late stage (promotion/progression) of astrocytoma grade IV for occupational ELF-EMF exposure was found.

  • 54.
    Carlberg, Michael
    et al.
    Örebro Univ, Örebro.
    Koppel, Tarmo
    Tallinn Univ Technol, Tallinn, Estonia.
    Ahonen, Mikko
    Mid Sweden University, Faculty of Science, Technology and Media, Department of Information Systems and Technology.
    Hardell, Lennart
    Örebro Univ, Örebro.
    Case-Control Study on Occupational Exposure to Extremely Low-Frequency Electromagnetic Fields and the Association with Meningioma2018In: BioMed Research International, ISSN 2314-6133, E-ISSN 2314-6141, article id 5912394Article in journal (Refereed)
    Abstract [en]

    Objective. Exposure to extremely low-frequency electromagnetic fields (ELF-EMF) was in 2002 classified as a possible human carcinogen, Group 2B, by the International Agency for Research on Cancer at WHO based on an increased risk for childhood leukemia. In case-control studies on brain tumors during 1997-2003 and 2007-2009 we assessed lifetime occupations in addition to exposure to different agents. The INTEROCC ELF-EMF Job-ExposureMatrix was used for associating occupations with ELF-EMF exposure (mu T) with meningioma. Cumulative exposure (mu T-years), average exposure (mu T), and maximum exposed job (mu T) were calculated. Results. No increased risk for meningioma was found in any category. For cumulative exposure in the highest exposure category 8.52+ mu T years odds ratio (OR) = 0.9, 95% confidence interval (CI) = 0.7-1.2, and.. linear trend = 0.45 were calculated. No statistically significant risks were found in different time windows. Conclusion. In conclusion occupational ELF-EMF was not associated with an increased risk for meningioma.

  • 55.
    Carlson, David
    Mid Sweden University, Faculty of Science, Technology and Media, Department of Information Systems and Technology.
    UTM-verktyg åt Roxtec2017Independent thesis Basic level (university diploma), 10 credits / 15 HE creditsStudent thesis
    Abstract [sv]

    Det denna rapport kommer att avhandla är ett projekt genomfört hos Roxtec AB, syftet har i

    detta projekt varit att underlätta hanteringen av så kallade utm-adresser. Vi kommer att gå

    igenom problemställning, hur vi väljer att lösa samma problem. Senare kommer vi att

    behandla hur vi löste de här problemen, huvudsyftet har varit att skapa ett enkelt verktyg för

    att generera nyss nämnda adresser. Lite extrafunktioner runt de här funktionerna kommer att

    avhandlas såsom sparning av historik, inlogg och så vidare. Projektet kom att innehålla många

    rader kod, de här kommer inte rapporten att avhandla i detalj, förutom vissa mer

    betydelsefulla delar av projektet. I de mer betydelsefulla avsnittet går vi igenom dess funktion

    samt med illustrationer försöker vi förklara dess syfte på ett enkelt sätt. Hur väl vi lyckats att

    lösa de uppställda problemen vi ställs inför går vi igenom och diskuterar även de här till sist.

    Projektet kommer att skapas i .NET och i lämplig programvara för att utföra och lösa senare

    uppställd problematik.

  • 56.
    Carlsson, Marcus
    Mid Sweden University, Faculty of Science, Technology and Media, Department of Information Systems and Technology.
    Problemen i ett utvecklingsteam: Inriktning mot versionshantering och agil utveckling2017Independent thesis Basic level (university diploma), 10 credits / 15 HE creditsStudent thesis
    Abstract [sv]

    Målet med min rapport har varit att undersöka de svårigheter och problem som

    utvecklare på Barnebys kan ställas inför under sitt dagliga arbete. För att

    begränsa mig har jag valt två stora områden att undersöka, det agila arbetssättet

    och versions hanteringssystem. Men det viktigaste fokuset i rapporten är vilka

    problem som uppstår som är relaterade att man jobbar flera utvecklare på

    samma projekt. Så kallade teams. Rapporten avhandlar en teoretisk bakgrund

    till de båda ämnena och sedan så genomför jag en undersökning bland de

    anställda på Barnebys teknikavdelning. Undersökningen görs medhjälp av

    Google Forms och här finns frågor kring dessa ämnen. Rapporten tar också upp

    förslag på lösningar och/eller förbättringar. Rapporten avslutas med en

    presentation av resultatet och egna reflektioner.

  • 57.
    Chen, Jian
    Mid Sweden University, Faculty of Science, Technology and Media, Department of Information Systems and Technology.
    Maintaining Stream Data Distribution Over Sliding Window2018Independent thesis Advanced level (degree of Master (Two Years)), 20 credits / 30 HE creditsStudent thesis
    Abstract [en]

    In modern applications, it is a big challenge that analyzing the order statistics about the most recent parts of the high-volume and high velocity stream data. There are some online quantile algorithms that can

    keep the sketch of the data in the sliding window and they can answer the quantile or rank query in a very short time. But most of them take the

    GK algorithm as the subroutine, which is not known to be mergeable. In this paper, we propose another algorithm to keep the sketch that maintains the order statistics over sliding windows. For the fixed-size window, the existing algorithms can’t maintain the correctness in the process of updating the sliding window. Our algorithm not only can maintain the correctness but also can achieve similar performance of the optimal algorithm. Under the basis of maintaining the correctness, the insert time and query time are close to the best results, while others can't maintain the correctness. In addition to the fixed-size window algorithm, we also provide the time-based window algorithm that the window size varies over time. Last but not least, we provide the window aggregation algorithm which can help extend our algorithm into the distributed system.

  • 58.
    Comstedt, Erik
    Mid Sweden University, Faculty of Science, Technology and Media, Department of Information Systems and Technology.
    Effect of additional compression features on h.264 surveillance video2017Independent thesis Basic level (degree of Bachelor), 10 credits / 15 HE creditsStudent thesis
    Abstract [en]

    In video surveillance business, a recurring topic of discussion is quality versus data usage. A higher quality allows for more details to be captured at the cost of a higher bit rate, and for cameras monitoring events 24 hours a day, limiting data usage can quickly become a factor to consider. The purpose of this thesis has been to apply additional compression features to a h.264 video steam, and evaluate their effects on the videos overall quality. Using a surveillance camera, recordings of video streams were obtained. These recordings had constant GOP and frame rates. By breaking down one of these videos to an image sequence, it was possible to encode the image sequence into video streams with variable GOP/FPS using the software Ffmpeg. Additionally a user test was performed on these video streams, following the DSCQS standard from the ITU-R recom- mendation. The participants had to subjectively determine the quality of video streams. The results from the these tests showed that the participants did not no- tice any considerable difference in quality between the normal videos and the videos with variable GOP/FPS. Based of these results, the thesis has shown that that additional compression features can be applied to h.264 surveillance streams, without having a substantial effect on the video streams overall quality.

  • 59.
    Conti, Caroline
    et al.
    University of Lisbon, Portugal.
    Soares, Luis Ducla
    University of Lisbon, Portugal.
    Nunes, Paulo
    University of Lisbon, Portugal.
    Perra, Cristian
    University of Cagliari, Italy.
    Assunção, Pedro Amado
    Institute de Telecomunicacoes and Politecenico de Leiria, Portugal.
    Sjöström, Mårten
    Mid Sweden University, Faculty of Science, Technology and Media, Department of Information Systems and Technology.
    Li, Yun
    Mid Sweden University, Faculty of Science, Technology and Media, Department of Information Systems and Technology.
    Olsson, Roger
    Mid Sweden University, Faculty of Science, Technology and Media, Department of Information Systems and Technology.
    Jennehag, Ulf
    Mid Sweden University, Faculty of Science, Technology and Media, Department of Information Systems and Technology.
    Light Field Image Compression2018In: 3D Visual Content Creation, Coding and Delivery / [ed] Assunção, Pedro Amado, Gotchev, Atanas, Cham: Springer, 2018, p. 143-176Chapter in book (Refereed)
  • 60.
    Dahlberg, Ted
    Mid Sweden University, Faculty of Science, Technology and Media, Department of Information Systems and Technology.
    Byte av Ärendehanteringssystem: - Förstudie och REST API av kund-databas2018Independent thesis Basic level (university diploma), 10 credits / 15 HE creditsStudent thesis
    Abstract [sv]

    Leeroy Group AB är ett IT-produktbolag i Sundsvall som även sköter support för sina produkter. Detta görs i ett egenutvecklat ärendehanteringssystem som under tid utvecklats och skräddarsytts efter verksamhetens behov. Leeroy är nu ute efter ett kraftfullare ärendehanteringssystem som skall ge möjlighet att ta hand om fler ärenden ju mer företaget växer. Detta examensarbete har inkluderat en grundlig förstudie bland de anställda på Leeroy. En kartläggning av krav och behov som finns för Leeroys' supporthantering har gjorts och sammanställts till en kravspecifikation. Leeroy har valt att titta på två stycken utvalda ärendehanteringssystem: Jira Service Desk och Zendesk. Utifrån den kravspecifikation som framkommit görs en dokumentations-inhämtning baserat på dessa krav och behov, för att sedan kunna presentera ett lämpligt alternativ för Leeroy. För att sedan ytterligare göra detta byta till en smidigare process har ett REST API och enklare webbapplikation utvecklats för att kunna hämta in kund-data från den befintliga kund-databasen. Webb-applikationen kommer att fungera som en koncept-applikation där man med sökning mot butiksnamn eller butiksnummer får ut all relevant information som Leeroys' supportanställda behöver.

  • 61.
    Dahlin, Karl
    Mid Sweden University, Faculty of Science, Technology and Media, Department of Information Systems and Technology.
    Hashing algorithms: A comparison for blockchains in Internet of things2018Independent thesis Basic level (degree of Bachelor), 10 credits / 15 HE creditsStudent thesis
    Abstract [en]

    In today’s society blockchains and the Internet of things are two very discussed subjects this has led to thoughts about combining them by using a blockchain in Internet of things. This objective of this study has been to solve the problem which hashing algorithms is the best for a blockchain used in an Internet of things network. This have been done by first selecting a few hashing algorithms and setting up a scenario where a blockchain can be used in an Internet of things network. After that I specified what to compare, speed, input length and output length. The study has been conducted with the aid of literary studies about the hashing algorithms and a program that implements the algorithms and tests their speed. The study has shown that out of the selected hashing algorithms MD5, SHA-256, SHA3-256 with the conditions specified for the study that the hashing algorithm SHA3-256 is the best option if speed is not of the utmost importance in the scenario since it is from a newer standard and do not have a max input length. If speed is the very important in other words if SHA3-256 is to slow then SHA-256 would be best for the Internet of things network.

  • 62.
    Danielsson, Erna
    et al.
    Mid Sweden University, Faculty of Human Sciences, Department of Social Sciences.
    Petridou, Evangelia
    Mid Sweden University, Faculty of Human Sciences, Department of Social Sciences.
    Lundgren, Minna
    Mid Sweden University, Faculty of Human Sciences, Department of Social Sciences.
    Olofsson, Anna
    Mid Sweden University, Faculty of Human Sciences, Department of Social Sciences.
    Große, Christine
    Mid Sweden University, Faculty of Science, Technology and Media, Department of Information Systems and Technology.
    Röslmaier, Michael
    Mid Sweden University, Faculty of Human Sciences, Department of Tourism Studies and Geography.
    Risk Communication: A Comparative Study of Eight EU Countries2018Report (Other academic)
    Abstract [en]

    How do EU member states communicate risks to their citizens? In this study, we define risk communication as the information provided by different levels of government to citizens regarding possible future crises. The questions serving as departure points for this study are as follows: How is the administrative system for risk communication set up in the countries studied? How the different risk communication campaigns are (provided that they exist) embedded in the larger administrative context? How is risk communication strategy formulated in each country and what kind of threats are emphasized? In order to tackle these questions, we examine the risk communication strategy of eight countries: Sweden, Finland, Germany, England, France, Estonia, Greece and Cyprus. Our data consist of governmental web sites, publications, campaigns, as well as other modes of communication, such as videos posted on YouTube, with questions centering on institutional actors, methods of delivery, content, and effectiveness. We acknowledge that risk communication aims at supporting vulnerable populations and evening out imbalances, but at the same time we flesh out the power dimension of risk. In our analysis, we search for reproduction of norms and social inequality in risk communication practices. The results show that some patterns emerge regarding the way different EU countries convey information to the public, but they do not hold strictly to geography or administrative system. Digital media are the foremost vehicle of risk communication and the message generally conveyed is geared towards traditional, middle class households with the main language of the country as their first language. Volunteer organizations are present in all the countries in question, though not at the same degree. The conveyance of “self-protection” guidelines implicitly places the responsibility of protection to the individual. The results also show that in some countries, materiality has become more prevalent than the social dimension of risk in the message the public sector conveys, and that there is a move from focusing on risk to focusing on security.

  • 63.
    Darborg, Alex
    Mid Sweden University, Faculty of Science, Technology and Media, Department of Information Systems and Technology.
    Identifiera känslig data inom ramen för GDPR: Med K-Nearest Neighbors2018Independent thesis Basic level (degree of Bachelor), 10 credits / 15 HE creditsStudent thesis
    Abstract [en]

    General Data Protection Regulation, GDPR, is a regulation coming into effect on May 25th 2018. Due to this, organizations face large decisions concerning how sensitive data, stored in databases, are to be identified. Meanwhile, there is an expansion of machine learning on the software market. The goal of this project has been to develop a tool which, through machine learning, can identify sensitive data. The development of this tool has been accomplished through the use of agile methods and has included comparisions of various algorithms and the development of a prototype. This by using tools such as Spyder and XAMPP. The results show that different types of sensitive data give variating results in the developed software solution. The kNN algorithm showed strong results in such cases when the sensitive data concerned Swedish Social Security numbers of 10 digits, and phone numbers in the length of ten or eleven digits, either starting with 46-, 070, 072 or 076 and also addresses. Regular expression showed strong results concerning e-mails and IP-addresses.

  • 64.
    Dibo, Alexandra
    Mid Sweden University, Faculty of Science, Technology and Media, Department of Information Systems and Technology.
    Challenges when making extensive changes to software processes: A case study on a software development department at Scania CV AB2017Independent thesis Advanced level (professional degree), 20 credits / 30 HE creditsStudent thesis
    Abstract [en]

    Organizations go through a variety of different change processes during their life time, and many of these are necessary for the organizations to maintain their competitiveness. However, a large part of change efforts never achieve their goals and about 70 percent of all change efforts are considered to be unsuccessful. A reason for the high percentage of failures is the inability to deal with the challenges that often arises in connection with the change process. In order for the change effort to be successful it is therefore crucial to be prepared and knowing how to identify and handle the challenges and resistance that may arise during the change process.

    The aim of the study has been to identify the challenges with extensive change efforts in software development organizations. Two extensive changes, (1) changing the software itself by making the software structure modular and (2) changing the software development process by adapting agile methods, at a software development department at Scania CV AB has been used as a case study.

    An overall fear and resistance towards extensive changes was identified. In addition, four main challenges were identified with the first change; difficulties with the software development process, lack of vision and communication from management, fear and uncertainty, and lack of resources and tools. Two challenges were identified with the second change; that it was time consuming and lack of resources and tools.

    The difficulties with the software development process showed that the major challenge with the modular software structure was maintaining it. However, the remaining challenges have previously been identified in several studies and could all be related to being causes of resistance.

    Also, a comparison between the two changes were made to identify similarities and differences between them. This was made to further understand if the difference between the changes could be related to the challenges. The comparison indicates that a change effort with a clear vision, good communication and management involvement is less likely to encounter challenges.

  • 65.
    Dima, Elijs
    Mid Sweden University, Faculty of Science, Technology and Media, Department of Information Systems and Technology.
    Multi-Camera Light Field Capture: Synchronization, Calibration, Depth Uncertainty, and System Design2018Licentiate thesis, comprehensive summary (Other academic)
    Abstract [en]

    The digital camera is the technological counterpart to the human eye, enabling the observation and recording of events in the natural world. Since modern life increasingly depends on digital systems, cameras and especially multiple-camera systems are being widely used in applications that affect our society, ranging from multimedia production and surveillance to self-driving robot localization. The rising interest in multi-camera systems is mirrored by the rising activity in Light Field research, where multi-camera systems are used to capture Light Fields - the angular and spatial information about light rays within a 3D space. 

    The purpose of this work is to gain a more comprehensive understanding of how cameras collaborate and produce consistent data as a multi-camera system, and to build a multi-camera Light Field evaluation system. This work addresses three problems related to the process of multi-camera capture: first, whether multi-camera calibration methods can reliably estimate the true camera parameters; second, what are the consequences of synchronization errors in a multi-camera system; and third, how to ensure data consistency in a multi-camera system that records data with synchronization errors. Furthermore, this work addresses the problem of designing a flexible multi-camera system that can serve as a Light Field capture testbed.

    The first problem is solved by conducting a comparative assessment of widely available multi-camera calibration methods. A special dataset is recorded, giving known constraints on camera ground-truth parameters to use as reference for calibration estimates. The second problem is addressed by introducing a depth uncertainty model that links the pinhole camera model and synchronization error to the geometric error in the 3D projections of recorded data. The third problem is solved for the color-and-depth multi-camera scenario, by using a proposed estimation of the depth camera synchronization error and correction of the recorded depth maps via tensor-based interpolation. The problem of designing a Light Field capture testbed is addressed empirically, by constructing and presenting a multi-camera system based on off-the-shelf hardware and a modular software framework.

    The calibration assessment reveals that target-based and certain target-less calibration methods are relatively similar at estimating the true camera parameters. The results imply that for general-purpose multi-camera systems, target-less calibration is an acceptable choice. For high-accuracy scenarios, even commonly used target-based calibration approaches are insufficiently accurate. The proposed depth uncertainty model is used to show that converged multi-camera arrays are less sensitive to synchronization errors. The mean depth uncertainty of a camera system correlates to the rendered result in depth-based reprojection, as long as the camera calibration matrices are accurate. The proposed depthmap synchronization method is used to produce a consistent, synchronized color-and-depth dataset for unsynchronized recordings without altering the depthmap properties. Therefore, the method serves as a compatibility layer between unsynchronized multi-camera systems and applications that require synchronized color-and-depth data. Finally, the presented multi-camera system demonstrates a flexible, de-centralized framework where data processing is possible in the camera, in the cloud, and on the data consumer's side. The multi-camera system is able to act as a Light Field capture testbed and as a component in Light Field communication systems, because of the general-purpose computing and network connectivity support for each sensor, small sensor size, flexible mounts, hardware and software synchronization, and a segmented software framework. 

  • 66.
    Dima, Elijs
    et al.
    Mid Sweden University, Faculty of Science, Technology and Media, Department of Information Systems and Technology.
    Gao, Yuan
    Institute of Computer Science, Christian-Albrechts University of Kiel, Germany.
    Sjöström, Mårten
    Mid Sweden University, Faculty of Science, Technology and Media, Department of Information Systems and Technology.
    Olsson, Roger
    Mid Sweden University, Faculty of Science, Technology and Media, Department of Information Systems and Technology.
    Koch, Reinhard
    Institute of Computer Science, Christian-Albrechts University of Kiel, Germany.
    Esquivel, Sandro
    Institute of Computer Science, Christian-Albrechts University of Kiel, Germany.
    Estimation and Post-Capture Compensation of Synchronization Error in Unsynchronized Multi-Camera SystemsManuscript (preprint) (Other academic)
  • 67.
    Dima, Elijs
    et al.
    Mid Sweden University, Faculty of Science, Technology and Media, Department of Information Systems and Technology.
    Sjöström, Mårten
    Mid Sweden University, Faculty of Science, Technology and Media, Department of Information Systems and Technology.
    Olsson, Roger
    Mid Sweden University, Faculty of Science, Technology and Media, Department of Information Systems and Technology.
    Modeling Depth Uncertainty of Desynchronized Multi-Camera Systems2017In: 2017 International Conference on 3D Immersion (IC3D), IEEE, 2017Conference paper (Refereed)
    Abstract [en]

    Accurately recording motion from multiple perspectives is relevant for recording and processing immersive multi-media and virtual reality content. However, synchronization errors between multiple cameras limit the precision of scene depth reconstruction and rendering. In order to quantify this limit, a relation between camera de-synchronization, camera parameters, and scene element motion has to be identified. In this paper, a parametric ray model describing depth uncertainty is derived and adapted for the pinhole camera model. A two-camera scenario is simulated to investigate the model behavior and how camera synchronization delay, scene element speed, and camera positions affect the system's depth uncertainty. Results reveal a linear relation between synchronization error, element speed, and depth uncertainty. View convergence is shown to affect mean depth uncertainty up to a factor of 10. Results also show that depth uncertainty must be assessed on the full set of camera rays instead of a central subset.

  • 68.
    Dima, Elijs
    et al.
    Mid Sweden University, Faculty of Science, Technology and Media, Department of Information Systems and Technology.
    Sjöström, Mårten
    Mid Sweden University, Faculty of Science, Technology and Media, Department of Information Systems and Technology.
    Olsson, Roger
    Mid Sweden University, Faculty of Science, Technology and Media, Department of Information Systems and Technology.
    Kjellqvist, Martin
    Mid Sweden University, Faculty of Science, Technology and Media, Department of Information Systems and Technology.
    Litwic, Lukasz
    Ericsson AB.
    Zhang, Zhi
    Ericsson AB.
    Rasmusson, Lennart
    Observit AB.
    Flodén, Lars
    Observit AB.
    LIFE: A Flexible Testbed For Light Field Evaluation2018Conference paper (Refereed)
    Abstract [en]

    Recording and imaging the 3D world has led to the use of light fields. Capturing, distributing and presenting light field data is challenging, and requires an evaluation platform. We define a framework for real-time processing, and present the design and implementation of a light field evaluation system. In order to serve as a testbed, the system is designed to be flexible, scalable, and able to model various end-to-end light field systems. This flexibility is achieved by encapsulating processes and devices in discrete framework systems. The modular capture system supports multiple camera types, general-purpose data processing, and streaming to network interfaces. The cloud system allows for parallel transcoding and distribution of streams. The presentation system encapsulates rendering and display specifics. The real-time ability was tested in a latency measurement; the capture and presentation systems process and stream frames within a 40 ms limit.

  • 69.
    Ding, Yuxia
    Mid Sweden University, Faculty of Science, Technology and Media, Department of Information Systems and Technology.
    Internet of Things: Quantitative Evaluation on Microsoft Azure IoT Suite2017Independent thesis Basic level (degree of Bachelor), 10 credits / 15 HE creditsStudent thesis
    Abstract [en]

    The Internet of Things (IoT) is rapidly gaining ground in our daily life. There is a trend that the number of devices and data increases rapidly in the future. Meanwhile, IoT platforms are emerging to enable people conveniently deal with the IoT and these huge amount of data and devices. Therefore, the goal in this paper is to perform a quantitative evaluation on Microsoft Azure, one of the IoT platform, about its advantages and disadvantages under press to determine if Azure IoT is fit for future IoT. In order to reach this goal, Azure IoT Hub is used as a bridge to connect and manage lots of IoT devices which send and receive huge amount of data. .NET is used to simulate devices and connect them to the IoT Hub. The two-way communication from sensor to cloud and from cloud to actuator is implemented through MQTT protocol. This paper makes measurements on three metrics including response time from sensor sending messages to actuator receiving messages, scalability and cost and analyzes them in detail. Besides, the analysis is also made in a specific scenario which has high demand on sensor update to see how Azure IoT performs. Finally, conclusion is made on Microsoft Azure IoT's advantages and disadvantages under stress.

  • 70.
    Domanski, Marek
    et al.
    Poznan University, Poland.
    Grajek, Tomasz
    Poznan University, Poland.
    Conti, Caroline
    University of Lisbon, Portugal.
    Debono, Carl James
    University of Malta, Malta.
    de Faria, Sérgio M. M.
    Institute de Telecomunicacôes and Politecico de Leiria, Portugal.
    Kovacs, Peter
    Holografika, Budapest, Hungary.
    Lucas, Luis F.R.
    Institute de Telecomunicacôes and Politecico de Leiria, Portugal.
    Nunes, Paulo
    University of Lisbon, Portugal.
    Perra, Cristian
    University of Cagliari, Italy.
    Rodrigues, Nuno M.M.
    Institute de Telecomunicacôes and Politecico de Leiria, Portugal.
    Sjöström, Mårten
    Mid Sweden University, Faculty of Science, Technology and Media, Department of Information Systems and Technology.
    Soares, Luis Ducla
    University of Lisbon, Portugal.
    Stankiewicz, Olgierd
    Poznan university, Poland.
    Emerging Imaging Technologies: Trends and Challenges2018In: 3D Visual Content Creation, Coding and Delivery / [ed] Assunção, Pedro Amado, Gotchev, Atanas, Cham: Springer, 2018, p. 5-39Chapter in book (Refereed)
  • 71.
    du Puy, Jakob
    Mid Sweden University, Faculty of Science, Technology and Media, Department of Information Systems and Technology.
    Myndigheters urval, selektering och prioritering av öppna data: En studie om informationsvärdering2018Independent thesis Advanced level (degree of Master (One Year)), 10 credits / 15 HE creditsStudent thesis
    Abstract [sv]

    Öppna data är något som blir en allt mer utbredd och cementerad del av svenska myndigheters verksamhet och strävans efter öppenhet och transparens. Ur myndigheternas verksamhetsprocess skapas en enorm mängd information som har potential att publiceras som öppna data. För att en publicering ska vara trovärdig och försvarbar ur ett demokratiskt perspektiv där öppna data utgör en del av en öppen förvaltning måste urvalet för publiceringen av information vara väl genomtänkt och teoretiskt grundad. Det här är speciellt viktigt för öppna data då få juridiska aspekter styr urval och prioritering av vilken information som ska bli öppna data. Den här studien undersöker på vilka grunder sju svenska nationella myndigheter gör sitt urval, sin selektering och sin prioritering av information som ska bli öppna data. Ett stort fokus ligger på myndigheternas arbete informationsvärdering.      Studien är baserad på åtta halvstrukturerade intervjuer med respondenter från sju olika nationella myndigheter. Respondenterna fått svara på frågor om arbetet med öppna data som myndigheten de representerar utför. För att analysera insamlad data har en kategorisering och kodning grundad i Theodore Schellenbergs teori om primära och sekundära värden gjorts och använts. Kategoriseringen och kodningen används för att analysera och förstå myndigheternas informationsvärdering.      Resultatet av studien visar att samtliga myndigheterna som undersöktes grundar urval, selektering och prioritering av öppna data främst i sekundära värden. Det innebär att informationens värde främst bedöms efter vilket värde den har för allmänheten, forskarvärlden, demokratin eller liknande. Fyra av sju myndigheter grundar också urval, selektering och prioritering i primära värden vilket innebär att informationen värderas efter dess värde för den egna myndighetens verksamhet. Studien visar också att urval, selektering och prioritering starkt grundar sig i hur mycket resurser informationen kräver för att publiceras som öppna data.

  • 72.
    Edin, Andreas
    Mid Sweden University, Faculty of Science, Technology and Media, Department of Information Systems and Technology.
    Autentisering med OAuth 2.0 i SiteVision: Jämförelse mellan Java Portlets och WebApps2018Independent thesis Basic level (degree of Bachelor), 10 credits / 15 HE creditsStudent thesis
    Abstract [en]

    The aim of this project has been to explore alternative technical solutions for making own extensions in the CMS SiteVision. The purpose of these extensions is to retrieve data from an external API (Office 365) which requires OAuth 2.0 authentication. Additional, the alternative technical solutions have been evaluated and compared. The comparisons have been made based on criteria developed through interviews with professional IT-consultants. The purpose of the project has been to contribute to more efficient digitization, integration and individualization of datasystems. Within the project, an applied example (POC) has been created to show examples of how the technology can be used. In this example, Java Portlets have been used to implement the above functionality. WebApps in SiteVision have also been studied since this technology is an alternative to Java Portlets. The survey shows that it is fully possible to create a separate extension in SiteVision that performs authentication with OAuth 2.0 and then uses it to retrieve data from an external API. The results from the comparison between the two different Java Portlets and WebApps technologies show that there are pros and cons of each technique. The alternatives studied where comparable in performance. Individual circumstances can dictate which alternative is best.

  • 73.
    Edmon, Jannika
    Mid Sweden University, Faculty of Science, Technology and Media, Department of Information Systems and Technology.
    Nya spelregler för skogsindustrin: En studie om möjligheter att gamifiera ett skogsbolags transportstyrningssystem2018Independent thesis Basic level (degree of Bachelor), 10 credits / 15 HE creditsStudent thesis
    Abstract [en]

    Today, companies need to constantly improve in order to be competitive and achieve good results. Gamification is a concept that is being implemented by more and more companies to increase motivation for employees as well as to engage them to work in a way that can help fulfil the goals of the company, which in the long run can lead to improved results. This study is based on a case with an international IT consulting company and their customer who works in the forest industry. The consulting company has developed an IT system for the customer, where timber car drivers are the system users among others. The purpose of the study has been to explore the possibilities to gamify the existing IT system in order to increase the system usage and encourage certain behaviors of the drivers that are aligned with the customer’s goals. Furthermore, the goal of the study has been to follow a gamification framework to develop suggestions on features in the system that can support these behaviors of the drivers. The framework has been used in combination with different methods for data collection such as interviews and open questionnaires. The use of the framework has resulted in four features that can be implemented in the system to support certain behaviors of the drivers. All features provide feedback to the user in different ways, following an action taken, to motivate continued use. Prior to implementation however, it is recommended that measures be taken to ensure that the gamification project is able to succeed.

  • 74.
    Ehrenberg, Mattias
    Mid Sweden University, Faculty of Science, Technology and Media, Department of Information Systems and Technology.
    Ramverket Aurelia och TypeScript2017Independent thesis Basic level (university diploma), 10 credits / 15 HE creditsStudent thesis
    Abstract [sv]

    Rapportens syfte är att genom att följa processen av skapandet av en webbapplikation samt, genom en enklare teoretisk studie, utvärdera front-end ramverket Aurelia och bakomliggande TypeScript och redovisa dessas för- och nackdelar. Den teoretiska studien har som syfte att titta på funktionalitet, användarvänlighet samt ”hälsotillstånd”. Funktionalitet och användarvänlighet kommer att utvärderas genom teori, artiklar skrivna av auktoriteter i ämnet samt genom intervjuer med erfarna utvecklare. ”Hälsotillstånd” studeras genom att undersöka responstid i ”communities” när konstruktionsfel uppmärksammas. Webbapplikationen som ska skapas är ett processverktyg för den interna rapporteringen av status i bolagets olika projekt. Webbapplikationen ska kunna samla data för alla projekt och redovisa dessa på lämpligt sätt med funktioner som bland annat sortering, sökning och editering. Artiklar i ämnet, intervjuer och mina egna erfarenheter visar att Aurelia har många positiva egenskaper och är ett mycket komplett ramverk. Aurelia arbetar modulärt, med ”konvention över konfiguration” och bygger på webstandarder vilket bidrar till att skapa en enkel och tydlig kod med bra syntax. Detta gör att en stor majoritet av författarna av artiklar som skrivits med jämförelser av likande ramverk pekar ut Aurelia som det bästa valet. Utvärderingen visade även att anmälda fel besvaras snabbt av ramverkets skapare. Den negativa kritik av ramverket som framkommit är att det, beroende val av utvecklingsmiljö, kan upplevas som tidskrävande att komma igång med för en förstagångsanvändare. Jämfört med huvudkonkurrenten Angular2 så har Aurelia färre användare, vilket innebär färre plugins, ett mindre ”community” och färre kodexempel på nätet. TypeScript skapar bakomliggande funktionalitet och kompileras till JavaScript före körning. TypeScript erbjuder många bra funktioner som klasser, inkapsling av kod och möjlighet att bestämma typ för variabler. TypeScript möjliggör att fel i koden visas direkt under inmatning och förslag på kod visas. Slutsatsen är att Aurelia är ett mycket komplett och väl fungerade ramverk som har en mycket bra och tydlig kodstuktur.

     

  • 75.
    Ekelund, Barbro
    Mid Sweden University, Faculty of Science, Technology and Media, Department of Information Systems and Technology.
    Webbutik Second Hand Shop2018Independent thesis Basic level (university diploma), 10 credits / 15 HE creditsStudent thesis
    Abstract [sv]

    Den här rapporten beskriver det självständiga arbetet med att skapa en webbutik för en second hand butik där klädesplagg visas upp till försäljning. Projektets syfte är att författaren tillsammans med handledaren på den webbaserade tidskriften Illegal Ground ska utveckla en webbplats som lätt kan administreras från ett administratörsgränssnitt samt att vara lätt tillgängligt för kunden/besökaren. Den testas och ska vara lämplig utifrån webbriktlinjer.se. Det innebär att varje individ i samhället ska kunna delta på lika villkor, oavsett etnisk bakgrund eller funktionsnedsättning. Lagen som kom i maj 2018; GDPR ska efterlevas genom att ett pop-up fönster finns på startsidan och vid formulären för registrering av kund och administratör. Det ska bland annat finnas funktionalitet för att webbutikens varor lätt ska kunna bytas ut från klädesplagg till artiklar av annan art, samt applikationen ska kunna lagra information från de olika tabellerna. Det här för att administratören ska kunna plocka ut data vid senare tillfälle. En av sidorna ska vara på engelska, samt det ska finnas en länk till denna på varje webbsida. Webbutiken ska vara allmängiltig för konceptet second hand med de etiska aspekter det innbär och och applikationen ska bygga på responsiv webbdesign och först och främst fungera på mobila enheter, typ Iphone 6. Det är begreppet ”mobile first” som gäller. De tekniker och verktyg som används är HTML, HTML5, CSS, CSS3, media queries, JavaScript, PHP med databas i frågespråket SQL, Bootstrap, Adobe Photoshop CC 2017, Projektledning m fl. Kurslitteratur från samtliga kurser i Webbutvecklingsprogrammet för studier och fördjupning i problem som uppstår under utvecklingen av webbutiken. Här ingår inspelade föreläsningar och relevanta webbsidor på internet. Metoden är främst att via iterationer av prototyper föra utvecklingen av applikationen i rätt riktning. Det hela har resulterat i en webbapplikation enligt mobile first, som fungerar väl vad gäller att registrera konto samt att välja och beställa av det urval av klädesplagg som finns till försäljning. Webbutiken Second Hand Shop fungerar för både kund och administratör. I gränssnittet för administratör finns viktiga funktioner för att organisera ett företags register.

  • 76.
    Ekman, Björn
    Mid Sweden University, Faculty of Science, Technology and Media, Department of Information Systems and Technology.
    Bilindustrins förmåga att hantera förändringar i affärsmodellerna: Hur den uppkopplade bilen påverkar bilindustrins affärsmodeller2017Independent thesis Advanced level (professional degree), 20 credits / 30 HE creditsStudent thesis
    Abstract [en]

    The development of IT has enabled the Internet of Things. Internet of Things is connected devices that communicate with each other and generate data that can then be used for different purposes. The number of connected devices is estimated to 20.4 billion by year 2020. The technology allows the automotive industry to connect its vehicles, which is called Internet of Vehicles, an application of the Internet of Things and intelligent cars. The amount of scientific literature that addresses the economic aspects related to the Internet of Things is scarce and need to be investigated further. The connected car is a disruptive technology that is expected to affect the automotive industry and their business models radically in the next few years and nobody really knows how the industry may look like the next 10-15 years. This type of disruptive technology requires companies to have the competence to implement innovative business models. The study focuses on the connected car from a business perspective and makes no insight into technical aspects or security challenges. The purpose of the study is to create an under-standing of the challenges facing the automotive business models with the introduction of the Internet of Things, which gives the automotive industry the opportunity to connect their cars to a greater extent than before. The study's results show that the value proposition and the channels are the most important parts to focus on in the business model. In order for companies to capitalize on the connection, it is extremely important that the customers understand the value of the connection and that the companies properly manage the information derived from the connected cars. Today, companies offer services or increased value in existing business models to strengthen their brand. The respondents share the opinion that the traditional business models will not change radically as long as the ownership of the car stay the same. Ownership and autonomous cars are the factors that are expected to affect the automotive industry the most. According to the respondents in the study, in order to deal with disruptive innovations in the industry, courage, adaptability, prospects and innovative thinking are required.

  • 77.
    Ekström, Marcus
    Mid Sweden University, Faculty of Science, Technology and Media, Department of Information Systems and Technology.
    Communication tool in virtual reality – A telepresence alternative: An alternative to telepresence – bringing the shared space to a virtual environment in virtual reality2017Independent thesis Advanced level (professional degree), 20 credits / 30 HE creditsStudent thesis
    Abstract [en]

    Videoconferencing is one of the most common telepresence methods today and educational videos is rising in popularity among distance learners. Traditional videoconferencing is unable to convey gestures and mutual eye contact between participants. This study aim to propose a Virtual Reality telepresence solution using game engines. A literature study confirmed the effectiveness achieved in VR is comparable to the effectiveness in face-to-face meetings. The suggested solution implements whiteboard functionality from a real-life perspective, confirming it is possible to include new functionality and directly transfer old functionality to the VR system from the communication systems today. The system was evaluated based on the response time, packet loss, bandwidth, frame rate and through user tests. The evaluation shows it is possible to design a telepresence system with VR capable of passing the Turing Test for Telepresence. The participants of the user tests did not experience discomfort and they were positively inclined to the telepresence system. Though, discomfort may emerge if the VR character is used with a common office workstation. Future studies in this topic would involve modifications of the third person camera, making the head's rotation follow the direction of the camera view and implementing movable eye pupils on the VR character using the upcoming eye-tracking accessory.

  • 78.
    Ekstål, Simon
    Mid Sweden University, Faculty of Science, Technology and Media, Department of Information Systems and Technology.
    Kommunikationslösning för GATA-systemet2017Independent thesis Basic level (degree of Bachelor), 10 credits / 15 HE creditsStudent thesis
    Abstract [en]

    Sogeti is an IT-consulting company active in many countries. It has many different assignments and develops systems for companies in different industries. One of these assignments and systems is a system called GATA made for the company SCA's business-branch SCA SKOG. GATA stands for GPS Assisted Transport Announcement and is a comprehensive solution for timber transport from forest to industry. At this point in time, messages that do not belong to the system's main data must be sent and received outside the system. The purpose of this project has been to create a communication solution that addresses this for the system. The basic objective of the solution has been to create a local communication solution and then integrate this solution with the system. This was supposed to be done in a structured manner during sprints and with a proof-of-concept model. A local communication solution has been created consisting of a server with a message-component, a console application for creating and sending messages and a website for receiving and presenting messages. The result of the local communication solution has been proven to be consistent with the basic objective. Thereafter the local communication solution was integrated with the GATA-system. The integration resulted in a message- component being created on the system server, a console application was created within the system and a component on the system's website was created. This integrated communication solution imitates and can perform the same operations as the local communication solution and has been adapted to the GATA-system. The result of the integration and thus the entire project have been proven to be successful according to the basic target objective.

  • 79.
    Eliasson, Pontus
    Mid Sweden University, Faculty of Science, Technology and Media, Department of Information Systems and Technology.
    Användning av greylisting för att filtrera skräppost för myndigheter2018Independent thesis Basic level (university diploma), 10 credits / 15 HE creditsStudent thesis
    Abstract [en]

    Investigates the usability of greylisting as a means of filtering spam emails in the perspective of a (swedish) government agency that has got legal obligations to be reachable by email and thus are limited in the ways incoming emails may be filtered. By setting up a virtual environment a few softwares for sending bulk mail are tested and greylisting shows to be a very effective when it comes to filter emails that are sent from clients that does not fully support the SMTP's functions for retransmission listed in the RFC. Greylisting has got an built in disadvantage in the way that email are filtered and that is that all emails from senders that has not been seen before will be delayed, in my tests and with my settings of Postgrey I got an average delay of approximately 17min.

  • 80.
    Eliasson, Stefan
    et al.
    Mid Sweden University, Faculty of Science, Technology and Media, Department of Information Systems and Technology.
    Vängman, Mattias
    Mid Sweden University, Faculty of Science, Technology and Media, Department of Information Systems and Technology.
    Integrering av ny teknologi: Modernisera ett befintligt IT-system med webbtjänst2018Independent thesis Basic level (degree of Bachelor), 10 credits / 15 HE creditsStudent thesis
    Abstract [en]

    Older IT-systems are still operational today and can need modernization by for example integrating a web service that is developed with new technology. Cybercom is in a situation where this is the case. They have a customer by the name FÖRETAG X and this customer uses an application called Master Security which is developed with old technology. Every Master Security has a connected local database. These applications and local databases are located at FÖRETAG X’s customers. When an update is available for these databases a reinstallation is required for the Master Security. The goal with this project is to create a new implementation of the system by having a webservice that handles updates to avoid the reinstallation of the Master Security. The webservice shall be developed with new technology and be able to work with Master Security. User test and time measurements will be conducted to later be evaluated. The final implementation is created in and for Windows environment the final implementation consists of a Server side and a Client side. The server side consists of a webservice that is using Windows Communication Foundation and a local connected database that stores updates. The updates on the server-side can be downloaded from the client side which consists of an application, local database, an external module that is a DLL-library. The DLL-library is the solution for communication between the old and new technology with the usage of COM-interface. The graphic user interface Script Sender was developed to upload new updates. The results from the user-tests shows that the new user-interfaces is easy to work with but there are room for improvement. The results from the time measurements show that the new implementation is faster than the existing system. The conclusion is that in this case it is possible to expand the existing IT-system with new technology instead of building it from the ground up.

  • 81.
    Ellström, Jonathan
    Mid Sweden University, Faculty of Science, Technology and Media, Department of Information Systems and Technology.
    Exploring IBM Integration Designer2017Independent thesis Basic level (degree of Bachelor), 10 credits / 15 HE creditsStudent thesis
    Abstract [en]

    The interest for Business Process Management (BPM) is increasing in Sweden. Government agencies such as the Swedish Nation Board of Student Aid (CSN), the Swedish Companies Registration Office (Bolagsverket) and the Swedish Social Insurance Agency (Försäkringskassan) are implementing BPM into their organizations. Sogeti is an IT-consulting company that has employees that works at CSN with BPM, and one of the tools they use for integration is IBM Integration Designer. Since this technology is new and is gaining popularity, there is a need for increased knowledge about it. This thesis report explores the tool IBM Integration Designer, regarding its different ways to integrate with systems and compares the different options for exposing the integration solu- tions. By researching documentation from IBM, and by learning how to use the tool itself, knowledge could be acquired about IBM Integration Designer. The result was an implementation of the five different export bindings: SCA, HTTP, SOAP over HTTP, Enterprise JavaBeans (EJB) and Java Messaging Service (JMS), a comparison of these export bindings and finally a service in IBM Inte- gration Designer that accesses a database and uses an external SMS API to send text messages. The result has been satisfactory to the purpose of this project in giving insight into IBM Integration Designer, one of the popular tools for inte- grating BMP.

  • 82.
    Embretsen, Axel
    Mid Sweden University, Faculty of Science, Technology and Media, Department of Information Systems and Technology.
    Implementation av virtualiseringstjänst: Skillnader eller brist på det gällande hypervisorer2017Independent thesis Basic level (university diploma), 10 credits / 15 HE creditsStudent thesis
    Abstract [en]

    There is a lot to consider when implementing a virtualization service and if you make a mistake you might have to pay for it later since migrating virtual machines from the system and technologies you are currently using to the new one can be a troublesome task.

    The aim of this project is to compare different technologies for virtualization, mainly hypervisors to see how they differ in features and performance (or how they don't) in order to better understand which technology is appropriate for what to be able to better recommend how a virtualization service should be implemented.

    To accomplish this I set up a single host machine which is to host two different virtual servers: a web server with high requirements for security and a video conference server with high quality requirements.

    The host was to have six different configurations doing this each using a different hypervisor.

    But due to hardware constrains half of the configurations had to be dropped leaving the ones using KVM, Xen and LXD.

    To better understand the security and quality requirements that the virtualization service and its components should meet I created a poll I sent to people in the industry to gather information. It did yield some interesting info but was lacking in details and it was hard to use it for any comparisons.

    In the end the comparisons showed that in most cases it makes little difference what hypervisor you use, and in the other cases it was a bit hard to tell what to make of it.

    As for security and quality there were some minor differences but of little importance.

    LXD does show some smaller advantages for specific situations but at the cost of being unable to run non-Linux virtual machines.

  • 83.
    Engelke, U.
    et al.
    Commonwealth Scientific and Industrial Research Organisation (CSIRO), Hobart, Australia.
    Darcy, D.P.
    Dolby Laboratories, San Francisco, USA.
    Mulliken, G.H.
    Dolby Laboratories, San Francisco, USA.
    Bosse, S.
    Fraunhofer Institute for Telecommunications, Berlin, Germany.
    Martini, M.G.
    Kingston University, London, UK.
    Arndt, S.
    Norwegian University of Science and Technology (NTNU), Trondheim, Norway.
    Antons, J.-N.
    Technische Universiat Berlin, Germany.
    Chan, K.Y.
    Curtin University, Perth, Australia.
    Ramzan, N.
    University of the West of Scotland, Hamilton, UK.
    Brunnström, Kjell
    Mid Sweden University, Faculty of Science, Technology and Media, Department of Information Systems and Technology. Acreo Swedish ICT AB.
    Psychophysiology-based QoE Assessment: A Survey2017In: IEEE Journal on Selected Topics in Signal Processing, ISSN 1932-4553, E-ISSN 1941-0484, Vol. 11, no 1, p. 6-21, article id 7569001Article in journal (Refereed)
    Abstract [en]

    We present a survey of psychophysiology-based assessment for Quality of Experience (QoE) in advanced multimedia technologies. We provide a classification of methods relevant toQoE and describe related psychological processes, experimental design considerations, and signal analysis techniques. We summarise multimodal techniques and discuss several important aspects of psychophysiology-based QoE assessment, including the synergies with psychophysical assessment and the need for standardised experimental design. This survey is not considered to be exhaustive but serves as a guideline for those interested to further explore this emerging field of research.

  • 84.
    Englevid, Jonas
    Mid Sweden University, Faculty of Science, Technology and Media, Department of Information Systems and Technology.
    Robotic Process Automation: Analys och implementation2018Independent thesis Basic level (degree of Bachelor), 10 credits / 15 HE creditsStudent thesis
    Abstract [en]

    Employees today have necessary daily tasks that do not require human handling. The objective is to investigate two processes if they are suitable for automation as well as to create and evaluate a prototype. The goals are to analyze the process, examine appropriate tools for automation, compare the tools, create and evaluate prototype, and perform an acceptance test. Robotic Process Automation is about automating tasks that humans have to do. Good candidates for automation are time-consuming, repetitive, rule-based tasks, prone to human er- rors with clear goals and expectations. The preliminary study was conducted in the form of a literature study of web-based sources, and the analysis was done by breaking down the process in different parts. The comparison was carried out by investigating the features of these tools. The prototype was created on Windows in UiPath tools and the robot will work on Internet Explorer and Excel, which will have a macro written in Visual Basic for Applications. The client will look at the criteria given and also on the prototype output and provide a subjective response. UiPath, Workfusion, and Selenium test programs were created. The prototype automatically logs on to Visma PX by entering username and password. Then it navigates, searches for an assignment and retrieves the data available. Indata is filtered and typed into Excel for each activity and employee. Finally, a macro creates graphs. Time tests show that UiPath is significantly more optimized and faster at completing the test programs. UiPath has strong benefits with its tools.

  • 85.
    Engström, Anna
    et al.
    Mid Sweden University, Faculty of Science, Technology and Media, Department of Information Systems and Technology.
    Mawlood, Hezwan
    Mid Sweden University, Faculty of Science, Technology and Media, Department of Information Systems and Technology.
    Processutveckling - En fallstudie för att undersöka hur integrationen av BPR och TQM  kan användas i olika faser av kvalitetsutvecklingen2017Independent thesis Basic level (degree of Bachelor), 10 credits / 15 HE creditsStudent thesis
    Abstract [en]

    In today’s society, it is necessary for organizations to continuously improve their strategic processes in order to successfully retain competitiveness in the market. In a competitive industry, it is of high importance that the internal strategy of an organization is smooth and effective. So, which concept or method will most efficiently improve these processes? Will a combination of two different improvement concepts achieve a better result? This study is based on the integration of two methods that will be implemented on two internal processes in a sales agent organization. The two methods are Business Process Reengineering (BPR) and Total Quality Management (TQM) The course of action will be systematically described and analyzed to receive a clear overview of the process before and after the developed improvements. The purpose of our study is to investigate if the organization’s internal working routines can be simplified and more efficient through integrating the two improvement methods suggested. As the study was performed on two different processes, two results were achieved. The organization is today performing one of these processes, which allows us to accurately follow the developments and improvements. Although for the second one, it has yet to be developed further than simply to the early stage of the implementation. Therefore, it is currently difficult to see a clear result regarding this change being made. Instead a suggestion for improvement was delivered to the organization for possible future use with a purpose to achieve an efficient internal process and thereby create a competitive advantage in the market. The study has shown that a company with a corporate culture that is open for change and with committed leadership can be benefit during development work. 

  • 86.
    Engvall, Tove
    Mid Sweden University, Faculty of Science, Technology and Media, Department of Information Systems and Technology.
    Arkivariers roll och perspektiv i arbete med öppna data2018In: Arkiv, Information, Teknik, no 2, p. 12-13Article in journal (Other (popular science, discussion, etc.))
    Abstract [sv]

    Digitaliseringen ses ofta som en ytterligare industriell revolution, där informationen är den centrala råvaran. Öppna data möjliggör att information från olika källor kan kombineras och analyseras och tanken är att det kan ligga till grund för ny kunskap, innovativa tjänster och ökad öppenhet som kan skapa förbättringar i samhället på olika sätt. Men det medför också risker och etiska ställningstaganden som är viktigt att beakta och där har arkivarier en viktig roll. 

  • 87.
    Engvall, Tove
    Mid Sweden University, Faculty of Science, Technology and Media, Department of Information Systems and Technology.
    Fear, Greed and Lack of Trust in Online Financial Trade2017In: Journal of Administrative Sciences and Technology, ISSN 2165-9435, article id 106163Article in journal (Refereed)
    Abstract [en]

    Trust is a crucial component in business relations, and also a precondition for people’s adoption of electronic services. This study addresses challenges regarding trust in online financial trade. Results indicate that brokerage companies acting in a dishonest way, along with dominant patterns of fear, greed and lack of knowledge and awareness among clients, lead to frequent losses of money. Which leads to a lack of trust in brokers, brokerage companies and the business domain. The way technology is used increases the challenges of risk and an unequal relation between client and brokerage companies. Self-ethnography has been used as a method, where observations, interviews and conversations were performed. Suggestions are to do further research in order to identify how mechanisms for trustworthiness, accountability and transparency can be created in the business domain, in which management of records will play a central role.

  • 88.
    Engvall, Tove
    Mid Sweden University, Faculty of Science, Technology and Media, Department of Information Systems and Technology.
    In-equalities, Regulation, Ethics & Records - reflections from a case in Online Trade2018In: Archives of Business Research, ISSN 2054-7404, Vol. 6, no 8, p. 135-147Article in journal (Refereed)
    Abstract [en]

    In an ethnographic study in online trade at the financial market, different forms of inequalities in client-broker/business relations have been found. It takes expression in for example design of technology, control of and access to information, competence and misuse of trust. Trust is at the center of concern, since that is crucial both to success and a risk for being taken advantage of. This relates to power between actors, responsibilities and accountability. This is where records role as evidence of human acts and conducts have a central role, both in relation to regulation and business ethics. As established structures for ensuring rights and obligations throughout societies are challenged by digitalization and globalization, this has to be recreated in the online context. The article conclude that both regulation and ethics are crucial in forming common values, behaviours and understanding of trust. If not, trust is easily violated which set people in trouble.

    In order to ensure trust, accountability is an important factor to consider, as well as informal means for ensuring fair behaviours that promotes the building and maintenance of trust among actors. As the technological development challenge processes and means for trust and accountability, new tools ought to be developed. As well as people-related aspects at both personal, business and societal level have to be considered.

  • 89.
    Engvall, Tove
    Mid Sweden University, Faculty of Science, Technology and Media, Department of Information Systems and Technology.
    The data dilemma: who is in control?2017In: The data dilemma: a risk or an asset? eabh international workshop in cooperation with InFuture conference, Zagreb, Croatia, 10 November, 2017, 2017Conference paper (Other academic)
  • 90.
    Engvall, Tove
    Mid Sweden University, Faculty of Science, Technology and Media, Department of Information Systems and Technology.
    Trade, Trust and Information Power - In Transition to Sustainability2017 (ed. 1)Book (Other academic)
    Abstract [en]

    Trade, Trust and Information Power. Lessons learned from Online Trade. This is a book about how trust is used, misused and shaped in an online financial context. The author also suggests ways forward for the financial markets, pointing towards client-oriented e-financial services. Digitalization both challenges and opens up new possibilities to perform financial activities. Which also shapes relations of power in new ways. Along with other challenges such as climate change, this requires of us to consider how we want to perform human activities in a sustainable way. 

    Results based on an ethnographic study indicate that individuals are very vulnerable in a global digital market place, and the means for trustworthiness at different levels are insufficient, exposing individuals to high risks and uncertainty of who is actually to trust. One important conclusion is that in order to be able to work for the benefit of many, such a system needs to ensure trust. And that digital records and archives have a crucial role in realizing this. 

  • 91.
    Engvall, Tove
    et al.
    Mid Sweden University, Faculty of Science, Technology and Media, Department of Information Systems and Technology.
    Hellmer, Erica
    Mid Sweden University, Faculty of Science, Technology and Media, Department of Information Systems and Technology.
    As money turns digital and trust turns algorithmic: what ought to be considered?2017Conference paper (Refereed)
    Abstract [en]

    The aim of the paper is to raise questions and challenges regarding construction of trust in the global online context, using the rapidly growing use of digital currencies as an example of concern. The study raises questions related to cultural values and relations of power, and how that relates to trust, when money is digital and monetary processes are performed by technology. The paper is based on a qualitative study, using semi structured interviews with key personnel related to three different cryptocurrencies. Results indicate that cultural values affect how trust is constructed and perceived. Trust is placed primarily in technology and its algorithms, and is identified as an expression of a technocratic utopian approach, which might be in need of being balanced with accountability concerns. Instead of authoritative institutional processes with assigned roles and responsibilities, decentralized forms with users involvements and users control of their money, are important in establishing trust. Cultural influences from globalization and marketization affects how trust is created. The findings contributes by elucidating a need for further research if blockchain technology is to be used in a domain where requirements of trust, responsibilities and rights, long-term preservation, and long-term accessibility are of high concern.

  • 92.
    Engvall, Tove
    et al.
    Mid Sweden University, Faculty of Science, Technology and Media, Department of Information Systems and Technology.
    Samuelsson, Göran
    Mid Sweden University, Faculty of Science, Technology and Media, Department of Information Systems and Technology.
    Ekonomiskt värde av information2017Report (Other academic)
    Abstract [sv]

    Information är en av Trafikverkets resurser och dess värde kan uttryckas på olika sätt och ha olika innebörd för olika aktörer vid olika tidpunkter. Informationen kan ha direkta ekonomiska värden, men värdet kan också uttryckas i dess betydelse för en långsiktig resursplanering och ekonomihantering.  Informationen  kan även bidra till arbetssätt som hushållar med skattemedel, och skapa  samhällseffekter inom t.ex  säkerhet och miljö. Vi har i den befintliga studien bland annat utgått ifrån standarden ISO 30300 som menar: 

    ”Att skapa och hantera verksamhetsinformation ingår i varje organisations aktiviteter, processer och system. Detta bidrar till effektivitet i verksamheten, ansvarstagande och riskhantering och till att verksamheten upprätthålls. Det gör det även möjligt för organisationer att se det värde som finns i informationsresurser som verksamhetstillgångar, kommersiella tillgångar och kunskapstillgångar, och bidra till att bevara det kollektiva minnet, som svar på utmaningar från den globala och digitala omvärlden” (ISO 30300, sid. iv). 

    I denna studie har vi valt att undersöka hur informationshanteringen, med fokus på ekonomirelaterad information, stödjer det pågående arbetet med ett livscykelperspektiv på anläggningen. Bakgrunden är Trafikverkets ambitioner att skapa en helhetssyn över anläggningens kostnader genom hela livscykeln. Och därmed också hur den ekonomiska redovisningen relaterar till annan information. Studien baseras på intervjuer med i huvudsak medarbetare på Trafikverket, men även vissa andra nyckelpersoner. Personer från följande funktioner och arbetsfält har intervjuats:  CoClass, BIM, ANDA/GUS, Öppna data, Informationsutbyten, Informationsförvaltning, Ekonomi, Ledningssystem, IT-strategi, Kontrakt och juridik, LCC, Internrevision, Riksrevisionen, Transportstyrelsen och Näringsdepartementet.

  • 93.
    Eriksson, Fredrik
    Mid Sweden University, Faculty of Science, Technology and Media, Department of Information Systems and Technology.
    Internet of Things (IoT): avskalad plattform i Java2018Independent thesis Basic level (degree of Bachelor), 10 credits / 15 HE creditsStudent thesis
    Abstract [en]

    The need of smart devices that uses sensors have never been higher and by the year 2020 it will be over 50 billion devices connected to the internet. All these devices that uses a sensor and are connected to the internet are a part of something called Internet of Things. The purpose of this study has therefore been to implement a stripped IoT platform that doesn’t use any external libraries to lower the cost for minor companies that doesn’t need the more advanced and expensive platforms. After the implementation various stress test will be performed to see the performance of the platform. The study has been done through web- based sources and as a programming language Java has been used in the development environment NetBeans, the database has been made with MySQL workbench. The result of the study has been a platform that uses REST to post and get data from the database. The external library mysql- connector-java-5.1.45 was essential for a connection to the database and therefore had to be used. The result of the stress test was that the platform performed well and could handle at least 500 REST calls per second with a small increase in response time, but the standard deviation was considerably higher. The conclusion was that the platform performed stable at 50 – 250 calls per second and because of it being stripped several platforms could be used in a company to divide the work load between them resulting in a both stable and scalable solution.

  • 94.
    Etembad, Soma
    et al.
    University of Guilan, Sowmeh Sara, Iran.
    Mohammadi Limaei, Soleiman
    University of Guilan, Sowmeh Sara, Iran.
    Olsson, Leif
    Mid Sweden University, Faculty of Science, Technology and Media, Department of Information Systems and Technology.
    Yousefpour, Rasoul
    University of Freiburg, Freiburg, Germany.
    Decision making on Sustainable Forest Harvest Production Using Goal Programming Approach: Case study: Iranian Hyrcanian Forest2018Conference paper (Refereed)
    Abstract [en]

    This paper aims to determine the optimal stock level in Hyrcanian forest of Iran. In this study, a goal programming techniques used to estimate the optimum stock level of different tree species considering economics, environmental and social issues. We consider multiple objectives in the process of decision making to realize the balance of maximizing annual growth, net present value, carbon sequestration and labor. We use regression analysis to develop a forest growth model using allometric functions for the quantification of carbon budget. The expected mean price was estimated to determine the net present value of forest harvesting. We use Expert knowledge to weight the goals in order to generate the optimal stock level. Results show that the total optimum stock is 0.5% lower than based on questioners. The results indicate that goal programming is a suitable methodology in this case. 

  • 95.
    Fahlén, Erik
    Mid Sweden University, Faculty of Science, Technology and Media, Department of Information Systems and Technology.
    Androidapplikation för digitalisering av formulär: Minimering av inlärningstid, kostnad och felsannolikhet2018Independent thesis Basic level (degree of Bachelor), 10 credits / 15 HE creditsStudent thesis
    Abstract [en]

    This study was performed by creating an android application that uses custom object recognition to scan and digitalize a series of checkbox form for example to correct multiple-choice questions or collect forms in a spreadsheet. The purpose with this study was to see which dataset and hardware with the machine learning library TensorFlow was cheapest, price worthy, enough reliable and fastest. A dataset of filled example forms with annotated checkboxes was created and used in the learning process. The model that was used for the object recognition was Single Show MultiBox Detector, MobileNet version, because it can detect multiple objects in the same image as well as it doesn’t have as high hardware requirements making it fitted for phones. The learning process was done in Google Clouds Machine Learning Engine with different image resolutions and cloud configurations. After the learning process on the cloud the finished TensorFlow model was converted to the TensorFlow Lite model that gets used in phones. The TensorFlow Lite model was used in the compilation of the android application so that the object recognition could work. The android application worked and could recognize the inputs in the checkbox form. Different image resolutions and cloud configurations during the learning process gave different results when it comes to which one was fastest and cheapest. In the end the conclusion was that Googles hardware setup STANDARD_1 was 20% faster than BASIC that was 91% cheaper and more price worthy with this dataset.

  • 96.
    Fang, Zhuowen
    Mid Sweden University, Faculty of Science, Technology and Media, Department of Information Systems and Technology.
    Java GPU vs CPU Hashing Performance2018Independent thesis Basic level (degree of Bachelor), 10 credits / 15 HE creditsStudent thesis
    Abstract [en]

    In the latest years, the public’s interest in blockchain technology has been growing since it was brought up in 2008, primarily because of its ability to create an immutable ledger, for storing information that never will or can be changed. As an expanding chain structure, the act of nodes adding blocks to the chain is called mining which is regulated by consensus mechanism. In the most widely used consensus mechanism Proof of work, this process is based on computationally heavy guessing of hashes of blocks. Today, there are several prominent ways developed of performing this guessing, thanks to the development of hardware technology, either using the regular all-rounded computer processing unit (CPU), or using the more specialized graphics processing unit (GPU), or using dedicated hardware. This thesis studied the working principles of blockchain, implemented the crucial hash function used in Proof of Work consensus mechanism and other blockchain structures with the popular programming language Java on various platforms. CPU implementation is done with Java’s built-in functions and for GPU I used OpenCL ’ s Java binding JOCL. This project gives a quantified measurement for hash rate on different devices, determines that all the GPUs tested advantage over CPUs in performance and memory consumption. Java’s built-in function is easier to use but both of the implementations are doing well in platform independent that the same code can easily be executed on different platforms. Furthermore, based on the measurements, I did in-depth exploration of the principles and proposed future work, analyzed their application values combined with future possibilities of blockchain based on implementation difficulties and performance.

  • 97.
    Farag, Hossam
    et al.
    Mid Sweden University, Faculty of Science, Technology and Media, Department of Information Systems and Technology.
    Gidlund, Mikael
    Mid Sweden University, Faculty of Science, Technology and Media, Department of Information Systems and Technology.
    Österberg, Patrik
    Mid Sweden University, Faculty of Science, Technology and Media, Department of Information Systems and Technology.
    A Delay-Bounded MAC Protocol for Mission- and Time-Critical Applications in Industrial Wireless Sensor Networks2018In: IEEE Sensors Journal, ISSN 1530-437X, E-ISSN 1558-1748, Vol. 18, no 6, p. 2607-2616Article in journal (Refereed)
    Abstract [en]

    Industrial Wireless Sensor Networks (IWSNs) designedfor mission- and time-critical applications require timelyand deterministic data delivery within stringent deadline bounds.Exceeding delay limits for such applications can lead to system malfunction or ultimately dangerous situations that can threaten human safety. In this paper, we propose SS-MAC, an efficient slot stealing MAC protocol to guarantee predictable and timely channel access for time-critical data in IWSNs. In the proposed SS-MAC, aperiodic time-critical traffic opportunistically steals time slots assigned to periodic non-critical traffic. Additionally, a dynamic deadline-based scheduling is introduced to provide guaranteed channel access in emergency and event-based situations where multiple sensor nodes are triggered simultaneously to transmit time-critical data to the controller. The proposed protocol is evaluated mathematically to provide the worst-case delay bound for the time-critical traffic. Performance comparisons are carried out between the proposed SS-MAC and WirelessHARTstandard and they show that, for the time-critical traffic, theproposed SS-MAC can achieve, at least, a reduction of almost 30% in the worst-case delay with a significant channel utilization efficiency.

  • 98.
    Farag, Hossam
    et al.
    Mid Sweden University, Faculty of Science, Technology and Media, Department of Information Systems and Technology.
    Mahmood, Aamir
    Mid Sweden University, Faculty of Science, Technology and Media, Department of Information Systems and Technology.
    Gidlund, Mikael
    Mid Sweden University, Faculty of Science, Technology and Media, Department of Information Systems and Technology.
    Österberg, Patrik
    Mid Sweden University, Faculty of Science, Technology and Media, Department of Information Systems and Technology.
    PR-CCA MAC: A Prioritized Random CCA MAC Protocol for Mission-Critical IoT Applications2018In: 2018 IEEE International Conference on Communications (ICC), IEEE, 2018, article id 8423018Conference paper (Refereed)
    Abstract [en]

    A fundamental challenge in Mission-Critical Internetof Things (MC-IoT) is to provide reliable and timely deliveryof the unpredictable critical traffic. In this paper, we propose an efficient prioritized Medium Access Control (MAC) protocol for Wireless Sensor Networks (WSNs) in MC-IoT control applications. The proposed protocol utilizes a random Clear Channel Assessment (CCA)-based channel access mechanism to handlethe simultaneous transmissions of critical data and to reduce thecollision probability between the contending nodes, which in turn decreases the transmission latency. We develop a Discrete-Time Markov Chain (DTMC) model to evaluate the performance of the proposed protocol analytically in terms of the expected delay and throughput. The obtained results show that the proposed protocolcan enhance the performance of the WirelessHART standard by 80% and 190% in terms of latency and throughput, respectively along with better transmission reliability.

  • 99.
    Fasth, Tobias
    et al.
    Inst. för data- och systemvetenskap, Stockholms universitet.
    Larsson, Aron
    Mid Sweden University, Faculty of Science, Technology and Media, Department of Information Systems and Technology. Stockholms Universitet.
    Ekenberg, Love
    Inst. för data- och systemvetenskap, Stockholms universitet; International Institute of Applied Systems Analysis, IIASA, Austria .
    Danielson, Mats
    Inst. för data- och systemvetenskap, Stockholms universitet; International Institute of Applied Systems Analysis, IIASA, Austria.
    Measuring Conflicts Using Cardinal Ranking: An Application to Decision Analytic Conflict Evaluations2018In: Advances in Operations Research, ISSN 1687-9147, E-ISSN 1687-9155, Vol. 2018, article id 8290434Article in journal (Refereed)
    Abstract [en]

    One of the core complexities involved in evaluating decision alternatives in the area of public decision-making is to deal with conflicts. The stakeholders affected by and involved in the decision often have conflicting preferences regarding the actions under consideration. For an executive authority, these differences of opinion can be problematic, during both implementation and communication, even though the decision is rational with respect to an attribute set perceived to represent social welfare. It is therefore important to involve the stakeholders in the process and to get an understanding of their preferences. Otherwise, the stakeholder disagreement can lead to costly conflicts. One way of approaching this problem is to provide means for comprehensive, yet effective stakeholder preference elicitation methods, where the stakeholders can state their preferences with respect to actions part of the current agenda of a government. In this paper we contribute two supporting methods: (i) an application of the cardinal ranking (CAR) method for preference elicitation for conflict evaluations and (ii) two conflict indices for measuring stakeholder conflicts. The application of the CAR method utilizes a do nothing alternative to differentiate between positive and negative actions. The elicited preferences can then be used as input to the two conflict indices indicating the level of conflict within a stakeholder group or between two stakeholder groups. The contributed methods are demonstrated in a real-life example carried out in the municipality of Upplands Väsby, Sweden. We show how a questionnaire can be used to elicit preferences with CAR and how the indices can be used to semantically describe the level of consensus and conflict regarding a certain attribute. As such, we show how the methods can provide decision aid in the clarification of controversies.

  • 100.
    Feng, Yuan
    Mid Sweden University, Faculty of Science, Technology and Media, Department of Information Systems and Technology.
    Improve Data Quality By Using Dependencies And Regular Expressions2018Independent thesis Advanced level (degree of Master (Two Years)), 20 credits / 30 HE creditsStudent thesis
    Abstract [en]

    The objective of this study has been to answer the question of finding ways to improve the quality of database. There exists a lot of problems of the data stored in the database, like missing or spelling errors. To deal with the dirty data in the database, this study adopts the conditional functional dependencies and regular expressions to detect and correct data. Based on the former studies of data cleaning methods, this study considers the more complex conditions of database and combines the efficient algorithms to deal with the data. The study shows that by using these methods, the database’s quality can be improved and considering the complexity of time and space, there still has a lot of things to do to make the data cleaning process more efficiency.

1234567 51 - 100 of 359
CiteExportLink to result list
Permanent link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf