Mid Sweden University

miun.sePublications
Change search
Link to record
Permanent link

Direct link
Alternative names
Publications (10 of 86) Show all publications
Zhang, M., Gao, B., Groth, G. K., Hermann, D. & Brunnström, K. (2024). Digital rear view mirrors with Augmented Reality in comparison with traditional rear-view mirrors. In: IS and T International Symposium on Electronic Imaging Science and Technology: . Paper presented at IS and T International Symposium on Electronic Imaging Science and Technology. The Society for Imaging Science and Technology, 36(11), Article ID 215.
Open this publication in new window or tab >>Digital rear view mirrors with Augmented Reality in comparison with traditional rear-view mirrors
Show others...
2024 (English)In: IS and T International Symposium on Electronic Imaging Science and Technology, The Society for Imaging Science and Technology, 2024, Vol. 36, no 11, article id 215Conference paper, Published paper (Refereed)
Abstract [en]

Recently, the traditional rear and side view mirrors have been started to be exchanged with a digital version. The aim of this study was to investigate the difference in driving performance between traditional rear-view mirrors and digital rear view mirrors which is called Camera Monitor System (CMS) in the vehicle industry. Here, two different types were investigated: CMS without or with Augmented Reality (AR) Information. The user test was conducted in a virtual environment, with four driving scenarios defined for testing. The user test results revealed that the participants' driving performance using CMS (only cameras and 2D displays without augmented information) did not improve over traditional mirrors. 

Place, publisher, year, edition, pages
The Society for Imaging Science and Technology, 2024
Keywords
Camera Monitor System (CMS), Digital rear view mirrors, Teleoperation, Traditional rear-view mirrors, User study, Virtual Reality (VR) and Augmented Reality (AR)
National Category
Computer and Information Sciences
Identifiers
urn:nbn:se:miun:diva-52044 (URN)10.2352/EI.2024.36.11.HVEI-215 (DOI)2-s2.0-85197288997 (Scopus ID)
Conference
IS and T International Symposium on Electronic Imaging Science and Technology
Available from: 2024-08-07 Created: 2024-08-07 Last updated: 2024-08-07
Rafiei, S., Brunnström, K., Andersson, J. & Sjöström, M. (2024). Investigation of human interaction with an augmented remote operating system for scaling in mining applications. Quality and User Experience, 9(4)
Open this publication in new window or tab >>Investigation of human interaction with an augmented remote operating system for scaling in mining applications
2024 (English)In: Quality and User Experience, ISSN 2366-0139, E-ISSN 2366-0147, Vol. 9, no 4Article in journal (Refereed) Published
Abstract [en]

Thanks to the advent of telepresence applications, we can remotely take control and operate industrial machinery. Teleoperation removes operators from hazardous workplaces such asmining and plays an essential role in the safety of workers. In addition, augmented telepresence can introduce information that helps the user understand the remote scene. However, remote operation has challenges when the received information is more limited than what could be perceived on-site–e.g., judging depth. This study investigates how well operators interact with an Augmented Remote Operation Scaling System (AROSS) in a mining context when different computer-generated visual interfaces are provided. The system can achieve five augmented views: Disocclusion Augmentation using selective content removal; Novel Perspective view generation; Lidar view; Right field of view; and Left field of view. We performed two experiments in a mine-like laboratory. The first experiment was a feasibility test to obtain an understanding of what users need to accurately perceive depth. The second experiment was designed to evaluate the user’s experience with the different versions of AROSS. To analyze human interaction with the designed prototype, we employed a mixed research methodology that used interviews, observations, and questionnaires. This mixed methodology consisted of quality of experience methods to discover the users’requirements from a technological standpoint and user experience methods (i.e., user-centricapproaches). We investigated 10 and 11 users’ interactions in the two subjective experiments. The first experiment focused on the effects of in-view augmentations and interface distributions on perceiving wall patterns. The second focused on the effects of augmentations on the depth and understanding the 3D environment. Using these data, we analyzed both thequality of experience and user experience via evaluation criteria consisting of interface helpfulness, task performance, potential improvement, and user satisfaction. The feasibility test results were mainly used to structure the formative investigation. The overall conclusion from the formative testing shows that the remote operators preferred using natural views (Original) as this approach made it easier to understand the environment. Although the augmented computer-generated views do not look natural, they support 3D cues. In addition, the combination of Novel Perspective and Lidar interfaces as additional views in depth perception tasks seemed helpful. There was difficulty performing tasks when the robot arm was obscured during the Disocclusion Augmentation view and low video quality during the Novel Perspective view. However, participants found the Novel Perspective view useful for geometry and depth estimation.

Place, publisher, year, edition, pages
Springer, 2024
Keywords
User Experience (UX), Quality of Experience (QoE), Augmented Telepresence (AT), Intelligent Mining, Industrial remote controlling, UX and QoE, Mixed methodology
National Category
Engineering and Technology Computer and Information Sciences
Identifiers
urn:nbn:se:miun:diva-50092 (URN)10.1007/s41233-024-00068-9 (DOI)
Funder
Swedish Foundation for Strategic Research
Available from: 2023-12-12 Created: 2023-12-12 Last updated: 2025-06-18Bibliographically approved
Rafiei, S., Brunnström, K., Schenkman, B., Andersson, J. & Sjöström, M. (2024). Laboratory study: Human Interaction using Remote Control System for Airport Safety Management. In: 2024 16th International Conference on Quality of Multimedia Experience, QoMEX 2024: . Paper presented at 2024 16th International Conference on Quality of Multimedia Experience, QoMEX 2024 (pp. 167-170). IEEE conference proceedings
Open this publication in new window or tab >>Laboratory study: Human Interaction using Remote Control System for Airport Safety Management
Show others...
2024 (English)In: 2024 16th International Conference on Quality of Multimedia Experience, QoMEX 2024, IEEE conference proceedings, 2024, p. 167-170Conference paper, Published paper (Refereed)
Abstract [en]

Remote control technology for machines and robots has experienced significant advancement in many domains where visual information delivery is essential. Safety management at airports is one field that benefits from remote control systems, enabling operators to scan the airstrip for obstacles and debris. Depth perception and proper view position are critical in remote control systems, where operators need to accurately perceive 3D coordinates and maintain an appropriate perspective for performing tasks from a distance. This Demo paper presents a laboratory platform for an experimental study on human interaction with remote control systems using visual interfaces for inspection tasks to enhance airport safety management. The platform can evaluate the user experience of interfaces provided by First Person View, Third Person View, and Augmentation technologies. This platform enables exploration through controlled experiments and by user tests. These provide an avenue for assessing how these interfaces may affect human performance, depth perception, and user experience when conducting inspection tasks remotely. The findings will shed light on the strengths and limitations of each interface type, offering insights into their potential applications in various domains such as industrial inspection, surveillance, and remote exploration. 

Place, publisher, year, edition, pages
IEEE conference proceedings, 2024
Keywords
Augmented Reality, depth perception, Industrial Remote Control System, Quality of Experience, Unmanned Ground Vehicle, User Experience, viewpoint position
National Category
Electrical Engineering, Electronic Engineering, Information Engineering
Identifiers
urn:nbn:se:miun:diva-52178 (URN)10.1109/QoMEX61742.2024.10598280 (DOI)001289486600030 ()2-s2.0-85201060708 (Scopus ID)9798350361582 (ISBN)
Conference
2024 16th International Conference on Quality of Multimedia Experience, QoMEX 2024
Available from: 2024-08-20 Created: 2024-08-20 Last updated: 2024-10-11Bibliographically approved
Kawthar, E. O., Rafiei, S., Singhal, C. & Brunnström, K. (2024). User performance and Quality of Experience for a remote-controlled lab-based moving platform. In: IS and T International Symposium on Electronic Imaging Science and Technology: . Paper presented at Electronic Imaging: Human Vision and Electronic Imaging, San Francisco, US, Januari 21-25, 2024. The Society for Imaging Science and Technology, 36, Article ID 237.
Open this publication in new window or tab >>User performance and Quality of Experience for a remote-controlled lab-based moving platform
2024 (English)In: IS and T International Symposium on Electronic Imaging Science and Technology, The Society for Imaging Science and Technology, 2024, Vol. 36, article id 237Conference paper, Published paper (Refereed)
Abstract [en]

In this study, we designed an experiment using remote-controlled lab based moving platform to evaluate the impact of resolution, latency, and field of view on the Quality of Experience, performance, user experience and depth perception. The experiment involves two tasks: driving the platform to a stop point and parking it between two boxes. Participants provided feedback through questionnaires, and their experiences were analyzed. Seven participants between 30 and 57 (average of 37) years old participated. We used Google Forms for data collection, including pre-experiment and recurring questionnaires as well as a simulator sickness questionnaire. Despite the low number of test participants leading to uncertainty in quantitative analysis, significant effects were observed, albeit with contradictory statistical outcomes. The data suggests that lower latency corresponds to better performance, with participants not always perceiving higher latency accurately. Video quality notably impacts user experience, with higher resolution being preferred.

Place, publisher, year, edition, pages
The Society for Imaging Science and Technology, 2024
National Category
Electrical Engineering, Electronic Engineering, Information Engineering
Identifiers
urn:nbn:se:miun:diva-50345 (URN)10.2352/EI.2024.36.11.HVEI-237 (DOI)2-s2.0-85197224888 (Scopus ID)
Conference
Electronic Imaging: Human Vision and Electronic Imaging, San Francisco, US, Januari 21-25, 2024
Funder
Swedish Foundation for Strategic Research
Available from: 2024-01-29 Created: 2024-01-29 Last updated: 2024-08-07Bibliographically approved
Brunnström, K., Djupsjöbacka, A., Billingham, J., Wistel, K., Andrén, B., Ozolins, O. & Evans, N. (2024). Video expert assessment of high quality video for Video Assistant Referee (VAR): A comparative study. Multimedia tools and applications, 83(20), 58783-58825
Open this publication in new window or tab >>Video expert assessment of high quality video for Video Assistant Referee (VAR): A comparative study
Show others...
2024 (English)In: Multimedia tools and applications, ISSN 1380-7501, E-ISSN 1573-7721, Vol. 83, no 20, p. 58783-58825Article in journal (Refereed) Published
Abstract [en]

The International Football Association Board decided to introduce Video Assistant Referee (VAR) in 2018. This led to the need to develop methods for quality control of the VAR-systems. This article focuses on the important aspect to evaluate the video quality. Video Quality assessment has matured in the sense that there are standardized, commercial products and established open-source solutions to measure it with objective methods. Previous research has primarily focused on the end-user quality assessment. How to assess the video in the contribution phase of the chain is less studied. The novelties of this study are two-fold: 1) The user study is specifically targeting video experts i.e., to assess the perceived quality of video professionals working with video production. 2) Six video quality models have been independently benchmarked against the user data and evaluated to show which of the models could provide the best predictions of perceived quality. The independent evaluation is important to get unbiased results as shown by the Video Quality Experts Group. An experiment was performed involving 25 video experts in which they rated the perceived quality. The video formats tested were High-Definition TV both progressive and interlaced as well as a quarters size format that was scaled down half the size in both width and height. The videos were encoded with both H.264 and Motion JPEG for the full size but only H.264 for the quarter size. Bitrates ranged from 80 Mbit/s down to 10 Mbit/s. We could see that for H.264 that the quality was overall very good but dropped somewhat for 10 Mbit/s. For Motion JPEG the quality dropped over the whole range. For the interlaced format the degradation that was based on a simple deinterlacing method did receive overall low ratings. For the quarter size three different scaling algorithms were evaluated. Lanczos performed the best and Bilinear the worst. The performance of six different video quality models were evaluated for 1080p and 1080i. The Video Quality Metric for Variable Frame Delay had the best performance for both formats, followed by Video Multimethod Assessment Fusion method and the Video Quality Metric General model. 

Place, publisher, year, edition, pages
Springer Nature, 2024
Keywords
Contribution, Football, HDTV, PSNR, SSIM, Subjective and objective video quality, Video Assistant Referee (VAR), Video quality, VIF, VMAF, VQM General, VQM_VFD
National Category
Electrical Engineering, Electronic Engineering, Information Engineering
Identifiers
urn:nbn:se:miun:diva-50241 (URN)10.1007/s11042-023-17741-4 (DOI)001130236200002 ()2-s2.0-85180664528 (Scopus ID)
Available from: 2024-01-09 Created: 2024-01-09 Last updated: 2024-06-10Bibliographically approved
Rafiei, S., Singhal, C., Brunnström, K. & Sjöström, M. (2023). Human Interaction in Industrial Tele-Operated Driving: Laboratory Investigation. In: 2023 15th International Conference on Quality of Multimedia Experience (QoMEX): . Paper presented at 2023 15th International Conference on Quality of Multimedia Experience, QoMEX 2023 (pp. 91-94). IEEE conference proceedings
Open this publication in new window or tab >>Human Interaction in Industrial Tele-Operated Driving: Laboratory Investigation
2023 (English)In: 2023 15th International Conference on Quality of Multimedia Experience (QoMEX), IEEE conference proceedings, 2023, p. 91-94Conference paper, Published paper (Refereed)
Abstract [en]

Tele-operated driving enables industrial operators to control heavy machinery remotely. By doing so, they could work in improved and safe workplaces. However, some challenges need to be investigated while presenting visual information from on-site scenes for operators sitting at a distance in a remote site. This paper discusses the impact of video quality (spatial resolution), field of view, and latency on users' depth perception, experience, and performance in a lab-based tele-operated application. We performed user experience evaluation experiments to study these impacts. Overall, the user experience and comfort decrease while the users' performance error increases with an increase in the glass-to-glass latency. The user comfort reduces, and the user performance error increases with reduced video quality (spatial resolution). 

Place, publisher, year, edition, pages
IEEE conference proceedings, 2023
Keywords
depth estimation, field of view, Industrial Tele-operation, latency, User and Quality of experience, video quality
National Category
Electrical Engineering, Electronic Engineering, Information Engineering
Identifiers
urn:nbn:se:miun:diva-49146 (URN)10.1109/QoMEX58391.2023.10178441 (DOI)001037196100017 ()2-s2.0-85167344580 (Scopus ID)9798350311730 (ISBN)
Conference
2023 15th International Conference on Quality of Multimedia Experience, QoMEX 2023
Available from: 2023-08-22 Created: 2023-08-22 Last updated: 2025-06-18Bibliographically approved
Göreke, H. D., Djupsjöbacka, A., Schenkman, B. N., Andrén, B., Hermann, D. S. & Brunnström, K. (2023). Perceptual Judgments of Simulated Low Temperatures in LCD based Vehicle Displays. In: : . Paper presented at SID International Symposium Digest of Technical Papers, 2023 (pp. 595-598). John Wiley & Sons, 54(1)
Open this publication in new window or tab >>Perceptual Judgments of Simulated Low Temperatures in LCD based Vehicle Displays
Show others...
2023 (English)Conference paper, Published paper (Refereed)
Abstract [en]

A well-known drawback with LCD-displays in cold is a slow pixel response leading to poor picture quality. Low temperatures can constitute a hazard in viewing important displays in cars. Perceptual experiments with 20 test-persons were conducted to find clear and acceptable ranges on screens simulating distortions in low temperatures. The results showed perception over clear and acceptable image quality was impaired beyond -20°C for the LCD-screen in the experiments.

Place, publisher, year, edition, pages
John Wiley & Sons, 2023
Keywords
Perception, Cold Screens, LCD, Video Quality, Distortions, CMS, Psychophysics
National Category
Electrical Engineering, Electronic Engineering, Information Engineering
Identifiers
urn:nbn:se:miun:diva-49796 (URN)10.1002/sdtp.16628 (DOI)2-s2.0-85175305976 (Scopus ID)
Conference
SID International Symposium Digest of Technical Papers, 2023
Available from: 2023-11-08 Created: 2023-11-08 Last updated: 2023-11-08Bibliographically approved
Brunnström, K., Djupsjöbacka, A., Ozolins, O., Billingham, J., Wistel, K. & Evans, N. (2023). Quality measurement methods for video assisting refereeing systems. Sports Engineering, 26(1), Article ID 17.
Open this publication in new window or tab >>Quality measurement methods for video assisting refereeing systems
Show others...
2023 (English)In: Sports Engineering, ISSN 1369-7072, E-ISSN 1460-2687, Vol. 26, no 1, article id 17Article in journal (Refereed) Published
Abstract [en]

Changes in the footballing world’s approach to technology and innovation, along with major advancements in broadcasting contributed to the decision by the International Football Association Board to introduce Video Assistant Referees in 2018. The change meant that under strict protocols referees could use video replays to review decisions in the event of a “clear and obvious error” or a “serious missed incident”. At the time of writing 48-Member Associations have introduced the Video Assistant Referees protocol in at least one of their tournaments and there are many technology providers who work with organisers to implement the Video Assistant Referees systems. To ensure that the use of Video Assistant Referees has a positive effect on the game, Fédération Internationale de Football Association collaborated with the RISE Research Institutes of Sweden to develop objective test methods that could be used to ensure that a system can provide an adequate solution. Each provider must be able to pass requirements that ensure that they can deal with the challenges of processing, coding, decoding, synchronising, and re-formatting of the broadcast feeds. This article will describe the development of the test methods and illustrate some initial results from a test event on Video Assistant Referees system candidates. The methods have shown to be robust and appropriate for their intended purpose and will be developed over the years to ensure the quality of Video Assistant Referees. The developed measurement methods are general and can be applied to other broadcast and video systems as well as to other sports. 

Keywords
Broadcast, Football, Latency, Measurements, Synchronicity, Video assistant referee (VAR), Video quality
National Category
Electrical Engineering, Electronic Engineering, Information Engineering
Identifiers
urn:nbn:se:miun:diva-48011 (URN)10.1007/s12283-023-00408-6 (DOI)000948344700001 ()2-s2.0-85150219324 (Scopus ID)
Available from: 2023-03-28 Created: 2023-03-28 Last updated: 2023-04-17Bibliographically approved
Singhal, C., Rafiei, S. & Brunnström, K. (2023). Real-time Live-Video Streaming in Delay-Critical Application: Remote-Controlled Moving Platform. In: 2023 IEEE 98th Vehicular Technology Conference (VTC2023-Fall): . Paper presented at 2023 IEEE 98th Vehicular Technology Conference (VTC2023-Fall) (pp. 1-7). IEEE conference proceedings
Open this publication in new window or tab >>Real-time Live-Video Streaming in Delay-Critical Application: Remote-Controlled Moving Platform
2023 (English)In: 2023 IEEE 98th Vehicular Technology Conference (VTC2023-Fall), IEEE conference proceedings, 2023, p. 1-7Conference paper, Published paper (Refereed)
Abstract [en]

Recent advancement in multimedia and communication network technology have made interactive multimedia and tele-operation applications possible. Teledriving, teleoperation, and video-based remote-controlling require real-time live-video streaming and are delay critical in principle. Supporting such applications over wireless networks for mobile users can pose fundamental challenges in maintaining video quality and service latency requirements. This paper investigates the factors affecting the end-to-end delay in a video-based, remotely controlled moving platform application. It involves the real-time acquisition of environmental information (visually) and the delay-sensitive video streaming to remote operators over wireless networks. This paper presents an innovative experimental testbed developed using a remote-controlled toy truck, off-the-shelf cameras, and wireless fidelity (Wi-Fi) network. It achieves ultra-low end-to-end latency and helped us in performing the delay, network, and video quality evaluations. Extending the experimental study, we also propose a real-time live-media streaming control (RTSC) algorithm that maximizes the video quality by selecting the best streaming (network, video, and camera) configuration while meeting the delay and network availability constraints. RTSC improves the live-streaming video quality by about 33% while meeting the ultra-low latency (< 200 milliseconds) requirement under constrained network availability conditions.

Place, publisher, year, edition, pages
IEEE conference proceedings, 2023
National Category
Electrical Engineering, Electronic Engineering, Information Engineering
Identifiers
urn:nbn:se:miun:diva-50153 (URN)10.1109/VTC2023-Fall60731.2023.10333522 (DOI)001133762500154 ()2-s2.0-85181173157 (Scopus ID)979-8-3503-2928-5 (ISBN)
Conference
2023 IEEE 98th Vehicular Technology Conference (VTC2023-Fall)
Available from: 2023-12-20 Created: 2023-12-20 Last updated: 2024-02-09Bibliographically approved
Rafiei, S. & Brunnström, K. (2023). Sustainability in industrial remote operating. In: MMTC Special Issue Sustainable Multimedia Communications and Services: . Paper presented at QoMEX 2023 workshop (pp. 24-28). , 18
Open this publication in new window or tab >>Sustainability in industrial remote operating
2023 (English)In: MMTC Special Issue Sustainable Multimedia Communications and Services, 2023, Vol. 18, p. 24-28Conference paper, Published paper (Refereed)
Abstract [en]

With the advent of technologies in industrial remote controlling (Tele-operation), operators interact with machines from a distance. Such systems improve personal safety and work environments but there are challenges in conveying accurate on-site information to the remote site, like perceiving (on-site) visual information, which is one of the main inputs for remote operators to interpret real-world events. Providing visual information in remote operating systems needs real-time video streaming which is energy-demanding. In this research, we investigated the impact of video quality (spatial video resolution)and latency (video buffer size) on user’s experience and energy consumption. Overall, there is a trade-off between the user’s comfortableness and energy consumption in the lab-based Tele-operation system. We observed the energy consumption through an increase in voltage drop over a fixed time where the current was constant with the highest spatial resolution and the highest video latency while there were no significant differences in the user’s comfort. We could, thus, encourage users to use adaptive video streaming considering a tradeoff between acceptable QoE and sustainable video stream choices in Tele-operation.

Keywords
Sustainable video streaming, teleoperating, industrial remote operating, greenhouse gas emissions
National Category
Electrical Engineering, Electronic Engineering, Information Engineering
Identifiers
urn:nbn:se:miun:diva-50346 (URN)
Conference
QoMEX 2023 workshop
Funder
Swedish Foundation for Strategic Research
Available from: 2024-01-29 Created: 2024-01-29 Last updated: 2024-11-04Bibliographically approved
Organisations
Identifiers
ORCID iD: ORCID iD iconorcid.org/0000-0001-5060-9402

Search in DiVA

Show all publications