Mid Sweden University

miun.sePublications
Operational message
There are currently operational disruptions. Troubleshooting is in progress.
Change search
Link to record
Permanent link

Direct link
Publications (10 of 13) Show all publications
Brunnström, K., Runsten Fredriksson, L. & Rafiei, S. (2025). Cloud gaming quality based on a passive video quality experiment and bootstrapped analysis. In: IS and T International Symposium on Electronic Imaging Science and Technology: . Paper presented at IS and T International Symposium on Electronic Imaging Science and Technology. The Society for Imaging Science and Technology, 37(11), Article ID HVEI-211.
Open this publication in new window or tab >>Cloud gaming quality based on a passive video quality experiment and bootstrapped analysis
2025 (English)In: IS and T International Symposium on Electronic Imaging Science and Technology, The Society for Imaging Science and Technology, 2025, Vol. 37, no 11, article id HVEI-211Conference paper, Published paper (Refereed)
Abstract [en]

The International Telecommunication Union has a project for developing objective quality models called Parametric Bitstream-Based Quality Assessment of Cloud Gaming services. The model will be divided into an interaction quality module and a video coding impairment module. To evaluate these two modules an experimental campaign was conducted where labs from different parts of the world performed user studies to collect data for the evaluation. This paper describes an experiment for collecting data to evaluate the video coding impairment module. The analysis is based on a bootstrapping approach. 

Place, publisher, year, edition, pages
The Society for Imaging Science and Technology, 2025
Keywords
Bootstrapping, Cloud gaming, Standardization, Video quality, Visual perception
National Category
Electrical Engineering, Electronic Engineering, Information Engineering
Identifiers
urn:nbn:se:miun:diva-55845 (URN)10.2352/EI.2025.37.11.HVEI-211 (DOI)2-s2.0-105019037173 (Scopus ID)
Conference
IS and T International Symposium on Electronic Imaging Science and Technology
Available from: 2025-10-28 Created: 2025-10-28 Last updated: 2025-10-28
Thulinsson, F., Söderlund, N., Rafiei, S., Schenkman, B., Djupsjöbacka, A., Andrén, B. & Brunnström, K. (2025). Impact of Camera height and Field-of-View on distance judgement and gap selection in digital rear-view mirrors in vehicles. In: IS and T International Symposium on Electronic Imaging Science and Technology: . Paper presented at IS and T International Symposium on Electronic Imaging Science and Technology. The Society for Imaging Science and Technology, 37(11), Article ID HVEI-199.
Open this publication in new window or tab >>Impact of Camera height and Field-of-View on distance judgement and gap selection in digital rear-view mirrors in vehicles
Show others...
2025 (English)In: IS and T International Symposium on Electronic Imaging Science and Technology, The Society for Imaging Science and Technology, 2025, Vol. 37, no 11, article id HVEI-199Conference paper, Published paper (Refereed)
Abstract [en]

This study investigates how different camera perspectives presented in digital rear-view mirrors in vehicles, also known as Camera Monitor Systems, impact drivers’ distance judgment and decision-making in dynamic driving scenarios. The study examines (1) the effects of field of view and (2) camera height on drivers' ability to judge distances to rearward vehicles and to select safe gaps in potentially hazardous situations. A controlled lab-based video experiment was conducted, involving 27 participants who performed distance estimations and last safe gap selections using a simulated side-view mirror display. Participants viewed prerecorded driving scenarios with varying combinations of field of view (40°, 76°, 112°) and camera heights (1 meter, 2.3 meter). No significant effects were found for camera height, but wider field of views led to more accurate distance estimations. However, the use of a wider field of view also increased the risk of potentially dangerous overestimations of distance, as evidenced by the last safe gap results. This suggests that a wider field of view leads to the selection of smaller and potentially risky gaps. Conversely, narrow field of views resulted in underestimations of distance, potentially leading to overly cautious and less efficient driving decisions. These findings inform Camera Monitor Systems design guidelines on how to improve driver perception and road safety, to reduce accidents from vehicle distance misjudgments. 

Place, publisher, year, edition, pages
The Society for Imaging Science and Technology, 2025
National Category
Transport Systems and Logistics
Identifiers
urn:nbn:se:miun:diva-55844 (URN)10.2352/EI.2025.37.11.HVEI-199 (DOI)2-s2.0-105019037415 (Scopus ID)
Conference
IS and T International Symposium on Electronic Imaging Science and Technology
Available from: 2025-10-28 Created: 2025-10-28 Last updated: 2025-10-28
Rafiei, S. (2025). Remote Controlled 3D Positioning in Augmented Telepresence: User and Quality of Experience Aspects. (Doctoral dissertation). Sundsvall: Mid Sweden University
Open this publication in new window or tab >>Remote Controlled 3D Positioning in Augmented Telepresence: User and Quality of Experience Aspects
2025 (English)Doctoral thesis, comprehensive summary (Other academic)
Abstract [en]

Industrial companies increasingly adopt remote operation technologies (teleoperation) to enhance safety and operational reach, bringing new challenges in understanding how users interact with these systems. In particular, when operators rely on flat, video-based displays without natural depth cues, challenges arise in user performance, spatial depth perception, and the overall quality of user experience when interacting with teleoperation systems. There is a growing need for evaluation approaches that extend beyond technical performance to assess the quality of users’ experience. This dissertation integrates findings from multiple studies conducted throughout the doctoral project. It explores how a mixed-method,user-centred evaluation strategy, combining system performance measurement with analysis of the quality of user experience, improves understanding of interaction quality in remote operation systems. Empirical investigations in mining and construction machinery domains examined how visual configurations such as image augmentation, scene presentation, and video degradation influence performance and perceived experience. Grounded in a pragmatic design philosophy, the research applied a combination of quantitative and qualitative methods. Laboratory experiments were conducted using two custom-built remote operation platforms: one simulating robotic arm control in mining, and another emulating construction vehicle teleoperation. Data collection included system-logged performance measures, users’ rating scales, and open-ended reflections to capture personal experiences. This multi-perspective approach enabled triangulation of results and a more complete understanding of functional and experiential aspects of remote interaction. Findings reveal that visual configurations significantly influence performance outcomes, user perceptions, and interaction strategies. Standard views enabled higher precision, while augmented perspectives improved spatial understanding and confidence. Degraded video quality and latency, reduced user comfort, control experience, and task clarity. Combining measurable performance data with reflective feedback offered deeper insights into factors shaping successful and satisfying interaction. This integrated approach contributes new knowledge to the design and evaluation of remote operation systems, ensuring attention to both system efficiency and the human experience. The methodology also provides practical guidance for evaluating complex human-technology interactions in safety-critical contexts. Beyond the mining and construction use cases, the research introduced a third platform aimed at airport safety monitoring that serves as a ready-to-use testbed for future investigations into situational awareness and human-system coordination in remote environments.

Place, publisher, year, edition, pages
Sundsvall: Mid Sweden University, 2025. p. 63
Series
Mid Sweden University doctoral thesis, ISSN 1652-893X ; 433
National Category
Electrical Engineering, Electronic Engineering, Information Engineering Computer and Information Sciences
Identifiers
urn:nbn:se:miun:diva-54678 (URN)978-91-90017-29-6 (ISBN)
Public defence
2025-09-02, L111, Holmgatan 10, Sundsvall, 09:15 (English)
Opponent
Supervisors
Funder
Vinnova, dnr. 2023-00755Vinnova, dnr. 2022- 02670Vinnova, dnr. 2021-02107Swedish Foundation for Strategic Research, FID18-0030
Note

Vid tidpunkten för disputationen var följande delarbete opublicerat: delarbete 4 manuskript.

At the time of the doctoral defence the following paper was unpublished: paper 4 in manuscript.

Available from: 2025-06-18 Created: 2025-06-18 Last updated: 2025-12-04Bibliographically approved
Rafiei, S., Brunnström, K., Pifferi, G., Schenkman, B., Djupsjöbacka, A., Andérn, B. & Sjöström, M. (2025). User Study on Visual Interface Helpfulness in Remote Inspection Tasks Using Teleoperation. In: 2025 17th International Conference on Quality of Multimedia Experience (QoMEX): . Paper presented at 2025 17th International Conference on Quality of Multimedia Experience (QoMEX) (pp. 1-4). IEEE conference proceedings
Open this publication in new window or tab >>User Study on Visual Interface Helpfulness in Remote Inspection Tasks Using Teleoperation
Show others...
2025 (English)In: 2025 17th International Conference on Quality of Multimedia Experience (QoMEX), IEEE conference proceedings, 2025, p. 1-4Conference paper, Published paper (Refereed)
Abstract [en]

Teleoperation systems are increasingly used in remote inspection and maintenance, especially in safety-critical settings. This paper presents findings from a lab study examining user interaction with a teleoperation for airport safety inspection. The study investigates five visual interface configurations: First Person View (FPV), Third Person View (TPV), Augmented FPV, FPV+TPV, and Augmented FPV+TPV. Eighteen participants completed inspection tasks under each condition and rated interface helpfulness, workload (NASA-TLX), and simulator sickness. The result was confirmed by bootstrapping with 1000 iterations. Results showed that all view configurations were rated more helpful than TPV alone, with FPV-based combinations improving depth perception and spatial understanding. NASA-TLX revealed cognitive effort, while SSQ scores decreased post-task, particularly in oculomotor symptoms, suggesting adaptation. These findings support continued research into optimising visual interfaces for teleoperation in safety-critical settings.

Place, publisher, year, edition, pages
IEEE conference proceedings, 2025
National Category
Human Computer Interaction
Identifiers
urn:nbn:se:miun:diva-56256 (URN)10.1109/QoMEX65720.2025.11219981 (DOI)2-s2.0-105023902030 (Scopus ID)979-8-3315-5435-4 (ISBN)
Conference
2025 17th International Conference on Quality of Multimedia Experience (QoMEX)
Available from: 2025-12-11 Created: 2025-12-11 Last updated: 2025-12-16Bibliographically approved
Rafiei, S., Brunnström, K., Andersson, J. & Sjöström, M. (2024). Investigation of human interaction with an augmented remote operating system for scaling in mining applications. Quality and User Experience, 9(4)
Open this publication in new window or tab >>Investigation of human interaction with an augmented remote operating system for scaling in mining applications
2024 (English)In: Quality and User Experience, ISSN 2366-0139, E-ISSN 2366-0147, Vol. 9, no 4Article in journal (Refereed) Published
Abstract [en]

Thanks to the advent of telepresence applications, we can remotely take control and operate industrial machinery. Teleoperation removes operators from hazardous workplaces such asmining and plays an essential role in the safety of workers. In addition, augmented telepresence can introduce information that helps the user understand the remote scene. However, remote operation has challenges when the received information is more limited than what could be perceived on-site–e.g., judging depth. This study investigates how well operators interact with an Augmented Remote Operation Scaling System (AROSS) in a mining context when different computer-generated visual interfaces are provided. The system can achieve five augmented views: Disocclusion Augmentation using selective content removal; Novel Perspective view generation; Lidar view; Right field of view; and Left field of view. We performed two experiments in a mine-like laboratory. The first experiment was a feasibility test to obtain an understanding of what users need to accurately perceive depth. The second experiment was designed to evaluate the user’s experience with the different versions of AROSS. To analyze human interaction with the designed prototype, we employed a mixed research methodology that used interviews, observations, and questionnaires. This mixed methodology consisted of quality of experience methods to discover the users’requirements from a technological standpoint and user experience methods (i.e., user-centricapproaches). We investigated 10 and 11 users’ interactions in the two subjective experiments. The first experiment focused on the effects of in-view augmentations and interface distributions on perceiving wall patterns. The second focused on the effects of augmentations on the depth and understanding the 3D environment. Using these data, we analyzed both thequality of experience and user experience via evaluation criteria consisting of interface helpfulness, task performance, potential improvement, and user satisfaction. The feasibility test results were mainly used to structure the formative investigation. The overall conclusion from the formative testing shows that the remote operators preferred using natural views (Original) as this approach made it easier to understand the environment. Although the augmented computer-generated views do not look natural, they support 3D cues. In addition, the combination of Novel Perspective and Lidar interfaces as additional views in depth perception tasks seemed helpful. There was difficulty performing tasks when the robot arm was obscured during the Disocclusion Augmentation view and low video quality during the Novel Perspective view. However, participants found the Novel Perspective view useful for geometry and depth estimation.

Place, publisher, year, edition, pages
Springer, 2024
Keywords
User Experience (UX), Quality of Experience (QoE), Augmented Telepresence (AT), Intelligent Mining, Industrial remote controlling, UX and QoE, Mixed methodology
National Category
Engineering and Technology Computer and Information Sciences
Identifiers
urn:nbn:se:miun:diva-50092 (URN)10.1007/s41233-024-00068-9 (DOI)
Funder
Swedish Foundation for Strategic Research
Available from: 2023-12-12 Created: 2023-12-12 Last updated: 2025-09-25Bibliographically approved
Rafiei, S., Brunnström, K., Schenkman, B., Andersson, J. & Sjöström, M. (2024). Laboratory study: Human Interaction using Remote Control System for Airport Safety Management. In: 2024 16th International Conference on Quality of Multimedia Experience, QoMEX 2024: . Paper presented at 2024 16th International Conference on Quality of Multimedia Experience, QoMEX 2024 (pp. 167-170). IEEE conference proceedings
Open this publication in new window or tab >>Laboratory study: Human Interaction using Remote Control System for Airport Safety Management
Show others...
2024 (English)In: 2024 16th International Conference on Quality of Multimedia Experience, QoMEX 2024, IEEE conference proceedings, 2024, p. 167-170Conference paper, Published paper (Refereed)
Abstract [en]

Remote control technology for machines and robots has experienced significant advancement in many domains where visual information delivery is essential. Safety management at airports is one field that benefits from remote control systems, enabling operators to scan the airstrip for obstacles and debris. Depth perception and proper view position are critical in remote control systems, where operators need to accurately perceive 3D coordinates and maintain an appropriate perspective for performing tasks from a distance. This Demo paper presents a laboratory platform for an experimental study on human interaction with remote control systems using visual interfaces for inspection tasks to enhance airport safety management. The platform can evaluate the user experience of interfaces provided by First Person View, Third Person View, and Augmentation technologies. This platform enables exploration through controlled experiments and by user tests. These provide an avenue for assessing how these interfaces may affect human performance, depth perception, and user experience when conducting inspection tasks remotely. The findings will shed light on the strengths and limitations of each interface type, offering insights into their potential applications in various domains such as industrial inspection, surveillance, and remote exploration. 

Place, publisher, year, edition, pages
IEEE conference proceedings, 2024
Keywords
Augmented Reality, depth perception, Industrial Remote Control System, Quality of Experience, Unmanned Ground Vehicle, User Experience, viewpoint position
National Category
Electrical Engineering, Electronic Engineering, Information Engineering
Identifiers
urn:nbn:se:miun:diva-52178 (URN)10.1109/QoMEX61742.2024.10598280 (DOI)001289486600030 ()2-s2.0-85201060708 (Scopus ID)9798350361582 (ISBN)
Conference
2024 16th International Conference on Quality of Multimedia Experience, QoMEX 2024
Available from: 2024-08-20 Created: 2024-08-20 Last updated: 2025-09-25Bibliographically approved
Kawthar, E. O., Rafiei, S., Singhal, C. & Brunnström, K. (2024). User performance and Quality of Experience for a remote-controlled lab-based moving platform. In: IS and T International Symposium on Electronic Imaging Science and Technology: . Paper presented at Electronic Imaging: Human Vision and Electronic Imaging, San Francisco, US, Januari 21-25, 2024. The Society for Imaging Science and Technology, 36, Article ID 237.
Open this publication in new window or tab >>User performance and Quality of Experience for a remote-controlled lab-based moving platform
2024 (English)In: IS and T International Symposium on Electronic Imaging Science and Technology, The Society for Imaging Science and Technology, 2024, Vol. 36, article id 237Conference paper, Published paper (Refereed)
Abstract [en]

In this study, we designed an experiment using remote-controlled lab based moving platform to evaluate the impact of resolution, latency, and field of view on the Quality of Experience, performance, user experience and depth perception. The experiment involves two tasks: driving the platform to a stop point and parking it between two boxes. Participants provided feedback through questionnaires, and their experiences were analyzed. Seven participants between 30 and 57 (average of 37) years old participated. We used Google Forms for data collection, including pre-experiment and recurring questionnaires as well as a simulator sickness questionnaire. Despite the low number of test participants leading to uncertainty in quantitative analysis, significant effects were observed, albeit with contradictory statistical outcomes. The data suggests that lower latency corresponds to better performance, with participants not always perceiving higher latency accurately. Video quality notably impacts user experience, with higher resolution being preferred.

Place, publisher, year, edition, pages
The Society for Imaging Science and Technology, 2024
National Category
Electrical Engineering, Electronic Engineering, Information Engineering
Identifiers
urn:nbn:se:miun:diva-50345 (URN)10.2352/EI.2024.36.11.HVEI-237 (DOI)2-s2.0-85197224888 (Scopus ID)
Conference
Electronic Imaging: Human Vision and Electronic Imaging, San Francisco, US, Januari 21-25, 2024
Funder
Swedish Foundation for Strategic Research
Available from: 2024-01-29 Created: 2024-01-29 Last updated: 2025-09-25Bibliographically approved
Rafiei, S., Singhal, C., Brunnström, K. & Sjöström, M. (2023). Human Interaction in Industrial Tele-Operated Driving: Laboratory Investigation. In: 2023 15th International Conference on Quality of Multimedia Experience (QoMEX): . Paper presented at 2023 15th International Conference on Quality of Multimedia Experience, QoMEX 2023 (pp. 91-94). IEEE conference proceedings
Open this publication in new window or tab >>Human Interaction in Industrial Tele-Operated Driving: Laboratory Investigation
2023 (English)In: 2023 15th International Conference on Quality of Multimedia Experience (QoMEX), IEEE conference proceedings, 2023, p. 91-94Conference paper, Published paper (Refereed)
Abstract [en]

Tele-operated driving enables industrial operators to control heavy machinery remotely. By doing so, they could work in improved and safe workplaces. However, some challenges need to be investigated while presenting visual information from on-site scenes for operators sitting at a distance in a remote site. This paper discusses the impact of video quality (spatial resolution), field of view, and latency on users' depth perception, experience, and performance in a lab-based tele-operated application. We performed user experience evaluation experiments to study these impacts. Overall, the user experience and comfort decrease while the users' performance error increases with an increase in the glass-to-glass latency. The user comfort reduces, and the user performance error increases with reduced video quality (spatial resolution). 

Place, publisher, year, edition, pages
IEEE conference proceedings, 2023
Keywords
depth estimation, field of view, Industrial Tele-operation, latency, User and Quality of experience, video quality
National Category
Electrical Engineering, Electronic Engineering, Information Engineering
Identifiers
urn:nbn:se:miun:diva-49146 (URN)10.1109/QoMEX58391.2023.10178441 (DOI)001037196100017 ()2-s2.0-85167344580 (Scopus ID)9798350311730 (ISBN)
Conference
2023 15th International Conference on Quality of Multimedia Experience, QoMEX 2023
Available from: 2023-08-22 Created: 2023-08-22 Last updated: 2025-09-25Bibliographically approved
Singhal, C., Rafiei, S. & Brunnström, K. (2023). Real-time Live-Video Streaming in Delay-Critical Application: Remote-Controlled Moving Platform. In: 2023 IEEE 98th Vehicular Technology Conference (VTC2023-Fall): . Paper presented at 2023 IEEE 98th Vehicular Technology Conference (VTC2023-Fall) (pp. 1-7). IEEE conference proceedings
Open this publication in new window or tab >>Real-time Live-Video Streaming in Delay-Critical Application: Remote-Controlled Moving Platform
2023 (English)In: 2023 IEEE 98th Vehicular Technology Conference (VTC2023-Fall), IEEE conference proceedings, 2023, p. 1-7Conference paper, Published paper (Refereed)
Abstract [en]

Recent advancement in multimedia and communication network technology have made interactive multimedia and tele-operation applications possible. Teledriving, teleoperation, and video-based remote-controlling require real-time live-video streaming and are delay critical in principle. Supporting such applications over wireless networks for mobile users can pose fundamental challenges in maintaining video quality and service latency requirements. This paper investigates the factors affecting the end-to-end delay in a video-based, remotely controlled moving platform application. It involves the real-time acquisition of environmental information (visually) and the delay-sensitive video streaming to remote operators over wireless networks. This paper presents an innovative experimental testbed developed using a remote-controlled toy truck, off-the-shelf cameras, and wireless fidelity (Wi-Fi) network. It achieves ultra-low end-to-end latency and helped us in performing the delay, network, and video quality evaluations. Extending the experimental study, we also propose a real-time live-media streaming control (RTSC) algorithm that maximizes the video quality by selecting the best streaming (network, video, and camera) configuration while meeting the delay and network availability constraints. RTSC improves the live-streaming video quality by about 33% while meeting the ultra-low latency (< 200 milliseconds) requirement under constrained network availability conditions.

Place, publisher, year, edition, pages
IEEE conference proceedings, 2023
National Category
Electrical Engineering, Electronic Engineering, Information Engineering
Identifiers
urn:nbn:se:miun:diva-50153 (URN)10.1109/VTC2023-Fall60731.2023.10333522 (DOI)001133762500154 ()2-s2.0-85181173157 (Scopus ID)979-8-3503-2928-5 (ISBN)
Conference
2023 IEEE 98th Vehicular Technology Conference (VTC2023-Fall)
Available from: 2023-12-20 Created: 2023-12-20 Last updated: 2025-09-25Bibliographically approved
Rafiei, S. (2023). Remote controlled 3D positioning in Augmented Telepresence: User and Quality of Experience.
Open this publication in new window or tab >>Remote controlled 3D positioning in Augmented Telepresence: User and Quality of Experience
2023 (English)Other (Other academic)
Abstract [en]

Remote control systems enable industrial operators to control heavy machinery remotely. By doing so, they could work in improved and safe workplaces. Apart from the advantages of remote control systems in industry, there are challenges when the received information is more limited than what could be perceived on-site, e.g., perceiving depth. Augmented Telepresence combines the concepts of Telepresence, which aims to make users feel as if they are physically present in a remote location, and Augmented Reality, which overlays digital information and virtual objects in the real-world environment. It is, therefore, crucial to investigate how well operators experience their work and perform tasks by interacting with Remote control systems and using the provided visual interfaces. 

National Category
Electrical Engineering, Electronic Engineering, Information Engineering
Identifiers
urn:nbn:se:miun:diva-50344 (URN)
Funder
Swedish Foundation for Strategic Research
Note

Half‑time seminar, presentation.

Available from: 2024-01-29 Created: 2024-01-29 Last updated: 2025-09-25Bibliographically approved
Organisations
Identifiers
ORCID iD: ORCID iD iconorcid.org/0000-0002-5913-3145

Search in DiVA

Show all publications