miun.sePublications
Change search
Link to record
Permanent link

Direct link
BETA
Alternative names
Publications (10 of 64) Show all publications
Brunnström, K., Dima, E., Andersson, M., Sjöström, M., quresh, t. & Johanson, M. (2019). Quality of Experience of hand controller latency in a Virtual Reality simulator. In: Damon Chandler, Mark McCourt and Jeffrey Mulligan, 2019 (Ed.), Human Vision and Electronic Imaging 2019: . Paper presented at Human Vision and Electronic Imaging 2019. Springfield, VA, United States, Article ID 3068450.
Open this publication in new window or tab >>Quality of Experience of hand controller latency in a Virtual Reality simulator
Show others...
2019 (English)In: Human Vision and Electronic Imaging 2019 / [ed] Damon Chandler, Mark McCourt and Jeffrey Mulligan, 2019, Springfield, VA, United States, 2019, article id 3068450Conference paper, Published paper (Refereed)
Abstract [en]

In this study, we investigate a VR simulator of a forestry crane used for loading logs onto a truck, mainly looking at Quality of Experience (QoE) aspects that may be relevant for task completion, but also whether there are any discomfort related symptoms experienced during task execution. A QoE test has been designed to capture both the general subjective experience of using the simulator and to study task performance. Moreover, a specific focus has been to study the effects of latency on the subjective experience, with regards to delays in the crane control interface. A formal subjective study has been performed where we have added controlled delays to the hand controller (joystick) signals. The added delays ranged from 0 ms to 800 ms. We found no significant effects of delays on the task performance on any scales up to 200 ms. A significant negative effect was found for 800 ms added delay. The Symptoms reported in the Simulator Sickness Questionnaire (SSQ) was significantly higher for all the symptom groups, but a majority of the participants reported only slight symptoms. Two out of thirty test persons stopped the test before finishing due to their symptoms.

Place, publisher, year, edition, pages
Springfield, VA, United States: , 2019
Series
Electronic Imaging, ISSN 2470-1173
Keywords
Quality of Experience, Virtual Reality, Simulator, QoE, Delay
National Category
Communication Systems Telecommunications Media Engineering
Identifiers
urn:nbn:se:miun:diva-35609 (URN)
Conference
Human Vision and Electronic Imaging 2019
Funder
Knowledge Foundation, 20160194
Available from: 2019-02-08 Created: 2019-02-08 Last updated: 2019-02-08
Allison, R. S., Brunnström, K., Chandler, D. M., Colett, H. R., Corriveau, P. J., Daly, S., . . . Zhang, Y. (2018). Perspectives on the definition of visually lossless quality for mobile and large format displays. Journal of Electronic Imaging (JEI), 27(5), 1-23, Article ID 053035.
Open this publication in new window or tab >>Perspectives on the definition of visually lossless quality for mobile and large format displays
Show others...
2018 (English)In: Journal of Electronic Imaging (JEI), ISSN 1017-9909, E-ISSN 1560-229X, Vol. 27, no 5, p. 1-23, article id 053035Article in journal (Refereed) Published
Abstract [en]

Advances in imaging and display engineering have given rise to new and improved image and videoapplications that aim to maximize visual quality under given resource constraints (e.g., power, bandwidth).Because the human visual system is an imperfect sensor, the images/videos can be represented in a mathematicallylossy fashion but with enough fidelity that the losses are visually imperceptible—commonly termed“visually lossless.” Although a great deal of research has focused on gaining a better understanding ofthe limits of human vision when viewing natural images/video, a universally or even largely accepted definitionof visually lossless remains elusive. Differences in testing methodologies, research objectives, and targetapplications have led to multiple ad-hoc definitions that are often difficult to compare to or otherwise employ inother settings. We present a compendium of technical experiments relating to both vision science and visualquality testing that together explore the research and business perspectives of visually lossless image quality,as well as review recent scientific advances. Together, the studies presented in this paper suggest that a singledefinition of visually lossless quality might not be appropriate; rather, a better goal would be to establish varyinglevels of visually lossless quality that can be quantified in terms of the testing paradigm.

Keywords
visual lossless, visual lossy, image quality, industrial perspective, mobile screen, large format displays
National Category
Communication Systems Telecommunications Media Engineering
Identifiers
urn:nbn:se:miun:diva-34722 (URN)10 .1117/1.JEI.27.5.053035 (DOI)000449229800037 ()2-s2.0-85054964928 (Scopus ID)
Note

Copyright (2018) Society of Photo-Optical Instrumentation Engineers. One print or electronic copy may be made for personal use only. Systematic reproduction and distribution, duplication of any material in this paper for a fee or for commercial purposes, or modification of the content of the paper are prohibited.

Available from: 2018-10-15 Created: 2018-10-15 Last updated: 2018-12-06Bibliographically approved
Brunnström, K., Sjöström, M., Imran, M., Pettersson, M. & Johanson, M. (2018). Quality Of Experience For A Virtual Reality Simulator. In: Human Vision and Electronic Imaging (HVEI): . Paper presented at Human Vision and Electronic Imaging (HVEI), Burlingame, California USA, 28 January - 2 February, 2018.
Open this publication in new window or tab >>Quality Of Experience For A Virtual Reality Simulator
Show others...
2018 (English)In: Human Vision and Electronic Imaging (HVEI), 2018Conference paper, Published paper (Refereed)
Abstract [en]

In this study, we investigate a VR simulator of a forestrycrane used for loading logs onto a truck, mainly looking at Qualityof Experience (QoE) aspects that may be relevant for taskcompletion, but also whether there are any discomfort relatedsymptoms experienced during task execution. The QoE test hasbeen designed to capture both the general subjective experience ofusing the simulator and to study task completion rate. Moreover, aspecific focus has been to study the effects of latency on thesubjective experience, with regards both to delays in the cranecontrol interface as well as lag in the visual scene rendering in thehead mounted display (HMD). Two larger formal subjectivestudies have been performed: one with the VR-system as it is andone where we have added controlled delay to the display updateand to the joystick signals. The baseline study shows that mostpeople are more or less happy with the VR-system and that it doesnot have strong effects on any symptoms as listed in the SSQ. In thedelay study we found significant effects on Comfort Quality andImmersion Quality for higher Display delay (30 ms), but verysmall impact of joystick delay. Furthermore, the Display delay hadstrong influence on the symptoms in the SSQ, as well as causingtest subjects to decide not to continue with the completeexperiments, and this was also found to be connected to the longerDisplay delays (≥ 20 ms).

Keywords
Quality of Experience, Virtual Reality, simulator, Remote operation
National Category
Media and Communication Technology
Identifiers
urn:nbn:se:miun:diva-33073 (URN)
Conference
Human Vision and Electronic Imaging (HVEI), Burlingame, California USA, 28 January - 2 February, 2018
Funder
Knowledge Foundation, 20160194
Available from: 2018-02-26 Created: 2018-02-26 Last updated: 2018-09-19Bibliographically approved
Brunnström, K. & Barkowsky, M. (2018). Statistical quality of experience analysis: on planning the sample size and statistical significance testing. Journal of Electronic Imaging (JEI), 27(5), 053013-1-053013-11, Article ID 053013.
Open this publication in new window or tab >>Statistical quality of experience analysis: on planning the sample size and statistical significance testing
2018 (English)In: Journal of Electronic Imaging (JEI), ISSN 1017-9909, E-ISSN 1560-229X, Vol. 27, no 5, p. 053013-1-053013-11, article id 053013Article in journal (Refereed) Published
Abstract [en]

This paper analyzes how an experimenter can balance errors in subjective video quality tests betweenthe statistical power of finding an effect if it is there and not claiming that an effect is there if the effect is not there,i.e., balancing Type I and Type II errors. The risk of committing Type I errors increases with the number ofcomparisons that are performed in statistical tests. We will show that when controlling for this and at thesame time keeping the power of the experiment at a reasonably high level, it is unlikely that the number oftest subjects that are normally used and recommended by the International Telecommunication Union (ITU),i.e., 15 is sufficient but the number used by the Video Quality Experts Group (VQEG), i.e., 24 is more likelyto be sufficient. Examples will also be given for the influence of Type I error on the statistical significance ofcomparing objective metrics by correlation. We also present a comparison between parametric and nonparametricstatistics. The comparison targets the question whether we would reach different conclusions on the statisticaldifference between the video quality ratings of different video clips in a subjective test, based on thecomparison between the student T-test and the Mann–Whitney U-test. We found that there was hardly a differencewhen few comparisons are compensated for, i.e., then almost the same conclusions are reached. Whenthe number of comparisons is increased, then larger and larger differences between the two methods arerevealed. In these cases, the parametric T-test gives clearly more significant cases, than the nonparametrictest, which makes it more important to investigate whether the assumptions are met for performing a certaintest.

Place, publisher, year, edition, pages
IS&T - the Society for Imaging Science and Technology, 2018
Keywords
Type-I error, video quality, statistical significance, quality of experience, Student T-test, Bonferroni, Mann–Whitney U-test, parametric versus nonparametric test
National Category
Electrical Engineering, Electronic Engineering, Information Engineering Media Engineering Signal Processing Communication Systems
Identifiers
urn:nbn:se:miun:diva-34508 (URN)10.1117/1.JEI.27.5.053013 (DOI)000449229800015 ()2-s2.0-85054069504 (Scopus ID)
Funder
Knowledge Foundation, 20160194
Note

Copyright (2018) Society of Photo-Optical Instrumentation Engineers. One print or electronic copy may be made for personal use only. Systematic reproduction and distribution, duplication of any material in this paper for a fee or for commercial purposes, or modification of the content of the paper are prohibited.

Available from: 2018-09-26 Created: 2018-09-26 Last updated: 2018-12-06Bibliographically approved
Sedano, I., Prieto, G., Brunnström, K., Kihl, M. & Montalban, J. (2017). Application of full-reference video quality metrics in IPTV. In: 2017 IEEE International Symposium on Broadband Multimedia Systems and Broadcasting (BMSB): . Paper presented at 12th IEEE International Symposium on Broadband Multimedia Systems and Broadcasting, BMSB 2017; Cagliari; Italy; 7 June 2017 through 9 June 2017 (pp. 464-467). IEEE Computer Society
Open this publication in new window or tab >>Application of full-reference video quality metrics in IPTV
Show others...
2017 (English)In: 2017 IEEE International Symposium on Broadband Multimedia Systems and Broadcasting (BMSB), IEEE Computer Society, 2017, p. 464-467Conference paper, Published paper (Refereed)
Abstract [en]

Executing an accurate full-reference metric such as VQM can take minutes in an average computer for just one user. Therefore, it can be unfeasible to analyze all the videos received by users in an IPTV network for example consisting of 10.000 users using a single computer running the VQM metric. One solution can be to use a lightweight no-reference metrics in addition to the full-reference metric mentioned. Lightweight no-reference metrics can be used for discarding potential situations to evaluate because they are accurate enough for that task, and then the full-reference metric VQM can be used when more accuracy is needed. The work in this paper is focused on determining the maximum number of situations/users that can be analyzed simultaneously using the VQM metric in a computer with good performance. The full-reference metric is applied on the transmitter using a method specified in the recommendation ITU BT.1789. The best performance achieved was 112.8 seconds per process. 

Place, publisher, year, edition, pages
IEEE Computer Society, 2017
Series
IEEE International Symposium on Broadband Multimedia Systems and Broadcasting
Keywords
IPTV & Internet TV, Objective evaluation techniques, Performance evaluation, QoE
National Category
Electrical Engineering, Electronic Engineering, Information Engineering
Identifiers
urn:nbn:se:miun:diva-32192 (URN)10.1109/BMSB.2017.7986191 (DOI)000414279400087 ()2-s2.0-85027248799 (Scopus ID)978-1-5090-4937-0 (ISBN)
Conference
12th IEEE International Symposium on Broadband Multimedia Systems and Broadcasting, BMSB 2017; Cagliari; Italy; 7 June 2017 through 9 June 2017
Available from: 2017-11-29 Created: 2017-11-29 Last updated: 2018-02-22Bibliographically approved
Brunnström, K. & Barkowsky, M. (2017). Balancing type I errors and statistical power in video quality assessment. In: IS and T International Symposium on Electronic Imaging Science and Technology: . Paper presented at Human Vision and Electronic Imaging 2017, HVEI 2017, San Francisco, USA, 29 January 2017 through 2 February 2017 (pp. 91-96). Society for Imaging Science and Technology, F130042
Open this publication in new window or tab >>Balancing type I errors and statistical power in video quality assessment
2017 (English)In: IS and T International Symposium on Electronic Imaging Science and Technology, Society for Imaging Science and Technology , 2017, Vol. F130042, p. 91-96Conference paper, Published paper (Refereed)
Abstract [en]

This paper analyzes how an experimenter can balance errors in subjective video quality tests between the statistical power of finding an effect if it is there and not claiming that an effect is there if the effect it is not there i.e. balancing Type I and Type II errors. The risk of committing Type I errors increases with the number of comparisons that are performed in statistical tests. We will show that when controlling for this and at the same time keeping the power of the experiment at a reasonably high level, it will require more test subjects than are normally used and recommended by international standardization bodies like the ITU. Examples will also be given for the influence of Type I error on the statistical significance of comparing objective metrics by correlation.

Place, publisher, year, edition, pages
Society for Imaging Science and Technology, 2017
National Category
Computer and Information Sciences
Identifiers
urn:nbn:se:miun:diva-32869 (URN)10.2352/ISSN.2470-1173.2017.14.HVEI-122 (DOI)2-s2.0-85041524573 (Scopus ID)
Conference
Human Vision and Electronic Imaging 2017, HVEI 2017, San Francisco, USA, 29 January 2017 through 2 February 2017
Available from: 2018-02-22 Created: 2018-02-22 Last updated: 2018-02-22Bibliographically approved
Hermann, D., Djupsjöbacka, A., Andrén, B., Brunnström, K. & Rydell, N. (2017). Display panel certification system for the vehicle industry. In: Digest of Technical Papers - SID International Symposium: . Paper presented at SID Symposium, Seminar, and Exhibition 2017, Display Week 2017, Los Angeles, United States, 21 May 2017 through 26 May 2017 (pp. 471-474). , 1(1)
Open this publication in new window or tab >>Display panel certification system for the vehicle industry
Show others...
2017 (English)In: Digest of Technical Papers - SID International Symposium, 2017, Vol. 1, no 1, p. 471-474Conference paper, Published paper (Refereed)
Abstract [en]

The ever-increasing need for displaying in-vehicle visual information in a non-distracting way requires a high visual performance of automotive displays. For their procurement, deep technical and supply-chain knowledge is required. Therefore, based on our comparisons of in-vehicle and laboratory visual-performance measurements, we propose a certification system for automotive display panels. 

Keywords
Car displays, Certification, Display, Measurements, Panels, Requirements, Safety, Visual performance
National Category
Communication Systems
Identifiers
urn:nbn:se:miun:diva-33505 (URN)10.1002/sdtp.11653 (DOI)2-s2.0-85044459309 (Scopus ID)
Conference
SID Symposium, Seminar, and Exhibition 2017, Display Week 2017, Los Angeles, United States, 21 May 2017 through 26 May 2017
Available from: 2018-04-17 Created: 2018-04-17 Last updated: 2018-04-17Bibliographically approved
Brunnström, K., Allison, R. S., Chandler, D. M., Colett, H., Corriveau, P., Daly, S., . . . Zhang, Y. (2017). Industry and business perspectives on the distinctions between visually lossless and lossy video quality: Mobile and large format displays. In: IS and T International Symposium on Electronic Imaging Science and Technology: . Paper presented at Human Vision and Electronic Imaging 2017, HVEI 2017, Burlingame, United States, 29 January 2017 through 2 February 2017 (pp. 118-133). Society for Imaging Science and Technology, F130042
Open this publication in new window or tab >>Industry and business perspectives on the distinctions between visually lossless and lossy video quality: Mobile and large format displays
Show others...
2017 (English)In: IS and T International Symposium on Electronic Imaging Science and Technology, Society for Imaging Science and Technology , 2017, Vol. F130042, p. 118-133Conference paper, Published paper (Refereed)
Abstract [en]

This paper will explore the mobile and business perspectives of visually lossless image quality, as well as review recent scientific advances. It is the outcome from the Special Session on Visually Lossless Video Quality for Modern Devices: Research and Industry Perspectives organized at the Human Vision and Electronic Imaging 2017 by IS&T at San Francisco Airport, Burlingame, California, USA, Jan 29 - Feb 2, 2017. It summarizes four presentations and a panel discussion.

Place, publisher, year, edition, pages
Society for Imaging Science and Technology, 2017
National Category
Electrical Engineering, Electronic Engineering, Information Engineering
Identifiers
urn:nbn:se:miun:diva-32866 (URN)10.2352/ISSN.2470-1173.2017.14.HVEI-131 (DOI)2-s2.0-85041502698 (Scopus ID)
Conference
Human Vision and Electronic Imaging 2017, HVEI 2017, Burlingame, United States, 29 January 2017 through 2 February 2017
Available from: 2018-02-20 Created: 2018-02-20 Last updated: 2018-02-22Bibliographically approved
Søgaard, J., Shahid, M., Pokhrel, J. & Brunnström, K. (2017). On subjective quality assessment of adaptive video streaming via crowdsourcing and laboratory based experiments. Multimedia tools and applications, 76(15), 16727-16748
Open this publication in new window or tab >>On subjective quality assessment of adaptive video streaming via crowdsourcing and laboratory based experiments
2017 (English)In: Multimedia tools and applications, ISSN 1380-7501, E-ISSN 1573-7721, Vol. 76, no 15, p. 16727-16748Article in journal (Refereed) Published
Abstract [en]

Video streaming services are offered over the Internet and since the service providers do not have full control over the network conditions all the way to the end user, streaming technologies have been developed to maintain the quality of service in these varying network conditions i.e. so called adaptive video streaming. In order to cater for users’ Quality of Experience (QoE) requirements,HTTP based adaptive streaming solutions of video services have become popular.However, the keys to ensure the users a good QoE with this technology is still not completely understood. User QoE feedback is therefore instrumental inimproving this understanding. Controlled laboratory based perceptual quality experiments that involve a panel of human viewers are considered to be the most valid method of the assessment of QoE. Besides laboratory based subjective experiments,crowdsourcing based subjective assessment of video quality is gaining popularity as an alternative method. This article presents insights into a study that investigates perceptual preferences of various adaptive video streaming scenarios through crowdsourcing based and laboratory based subjective assessment.The major novel contribution of this study is the application of Paired Comparison based subjective assessment in a crowdsourcing environmen. The obtained results provide some novel indications, besides confirming the earlier published trends, of perceptual preferences for adaptive scenarios of video streaming. Our study suggests that in a network environment with fluctuations in the bandwidth,a medium or low video bitrate which can be kept constant is the best approach. Moreover, if there are only a few drops in bandwidth, one can choose a medium or high bitrate with a single or few buffering events.

Keywords
quality, video
National Category
Communication Systems Telecommunications
Identifiers
urn:nbn:se:miun:diva-28751 (URN)10.1007/s11042-016-3948-3 (DOI)000404609100030 ()2-s2.0-84988585391 (Scopus ID)
Funder
VINNOVA
Available from: 2016-09-12 Created: 2016-09-12 Last updated: 2018-11-06Bibliographically approved
Engelke, U., Darcy, D., Mulliken, G., Bosse, S., Martini, M., Arndt, S., . . . Brunnström, K. (2017). Psychophysiology-based QoE Assessment: A Survey. IEEE Journal on Selected Topics in Signal Processing, 11(1), 6-21, Article ID 7569001.
Open this publication in new window or tab >>Psychophysiology-based QoE Assessment: A Survey
Show others...
2017 (English)In: IEEE Journal on Selected Topics in Signal Processing, ISSN 1932-4553, E-ISSN 1941-0484, Vol. 11, no 1, p. 6-21, article id 7569001Article in journal (Refereed) Published
Abstract [en]

We present a survey of psychophysiology-based assessment for Quality of Experience (QoE) in advanced multimedia technologies. We provide a classification of methods relevant toQoE and describe related psychological processes, experimental design considerations, and signal analysis techniques. We summarise multimodal techniques and discuss several important aspects of psychophysiology-based QoE assessment, including the synergies with psychophysical assessment and the need for standardised experimental design. This survey is not considered to be exhaustive but serves as a guideline for those interested to further explore this emerging field of research.

Keywords
Psychophysiology, quality of experience, electroencephalography, near-infrared spectroscopy, electrocardiography, electrodermal activity, eye tracking, pupillometry.
National Category
Communication Systems Telecommunications
Identifiers
urn:nbn:se:miun:diva-28754 (URN)10.1109/JSTSP.2016.2609843 (DOI)000395767600002 ()2-s2.0-85015199281 (Scopus ID)STC (Local ID)STC (Archive number)STC (OAI)
Available from: 2016-09-12 Created: 2016-09-12 Last updated: 2018-11-06Bibliographically approved
Organisations
Identifiers
ORCID iD: ORCID iD iconorcid.org/0000-0001-5060-9402

Search in DiVA

Show all publications