miun.sePublikasjoner
Endre søk
RefereraExporteraLink to record
Permanent link

Direct link
Referera
Referensformat
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Annet format
Fler format
Språk
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Annet språk
Fler språk
Utmatningsformat
  • html
  • text
  • asciidoc
  • rtf
Multi-Camera Light Field Capture: Synchronization, Calibration, Depth Uncertainty, and System Design
Mittuniversitetet, Fakulteten för naturvetenskap, teknik och medier, Avdelningen för informationssystem och -teknologi. (Realistic 3D)ORCID-id: 0000-0002-4967-3033
2018 (engelsk)Licentiatavhandling, med artikler (Annet vitenskapelig)
Abstract [en]

The digital camera is the technological counterpart to the human eye, enabling the observation and recording of events in the natural world. Since modern life increasingly depends on digital systems, cameras and especially multiple-camera systems are being widely used in applications that affect our society, ranging from multimedia production and surveillance to self-driving robot localization. The rising interest in multi-camera systems is mirrored by the rising activity in Light Field research, where multi-camera systems are used to capture Light Fields - the angular and spatial information about light rays within a 3D space. 

The purpose of this work is to gain a more comprehensive understanding of how cameras collaborate and produce consistent data as a multi-camera system, and to build a multi-camera Light Field evaluation system. This work addresses three problems related to the process of multi-camera capture: first, whether multi-camera calibration methods can reliably estimate the true camera parameters; second, what are the consequences of synchronization errors in a multi-camera system; and third, how to ensure data consistency in a multi-camera system that records data with synchronization errors. Furthermore, this work addresses the problem of designing a flexible multi-camera system that can serve as a Light Field capture testbed.

The first problem is solved by conducting a comparative assessment of widely available multi-camera calibration methods. A special dataset is recorded, giving known constraints on camera ground-truth parameters to use as reference for calibration estimates. The second problem is addressed by introducing a depth uncertainty model that links the pinhole camera model and synchronization error to the geometric error in the 3D projections of recorded data. The third problem is solved for the color-and-depth multi-camera scenario, by using a proposed estimation of the depth camera synchronization error and correction of the recorded depth maps via tensor-based interpolation. The problem of designing a Light Field capture testbed is addressed empirically, by constructing and presenting a multi-camera system based on off-the-shelf hardware and a modular software framework.

The calibration assessment reveals that target-based and certain target-less calibration methods are relatively similar at estimating the true camera parameters. The results imply that for general-purpose multi-camera systems, target-less calibration is an acceptable choice. For high-accuracy scenarios, even commonly used target-based calibration approaches are insufficiently accurate. The proposed depth uncertainty model is used to show that converged multi-camera arrays are less sensitive to synchronization errors. The mean depth uncertainty of a camera system correlates to the rendered result in depth-based reprojection, as long as the camera calibration matrices are accurate. The proposed depthmap synchronization method is used to produce a consistent, synchronized color-and-depth dataset for unsynchronized recordings without altering the depthmap properties. Therefore, the method serves as a compatibility layer between unsynchronized multi-camera systems and applications that require synchronized color-and-depth data. Finally, the presented multi-camera system demonstrates a flexible, de-centralized framework where data processing is possible in the camera, in the cloud, and on the data consumer's side. The multi-camera system is able to act as a Light Field capture testbed and as a component in Light Field communication systems, because of the general-purpose computing and network connectivity support for each sensor, small sensor size, flexible mounts, hardware and software synchronization, and a segmented software framework. 

sted, utgiver, år, opplag, sider
Sundsvall, Sweden: Mid Sweden University , 2018. , s. 64
Serie
Mid Sweden University licentiate thesis, ISSN 1652-8948 ; 139
Emneord [en]
Light field, Camera systems, Multiview, Synchronization, Camera calibration
HSV kategori
Identifikatorer
URN: urn:nbn:se:miun:diva-33622ISBN: 978-91-88527-56-1 (tryckt)OAI: oai:DiVA.org:miun-33622DiVA, id: diva2:1205723
Presentation
2018-06-15, L111, Holmgatan 10, Sundsvall, 13:00 (engelsk)
Opponent
Veileder
Forskningsfinansiär
Knowledge Foundation, 20140200
Merknad

Vid tidpunkten för framläggning av avhandlingen var följande delarbete opublicerat: delarbete 3 manuskript.

At the time of the defence the following paper was unpublished: paper 3 manuscript.

Tilgjengelig fra: 2018-05-16 Laget: 2018-05-15 Sist oppdatert: 2018-05-16bibliografisk kontrollert
Delarbeid
1. Modeling Depth Uncertainty of Desynchronized Multi-Camera Systems
Åpne denne publikasjonen i ny fane eller vindu >>Modeling Depth Uncertainty of Desynchronized Multi-Camera Systems
2017 (engelsk)Inngår i: 2017 International Conference on 3D Immersion (IC3D), IEEE, 2017Konferansepaper, Publicerat paper (Fagfellevurdert)
Abstract [en]

Accurately recording motion from multiple perspectives is relevant for recording and processing immersive multi-media and virtual reality content. However, synchronization errors between multiple cameras limit the precision of scene depth reconstruction and rendering. In order to quantify this limit, a relation between camera de-synchronization, camera parameters, and scene element motion has to be identified. In this paper, a parametric ray model describing depth uncertainty is derived and adapted for the pinhole camera model. A two-camera scenario is simulated to investigate the model behavior and how camera synchronization delay, scene element speed, and camera positions affect the system's depth uncertainty. Results reveal a linear relation between synchronization error, element speed, and depth uncertainty. View convergence is shown to affect mean depth uncertainty up to a factor of 10. Results also show that depth uncertainty must be assessed on the full set of camera rays instead of a central subset.

sted, utgiver, år, opplag, sider
IEEE, 2017
Emneord
Camera synchronization, Synchronization error, Depth estimation error, Multi-camera system
HSV kategori
Identifikatorer
urn:nbn:se:miun:diva-31841 (URN)10.1109/IC3D.2017.8251891 (DOI)000427148600001 ()2-s2.0-85049401578 (Scopus ID)978-1-5386-4655-7 (ISBN)
Konferanse
2017 International Conference on 3D Immersion (IC3D 2017), Brussels, Belgium, 11th-12th December 2017
Prosjekter
LIFE project
Forskningsfinansiär
Knowledge Foundation, 20140200
Tilgjengelig fra: 2017-10-13 Laget: 2017-10-13 Sist oppdatert: 2019-03-22
2. Assessment of Multi-Camera Calibration Algorithms for Two-Dimensional Camera Arrays Relative to Ground Truth Position and Direction
Åpne denne publikasjonen i ny fane eller vindu >>Assessment of Multi-Camera Calibration Algorithms for Two-Dimensional Camera Arrays Relative to Ground Truth Position and Direction
2016 (engelsk)Inngår i: 3DTV-Conference, IEEE Computer Society, 2016, artikkel-id 7548887Konferansepaper, Publicerat paper (Fagfellevurdert)
Abstract [en]

Camera calibration methods are commonly evaluated on cumulative reprojection error metrics, on disparate one-dimensional da-tasets. To evaluate calibration of cameras in two-dimensional arrays, assessments need to be made on two-dimensional datasets with constraints on camera parameters. In this study, accuracy of several multi-camera calibration methods has been evaluated on camera parameters that are affecting view projection the most. As input data, we used a 15-viewpoint two-dimensional dataset with intrinsic and extrinsic parameter constraints and extrinsic ground truth. The assessment showed that self-calibration methods using structure-from-motion reach equal intrinsic and extrinsic parameter estimation accuracy with standard checkerboard calibration algorithm, and surpass a well-known self-calibration toolbox, BlueCCal. These results show that self-calibration is a viable approach to calibrating two-dimensional camera arrays, but improvements to state-of-art multi-camera feature matching are necessary to make BlueCCal as accurate as other self-calibration methods for two-dimensional camera arrays.

sted, utgiver, år, opplag, sider
IEEE Computer Society, 2016
Emneord
Camera calibration, multi-view image dataset, 2D camera array, self-calibration, calibration assessment
HSV kategori
Identifikatorer
urn:nbn:se:miun:diva-27960 (URN)10.1109/3DTV.2016.7548887 (DOI)000390840500006 ()2-s2.0-84987849952 (Scopus ID)STC (Lokal ID)978-1-5090-3313-3 (ISBN)STC (Arkivnummer)STC (OAI)
Konferanse
2016 3DTV-Conference: The True Vision - Capture, Transmission and Display of 3D Video, 3DTV-CON 2016; Hamburg; Germany; 4 July 2016 through 6 July 2016; Category numberCFP1655B-ART; Code 123582
Forskningsfinansiär
Knowledge Foundation, 20140200
Tilgjengelig fra: 2016-06-17 Laget: 2016-06-16 Sist oppdatert: 2018-05-15bibliografisk kontrollert
3. Estimation and Post-Capture Compensation of Synchronization Error in Unsynchronized Multi-Camera Systems
Åpne denne publikasjonen i ny fane eller vindu >>Estimation and Post-Capture Compensation of Synchronization Error in Unsynchronized Multi-Camera Systems
Vise andre…
(engelsk)Manuskript (preprint) (Annet vitenskapelig)
HSV kategori
Identifikatorer
urn:nbn:se:miun:diva-33621 (URN)
Tilgjengelig fra: 2018-05-15 Laget: 2018-05-15 Sist oppdatert: 2018-05-16bibliografisk kontrollert
4. LIFE: A Flexible Testbed For Light Field Evaluation
Åpne denne publikasjonen i ny fane eller vindu >>LIFE: A Flexible Testbed For Light Field Evaluation
Vise andre…
2018 (engelsk)Konferansepaper, Publicerat paper (Fagfellevurdert)
Abstract [en]

Recording and imaging the 3D world has led to the use of light fields. Capturing, distributing and presenting light field data is challenging, and requires an evaluation platform. We define a framework for real-time processing, and present the design and implementation of a light field evaluation system. In order to serve as a testbed, the system is designed to be flexible, scalable, and able to model various end-to-end light field systems. This flexibility is achieved by encapsulating processes and devices in discrete framework systems. The modular capture system supports multiple camera types, general-purpose data processing, and streaming to network interfaces. The cloud system allows for parallel transcoding and distribution of streams. The presentation system encapsulates rendering and display specifics. The real-time ability was tested in a latency measurement; the capture and presentation systems process and stream frames within a 40 ms limit.

Emneord
Multiview, 3DTV, Light field, Distributed surveillance, 360 video
HSV kategori
Identifikatorer
urn:nbn:se:miun:diva-33620 (URN)000454903900016 ()2-s2.0-85056147245 (Scopus ID)978-1-5386-6125-3 (ISBN)
Konferanse
2018 3DTV Conference: The True Vision - Capture, Transmission and Display of 3D Video (3DTV-CON), Stockholm – Helsinki – Stockholm, 3-5 June 2018
Prosjekter
LIFE Project
Forskningsfinansiär
Knowledge Foundation, 20140200
Tilgjengelig fra: 2018-05-15 Laget: 2018-05-15 Sist oppdatert: 2019-02-15bibliografisk kontrollert

Open Access i DiVA

MultiCameraLightFieldCapture(5752 kB)327 nedlastinger
Filinformasjon
Fil FULLTEXT01.pdfFilstørrelse 5752 kBChecksum SHA-512
fb55ccd2bb17ac5e74daa48707c80c98e588a1c51d42c9f978630b513f3ed32dfc9eb0532d4fcd24de58129ed62e97e556a0a9d61553bbd58c60222135029445
Type fulltextMimetype application/pdf

Personposter BETA

Dima, Elijs

Søk i DiVA

Av forfatter/redaktør
Dima, Elijs
Av organisasjonen

Søk utenfor DiVA

GoogleGoogle Scholar
Totalt: 327 nedlastinger
Antall nedlastinger er summen av alle nedlastinger av alle fulltekster. Det kan for eksempel være tidligere versjoner som er ikke lenger tilgjengelige

isbn
urn-nbn

Altmetric

isbn
urn-nbn
Totalt: 952 treff
RefereraExporteraLink to record
Permanent link

Direct link
Referera
Referensformat
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Annet format
Fler format
Språk
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Annet språk
Fler språk
Utmatningsformat
  • html
  • text
  • asciidoc
  • rtf