miun.sePublications
Change search
Link to record
Permanent link

Direct link
BETA
Thörnberg, Benny
Alternative names
Publications (10 of 59) Show all publications
Nyström, J., Gradin, P. & Thörnberg, B. (2018). An experimental study of the chipping process with focus on energy consumption and chipping angles. Nordic Pulp & Paper Research Journal, 33(3), 460-467
Open this publication in new window or tab >>An experimental study of the chipping process with focus on energy consumption and chipping angles
2018 (English)In: Nordic Pulp & Paper Research Journal, ISSN 0283-2631, E-ISSN 2000-0669, Vol. 33, no 3, p. 460-467Article in journal (Refereed) Published
Abstract [en]

A series of chipping experiments were performed under both dynamic and quasi-static conditions in a laboratory wood chipper (dynamic) and in a MTS-servohydraulic testing machine (quasi-static). One aim with the experiments was to investigate the rate dependency of the energy consumption during chipping. Another aim was to try to determine the load per unit knife edge length required to initiate cutting. The experiments were carried out using different combinations of spout and edge angles. It was found that for large edge angles (keeping the spout angle constant at 30 °30^\circ ) there was a slight rate dependency such that the energy consumption was larger at higher cutting rates which is quite in opposite of what is expected if wood is assumed to be a viscoelastic material. It was also found that to determine the force at initiation of cutting, is not a trivial task. Both Acoustic Emission monitoring and visual inspection was used to this end. The wood species used in this study was pine (Pinus silvestris). 

Keywords
energy consumption, rate dependency, wood chipping
National Category
Chemical Engineering
Identifiers
urn:nbn:se:miun:diva-34599 (URN)10.1515/npprj-2018-3055 (DOI)2-s2.0-85052632762 (Scopus ID)
Available from: 2018-10-03 Created: 2018-10-03 Last updated: 2018-10-03Bibliographically approved
Rydblom, S., Thörnberg, B. & Olsson, E. (2018). Field Study of LWC and MVD Using the Droplet Imaging Instrument. IEEE Transactions on Instrumentation and Measurement
Open this publication in new window or tab >>Field Study of LWC and MVD Using the Droplet Imaging Instrument
2018 (English)In: IEEE Transactions on Instrumentation and Measurement, ISSN 0018-9456, E-ISSN 1557-9662Article in journal (Refereed) Epub ahead of print
Abstract [en]

The droplet imaging instrument (DII) is a new instrument for cost-effective in situ measurements of the size and concentration of water droplets. The droplet size distribution and the concentration of atmospheric liquid water are important for the prediction of icing on structures, such as wind turbines. To improve the predictions of icing, there is a need to explore cost-effective working solutions. Through imaging, a wide range of droplet sizes can be measured. This paper describes a study of the atmospheric liquid water content and the median volume diameter using the DII and a commercial reference instrument--the cloud droplet probe 2 from Droplet Measurement Technologies Inc. The measurement is done at a weather measurement station in mid-Sweden. For a second validation, the result is compared with predictions using a numerical weather prediction model. The size measurement of the DII is verified using polymer microspheres of four known size distributions. The study shows that the DII measurement is precise, but there is a systematic difference between the two compared instruments. It also shows that droplets larger than 50 μm in diameter are occasionally measured, which we believe is important for the prediction of icing.

Place, publisher, year, edition, pages
IEEE, 2018
National Category
Other Electrical Engineering, Electronic Engineering, Information Engineering
Identifiers
urn:nbn:se:miun:diva-34304 (URN)10.1109/TIM.2018.2843599 (DOI)
Available from: 2018-08-28 Created: 2018-08-28 Last updated: 2018-09-20Bibliographically approved
Fedorov, I., Lawal, N., Thörnberg, B., Alqaysi, H. & O'Nils, M. (2018). Towards calibration of outdoor multi-camera visual monitoring system. In: : . Paper presented at ICDSC'18 Proceedings of the 12th International Conference on Distributed Smart Cameras. New York, NY, US: ACM Digital Library
Open this publication in new window or tab >>Towards calibration of outdoor multi-camera visual monitoring system
Show others...
2018 (English)Conference paper, Published paper (Refereed)
Abstract [en]

This paper proposes a method for calibrating of multi-camera systems where no natural reference points exist in the surrounding environment. Monitoring the air space at wind farms is our test case. The goal is to monitor the trajectories of flying birds to prevent them from colliding with rotor blades. Our camera calibration method is based on the observation of a portable artificial reference marker made out of a pulsed light source and a navigation satellite sensor module. The reference marker can determine and communicate its position in the world coordinate system at centimeter precision using navigartion sensors. Our results showed that simultaneous detection of the same marker in several cameras having overlapping field of views allowed us to determine the markers position in 3D world coordinate space with an accuracy of 3-4 cm. These experiments were made in the volume around a wind turbine at distances from cameras to marker within a range of 70 to 90 m.

Place, publisher, year, edition, pages
New York, NY, US: ACM Digital Library, 2018. p. 6
National Category
Other Electrical Engineering, Electronic Engineering, Information Engineering
Identifiers
urn:nbn:se:miun:diva-34643 (URN)10.1145/3243394.3243695 (DOI)978-1-4503-6511-6 (ISBN)
Conference
ICDSC'18 Proceedings of the 12th International Conference on Distributed Smart Cameras
Available from: 2018-10-05 Created: 2018-10-05 Last updated: 2018-10-09Bibliographically approved
Amir, Y. M. & Thörnberg, B. (2017). High Precision Laser Scanning of Metallic Surfaces. International Journal of Optics, Article ID 4134205.
Open this publication in new window or tab >>High Precision Laser Scanning of Metallic Surfaces
2017 (English)In: International Journal of Optics, ISSN 1687-9384, E-ISSN 1687-9392, article id 4134205Article in journal (Refereed) Published
Abstract [en]

Speckle noise, dynamic range of light intensity, and spurious reflections are major challenges when laser scanners are used for 3D surface acquisition. In this work, a series of image processing operations, that is, Spatial Compound Imaging, High Dynamic Range Extension, Gray Level Transformation, and Most Similar Nearest Neighbor are proposed to overcome the challenges coming from the target surface. A prototype scanner for metallic surfaces is designed to explore combinations of these image processing operations. The main goal is to find the combination of operations thatwill lead to the highest possible robustness andmeasurement precision at the lowest possible computational load. Inspection of metallic tools where the surface of its edge must be measured at micrometer precision is our test case. Precision of heights measured without using the proposed image processing is firstly analyzed to be +/- 7.6 mu m at 68% confidence level. The best achieved height precision was +/- 4.2 mu m. This improvement comes at 24 times longer processing time and five times longer scanning time. Dynamic range extension of the image capture improves robustness since the numbers of saturated or underexposed pixels are substantially reduced. Using a high dynamic range (HDR) camera offers a compromise between processing time, robustness, and precision.

National Category
Electrical Engineering, Electronic Engineering, Information Engineering
Identifiers
urn:nbn:se:miun:diva-31363 (URN)10.1155/2017/4134205 (DOI)000405332500001 ()2-s2.0-85024488897 (Scopus ID)
Available from: 2017-08-10 Created: 2017-08-10 Last updated: 2017-11-29Bibliographically approved
Rydblom, S. & Thörnberg, B. (2016). Droplet Imaging Instrument Metrology Instrument for Icing Condition Detection. In: 2016 IEEE INTERNATIONAL CONFERENCE ON IMAGING SYSTEMS AND TECHNIQUES (IST): . Paper presented at IEEE International Conference on Imaging Systems and Techniques (IST) / IEEE International School on Imaging, OCT 04-06, 2016, Chania, GREECE (pp. 66-71). IEEE, Article ID 7738200.
Open this publication in new window or tab >>Droplet Imaging Instrument Metrology Instrument for Icing Condition Detection
2016 (English)In: 2016 IEEE INTERNATIONAL CONFERENCE ON IMAGING SYSTEMS AND TECHNIQUES (IST), IEEE, 2016, p. 66-71, article id 7738200Conference paper, Published paper (Refereed)
Abstract [en]

An instrument for measuring water droplets is described and constructed. It is designed to measure the volume concentration and the size distribution of droplets in order to detect icing conditions in a natural fog. The instrument works by shadowgraph imaging, with a collimated blue LED as background illumination. We show how to use a reference object to obtain a calibration of the droplet size and the measurement volume. These properties are derived from a measurement of the object's shadow intensity and its edge second derivative. From the size of every measured droplet and its expected detection volume, a measure of the liquid water content (LWC) and the median volume diameter (MVD) can be estimated. The instrument can be used for continuous measurement in a remote weather-exposed location and is tested in a small environment chamber. We also describe this chamber and how we can change the LWC using an ultrasonic fog generator and a fan.

Place, publisher, year, edition, pages
IEEE, 2016
Series
IEEE International Conference on Imaging Systems and Techniques, ISSN 2471-6162
Keywords
atmospheric measurements, fog chamber, image analysis, liquid water content, machine vision
National Category
Electrical Engineering, Electronic Engineering, Information Engineering
Identifiers
urn:nbn:se:miun:diva-29765 (URN)10.1109/IST.2016.7738200 (DOI)000388735200012 ()2-s2.0-85004010273 (Scopus ID)STC (Local ID)978-1-5090-1817-8 (ISBN)STC (Archive number)STC (OAI)
Conference
IEEE International Conference on Imaging Systems and Techniques (IST) / IEEE International School on Imaging, OCT 04-06, 2016, Chania, GREECE
Available from: 2016-12-22 Created: 2016-12-22 Last updated: 2017-10-12Bibliographically approved
Rydblom, S. & Thörnberg, B. (2016). Liquid Water Content and Droplet Sizing Shadowgraph Measuring System for Wind Turbine Icing Detection. IEEE Sensors Journal, 16(8), 2714-2725, Article ID 7384444.
Open this publication in new window or tab >>Liquid Water Content and Droplet Sizing Shadowgraph Measuring System for Wind Turbine Icing Detection
2016 (English)In: IEEE Sensors Journal, ISSN 1530-437X, E-ISSN 1558-1748, Vol. 16, no 8, p. 2714-2725, article id 7384444Article in journal (Refereed) Published
Abstract [en]

This study shows that the liquid water content (LWC) and the median volume diameter (MVD) can be derived from images of water droplets using a shadowgraph imaging system with incoherent LED illumination.

Icing on structures such as a wind turbine is the result of a combination of LWC and MVD and other parameters like temperature, humidity and wind speed. Today, LWC and MVD are not commonly measured for wind turbines. Systems for measuring these properties are often expensive or impractical in terms of location or remote reading. The aim of this study is to gain knowledge about how to design a single instrument based on imaging that has the ability to measure these properties with enough precision and accuracy to detect icing conditions for wind turbines.

A method to calculate both the LWC and the MVD from the same images is described in this paper. The size of one droplet is determined by measuring the shadow created by the droplet in background illumination. The concentration is calculated by counting the measured droplets and estimating the volumes in which these droplets can be observed.

In the described study, the observation volume is shown to be dependent on the particle size and the signal to noise ratio (SNR) for each measured particle. An expected coefficient of variation of the LWC depending on the droplet size is shown to be 2.4 percent for droplets 10 µm in diameter and 1.6 percent for 25 µm droplets. This is based on an error estimation of the laboratory measurements calibrated using a micrometer dot scale.

Place, publisher, year, edition, pages
IEEE Sensors Council, 2016
Keywords
LWC, MVD, Icing, Clouds, Image processing, Machine vision, Meteorology, Optical microscopy, Wind power generation
National Category
Other Electrical Engineering, Electronic Engineering, Information Engineering
Identifiers
urn:nbn:se:miun:diva-27321 (URN)10.1109/JSEN.2016.2518653 (DOI)000372419100061 ()2-s2.0-84962128668 (Scopus ID)STC (Local ID)STC (Archive number)STC (OAI)
Funder
Swedish Energy Agency
Available from: 2016-03-22 Created: 2016-03-22 Last updated: 2017-10-12Bibliographically approved
Imran, M., O'Nils, M., Munir, H. & Thörnberg, B. (2015). Low complexity FPGA based background subtraction technique for thermal imagery. In: ACM International Conference Proceeding Series: . Paper presented at 9th International Conference on Distributed Smart Cameras, ICDSC 2015; Seville; Spain; 8 September 2015 through 11 September 2015; Code 117454 (pp. 1-6). Association for Computing Machinery (ACM)
Open this publication in new window or tab >>Low complexity FPGA based background subtraction technique for thermal imagery
2015 (English)In: ACM International Conference Proceeding Series, Association for Computing Machinery (ACM), 2015, p. 1-6Conference paper, Published paper (Refereed)
Abstract [en]

Embedded smart camera systems are gaining popularity for a number of real world surveillance applications. However, there are still challenges, i.e. variation in illumination, shadows, occlusion, and weather conditions while employing the vision algorithms in outdoor environments. For safety-critical surveillance applications, the visual sensors can be complemented with beyond-visual-range sensors. This in turn requires analysis, development and modification of existing imaging techniques. In this work, a low complexity background modelling and subtraction technique has been proposed for thermal imagery. The proposed technique has been implemented on Field Programmable Gate Arrays (FPGAs) after in-depth analysis of different sets of images, characterizing poor signal-to-noise ratio challenges, e.g. motion of high frequency background objects, temperature variation and camera jitter etc. The proposed technique dynamically updates the background on pixel level and requires a single frame storage as opposed to existing techniques. The comparison of this approach with two other approaches show that this approach performs better in different environmental conditions. The proposed technique has been modelled in Register Transfer Logic (RTL) and implementation on the latest FPGAs shows that the design requires less than 1 percent logics, 47 percent block RAMs, and consumes 91 mW power consumption on Artix-7 100T FPGA.

Place, publisher, year, edition, pages
Association for Computing Machinery (ACM), 2015
Keywords
Background modelling; subtraction; FPGA; architecture; smart camera, thermal imaging.
National Category
Embedded Systems
Identifiers
urn:nbn:se:miun:diva-25997 (URN)10.1145/2789116.2789121 (DOI)2-s2.0-84958251961 (Scopus ID)STC (Local ID)978-145033681-9 (ISBN)STC (Archive number)STC (OAI)
Conference
9th International Conference on Distributed Smart Cameras, ICDSC 2015; Seville; Spain; 8 September 2015 through 11 September 2015; Code 117454
Available from: 2015-09-28 Created: 2015-09-28 Last updated: 2016-12-23Bibliographically approved
Malik, A. W., Thörnberg, B., Anwar, Q., Johansen, T. A. & Shahzad, K. (2015). Real Time Decoding of Color Symbol for Optical Positioning System. International Journal of Advanced Robotic Systems, 12(5)
Open this publication in new window or tab >>Real Time Decoding of Color Symbol for Optical Positioning System
Show others...
2015 (English)In: International Journal of Advanced Robotic Systems, ISSN 1729-8806, E-ISSN 1729-8814, Vol. 12, no 5Article in journal (Refereed) Published
Abstract [en]

This paper presents the design and real-time decoding of a color symbol that can be used as a reference marker for optical navigation. The designed symbol has a circular shape and is printed on paper using two distinct colors. This pair of colors is selected based on the highest achievable signal to noise ratio. The symbol is designed to carry eight bit information. Real time decoding of this symbol is performed using a heterogeneous combination of Field Programmable Gate Array (FPGA) and a microcontroller.  An image sensor having a resolution of 1600 by 1200 pixels is used to capture images of symbols in complex backgrounds. Dynamic image segmentation, component labeling and feature extraction was performed on the FPGA. The region of interest was further computed from the extracted features. Feature data belonging to the symbol was sent from the FPGA to the microcontroller. Image processing tasks are partitioned between the FPGA and microcontroller based on data intensity. Experiments were performed to verify the rotational independence of the symbols. The maximum distance between camera and symbol allowing for correct detection and decoding was analyzed. Experiments were also performed to analyze the number of generated image components and sub-pixel precision versus different light sources and intensities. The proposed hardware architecture can process up to 55 frames per second for accurate detection and decoding of symbols at two Megapixels resolution. The power consumption of the complete system is 342mw.

Keywords
Indoor navigation, Reference symbol, Robotic vision
National Category
Robotics
Identifiers
urn:nbn:se:miun:diva-23168 (URN)10.5772/59680 (DOI)000350647600001 ()2-s2.0-84923346270 (Scopus ID)STC (Local ID)STC (Archive number)STC (OAI)
Funder
Knowledge Foundation
Available from: 2014-10-08 Created: 2014-10-08 Last updated: 2017-10-27Bibliographically approved
Jonsson, P., Thörnberg, B., Dobslaw, F. & Vaa, T. (2015). Road Condition Imaging: Model Development. In: : . Paper presented at Transportation Research Board 2015 Annual Meeting.
Open this publication in new window or tab >>Road Condition Imaging: Model Development
2015 (English)Conference paper, Published paper (Refereed)
Abstract [en]

It is important to classify road conditions to plan winter road maintenance, carry out proper actions and issue warnings to road users. Existing sensor systems only cover parts of the road surface and manual observations can vary depending on those who classify the observations. One challenge is to classify road conditions with automatic monitoring systems. This paper presents a model based on data from winter 2013-2014, retrieved from two installations in Sweden and Norway. To address that challenge an innovative and cost effective road condition imaging system, capable of classifying individual pixels of an image as dry, wet, icy or snowy, is evaluated. The system uses a near infra-red image detector and optical wavelength filters. By combining data from images taken from different wavelength filters it is possible to determine the road status by using multiclass classifiers. One classifier for each road condition was developed, which implies that a pixel can be classified to two or more road conditions at the same time. This multiclass problem is solved by developing a Bayesian Network that uses road weather information system data for the calculation of the probabilities for different road conditions.

Keywords
Road condition, Near Infra Red, classification, remote sensing, Bayesian Networks
National Category
Other Electrical Engineering, Electronic Engineering, Information Engineering
Identifiers
urn:nbn:se:miun:diva-24250 (URN)STC (Local ID)STC (Archive number)STC (OAI)
Conference
Transportation Research Board 2015 Annual Meeting
Note

Paper number: 15-0885

Presented at the conference in Washington.

Available from: 2015-01-29 Created: 2015-01-29 Last updated: 2016-12-23Bibliographically approved
Jonsson, P., Thörnberg, B. & Casselgren, J. (2015). Road surface status classification using spectral analysis of NIR camera images. IEEE Sensors Journal, 15(3), 1641-1656
Open this publication in new window or tab >>Road surface status classification using spectral analysis of NIR camera images
2015 (English)In: IEEE Sensors Journal, ISSN 1530-437X, E-ISSN 1558-1748, Vol. 15, no 3, p. 1641-1656Article in journal (Refereed) Published
Abstract [en]

There is a need for an automated road status classification system considering the vast number of weather-related accidents that occur every winter. Previous research has shown that it is possible to detect hazardous road conditions, including, for example, icy pavements, using single point infrared illumination and infrared detectors. In this paper, we extend this research into camera surveillance of a road section allowing for classification of area segments of weather-related road surface conditions such as wet, snow covered, or icy. Infrared images have been obtained using an infrared camera equipped with a set of optical wavelength filters. The images have primarily been used to develop multivariate data models and also for the classification of road conditions in each pixel. This system is a vast improvement on existing single spot road status classification systems. The resulting imaging system can reliably distinguish between dry, wet, icy, or snow covered sections on road surfaces.

Keywords
Remote sensing, Infrared imaging, Spectral analysis, Image classification
National Category
Electrical Engineering, Electronic Engineering, Information Engineering
Identifiers
urn:nbn:se:miun:diva-24249 (URN)10.1109/JSEN.2014.2364854 (DOI)000348858300008 ()2-s2.0-84921047416 (Scopus ID)STC (Local ID)STC (Archive number)STC (OAI)
Available from: 2015-01-29 Created: 2015-01-29 Last updated: 2017-12-05Bibliographically approved
Organisations

Search in DiVA

Show all publications