Subversion Repositories seema-scanner

Rev

Rev 61 | Rev 64 | Go to most recent revision | Show entire file | Ignore whitespace | Details | Blame | Last modification | View Log | RSS feed

Rev 61 Rev 62
Line 1... Line 1...
1
\documentclass[10pt,notitlepage]{report}
1
\documentclass[10pt,notitlepage]{report}
-
 
2
\usepackage[utf8]{inputenc}
-
 
3
\usepackage[T1]{fontenc}
2
\usepackage{url}
4
\usepackage{url}
3
\usepackage{graphicx}
5
\usepackage{graphicx}
4
\usepackage{fullpage}
6
\usepackage{fullpage}
5
 
7
 
6
\renewcommand{\chaptermark}[1]{\markboth{#1}{}}
8
% \renewcommand{\chaptermark}[1]{\markboth{#1}{}}
7
\renewcommand{\sectionmark}[1]{\markright{\thesection\ #1}}
9
% \renewcommand{\sectionmark}[1]{\markright{\thesection\ #1}}
8
 
10
 
9
\title{The SeeMa Lab Structured Light Scanner}
11
\title{The SeeMa Lab Structured Light Scanner}
10
\author{Eytor Eirikson and Jakob Wilm\\
12
\author{Jakob Wilm and Eyþór Rúnar Eiríksson\\
11
		\url{{eruei, jakw}@dtu.dk}}
13
		\url{{jakw,eruei}@dtu.dk}}
12
\date{\today}
14
\date{\today}
13
 
15
 
14
\begin{document}
16
\begin{document}
15
 
17
 
16
\maketitle
18
\maketitle
Line 20... Line 22...
20
		\includegraphics[width=.9\textwidth]{mesh0.png}
22
		\includegraphics[width=.9\textwidth]{mesh0.png}
21
	\label{fig:mesh0}
23
	\label{fig:mesh0}
22
\end{figure}
24
\end{figure}
23
 
25
 
24
\begin{abstract}
26
\begin{abstract}
25
This document is the official manual for the Seeing Machines Lab Structured Light Scanner, SeeMa-Scanner for short. The scanner constitutes of both hardware components (the physical device, including cameras, projector and rotation stage), and the software GUI needed to perform object surface digitizations in full color with high precision. While most of these components should be self-explanatory, we will describe the functional principles of the scanner and give a brief introduction for how to get from a physical object to a complete digital meshed model of it. This document also describes the software components involved, making it possible for students and staff to implement scan software, and possibly extend the software.
27
This is the official manual for the Seeing Machines Lab Structured Light Scanner -- SeeMa-Scanner for short. The scanner constitutes of both hardware components (the physical device, including cameras, projector and rotation stage), and the software GUI needed to perform object surface digitizations in full color with high precision. While most of these components should be self-explanatory, we will describe the functional principles of the scanner and give a brief introduction for how to get from a physical object to a complete digital meshed model of it. This document also describes the software components involved, making it possible for students and staff to implement scan software, and possibly extend the software.
26
\end{abstract}
28
\end{abstract}
27
 
29
 
28
\chapter{The scanner}
30
\chapter{The scanner}
29
\section{Getting started}
31
\section{Getting started}
30
Welcome to the SeeMaLab 3D scanner documentation. This document describes the main hardware and software parts of the system, and provides short directions for performing scans, and reconstructing surfaces. Please be very careful with this very expensive equipment, and considerate by not misplacing any parts and not borrowing any components of the scanner hardware.
32
Welcome to the SeeMaLab 3D scanner documentation. This document describes the main hardware and software parts of the system, and provides short directions for performing scans, and reconstructing surfaces. Please be very careful with this very expensive equipment, and considerate by not misplacing any parts and not borrowing any components of the scanner hardware.
Line 35... Line 37...
35
 
37
 
36
\section{Hardware parts}
38
\section{Hardware parts}
37
\begin{table}
39
\begin{table}
38
	\begin{tabular}{l l l p{0.3\textwidth}}
40
	\begin{tabular}{l l l p{0.3\textwidth}}
39
		\textbf{Part}              & \textbf{Manufacturer} & \textbf{Model} & \textbf{Specifications} \\
41
		\textbf{Part}              & \textbf{Manufacturer} & \textbf{Model} & \textbf{Specifications} \\
40
		\hline
42
		\hline\\[0.2cm]
41
		Industrial Cameras & Point Grey Research & GS3-U3-91S6C-C & Color, 9.1 MP, Sony ICX814 CCD, 1", 3.69 µm, Global shutter, 3376 x 2704 at 9 FPS \\[0.5cm]
43
		Industrial Cameras & Point Grey Research & GS3-U3-91S6C-C & Color, 9.1 MP, Sony ICX814 CCD, 1", 3.69 $\mu$m, Global shutter, 3376 x 2704 at 9 FPS \\[0.5cm]
42
		Camera Lenses & Kowa & LM12SC & 1'', 12mm, 6MPix \\[0.5cm]
44
		Camera Lenses & Kowa & LM12SC & 1'', 12mm, 6MPix \\[0.5cm]
43
		Projector		& LG & PF80G & DLP 1080p HD resolution (1920 x 1080), 1,000 ANSI lumen, LED light source \\[0.5cm]
45
		Projector		& LG & PF80G & DLP 1080p HD resolution (1920 x 1080), 1,000 ANSI lumen, LED light source \\[0.5cm]
44
		Rotations Stage & Newmark & RM-5-110 & 0.36 arc-sec resolution, 70 arc-sec accuracy, 5 arc-sec repeatability, stepper motor, 72:1 gear ratio, home switch, no optical encoder \\[0.5cm]
46
		Rotations Stage & Newmark & RM-5-110 & 0.36 arc-sec resolution, 70 arc-sec accuracy, 5 arc-sec repeatability, stepper motor, 72:1 gear ratio, home switch, no optical encoder \\[0.5cm]
45
		Rotation Controller & Newmark & NSC-A1 & Single Axis, Serial over USB, C API \\[0.5cm]
47
		Rotation Controller & Newmark & NSC-A1 & Single Axis, Serial over USB, C API \\[0.5cm]
46
		Breadboard & Thorlabs & PBG11111 & 4' x 2.5' x 1.0", 21 kg, 1/4"-20 Holes on 1" Centers\\[0.5cm]
48
		Breadboard & Thorlabs & PBG11111 & 4' x 2.5' x 1.0", 21 kg, 1/4"-20 Holes on 1" Centers\\[0.5cm]
Line 66... Line 68...
66
	\caption{The physical dimensions of the breadboard, and throw angles of the cameras and projector.}
68
	\caption{The physical dimensions of the breadboard, and throw angles of the cameras and projector.}
67
	\label{fig:hardwaredimensions}
69
	\label{fig:hardwaredimensions}
68
\end{figure}
70
\end{figure}
69
 
71
 
70
\subsection{Projector}
72
\subsection{Projector}
71
The SeeMa-Scanner uses a standard commercial Full-HD projector. This is a very cost-effective solution for high resolution projection. The projector is configured to minimal image processing, and the HDMI port configured for ''Notebook''-use, which gives the lowest possible input lag. It should be noted that commercial projector do not posess a linear response, which is necessary for many structured light purposes. Gamma can be set to $1.6$, and if matched in the graphics card configuration of the scan computer, a close to linear response can be achieved. 
73
The SeeMa-Scanner uses a standard commercial Full-HD projector. This is very cost-effective, but brings a few challenges. The projector is configured to minimal image processing, and the HDMI port configured for ''Notebook''-use, which gives the lowest possible input lag. The projector a micromirror array to produce binary patterns at high refresh rates. A truthful capture of gray-value patterns requires that the camera integration time is a multiple of the 16.7 ms refresh period of the projector. It should be noted that commercial projector like this one do not have a linear response, which is also necessary for truthful capture of gray-value patterns. Gamma can be set to the lowest possible value of $1.6$, and if matched in the graphics card configuration of the scan computer, a close to linear response can be achieved. By only using binary patterns, these problems are avoided, however input lag must still be taken into consideration.
72
 
74
 
73
\subsection{Cameras}
75
\subsection{Cameras}
74
These are high resolution 9MPx industrial CCD color cameras. While color information is usually not necessary in structured light, it enables us to capture color information of the object. In the program code, a white balance is used for the camera, which approximately matches the white light used in the projector. To acchieve true coloring, a rigourous color calibration would have to be done.
76
These are high resolution 9MPx industrial CCD color cameras. While color information is usually not necessary in structured light, it enables us to capture color information of the object. In the program code, a white balance is used for the camera, which approximately matches the white light used in the projector. To acchieve true coloring, a rigourous color calibration would have to be done.
75
 
77
 
76
\subsection{Rotation stage}
78
\subsection{Rotation stage}
Line 91... Line 93...
91
 
93
 
92
\section{\texttt{Projector} Class} 
94
\section{\texttt{Projector} Class} 
93
This class provides a fullscreen OpenGL context, and the ability to project any texture. The window/context creation is operating system dependant. It works very well on Linux with proprietary nVidia drivers, as found on the scan computer. In order to get a completely independant screen output, which does not interfere with the window manager, the projector needs to be set up as a seperate X screen in \texttt{xorg.conf}. The absolute position of this second X screen must provide a small gap to the primary screen. This gives a secondary screen, which is not recognized by Compiz (Unity in Ubuntu), but which can be accessed through the Projector class.
95
This class provides a fullscreen OpenGL context, and the ability to project any texture. The window/context creation is operating system dependant. It works very well on Linux with proprietary nVidia drivers, as found on the scan computer. In order to get a completely independant screen output, which does not interfere with the window manager, the projector needs to be set up as a seperate X screen in \texttt{xorg.conf}. The absolute position of this second X screen must provide a small gap to the primary screen. This gives a secondary screen, which is not recognized by Compiz (Unity in Ubuntu), but which can be accessed through the Projector class.
94
 
96
 
95
\section{\texttt{Camera} Class}
97
\section{\texttt{Camera} Class}
96
An abstraction from the individual industrial camera APIs was created, in order to easy replacement and enhance modularity. A concrete implementation for Point Grey cameras is provided. The program is currently designed for ''software triggering'' of the cameras. Due to substantial input lag in the projector and cameras, a certain pause must be made in program execution between projecting a certain pattern, and image capture. Close temporal syncronization of both cameras is achieved by calling the trigger method on both cameras, and collecting the images subsequently.
98
An abstraction from the individual industrial camera APIs was created, in order to ease replacement and enhance modularity. A concrete implementation for Point Grey cameras is provided. The program is currently designed for ''software triggering'' of the cameras. Due to substantial input lag in the projector and cameras, a certain pause must be made in program execution between projecting a certain pattern, and image capture. Close temporal syncronization of both cameras is achieved by calling the trigger method on both cameras, and collecting the images subsequently.
97
 
99
 
98
\section{\texttt{RotationStage} Class}
100
\section{\texttt{RotationStage} Class}
99
Here a C++ abstraction for the Newmark motion control API was implemented. The C API essentially receives serial commands for serial-over-USB, and full documentation is provided on the Newmark website. Important things to consider are the latencies of many of these calls. Specifically reading and writing ''hardware settings'' such as microstep levels and motor current take considerable amounts of time.
101
Here a C++ abstraction for the Newmark motion control API was implemented. The C API essentially receives serial commands for serial-over-USB, and full documentation is provided on the Newmark website. Important things to consider are the latencies of many of these calls. Specifically reading and writing ''hardware settings'' such as microstep levels and motor current take considerable amounts of time. The motor's controllers inherent positional unit is ''number of microsteps''. This can be converted to an angular position, $\alpha$, by means of the following formula:
-
 
102
\[
-
 
103
	\alpha = \frac{\textrm{XPOS} \cdot 1.8}{\textrm{MS} \cdot 72} \quad ,
-
 
104
\]
-
 
105
where XPOS is the rotation controller's value, $1.8$ is the number of degrees per step on the motor axis. MS is the current microstep setting, and $72$ the worm-gear ratio. The \texttt{RotationStage} class interface abstracts from this and lets you rotate to a specific angle using the shortest path. 
100
 
106
 
101
\chapter{Practical scanning}
107
\chapter{Practical scanning}
102
The following procedure explains the steps involved in calibration and aquisition of a $360^\circ$ scan of an object. 
108
The following procedure explains the steps involved in calibration and aquisition of a $360^\circ$ scan of an object. 
103
 
109
 
104
Calibration parameters consist of camera focal lengths, central points, lens distortion parameters, camera extrinsics (their relative position and angles), and the location and orientation of the rotation stage axis. These parameters are stored in the GUI, but it is highly recommended to perform a new calibration before aquiring new data. Also, the exact position of cameras may be altered to better fit the object, in which case recalibration is mandatory. The calibration parameters can be exported into a \texttt{*.xml} file through the top bar menu. The global coordinate system, in which everything is expresses coincides with the left camera.
110
Calibration parameters consist of camera focal lengths, central points, lens distortion parameters, camera extrinsics (their relative position and angles), and the location and orientation of the rotation stage axis. These parameters are stored in the GUI, but it is highly recommended to perform a new calibration before aquiring new data. Also, the exact position of cameras may be altered to better fit the object, in which case recalibration is mandatory. The calibration parameters can be exported into a \texttt{*.xml} file through the top bar menu. The global coordinate system, in which everything is expresses coincides with the left camera.