Subversion Repositories seema-scanner

Rev

Rev 94 | Rev 102 | Go to most recent revision | Show entire file | Ignore whitespace | Details | Blame | Last modification | View Log | RSS feed

Rev 94 Rev 101
Line 22... Line 22...
22
		\includegraphics[width=.9\textwidth]{mesh0.png}
22
		\includegraphics[width=.9\textwidth]{mesh0.png}
23
	\label{fig:mesh0}
23
	\label{fig:mesh0}
24
\end{figure}
24
\end{figure}
25
 
25
 
26
\begin{abstract}
26
\begin{abstract}
27
This is the official manual for the Seeing Machines Lab Structured Light Scanner -- SeeMa-Scanner for short. The scanner constitutes of both hardware components (the physical device, including cameras, projector and rotation stage), and the software GUI needed to perform object surface digitizations in full color with high precision. While most of these components should be self-explanatory, we will describe the functional principles of the scanner and give a brief introduction for how to get from a physical object to a complete digital meshed model of it. This document also describes the software components involved, making it possible for students and staff to implement scan software, and possibly extend the software.
27
This is the manual for the Seeing Machines Lab Structured Light Scanner (SeeMa-Scanner). The scanner consists of both hardware components (including cameras, projector and rotation stage), and software for calibration, scanning and reconstruction. While most of the components should be self-explanatory, we describe the hardware, and each software component, making it possible for students and staff to extend the scanner with new functionality. We also give a brief step-by-step guide on how to get from a physical object to a digital mesh model of it. 
28
\end{abstract}
28
\end{abstract}
29
 
29
 
30
\chapter{The scanner}
30
\chapter{The scanner}
31
\section{Getting started}
31
\section{Getting started}
32
Welcome to the SeeMaLab 3D scanner documentation. This document describes the main hardware and software parts of the system, and provides short directions for performing scans, and reconstructing surfaces. Please be very careful with this very expensive equipment, and considerate by not misplacing any parts and not borrowing any components of the scanner hardware.
32
This section describes the main hardware and software parts of the system.
33
 
33
 
34
If your main objective is to digitize objects, you should be able to do so on your own by reading this documentation, and familiarizing yourself with the scanner. The chapter ''Practical Scanning'' gives a step-by-step recipe to perform a complete object digitization. 
34
If your main objective is to digitize objects, you should be able to do so on your own by reading the chapter ''Practical Scanning'', which gives a step-by-step recipe to perform a complete object scan and reconstruction. 
35
 
35
 
36
Technical projects and contributions are very welcome. Please get in touch with the authors if you plan any alterations to the hardware, or would like write access to the SVN repository containing the software.
36
Technical projects and contributions are very welcome. Please get in touch with the authors if you plan any alterations to the hardware, or would like write access to the SVN repository containing the software. The public read access url to the SeeMaLab Scanner repository is: \url{http://svn.compute.dtu.dk/svn/seema-scanner/}.
37
 
37
 
38
\section{Hardware parts}
38
\section{Hardware parts}
39
\begin{table}
39
\begin{table}
40
	\begin{tabular}{l l l p{0.3\textwidth}}
40
	\begin{tabular}{l l l p{0.3\textwidth}}
41
		\textbf{Part}              & \textbf{Manufacturer} & \textbf{Model} & \textbf{Specifications} \\
41
		\textbf{Part}              & \textbf{Manufacturer} & \textbf{Model} & \textbf{Specifications} \\
Line 55... Line 55...
55
 
55
 
56
The cameras, projector and rotation stage are mounted rigidly with respect to each other, which is important for high quality results. See figure \ref{fig:hardware0} for an image of the inside of the main scanner assembly. A darkening curtain can be lowered, to prevent ambient light from interfering with the measurement procedure. 
56
The cameras, projector and rotation stage are mounted rigidly with respect to each other, which is important for high quality results. See figure \ref{fig:hardware0} for an image of the inside of the main scanner assembly. A darkening curtain can be lowered, to prevent ambient light from interfering with the measurement procedure. 
57
\begin{figure}[h]
57
\begin{figure}[h]
58
	\centering
58
	\centering
59
		\includegraphics[width=.9\textwidth]{hardware0.jpg}
59
		\includegraphics[width=.9\textwidth]{hardware0.jpg}
60
	\caption{The scanner hardware. Two industrial cameras and one projector constitute the optical parts. An angle figure acts as the scan object, and is placed on top of the circular rotation plate. This plate is screwed onto a microrotation stage. The projector remote control and the calibration target are also seen.}
60
	\caption{The scanner hardware. Two industrial cameras and one projector constitute the optical parts. An angle figure acts as the scan object, and is placed on top of the circular rotation plate. This plate is screwed onto a microrotation stage. The calibration target is also seen on its holder.}
61
	\label{fig:hardware0}
61
	\label{fig:hardware0}
62
\end{figure}
62
\end{figure}
63
 
63
 
64
The geometry of the scanner is illustrated on figure \ref{fig:hardwaredimensions}, which also indicates the minimum focus range of the cameras and projector.
64
The geometry of the scanner is illustrated on figure \ref{fig:hardwaredimensions}, which also indicates the minimum focus range of the cameras and projector.
65
\begin{figure}[h]
65
\begin{figure}[h]
Line 68... Line 68...
68
	\caption{The physical dimensions of the breadboard, and throw angles of the cameras and projector.}
68
	\caption{The physical dimensions of the breadboard, and throw angles of the cameras and projector.}
69
	\label{fig:hardwaredimensions}
69
	\label{fig:hardwaredimensions}
70
\end{figure}
70
\end{figure}
71
 
71
 
72
\subsection{Projector}
72
\subsection{Projector}
73
The SeeMa-Scanner uses a standard commercial Full-HD projector. This is very cost-effective, but brings a few challenges. The projector is configured to minimal image processing, and the HDMI port configured for ''Notebook''-use, which gives the lowest possible input lag. The projector a micromirror array to produce binary patterns at high refresh rates. A truthful capture of gray-value patterns requires that the camera integration time is a multiple of the 16.7 ms refresh period of the projector. It should be noted that commercial projector like this one do not have a linear response, which is also necessary for truthful capture of gray-value patterns. Gamma can be set to the lowest possible value of $1.6$, and if matched in the graphics card configuration of the scan computer, a close to linear response can be achieved. By only using binary patterns, these problems are avoided, however input lag must still be taken into consideration.
73
The SeeMa-Scanner uses a standard commercial Full-HD projector. This is very cost-effective, but brings a few challenges. The projector is configured to perform minimal image processing, and the HDMI port is set to ''Notebook''-mode, which gives the lowest possible input lag (approx. 80 ms). The projector contains a DLP micromirror array to produce binary patterns with a high refresh rates (kHz range). Intermediate gray-values are created by the projector by altering the relative on-off cycles of each micromirror. A truthful capture of gray-values with the camera, requires an integration time that is a multiple of the 16.7 ms refresh period of the projector. 
-
 
74
 
-
 
75
Commercial projectors do not have a linear response, which would be necessary for truthful capture of gray-value patterns. Gamma can be set to the lowest possible value of $1.6$, and if matched in the graphics card configuration of the scan computer, a close to linear response can be achieved. By only using binary patterns, this problem is avoided.
74
 
76
 
75
\subsection{Cameras}
77
\subsection{Cameras}
76
These are high resolution 9MPx industrial CCD color cameras. While color information is usually not necessary in structured light, it enables us to capture color information of the object. In the program code, a white balance is used for the camera, which approximately matches the white light used in the projector. To acchieve true coloring, a rigourous color calibration would have to be done.
78
These are high resolution 9MPx industrial CCD color cameras. While color information is usually not necessary in structured light, it enables us to capture color information of the object. In the program code, a white balance is used for the camera, which approximately matches the white light used in the projector. To acchieve true coloring, a rigourous color calibration would have to be done.
77
 
79
 
78
\subsection{Rotation stage}
80
\subsection{Rotation stage}
79
This is a socalled micro-rotation stage, commonly used in high precision photonic research and production. A larger diameter plate was attached. The rotation stage has a stepper motor which drives a worm-gear. This gives high precision and very high repeatability. Note that the rotation stage does not have an optical encoder. It is reset to 0 degrees at each program start in software. The motor controller can be configured for different levels of microstepping and motor current. Higher motor current provides more torque and less risk of missing steps. Load on the plate should not exceed 20 kg, and be centered around the rotation axis. Objects can be stabilized on the plate using e.g. modeling clay.
81
This is a socalled micro-rotation stage, commonly used in high precision photonic research and production. A larger diameter plate was attached. The rotation stage has a stepper motor which drives a worm-gear. This gives high precision and very high repeatability. Note that the rotation stage does not have an optical encoder. It is reset to 0 degrees at each program start in software. The motor controller can be configured for different levels of microstepping and motor current. Higher motor current provides more torque and less risk of missing steps. Load on the plate should not exceed 20 kg, and be centered around the rotation axis. Objects can be stabilized on the plate using e.g. modeling clay.
80
 
82
 
81
\subsection{Calibration target}
83
\subsection{Calibration target}
82
A calibration target is also part of the scanner. It was produced by printing a checkerboard in vector format, and gluing it onto the outer glass surface of a standard picture frame using spray adhesive. Please note that the target is asymmetrical, which is necessary to uniquely match chessboard corners in both cameras. The calibration target was designed to fill the scan objects space. If you need a smaller scan area, a smaller calibration target would be beneficial, however physical dimensions are currently hardcoded in the scanner GUI. Please also note the minimal focus distance of the projector and cameras.
84
A calibration target is also part of the scanner. It was produced by printing a checkerboard in vector format, and gluing it onto a thick piece of float glass using spray adhesive. Please note that the target is asymmetrical, which is necessary to uniquely match chessboard corners in both cameras. The calibration target was designed to fill the scan objects space. If you need a smaller scan area, a smaller calibration target would be beneficial, however physical dimensions are currently hardcoded in the scanner GUI. Please also note the minimal focus distance of the projector and cameras.
83
 
85
 
84
\section{Software components}
86
\section{Software components}
85
The SeeMaLab 3D scanner has a full graphical user interface for calibration, and scanning. The output from this software is a number of color pointclouds in the PLY format along with a Meshlab alignment project file (file suffix .aln), which contains orientation information as provided from the rotation stage parameters. This allows the user to import the point cloud for further processing in Meshlab, e.g. to produce a full mesh model of the surface. The rotation axis is determined during calibration, which means that usually no manual or algorithm-assisted alignment of partial surfaces is necessary. 
87
The SeeMaLab 3D scanner has a full graphical user interface for calibration, and scanning. The output from this software is a number of color pointclouds in the PLY format along with a Meshlab alignment project file (file suffix .aln), which contains orientation information as provided from the rotation stage parameters. This allows the user to import the point cloud for further processing in Meshlab, e.g. to produce a full mesh model of the surface. The rotation axis is determined during calibration, which means that usually no manual or algorithm-assisted alignment of partial surfaces is necessary. 
86
 
88
 
87
To get fine grained control over the scan procedure, the user can modify the source code for the GUI application, or use the supplied Matlab wrappers. These wrappers provide basic functionality to capture images with the cameras, project a specific pattern on the projector, or rotate the rotation stage to a specific position. Using these components, a full structured light scanner can be implemented in Matlab with full design freedom. 
89
To get fine grained control over the scan procedure, the user can modify the source code for the GUI application, or use the supplied Matlab wrappers. These wrappers provide basic functionality to capture images with the cameras, project a specific pattern on the projector, or rotate the rotation stage to a specific position. Using these components, a full structured light scanner can be implemented in Matlab with full design freedom. 
Line 103... Line 105...
103
	\alpha = \frac{\textrm{XPOS} \cdot 1.8}{\textrm{MS} \cdot 72} \quad ,
105
	\alpha = \frac{\textrm{XPOS} \cdot 1.8}{\textrm{MS} \cdot 72} \quad ,
104
\]
106
\]
105
where XPOS is the rotation controller's value, $1.8$ is the number of degrees per step on the motor axis. MS is the current microstep setting, and $72$ the worm-gear ratio. The \texttt{RotationStage} class interface abstracts from this and lets you rotate to a specific angle using the shortest path. 
107
where XPOS is the rotation controller's value, $1.8$ is the number of degrees per step on the motor axis. MS is the current microstep setting, and $72$ the worm-gear ratio. The \texttt{RotationStage} class interface abstracts from this and lets you rotate to a specific angle using the shortest path. 
106
 
108
 
107
\chapter{Practical scanning}
109
\chapter{Practical scanning}
-
 
110
Please be very careful with this very expensive equipment, and considerate by not misplacing any parts and not borrowing any components of the scanner hardware.
108
The following procedure explains the steps involved in calibration and aquisition of a $360^\circ$ scan of an object. 
111
The following guide explains the steps involved in calibration and aquisition of a $360^\circ$ scan of an object. 
109
 
112
 
110
Calibration parameters consist of camera focal lengths, central points, lens distortion parameters, camera extrinsics (their relative position and angles), and the location and orientation of the rotation stage axis. These parameters are stored in the GUI, but it is highly recommended to perform a new calibration before aquiring new data. Also, the exact position of cameras may be altered to better fit the object, in which case recalibration is mandatory. The calibration parameters can be exported into a \texttt{*.xml} file through the top bar menu. The global coordinate system, in which everything is expresses coincides with the left camera.
113
Calibration parameters consist of camera focal lengths, central points, lens distortion parameters, camera extrinsics (their relative position and angles), and the location and orientation of the rotation stage axis. These parameters are stored in the GUI, but in most cases, it is recommended to perform a new calibration before aquiring new data. Also, the exact position of cameras may be altered to better fit the object, in which case recalibration must be done. The calibration parameters can be exported into a \texttt{*.xml} file through the top bar menu. The global coordinate system, in which everything is expresses coincides with the left camera.
111
 
114
 
112
Image aquisition consists of projecting a sequence of patterns onto the object, which are then converted to depth values by means of the specific algorithm.
115
Image aquisition consists of projecting a sequence of patterns onto the object, which are then converted to depth values by means of the specific algorithm.
113
 
116
 
114
Depending on the surface complexity (blind spots, etc.), multiple $360^\circ$ scans may be necessary. In that case, the following procedure is done multiple times.
117
Depending on the surface complexity (blind spots, etc.), multiple $360^\circ$ scans may be necessary. In that case, the following procedure is done multiple times.
115
\begin{enumerate}
118
\begin{enumerate}