Line 112... |
Line 112... |
112 |
|
112 |
|
113 |
Calibration parameters consist of camera focal lengths, central points, lens distortion parameters, camera extrinsics (their relative position and angles), and the location and orientation of the rotation stage axis. These parameters are stored in the GUI, but in most cases, it is recommended to perform a new calibration before aquiring new data. Also, the exact position of cameras may be altered to better fit the object, in which case recalibration must be done. The calibration parameters can be exported into a \texttt{*.xml} file through the top bar menu. The global coordinate system, in which everything is expresses coincides with the left camera.
|
113 |
Calibration parameters consist of camera focal lengths, central points, lens distortion parameters, camera extrinsics (their relative position and angles), and the location and orientation of the rotation stage axis. These parameters are stored in the GUI, but in most cases, it is recommended to perform a new calibration before aquiring new data. Also, the exact position of cameras may be altered to better fit the object, in which case recalibration must be done. The calibration parameters can be exported into a \texttt{*.xml} file through the top bar menu. The global coordinate system, in which everything is expresses coincides with the left camera.
|
114 |
|
114 |
|
115 |
Image aquisition consists of projecting a sequence of patterns onto the object, which are then converted to depth values by means of the specific algorithm.
|
115 |
Image aquisition consists of projecting a sequence of patterns onto the object, which are then converted to depth values by means of the specific algorithm.
|
116 |
|
116 |
|
117 |
Depending on the surface complexity (blind spots, etc.), multiple $360^\circ$ scans may be necessary. In that case, the following procedure is done multiple times.
|
117 |
\section{Calibration}
|
118 |
\begin{enumerate}
|
118 |
\begin{enumerate}
|
119 |
\item The GUI application is started on the scanner computer. The projector is turned on using the remote control or the touch interface on its top. Make sure the proper HDMI input is chosen as source. Some software settings can be altered through the ''File $\rightarrow$ Preference'' menu, if necessary (the GUI needs to be restarted after altering these settings).
|
119 |
\item The GUI application is started on the scanner computer. The projector is turned on using the remote control or the touch interface on its top. Make sure the proper HDMI input is chosen as source. Some software settings can be altered through the ''File $\rightarrow$ Preference'' menu, if necessary (the GUI needs to be restarted after altering these settings).
|
120 |
\item Position the calibration target on the circular rotation plate, and inside the field of view of cameras and projector. White light will be provided from the projector for guidance. The GUI will show as shown on figure \ref{fig:calibration0}.
|
120 |
\item Position the calibration target on the circular rotation plate, and inside the field of view of cameras and projector. White light will be provided from the projector for guidance. The GUI will show as shown on figure \ref{fig:calibration0}.
|
121 |
\item The darkening curtain is lowered, to improve the signal to noise ratio, and to avoid artifacts pertaining from ambient lighting.
|
121 |
\item The darkening curtain is lowered, to improve the signal to noise ratio, and to avoid artifacts pertaining from ambient lighting.
|
122 |
\item A number of calibration sets need to be aquired. The bare minium is 3 sets, and more is beneficial. The calibration pattern needs to be fully visible and bright in both cameras. The viewing angle must not be too shallow. The preset ''batch aquisition'' gives a reasonable number of calibration sets.
|
122 |
\item A number of calibration sets need to be aquired. The minium is 3 sets, and more is beneficial. The calibration pattern needs to be fully visible and equally bright in both cameras. The viewing angle must not be too shallow. The preset ''batch aquisition'' gives a reasonable number of calibration sets.
|
123 |
\item After aquisition, individual calibration sets can be re-examined. Calibration parameters are automatically determined by clicking the ''Calibrate'' button. This procedure can take up to a few minutes. The terminal output will show recalibration errors, which measure the quality of calibration.
|
123 |
\item After aquisition, individual calibration sets can be re-examined. Calibration parameters are automatically determined by clicking the ''Calibrate'' button. This procedure can take up to a few minutes. The terminal output will show recalibration errors, which measure the quality of calibration.
|
124 |
\item The calibration result can be examined by changing to the ''Point Clouds'' tab in the GUI (see fig. \ref{fig:pointclouds0}). Left and right cameras are representated by colored coordinate systems (the viewing direction is the positive z-axis, y points down, x to the right). The rotation axis, as determined by the calibration procedure is shown as a white line section.
|
124 |
\item The calibration result can be examined by changing to the ''Point Clouds'' tab in the GUI (see fig. \ref{fig:pointclouds0}). Left and right cameras are representated by colored coordinate systems (the viewing direction is the positive z-axis, y points down, x to the right). The rotation axis, as determined by the calibration procedure is shown as a white line section.
|
- |
|
125 |
\end{enumerate}
|
- |
|
126 |
|
- |
|
127 |
\section{Making a 360 degree scan}
|
- |
|
128 |
Depending on the surface complexity (blind spots, etc.), multiple $360^\circ$ scans may be necessary. In that case, the following procedure is done multiple times with the object in different orientations.
|
- |
|
129 |
\begin{enumerate}
|
125 |
\item After successfull calibration, data can be aquired for later point cloud reconstruction. This is done in the ''Capture'' tab, see figure \ref{fig:capture0} for an illustration.
|
130 |
\item Choose the ''Capture'' tab in the GUI -- see figure \ref{fig:capture0} for an illustration.
|
126 |
\item The scan object is now placed on the rotation plate, and the darkening curtain again lowered.
|
131 |
\item The scan object is now placed on the rotation plate such that it is visible in both cameras, and the darkening curtain again lowered.
|
- |
|
132 |
\item Press ''Single Capture'' or ''Batch Capture'' in the GUI.
|
127 |
\item Sequences of patterns are projected onto the object. The captured images can be reviewed, and one or multiple captured sequences reconstructed using the ''Reconstruct'' button.
|
133 |
\item Sequences of patterns are projected onto the object. The captured images can be reviewed, and one or multiple captured sequences reconstructed using the ''Reconstruct'' button.
|
128 |
\item The results will show up in the ''Points Clouds'' tab. Single point clouds can be shown or hidden, see figure \ref{fig:pointclouds1}.
|
134 |
\item The results will show up in the ''Points Clouds'' tab. Single point clouds can be shown or hidden, see figure \ref{fig:pointclouds1}.
|
129 |
\item All data can be exported from the GUI program by means of the top bar menues. By exporting the point clouds into a folder, a \texttt{*.aln} is stored alongside these, which contains pose information in global coordinate space, which aligns the points clouds correctly and relative to each other.
|
135 |
\item All data can be exported from the GUI program by means of the top bar menues. By exporting the point clouds into a folder, a \texttt{*.aln} is stored alongside these, which contains pose information in global coordinate space, which aligns the points clouds correctly and relative to each other.
|
130 |
\end{enumerate}
|
136 |
\end{enumerate}
|
131 |
\begin{figure}[H]
|
137 |
\begin{figure}[H]
|
Line 152... |
Line 158... |
152 |
\caption{''Point Clouds'' tab with reconstructed point clouds.}
|
158 |
\caption{''Point Clouds'' tab with reconstructed point clouds.}
|
153 |
\label{fig:pointclouds1}
|
159 |
\label{fig:pointclouds1}
|
154 |
\end{figure}
|
160 |
\end{figure}
|
155 |
\clearpage
|
161 |
\clearpage
|
156 |
|
162 |
|
157 |
\section{Reconstructing a surface}
|
163 |
\section{Reconstructing a mesh surface}
|
158 |
Multiple point clouds can be merged fused into a single watertight mesh representation using Meshlab. Meshlab is available on the scanner computer, but also freely available for download for multiple platforms. The basic steps involved in merging and reconstructing are outlined below. The input data will consist of one or more sets of pointclouds aquired with the SeeMaLab GUI. Note that if multiple object poses are desired (for complex geometries/blind spots, etc.), it is recommended to close and restart the GUI for each pose, to clear the captured sequences and memory.
|
164 |
Multiple point clouds can be merged into a single watertight mesh representation using Meshlab. Meshlab is available on the scanner computer, but also freely available for download for multiple platforms. The basic steps involved in merging and reconstructing are outlined below. The input data will consist of one or more sets of pointclouds aquired with the SeeMaLab GUI. Note that if multiple object poses are desired (for complex geometries/blind spots, etc.), it is recommended to close and restart the GUI for each pose, to clear the captured sequences and memory.
|
159 |
\begin{enumerate}
|
165 |
\begin{enumerate}
|
160 |
\item Load a set of point clouds, by opening the \texttt{*.aln} file in Meshlab (''File $\rightarrow$ Open Project...''). See figure \ref{fig:meshlab0} for an illustration of one full set of scans loaded into Meshlab.
|
166 |
\item Load a set of point clouds, by opening the \texttt{*.aln} file in Meshlab (''File $\rightarrow$ Open Project...''). See figure \ref{fig:meshlab0} for an illustration of one full set of scans loaded into Meshlab.
|
161 |
\item The PLY files do contain XYZ and RGB values for all points. You will need to estimate normals, in order for the surface reconstruction to succeed. These normals can be estimated and consistently oriented by considering the camera viewpoint. Select all point cloud in turn and for each, choose ''Filters $\rightarrow$ Point Sets $\rightarrow$ Compute Normals for Point Set''. Make sure the ''Flip normals...'' checkbox is ticked (see fig. \ref{fig:meshlab1}). Suitable neighborhood values are in the order of $10$. You can visualize the estimated normals through the ''Render'' menu.
|
167 |
\item The PLY files do contain XYZ and RGB values for all points. You will need to estimate normals, in order for the surface reconstruction to succeed. These normals can be estimated and consistently oriented by considering the camera viewpoint. Select all point cloud in turn and for each, choose ''Filters $\rightarrow$ Point Sets $\rightarrow$ Compute Normals for Point Set''. Make sure the ''Flip normals...'' checkbox is ticked (see fig. \ref{fig:meshlab1}). Suitable neighborhood values are in the order of $10$. You can visualize the estimated normals through the ''Render'' menu.
|
162 |
\item After estimating normals for all point clouds in a set, choose ''Filters $\rightarrow$ Mesh Layer $\rightarrow$ Flatten Visible Layers''. Make sure to retain unreferences vertices, because at this point, none of the points will be part of any triangles (see figure \ref{fig:meshlab2}). This process will alter all coordinates by applying the pose transformation to all point clouds before merging them.
|
168 |
\item After estimating normals for all point clouds in a set, choose ''Filters $\rightarrow$ Mesh Layer $\rightarrow$ Flatten Visible Layers''. Make sure to retain unreferences vertices, because at this point, none of the points will be part of any triangles (see figure \ref{fig:meshlab2}). This process will alter all coordinates by applying the pose transformation to all point clouds before merging them.
|
163 |
\item Save the resulting merged point cloud. In the save dialog, make sure to include the normals in the output file (see fig. \ref{fig:meshlab3}).
|
169 |
\item Save the resulting merged point cloud. In the save dialog, make sure to include the normals in the output file (see fig. \ref{fig:meshlab3}).
|