Before Christmas I was able to spend some time with a Structure 3D sensor, which is used to scan everyday objects and produce 3D models of them. This is quite an experimental piece of technology that we are using to explore the possible applications of 3D models in engineering teaching in SMCSE.
We are hoping this technology can be used to provide students with 3D scans of objects that are precious, dangerous or difficult to transport. It can also be used to familiarise students with equipment prior to coming to the lab. We are also looking at how 3D models can be annotated to provide explanations or warnings about parts of machinery. This work is closely linked with our research into 360 videos and Virtual Reality for similar purposes.
If you’re interested in trying out these kinds of technologies in your teaching, please get in touch via video@city.ac.uk
3D scanning devices have been around since the 1980s, but the machines were extremely expensive and reserved for specialist use. With processing power and storage increasing and becoming cheaper, all sorts of technology has become available to enthusiasts, including 3D scanners. The Structure 3D sensor attaches to an iPad and, with the help of the camera, is able to produce 3D images of various objects. You merely point the scanner at an object and walk around it to capture all of its sides. The app lets you know which parts have already been scanned by “painting” the scanned parts white.
I’ve tested several objects using the software provided by the developers of the scanner: a coffee cup, old calculating machine, office chair, a small desk, and several other items of various sizes and textures. The process itself is quite slow and meticulous as you have to slowly move around the object and capture it from all sides. It needed ample light and room, as well as podium to raise the object. The scanner can also scan whole rooms although I haven’t experimented with this feature very much.
Firstly, due to the limited processing power of the iPad, small objects and details do not come through very well in the current setup. As you can see on the models, whilst the shape and most of the texture is there, small details are omitted. Additionally metallic and shiny objects as well as some black and other light absorbent materials, are effectively invisible to the scanner’s laser. Rough or complex textures are rendered smooth and bottoms of objects are hollow since they are resting on a surface and are not accessible to the scanner. I haven’t experimented with suspending the objects in the air, although that is a possibility.
Overall, the results suggest that in its current state the scanner is best used for objects the size of a chair. Any smaller or bigger and details disappear. These results are still very encouraging considering the software was effectively a demo. There is also a more professional app available for a license fee, where the data from the iPad is streamed to a computer which then is used to process the models. This should result in much greater fidelity and let us scan smaller and more detailed objects, but we haven’t been able to confirm that yet.
If you’re interested in trying out these kinds of technologies in your teaching, please get in touch via video@city.ac.uk