Photographic Knitting Club

No.01 08.02
No.02 08.22
No.03 08.23
No.04 08.24
No.05 09.25
No.06 10.22
No.07 11.19
No.08 11.21
No.09 12.01

No.10 04.05
No.11 05.05
No.12 07.11
No.13 08.03

︎Exhibition 09 2021
︎Exhibition 01 2021           

Conversation with Curator 

 ︎ Tutorial 

Photographic Knitting Club” is a series of participatory workshops that introduce participants to the photographic practice of photogrammetry. Developed as a scientific method for obtaining measurements from a series of images to remotely survey land in the nineteenth century, photogrammetry today has many applications, from 3D animation to forensics and state-sponsored surveillance. Photographic Knitting Club reflects on these applications of digital tools, while centering our attention on the domestic space, a site increasingly exposed to corporate surveillance and data extraction.

In the workshops, participants engaged in practices that reframe these imaging and quantification tools as digital crafts and connect data visualization with embodied experience. I led a series of exercises that explore the mechanics of photogrammetry, the material practices of photographic production, and the tactility associated with inhabiting our most familiar spaces. After processing and anonymizing the visual information shared by the participants, I perform handicraft on this data as a way of experimenting with new modes of seeing within 3D software. Inverting the association between photogrammetry and instrumentalist extraction of data, the artifacts fabricated for this piece instead allow us to reimagine and make strange the spaces we inhabit and the intimate objects with which we share them.

Types of Non-Contact 3D Scanning Methods

Post 3D Scan

2-1. Laser
a. Time-of-flight: (long range, less accurate)
A laser emits light, and the amount of time before the reflected light is seen by a detector is measured.

Known speed of light = Known round-trip distance.
b. Triangulation: (short range, more accurate)

A laser emits light, and a camera is used to look for the location of the laser dot (object).

(1) The known distance between the laser emitter and the camera and (2) The angle of the laser emitter corner. (3) The angle of the camera corner can be determined by looking at the location of the laser dot in the camera's field of view. With information (1)(2)(3), the distance between the laser and object can be calculated.
For example, LiDAR (Light Detection and Ranging) measures how long it takes for the emitted light to return back to the sensor.
Publicly Available LiDAR Data: USGS

Dream Life of Driverless Cars, ScanLAB Projects for The New York Times
2-2. Structured Light
Structured light 3D scanners project a known pattern of light. The distortion of the pattern helps determine the shape of the object.
There are many different types of sensors that use Structured Light. Here you see a Artec Eva handheld scanner used in the video The President, in 3D by the Smithsonian.

A high-res digital model resulting from the structured light sensor used in The President, in 3D by the Smithsonian.
Another example of handheld scan using Shining EinScan scanner.
You can also set up the scanner with a tripod and turntable.

Structured Light scanners like above cost around $7k (Shining EinScan) to $20k (Artec Eva). The software is usually proprietary. We will focus on (comparably) affordable Structured Light sensors.
For example, you can find a used Kinect for about $20 online (the model number at the bottom of the scanner should be “1414”).

Kinect has (1) an IR Projector to project IR light, (2) IR Receiver to see the IR light, and (3) RGB Camera to give 3D model color information.

However, the resolution of Kinect’s RGB camera is about 640px × 480px.

Studios like Scatter created DepthKit to replace the RGB camera with a DSLR. 
Structure Sensor is another lower cost option (around $300 to $500).

The iPhoneX also has an IR Projector and IR Receiver for FaceID.


Windows, macOS (supports Structure Sensor, Kinect)
Windows (supports most sensors)


Itseez3d: IOS, Android (supports Structure Sensor, RealSense)
Canvas.ioIOS (supports iPhoneX build in sensor)

Tutorial (click to zoom)
Here are the steps to use the software SKANECT with the hardware Structure Sensor or Kinect.

Feedback Screen

Green Pixels: Areas being scanned and expected.

Red Pixels: Areas not seen anymore.

Portrait Capture Tips

(1) Pick a well-lit place (avoid direct sunlight as it will interfere with IR).
(2) Have the subject look at a specific spot or close their eyes.
(3) Take off glasses or any reflective objects.
(4) Try to start and end somewhere that’s not the face.
(5) You can walk around a subject in a circular motion, sweeping up and down slowly (think Tai Chi). Or you can have a turnable for the subject to stand on.

Artist Gabe BC scanning an object.

3D scans from Knowing Together workshop participants: Kinect + SKANECT.

Example scan using Structure Sensor + SKANECT.