I finally got the chance to push through a little idea about walking through visual time with a new Twitter bot I’m calling @MechaKubler.
Inspired by Google Cultural Institute’s “X Degrees of Separation”, I was curious to see if it was possible to recreate that app by hand using a more focused collection of works, and constraining the kinds of paths that would get drawn between two given images.
Was it possible to make this path move only forward in time? Or only backwards? To only consider a certain set of objects by type or nationality? The idea had been gnawing at me for some time.
Once an hour, @MechaKubler assembles an 8-image-long path between two objects from the Rijksmuseum, trying to find pictures that are roughly evenly separated across an expanse of visual similarity space. I used the penultimate max pooling layer of the pre-trained VGG-16 convolutional neural network1 to produce this space of multidimensional features (an involved way of saying “a list of 512 numbers per image”) for over 200,000 images of artworks in the Rijksmuseum collections.
Rather than just find the closest object at hand, it will take a chronological path, expressly moving either forwards or backwards in time as it traverses this visual space. Hence the homage to George Kubler, who considered the seriation of visual form through history in his 1962 Shape of Time. One of his core arguments was that there exist “prime objects”: ideal solutions to visual problems that artists then manifest through physical variants. This not unlike how @MechaKubler works. Using a R package I wrote to generate nearest-neighbor paths through numeric matrices, it identifies several ideal points sitting on a line evenly spaced between two randomly-chosen objects. These ideal points in the VGG-18 feature space can’t be directly translated back into images2 – alone, they’re just separate lists of 512 numbers. But it is possible to search through the real objects to find those whose own 512-number long feature vectors are very close to the ideal points.
Read the full post here.