Gian Pablo Villamil

Gian Pablo Villamil

Solving new problems in video, electronics and strategy

Gian Pablo Villamil RSS Feed

Archive for Media

Variable rate time-lapse video using motion detection

As part of a larger project involving timelapse video, I developed a technique that uses motion detection to identify frames for capture, and skips over frames where nothing is happening.

YouTube Preview Image

The video on the left implements this technique. You can see how it works, by skipping the stops at the stations in the video. The video on the right is original source video, sped up to match the duration of the video on the left. You can see that it does not skip the stops at the station.

I implemented this in Max/MSP/Jitter. The basic algorithm keeps track of the last frame written, and constantly compares the incoming video feed with that frame, calculating a constant difference score that tracks not only the number of differing pixels, but also the amount of difference. When this difference exceeds a defined threshold, then it captures that frame.

Comparing against the last frame captured, instead of simply the previous frame, ensures that even gradual changes will be recorded.

Bay Bridge (1) – Coming and Going

Exploring a different view of panoramic footage, showing where you’ve been and where you’re going in a single simultaneous stream.

Bay Bridge (1) – Coming and going

Exploring a different view of panoramic footage, showing where you’ve been and where you’re going in a single simultaneous stream.

Cast: Gian Pablo Villamil

Tags: video art and san francisco

“Visual Streams” at the Exploratorium

"Visual Streams" installation during testing

"Visual Streams" installation during testing

For the past several months I’ve been volunteering at the Exploratorium, the famous science museum and playground in San Francisco. One of the projects I’ve been working on is called “Visual Streams”, an installation that represents the different aspects of the human visual system. I’ve built on previous work by Bill Meyer and Richard Brown of the Exploratorium, and some extremely useful JavaScript programming by Bill Nye.

The quadrants of the screen represent: color vision without brightness information (equiluminance), motion, face detection and edge detection. I’ve learned a lot of interesting things in the process, not just about the human visual system, but also how difficult it is to accurately represent color using digital media. The project relies on a lot of the reliability and ease-of-use techniques that I’ve developed for use in art projects.

The installation is all built in Max/MSP, and relies heavily on the OpenCV computer vision library, in its¬†cv.jit incarnation. It’s currently in “soft launch” mode, check it out in the “Seeing” exhibit.

“At The Still Point”, at Smack Mellon gallery

Installation view, photo by Etienne Frossard

Installation view, photo by Etienne Frossard

Pawel Wojtasik has a new video installation on view, at the Smack Mellon gallery in DUMBO (Brooklyn, NY). The soundtrack is by Stephen Vitiello, and I developed the software that manages the HD multi-channel projection.

“At The Still Point” was filmed in India, and covers themes of creation, destruction, and rebirth. It is presented as 5 channels of high resolution video, designed to fit in with Smack Mellon’s unique,¬†cathedral-like physical space.

An excerpt from the piece is online on Vimeo:

It’s on show until April 11, well worth a visit! Smack Mellon is at 92 Plymouth St., Brooklyn, New York.