Archive for the 'film' Category


Tuesday, December 8th, 2015

I spent a couple of months this Summer working with JT Petty on a virtual reality serialized science fiction thriller called GONE that got released today on Samsung’s MilkVR interactive spherical video platform. You can’t really experience it unless you have a GearVR with a compatible Samsung phone, but there are enough resources available online to give you an idea of what this is about, starting with this article on Variety.

GONE is a joint effort from Wevr, Skybound and Samsung to dive deep into the mostly unexplored waters of cinematic storytelling in virtual reality. This means we had to take the format a bit more seriously than it had been done before, starting with the scope and length of the story, cinematic language and camera movements, all the way to building a system that allows the audience to actually explore the scene at will through interactive features, taking them as close as possible to the notion of presence, or being there, that is key to VR. While you watch GONE, you have the choice to explore the settings of a scene at will, and by doing so you might gain better insight into some aspect of the story, but you might also miss out on something equally important just behind your shoulder. So every time you play an episode you will experience a slightly different story based of how you chose to explore the scene.

In order to achieve this, we shot every scene from multiple vantage points, and developed a playback system with a universal timeline that allows you to “jump” between multiple concurrent video tracks. The sense of being there while the scene unfolds around you is outstanding. When I think about all the limitations imposed to us by our 360° video capture, postproduction and playback tools, I can only begin to imagine how powerful this medium will become once better solutions have been developed to support our creative practice. GONE is the first time I watch a VR piece that feels as engaging in terms of story as any of my favorite TV shows, and this the result of JT’s excellent screenwriting in combination with the craft and effort we put into understanding what it means to tell a story in VR, and how to achieve this using the tools we had at the time.

Hello Sundance

Friday, December 4th, 2015

Four virtual reality WEVR productions just got selected to be part of the Sundance Film Festival 2016 New Frontiers exhibit. I worked as Creative Director in two of them, Hard World for Small Things and Waves, and I feel quite accomplished to have my name featured in the Sundance Film Festival website surrounded by such a talented group of people.

In particular, Waves is one of two projects where I spent most of my hours this year. It’s pretty cool. It was written and directed by Ben Dickinson, and it features Reggie Watts and Nathalie Emmanuel in a reality bending philosophical-musical comedy, where nothing is what it seems or seems to be what it is. I can’t wait for the day we make it publicly available.


The advent of computational photography

Friday, November 6th, 2015

Ever since I started working in cinematic Virtual Reality I have fantasized about the time when cameras will evolve from optics based mechanical contraptions to sensor based computational machines. Instead of projecting light into a flat image using lenses, computational photography collects data from the environment and uses it to reconstruct the scene after the fact. I find this subject matter fascinating. In fact, I almost attended Frédo Durand’s Computational Photography class at MIT, but I got too busy fooling around with symbolic programming and pattern recognition instead. I was not surprised to find out that Frédo is an advisor for the upcoming Light L16 digital camera. It looks insane and I definitely want one.

Before we had a Light 16 we had Lytro, a company famous for their shoot-first, focus-later consumer level funny looking cameras. To my knowledge this was the first time ever a data driven photography device has ever hit the consumer market. I didn’t get one, and I didn’t get their next generation DSLR model, but I always believed the Lytro guys were up to something interesting. It made total sense to me when they announced a few months ago they had begun development of a light field camera for Virtual Reality, and I even thought they might actually be the ones to pull that off.

Later I learned Wevr had been selected as a development partner to try the first working prototypes of Lytro’s VR capture system, called Immerge, and I might get to play with it before the end of this year. It will be a great relief after a couple of years dealing with custom rigs made with GoPro cameras and the limitations and difficulties inherited from having to stitch a bunch of deformed images at the very beginning of the postproduction pipeline. And since capturing light fields delivers data instead of pictures, you can move inside the scene almost like you were actually there, instead of being limited to just look around it.

Lytro CEO Jason Rosenthal sums it up in a recent press release: “To get true live-action presence in VR, existing systems were never going to get you there. To really do this, you need to re-think it from the ground up.” I can’t agree more.

Lytro Immerge from Lytro on Vimeo.

VR Filming all Weekend

Monday, August 3rd, 2015

After a very busy month filming all over California for a secret VR project with Skybound and Samsung, I just began the month of August filming all weekend for another two Wevr productions that I supervised as Creative Director. One of them is a VR short film called Hard World For Small Things, directed by Janicza Bravo, and the other one is a VR music video for the song Crown by Run the Jewels, directed by Peter Martin.

On location for Hard World

On location for Hard World

Killer Mike performing on set

El-P performing on set

Coastline Apparition – Old Habits

Wednesday, May 27th, 2015

I just finished my first music video. I shot it last Summer in West LA for the band Coastline Apparition with a Canon 5D Mark II and a Black Magic Pocket camera. I color graded and cut it in DaVinci Resolve and finished it in Adobe After Effects.

The piece features Swedish model Chloe Cole trying to find a future in a place that has a lot to offer but wont give anything away. This seemed to be a perfectly appropriate subject matter to frame the song with a visual narrative, and it gave Chloe a canvas to perform a fictional character that was close enough to her real self.

Creative Control SXSW Teaser

Friday, April 24th, 2015

I am working on a secret virtual reality project with musician/comedian Reggie Watts and director Ben Dickinson.

I recently went to a private screening of Ben’s most recent movie. It’s called Creative Control and it premiered in SXSW about a month ago, but who knows when will it be released to the public. It is not often that I say this but it’s a pretty good movie, and very difficult to describe. Filmed in black and white, it feels like something Woody Allen, Stanley Kubrick and Pier Paolo Pasolini could have made together.

Here are the official synopsis and teaser from SXSW:

In near-future Brooklyn, David is an overworked, tech-addicted advertising executive working on a high profile campaign for a new generation of Augmented Reality glasses. Envious of his best friend Wim’s charmed life and obsessed with his entrancing girlfriend Sophie, David uses the glasses to develop a life-like avatar of her; but he isn’t prepared for what happens when the line between reality and virtual reality begins to blur.


Friday, December 12th, 2014

Some time in the nineteen fifties a serious attempt was made to bring stereoscopic photography to the masses. Stereoscopic photography faces similar adoption challenges to 3D movies and Virtual Reality because up to this point there is no easy way to experience any of them without attaching a contraption to your face. An interesting note is that while 3D movies and Virtual Reality are fairly recent, stereoscopic photography has been around since the eighteen fifties, and even commercial viewers were mass produced back then.

In my quest to learn how to make my own Virtual Reality work, I got interested in the display of stereoscopic photography in VR. Inspired by the idea that properly placed in VR space, a stereoscopic photo can feel a lot like a VR sculpture of VR hologram, a moment frozen in time with a large potential to make the viewer feel “there with it” as opposed to just looking at a flat projection of it in a two dimensional picture.

While I was looking for cheap and easy stereoscopic camera systems I came across the historical mid twentieth century consumer cameras, and it was easier for me —or at least more reliable/fun/interesting— to get my hands on a couple of film cameras and start taking pictures than to find a digital solution. Eventually I came across a smartphone solution in Kickstarter that worked fairly well, but not before I was already shooting stereoscopic 35mm film all over the place. Here is an inventory of my stereoscopic gadgetry:

  • Revere Stereo 33 35mm Camera Released around 1953. In perfect condition. Works great.
  • Stereo Realist by the David White Company, available from 1947 to 1971. In perfect condition. Works great.
  • Poppy 3D for iPhone via Kickstarter. This works great too. Except it’s a little painful to manage the files. Overall easier than shooting film, developing it, scanning it and putting it together for digital viewing, but it could be easier on the viewing/playback aspect of the product.

Once I had my pictures, the next step was to build a program that could let me look at them through a VR headset. Since I have no time or interest to learn how to use a game engine, I was left with only one option: the web browser and WebGl/ThreeJS. I already knew there are experimental builds of Firefox and Chrome that are compatible with the Oculus Rift DK2 headset, and I also wanted something that could run on a mobile browser for Google Cardboard viewing. I knew where to get sample boilerplates from the VR Chrome team and Mozilla, so all I had left to do was to find a way to feed two textures onto the same piece of geometry while rendering for each eye. Luckily I found a code example that did just that, and it didn’t take me long to adjust it to my needs. You can find the code here (Stereovision).

A stereo card of a woman using a stereoscope circa 1901. Via Wikipedia

My Stereo Realist after a photo session

Stereovision for Cardboard on my iPhone. Featured photo taken with the Stereo Revere

Stereovision screenshot. Left and right images are rendered for the corresponding eye