Archive for the 'media' Category

Fast Ride on LifeVR

Wednesday, September 21st, 2016

Time-Life just launched their LifeVR app featuring a 360 video piece I directed called Fast Ride. A few months ago I spent a day with my colleagues from Wevr in a race track in Laguna Seca where the Mazda racing team took out all their vintage cars for a spin like they do every year. J. H. Harper wrote an in-depth piece about the event for The Verge that explains everything about this amazing cars.

The following picture shows pilot Jeremy Barnes tagging along for a fast ride with himself just after he finished his real-life lap. It was a lot more intense for him to be the passenger in VR than to be the pilot in real life 😀

jeremy barnes vr

Openframeworks + Kinect still working

Thursday, July 28th, 2016

Last night I was looking at depthkit and 8i while researching options for video-based 3D capture, and I felt inspired to rescue my old kinect from the bottom of a drawer. I got a fresh copy of openframeworks and ten minutes later I had a running build of the openframeworks kinect example in my computer. It was literally like time-traveling to 2011. I remember capturing a point cloud of Amy pregnant back before Maya was born. A couple of months later the kinect found oblivion in the bottom of a drawer and I stopped using openframeworks until now.

IMG_6296-550

IMG_6268-550

Visiting UCSB to talk about Virtual Reality

Tuesday, May 10th, 2016

My friend and computational geometry genius Pablo Colapinto invited me to give a talk on VR at the Media Arts and Technology Graduate Program in his alma mater UCSB. I met Pablo a while ago when visiting Colombia for a similar reason, where we quickly bonded over symmetry groups, OpenGL and cinema.

Instead of talking about what’s usually understood as Virtual Reality, I decided to talk about what Virtual Reality means to a person like me, currently working in the field within the constraints and requirements of an overly supportive yet confused and almost pathologically optimistic entertainment industry. In that context, there were three points that I wanted to make as clear as possible:

There are ways to talk to the Web that go beyond the page/scroll metaphor —imagine pulling data from the web like you pull thoughts, impressions and memories from the corners of your mind.
Immersive video matters and it requires a language of its own —memories and other sampled content are at least as important as simulations.
The web browser is today the most powerful storytelling machine —the web browser is the only platform we have today that can fully touch across all aspects of digital media.

The content of my talk will eventually be available online here.

Luis-IG

After my presentation and the discussion that followed, I was invited by Director JoAnn Kuchera-Morin to play with their AlloSphere, and I spent the next few hours immersed inside the most spectacular stereoscopic scientific visualizations you could ever imagine. There is something about the AlloSphere that makes it incredibly effective at rendering virtual objects in space when wearing 3D glasses. You can almost touch the damn things floating around you. I took some pictures but none of them make it justice. Just like with Virtual Reality, you have to experience it yourself before you can fully understand what it is.

Run the Jewels Crown is out

Thursday, March 10th, 2016

A few months ago I produced a VR music video with my friends at Wevr for the hip-hop duo Run the Jewels. I’m proud to announce that The New York Times just dropped it in their nytvr mobile app.

Here is an insightful article written by @djabatt on the importance of matching hip-hop with virtual reality.

360° Premiere: Run The Jewels's "Crown"

Run The Jewels keeps pushing boundaries with their 360° video for "Crown," premiering on The New York Times's virtual-reality app.Check out the full video from Killer Mike GTO and EL-P: http://nyti.ms/1QOIbed

Posted by The New York Times on Thursday, March 10, 2016

crown-nyt

Synchrony 2016

Saturday, January 9th, 2016

I am attending a demo party called Synchrony NYC. It is hosted at a place near Union Square in Manhattan called Babycastles and organized by my old friend @nickmofo, who invited me to give a talk about virtual reality.

Synchrony103a_white1
Synchrony promotional image by Raquel Meyers.

Anamorphic iPhone lens

Sunday, December 27th, 2015

@djabatt just gave me a 1.33x anamorphic lens adapter from Moondog Labs for my iPhone. It’s great. Now I finally believe you can make movies with an iPhone, or at least use it to reliably preview real movie stuff. I love it.

IMG_8151D

IMG_8151C

IMG_8151b

A-FRAME, a markup language for browser-based VR

Wednesday, December 16th, 2015

aframe-1

I have been fooling around with ThreeJS and virtual reality boilerplates for desktop and mobile browsers using Oculus and Cardboard for a while, but this just takes things to a whole new level.

A-frame is described by its creators as

an open source framework for easily creating WebVR experiences with HTML. It is designed and maintained by MozVR (Mozilla’s virtual reality team research team). A-Frame wraps WebGL in HTML custom elements, enabling web developers to create 3D VR scenes that leverage WebGL’s power, without having to learn its complex low-level API. Because WebGL is ubiquitous in modern browsers on desktop and mobile, A-Frame experiences work across desktop, iPhone (Android support coming soon), and Oculus Rift headsets.

It is not the first time we see something like this —remember VRML and more recently GLAM— but this is the first time I sense a strong design and content oriented vision behind a toolset of this kind. It has been clearly built taking into consideration the full spectrum of creative people that currently fuel the web as well as the mobile space, and this I hope will help it stick around. To see what I mean just launch http://aframe.io/ from the broswer in your iPhone if you have one (sorry androids), browse through the examples, and hit that cardboard icon.

Screen Shot 2015-12-20 at 11.04.51 AM

Finally, I just stole a drawing from an article by @ngokevin where he explains what’s so special about A-frame and the entity-component-system design pattern at its core.

entity-component