Time-Life just launched their LifeVR app featuring a 360 video piece I directed called Fast Ride. A few months ago I spent a day with my colleagues from Wevr in a race track in Laguna Seca where the Mazda racing team took out all their vintage cars for a spin like they do every year. J. H. Harper wrote an in-depth piece about the event for The Verge that explains everything about this amazing cars.
The following picture shows pilot Jeremy Barnes tagging along for a fast ride with himself just after he finished his real-life lap. It was a lot more intense for him to be the passenger in VR than to be the pilot in real life 😀
I’ve been working on some new cover art for the Psychedelic Rock band Rest in Haste. I am not done yet, but some interesting things are coming out. I will be posting full resolution updates in my Rest in Haste Flickr album.
Last night I was looking at depthkit and 8i while researching options for video-based 3D capture, and I felt inspired to rescue my old kinect from the bottom of a drawer. I got a fresh copy of openframeworks and ten minutes later I had a running build of the openframeworks kinect example in my computer. It was literally like time-traveling to 2011. I remember capturing a point cloud of Amy pregnant back before Maya was born. A couple of months later the kinect found oblivion in the bottom of a drawer and I stopped using openframeworks until now.
I still remember exactly ten years from today, when @johnmaeda recommended me to start a blog shortly after I joined the Media Lab under his tutelage. I think he meant that I used it as a tool for self-promotion, but my blogging enterprise promptly evolved into a sort of introspective public journal with an audience of approximately one, myself always included ???? —and now ten years have passed, ten years since my first WordPress installation and ten years since I wrote this words. I still believe that Information is not Knowledge is not Wisdom is not Love but my priorities, motivations and interests have shifted greatly from that era. This is natural since much has changed around and within me. Even blogs, that felt so new back then —every cool kid had one— feel like archeological artifacts when compared to contemporary digital media. I have grown very fond of this journal, and I will continue writing on it and keeping my little server running for as long as I can. The privilege of having this window to my personal history is priceless.
A couple of days ago, I participated on a Reddit AMA about the VR series GONE with my friends from @Skybound and @PettyJTyrant. It was a great opportunity to revisit the creative and technical challenges we faced during this crazy adventure, as well all our accomplishments. So far I have produced or directed over a dozen cinematic VR projects, all of them valuable on their own right, but GONE surpasses all others in breadth and depth. No other time have I been able to explore, test and develop so many cinematic VR storytelling techniques. From camera moves to visual effects and interactive features, GONE remains at the bleeding edge of cinematic VR today. No other piece of 360 video content out there can compare to it, even though we finished production about a year ago and the first episode aired shortly after. We pulled off some crazy shit on this project. In the future people will wonder how could we achieve what we did during times where there were ZERO off-the-shelf production and postproduction tools for this kind of filmmaking. From DIY makeshift camera systems to painfully laborious postproduction techniques and previously non-existent user experience design, we figured out a way to make it all happen. I only wish it was promoted as well as it deserves.
Instead of talking about what’s usually understood as Virtual Reality, I decided to talk about what Virtual Reality means to a person like me, currently working in the field within the constraints and requirements of an overly supportive yet confused and almost pathologically optimistic entertainment industry. In that context, there were three points that I wanted to make as clear as possible:
There are ways to talk to the Web that go beyond the page/scroll metaphor —imagine pulling data from the web like you pull thoughts, impressions and memories from the corners of your mind. Immersive video matters and it requires a language of its own —memories and other sampled content are at least as important as simulations. The web browser is today the most powerful storytelling machine —the web browser is the only platform we have today that can fully touch across all aspects of digital media.
The content of my talk will eventually be available online here.
After my presentation and the discussion that followed, I was invited by Director JoAnn Kuchera-Morin to play with their AlloSphere, and I spent the next few hours immersed inside the most spectacular stereoscopic scientific visualizations you could ever imagine. There is something about the AlloSphere that makes it incredibly effective at rendering virtual objects in space when wearing 3D glasses. You can almost touch the damn things floating around you. I took some pictures but none of them make it justice. Just like with Virtual Reality, you have to experience it yourself before you can fully understand what it is.