Archive for the 'web' Category

Reddit AMA for VR Thriller Gone

Thursday, June 9th, 2016

A couple of days ago, I participated on a Reddit AMA about the VR series GONE with my friends from @Skybound and @PettyJTyrant. It was a great opportunity to revisit the creative and technical challenges we faced during this crazy adventure, as well all our accomplishments. So far I have produced or directed over a dozen cinematic VR projects, all of them valuable on their own right, but GONE surpasses all others in breadth and depth. No other time have I been able to explore, test and develop so many cinematic VR storytelling techniques. From camera moves to visual effects and interactive features, GONE remains at the bleeding edge of cinematic VR today. No other piece of 360 video content out there can compare to it, even though we finished production about a year ago and the first episode aired shortly after. We pulled off some crazy shit on this project. In the future people will wonder how could we achieve what we did during times where there were ZERO off-the-shelf production and postproduction tools for this kind of filmmaking. From DIY makeshift camera systems to painfully laborious postproduction techniques and previously non-existent user experience design, we figured out a way to make it all happen. I only wish it was promoted as well as it deserves.

Sports Illustrated Swimsuit Edition in VR

Monday, February 15th, 2016

I have to admit it caught my by surprise. I’ve never found my work on top of my twitter feed before. The Sports Illustrated Swimsuit Edition 2016 in Virtual Reality is out and available for download today. As Adi Robertson from The Verge pointed out, this might be first time where VR video has been made available to the public for a price. Will people buy it? Time will tell.


And of course hanging out with the Models and my pals from Wevr in the Dominican Republic for a week was a difficult endurance test that challenged every professional skill we’ve developed as virtual reality filmmakers over the last couple of years 😀


A-FRAME, a markup language for browser-based VR

Wednesday, December 16th, 2015


I have been fooling around with ThreeJS and virtual reality boilerplates for desktop and mobile browsers using Oculus and Cardboard for a while, but this just takes things to a whole new level.

A-frame is described by its creators as

an open source framework for easily creating WebVR experiences with HTML. It is designed and maintained by MozVR (Mozilla’s virtual reality team research team). A-Frame wraps WebGL in HTML custom elements, enabling web developers to create 3D VR scenes that leverage WebGL’s power, without having to learn its complex low-level API. Because WebGL is ubiquitous in modern browsers on desktop and mobile, A-Frame experiences work across desktop, iPhone (Android support coming soon), and Oculus Rift headsets.

It is not the first time we see something like this —remember VRML and more recently GLAM— but this is the first time I sense a strong design and content oriented vision behind a toolset of this kind. It has been clearly built taking into consideration the full spectrum of creative people that currently fuel the web as well as the mobile space, and this I hope will help it stick around. To see what I mean just launch from the broswer in your iPhone if you have one (sorry androids), browse through the examples, and hit that cardboard icon.

Screen Shot 2015-12-20 at 11.04.51 AM

Finally, I just stole a drawing from an article by @ngokevin where he explains what’s so special about A-frame and the entity-component-system design pattern at its core.


Friday, October 16th, 2015

They say “build and share virtual reality for everyone”: relies on webgl and threejs to deliver a browser based tool that lets users create simple virtual reality pieces that can be explored using google cardboard. I tried and it took me no time to make a simple scene. As far as I know, the barrier of entry for VR can’t get lower than this, both in terms of creating and sharing. It is pretty fun to use and delivers interesting results in five minutes. I still don’t know if I’ll ever actually use it but I seriously recommend you to check it out. As VR becomes more relevant to the regular digital user, more of these tools will emerge offering simple ways to make and share VR, and I believe a lot of interesting stuff will emerge from this space.

PS: here is my profile page in case I ever do something worth publishing with this toolkit. For now I can already say it’s a useful prototyping tool for VR experience development and I have successfully used it a couple of times.



Delicious, 4397 bookmarks later

Sunday, March 8th, 2015

I posted my first bookmark to delicious on 7/13/06, back when it was called In spite of having changed owner a couple of times, and survived a couple of not very fortunate redesigns, delicious might be the online service that I have most consistently used to aggregate annotated content from the web. Gone are the days when I interacted with it socially; most of the users in my network haven’t used it in a very long time, but I still find pleasure using it to collect interesting links and track my browsing preferences by exploring my data. Unlike other services from that era, delicious has kept available a simple API without forcing any horrendous authentication protocols on their users. This has allowed me to keep my delicious tags page alive —a simple sketch where I render all my tags using size and color to visualize frequency of usage. At this point, it’s pretty clear what my favorite webpages and websites are about.


I wonder if any knowledge can be inferred from the tag diversity expressed by an active user in a given amount of time. Does it reflect something about the user’s vocabulary as well as the diversity of their interests? Is there something in common about a group of users that grow their collections of tags and bookmarks at similar rates even if the bookmarks and tags have nothing in common? Can this behavior be evidence of a personal and/or philosophical disposition from users towards knowledge? In section 2.2.2 of Mr. Palomar (The cheese museum), Italo Calvino conjectures that a proclivity towards or against sample diversity will influence —and even shape— the nature of the knowledge acquired from a given experience, in his case, the quest for truth in the appreciation of a particular cheese.

If you have been a delicious user, you can visit my delicious tags page and pass along your username as a parameter. My page will return a nicely crafted version of your delicious data, perhaps helping you learn something you didn’t know about yourself. Here are some examples:,,,,,,,,,,

IML400 Spring 2015

Sunday, February 15th, 2015

The time has come to teach IML-400 at USC again, and this time around things are a little different. It is the first time I get a batch of students that had to take a prerequisite class, IML300, before they could join my class. This means I can jump ahead and make some assumptions about my students’ general knowledge that will hopefully help us move faster into the fun stuff and really take advantage of the browser as an interactive programming playground.

In addition to this, the class got split in two smaller groups of around twelve students, and I am only teaching one of these groups, while my colleague Raphael Arar is teaching the other one. When talking with Raphael about previous iterations and the future of the class, we decided to design a new Syllabus together based on my previous one, but taking into consideration Raphael’s teaching interests, the more advanced nature of this class, and aspects of the web that are a lot more mature today than they were during my previous iteration of IML400 a year ago. Specifically, I wanted students to put aside the page-based nature of the web we have today, and think about the things they can do using Web Audio and WebGL in emerging contexts like mobile WebVR for example.

I see my class not as design class, but as a creative innovation one. When thinking about new media, user interface, user interaction and user experience design are important things to understand, explore and develop as skills, but we are at a point where some design paradigms —like the page/scroll nature of the web today— have reached a degree of maturity that leaves very little room for the pure, unbiased creative experimentation that will eventually drive the emergence of fresh new media. There is so much more to the web that is coming to us.

Having a partner in crime on this teaching adventure has been the best thing ever happened to me and to this class. We are only a few weeks into the semester and Raphael and I have established a relationship where we exchange impressions about how the class is going every week, and iterate upon our teaching approach together. It’s really great to have someone to talk to at this level 😀





Friday, December 12th, 2014

Some time in the nineteen fifties a serious attempt was made to bring stereoscopic photography to the masses. Stereoscopic photography faces similar adoption challenges to 3D movies and Virtual Reality because up to this point there is no easy way to experience any of them without attaching a contraption to your face. An interesting note is that while 3D movies and Virtual Reality are fairly recent, stereoscopic photography has been around since the eighteen fifties, and even commercial viewers were mass produced back then.

In my quest to learn how to make my own Virtual Reality work, I got interested in the display of stereoscopic photography in VR. Inspired by the idea that properly placed in VR space, a stereoscopic photo can feel a lot like a VR sculpture of VR hologram, a moment frozen in time with a large potential to make the viewer feel “there with it” as opposed to just looking at a flat projection of it in a two dimensional picture.

While I was looking for cheap and easy stereoscopic camera systems I came across the historical mid twentieth century consumer cameras, and it was easier for me —or at least more reliable/fun/interesting— to get my hands on a couple of film cameras and start taking pictures than to find a digital solution. Eventually I came across a smartphone solution in Kickstarter that worked fairly well, but not before I was already shooting stereoscopic 35mm film all over the place. Here is an inventory of my stereoscopic gadgetry:

  • Revere Stereo 33 35mm Camera Released around 1953. In perfect condition. Works great.
  • Stereo Realist by the David White Company, available from 1947 to 1971. In perfect condition. Works great.
  • Poppy 3D for iPhone via Kickstarter. This works great too. Except it’s a little painful to manage the files. Overall easier than shooting film, developing it, scanning it and putting it together for digital viewing, but it could be easier on the viewing/playback aspect of the product.

Once I had my pictures, the next step was to build a program that could let me look at them through a VR headset. Since I have no time or interest to learn how to use a game engine, I was left with only one option: the web browser and WebGl/ThreeJS. I already knew there are experimental builds of Firefox and Chrome that are compatible with the Oculus Rift DK2 headset, and I also wanted something that could run on a mobile browser for Google Cardboard viewing. I knew where to get sample boilerplates from the VR Chrome team and Mozilla, so all I had left to do was to find a way to feed two textures onto the same piece of geometry while rendering for each eye. Luckily I found a code example that did just that, and it didn’t take me long to adjust it to my needs. You can find the code here (Stereovision).

A stereo card of a woman using a stereoscope circa 1901. Via Wikipedia

My Stereo Realist after a photo session

Stereovision for Cardboard on my iPhone. Featured photo taken with the Stereo Revere

Stereovision screenshot. Left and right images are rendered for the corresponding eye