Increasingly our projects involve integration of video that go beyond the normal use of a an html5 video container. This for reasons of finely controlling the playhead through mouse or touch interaction, or inclusion in a complex cropping or alpha channel situation, or for the simple reason that we need automatic playback in an iOS web environment. We’re developing a set of methods that involve file processing with ffmpeg, canvas drawing, precise framerate control with requestAnimationFrame, and audio synchronization with the webAudio API. I’d like to walk you through some of these steps to give a general idea of how we do this, as well as give an idea of what is possible, and how the technology plays out when we go on to add crazy things like CSS transforms, SVG programming, and even voronoi equations to the moving picture soup. Kind of like a 21st century Eadweard Muybridge.
An Xtreme Labs lounge presentation
Only at FITC. Free drinks. 30 minute presentations. Cool topics! Unique, on-the-fly presentations from the worlds top creators.