Deployed in London, NYC, London, and back to NYC. The Experience was comprised of 3 pieces:
A multi-display Rube Goldberg inspired story that played out over multiple synchronized screens, including a modified Jamboard at the center that was running a custom app that interfaced with the multi-display controller to sync interactions and playback.
Next, an interactive art gallery where visitors find their doppelgänger in the Metropolitan Museum of Art’s vast collection using Google’s machine vision… and get a physical print.
Finally, a room scale multi-surface projection that tracks users as they walk and explore the globe according to Google’s Physical Network. openFrameworks for combining Kinect data, blob tracking and broadcasting tracking data two sync’d Unity apps.
A real-time WebGL particle system projection using both Kinect and 180 degree cameras for blob and skeletal tracking. Technical director, and developer. I built the tracking system, planned the projection setup, and coordinated all of our partners for on site hardware/tools for filming and interaction. Worked with Kuva (now lusion) and collaborated with them for WebGL Particle visuals, and Rooftop films for projection equipment.
Google's ATAP (Advanced Technology and Projects) division created the first affordable portable SLAM device dubbed Project Tango. We partnered with them and Google's Art Copy Code division to create the first retail experience with the new prototype hardware. The idea was to create a real world scale game that mapped to the isles inside Target Stores and let kids play an epic snowball fight with characters placed in a winter wonderland. Four Stores around the country were selected, and four levels were created that mapped the physical blueprints to the world scale geometry. We conceived and designed custom cases for the limited amount of pre-production devices we had, and worked hand in hand with Google's ATAP group to develop the Unity SDK.
Call me Ishmael, or something, because this was a giant whale of a project. DEDON asked SarkissianMason for a fun piece for their NYC showroom. After a collaborative brainstorm, some refinement, and some R&D we were on our way to building a full-fledged interactive installation that aimed to replicate grass flowing in the 'wind' generated by people moving along the sidewalk. I already had some experience toying around with physical computing, and recently I had the opportunity to fuss with a Kinect and Processing. So naturally I figured that building an interactive sculpture using that tech was 'totally possible'. The following three weeks saw plenty of Processing and Arduino development, soldering, learning about controlling 30 stepper motors asynchronously, and patience. so much patience. At the end of the day, we have what you see in this video: smiling people. As with any ambitious project, there were some false starts (like discovering that a Kinect is blind when looking at natural sunlight, which resulted in having to make a separate application that detected motion during the daytime using RGB data, along with the IR depth sensing program for the nighttime), and some on site debugging after the install. However the final result was a great piece that people genuinely enjoyed, the client loved, and I was happy with.
I whipped up this little guy to give as a Christmas gift. It was something that I've wanted to make ever since I came across an example on YouTube. I used parts from the Maker SHED, but supplied my own enclosure. After some quick cuts, sanding, and paint, it was good to go. I probably should have let the paint dry longer… but Christmas waits for no one.
This is a mobile projection that populated the streets of Austin during SXSW 2011. Working with CNN, we created an RSS reader that presented customized content. SXSW patrons were presented with local news about SXSW that updated and informed the passing observer. Built as an AIR application, this piece was put together and deployed in one week using reusables from th Sekati API and an auto-updating Automator script that pulled down the most recent SVN build for automatic remote deployment.
Ferragamo sought a social solution for the seasonal launch of their W bag. The W List was SarkissianMason's solution: a vote based contest that was socialized via deep Facebook integration. I worked on the front-end / Facebook integration along side Jason Horwitz and Peter Segerstrom for the backend and voting logic.
lovehatelove is a demo of the Shopmate Product that was developed while at SarkissianMason. Shopmate at its core is a p2p video / audio / data chat that enables real-time collaborative shopping. Built on Adobe's Cirrus RTMFP service, initial concepts required no money for bandwidth up front, but could scale to FMS via a quick configuration change if needed. Several iterations of the Shopmate platform were developed, including full Flash applications on one end, and a Hybrid JS/HTML/AJAX version with a real-time whiteboard that allowed for collaborative design of outfits based on products pulled in from the Shopstyle API.
This project was the result of being tasked by SM's owner, Patrick Sarkissian, to “make something awesome”. Generally, “awesome” implies a great measure of success and increasingly, success on the web is being measured solely on traffic, click-thru rate, and return on investment. Coming from the perspective of the guys that take pride in crafting interactive experiences, we took this idea to its extreme conclusion and decided to make a site that measured inane statistics around our office and present it in a beautiful flash site. In order to accomplish this we built a wireless mesh of Arduino driven devices that detected various physical metrics, we logged passive network statistics, and created a web interface for people in the office to tally human statistics. All of this data was coordinated on a centralized database that served up a custom developed socket server for our flash application to receive live data. Things like elevator door opens vs. elevator button presses, toilet flushes, overheard corporate jargon, bathroom methane levels, and smoke breaks were all tallied and represented on the site. This all added up to the message of “The Quantification of Creativity”.
The Web of Secrets is a great little site I made with Roger Braunstein at SarkissianMason. I used FIVE3D for the main menu, and the Sekati framework for the guts. The site let people anonymously submit shameful secrets into the ether. Users could browse through secrets by use of cross relational database that would highlight words in each secret that existed in other secrets. This created an endless web that kept users on the site. Additionally, an iOS app was developed to supplement the site, so that people could add to the database without needing Flash.
This was a fast-turnaround site made along side my good friend and cohort Jason Horwitz at SM. Notable things are the custom-built (pre Flash 10) 3D rotations and smart use of bitmap slicing on larger assets for full page turns. Not to mention that we built this piece start to finish in about four days!
“Car in a Box” as we called it was one of the first large scale development projects that I led and developed . Working in conjunction with art director Ness Higson, we produced a beautiful FWA winning website that used Papervision 3D, scrubbable 3D video renders, and plenty of old-school Flashy goodness.
This was my first full piece with Papervision 3D in Flash, and at SarkissianMason. The cover gallery was a small potential addition to a larger project that was only a value added component, but the client was so happy with the piece that the module remains on their site today. This was a fun project for me because my tinkering informed and led the design process and almost all of the finished piece is in line with the idea that I had developed.