04/07/2014 - 12:38
I am honored to be included as a finalist in the inaugural Design for Experience Awards in the "Bridging Digital and Physical Experiences" category. The awards were organized by UX Magazine.
At Unified Field we create content-rich, experiential and interactive media for next-generation digital branding environments, multichannel media experiences and interactive exhibits that engage audiences where the physical and the virtual converge. While working at Unified Field I have participated in and led the software development of a number of these projects. Most recently I developed simulators and interactive exhibits for the Space Shuttle Atlantis Attraction at the Kennedy Space Center in Florida. I developed four separate multi-screen simulators that allow visitors to go on a spacewalk using gesture sensors, operate the shuttle’s robotic arm, dock the shuttle to the International Space Station and land the shuttle. The simulator software is operated by the user in unison with realistic physical mock-ups of the shuttle cock-pit giving visitors a unique experience to learn what it’s like to train like an actual astronaut. The exhibits are situated around the Atlantis, which is mounted on display nearby, and enhance the visitor's connection to the shuttle and its legacy. I also recently developed an immersive experience for the Birds of Paradise exhibit at the National Geographic headquarters in Washington D.C. The "Dance Dance Evolution" experience was an interactive exhibit that allowed visitors to control virtual 3D Parotia Birds, and replicate moves from the intricate mating dances the birds engage in while trying to attract a mate. Two visitors "dance" against one another simultaneously as the virtual birds reflect their moves through gesture tracking. The moves are then algorithmically applied to the animated virtual 3D bird model. The users are tracked for how well they perform and surrounded by fellow visitors that are able to rate how well they dance by voting for one or the other. Spatially, the physical set up is similar to how two male Parotia birds dance in a jungle clearing while female birds observe from branches above. Another recent project I worked on was the GE: What Works conference about American Competitiveness at the Mellon Auditorium in Washington D.C. We developed a number of innovative programs that include gesture-based presentations, multi-touch projection tables, mediascapes and an immersive interactive environment that connected G.E.'s key job-creation initiatives to its target audience. Visitors included members of Congress and senior G.E. managers in addition to the general public. I developed the software for two content-rich multimedia gesture tables that many participants were able to explore simultaneously using simple gestures above the table surface. I also helped create a library of content-specific gestures for two interactive walls that made product and historical content highly engaging.
02/26/2014 - 10:04

New MTA subway touch screens displays have finally arrived, a number of the new systems were placed in Grand Central Station. They will feature way-finding and real-time status information, useful for receiving information underground when PA speakers are unintelligible or when one is unable to find a station attendant (the numbers of which are steadily decreasing).



via Gizmodo:

New York subway riders first were promised futuristic touchscreen wayfinding maps a year ago. But the plan to install the futuristic infrastructure stalled as the design team took a step back to improve the hardware. Six months overdue, the first batch is finally live in Grand Central Station. They were worth the wait.

Over the last month, the first 18 MTA On the Go kiosks were installed in the Grand Central subway station. Eight of them are split between the uptown and downtown sides of the major 4/5/6 north-south arteries; the other 10 are scattered throughout the mezzanine above that connects the subway to the century-old commuter rail station. (Expect a wider roll out to more stations by the middle of the year.) The screens are basically huge interactive navigation centers, which serve real-time up information about how to get where you're going, and what (inevitable) service disruptions might get in the way.

More information can be found about the pilot program from the MTA website. 
They will be moving ahead with a second phase of the project to expand the kiosks to more stations:

MTA New York City Transit announced today that it will move ahead with the second phase of a pilot project for On the Go! Travel Stations, adding at least 77 of the interactive touch-screen kiosks throughout the system that offer MTA travel information and a whole lot more.

... and the systems will be customized according to their location and the time of day:

The On the Go! Travel Stations can be customized for a specific location and by time of day.  For example, at the Penn Station Travel Station, during the morning, the screen will default to subway information and in the evenings it will default to LIRR service.  All content is remotely managed from a secure web-based management system and applications can be changed or updated as needed.

This is of particular interest to me, since I studied MTA subway information gaps as part of my thesis in the MFADT program at Parsons.

02/06/2014 - 18:44

This is a recap of the Volumetric Meetup I attended on Feb. 5, 2014 featuring Ken Perlin. 

Ken Perlin, a professor in the Department of Computer Science at New York University, directs the NYU Games For Learning Institute, and a participating faculty member in the NYU Media and Games Network (MAGNET). He was also founding director of the Media Research Laboratory and director of the NYU Center for Advanced Technology. His research interests include graphics, animation, augmented and mixed reality, user interfaces, science education and multimedia. He received an Academy Award for Technical Achievement from the Academy of Motion Picture Arts and Sciences for his noise and turbulence procedural texturing techniques, which are widely used in feature films and television, as well as the 2008 ACM/SIGGRAPH Computer Graphics Achievement Award, the TrapCode award for achievement in computer graphics research, the NYC Mayor's award for excellence in Science and Technology and the Sokol award for outstanding Science faculty at NYU, and a Presidential Young Investigator Award from the National Science Foundation.

In short he was the creator of "Perlin Noise" for procedural texturing.

He started off by creating procedural characters that can be directed, not animated, using simple ai - similar to the approach that Pixar takes. Thought about what it  would be like to have these characters interact with real objects?

11/10/2013 - 13:49

Unified Field was part of the developer preview program for the Leap Motion Controller and we considered using it for the Kennedy Space Center Exhibit. The official Leap Controller didn't launch until the Summer of 2013 around the time the museum opened, so we weren't able to use it there. Since then I created a demo by hooking it up to the Shuttle Landing Simulator I developed for Kennedy, and we set it up for our Unified Field booth at the 2013 ASTC tradeshow in Albequerque, New Mexico. 

Users can advance through the interface by "tapping" their finger, originally there was a joystick with a button for this application. The shuttle flight path is controlled by moving one's hand in the same manner an airplane would fly.

An on-screen guide hints at the necessary gestures.

11/02/2013 - 13:15

Got to meet Col. Chris Hadfield at the Union Square Barnes and Noble book signing event for his new book "An Astronaut's Guide to Life on Earth" on November 1st, 2013, my friend Mike Edwards came along as well. He was the astronaut that covered Space Oddity by David Bowe while aboard the International Space Station, a video that has been seen over 18 million times. He was also a prolific social media author (YouTube, Twitter, etc...) while in space and recorded random space station sounds. I mentioned to him that I used his recordings of turning a space suit on and off as sound effects in my EVA Spacewalk simulator for the Space Shuttle Atlantis Museum.



06/29/2013 - 01:44

I was fortunate to be on-site during the opening of the Space Shuttle Atlantis Exhibit at Kennedy Space Center in Florida. 40 former astronauts who represented crew members from all of Atlantis' missions were also in attendance in addition to about 1000 guests.

Guests are gathered around the stage and the astronauts under the shuttle during the festivities.


The picture below is a view of the upper floor of the museum, it shows the proximity of the viewing platform to the shuttle with its open payload bay doors on the right, in the center along the rail is the six-screen multitouch timeline interactive we also built at Unified Field (every shuttle mission can be looked up), and on the left are the blue shells that house the EVA Spacewalk Simulator I worked on. In the background is a full-scale model of the Hubble Space Telescope.

Upper floor of the museum.


A closer view of the EVA shells. Astronaut Jon McBride, right, who landed the Atlantis on a mission in the '80s watches another guest use the EVA.

06/07/2013 - 01:24

I presented a case study about our design and development work for the GE Works conference on behalf of Unified Field at the 2013 SEGD National Conference. The conference was titled "Above the Fog" and located at the Fairmont Hotel in San Francisco June 6-8 2013. The photo above was courtesy of the SEGD twitter feed during my presentation.

Another shot of a crowd at the conference.


These are some shots from the presentation I delivered about the GE project. An early 3D rendering of the space from Thinc Design who we worked with on the project. They designed the space, and Unified Field designed and developed the media elements within.

05/09/2013 - 23:28

This is a realistic earth rendered in OpenGL in real time using GLSL shaders for dynamically blending the day, night, and cloud textures and uses a light scattering algorithm for illuminating the atmosphere based on the sun position. The earth is included in the simulators I developed for the Space Shuttle Atlantis exhibit at the Kennedy Space Center. The sun elevation and position in relation to the earth is accurate according to the time and season of the computer on which it is running. I also created an app so one could use the Earth in other programs with GPS coordinates. I am hoping to release that soon as an addon for openFrameworks.

Some of the shader source code was based on the article by Sean O'Neill for atmospheric scattering

Texture were by the Blue Marble site.

01/17/2013 - 08:54

I visited Johnson Space Center in Houston TX during a research trip for the Shuttle Atlantis project to help interview astronaut instructors and test out the official NASA Shuttle training simulators. It was humbling to be among all the staff and astronauts that have kept NASA's manned space flight program running since the 1960's.

The Christopher Craft Mission Control Center building is where mission control is based. It is hurricane proof and as long as the flag on top of the building is raised it means there is an American Astronaut in orbit.

The original Apollo-era mission control room is still preserved exactly as it looked during the moon landing, Apollo 13 and early shuttle missions.

The newer mission control room is the base of operations for the International Space Station.

10/10/2012 - 00:03

National Geographic asked Unified Field to build interactive experiences that engage visitors for the “Birds of Paradise” exhibit at the National Geographic headquarters in Washington DC. I authored a post over on the Unified Field blog that covers some of our process building the Dance Dance Evolution game, in which players dance against one another in order to control 3D virtual Parotia birds as they enact actual mating-dance moves that the birds utilize in nature.

The post discusses how we animated the virtual birds in the 3D game environment, the need to construct a work flow to export the models from Maya and import the them into my OpenGL/C++ based 3D environment, prototyping, and correlating human skeletal movement to a bird. 

Check it out here.

06/10/2012 - 21:42

I've been to a number of "Creative Technologist-Interactive" conferences and Eyeo has consistently featured one of the best speaker line-ups worldwide as far as I can tell. All the more surprising given its location in my home-town in the middle of the country. It's great that so many talented people have been making the trek out to Minnesota and get to experience the Twin Cities in summertime. 

The conference ran from June 5-8, and basically ran from morning to night each day. There has been a lot written about it and much of the content is available online, so I'd like to put a developer lens on it and highlight the following four creative technologists. 

Robert Hodgin is a super talented developer, co-founder of the Cinder code library, and one of the best speakers I've seen (this was the second time). He is highly original, humorous and develops demos unique to his speaking engagement. You could tell he put a lot of time and thought into his talk and it was entertaining because of it. 


Andrew Bell is also a co-founder of Cinder and his skills and experience are definitely on par with Robert Hodgin. His presentation was also inspirational and he touched on more holistic career-oriented topics, which made his talk highly enlightening since he shared much about his vast experience in the industry.


Theo Watson and Emily Gobielle of Design I/O are former Parsons grads like myself and really made it clear why attending conferences can be rewarding. Theo and Emily did a better job than most of the other speakers at diving into their process and trying to share material they wouldn't otherwise make public. The images below are an example of how they took the audience through some of their project development.