This project is an interactive and immersive educational installation. The display, installed at the Marine Science Magnet High School displays data pulled directly from NERACOOS buoys, as well as custom information entered by students or educators.
Students can interact directly with the screens as they explore the latest developments in the world of marine science technology of the Long Island Sound.
The installation is composed of two large touch screens placed back to back. Each screen is encased with a PQ labs frame, which detects touch points. The installation runs on two independent computers, and one of these computers also acts as a server for the CMS.
As I probably mentioned in earlier posts or tweets, Float4 most often uses RealMotion, its own proprietary software. The development was then split in two: the RealMotion graph that handles the interaction and embeds the content, built by Bruno Gohier, and the interface and content, which I built.
Akin to how Scaleform allows to embed Flash content, so does RealMotion. Since the content for this installation needed not to live anywhere else, and also since it was mostly a two dimensional multitouch graphical interface, Flash was the most indicated technology to choose here.
I may have said in other posts that RealMotion can communicate touch points to the application via the TUIO protocol. I was using the TUIO AS3 library, although along the way I ended up finding and fixing some of its memory leaks and other issues. Don't get me wrong though, it's still an awesome library.
The development was pretty much straightforward, however I think I haven't done something this demanding on a machine since the adidas Originals Women's Lookbook. This project is in full HD (1920 x 1080) at 60 FPS and was required to work in multitouch.
The transition animation ended up being a lot more work than expected. Using a 60 FPS video would prove impossible, as it would take too much time to start and skip too many frames. I tried scrubbing the video manually at every frame, which ended up being worse. I also tried loading dynamically all 360 frames for the transition and built a class that would display the animation. This worked smoothly and was quite satisfying visually. The issue was that once all the images were loaded, the application would freeze while trying to add the images to the stage. And this would only work while developing in FDT, as soon as I tried in a browser or within the context of RealMotion, a crash would occur. So I ended up using two 30 fps videos, one forward and one backwards, since the user has to be free to come back to the map state.
It was not my first go at multitouch, so I could apply the discoveries I had made there. The main thing was really about keeping track of the touch ids: once an item is touched, no matter how many times other touches disappear from the surface, the item is still considered touched as long as the first id is still touching the item.
Another touch behaviour that I implemented from the slideshow was the "touch and hold to trigger". You can read a bit more about that logic in previous post. In this case it was used in an area where images and videos could be dragged, a bit like a vertical carousel. It then became possible to distinguish between dragging to change images and triggering to start a video.
Because laser touch detection cannot be as precise as touch screens, we wanted to avoid small UI elements as much as possible. We decided that textfields would be draggable, like on mobile devices, also because it allowed us to remove the drag bar and the arrows. More space for content.
The weather data presented in the attract view is pulled from the Yahoo! Weather RSS Feed, whereas the buoy data is obtained from NERACOOS. Actually, if you ever need to use buoy data, contact these guys, there were thrilled to help us use their service. Even before completion of the project, they showcased the installation in their newsletter.
All other historical and animal data is entered in a CMS by the users (students or teachers).
The RealMotion graph contained the Flash content and presented it on screen. Since this software is really strong with visual effects, there was no need to try and implement them in the Flash application. We opted for a water ripple effect, which made sense in the context. As the user touches or swipes the surface, water like waves are created across the screen. In the map view, we also added a mask so that the waves would ricochet off the shores of the coast.
We had to come up with a way to change the states of the waves according to the states of the Flash application. The best way to do so was to send values via the
I built the information architecture for the CMS and gave this information to WLAB, the service providers who developed most of the CMS. They used CakePHP, a framework I didn't know then. To be honest, I found this framework too bloated and convoluted for the needs of this project, a fully custom CMS would have been preferable, quicker and more flexible. The providers became overloaded with other projects, so I asked Simon Arame to take over and help us finish the work.
- Original Concept: Meka, Alexandre Simionescu
- Art Direction, UX Design, Information Architecture: Mat Janson Blanchet
- Interface Design: Raed Mousa
- Transition Animation: Francis Dakin-Côté
- Interface Development: Mat Janson Blanchet
- RealMotion™ Graph Development: Bruno Gohier
- Back-End Development: WLAB, Simon Arame
- Project management: Anne Élisabeth Thibault, Géraldine Restani, Mat Janson Blanchet
All copyrights and products belong to their respective owners.
UX Design, Information Architecture, Art Direction, Technical Direction, Development, and Project Management Support
While hired for freelance creative development services by Float4