A Mezzanine Visit with TechCrunch

2.22.2012
Team Oblong

In the middle of 2011, we had the privilege of hosting MG Siegler (while he was still at TechCrunch) at an under-the-radar Oblong demo location in the Bay Area. We invited MG to see an early demo of our Mezzanine product, as well as to generally talk about the future of computing, which we at Oblong are always happy to do. 

As we near Mobile World Congress in Barcelona, where we will be showcasing Mezzanine, we thought we would re-surface the article for those who are interested in a 3rd party look at the product. Although the TechCrunch story highlights a somewhat early version of Mezzanine's feature set, it's a nice sneak peek of what Oblong will be demonstrating at MWC. 

And if you’re interested in the quick summary, here’s a brief excerpt that gets to the core of Mezzanine's in-room collaboration use case:

The idea for Mezzanine is to get people in a room together in order to synthesize information in the most collaborative way imaginable. “We want to get everyone’s pixels in a shared workspace, where they collide,” [Kwin] Kramer says. “The key is to give everyone control over what’s happening,” he continues. And that means interacting with the data on the three main screens from your laptop, iPhone, iPad, directly on the screen with the wands, or even remotely via a device with a web browser.

Using these controls, anyone can rearrange data, push new data into the flow, highlight specific things, and queue stuff up to talk about later. Perhaps the best way to think about it is as a symphony of information that everyone in the room can conduct. Again, it’s a bit hard to describe, but when you see it, it just makes sense.

No items found.

Interested in Learning More?

Start a conversation with our team to deliver the best collaboration experience for your teaming spaces.

Contact Us