Share with friends:
Tech-industry veteran Jim Marggraff discusses his thoughts on the collaboration needs of virtual teams, and shares his excitement for the launch of Oblong’s newest creation: Rumpus.
PTS (Problems to Solve):
1) Recover the lost productivity resulting from social isolation of remote workers.
2) Improve the means of sharing content and sharing intent in commonplace tools used for remote collaboration, such as Slack, BlueJeans/Webex/Hangouts/Zoom/Skype, Google Docs and others.
3) Provide tools for remote collaboration that offer mind-to-mind collaboration and communication that EXCEED what’s POSSIBLE in face-to-face meetings.
Imagine a collaborative, video environment where ideas, intent, and information can be shared instantaneously by, and between, any individuals at any time. The designers knew that such a new form of virtual team collaboration would be most readily accepted if it begins by recreating the familiar, natural dynamics of in-person communication, and then extends it.
Such an experience can then transcend the “one speaker at a time” limit of a group of co-located people sharing the medium of “air” as the ether of local communication. The result, with the simple addition of an initial set of capabilities comprising easy object sharing, simultaneous cursor movement, temporal markup, and contextual emoticons, is an elegant breakthrough riding atop traditional video conferencing.
Rumpus, a new Beta offering from Oblong, enhances human collaboration. From the creator of Tom Cruise’s iconic Minority Report manipulation of information from a trio of Spielberg’s “precogs,” John Underkoffler, Founder and CEO of Oblong, and his team, are bringing tenets of intent-transfer into video conferencing.
Rumpus currently rides on BlueJeans for a full video/content integrated experience. Rumpus also runs alongside all video conferencing apps without it becoming the content sharing app for your video call. You’re only one sent link away to more engaging virtual meetings with your colleagues where you all share docs, point, draw, and share emotions instantly! Try it here! It’s powerful!
Rumpus is Mac-only at this stage, with support for PCs and other video conferencing and collaboration platforms coming soon.
Really worth a test-run, for a glimpse of the future, with much more to come!
Disclosure: I’m on the Board of Oblong, and am smitten with this team’s commitment and ability to disrupt and advance collaboration and communication.
Applying AI to Enhance Human Performance, Connectedness, and Communication
Working with Watson
The goal of each Watson Experience Center—located in New York, San Francisco, and Cambridge—is to demystify AI and challenge visitor’s expectations through more tangible demonstrations of Watson technology. Visitors are guided through a series of narratives and data interfaces, each grounded in IBM’s current capabilities in machine learning and AI. These sit alongside a host of Mezzanine rooms where participants further collaborate to build solutions together.
The process for creating each experience begins with dynamic, collaborative research. Subject matter experts take members of the design and engineering teams through real-world scenarios—disaster response, financial crimes investigation, oil and gas management, product research, world news analysis—where we identify and test applicable data sets. From there, we move our ideas quickly to scale.
Accessibility to the immersive pixel canvas for everyone involved is key to the process. Designers must be able to see their ideas outside of the confines of 15″ laptops and prescriptive software. Utilizing tools tuned for rapid iteration at scale, our capable team of designers, data artists, and engineers work side-by-side to envision and define each experience. The result is more than a polished marketing narrative; it's an active interface that allows the exploration of data with accurate demonstrations of Watson’s capabilities—one that customers can see themselves in.
Under the Hood
Underlying the digital canvas is a robust spatial operating environment, g‑speak, which allows our team to position real data in a true spatial context. Every data point within the system, and even the UI itself, is defined in real world coordinates (measured in millimeters, not pixels). Gestures, directional pointing, and proximity to screens help us create interfaces that more closely understand user intent and more effectively humanize the UI.
This award-nominated collaboration with IBM is prototyped and developed at scale at Oblong’s headquarters in Los Angeles as well as IBM’s Immersive AI Lab in Austin. While these spaces are typically invite-only, IBM is increasingly open to sharing the content and the unique design ideas that drive its success with the public. This November, during Austin Design Week, IBM will host a tour of their Watson Immersive AI Lab, including live demonstrations of the work and a Q&A session with leaders from the creative team.
Can't make it to Austin? Contact our Solutions team for a glimpse of our vision of the future at our headquarters in the Arts District in Los Angeles.