Humor by Design
Across the verbose landscape of tech/design buzzwords, humor is one of the essential human expressions that doesn’t seem to be used often enough. As a tool for breaking down communication barriers, humor is something that often comes naturally in day-to-day conversation, but can be easily overlooked when creating client-facing work. Working from within an environment that strives to humanize digital interactions, our goal is to make sure that these moments are not only possible, but encouraged.
After my initial on-boarding process, the first task I was given was to help put together a presentation demo using an arduino, RFID scanner and g-speak™. As a designer, the goal was to turn the technology into a human-relatable story. The result was Tape Deck, a video player interface held together with hairbands and lollipop sticks that takes the form of an antiquated media format. If that sounds funny, then it was a success—my first client-facing project turned out to be a joke.
Who is it for?
We have longterm relationships with many of our clients, who are familiar with the range of immersive interactive experiences we have designed and built. They use our wands in high-end executive briefing centers to move content across multiple displays and other large-scale environments. In the case of TapeDeck, we wanted the client to understand the power of tangible interface—to think beyond the large display wall.
How does it work?
Each of the cassettes we used had its innards swapped out for a small RFID tag, or “FOB,” with a unique ID code. When one of these cassettes is placed inside the slot of the Tape Deck, its tag’s code is scanned by an Arduino and then broadcast to the room. g-speak treats multiple connected machines as integral parts of a single room-scale interface. When the message is received on a machine, a corresponding video plays on its display(s).
What does it do?
There are actually two answers here. At face value, Tape Deck provides a physical interface for executing video playback, but what we are really showing is that g-speak is flexible and extends to take input from other technology platforms such as Arduino, Raspberry PI, and many more. While this capability might seem apparent in the abstract, we wanted to make it real. This is what was really being showcased, and also where the punchline lies.
Why a tape deck?
Analogies and metaphors are strong tools for communicating new ideas, and a strategy Oblong uses regularly. Apart from the fact that analog has been making a comeback, the metaphor of the tape deck as media device is instantly recognizable.
Although we were keen to showcase the capabilities exhibited by Tape Deck, there were some subtle antics involved. In the middle of the client meeting and without explanation, my colleague Pete loaded one of the cassettes into the slot of the device, prompting a reel of documentation to play on the Mezzanine displays. The clients were baffled at first, chuckling, but did not object as Pete talked through the demo.
Pete slid out the tape and dropped in another, prompting the next video to play. Again the action was met with amusement.
Following the presentations, the designer from Oblong, a company that pushes for the next generation of digital collaboration, used an old sponge to “clean” the video stream. There was a roar of laughter from the conference room. You don’t have to be familiar with g-speak, Mezzanine, or Arduino to understand this part of the joke. Was it gimmicky? Yes. Effective? Absolutely.
Aristotle made the observation that humor is “… something unexpected, the truth of which is recognized.” The truth in this case is expressed by an RFID tag encased in a dried up sponge. Sitting on its own, Tape Deck is really not that funny to look at. It requires a human touch and some comedic timing to set up and deliver the punchline.
We use design across all fields to clearly communicate capabilities of our technology, and humor is a model of efficiency. Bringing a demo like Tape Deck into the meeting not only allows us to create an understanding using a visual reference; it sets the stage for a slapstick performance, provoking laughter and making human connections.
1. Salvatore Attardo (1994). Linguistic Theories of Humor. Walter de Gruyter. pp. 20–. ISBN 978-3-11-014255-6.
Working with Watson
The goal of each Watson Experience Center—located in New York, San Francisco, and Cambridge—is to demystify AI and challenge visitor’s expectations through more tangible demonstrations of Watson technology. Visitors are guided through a series of narratives and data interfaces, each grounded in IBM’s current capabilities in machine learning and AI. These sit alongside a host of Mezzanine rooms where participants further collaborate to build solutions together.
The process for creating each experience begins with dynamic, collaborative research. Subject matter experts take members of the design and engineering teams through real-world scenarios—disaster response, financial crimes investigation, oil and gas management, product research, world news analysis—where we identify and test applicable data sets. From there, we move our ideas quickly to scale.
Accessibility to the immersive pixel canvas for everyone involved is key to the process. Designers must be able to see their ideas outside of the confines of 15″ laptops and prescriptive software. Utilizing tools tuned for rapid iteration at scale, our capable team of designers, data artists, and engineers work side-by-side to envision and define each experience. The result is more than a polished marketing narrative; it's an active interface that allows the exploration of data with accurate demonstrations of Watson’s capabilities—one that customers can see themselves in.
Under the Hood
Underlying the digital canvas is a robust spatial operating environment, g‑speak, which allows our team to position real data in a true spatial context. Every data point within the system, and even the UI itself, is defined in real world coordinates (measured in millimeters, not pixels). Gestures, directional pointing, and proximity to screens help us create interfaces that more closely understand user intent and more effectively humanize the UI.
This award-nominated collaboration with IBM is prototyped and developed at scale at Oblong’s headquarters in Los Angeles as well as IBM’s Immersive AI Lab in Austin. While these spaces are typically invite-only, IBM is increasingly open to sharing the content and the unique design ideas that drive its success with the public. This November, during Austin Design Week, IBM will host a tour of their Watson Immersive AI Lab, including live demonstrations of the work and a Q&A session with leaders from the creative team.
Can't make it to Austin? Contact our Solutions team for a glimpse of our vision of the future at our headquarters in the Arts District in Los Angeles.