Sony is back in Austin this week for SXSW, the annual tech and entertainment meet-up, with a entire warehouse of weird gadgets, games, and prototypes that all rely, in one way or another, on Sony technology. The exhibit, called the Wow Factory, is an opportunity for the Japanese tech giant’s engineers and artists to collaborate on experimental projects.
These projects are meant to emphasize how Sony’s display technology, particularly its advancements in image sensors and projectors, can be stretched and morphed into hardware and software that goes far beyond a standard image on a flat screen. In this way, Sony is able to dabble in areas like augmented reality by using interactive holograms instead of requiring users to wear bulky glasses or helmets. It achieves this by using projectors and sensors that track motion and measure depth and pressure to let you interact with objects made entirely of light.
Sony achieves AR using image sensors and projectors, not glasses or helmets
One such example is a three-way augmented reality air hockey game Sony developed specifically for the Wow Factory this year. The game features a physical hockey puck and three physical paddles around custom circular table. But the table is also making use of two of Sony’s new IMX382 visual sensors, which can detect and track objects at 1,000 frames per second. One sensor sits above the table to track the puck, and another sits below to track players’ paddles. An overhead projector meanwhile overlays the game interface and virtual pucks onto the surface of the table.
This sensor setup is similar to the one contained in Sony’s experimental projectors that it’s brought to SXSW in past exhibits. In those situations, Sony has turned tabletops into touchscreens and created interactive software that overlays onto physical props. For instance, Sony used a copy of Lewis Carroll’s Alice’s Adventures in Wonderland alongside a physical deck of playing cards and a teacup to bring to life the happenings described in the text. The company also built an architectural demo that could turn a standard block of wood into a top-down scale model of a home, with the light shining down onto the table to color and annotate the objects in real time.
In the case of the AR air hockey game, Sony’s software allows the real hockey puck to interact with the virtual ones because the image sensors track both your hand and the paddle as you interact with the objects on the table. So you can hit the virtual pucks with your paddle as if they real, and the virtual ones even bounce off of the real puck and the sides of the table in realistic fashion.
The game itself is a chaotic one in which all three players are simultaneously defending their own goal and going on the offensive against their opponents. All the while, a half-dozen hockey pucks — all but one of which are made out of light — fly around the table and collide in a nonstop frenzy.
While it’s not ever going to be a commercial product, Sony has shown time and again that its display and sensor tech can achieve a novel form of AR. These demos show the extent to which sensor data and the right mix of hardware can create immersive experiences that don’t rely on blasting light in your eyes or plastering a screen on your face. AR is often thought of as something that will only truly arrive when it’s packed into a standard pair of eye glasses. And yet right now, the common conception of the technology is the selfie filters and other animations you get on Snapchat and other apps, as well as the clumsy mixing of real and virtual objects through a smartphone lens like with Niantic’s Pokémon Go.
But Sony’s showcase here at SXSW illustrates how AR can be achieved through alternative means, if you’re wiling to expand how you think about the term and what it requires to realistic function.
Hi! I am a robot. I just upvoted you! I found similar content that readers might be interested in:
https://www.theverge.com/circuitbreaker/2018/3/10/17104770/sony-projector-augmented-reality-air-hockey-light-powered-sxsw-2018