Parallel Reality Display
AI and 5G will enable two people to see different content on the same screen at the same time
The partnership between Delta Airlines and Misapplied Sciences was much of the talk at CES. With AI, 5G, and location capabilities, Delta Airlines will enable people standing next to each other to see different personalized content on the same screen.
While many events have used wearable beacons to trigger customized content for attendees, Misapplied Sciences has more capabilities, and no beacon is required. Delta, for example, will attach an ID to a person from ceiling mounted cameras once a boarding pass is scanned. That ID will trigger personal travel information to appear when that body steps in specific locations.
Delta will roll out this new technology at Detroit Metropolitan airport later in the year. Just after passing through security, passengers will see personalized flight and gate information appear.
If this reminds you of the last few seconds of the “It’s A Small World” ride where screens display goodbyes with guest names as they pass, that’s because one of the Misapplied Sciences founders was a former Imagineer with the Walt Disney Company.
Future Applications at Events
In addition to the obvious wayfinding solution this could provide, “parallel reality” can also address many friction points in a customer experience once 5G becomes widespread and the technology becomes more affordable.
The meetings world can benefit with globally diverse audiences, too. Part of the information that can be customized to the individual is language, which means that screens can provide subtitles according to the viewer’s language.
For global brands whose executives speak several different languages, this technology can be a game changer in corporate meetings and events strategy .
Eventually, personalized content may not be the only option; the automation of specific experiences will be here before we know it. Tailored to an individual’s location, machines will respond and provide that personal touch, like a coffee just the way you like it— George Jetson style.
Bending and Rotating Screens
New capabilities reshape how we view and interact with screens.
When an emerging technology is first commercialized it immediately heads in two directions: into the hands and homes of early adopters or into the events and experiences created by marketers.
Driven by the recent wave of announcements, commercially-available bendable and curved display screens are sure to set the tone and the stages for event marketers.
Curving with Sets
Last year’s curved screens surged in popularity for corporate events and conferences. In April, natural materials and flowing stage designs put the focus on elegantly curved screens at Google Cloud Next ’19. While at Coachella, both the Sahara tent and Empire Polo Club paired big-time entertainment with massive arrays of curving LED screens.
Rolling Environments
LG, a notable leader in finding new ways to bend and stick video screens, continued to up the ante for all screen manufacturers. At CES 2019, LG continued to show (and show-off) their leadership in this space. Their “Massive Curve of Nature” awed attendees by merging organic design with nature-inspired visuals. LG also pushed from concept to commercialization, as they are now taking orders for their new roll-up OLED televisions.
As these new form factors move into our pockets, homes, and festivals, marketers will quickly follow with brand experiences that showcase an array of flexible screens.
Anything else will fall flat.
Podcast Engagement
Podcasts help create extendible event content experiences on-and off-site.
Podcasts and live broadcasts may not be anything new, but their evolving connections to events are becoming more strategic. On top of podcast broadcast stages, marketers are embedding podcasts into larger events. They also engage influencers, and they’re capitalizing on the aspect of podcast streaming to capture audiences beyond the event footprint.
Take into account the amplification possibilities, as reported by Digiday. Many podcast producers report that when they do live shows, particularly outside the hosts’ hometowns, they reliably drive big spikes in downloads locally.
C2 Montreal took the strategy of embedding podcasts into its commerce and creativity conference in May with The Aquarium, a glass enclosed on-site broadcast center in the middle of the show floor. Because of the design, the podcast stage served as an engagement tool and was a spectacle in and of itself, where passersby could stop to watch the action while tuning in and snapping photos. Not to mention, the design supported the acoustics needed for a successful podcast recording.
South by Southwest for a second year activated the SXSW Podcast Stage (which had sponsor TuneIn attached). Throughout the event, 42 podcasts on genres ranging from comedy to food to science were featured on stage. The podcast stage served as an event venue, too, for sponsored happy hours with TuneIn.
A top benefit for brands, events and exhibitors, is the opportunity to create compelling content and capitalize on the energy of the live experience. Exhibitors, for example, can ask a keynoter or key attendee influencer to take five minutes to sit down in the podcast zone in their booth to conduct an interview. Attendees will come to watch the VIPs in action and the content generated will be timely, live and worth amplifying, earning a brand thought-leadership status.
Hearables
How listening technology and the ‘ear space’ is transforming events.
Move over wearables and other tactile innovations that dominate events. Hearables are the next frontier of event engagement. Experts predict they will proliferate like smartphones, enhancing the attendee journey and how attendees experience session content with real-time language translation, crisp audio and other benefits. Hearables will force marketers to rethink event footprints, as walls and sound barriers become irrelevant.
The ear space, in the wake of voice-assistant technology like Amazon Alexa, will set in motion the biggest interface revolution since the iPhone popularized the touchscreen, says Peter Burrows in a recent Fast Company piece. In “The Future is Ear: Why Hearables Are Finally Tech’s Next Big Thing,” he quotes Gints Klimanis, former head of audio engineering at the now defunct smart earbud startup Doppler: Ultimately, the idea is to steal time from the smartphone. The smartphone will probably never go away completely, but the combination of voice commands and hearing could become the primary interface for anything spontaneous.
The ear space revolution dialed up with the return of headphone culture (think Beats by Dre) and with earbud innovations, as smart devices and Bluetooth began supporting pocket music and conversation. Up rose in popularity silent discos at events as an individual and group experience, where music plays in each attendee’s ear sometimes, their very own selection. Silent discos have evolved into activities like Silent Yoga by Sound Off at C2 Montreal.
In the meeting and conference space, headsets have transformed sessions into silent discos. Event Marketer’s EventTech show, which is set in an open campus format, uses personal earbuds and receivers to enhance sound quality. At the Adobe Summit Conference in London, partner Silent Conference distributed 2,500 headphones and allowed attendees to tune into five different talks within one conference hall, without relocating meaning, there were no breakout rooms. And at PCMA The Future of Face2Face, an LED installation called The Mix displayed video playbacks of sessions, so attendees could sit down and channel surfed through content on-demand via their headsets and the app.
A big player in the space are Google’s Pixel Buds, which works with Google Assistant in translating 40 different languages in real-time. As event audiences expand globally, planners and attendees won’t have to worry about language barriers in conducting business, visiting an expo or networking.
A result that’s crystal clear.
180-Degree Video
Thanks to a new push by YouTube, 360-video is doing a 180.
Recognizing that VR content is cool but often difficult to watch, YouTube this summer announced it is embracing 180-degree video heavily with the hope of making it more accessible and easier for consumers to create, post and view. This content, rather than viewed by spinning around (either your head or your mouse), focuses on one direction with a wide-ranging peripheral view on either side.
As part of the push, YouTube and the Google Daydream VR division are partnering with Lenovo, LG and Yi on VR180 cameras available for mass market use and priced like point-and-shoot cameras to democratize VR video creation.
“Our goal with VR180 is to simplify VR video production for all creators, consumers, and high-end video producers as well,” YouTube’s VR product Lead Erin Teague told Mashable.
How will this more mainstream immersive content impact events and marketing campaigns?
Better production techniques. With 180-degree video, there’s no need to worry about removing items from the frame that you don’t want to appear, or the inability to make production cuts. And, directors can actually sit behind the camera to manage the shoot. Without the head-spinning requirement of 360-degree content, there will be fewer motion sickness issues, too.
It’s not as cumbersome. If you’re being honest, rarely have you decided to spin your device around on your couch in order to watch your friend’s 360-content on Facebook. And that’s exactly one of the challenges YouTube sought to solve with its embrace of 180. Could it mean your event or branded content is viewed more and for longer periods of time? Perhaps.
More people will embrace it. As Teague described to Mashable, the current 360-degree stereoscopic cameras require specific skills, are expensive and production can be complex. With point-and-shoot 180-degree options, more events, brands and consumers will be able to create compelling event and user-generated content. Which means more, and better, stories for brands to leverage for maximum amplification.
‘Gesture’ Analytics
Voice and gaze-based technologies are giving marketers an eyeful and earful of fresh analytics.
Marketers often rely on event attendees to provide feedback on an experience, whether it’s through a survey or by measuring activities like app downloads. These controlled methods of collecting data, however, don’t always produce unbiased, honest results. Eye-gaze technology and voice analytics are poised to provide an unbridled look into what attendees are focusing on and, perhaps, what they’re saying about your brand or product later on.
Smashbox Cosmetics and ModiFace in July released insights on the impact of eye-tracking on mobile commerce.The two collaborated on the iOS app MAKEUP, which features Smashbox products and ModiFace’s face-tracking and video makeup rendering technology along with a unique ability to track the location on the screen that the user is looking at based on their video. It allowed the brand to see what users were looking at by layering a heat map on areas of the screen that receive more attention and how long a user reads the specs of a product on the screen. Smashbox could then determine the most popular cosmetics category, the most popular shades and more.
On the voice recognition forefront, researchers at the Mitsubishi Electric Research Laboratory in Cambridge, MA, are manipulating artificial intelligence to learn to pick out individual voices from the crowd. The trick identifies features in a voice that can be used to track a single person in conversation. While the technology could improve upon speech recognition devices like Amazon Echo (so it can hear you better over a dinner party), it could mean opportunities for marketers down the road to listen in on feedback and conversations, perhaps from individual influencers or pick up on key words that could then be aggregated and analyzed post-event.
Beyond analytics, eye-gaze and voice technologies are improving global accessibility and event inclusivity. Over the past few years, a host of eye-tracking startups have been picked up by Silicon Valley heavy hitters Google, Facebook and Apple, and, just this past August, Microsoft joined them, adding a test program for Windows 10 with an eye-controlled interface that allows users to get around Windows 10 on their machines by looking and directing with their eyes. The test program works with Tobii Eye Tracker 4C device and is primarily designed to help those suffering from neuro-muscular diseases like ALS and other disabilities to control the various interface elements in Windows 10 without a traditional mouse and keyboard, The Verge reports. London’s National Theater introducing AR-powered smart glasses for the hearing impaired that displays closed captioning, live right before users’ eyes.
Sounds like success.
Ultrasonic Beacons at Events
Ultrasonic beacons have breached the concert ticketing space, and events are next in line.
By now you’re familiar with proximity beacons, but another kind of beacon that has been around for a few years is beginning to infiltrate the event space. Audio beacons, as they’re known, use ultrasonic tech sound to communicate with our smart devices, reacting to and sending data and content based on personal preferences, without any of us lifting a finger. It’s been used by advertisers and in retail settings, but as Ticketmaster embraces the technology, announcing a partnership with audio beacon company LISNR, it signals the beginning of the tech trend’s potential influence on events.
Ticketmaster is testing audio beacons as an e-ticketing system called Presence at concerts, implemented in partnership with LISNR. The technology can passively check attendees into events using audio data from smartphones to reduce entry wait time, The Verge reports. No need for manually QR code-scanning or paper tickets. The smart tone technology can receive attendee’s data over their smartphone’s ultrasonic sound transmission to verify their mobile ticket and ID.
In other spaces, audio beacons have helped passively personalize everyday experiences. Jaguar is leveraging the technology by LISNR in its Land Rover Incubator lab, where beacons detect who’s driving the vehicle, based on smart tones emitting from their mobile device, and then adjust the driver’s seat according to their preferences, all thanks to in-car Bluetooth and Wi-Fi. It can even adjust the interior climate settings, all from preferences detected from a person’s device through app settings.
As mentioned earlier, the retail space has leveraged similar technologies for a few years now, like Shopkick, which senses consumers devices, sends them a push notification to use the store’s app once they’ve entered the footprint, and then leads them to deals and away from the clutter that doesn’t match their preferences. Another provider, SilverPush, has been using ultrasonic inaudible sounds to help advertisers pinpoint users. If you’ve installed any app that uses the SilverPush software development kit, it will actually be listening for that sound in the background, and when it detects an audio beacon, it’s able to identify that your desktop/laptop computer and your phone/tablet belong to the same person, TechCrunch explains.
The next time you’re lined up at registration or managing appointments at a trade show, imagine how inaudible tones will redefine customer service, preference-based experiences and automate connections like never before.
Empathy VR
How brands are making emotional connections through immersive VR.
As the novelty over virtual reality begins to wane, event marketers are thinking more strategically about how they leverage the immersive technology. They’re incorporating multi-sensory elements like moving chairs, wind and scent. They’re creating group virtual reality activations that give an experience an extra boost of energy and fewer queue lines. And they’re taking advantage of the principles of storytelling to bring campaigns to life and, as a result, creating virtual engagements designed to move audiences. VR to generate empathy is the latest trend wave to hit the space.
Excedrin leveraged VR to eliminate a pain point (beyond the physical pain) among its consumers who suffer from the debilitating headaches and their effects. Excedrin wanted to help its consumers help others understand what it feels like to have a migraine, an often-misunderstood affliction. It created The Migraine Experience, VR content that replicated symptoms like sensitivity to light and sound, disorientation and aura visual disturbances. For the program, migraine sufferers were invited to nominate a friend, colleague or family member to experience it, and an augmented reality technologist programmed the simulator to replicate that person’s specific symptoms. The footage of the VR trial was filmed for content. The results? Emotional.
At the American Academy of Dermatology annual meeting, Aqua Pharmaceuticals had dermatologists step into the shoes of their teenage patients to experience the stress of picture day for a teen with skin issues, like acne, through a VR experience. The content follows a female student’s journey through the halls of a high school as she laments her skin problems in front of a locker mirror, and tries to ignore looks and whispers she feels are directed towards her as she approaches a classroom door and realizes it’s yearbook photo day.
It may have been decades since some of the dermatologist-participants had walked the halls of a high school, but the content, with its realistic scenarios and audio, helped transport them right back into those teenage insecurities and then consider Aqua Pharmaceuticals products for the patients afterwards. Content that supports the bottom line? Success.
Attendee Heat Maps
Heat maps will transform how we interact and react to real-time data.
Whether we’re aware of it or not, we, as consumers, have been using heat maps on the regular, tracking our runs through fitness apps, locating people through the iPhone’s Find My Friends feature and following our route home in our Ubers. In events, the rise of beacons and other sensing technologies have allowed marketers to watch traffic in event footprints to spot hot zones of engagement or weak ones.
What are heat maps, and how are they generated? The most common method of generating heatmaps is through proximity beacons including radio frequencies, Wi-Fi, NFC, or Bluetooth-based technologies embedded in badges or other wearables that connect to receivers that map devices to locations and, in turn, generate traffic patterns, dwell times, and more. New tools on the market continue to evolve their baseline technologies including GPS-enabled mobile devices, Ultra Wide Band RF, or Bluetooth LE which, in turn, is enabling more variety and broader acceptance of different devices.
Pixmob, known for its crowd-sourced wearable light show technology, has introduced a new product called Klik, available in badges or bands, that contains a wireless Bluetooth chip that then connects the wearable to the Klik platform. Attendees sync their personal Klik profile with their devices to manage connections with other people and their overall event experience.
Its Colocator services, like Colocator Tracker, help event planners manage and locate assets and other key individuals, like security and staff. The company’s tags attach to a badge or lanyard, and the location and movement of each tag is triangulated by attendees mobile devices and displayed on a digital map overlaying your venue, that event planners access from the Colocator site. Each tag can be assigned a name, too, for easier tracking.
Event app platform Eventbase has been live-wiring large-scale events like SXSW and SAP’s SAPPHIRE NOW, and offers integrations between beacons and mobile apps to generate real-time heat maps revealing insights such as dwell-time. Other strong players to watch in this space, include Gimbal, TurnoutNow, and Sendrato.
We can expect the next frontier of wearables and events to transform both the event organizer and the attendee experience. On the organizer side, heat maps analyzing foot traffic visualizations in real-time can report which sponsor area has been sparsely visited, and perhaps, that means putting the coffee bar in that area the next day. For, attendees, it may mean digital event signage that allows them to observe visualizations of attendees on the floor, and locate people they want to network with.
Seeing is believing with heat mapping, on either side of the event equation.
Brainwear
Why engage with your eyes, when you can engage with your brain?
It’s a marketer’s dream to know exactly what’s happening inside the consumer mind, and whether an experience they’re having is impacting them at a neurological level. Thanks to the emergence of biometrics and brainwear technologies, marketers are getting one step closer to harnessing this data.
Helping catapult brainwear into the mainstream, Oreo teamed up with Shaquille O’Neil and Muse brain-sending headband at New York’s Chelsea Market (the site of the original Nabisco bakery where Oreo came from). The basketball star and Oreo enthusiasts sat across from each other wearing Muse. Above them, a suspended Oreo cookie hung from a mechanical arm four feet above a glass of milk, connected to the headpiece that sensed relaxed, alpha brain-wave patterns. Just by thinking about dunking the cookies, the arm’s motor would read their minds and descend the cookie halfway into the glass of milk thus perfecting the hands-free dunk, in a gamified experience.
Brainwear is also leading to more customized and personalized experiences. Acura at Sundance Film Festival teamed up with brainwear maker Emotiv for a driving experience that measured the brain waves of participants and personalized the route with landscapes, colors, speed and music on the drive simulation (on a big screen, before their eyes). No two driving experiences were the same. IBM used brainwear for its cognitive-themed pop-up experience at SXSW 2016, leveraging a toy BB-8 Droid and Emotiv brainwear that consumers could move using their minds. It was all designed to mimic The Force from Star Wars. At Adobe MAX, Emotiv activated a creative station that recorded attendee brainwaves as they doodled, giving an inside look at the creative process.
Coming down the pike are innovations that can actually alter moods. Like Thync, a module that attaches to your forehead and uses a proprietary neuro-stimulation programs that safely stimulate nerves on your head and neck to energize or relax you. Imagine how this could transform a general session whether its taming an unruly crowd or energizing a snoozing one.
Step aside eye candy marketers have their sights set on brain candy.
High-Tech Office Space
Modern offices become incubators for high-tech collaboration and organization.
Forget worktables, open concepts and sign-out sheets. Today, office spaces have never been more collaborative or more efficient thanks, in part, to a host of high-tech integrations. In many ways, offices are transforming into mini innovation labs for events. Since it’s not just meeting planners thriving in high-tech office spaces as attendees are, too, the expectation to deliver live event experiences that mirror work-life are that much greater today.
Among the latest innovations in workspaces is a new program between Steelcase and Microsoft, a new creative cube of the future for offices. The technology-enabled spaces are designed to integrate Microsoft Surface devices with Steelcase architecture and furniture to help foster creativity and collaboration. Among the configurations on offer: a lounge area infused with Microsoft Surface Studio and a Maker Commons for integrations with Surface Hub and Surface Book. Ideas for transforming your event lounge spaces? Check.
Another collaboration tool, Google Jamboard, expected to be released to the public this year, is a next-gen whiteboard that incorporates Google’s G Suite of cloud-based tools. Employees can access it from anywhere on the Jamboard app or an Android device to see a real-time feed of the board. Work on the board can exported, saved and shared. Users can add items, rearrange them or pull in images and drawings. It’s live-wired note taking, and in the meeting setting, it could transform breakout sessions, learning labs— paper, notepads, or handouts no longer required.
In addition to collaboration tools, other offerings are helping offices run more efficiently. Like Envoy, an elegant registration system for the front desk that tracks room schedules, sign-ins, and sends notifications (via email and text). Robin presence screens and beacons track conference room use and provide real-time analytics on who’s in a room and which rooms are used most and by how many people, among functions. Imagine Robin at a user conference, live-wiring bookable spaces for sponsors, and allowing attendees to tap and reserve a room for prospecting meetings on the spot.
More players are entering the space, too. Amazon in March announced its competitor for Google Hangouts, called Chime, a pin-free, automated video conference calling system. It’s set to shake up the live video conference space already occupied by the likes of Logitec, Slack and Skype. It could also shake up how your attendees communicate with each other, perhaps, between attendees on-site and virtual attendees.
And the lines between how we work, learn and play continue to blur.
Live Augmented Reality
Everything we love about augmented reality will transform general sessions and booth spaces into group experiences sans devices.
Everything we love about augmented reality the data visualizations, the intrigue, the interaction is coming to the event setting, transforming general sessions and booth spaces into group experiences without the goggles or devices. We’ve seen the effect in retail settings, through augmented reality dressing rooms, and through apps that let us view visuals through our screens. We’ve seen transparent displays that offer a window into a 3D visual. But a couple of different types of technologies worth keeping an eye on, however, are poised to remove all barriers with augmented reality.
Like Kino-Mo, the low-cost holographic-style technology that caught everybody’s attention at CES, that makes 3D videos and images appear as though they are floating in air. The company’s latest product, the new Hypervsn screen-less LED technology, boasts rotating 3D visuals generated by propellers that are crisp enough for bright environments and can be seen from a distance. Imagine suspending a 3D product animation over your booth to draw in traffic or suspending your latest product in a large-scale setting like a general session.
Another product, InAiR, a TV-based technology, transforms any screen into a new medium, pulling content from the web and social media and overlaying it on the screen in a Minority Report style of effect. Take this technology and overlay it onto a big screen displaying a live feed of a presenter, and you can bring data and other visuals into the live conversation, transforming a static keynote or session into a dynamic content experience. We’ve seen the technology in play in the television industry for a several years now (think: Fox Sports’ robot that dances across the live game on your television screen during NFL games).
Marketers can expect to see its implementation in events to drive content and engagement, as Ford did at the Detroit Auto Show in January when it debuted an augmented reality presentation. During three presentations about the Ford GT, 2018 F-150 and new EcoSport vehicles, as a live presenter walked around the vehicle, spectators (and those watching the footage at home after the fact) watched a digital overlay of the vehicle in front of them on a screen above that brought the features and visuals being described in the presentation to life.