EMERGE
Empowering Digital Product Leaders
meatburger is better than veggie
emerge
How to Create the Killer App for Sports Venues through Mobile Application Development

How to Create the Killer App for Sports Venues through Mobile Application Development

  • Product Development /
  • Tools /

Emerge


Top Image Credit:

dailymail.co.uk

Transforming the Audience's Mobile Phones into a Crowd-Sourced Interactive Display

Smartphones and sports venues make for an interesting combination. They’re already used to increase the convenience factor at sporting events, allowing attendees to order food and drinks to their seats via their phones and augmenting the experience by providing additional real-time game information and video feeds to their seats. Audience participation apps are becoming more popular as well.

As a leading digital experience agency we prompted the question of what the smartphone can do to not just improve the experience for its owner, but for everyone in the stadium–particularly if there are thousands of phones that are linked together.

Here are some of the initial ideas we pursued:

  • Controlling the flashlight on the phones for synchronized “camera flash” effects
  • Playing synchronized sounds through the phone’s speakers to get crowd chants going or create synchronized sound effects from various areas of the venue
  • Using the display to emit light that can be changed in brightness and to any color to create a large Tifo “display” in the audience

In the end we opted to pursue the concept of controlling the phones’ screens. We were guided by the vision of a basketball arena with dimmed lights before the game and 20,000 roaring fans holding up their phones. This would turn the stands into a massive full-color screen. A similar concept was used at the 2012 London Olympics using fixed mounted LED displays.

Stadium seats with LED screens

The goal is to combine 20,000+ iPhone and Android screens to create a single display that spans the entire seating area of the venue. For example if a venue has 50 rows—each with 500 seats—then with everyone in the audience holding up their phone we would have a “display” resolution of 500px by 50px. Not quite HD resolution, but it could allow us to create some fun animations.

The London Olympics execution had 70,500 light sources embedded in the stands that allowed for some fantastic animations during the show.

Source: Kate Dawkins

Another example that inspired us was AQKA’s implementation of Carol of the Bells using synchronized smartphones.

So we set out to develop an iPhone app prototype that would test the feasibility of doing something similar using the crowd’s smartphone screens. Right out of the gate we came upon some limitations that would test the concept and the very limits of people’s devices:

  • We would be dealing with a wide variety of phones and operating system versions. Particularly at sporting events that draw a diverse audience from a variety of socioeconomic backgrounds we would have to expect the unexpected. Developing for the lowest common denominator would be paramount.
  • In a stadium environment–where thousands of people in a very condensed area are trying to get online–data connectivity can drop significantly. We would need to expect low bandwidth and high lag times, perhaps even no connectivity at all.
  • Different screen brightnesses and screen timeouts could cause some “pixels” to be brighter than others, and some to go dark altogether. We would need to install native apps that can control screen brightness and timeout parameters.
  • What if there’s only partial audience participation? Are the animations still recognizable if only 50% of crowd members participate?
  • To create more intricate animations we would need to know exactly where in the venue each device is located. In a stadium environment we would need each audience member to enter their section, row, and seat number into the phone app.

So we set out to build a prototype that would conquer these limitations and prove the concept feasible. We broke the challenge into two main components:

  1. The server software that would allow setting up the animations, store them, and relay them to the phones.
  2. The client interface that would run on the phones and control the screen.

For the server side we selected node.js and socket.io. To start we created an authoring web interface that allowed configuring a multi-step full-color animation for up to 24 devices.

Web interface to control smartphone screens

In parallel we started developing the smartphone client. We started off with a simple JS script that was receiving animation queues from the server. Each phone corresponded to a specific location on the animation grid, much like an audience member would enter their seat location when first opening the application.

At first this resulted in a very discoordinated animation due to latency in the animation start time between devices and difference in phone speeds when executing the color changes. We overcame those challenges by measuring the latency to each device before triggering the animation and driving the timing off the server time instead of the client’s JS engine. In the end we were able to prove the concept:

We then built the control software to load animations and map them to the seat maps of larger stadiums. Animations are imported from animated GIFs, which can be swapped out on the fly.

The next step is to run this with a larger group at scale and expand it to include Aruba’s Beacon Technology for ultra-accurate indoor location and navigation. We’re excited to take this concept to production, and work with partners who are being challenged to bring new innovation to their venues and enhance the experience of audiences.

Forward. Digital. Thinking.

© 2024 EMERGE. All Rights Reserved.