“While the McLane Stadium is considered unique because it was built from scratch, the Ferrell Center was a retrofit, so it had its own set of distinct challenges. After our successful deployment with McLane stadium, we updated the Ferrell Stadium which is 43,000 square feet and hosts basketball games and other events. Deploying Wi-Fi technology in an open-air stadium versus a closed stadium is drastically different and comes with a new set of challenges.” -Bob Hartland: Vice President of Information Technology Infrastructure at Baylor University
Bob Hartland, is the Vice President of Information Technology Infrastructure at Baylor University. Baylor University is a private university in Waco, Texas and is one of the oldest, continually operating universities in the state with 17,000 students and over 1,200 academic staff. The McLane Stadium and the Ferrell Center are two venues at the center point of Baylor’s athletic program that take part in the NCAA D1 Big 12 Conference.
As part of our Game Day is Everyday series, I chatted with Bob about his unique technology goals and challenges, and how IT solutions are a critical part of McLane Stadium & Ferrell Center’s ‘game day’ playbook.
The McLane Stadium was a new construction, built from scratch and seats 45,000 people. Last weekend we had an announced crowd of 50,000 spectators! McLane Stadium replaced Floyd Casey Stadium which was around 60 years old and in terms of technology, the most advanced feature were the outdoor lights. We had two years to build a new stadium and we wanted it to be a technology showcase. One of the key components would be data transport, so it was imperative that we have a strong, robust network that could handle the bandwidth requirements needed. We also looked for a partner to create a mobile app to draw fans in and keep them engaged. There are not many games you can’t stream to your home or mobile devices, so it was important for our IT team and our partner to find a way to make the fan experience at the game more enjoyable.
Oftentimes, we found security to be an afterthought, especially when it comes to our vendors. Operation versus security versus aesthetics were at the center of many decisions. During the construction of McLane, we learned “form does not always follow function”. It’s important to have an IT group at the table, to take on the conductor role to ensure everyone is in sync.
We picked an app provider that offered many features including instant replay and were told that we would be the first collegiate venue to have instant replay available within the stadium. Currently, we have six different cameras so in the case a fan wants to go back and look at a play, they can from six different angles. Despite the Ferrell Center being a closed arena, we were able to satisfactorily cover the arena with Extreme technology which was a key win considering we hold over 150 events in the arena each year. Some security goals found in both the Ferrell Center and McLane Stadium were to protect critical infrastructure assets and minimize service interruptions.
Some of the more complex IT needs we bumped into pertained to the instant replay because we didn’t have any collegiate venues that had done it. However, there were a few professional venues that were dealing with it at the time. We needed proper density in our design to handle the bandwidth that comes with mobile streaming, and our priority going in was figuring out what our bandwidth requirements were. After some trials with rate limiting, we ended up with 5Mb in the inner seating bowl and 10Mb up in the suites. The problem with unlimited bandwidth in a large arena with thousands of people in the stands, is the possibility of degradation of performance, but not in our case. Whether you’re in line at concessions, in the bowl, or a suite you get exactly the quality of service that we advertise.
Another other unique action we’ve taken is requiring a portal to on-ramp, which is done by clicking a button saying you agree to the terms and conditions of our stadium’s ISP provider. A sidebar advantage of the portal is that it provides everyone the opportunity to check the progress as they’re on-ramping. By offering instant replay and supporting streaming video we needed robust network access, which meant deploying 460 AP’s. Many systems prefer a separate network, so we looked at air-gapping between them and were surprised as to how many interdependencies there were. When it comes to our network, capacity and reliability are paramount, but with data security we try to protect our assets while not interfering with our customers experience.
It’s constantly growing. One of the best examples, although it has nothing to do with sports venues, was the building of a beautiful fountain in the center of campus. While they were building it, I get an email saying they need an IP address for the fountain to manage the flow and lights. Everything is networked and I do not see a plateau, rather I see more and more requests which could pose another big challenge.
While the Internet of Things is alive, we must find solutions to our security concerns, especially when we’re dealing with concessions at the stadium. We tell vendors they must supply their own networking equipment and we will manage the physical layer. Some vendors even request their own Wi-Fi, which we deny because we have public, robust Wi-Fi. We’re not letting any Wi-Fi other than McLane Wi-Fi in the stadium.
As with anything I’ve done in my IT career, I’ve tried to build a network that could handle the unknown. I must have flexibility and frankly when I’m picking partners, I need that from them as well. In one case, the coaches on the field utilized a 2.4 mega hertz system to communicate with each other to the booth. I’ve heard, in other stadiums, these systems would get interference from fans, but Extreme was able to go in and design around the coaches resulting in no interference on the field.
We also found that photographer cameras are internet-capable. We were told during the design phase, not to let any Wi-Fi spill onto the field, and now our team is getting requests for additional signal on the field. So, we strategically deployed access points, played around with channels, and eventually we installed a solution recommended by Extreme that meet on-the-field needs without impacting fans in the stands.
Other unique needs include ticketing, security, and maintaining our 160 high density security cameras. We like using air-gapping physically separate networks, unless there are interdependencies required in which we just use policies and routing tables. Our deployment took place back in 2014, so we are in our sixth season using older technology. We pretty much deployed all overhead, which caused a problem when we heard news of students standing on their seats and causing interference. Ultimately, we had to relocate some of our antennas.
In terms of design and uniqueness, the fans’ experience can offer issues, especially with the cameras. On game day, we verify a list of “life safety” applications including voice-evac and making sure elevator communications and management are online and ready to go. During the event, we have eight public safety operators represented which include Officers for Parks and Wildlife since we are located on the Brazos River. Since they all have unique communication and security needs, so we must ensure audio and video are running smoothly.
As far as looking ahead to future plays, it’s all about the fans for us. We’ve got to engage with them the best way we can and so far, we’ve had success in terms of on-ramping people within the two venues. We’re working with our app provider to make sure we can roll out some new features to further engage fans such as friendly competition using their mobile devices.
To explore more IT ‘Game Day’ stories from our customers, and to further learn how our networking solutions support the unique challenges across all industries, visit our “Game Day is Everyday” landing page.