Skip to main content

Students & Alumni

Dinosaurs, Planes, and a VR Haunted House among prototypes at FIEA’s 2025 GameLab Presentations

April 28, 2025

FIEA’s Cohort 21 graduate students demonstrated a variety of interactive prototypes on Wednesday at their final GameLab presentations in the FIEA Sound Stage. Divided into eight teams, 71 students explored a mix of technology like voice recognition, AI, VR, haptics, and more to solve problems faced by industries adjacent to gaming.
Cohort 21 students help each other suit up for the FIEA Funhouse experience in VR headset, noise cancelling headphones and rumble vest.

GameLab, a Spring semester course led by FIEA Director of Strategic Partnerships Erik Sand, explores using games in non-traditional applications, such as medical and military simulations, hospitality, science and entertainment. Students create prototypes and simulations ranging from teaching science to different audiences, training military personnel, simulating social interactions to practice customer service response, and more. Each organization’s subject matter experts (SMEs) assisted the students along the way, providing valuable insight as their projects developed.

One of the eight teams was led by project lead Alyona Speranskaia, whose team created an AI-powered simulation to help intellectually disabled individuals learn social cues to help improve their career opportunities. Speranskaia described how much her team enjoyed the challenging experience: “My teammates shared that one of their favorite parts was seeing everything come together. We all helped each other, and by the end, we could really see the impact and potential of what we’d built.”

Spring 2025 GameLab Projects from Cohort 21:

Dinosaur VR, led by Kurt D’amour

Partner/SME: Nick Zuccarello

FIEA 3D Art Faculty Nick Zuccarello, who regularly presents at the Orlando Science Center, asked Kurt D’amour’s team to design a VR experience that lets guests travel back in time to see life-size, living dinosaurs in their natural habitat.

“Static museum exhibits often struggle to capture the imagination of today’s audiences,” D’amour described. “A fossil on a pedestal doesn’t convey the awe-inspiring scale, movement, or appearance of a living dinosaur. Our solution was to use virtual reality to place guests face-to-face with full-sized, animated dinosaurs—skin, muscle, behavior, and all—letting them explore a scientifically grounded prehistoric environment in a way that feels immersive, memorable, and modern.”

The team had to work on achieving scientific accuracy, which required a deep dive into paleontology. “Balancing entertainment with educational authenticity became a challenge we were proud to tackle,” D’amour said.

Dinosaur VR team photo

Dinosaur VR


UCF Virtual Campus, led by Ash Mckee

Partner/SME: Jeremy Paulding

Group photo

Photo of programming student navigating through app

FIEA Programming Faculty Jeremy Paulding sought out to help resolve problems that students may face in the college experience at a large university like UCF: anxiety while navigating the many buildings on Main Campus, lack of awareness of traditions and activities around campus, and feeling a sense of community. Ash Mckee’s team created an interactive app-based experience where the user (a UCF student) could explore UCF in a virtual space.

“The unique challenge of this project was building a server-based multiplayer game,” Mckee said. “Nothing in our curriculum quite prepared our programmers for this task, and it definitely took up the majority of our development time! Luckily, we had someone who had worked with web databases before, and that helped guide our learning of this technology.” Mckee went on to add, “By far the easiest part was getting an accurate map of UCF, as Google Maps data was used to near-instantly generate the template for our playable space.”


System for Emotional Navigation and Service Enhancement (SENSE), led by Alyona Speranskaia

Partner/SME: Toni Jennings Exceptional Education Institute

“Millions of individuals with intellectual disabilities (ID) face barriers to entering the workforce, often due to challenges in interpreting social cues and navigating customer service interactions,” Project Lead Alyona Speranskaia said. “These are skills that can be developed, especially with the help of realistic AI simulations that allow for safe practice.”

The SENSE team created an AI-powered, realistic training simulation to help users practice customer service interactions and learn these social cues, integrating two characters into their simulation. The simulation used adaptive AI dialogue, AI-generated voiceovers, voice input, a scenario builder for customized practice, and a transcript to review.

It didn’t come without its challenges. “As a team, we faced the challenge of making AI feel human and using it to emulate customer service,” Speranskaia described. “That meant aligning narrative design with a MetaHuman character and building logic so the character could react naturally.”

Group photo

Group photo


Signal Surge, led by Courtney McCracken

Partner/SME: Team Dark Knights/USAF 338th Cyber Training

Group photo

Group photo

While Team Dark Knights teaches apprentices about network/information systems, Project Lead Courtney McCracken said that cybersecurity can be a dense subject matter, especially in dynamic and hostile environments troops may be deployed in. The solution that McCracken’s team developed was a strategy game designed to introduce and reinforce cybersecurity concepts for airmen. The game runs in real time in order to simulate the pressure of a real-world situation.

Creating something that could be further developed was important in the development of the project. “To encourage our SMEs to continue development on this project further in the future, we decided to create designer-friendly and modular gameplay systems that are easy to iterate on,” McCracken explained. “So, we made sure not to take shortcuts with the development of these tools.”

Team Dark Knights presented each of the students on the development team with a patch to honor their hard work on their project after their GameLab presentation. “The camaraderie of our team, ‘Stalwart,’ has been an awesome and important part of getting this project to this point,” McCracken said. “It’s been such a pleasure seeing everyone’s work come together, and to see everyone uplift each other.”


ATC Experience, led by Jenna Stellmack

Partner/SME: Noa Baggs & Guest SME Clyde Rinkinen (Embry-Riddle Aeronautical University)

Language and phraseology are integral facets of air traffic control, but most current simulations do not prioritize teaching it. Using AI and voice recognition, Jenna Stellmack’s team created a virtual reality teaching simulation for air traffic controllers where players interact in real time with AI pilots to help planes safely land and take off while practicing proper language and phraseology.

“Our biggest challenges were integrating voice recognition and AI and keeping the simulation accurate while still making it a fun and relatively low-stress experience,” Stellmack said. “Since I’ve never made a game in VR before, it was a cool experience to put on the headset and see everything we’d made come to life!”

The Parkinson's Immersive Experience team

The Parkinson's Immersive Experience team


AMD FSR Demo, led by Katherine Teng

Partner/SME: Jarrett Wendt, AMD Orlando R&D Center University Program

Group photo

Katherine Teng’s team assisted AMD in developing a solution to test and debug their frame generation software, FSR. The student team created an in-house test with customizable settings for the software.

“Being a team lead and navigating the responsibilities that come with that was a challenge, as I’ve never been a lead for a team of this size before,” Teng said. “But my team and Erik have really helped me grow and become more comfortable in this position.”


BreakAway, led by Joshua Chan

Partner/SME: Jenn McNamara, BreakAway Games

Joshua Chan’s team was tasked with making a serious game to train college-aged students and young professionals on the steps involved in engineering a concise and efficacious AI prompt. Chan’s team worked with BreakAway, a renowned game company, which shared what Chan described as valuable, tangible feedback, including obstacles that could potentially appear down the road.

“We chose to abstract this concept into a restaurant setting and provide iterative feedback to various customer orders, which act as different stages of an AI prompt,” Chan said. “In this way, we are able to embody the AI itself and guide the customers through creating a good prompt.”

Group photo


FIEA Funhouse, led by Greg Kelley

Partner/SME: Chris Roda

Group photo

Suiting up

The high-impact, immersive experience called FIEA Funhouse developed by Project Lead Greg Kelley’s team, integrated different technologies into one multi-sensory experience. With the support of SME and FIEA technical art faculty Chris Roda, Kelley’s team designed a multi-area horror attraction using directional walking, live-tracked props, and haptics within the confines of Studio 500’s MOCAP stage. While navigating through the haunted house, the playtester wore a VR headset, noise cancelling headphones, and a rumble vest, a wearable vest that provides tactile sensations to the body while interacting in the virtual environment. While immersed in the sights, sounds and feelings of the experience, an actor fully decked out in MOCAP gear could appear in the playtester’s experience as a spooky creature.

Kelley said he loved watching playtesters realize they had walked in different ways from the real-life recording, even though they felt they were walking straight while immersed in the game. “Seeing that ‘how did you do that?!’ moment was really cool,” Kelley said.