Software

The IRL develops (and has developed) various applications and libraries to support our research in adaptive & intelligent training, game-based learning, competitive learning and multi-modal simulation.

These include applications and supporting libraries:

Applications

Adaptive & Intelligent Training

DIVAARS: The Dismounted Infantry Virtual After Action Review System, funded by the Army Research Institute, is our first AAR system and is focused on dismounted infantry in MOUT (Military Operations in Urban Terrain) scenarios.
JTACAARS: The Joint Terminal Attack Controller After Action Review System, funded by the Office of Naval Research, is built around the notion of an AAR for Fire Support Teams.
GEAARS: This game-based AAR system records video of any game-based training system (using FRAPS or Camtasia Studio) and provides an AAR session with all of the trainee video feeds, keeping the playback synchronized so any of them may be highlighted.
TeachAARS: This system, based off of GEAARS, is an AAR system prototyped for training of student teachers.
COGS: The CAN-oriented Objective-based Generator of Scenarios is a system for creating adaptive scenarios in a semi-automatic setting that provides a variety of qualitatively similar scenarios (rather than a "one size fits all" approach).

Game-based Learning

GamePAB: The Game Performance Assessment Battery, funded by the Army Research Institute, is a system for assessing a user's current ability in areas common to game-based training (movement, targeting and communication).
Marine Corps Planning Process (MCPP) Game: This allows Marines to practice the process of planning a course of action in a game setting.
InnerCell: InnerCell provides an experimental setting for studying user interface variations for game-based training using a narrative based upon the interactions of the immune system.

Multi-modal Simulation

HapMed: Microcode and Android-based instructor app for haptic-based part-task trainers for combat medics.
FITT: Fully Immersive Team Training system for immersive virtual reality experiences.

Competitive Learning

CORRECT: Component-based Online Registration and Reporting Environment for Contests and Tournaments, a web interface to handle registrations for a competitive learning event.
SPARTA: Submission of Programs for Adjudication and Response with Tournament Administration is a web interface for the submission of an entry in a competition for judging.
HSPT Compendium: Mobile app for iOS and Android that provides an overview of the UCF High School Programming Tournament.

Supporting Libraries

ATLAS: The Adaptable Tool Library for Advanced Simulation is a library that encompasses many basic, low-level functions needed across multiple applications and libraries, and is used throughout our software.
GEMINI: The General-purpose, Environmental, Modular, Interactive Network Interface is a distributed environment architecture that supports multiple databases and multiple protocols for distributed, dynamic virtual worlds.
MUSES: Multi-platform Unifying Software for Embedded Systems attempts to encapsulate the differences between mobile platforms to minimize changes needed to run apps across different devices.
PYTHAGORAS: Procedural Yielding Techniques and Heuristics for Automated Generation of Objects within Related and Analogous Scenarios is an engine for scenario generation that builds a variety of scenarios that are qualitatively similar, yet different.
SIRENS: The Software Infrastructure for the Recording, Emission, and Networking of Sound is a library for recording, playback, and transport of audio data. It collects all of our work on capturing and replaying sound and forms the basis for our applications in this area.
SOCRATES: The System of Object-based Components for Review and Assessment of Training Environment Scenarios is an engine for performing after action reviews. It forms the basis for our research in the AAR process.
VESS: The Virtual Environment Software Sandbox is a library for developing virtual and augmented environments. It includes various hardware drivers, motion models for connecting hardware with graphical components, and support for visual, aural, haptic and olfactory modalities.
University of Central Florida
Institute for Simulation & Training
Interactive Realities Laboratory
3100 Technology Parkway
Orlando, FL 32826