Equipment
The Human Factors (HF) Lab is equipped with a diverse range of advanced simulators and sensing technologies that support cutting-edge research in human behavior, human-automation interaction, and cognitive ergonomics. From high-fidelity driving and cycling simulators to immersive virtual and augmented reality systems, each piece of equipment is designed to facilitate realistic, data-rich environments for experimentation and analysis. These tools enable the lab to explore critical topics such as driver fatigue, user safety, human-system integration, and training effectiveness across a variety of domains.

RDS-1000 Driving Simulator
The RDS-1000 from Realtime Technologies is a high-fidelity, research-grade, single-cab sedan simulator. In addition to supporting human driving, the simulator is equipped with multiple levels of advanced driver assistance systems (ADAS) to enable automated driving capabilities. It features a wide 205° field of view across three 65-inch HD displays. This simulator includes weather effects, ambient lighting, spatial audio, and heads-up displays. The Human Factors Lab has conducted research using this simulator to classify driver fatigue during prolonged automated driving scenarios and to design multimodal warning systems tailored to alarm severity. Currently, the lab is focused on studying driver behavior, fatigue, human-automation interaction, and driver’s interaction with vulnerable road users using this driving simulator.

TT-1000 Driving Simulator
The TT-1000 from Realtime Technologies is a single-cab 18-wheel commercial semi-truck simulator that replicates realistic driving dynamics and immersive environmental conditions, including fog, rain, and nighttime driving, with spatial audio. This simulator enables data collection for studies on truck driver behavior. The lab focuses on using this simulator to investigate commercial driver workload, distraction, hazard perception, and response under varying road and weather scenarios.

Bicycle simulator
The immersive bicycle simulator from Realtime Technologies features a full-size hybrid bike mounted on a steering-enabled platform, combined with realistic virtual environments. The bicycle simulator can run as a standalone system or networked with the sedan simulator, offering the ability to collect data from both car drivers and bicycle riders simultaneously from the same experiment. This simulator includes weather effects, ambient lighting, and spatial audio. The system supports forward projection and a rearview inset to simulate real-world cycling conditions. The lab used this simulator to study cyclist safety through facial expression analysis and to develop a smartphone-based warning system for riders.
Virtual Reality and Augmented Reality Headsets
The lab features Meta Quest 3, Meta Quest 2, HoloLens 2, and HTC Vive Pro headsets. HF lab conducts extensive research on developing virtual reality and augmented reality applications using these immersive technologies. Recently, a Bilingual virtual reality training module for modern manufacturing has been developed under an NSF-funded project. The development module was implemented in the classroom using Meta Quest 3. An augmented reality-based application for inventory management has been developed for Holo Lens 2. Currently, the HF lab focuses on developing multi-user virtual reality applications and integrating artificial intelligence within the virtual environment to study cognitive ergonomics.





Vicon Motion Capture System
The Vicon motion capture system can capture full-body movement in six degrees of freedom (6DoF). This system utilizes Valkyrie cameras and Tracker 4 software to deliver ultra-precise motion tracking with an accuracy of up to 0.017 mm. Currently, the lab focuses on conducting real-time analysis of posture, biomechanics, and tool interaction using this motion capture system. The lab is also planning to integrate this system with the Unity VR platform to conduct product testing and safety research, thereby optimizing human-system design.

Insta360 Pro Camera
The Insta360 Pro, a high-resolution 360-degree camera, is used to capture immersive video and imagery for virtual environment development and user studies. The Lab supports the creation of realistic VR training scenarios and contextual visual data for situational analysis. Its panoramic capture capabilities make it ideal for studying spatial awareness, environmental perception, and user immersion. Recently, the lab has conducted several experiments for remote collaboration with robotic arms using this camera.

Pupil Invisible Eye Tracking Glasses
This lightweight, unobtrusive eye-tracking glass captures real-world gaze behavior without requiring calibration. In the Lab, it is used to study visual attention, situational awareness, and decision-making in dynamic tasks such as driving and training simulations. The glass enables mobile, high-accuracy eye movement tracking in both indoor and outdoor environments.

Actiheart Heart Rate Monitor (CamNTech)
The Actiheart is a compact, wearable device that combines heart rate monitoring with accelerometry to assess physical exertion, stress, and cognitive workload. It supports continuous data collection in real-world and simulated environments. The lab utilizes Actiheart to investigate physiological responses in tasks that require sustained attention, mental effort, and multitasking performance.