TY - CHAP AB - The ROV ground control simulator (Fig. 1) used in this multi-sensory research consists of two workstations: pilot and SO. At the left workstation, the pilot controls ROV flight (via stick-and-throttle inputs as well as invoking auto-holds), manages subsystems, and handles external communications. From the right workstation, the SO is responsible for locating and identifying points of interest on the ground by controlling cameras mounted on the ROV. Each station has an upper and a head-level 17″ color CRT display, as well as two 10″ head-down color displays. The upper CRT of both stations displays a ‘God's Eye’ area map (fixed, north up) with overlaid symbology identifying current ROV location, flight waypoints, and current sensor footprint. The head-level CRT (i.e., “camera display”) displays simulated video imagery from cameras mounted on the ROV. Head-up display (HUD) symbology is overlaid on the pilot's camera display and sensor specific data are overlaid on the SO's camera display. The head-down displays present subsystem and communication information as well as command menus. The simulation is hosted on four dual-Pentium PCs. The control sticks are from Measurement Systems Inc. and the throttle assemblies were manufactured in-house. VL - 7 SN - 978-0-76231-247-4, 978-1-84950-370-9/1479-3601 DO - 10.1016/S1479-3601(05)07011-6 UR - https://doi.org/10.1016/S1479-3601(05)07011-6 AU - Calhoun Gloria L. AU - Draper Mark H. ED - Nancy J. Cooke ED - Heather L. Pringle ED - Harry K. Pedersen ED - Olena Connor PY - 2006 Y1 - 2006/01/01 TI - 11. Multi-Sensory Interfaces for Remotely Operated Vehicles T2 - Human Factors of Remotely Operated Vehicles T3 - Advances in Human Performance and Cognitive Engineering Research PB - Emerald Group Publishing Limited SP - 149 EP - 163 Y2 - 2024/04/23 ER -