The ROV ground control simulator (Fig. 1) used in this multi-sensory research consists of two workstations: pilot and SO. At the left workstation, the pilot controls ROV flight (via stick-and-throttle inputs as well as invoking auto-holds), manages subsystems, and handles external communications. From the right workstation, the SO is responsible for locating and identifying points of interest on the ground by controlling cameras mounted on the ROV. Each station has an upper and a head-level 17″ color CRT display, as well as two 10″ head-down color displays. The upper CRT of both stations displays a ‘God's Eye’ area map (fixed, north up) with overlaid symbology identifying current ROV location, flight waypoints, and current sensor footprint. The head-level CRT (i.e., “camera display”) displays simulated video imagery from cameras mounted on the ROV. Head-up display (HUD) symbology is overlaid on the pilot's camera display and sensor specific data are overlaid on the SO's camera display. The head-down displays present subsystem and communication information as well as command menus. The simulation is hosted on four dual-Pentium PCs. The control sticks are from Measurement Systems Inc. and the throttle assemblies were manufactured in-house.
Calhoun, G.L. and Draper, M.H. (2006), "11. Multi-Sensory Interfaces for Remotely Operated Vehicles", Cooke, N.J., Pringle, H.L., Pedersen, H.K. and Connor, O. (Ed.) Human Factors of Remotely Operated Vehicles (Advances in Human Performance and Cognitive Engineering Research, Vol. 7), Emerald Group Publishing Limited, Bingley, pp. 149-163. https://doi.org/10.1016/S1479-3601(05)07011-6Download as .RIS
Emerald Group Publishing Limited
Copyright © 2006, Emerald Group Publishing Limited