Skip to main content

Currently Skimming:

Appendix C: Autonomous Mobility
Pages 127-150

The Chapter Skim interface presents what we've algorithmically identified as the most significant single chunk of text within every page in the chapter.
Select key terms on the right to highlight them within pages of the chapter.


From page 127...
... The UGV must be able to avoid positive obstacles, such as rocks or trees, and negative obstacles, such as ditches. It must avoid deep mud or swampy regions, where it could be immobilized and must traverse slopes in a stable manner so that it will not turn over.
From page 128...
... . Specific perception system objectives for road following, following a planned path cross-country, and obstacle avoidance are derived from the required vehicle speed and the characteristics of the assumed operating environment (e.g., obstacle density, visibility, illumination [day/night]
From page 129...
... must at a minimum detect and track a lane to provide an input for lateral or lane-steering control (road following) ; detect and track other vehicles either in the lane or oncoming, to control speed or lateral position; and detect static obstacles in time to stop or avoid them.6 In the urban environment, in particular, a vehicle must also navigate intersections, detect pedestrians, and detect and recognize traffic signals and signage.
From page 130...
... This is important when a vehicle is navigating primarily cross-country but where part of the planned path is on a road segment, probably unstructured, that passes through the 9Active vision refers to the dynamic control of data sources, field of view (e.g., sensor position and focal length) , and processes.
From page 131...
... described an improved version of the ROBIN neural-network-based system used in Demo II. This version was used for unstructured road following in the early part of the Demo III program and in other unrelated experiments.
From page 132...
... and Demo III (two vehicles)
From page 133...
... When a different environment is encountered the use of the initial parameters may lead to degraded performance requiring manual retuning. Instances of this occurred throughout the ALV, Demo II, and Demo III programs.
From page 134...
... Many of the perceptual clues used to navigate open roads may be available only intermittently because of traffic or parked cars, but these, in turn, can also serve to help de fine the road. Road following, intersection detection, and traffic avoidance cannot be done in any realistic situation.
From page 135...
... The sensors used on the Demo III XUVs were stereo, color video cameras (640 x 480) , stereo FLIR cameras (3-5, 320 x 256, cooled, 2-msec integration i5"Unknown" means that the operators have not previously seen or walked the terrain.
From page 136...
... B FIGURE C-2 Demo III vehicle and PerceptOR vehicle.
From page 137...
... The perception system must assess the density of the surrounding brush to determine if the vehicle can push through or must detour. The Demo III system counted the number of range points in a LADAR voxel to estimate vegetation density.
From page 138...
... The classification results matched ground truth. The active vision software has not yet been ported to the Demo III XUV.
From page 139...
... Many of the issues identified in the ALV and Demo II programs remain problems today, and many of the capabilities demonstrated in Demo III could be replicated with Demo II algorithms enabled by improved computation. Similarly, there is no way to know how close Demo III performance is to meeting putative FCS or other requirements, since specificity is lacking on both sides.
From page 140...
... The heavy, almost exclusive, dependence of Demo III on an active sensor such as LADAR may be in conflict with tactical needs. Members of the technical staff at the Army NVESD told the committee that LADAR was "like a beacon" to appropriate sensors, making the UGV very easy to detect and vulnerable (U.S.
From page 141...
... Little work has been done on detecting tactical features at ranges of interest. Tree lines and overhangs have been reliably detected, but only at ranges less than 100 meters.
From page 142...
... The studies and experiments on sensor phenomenology supporting the ALV, Demo II, Demo III, and the PerceptOR progress, and experiments at the Jet Propulsion Laboratory (JPL) for a Mars rover provide evidence that mobility vision requirements can be met by some combination of color cameras, FLIR, LADAR, and radar.
From page 143...
... Multiband FLIR may provide terrain classification capability at night. During the day, FLIR can make use of thermal differences to select correspondences and augment color stereo.
From page 144...
... Table C-4 was initially developed for the Demo III program and was subsequently refined. It summarized the judgment of robotic vision researchers about those techniques that could potentially lead to the greatest improvement in feature detection and classification.
From page 145...
... They presented such a method and illustrated its application. The value of texture analysis has been suggested in preceding discussions, particularly for terrain classification.
From page 146...
... bush) Water/mud Vehicles Humans Trees and tree lines Hills/ridge lines Grazing Angle Equal Rumination No Surface Texture Avoids Traversables Poor Resolution Minimal Geom.
From page 147...
... , depending upon the requirement. Terrain classification is most often done using either unsupervised clustering or by a supervised pattern recognition approach, such as a neural network or other statistical classifier.
From page 148...
... SUMMARY In the 18 years since the beginning of the DARPA ALV program, there has been significant progress in the canonical areas of perception for UGVs: road following, obstacle detection and avoidance, and terrain classification and traversability analysis. There has not been comparable progress at the system level in attaining an ability to go from A to B (on-road and off-road)
From page 149...
... Briefing by Gene Klager, CECOM Night Vision and Electronic Sensors Directorate, to the Committee on Unmanned Ground Vehicle Technology, CECOM Night Vision and Electronic Sensors Directorate, Ft. Belvoir, Va., January 14.
From page 150...
... Briefing by Larry Matthies, Supervisor, Machine Vision Group, Jet Propulsion Laboratory, to the Committee on Army Unmanned Ground Vehicle Technology, Jet Propulsion Laboratory, Pasadena, Calif., January 22. Matthies, L., and P


This material may be derived from roughly machine-read images, and so is provided only to facilitate research.
More information on Chapter Skim is available.