Skip to main content

Currently Skimming:

3 Ensuring Data Access and Utilization
Pages 42-52

The Chapter Skim interface presents what we've algorithmically identified as the most significant single chunk of text within every page in the chapter.
Select key terms on the right to highlight them within pages of the chapter.


From page 42...
... The group that receives data indirectly includes a very large number of private citizens who individually consume a small amount of data from sources such as Accuweather and Weather.com. For this second group, the importance of environmental satellite data and the total volume consumed will grow dramatically as mechanisms for delivery of the data continue to improve.
From page 43...
... (Metadata for federal geospatial data are required by Executive Order 12906, "Coordinating Geographic Data Acquisition and Access: The National Spatial Data Infrastructure," which was signed in 1994 by President Clinton.) These metadata can then be used as a mechanism for distributing key words to agency Web sites and public search engines, thus enabling users to quickly and easily find and understand the data.
From page 44...
... While incremental, or even step-function, increases in transmission bandwidth are possible via NOAAPORT-like direct broadcast system technologies, these systems do not scale well and offer limited additional capability unless the majority of the data products being broadcast are required by most of the user base. Although there will always be some high-volume users who can and want to receive all available data (e.g., NCEP)
From page 45...
... . Although it is probable that Internet services for data distribution will satisfy a large fraction of the user community, it remains likely that because of the concerns about the availability and quality of commercially provided Internet service, as well as about the cost of maintaining private networks, direct broadcast will remain a valid option for many users, either as the primary data transport method or as a backup.
From page 46...
... · Peer-to-peer access: As data volumes increase, the traditional "person in the loop" search and order will be increasingly supplemented by peer-to-peer2 system interfaces that automatically harvest NOAA data repositories for the data needed by users to generate their own domain-specific information products. · Maintaining a capability for data assurance: Data assurance, the guaranteed delivery of scientifically valid data, is a key user requirement that system architec ture, design, implementation, and operations must all support.
From page 47...
... In summary, the higher-resolution data and improved temporal coverage offered by the NPOESS and GOES-R systems require innovative approaches to the production, archiving and storage, and distribution of the data so that NOAA's goals for the utilization of environmental satellite data by a broad range of users for the benefit of society can be realized.3 Indirect Users Internet and cellular radio communication technologies are already expanding the use of NOAA's environmental satellite data by making data-derived products 3 P.E. Ardanuy, W.R.
From page 48...
... With constrained budgets a fact of life, NOAA's emphasis should include the assured availability of a prioritized, validated data product set, beginning with calibrated-at-aperture radiances, atmospherically corrected radiances, cloud and other masks, and basic products of key utility (e.g., measurements of sea-surface temperature) before focusing on the complete spectrum of possible products.
From page 49...
... , which will provide federal, state, and local governments, as well as private citizens, with "one-stop" access to geospatial data. Interoperability tools, which allow different parties to share data, will be used to migrate current geospatial data from all levels of government to the NSDI, following data standards developed and coordinated through the Federal Geographic Data Committee using the standards process of the American National Standards Institute.
From page 50...
... The Terra satellite alone will have doubled NASA's Earth science holdings in less than 1 year. At 194 gigabytes per day Terra takes in almost as much data as the Hubble Space Telescope acquires in an entire year, as much data as the Upper Atmosphere Research Satellite obtains in 11/2 years, and as much data as the Tropical Rainfall Measuring Mission obtains in 200 days.
From page 51...
... In attempting to create global measurements of upper air temperatures with long-term stability, two University of Alabama, Huntsville scientists developed a relatively homogeneous time series of globally gridded, deep-layer temperature measurements in 1990. These products required highly specialized knowledge of the NOAA microwave instruments and of the spacecraft on which they flew.
From page 52...
... NOAA has deemed a few of these data construction activities as vital for its climate monitoring function and so has initiated small contracts with groups such as the University of Alabama, Huntsville, which then provide monthly updates of these products, usually by the tenth day after each month's end. As additional spurious effects are discovered and minimized, these data sets are updated, metadata files describing the issue are created, and if appropriate, the results are published in the peer-reviewed literature.


This material may be derived from roughly machine-read images, and so is provided only to facilitate research.
More information on Chapter Skim is available.