Skip to main content

Currently Skimming:

Appendix N: TA11 Modeling, Simulation, and Information Technology and Processing
Pages 282-293

The Chapter Skim interface presents what we've algorithmically identified as the most significant single chunk of text within every page in the chapter.
Select key terms on the right to highlight them within pages of the chapter.


From page 282...
... There are also important spacecraft computer technology requirements, including intelligent data understand ing, development of radiation-hard multicore chips and GPUs, fault tolerant codes and hardware, 2 and software that runs efficiently on such systems. Another important challenge is developing improved software for reliably simulating and testing complete NASA missions including human components.
From page 283...
... It has been split in two: • 11.2.4a, Science Modeling and Simulation, and • 11.2.4b, Aerospace Engineering Modeling and Simulation. The content of these two technologies is as described in the TA11 roadmap under Section 11.2.4 Science and Aerospace Engineering Modeling, in the subsections titled Science Modeling and Aerospace Engineering, respectively.
From page 284...
... Develop new flight and ground computing software tools (and engage trained computer scientists) to take advantage of new computing technologies by keeping pace with computing hardware evolution, eliminating the multi-core "programmability gap," and permitting the porting of legacy codes.
From page 285...
... Simulation-Based Training and Decision Support 70 L 1 1 1 1 3 1 0 Systems 174 M 11.4.1. Science, Engineering, and Mission Data Lifecycle 3 9 9 0 3 1 -1 38 L 11.4.2 Intelligent Data Understanding 1 3 1 0 1 -3 -1 160 M 11.4.3 Semantic Technologies 3 9 1 1 3 1 -1 51 L 11.4.4 Collaborative Science and Engineering 0 9 3 9 3 -3 -9 188 M 11.4.5.
From page 286...
... Access to the space station would not benefit this technology development. This technology will have significant impact because advanced computer architectures, when eventually incorporated into radiation-hardened flight processors, can be expected to yield major performance improvements in on-board computing throughput, fault management, and intelligent decision making and science data acquisition, and will enable autonomous landing, hazard avoidance.
From page 287...
... Testing: Improve reliability generations of scientific autonomous hazard avoidance new computing technologies by and effectiveness of hardware computers (e.g. for cross-scale in landing on planetary keeping pace with computing and software testing and simulations and data surfaces; adaptive telescope hardware evolution, eliminating enhance mission robustness assimilation and visualization in mirror technology; smart the multi-core "programmability via new generations of Earth science, astrophysics, rovers; and autonomous gap," and permitting the porting affordable simulation software heliophysics, and planetary rendezvous)
From page 288...
... Technology development is needed to create software tools to help programmers convert legacy codes and new algorithms so that they run efficiently on these new computer systems. Related challenges are developing improved compilers and run-time algorithms that improve load balancing in these new computer architectures, and developing methods to prevent computer hardware failures in systems with hundreds of thousands to millions of cores from impacting computational reliability.
From page 289...
... for the NASA Technology Roadmaps study held a workshop on May 10, 2011, at the National Academies Keck Center in Washington, D.C. It focused on Modeling, Simula tion, Information Technology, and Processing (NASA Technology Roadmap TA11)
From page 290...
... She also agrees that there is a significant challenge of properly scaling individual models when they are linked together to model complete architectures. Panel Discussion 2: Re-Engineering Simulation, Analysis, and Processing Codes The next session focused on the new classes of programming languages and how to adapt to new multi-core computers and was moderated by Joel Primack.
From page 291...
... He identified gaps in the roadmap as: no discussions on on-board computing for large data flows; no discussion on virtual observatories, clearing houses, search engines and other tools for NASA science data necessary to perform multi-mission data analysis or anchor models; no discussions of frameworks or processes to enable modeling and simulation. He noted the significant potential for autonomous and adaptive systems and said their single biggest challenge is testing.
From page 292...
... He believes that the top challenges for NASA include current data services are not sufficiently interoperable; the cost of future data systems will be dominated by software development rather than computing and storage; uncoordi nated development and an unpredictable support lifecycle for infrastructure and data analysis tools; and the need for a more coordinated approach to data systems software. However, he thinks that NASA can exploit emerging technologies for most of their needs in this area without investing in development.
From page 293...
... Every mission has its own Context-Driven Content Management system to do configuration management, which is part of product data lifecycle management. It was suggested that there should be an effort to advance the state-of-the-art and share that technology across missions for systems engineering and intelligent data understand ing.


This material may be derived from roughly machine-read images, and so is provided only to facilitate research.
More information on Chapter Skim is available.