Advances in the Integration of Large Data Sets for Seismic Monitoring of Nuclear Explosions
The National Nuclear Security Administration (NNSA) Ground-Based Nuclear Explosion Monitoring Research and Engineering (GNEMRE) program has been integrating large sets of seismic events and their associated measurements for almost a decade to support nuclear explosion monitoring. During that time th...
Gespeichert in:
Hauptverfasser: | , , , , , , , , , |
---|---|
Format: | Report |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | The National Nuclear Security Administration (NNSA) Ground-Based Nuclear Explosion Monitoring Research and Engineering (GNEMRE) program has been integrating large sets of seismic events and their associated measurements for almost a decade to support nuclear explosion monitoring. During that time the integration process has changed significantly, generally becoming more complex and more automated as the number of events and the range of associated measurements has steadily grown. In this paper, we explain the methodology for integrating database tables from different products that are part of a Knowledge Base (KB) release. The major effort of KB integration is merging events and their associated information in Oracle database tables. We have developed a substantial foundation of structure and software to assure data integrity in the integration of diverse data sets. The structural part of this foundation utilizes Oracle data dictionary tables along with complementary custom database tables. These custom tables contain information specifically related to how KB database objects are built. Information such as descriptions and database types are stored in these custom tables so that the KB structure is easily modifiable making it more flexible than it would be following a traditional database design. This metadata of the supporting structures is called the schema schema, and all the integration tools are based on this structure. Los Alamos National Laboratory (LANL) has developed a tool to make sure the information in the product database tables is accurate and valid. Called the Quality Control Tool (QCTool), this tool checks that the database tables conform to the database structures found in the schema schema and that the values in the database tables are reasonable. Database tables are not merged together until the errors generated by QCTool are either corrected or explained.
Presented at the Monitoring Research Review: Ground-Based Nuclear Explosion Monitoring Technologies Conference (29th) held in Denver, CO on 25-27 September 2007. Published in the Proceedings of the Monitoring Research Review: Ground-Based Nuclear Explosion Monitoring Technologies Conference (29th), p927-934, September 2007. The original document contains color images. |
---|