Data Audit & Clean Up

Data Audit

Reduce time searching for missing data, lower spend on new data, and improve E&P efficiency. Data management budgets are squeezed and resources stretched during challenging times for the E&P industry.  Facing increasing volumes of new data, exploration teams assign a low priority to legacy data. Committing resources and budget to sort legacy data without obvious value can be difficult. However, old 2D seismic sections can be useful in supporting later 3D surveys, and well data of any age helps improve reservoir characterization. Quality data helps reduce the risk associated with E&P planning and investment decisions.  So legacy data does have value, but only when it is fully understood.

Know the data

The key to meeting these three challenges is metadata.

The first task is to create, or complete, metadata essential in assessing data stored in warehouses, whether on paper, tapes or microfiche. Perhaps only a simple index, metadata must be accurate and complete. More robust metadata translates into quicker and more accurate audits.

Define essential metadata

Essential metadata is defined during project planning. If not captured, essential metadata affects the use and value of the data. For example, a seismic survey without associated spacial data is of little use.

Begin clean-up

Once metadata is captured, a data review can commence and redundant data can be retrieved and destroyed as required. This may include duplicates, old data that changed ownership under M&A activity, or copies of data held elsewhere in updated formats. After this first clean-up stage, the rest of the process can be completed based on client requirements.

CGG’s approach to Data Audit, Clean Up and QC

The combination of People, Process, Technology and facilities makes CGG the ideal partner for Data Audit, Clean Up and QC projects.  CGG combines the latest technology with expertise of Data Analysts. Technology increases efficiency, and people improve the quality of the end results.

Step 1

Retrieve data from current locations. A CGG global hub reviews and updates available metadata catalogues. A basic audit process will suffice for good quality metadata; a complete index for data of lower quality. Metadata quality is based on agreed metadata requirements, and whether or not it is deemed essential.

Do not rely on tape labels to assess quality of metadata (particularly on older media) stored on tape.  Transcribing tapes and collecting metadata automatically from stored file headers is more cost effective, efficient, and gives an accurate catalogue and assessment of tape-stored data.

Step 2

Load metadata into PleXus, a new generation of data management applications. PleXus stores metadata and supports data audit, clean up and QC projects. PleXus is flexible in supporting any data model and managing any type of data. Its intuitive user interface makes it easy to use with minimal training. Clients can view metadata and data alongside CGG Data Analysts throughout the project. This collaborative approach produces higher quality outputs.

Step 3

Scan, digitize and upload documents and other physical data into CGG managed cloud storage (optional process based on client requirements).

Load digitized copies into CGG Managed Storage based on Microsoft Azure with high levels of physical and cyber security. With client agreement, the storage location will comply with regulatory requirements or internal data protocols. At end of project, transfer data to client-selected location, copy to tape, or store long term in CGG Managed Cloud Storage.

Step 4

Complete data clean-up and QC based on client’s data governance standards and agreed timetable. Release data for client use once QC process is complete. A typical clean-up process standardizes naming conventions and formats, and addresses missing metadata attributes such as NAV data.

The Business Case

Beyond discovering stored data, sorting also makes the best possible quality data available to the end user. CGG’s approach to Audit, Clean-up & QC of legacy data reduces storage costs and releases the hidden value of stored data.

Overall, E&P efforts are enhanced by reducing time searching for missing data, lowering spend on new data, and improving planning and investment decisions.