626075 Rev. B
June 1998
Copyright 1998 Landmark Graphics Corporation All Rights Reserved Worldwide This publication has been provided pursuant to an agreement containing restrictions on its use. The publication is also protected by Federal copyright law. No part of this publication may be copied or distributed, transmitted, transcribed, stored in a retrieval system, or translated into any human or computer language, in any form or by any means, electronic, magnetic, manual, or otherwise, or disclosed to third parties without the express written permission of: Landmark Graphics Corporation 15150 Memorial Drive, Houston, TX 77079, U.S.A. Phone: 713-560-1000 FAX: 713-560-1410
Trademark Notices Landmark, OpenWorks, SeisWorks, ZAP!, PetroWorks, and StratWorks are registered trademarks of Landmark Graphics Corporation. Pointing Dispatcher, Log Edit, Fast Track, SynTool, Contouring Assistant, TDQ, RAVE, 3DVI, SurfCube, SeisCube, VoxCube, Z-MAP Plus, ProMAX, ProMAX Prospector, ProMAX VSP, MicroMAX, DepthTeam and Landmark Geo-dataWorks are trademarks of Landmark Graphics Corporation. Technology for Teams is a service mark of Landmark Graphics Corporation. ORACLE is a registered trademark of Oracle Corporation. IBM is a registered trademark of International Business Machines, Inc. AIMS is a trademark of GX Technology. Motif, OSF, and OSF/Motif are trademarks of Open Software Corporation. UNIX is a registered trademark of UNIX System Laboratories, Inc. SPARC, SPARCstation, Sun, SunOs and NFS are trademarks of SUN Microsystems. X Window System is a trademark of the Massachusetts Institute of Technology. SGI is a trademark of Silicon Graphics Incorporated. All other brand or product names are trademarks or registered trademarks of their respective companies or organizations.
Note The information contained in this document is subject to change without notice and should not be construed as a commitment by Landmark Graphics Corporation. Landmark Graphics Corporation assumes no responsibility for any error that may appear in this manual. Some states or jurisdictions do not allow disclaimer of expressed or implied warranties in certain transactions; therefore, this statement may not apply to you.
Agenda . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1
Day 1 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1
Introductions, Course Outline, and Miscellaneous Topics . . . . . . . . . . . . . . . . . . . . . . 1 ProMAX 2D Geometry - Manual . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1 ProMAX 2D Geometry - Full Extraction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1 ProMAX 2D Geometry - Extraction with Editing . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1 Trace Editing using Trace Statistics and DBTools . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1 System Overview . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1
Day 2 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2
Parameter Selection and Analysis. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2 Elevation Static Corrections . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2 Brute Stack . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2 Neural Network First Break Picking. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2 Refraction Static Corrections . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2 Stack Comparisons . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2 Velocity Analysis and the Volume Viewer . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2
Day 3 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3
Residual Statics Corrections . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3 Dip Moveout (DMO) . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3 PostStack Signal Enhancement. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3 Velocity: QC, Editing, Modeling . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3 PostStack Migration . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3 Additional Topics . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3
Landmark
Contents
1-1
View Database Attributes. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1-29 Load Geometry to the Trace Headers . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1-33 Graphical Geometry QC . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1-35
QC your Geometry Assignment . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1-36
ii
Landmark
Contents
Landmark
iii
Contents
. . . . . . . . . 4-1
Chapter Objectives. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4-2 Picking a Time Window for Statistical Analysis . . . . . . . . . . . . . . . . . . . . . . . . 4-3 Running the Trace Statistics Process . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4-4 Displaying the Statistics using DBTools . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4-5 Selecting the Data of Interest Graphically. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4-9 Focusing on a Range of data on the Histogram . . . . . . . . . . . . . . . . . . . . . . . . 4-12 Chapter Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4-14
iv
Landmark
Contents
Landmark
Contents
6-1
7-1
vi
Landmark
Contents
Landmark
vii
Contents
Stack Comparisons
. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11-1
12-1
Chapter Objectives. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12-2 Velocity Analysis Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12-3 Velocity Analysis Precompute . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12-4
Precompute Velocity Analysis . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12-5 Velocity Analysis . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12-9 Velocity Analysis Icons . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12-12 Using the Volume Viewer . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12-13 Velocity Analysis PD Tool. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12-16
13-1
Chapter Objectives. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13-2 Autostatics Flowchart . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13-3 Data Preparation for Input to Residual Statics . . . . . . . . . . . . . . . . . . . . . . . . . 13-4
Data preparation and horizon picking for residual statics . . . . . . . . . . . . . . . . 13-4
viii
Landmark
Contents
External Model Autostatics Overview . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13-17 External Model Autostatics Flowchart . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13-18
Create Eigen Stack . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13-19
DMO . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14-17
Apply DMO to the data. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14-18 Final Stack . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14-20
Landmark
ix
Contents
16-1
PostStack Migration . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
17-1
Chapter Objectives. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 17-2 PostStack Migration Processes . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 17-3 Tapering . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 17-4 Poststack Migration . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 17-5
Apply FK migration . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 17-5 Apply Phase Shift Migration . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 17-8 Apply FD Migration . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 17-10 Compare Migrations . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 17-11
Contents
. . . . . . . . . . . . . . . 4-1
Landmark
xi
Contents
5-1
xii
Landmark
Preface
Preface
About The Manual This manual is intended to accompany the instruction given during the standard ProMAX 2D course. Because of the power and flexibility of ProMAX, it is unreasonable to attempt to cover all possible features and applications in this manual. Instead, we try to provide key examples and descriptions, using exercises which are directed toward common uses of the system. For more progressive training please take Advanced 2D. The manual is designed to be flexible for both you and the trainer. Trainers can choose which topics, and in what order to present material to best meet your needs. You will find it easy to use the manual as a reference document for identifying a topic of interest and moving directly into the associated exercise or reference. You are encouraged to copy the exercise workflows and optimize them to your personal situation. How To Use The Manual This manual is divided into chapters that discuss the key aspects of the ProMAX system. In general, chapters conform to the following outline: Introduction: A brief discussion of the important points of the topic and exercise(s) contained within the topic. Topics Covered and Chapter Objectives: Brief list of skills or processes, in the order that they are covered in the exercise. Topic Description: More detail about the individual skills or processes covered in the chapter. Exercise: Details pertaining to each skill in an exercise, along with diagrams and explanations. Examples and diagrams will assist you during the course by minimizing note taking requirements, and providing guidance through specic exercises. Chapter Summary: A brief list of skills the chapter was designed to train.
This format allows you to glance at the topic description to either quickly reference an implementation, or simply as a means of refreshing your memory on a previously covered topic. If you need more information, see the Exercise sections of each topic.
Landmark
Preface
Conventions
Mouse Button Help This manual does not refer to using mouse buttons unless they are specific to an operation. MB1 is used for most selections. The mouse buttons are numbered from left to right so: MB1 refers to an operation using the left mouse button. MB2 is the middle mouse button. MB3 is the right mouse button. Actions that can be applied to any mouse button include: Click: Briey depress the mouse button. Double Click: Quickly depress the mouse button twice. Shift-Click: Hold the shift key while depressing the mouse button. Drag: Hold down the mouse button while moving the mouse.
Mouse buttons will not work properly if either Caps Lock or Nums Lock are on.
Exercise Organization Each exercise consists of a series of steps that will build a flow, help with parameter selection, execute the flow, and analyze the results. Many of the steps give a detailed explanation of how to correctly pick parameters or use the functionality of interactive processes. The flow examples list key parameters for each process of the exercise. As you progress through the exercises, familiar parameters will not always be listed in the flow example. The exercises are organized so that your dataset is used throughout the training session. Carefully follow the instructors direction when assigning geometry and checking the results of your flow. An improperly generated dataset or database may cause a subsequent exercise to fail.
Landmark
Preface
Manual Organization The manual will take you through a typical workflow of a geoscientist processing a land 2D seismic dataset. The processing functions of ProMAX will be introduced and discussed as they appear in the workflow.
Processing WorkFlow
1. Geometry Assignment 2. Trace Editing 3. Parameter Selection 4a. Elevation Statics 4b. Refraction Statics 5. Brute Stack 6. Velocity Analysis 7. Residual Statics 8. Dip Moveout (DMO) 9. PostStack Signal Enhancement 10. PostStack Migration Velocity Modeling Pick First Breaks Field Data
Landmark
Preface
Landmark
Agenda
Agenda
Day 1
Introductions, Course Outline, and Miscellaneous Topics ProMAX 2D Geometry - Manual
Input Data into the Spreadsheet CDP Binning Loading Geometry to Trace Headers QC Database Attributes
System Overview
Directory Structure Program Execution Ordered Parameter Files Parameter Tables Disk Datasets Tape Datasets
Landmark
Agenda
Day 2
Parameter Selection and Analysis
Parameter Table Picking Parameter Test IF/ENDIF Conditional Processing F-K Analysis and Filtering F-K Filtering Comparisons Interactive Spectral Analysis (ISA)
Brute Stack
RMS Velocity Field ASCII Import Brute Stack with Elevation Statics
Stack Comparisons
Compare Stacks
Landmark
Agenda
Day 3
Residual Statics Corrections
Data Preparation for Input to Residual Statics Calculation of Residual Statics QC and Application of Residual Statics External Model Autostatics
PostStack Migration
Poststack Migration Processes Tapering Poststack Migration
Additional Topics
Landmark
Agenda
Landmark
Chapter 1
Landmark
1-1
Chapter Objectives
1. Geometry Assignment
Field Data
We are at step one, Geometry Assignment, of our processing workflow. Geometry is probably the longest and most difficult subject in the manual, as it is in a normal processing sequence. If we can get the geometry correct we are well on our way to having the best possible seismic data for the interpreter. Upon completion of this chapter you should: Understand what the Ordered Parameter Files (OPFs) represent Edit the OPFs via the Geometry Spreadsheet View Trace Header values for Geometry Attributes Import Observer Data into the Geometry Spreadsheet QC and Edit Geometry via DBTools and XDB Understand ProMAX Sign Conventions Understand what a Pattern Represents Understand the steps of Binning Graphically QC Geometry with Farr Displays
1-2
Landmark
Manual Input
SEG-? Input
Seismic Data (ProMAX) Extract Database Files Inline Geom Header Load
Geometry Spreadsheet
Geometry assignment path for this exercise ProMAX geometry assignment is designed to be both flexible and robust. The previous map, however, displays the complicated price we pay for that flexibility. The following map shows a simplified path that we will use for geometry assignment in this exercise.
SEG-Y Input
1-4
Landmark
Land Geometry
The 2D Land Geometry Spreadsheet is used to assign the geometry. The spreadsheet is an editor used to input/modify geometry information, residing in the ProMAX database. While you can manually key in data, the spreadsheet has options to import geometry information, such as source and receiver coordinates from ASCII files. If the input seismic data has pertinent geometry information in the trace headers, you can extract this information using the process Extract Database Files prior to working with the spreadsheet.
Landmark
1-5
SEG-Y Input
Type of storage to use: ------------------------------Disk Image Enter DISK le path name: ---------------------------------------------------/misc_les/2d/segy_0_value_headers ----Default the remaining parameters----
Trace Display
Number of ENSEMBLES (line segments)/screen: -------2 ----Default the remaining parameters---4. Execute the ow. Use the Next Ensemble icon to move through all 20 shots for this line. Notice how the shot rolls onto the spread and that there is a discontinuity between channels 60 and 61.
1-6
Landmark
Channel 1 Station
60
61
120
392.5
Channel 1 Station
Channel 1 Station
60
61
120
20 Sources 120 Channels 55 ft. Receiver Interval 220 ft. Source Interval 2 Second Record Length 4 ms Sample Rate Dynamite Source
Landmark
1-7
Observers Report
Group Int.=55 Shot Loc. File no. Shot Int.=220 Depth Offset Sample Int.=4 ms Uphole Time Chan 1 Chan 60 # of Chan=120 Chan. 61 Chan 120
388.5 392.5 396.5 400.5 404.5 408.5 412.5 416.5 420.5 424.5 428.5 431.5 436.5 440.5 444.5 448.5 452.5 456.5 458.5 464.5
2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 22 23
93 93 93 93 93 93 93 93 93 93 93 93 93 93 93 93 93 93 93 93
0 0 50 0 15 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
First Live Station=387
22 20 20 23 18 24 20 19 17 20 22 19 19 20 21 23 22 20 20 20
387 387 387 387 387 387 387 387 387 387 387 387 387 387 387 388 392 396 398 404
446 446 446 446 446 446 446 446 446 446 446 446 446 446 446 447 451 455 457 463
449 449 449 449 449 449 449 449 449 449 449 449 449 449 449 450 454 458 460 466
508 508 508 508 508 508 508 508 508 508 508 508 508 508 508 509 513 517 519 525
1-8
Landmark
Load Survey information to the spreadsheet In this exercise, you will assign geometry to the 2DTutorial dataset, Watson Rise, using the geometry spreadsheet. Two flows are required to accomplish this task. One ow will use the spreadsheet as an editor to automatically enter data to the database. The second ow will load the geometry from the database to the trace headers.
The following spreadsheet guide is designed to help you assign geometry to the line you are processing in the class. It is by no means a complete description of all the capabilities. Please consult the Reference Manual for additional documentation. 1. Build the following ow :
Landmark
1-9
3. Select Setup, and ll out the menu with information from the observers log.
4. Select to assign midpoints by Matching pattern numbers using rst live chan and station. 5. Enter source and receiver station interval, and leave the survey azimuth blank as it will be calculated later. 6. Enter the rst and last live station numbers, select Yes to base source station numbers on receiver station numbers. Set source type to shot holes, and units are feet. You may also enlarge the font. 7. Select OK when you have entered all the information.
1-10
Landmark
2. Mark all rows active by clicking MB3 on any of the numbered blocks under the Mark Block column. Marked blocks will turn a different color. Station, X, and Y are required in 2D geometry. 3. Insert enough rows to accommodate all receiver stations. Notice how many rows are present in your default spreadsheet (this number will vary depending on your font). There are 139 receiver stations in this survey, so you will need to insert rows into the default spreadsheet so that there are 139 rows. Select Edit Insert, and insert the proper number of rows after the last marked block. Scroll to the bottom of the spreadsheet. If you created more than 139 blocks, mark the excess blocks by selecting block 140 with shift-MB2. This will select all blocks numbered 140 and greater. Select Edit Delete, and OK. After you are certain that you have exactly 139 rows in the spreadsheet, mark all rows active with MB3 again, so that you can easily work with the entire spreadsheet.
Landmark
1-11
4. Fill in the appropriate values for the Station column. Mark the Station column by clicking MB1 on the Station column heading. From the menu bar select Edit Fill. This will bring up a popup menu. Enter 387 as a starting value and an increment of 1, then select OK. (An easier way to ll, is to click MB2 on the column header. This immediately causes the ll window to display.) 5. Follow the same procedure to ll the X coordinate, starting with 0 and incrementing by 55 and the Y coordinate with all 0s. This is an old land line, for which there were no XY values recorded. We will make up some fake XYs assuming that the line is straight, runs from West to East, and has a nominal receiver spacing of 55ft. 6. Import the Elevation values from an ASCII le. When working with ASCII le import there are three required steps: Open the ASCII le. Dene which numbers are in which columns. Dene which cards or rows to exclude from the import.
1-12
Landmark
7. Select File Import to import ASCII elevation values. Two windows will pop up allowing you to open an ASCII le.
1 3
In the Filter box of the File Import Selection window, enter the directory path (.../misc_files/2d/*) to your ASCII le, followed by /*, then select Filter. Select the ASCII lename and OK.
Landmark
1-13
8. Click Format and enter a name recs for a format description containing ASCII import column denition information. You will see a new window at this point.
Example ASCII Import Column Denition 9. In the Column Import Denition menu, click on a parameter attribute name, such as station, to dene that columns information Note that the selection turns white.
NOTE: Look at the Mouse Button help descriptions at the bottom of the ASCII text window. Note that they reect the MB1 press and drag operation for column denition
10. Highlight the columns that contain the numbers for the attribute you selected by holding down MB1 and dragging from left to right. 11. Repeat the previous two steps for elevations.
1-14
Landmark
Switch to card or row exclusion mode. 12. Now freeze the column denitions by clicking MB3 over the Parameter Column. 13. Click MB3 with the cursor positioned over the word Station or one of the other columnar attributes.
NOTE: Look at the Mouse Button help descriptions at the bottom of the ASCII text window. Note that they now reect block selection and deletion options.
14. Use MB1 to select the rst row to exclude, and MB2 to select the last row to exclude, and press Ctrl-d. You will want to exclude title rows, blank rows, and rows with information that you do not want to import.
This writes a Ignore Record for Import message on all the dened rows. 15. There are also rows at the bottom of this le containing source information that need to be ignored. 16. From the main import menu, select Filter. This will check for any cards with inappropriate information, and allows you to interactively delete them.
Landmark
1-15
17. From the main import menu, select Apply. 18. Select Merge existing station values with matching station data and click OK.
This will add the elevations to your spreadsheet by matching the station numbers in the ASCII le with those already in the spreadsheet. The import windows will disappear. 19. Leave the Static column lled with zeros. 20. Make sure you have 139 stations dened in your receiver spreadsheet, and the information looks correct. 21. Select File Save. 22. Use the display capabilities in the spreadsheet to QC the imported elevations. Select View View All XYGraph from the menu bar. Click MB1 in the X column heading, and MB2 in the Elev column heading.
1-16
Landmark
After the XYGraph displays, select Color Bar from the menu.
Notice that the X Coordinate is displayed on the horizontal axis, the Elevations are on the vertical axis, and the Station numbers are represented by color. Activate the Notebook icon. When this icon is activated, you can select a point in the XYGraph, and automatically jump to that line in the spreadsheet. Select a point in the XYGraph with MB1. This is a powerful QC tool. You can easily locate bad values in the XYGraph, and then edit the value in the spreadsheet. Exit the XYGraph by selecting File Exit Conrm. 23. Use the File Exit pulldown menu to save the information and exit the receiver spreadsheet.
Landmark
1-17
2. The Sources (SIN) spreadsheet appears. You must go through the same procedure as in the Receiver spreadsheet to make 20 rows in the spreadsheet to accommodate the 20 shots in this survey. 3. Fill the Station column. Start at 388, and increment by 4. Notice that you did not input 388.5 as the observers report states. This is because the spreadsheet will only accept integer numbers. You will specify this half station difference using the skid column later. Also notice that the x, y, and z values updated. Because you told the spreadsheets that the source and receiver station numbers were linked, the Sources spreadsheet uses the x, y, and z values entered in the Receivers spreadsheet. Therefore, the source elevations are the elevations of the previous receiver location. In our case, you need to interpolate elevations between receiver locations. We will do this later from the Database tool. Finally, you can see from the Observers Report that a few of the shot station numbers do not increment by four. Fix the station numbers for those shots in the spreadsheet now. Notice that the x, y, and z values change as you change the Station number.
1-18
Landmark
4. Fill the Source column to match the Station column. Source numbers are user dened and could be set to any value. Some people prefer to use this number as a counter, and will ll the column starting with 1, and incrementing by 1. 5. Fill the FFID column starting at 2, and incrementing by 1. Notice from the Observers Report that there is a gap in the FFID numbers between 19, and 22. Enter this gap in the spreadsheet. 6. Enter the offsets of 50 and 15 for the appropriate stations in the Offset column of the spreadsheet. Instead of North, South, East, and West, ProMAX uses the following sign convention:
Shot (x,y)
(+) Positive Offset 7. Scroll the spreadsheet to the right, and ll the Skid column with 27.5. This is where you specify the inline offsets that move the shots from integer station numbers to half station numbers. ProMAX uses the following sign convention:
8. Import the Uphole time and Hole Depth information from the ASCII le using the same procedure as described in the Receivers spreadsheet.
Landmark
1-19
Patterns spreadsheet At this point, leave the Sources spreadsheet, and fill in the patterns spreadsheet. After filling out the pattern, you will finish the remainder of the Sources spreadsheet. There are two methods of defining patterns. If the shot gap stays in a constant location, use the Static Gap Method. This method is only available if you chose to assign midpoints by matching pattern numbers using first live chan and station in the setup menu. If your shot gap changes locations, use the Dynamic Gap Method. This method is available if you chose either to assign midpoints by matching pattern numbers using first live chan and station, or matching pattern number using pattern station shift. Static Gap Method:
In this method gap size and location is specified in the Patterns spreadsheet. In the Sources and Receivers spreadsheets, each shot or receiver used one row of the spreadsheet. In the Pattern spreadsheets, one pattern can use as many rows of the spreadsheet as necessary.
1-20
Landmark
Patterns Spreadsheet Pat 1 Min Max/Gap Chan Rcvr Rcvr Chan Chan Inc MinChan MaxChan 1 120 1 387 506 Rcvr Inc 1
In this method, you specify the rst and last channels and stations in the Pattern spreadsheet. The shot gap size and location is specied in the Sources spreadsheet. 1. Select Patterns from the main spreadsheet window. You will now dene your cable conguration, that is the relationship of channels to receiver locations. When you enter the Pattern spreadsheet for the rst time, a window will appear that asks you to enter some information about the number of channels.
Landmark
1-21
2. Enter 120 for the maximum number of channels, select Constant number of channels/record, then OK. These values will be used for error checking when you exit the patterns spreadsheet. If you dene your pattern for more or less than 120 channels, the error column in the spreadsheet lls with ***** and will force you to correct your error before exiting the Patterns spreadsheet. If you need to edit the number of channels later select Edit NChans. 3. Since our shot gap is in a constant location, ll in the Pattern spreadsheet using the Static Gap Method.
4. Select File Exit to save the information, and exit the Patterns spreadsheet. 5. Return to the Sources spreadsheet, and reorder the columns so that the pattern description columns will be displayed next to the Station column. With the default column order, you cannot see the Station column after scrolling the spreadsheet to the right. To change the displayed order of the columns select Setup Order the menu bar. Follow the mouse button help, and click MB1 in the column heading for Station, Pattern, Num Chn, Shot Fold, 1st Live Sta, 1st Live Chn, Gap Chan Dlt, Gap Size Dlt, and Static.
1-22
Landmark
Finish the selection by clicking MB2 in the column heading for Static. The columns you selected will now move to the left of the spreadsheet as pictured below.
6. Fill in the Pattern column with ones. This tells the Sources spreadsheet to use pattern number 1 from the Patterns spreadsheet. Recall that you only dened one pattern for this survey. 7. Fill the Num Chn column with 120. This species that there are 120 channels for each shot on this survey. 8. You cannot edit the Shot Fold* column. This column will be calculated and lled when you assign midpoints later in the exercise. 9. Fill the 1st Live Sta column with information from the Observers Report. Notice that the rst live station for this survey is 387 for all but the last ve shots. 10. Fill the 1st Live Chn column with ones. This species that the rst live channel for each shot is 1.
Landmark
1-23
11. Leave the Gap Chan Dlt column blank, and leave the Gap Size Dlt column lled with zeros. The information entered in these two columns depends on which method you chose for entering the pattern in the Patterns spreadsheet. Since you chose the Static Gap method, you have already specied the shot gaps size and location in the pattern spreadsheet, and do not need to specify it here. If you had chosen the Dynamic Gap method, you would enter the shot gaps location in Gap Chan Dlt, and the shot gaps size in Gap Size Dlt. 12. Leave the Static column lled with zeros. If the information were available, you could enter any previously calculated datum static values in this column. 13. Select File Save.
1-24
Landmark
14. Display a basemap of both the shots and receivers, and measure the station azimuth. Select View View All Basemap
Notice that the receivers are displayed as a plus + sign, and the shots are displayed as an asterisk *. Also notice the two offset shots. To get a better view of the shots select Display Sources Control Points White. Now select the Cross Domain icon to allow you to measure the station azimuth. Press MB3 (notice the mouse button help) near the rst shot on the line, and drag the mouse to the end of the line. While still holding down MB3, make note of the azimuth (Azi) readout in the mouse button help. For this line, the azimuth should be 90 degrees. Select File Exit Conrm in the XYGraph display.
15. From the main Land Geometry window, select Setup, and enter 90 for the Nominal Survey Azimuth. Select OK to save the information an close the window. 16. Make sure that you only have 20 rows in the Sources spreadsheet. 17. Exit the Sources spreadsheet by selecting File Exit.
TraceQC spreadsheet 1. The information in the traces spreadsheet will be calculated by the binning process. You can not edit this information.
Landmark
1-25
Binning 1. Select Bin from the main window. There are three steps to be completed in order: Assign Midpoints One of the several Binning options Finalize database
2. Select Assign midpoints by: Matching pattern numbers using rst live chan and station, and then select OK. In this case the Assignment step is performing the following calculations: Computes the SIN and SRF for each trace and populates the TRC OPF. Computes the Shot to Receiver Offset (Distance.) Computes the Midpoint coordinate between the shot and receiver. Computes the Shot to Receiver Azimuth.
1-26
Landmark
An Assignment Warning window will pop up warning that some or all of the data in the Trace spreadsheet will be overwritten. Click Proceed.
A number of progress windows will ash on the screen as this step runs. A nal Status window appears notifying that you Successfully completed geometry assignment. Click Ok. If this step fails, you have an error in your spreadsheets somewhere. Not much help is given to you, but, the problems are usually related to the spread and/or pattern denitions. 3. Choose Binning with a method of Add source and receiver stations, user dened OFB parameters. Fill in the parameters in the bottom of the window, and select OK.
This step calculates CDP numbers for each trace by adding source and receiver numbers. The rst CDP will be 775 (387 + 388), the last CDP will be 989 (464 + 525). This step also creates the OFB ordered parameter le. 4. Select OK in the nal status window when successfully completed.
Landmark
1-27
Clcik OK in the nal status window when successfully completed. Click Cancel in the Land 2D Binning window to exit the binning window.
5. Open the Receivers spreadsheet. 6. The binning step lled in the data in the Traces spreadsheet. You can QC this information from a basemap. From the Receivers spreadsheet, select View View All Basemap.
7. Highlight the Cross Domain icon. Click and hold MB1 near a source location to see which receivers contributed to that shot. Drag your mouse to the end of the line to see the receiver range change. Click and hold MB2 near a receiver location to see which shots contributed to that receiver.
1-28
Landmark
8. Select File Exit Conrm to exit the basemap display. 9. Select File Exit from the Receivers spreadsheet. 10. Select File Exit from the main spreadsheet window.
Landmark
1-29
TRC (Trace)
PAT (Pattern)
1-30
Landmark
To graphically QC and edit the database select Database XDB Database Display.
2. From the XDB window select Database Get. 3. Project SRF elevations into SIN. By projecting the SRF elevations into the SIN elevations you will correct for the skid of the elevation being on the half station. For example, compare the land geometry database for receiver and shot elevations at station number 428. You see that they both read an elevation of 842 feet. Looking at the elevation for station number 429, however, you see an elevation of 845.3. From the observer notes and geometry assignment you remember that the shot is actually at station location 428.5, and therefore at an elevation around 843.6. To x the source elevations go to the attribute selection window, and click on the SIN order, then GEOMETRY ELEV. After this is displayed, click on the SRF order, then GEOMETRY ELEV.
Landmark
1-31
In the popup window, type in ELEV for the new attribute name, then click on OK. Your new attribute will be plotted. Notice how station 428 has been corrected.
4. To save this new attribute, select Database Save. In the popup list, click on the name of the new attribute, SIN:GEOMETRY:ELEV. Select OK from the overwrite warning and from the acknowledgment window, then Exit the Database tool. 5. You can verify the source elevation was corrected by going back into the source spreadsheet.
1-32 ProMAX 2D Seismic Processing and Analysis Landmark
6. There are several useful QC plots that can be made from the DBTools or from the XDB Database Display. Some examples are listed below.
XDB CDP: GEOMETRY: FOLD (DBTools: double click on FOLD from CDP tab)
Used to check offset distribution in CDPs for velocity analysis placement and DMO binning.
Landmark
1-33
SEG-Y Input
Type of storage to use: ------------------------------Disk Image Enter DISK le path name: ---------------------------------------------------/misc_les/2d/segy_0_value_headers ----Default the rest of the parameters----
1-34
Landmark
6. Edit your ow 1.1-View Shots to check the trace headers of your dataset.
Trace Display
Number of ENSEMBLES (line segments)/screen: -------2 Do you want to use variable trace spacing?------------Yes ----Default the remaining parameters---7. Change the sort order as shown in the ow. 8. In the trace display use variable trace spacing to highlight the source gap in the shots. 9. While viewing the data in Trace Display, use the dx/dt icon to measure the rst break velocity of a few shots. Write down this value as it will be used later in the Graphical Geometry QC section.
Landmark
1-35
Graphical Geometry QC
Graphical Geometry QC* is a macro designed to quickly find mistakes in your geometry assignment. The process applies linear moveout to shots and splices multiple shots together in a vertical fashion based on receiver surface station. This display is often referred to as a Farr display.
Shot t
LMO Shot
Farr Display
Mistakes in geometry assignment show up as obvious anomalies, such as the last panel in the Farr display. In other cases, you may find that your first break data is far from being flat, with your onset of energy coming in much later with longer offsets. Another indicator is when all first breaks tend to line up at 100 ms, but for one shot they line up at 200 ms. Check the geometry in these areas.
1-36
Landmark
The spikes will bias the entire screen scaling scalar and cause many of the traces to appear having zero amplitude. 7. Execute the ow using MB2. This process uses Screen Display for displaying your data, instead of Trace Display. When you execute with MB2, the data is automatically displayed. Use the Header tool icon to check vertically constant SRF_SLOC trace header values. Note what shot you are on. Look for anomalies, such as a back spread shifted 50-100 ms higher than a front spread, or severely undercorrected or overcorrected shots. Also, any reversed traces should remain at a constant surface location.
NOTE: If you nd any mistakes you must go back to the spreadsheets and correct them. Then you will need to rebin. Finally, to get the proper trace headers loaded you need to rerun the inline header load ow.
1-38
Landmark
Chapter Summary
Upon completion of this chapter you should be able to answer the following questions: Do you understand what the Ordered Parameter Files represent Can you edit the OPFs via the Geometry Spreadsheet Can you view Trace Header values for Geometry Attributes Can you import Observer Data into the Geometry Spreadsheet Can you QC and edit Geometry via DBTools and XDB Do you understand ProMAX Sign Conventions Do you understand what a Pattern Represents Do you understand the steps of Binning Can you graphically QC Geometry with Farr Displays
Landmark
1-39
1-40
Landmark
Chapter 2
Landmark
2-1
Chapter Objectives
1. Geometry Assignment
Field Data
This is an alternative method of completing step one, Geometry Assignment, of our processing workflow. For reprocessing data this method can be very fast and efficient. Upon completion of this chapter you should: Understand how to Remap SEG-Y headers Create Database Files from Extraction
2-2
Landmark
Full Extraction
Field Data
SEG-Y Input
Landmark
2-3
1. Create a new Line. Make sure you are in your Area. Go to the Line level of the ProMAX User Interface and click on Add. Type in the line name, Database Full Extraction, and then press Enter.
2-4
Landmark
SEG-Y Input
Type of storage to use: ----------------------------- Disk Image Enter DISK le path name: ---------------------------------------------------------------------------/misc_les/2d/segy2d_remap Remap SEGY header values?: -------------------------------Yes Input/override trace header entries: --------------------------sou_sloc,,4I,,181/srf_sloc,,4I,,185/ cdp_sloc,,4I,,189/cdp_x,,4I,,193/ cdp_y,,4I,,197/cdp_elev,,4I,,201/
CDP_ELEV. These are not standard SEG-Y headers, and therefore must be stored in the extended header section of the SEG-Y data. Choose the remap option to read in these values. 4. In Extract Database Files, select No for Pre-Geometry database initialization. Enter FFID to Source index method. Select Stations for Receiver Index Method. However, coordinates can be selected since the SEGY le contains both station numbers and x,y values. Select No for Pre-Geometry initialization because you have receiver information in the input SEG-Y headers and thus the SRF OPF directory will be properly built.
NOTE: If no receiver information exists in the input trace headers and you answer no to PreGeometry Initialization, the job will fail. If no receiver information exists in the input trace headers and you answer Yes to Pre-Geometry Initialization, the SRF OPF will be built anyway. You must then enter the missing information into the Receivers spreadsheet, as well as dene pattern information in the Sources and Patterns spreadsheets.
5. Output a new ProMAX disk dataset. 6. Execute the ow. 7. Now conrm that the SEG-Y headers were complete by doing some QC plotting from the Database to check that the trace, receiver, shot, and CDP OPF les look proper. 8. Did you notice how the receivers (SRF GEOMETRY X_COORD) were out of order? 9. Do you believe the extracted geometry?
2-6
Landmark
Chapter Summary
Upon completion of this chapter you should be able to answer the following questions: Can you Remap SEG-Y headers Can you Create Database Files from Extraction
Landmark
2-7
2-8
Landmark
Chapter 3
Landmark
3-1
Chapter Goals
1. Geometry Assignment
Field Data
This is another alternative method of completing step one, Geometry Assignment, of our processing workflow. For reprocessing data this method can be fast and efficient. Upon completion of this chapter you should: Better Understand OPF/SpreadSheet operations Learn How to Finalize the Database Load Geometry to Trace Headers
3-2
Landmark
Extraction + Editing
Field Data
SEG-Y Input
Inline Geom Header Load Valid Trace Numbers Overwrite Trace Headers Seismic Data (ProMAX)
Landmark
3-3
The following spreadsheet guide is designed to help you assign geometry to the line you are processing in the class. It is by no means a complete description of all the capabilities. Please consult the Reference Manual for additional documentation. 1. Create a new Line. Make sure you are in your Area. Go to the Line level of the ProMAX User Interface and click on Add. Type in the line name, Database Partial Extraction, and then press Enter.
3-4
Landmark
SEG-Y Input
Type of storage to use: ----------------------------- Disk Image Enter DISK le path name: ---------------------------------------------------------------------------/misc_les/2d/segy2d_remap Remap SEGY header values?: -------------------------------Yes Input/override trace header entries: -------------------------------------------sou_sloc,,4I,,181/srf_sloc,,4I,,185/
4. In Extract Database Files, select No for Pre-Geometry database initialization. Enter FFID to Source index method. Select Stations for Receiver Index Method. However, coordinates can be selected since the SEGY le contains both station numbers and x,y values. Select No for Pre-Geometry initialization because you have receiver information in the input SEG-Y headers and thus the SRF OPF directory will be properly built.
NOTE: If no receiver information exists in the input trace headers and you answer no to PreGeometry Initialization, the job will fail. If no receiver information exists in the input trace headers and you answer Yes to Pre-Geometry Initialization, the SRF OPF will be built anyway. You must then enter the missing information into the Receivers spreadsheet, as well as dene pattern information in the Sources and Patterns spreadsheets.
5. Output a new ProMAX disk dataset. This disk dataset is used in Flow 03 as input to the Inline Geom Header Load. 6. Execute the ow.
Spreadsheet completion and binning 1. Since some of the OPF les were not complete in the Database, you will need to build the following ow:
3-6
Landmark
Since you used Extract Database Files, the default option in setup is to Assign midpoints by existing index number mappings in the TRC. Reset the units to feet, leave the rest of the Setup window blank and select OK. 4. Select Receivers from the main window. All of this information should be correct. You may notice that some of the receivers are not in sequential order. You can sort these by selecting Setup Sort Ascending. Choose OK in the warning window that appears, and then select the Station column with MB2. This will sort the spreadsheet by ascending station number. Check for incorrect information, and select File Exit. Choose Proceed and then OK to the following messages.
Landmark
3-7
5. Select Sources. All of this information should be correct. Check for incorrect information, and select File Exit. 6. The Patterns spreadsheet option should be grayed out and not functional. The Patterns spreadsheet and the pattern related columns in the Sources spreadsheet are deactivated when you select Assign midpoints by existing index number mappings in the setup menu. If Assign midpoints by pattern number in the source and pattern spreadsheets is selected, the pattern columns in the Sources spreadsheet and the Patterns spreadsheet would have to be completed. 7. Select Bin.
NOTE: You must execute all three options available in this window. Each of these options may be time consuming in the case of 3D data, so they are separated out in this menu.
Select Assign midpoints by existing index number mappings in the TRC. Click OK, then select Proceed in the warning window.
3-8
Landmark
Select only one of the three Bin midpoints options. In this case, select Using previously assigned CDP numbers, user dened OFB parameters, since our input SEGY trace headers included CDP numbers. Use a Binning bias of 0 and an offset bin center increment of 55. Click OK. Select OK when successfully completed. 8. Select Finalize Database, then OK. This step lls in the LIN order
of the database with the nal survey information. Click OK in the Status window when successfully completed. Click Cancel in the Land 2D Binning window to exit Bin.
Landmark
3-9
9. Select File Exit from the main Geometry window. 10. Exit the current ow. From the Flows window, access the database with the Database global command option, and check various attributes for correctness.
3-10
Landmark
All traces in the dataset are described in the geometry. If there are any missing traces in the input le, the job will fail. 3. In Inline Geom Header Load, select Yes to Match by valid trace number. (There will be no Primary or Secondary headers listed.) The Inline Geom Header Load uses the valid trace number found on each trace of each ensemble to assign geometry. 4. In Disk Data Output, select Overwrite the input dataset. Overwrite allows us to process and overwrite only the trace header les (HDRs.) If the existing HDR les are not large enough to accept the data to write out, you must: Change Process trace headers only in Disk Data Input to No. Change Overwrite to New in Disk Data Output. Name a new ProMAX disk dataset name in Disk Data Output.
5. Execute the ow. 6. Create a simple display ow to check the trace headers of your dataset.
Landmark
3-11
Chapter Summary
Upon completion of this chapter you should be able to answer the following questions: Are you condent in OPF/SpreadSheet operations What does it mean to Finalize the Database Can you Load Geometry to Trace Headers
3-12
Landmark
Chapter 4
Landmark
4-1
Chapter Objectives
2. Trace Editing
In this chapter you learn some of the power of DBTools using it to isolate bad traces via Trace Statistics. Upon completion of this chapter you should: Understand how to run Trace Statistics Be functional at using DBTools Understand how the Pointing Dispatcher communicates between processes
4-2
Landmark
Trace Display
----Default all parameters---2. Execute the ow. 3. We rst need to pick a time gate that will be used by the Trace Statistics process. On the rst shot select Picking Pick Miscellaneous Time Gate... Trace Stats Gate by AOFFSET.
Landmark
4-3
Pick the top of the gate following the rst break times. Use MB3 to add NEW LAYER for the bottom gate. Track the end of the reection data in this case, near 2 seconds:
When you are done picking choose File Exit/Stop Flow. Select Yes to save edits before exiting.
4-4
Landmark
Trace Statistics
Types of trace statistics to compute --------------- select all Use rst breaks or time gates?-------------------TIME GATE Time gate reference----------------------------------------- Time 0 Get analysis gates from the DB?--------------------------- Yes Select time gate parameter le----------- Trace Stats Gate Form of statistical output------------ Database & Headers
>Trace Display<
2. Execute the ow. Trace statistics is run to write the statistical attribute values to the database.
Landmark
4-5
Database/Header Transfer
Direction of Transfer: Load TO header FROM database Number of parameters -------------------------------------------- 4 First DB parameter ----------- TRC TRCSTATS PRE_FB_A First Header ----------------------------------------------PRE_FB_A Second DB parameter ------- TRC TRCSTATS TRC_AMPL Second Header ---------------------------------------- TRC_AMPL Third DB parameter ----------- TRC TRCSTATS T_SPIKES Third Header --------------------------------------------- T_SPIKES Fourth DB parameter ------ TRC TRCSTATS AMPDECAY Fourth Header ---------------------------------------- AMPDECAY
Trace Display
----Default all parameters---2. Use Database/Header Transfer to selectively move the values of interest to the headers. You will need to User Dene the trace header words. 3. Execute this ow and wait for the display.
4-6
Landmark
You should get the IDA window and the trace display window. Make sure IDA is working by using the forward and reverse arrows. 4. Add a header plot of the TRC_AMPL in the Trace Display: View Header Plot Congure TRC_AMPL. 5. Leave the Trace Display running, but, exit from the ow menus and press Database on the User Interface. The main DBTools window will appear.
6. Generate a pre-dened Source Fold map. Use the View Predened Source fold map pull down menu.
For this example you may elect to change the background to white and then change to a monochrome color using the Options White
Landmark ProMAX 2D Seismic Processing and Analysis 4-7
Background and Color Monochrome and Color Edit pull downs respectively. 7. Because of the dynamic range of the data large amplitude spikes obscure the rest of the data. Lets attack this problem by taking the log of the trace amplitude Edit Attribute Apply a Function.
8. Lets call the new attribute LOG_AMP with an infotype of TRCSTATS. Select OK when done.
4-8
Landmark
9. Use the View Summary Statistics... pull down menu to generate a Summary Statistics plot of OFFSET, SIN, SRF, and the trace statistics: TRC_AMPL, LOG_AMP, T_SPIKES, and AMP_DECAY from the TRC database. (Click MB1 on OFFSET and then CNTL MB1 on the others and then click on OK to generate the histogram display.
Landmark
4-9
4-10
Landmark
drag the mouse (holding MB1) across this region Using MB1, drag the cursor across the anomalous range of the plot. The points will turn red and all the others will turn black. Notice that a few points will also turn red on the other displays. This is the power of the summary statistics plot. This demonstrates that the high amplitude traces are distributed amongst the shots and receivers. (i.e. there does not appear to be any single high amplitude shot or receiver, these are randomly placed traces) 2. SELECT these points using the Select All Highlighted pull down. The points that were highlighted red will turn pink indicating that they are now selected Notice the difference between HIGHLIGHTING points and actually SELECTING them. 3. PROJECT these points to the shot domain using the Project SIN pull down so that we know which shots contain these high amplitude traces.
Landmark
4-11
Notice that some of the shots on the shot location map turned black:
These are the shots that have the high amplitude traces. 4. PD these shots to the trace display using the bow and arrow PD icon so that the display will only show you the shots that contain the high amplitude traces. This way you are only presented with a few shots to examine instead of the entire data volume of shots. You should only have three shots available to page through in the display. 5. Open a Trace Kill table using the Picking Kill Traces... pull down. Assign this table a name such as Kill list from DBTools interactive and choose CHAN as the secondary sort key for the list.
4-12
Landmark
7. To check that you killed the proper traces select the Paintbrush icon which toggles on and off the kills. 8. Repeat the sequence choosing different ranges of different attributes until you are happy that you have found all of the bad traces and have added them to the list. 9. As you change the attribute of interest add a header plot of that attribute to the Trace Display to help identify the anomalous traces.
Landmark
4-13
1. Highlight the one line of the histogram that represents all of the traces except for the highest amplitude on the TRC-AMPL plot. Notice that almost the entire plot remains red except for a few traces that are marked in black. 2. SELECT these traces, thus excluding the extremely high amplitude traces using the Select All Highlighted pull down.
4-14
Landmark
3. Now we need to re-focus the display on only the selected data points using the Focus On Selection pull down.
4. You can now highlight a new range of points of interest. 5. Select these new points using the Select All highlighted pull down. 6. Project these points to the shot map using the Project SIN pull down. 7. PD these shots to the Trace Display using the bow and arrow PD icon. You can always reset the range of points displayed on the histogram by using the Focus On All pull down. 8. After all traces of interest have been selected to the edit list, Exit from the Trace Display, saving the results. Exit from the main DBTools window with the Database Exit pull down, select Commit to save the LOG_AMP attribute you created to the database.
Landmark
4-15
Chapter Summary
Upon completion of this chapter you should be able to answer the following questions: Can you run Trace Statistics Are you functional at using DBTools Are you comfortable with how PD is communicating between processes
4-16
Landmark
Chapter 5
System Overview
In this chapter we discuss some of the behind-the-scenes program operation, as well as the basic ProMAX framework. Understanding the ProMAX framework and its relationship to the UNIX directory structure can be useful. The ability to manipulate the various components of the line database, such as ordered parameter les, from the User Interface is critical to smooth operation of the software.
Landmark
5-1
Chapter Objectives
This chapter gives the processor a basic understanding of how ProMAX interacts with the operating system. Upon completion of this chapter you should: Understand where and how data les are stored Know where menus and program executables are stored Know how data passes through a ProMAX ow
5-2
Landmark
Directory Structure
/ProMAX (or $PROMAX_HOME) The directory structure begins at a subdirectory set by the $PROMAX_HOME environmental variable. This variable defaults to /ProMAX, and is used in all the following examples. Set the $PROMAX_HOME environment variable to /my_disk/my_world/ProMAX to have your ProMAX directory tree begin below the /my_disk/my_world subdirectory. All ProMAX development tools are included within the following subdirectories: /ProMAX/sys/lib, /ProMAX/sys/obj, /ProMAX/port/ src, /ProMAX/port/bin, /ProMAX/port/include and /ProMAX/port/ man.
Landmark
5-3
/sys
/exe exec.exe super_exec.exe *.exe from program /bin *.exe from command line
/port
/promax
/etc
*.lok - Frame help /lib/X11/app-defaults *.help -ASCII help Application window /promax3d managers /promaxvsp /menu /promax *.menu Processes /promax3d /promaxvsp /misc *_stat_math *.rgb-colormaps ProMax_defaults /bin start-up executable
5-4
Landmark
/ProMAX/sys Software that is Operating System Specific resides in /ProMAX/sys which is actually a symbolic link to subdirectories unique to a given hardware platform, such as: /ProMAX/rs6000 for IBM RS6000 workstations, /ProMAX/sparc for Sun Microsystems Sparcstations running SunOS, /ProMAX/solaris for Sun Microsystems Sparcstations and Cray 6400 workstations running Sun Solaris OS, /ProMAX/sgimips for Silicon Graphics Indigo workstations using the 32 bit operating system and /ProMAX/sgimips4 for Silicon Graphics Indigo and Power Challenge workstations using the 64 bit operating system. This link facilitates a single file server containing executable programs and libraries for all machine types owned by a client. Machine specific executables are invoked from the UNIX command line, located in /ProMAX/sys/bin. Operating System specific executables and libraries, called from ProMAX, are located under /ProMAX/sys/exe. These machinedependent directories are named after machine type, not manufacturer, to permit accommodation of different architectures from the same vendor. Accommodating future hardware architectures will simply involve addition of new subdirectories. Unlike menus, help and miscellaneous files, a single set of executables is capable of running all ProMAX products, provided the proper product specific license identification number is in place. Third party software distributed by ProMAX will now be distributed in a subdirectory of /ProMAX/sys/exe using the companys name, thus avoiding conflicts where two vendors use identical file names. For example, SDIs CGM Viewer software would be in /ProMAX/sys/exe/sdi and Frame Technologys FrameViewer would be in /ProMAX/sys/exe/frame.
/ProMAX/port Software that is Portable across all Platforms is grouped under a single subdirectory /ProMAX/port. This includes menus and Processes (/ProMAX/port/menu), helpfiles(/ProMAX/port/help), miscellaneous
Landmark
5-5
files (/ProMAX/port/misc.) Under the menu and help subdirectories are additional subdirectories for each ProMAX software product. For instance, under /ProMAX/port/menu, you will find subdirectories for ProMAX 2D (promax), ProMAX 3D (ProMAX3D), and ProMAX VSP (ProMAXVSP.) Menus for additional products are added as new subdirectories under /ProMAX/port/menu. If your system administrator is not afraid of the LISP programming language you can have them customize the ProMAX menus and defaults. The .../ProMAX/port/bin contains a very special file Promax which is the ProMAX start-up script. You may want to edit this file and personalize it to your environment. The .../ProMAX/port/lib/X11/app-defaults contains the color attributes and window configurations for the individual applications.
/ProMAX/etc Files unique to a particular machine are located in the /ProMAX/etc subdirectory. Examples of such files are the config_file, which contains peripheral setup information for all products running on a particular machine, and the product file, which assigns unique pathnames for various products located on the machine.
/ProMAX/scratch The scratch area defaults to /ProMAX/scratch. This location can be overridden with the environmental variable, PROMAX_SCRATCH_HOME. We recommend you point this to the biggest file system you have write permission. The DMO, Migrations, and Spreadsheets are heavy users of this file system. We also recommend that you periodically clean this file system.
/ProMAX/data (or $PROMAX_DATA_HOME) The primary data partition defaults to /ProMAX/data, with new areas being added as subdirectories beneath this subdirectory. This default location is specified using the entry: primary disk storage partition: /ProMAX/data 200 in the /ProMAX/etc/config_file. This location can also be set with the environmental variable $PROMAX_DATA_HOME. We also recommend that you point this to a large files system you can write to.
5-6
Landmark
/Line DescName 17968042TVEL 31790267TGAT 36247238TMUT 12345678CIND 12345678CMAP /12345678 HDR1 HDR2 TRC1 TRC2 /Flow1 DescName TypeName job.output packet.job /OPF.SIN
OPF60_SIN.GEOMETRY.ELEV
4) /OPF.SIN Database subdirectory and a non-spanned le /OPF.SRF Database subdirectory and a span le
/OPF.SRF
#s0_OPF60_SRF.GEOMETRY.ELEV
Each region identifies a collection of files and directories which can be summarized as the Area and Line separated into four main file types: 1) Parameter Tables, 2) Trace/Trace Headers, 3) Flows, and 4) Ordered Parameter Files database.
Landmark
5-7
Program Execution
User Interface ($PROMAX_HOME/sys/bin/promax) Interaction with ProMAX is handled through the User Interface. As you categorize your data into Areas and Lines, the User Interface automatically creates the necessary UNIX subdirectories and provides an easy means of traversing this data structure. However, the primary function of the User Interface is to create, modify, and execute processing flows. A flow is a sequence of processes that you perform on seismic data. Flows are built by selecting processes from a list, and then selecting parameters for each process. A typical flow contains an input process, one or more data manipulation processes, and a display and/or output process. All information, needed to execute a flow, is held within a Packet File (packet.job) within each Flow subdirectory. This Packet File provides the primary means of communication between the User Interface and the Super Executive program. See next section, Super Executive Program. In addition, the User Interface provides utility functions for: copying, deleting and archiving Areas, Lines, Flows, and seismic datasets accessing and manipulating ordered database les and parameter tables displaying processing histories for your ows providing information about currently running jobs
The User Interface is primarily mouse-driven and provides point-andclick access to the functions
5-8
Landmark
Program Execution
Super Executive Program (super_exec.exe) Execution of a flow is handled by the Super Executive, which is launched as a separate task by the User Interface. The Super Executive is a high level driver program that examines processes in your flow by reading packet.job and determines which executables to use. The majority of the processes are subroutines linked together to form the Executive. Since this is the processing kernel for ProMAX, many of your processing flows, although they contain several processes, are handled by a single execution of the Executive. Several of the processes are stand-alone programs. These processes cannot operate under the
Landmark
5-9
control of the Executive, and handle their own data input and output by directly accessing external datasets. In these instances, the Super Executive is responsible for invoking the stand-alone programs and, if necessary, multiple calls to the Executive in the proper sequence. The Packet File, packet.job, defines the processes and their type for execution. The Super Executive concerns itself with only two types of processes: Executive processes Stand-alone processes
Executive processes are actually subroutines operating in a pipeline, meaning they accept input data and write output data at the driver level. However, stand-alone processes cannot be executed within a pipeline, but rather must obtain input and/or produce output by directly accessing external datasets. The Super Executive sequentially gathers all Executive-type processes until a stand-alone is encountered. At that point, the Packet File information for the Executive processes is passed to the Executive routine (exec.exe) for processing. Once this is completed, the Super Executive invokes the stand-alone program for processing, and then another group of Executive processes, or another stand-alone process. This continues until all processes in the flow have been completed.
Executive Program (exec.exe) The Executive program is the primary processing executable for ProMAX. The majority of the processes available under ProMAX are contained in this one executable program. The Executive features a pipeline architecture that allows multiple seismic processes to operate on the data before it is displayed or written to a dataset. Special processes, known as input and output tools, handle the tasks of reading and writing the seismic data, removing this burdensome task from the individual processes. This results in processes that are easier to develop and maintain. The basic flow of data through the Executive pipeline is shown below:
5-10
Landmark
Processing Pipeline Each individual process will not operate until it has accumulated the necessary traces. Single trace processes will run on each trace as the traces come down the pipe. Multi channel processes will wait until an entire ensemble is available. For example, in the example flow the FK
Landmark
5-11
filter will not run until one ensemble of traces has passed through the DDI and AGC. If we specify for the Trace Display to display 2 ensembles, it will not make a display until two shots have been processed through the DDI, AGC and FK filter. No additional traces will be processed until Trace Display is instructed to release the traces that it has displayed and is holding in memory by clicking on the traffic light icon or terminating its execution (but continuing the flow). Note: All the processes shown are Executive processes and thus operate in the pipeline. An intermediate dataset and an additional input tool process is needed if a stand-alone process were included in this flow. A pipeline process must accept seismic traces from the Executive, process them, and return the processed data to the Executive.
5-12
Landmark
Disk Data Input, Tape Data Input and standalone tools always start new pipes within a single ow
CDP Stack
Bandpass Filter
Disk Data Output One pipe must complete successfully before a new pipe will start processing
Landmark
5-13
Types of Executive Processes The table below describes the four types of processes defined for use in the Executive. Table 2: ProMAX Executive Process Types
Process Type simple tools ensemble tools complex tools Description Accepts and returns a single seismic trace. Accepts and returns a gather of seismic traces. Accepts and returns a variable number of seismic traces such as, stack. This type of process actually controls the ow of seismic data. Accepts and returns overlapping panels of traces to accommodate a group of traces too large to t into memory. Overlapping panels are processed and then merged along their seams.
panel tools
Stand-Alone Processes and InterProcess Communication Tools Some seismic processing tools are not well suited to a pipeline architecture. Typically, these are tools making multiple passes through the data or requiring self-directed input. These tools can be run inline in a ProMAX job flow and appear as ordinary tools, but in reality are launched as separate processes. The current version of ProMAX does not provide the ability to output datasets from a stand-alone process. InterProcess Communication tools start a new process and then communicates with the Executive via UNIX interprocess communications. InterProcess Communication tools have the singular advantage of being able to accept and output traces in an asynchronous manner.
5-14
Landmark
This section discusses the following issues relating to the Ordered Parameter Files database: Organization Database Structure File Naming Conventions
The Ordered Parameter Files database serves as a central repository of information that you or the various tools can rapidly access. Collectively, the ordered database files store large classes of data, including acquisition parameters, geometry, statics and other surface consistent information, and pointers between the source, receiver and CDP domains. The design of the Orders is tailored for seismic data, and provides a compact format without duplication of information. The Ordered Parameter Files database is primarily used to obtain a list of traces to process, such as traces for a shot or CDP. This list of traces is then used to locate the index to actual trace data and headers in the MAP file of the dataset. Once determined, the index is used to extract the trace and trace header data from their files.
Organization The Ordered Parameter Files contain information applying to a line and its datasets. For this reason, there can be many datasets for a single set of Ordered Database Files. Ordered Parameter Files, unique to a line, reside in the Area/Line subdirectory. The Ordered Parameter Files database stores information in structured categories, known as Orders, representing unique sets of information. In each Order, there are N slots available for storage of information, where N is the number of elements in the order, such as the number of sources, number of surface locations, or number of CDPs. Each slot contains various attributes in various formats for one
Landmark
5-15
particular element of the Order. The Orders are organized as shown in the table below. Table 3: Organization of Ordered Parameter Files
LIN (Line) Contains constant line information, such as nal datum, type of units, source type, total number of shots. Contains information varying by trace, such as FB Picks, trim statics, source-receiver offsets. Contains information varying by surface receiver location, such as surface location x,y coordinates, surface location elevations, surface location statics, number of traces received at each surface location, and receiver fold. Contains information varying by source point, such as source x,y coordinates, source elevations, source uphole times, nearest surface location to source, source statics. Contains information varying by CDP location, such as CDP x,y coordinates, CDP elevation, CDP fold, nearest surface location. Contains information varying by channel number, such as channel gain constants and channel statics. Contains information varying by offset bin number, such as surface consistent amplitude analysis. OFB is created when certain processes are run, such as surface consistent amplitude analysis. Contains information describing the recording patterns.
TRC (Trace)
PAT (Pattern)
OPF Matrices The OPF database files can be considered to be matrices or flat files. The OPF database files are not a relational database. Each OPF is indexed against the OPF counter and there are various single numbers per index. Note the relative size of the TRC OPF to the other OPF files. The TRC is by far the largest contributor to the size of the database on disk.
5-16
Landmark
OPF Matrices
Landmark
5-17
Database Structure The ProMAX database was restructured for the 6.0 release to handle large 3D land and marine surveys. The features of the new database structure are listed below: Each order is contained within a subdirectory under Area and Line. For example, the TRC is in the subdirectory OPF.TRC. There are two types of files contained in the OPF subdirectories: Parameter: Contain attribute values. There may be any number of attribute les associated with an OPF. Index: Holds the list of parameters and their formats. There is only one index le in each OPF subdirectory. The exception to this is the LIN OPF. The LIN information is managed by just two les, one index and one parameter, named LIN.NDX and LIN.REC.
OPF files are of two types: Span: These les are denoted by the prex, #s. Non-span les lack this prex. The TRC, CDP, SIN, and SRF OPF parameters are span les. The rst span of 10MB for each parameter le is always written to primary storage. Span les are created in the secondary storage partitions listed in the cong_le as denoted with the OPF keyword. Span les may be moved to any disk partition within the secondary storage list for read purposes. Newly created spans are written in the OPF denoted secondary storage partitions. All subsequent spans are written to the secondary storage partitions denoted by the OPF keyword in a round robin fashion until the secondary storage is full. Then, subsequent spans are created in primary storage. Span le size is currently xed at 10 megabytes, or approximately 2.5 million 4 byte values per span le. Non-span: All other OPFs are non-span.
Given the fact that each parameter is managed by a file, it may be necessary to increase the maximum number of files open limit on some systems, specifically, SUN, Solaris and SGI. From the csh, the following command increases the file limit to 255 files open, limit de 255. The geometry spreadsheet is a ProMAX database editor. Modifying information within a spreadsheet editor and saving the changes will automatically update the database. There is no longer an import or
5-18
Landmark
export from the geometry database to the ProMAX database files as was required prior to the 6.0 release. Database append is allowed. Data can be added to the database via the OPF Extract tool or the geometry spreadsheet. This allows for the database to be constructed incrementally as the data arrives. There is improved network access to the database. Database I/O across the network is optimized to an NFS default packet size of 4K. All database reads and writes are in 4K pages. Existing and restored 5.X databases are automatically converted to the 6.0 (and later) database format.
File Naming Conventions Parameter file names consist of information type and parameter name, preceded by a prefix denoting the Order of the parameter. For example, the x coordinate for a shot in the SIN has the following name: #s0_OPF60_SIN.GEOMETRY.X_COORD. Where #s0_OPF60 indicates a first span file for the parameter, _SIN denotes the Order, GEOMETRY describes the information type of the parameter, and X_COORD is the parameter name. Index file names contain the three letter Order name. For example, the index file for the TRC is called OPF60_TRC.
NOTE: The index file for each Order must remain in the primary storage partition. Span parameter files may be moved and distributed anywhere within primary and secondary storage.
Within each Order, there are often multiple attributes, with each attribute being given a unique name.
Landmark
5-19
Parameter Tables
Parameter Tables are files used to store lists of information in a very generalized structure. Some examples may be mute functions or decon design time gates. To increase access speed and reduce storage requirements, parameter tables are stored in binary format. They are stored in the Area/Line subdirectory along with seismic datasets, the Ordered Parameter Files database files (those not in separate directories), and Flow subdirectories. Parameter Tables are often referred to as part of the database. Parameter tables differ from the OPF database in OPF files contain many attributes that are one number per something. Parameter tables contain more than one number per something. For example a velocity function contains multiple velocity-time pairs at one CDP.
Creating a Parameter Table Parameter tables are typically created in three ways: Processes store parameters to a table for later use by other processes. Parameter tables can be imported from ASCII les that were created by other software packages or hand-edited by you. Parameter tables can be created by hand using the Parameter Table Editor which is opened by the Create option on the parameter table list screen.
An example is the interactive picking of time gates within the Trace Display process. After seismic data is displayed on the screen, you pull down the Picking Menu and choose the type of table to create. The end result of your work is a parameter table. If you were to pick a top mute, you would generate a parameter table ending in TMUT. If you were picking a time horizon, you would generate a table ending in THOR. These picks are stored in tabular format, where they can be edited, used
5-20
Landmark
by other processes in later processing, or exported to ASCII files for use by other software packages
WARNING: Remember, you name and store the parameter tables in their specic Area/Line subdirectory. Therefore, you can inadvertently overwrite an existing parameter table by editing a parameter table in a different processing ow.
ASCII Import to a Parameter Table File Import reads either ASCII or EBCDIC formatted files with fixed columnar data into the spreadsheet editor. When the application is initialized, two windows appear: the main ASCII/EBCDIC File Import window and the File Import Selection dialog. After a file has been selected, it is displayed, and you can select rows. Note: Filter and Apply appear grayed out and are insensitive to mouse button actions. After Format has been pressed and a columnar format selected, Filter and Apply appear normally and are available for use. The steps involved in performing a file import are as follows: 1. Select File: Select a le to import. If the text le does not contain valid line terminators, use Width to set the line width and then reread the le. 2. Select Format: Select a previous format or create a new format. 3. Review or Edit Column Denitions: Review the previously dened columns in an existing format by selecting all the columns. Review the highlighted regions in the le display for accuracy. Columns can either be edited by hand entering Start Col. and End Col. values, or by performing click and drag column denition. 4. Save the Column Denition: Save any changes to the current column denition to disk for later retrieval. 5. Filter the File for Invalid Text: Search the marked columns and rows for any invalid text. Text may be excluded or replaced within this interactive operation. 6. Perform the Import: Select the Apply button. The application windows will close and the focus will return to the calling spreadsheet.
Landmark
5-21
ASCII File Export from the Parameter Table Editor Export writes either ASCII or EBCDIC formatted files with fixed columnar data from a spreadsheet editor. When the application is initialized, the main ASCII File Export window will appear. After a file and format has been selected, then the ASCII text is displayed and the Apply button is activated. The steps involved in performing a file export are as follows: 1. Select File: Select a le for export within the File Export Selection dialog. 2. Select Format: Select a previous format or create a new format. 3. Review or Edit Column Denitions: Review the previously dened columns in an existing format by selecting all the columns. Review the highlighted regions in the le display for accuracy. Columns can either be edited by hand entering Start Col. and End Col. values, or by performing click and drag column denition. 4. Save the Column Denition: Save any changes to the current column denition to disk for later retrieval. 5. Perform the Export: Select the Apply button. 6. Cancel the Export Operation: Press the Cancel button to close the export windows and return to the calling spreadsheet.
5-22
Landmark
Disk Datasets
ProMAX uses a proprietary disk dataset format that is tailored for interactive processing and random disk access. Disk dataset files can span multiple filesystems, allowing for unlimited filesize datasets. A typical set of files might look like this: /ProMAX/data/usertutorials/landexample/12345678CIND /ProMAX/data/usertutorials/landexample/12345678CMAP /ProMAX/data/usertutorials/landexample/12345678/TRC1 /ProMAX/data/usertutorials/landexample/12345678/HDR1 These files are described in more detail in the table below. Table 4: Composition of a Seismic Dataset
File Name Trace (...TRCx) Trace Header (....HDRx)
Contents
File containing actual sample values for data trace.
File containing trace header entries corresponding to data samples for traces in the trace le. This le may vary in length, growing as new header entries are added. Keep trace headers in a separate le so trace headers can be sorted without needing to skip past the seismic data samples. File keeps track of trace locations, even if data ows over many disks. Given a particular trace number, it will nd the sequential trace number within the dataset. This rapidly accesses traces during processing. The map le is a separate le, as it may grow during processing, it is always held in the line directory. File has free-form format information relating to the entire dataset, including sample interval, number of samples per trace, processing history, and names of trace header entries. This le may grow during processing, and it is also always held in the line directory.
Map (....CMAP)
Index (....CIND)
Landmark
5-23
CIND
HDRx
CMAP
TRCx
Secondary Storage In a default ProMAX configuration, all seismic dataset files reside on a single disk partition. The location of this disk partition is set in the $PROMAX_HOME/etc/config_file with the entry: primary disk storage partition: /ProMAX/promax/data 200 In addition to the actual trace data files, the primary storage partition will always contain your flow subdirectories, parameter tables, ordered parameter files, and various miscellaneous files. The ...CIND and ...CMAP files which comprise an integral part of any seismic dataset are always written to primary storage. Since the primary storage file system is of finite size, ProMAX provides the capability to have some of the disk datasets, such as the ...TRCx and ...HDRx files, and some of the ordered parameter files span multiple disk partitions. Disk partitions other than the primary disk storage partition are referred to as secondary storage. All secondary storage disk partitions must be declared in the appropriate $PROMAX_HOME/etc/config_file. Samples entries are:
5-24
Landmark
secondary disk storage partition: /ProMAX/promax/data2 20 TRC OPF secondary disk storage partition: /ProMAX/promax/data3 20 TRC secondary disk storage partition: /ProMAX/promax/data4 20 OPF secondary disk storage partition: /ProMAX/promax/data5 20 Primary Data Secondary Data2 Secondary Data3 Secondary Data4 Secondary Data5
Refer to the ProMAX System Administration guide for a complete description of the config_file entries for primary and secondary disk storage. 20 is the default disk file size in Megabytes. This default is probably too small for modern surveys as it was based on the old Unix 2Gig file system limitation. A better value would be 4000, or as large as your dataset, or as large as a file as your system will allow.
WARNING: If the Primary le system lls up ProMAX will crash and will not be able to launch until space on Primary has been cleaned up.
Under the default configuration, the initial TRC1 and HDR1 files are written to the primary storage partition. It is possible to override this behavior by setting the appropriate parameter in Disk Data Output. If the parameter Skip primary disk partition? is set to Yes, then no TRC or HDR files will be written to the primary disk partition. This can be useful as a means of maintaining space on the primary storage partition. (To make this the default situation for all users, have your ProMAX system administrator edit the diskwrite.menu file, setting the value for Alstore to t instead of nil). Secondary storage is used in a as listed and available fashion. As an attempt to minimize data loss due to disk hardware failure, ProMAX tries to write a dataset to as few physical disks as possible. If the primary storage partition is skipped by setting the appropriate parameter in Disk Data Output, the CIND and CMAP files are still written to the primary storage partition, but the TRCx or HDRx files will not be found there.
Landmark
5-25
Tape Datasets
Tape datasets are stored in a proprietary format, similar to the disk dataset format, but incorporating required structures for tape input and output. Tape input/output operates either in conjunction with a tape catalog system, or without reference to the tape catalog. The tape devices used for the Tape Data Input, Tape Data Insert, and Tape Data Output processes are declared in the ProMAX device configuration window. This allows access to tape drives anywhere on a network. The machines that the tape drives are attached to do not need to be licensed for ProMAX, but the fclient.exe program must be installed.
Tape Trace Datasets A ProMAX tape dataset is similar to a disk dataset in that the index file (...CIND) and map file (...CMAP) still reside on disk in the Line/survey database. Refer to the documentation in the Disk Datasets portion of this helpfile for a discussion of these files. Having the index and map files available on disk provides you with immediate access to information about the dataset, without needing to access any tapes. It also provides all the information necessary to access traces in a non-sequential manner. Although the index and map files still reside on disk, copies of them are also placed on tape(s), so that the tape(s) can serve as a self-contained unit(s). If the index and map files are removed from disk, or never existed, as in the case where a dataset is shipped to another site, the tapes can be read without them. However, access to datasets through the index and map files residing solely on tape must be purely sequential. Tape datasets are written by the Tape Data Output process, and can be read using the Tape Data Input or Tape Data Insert processes. These input processes include the capability to input tapes by reel, ensemble number, or trace number. Refer to the relevant helpfile for a complete description of the parameters used in these processes. The use or non-use of the tape catalog in conjunction with the tape I/O processes is determined by the tape catalog type entry in the appropriate $PROMAX_HOME/etc/config_file. Setting this variable to full activates catalog access, while an entry of none deactivates catalog access. An entry of external is used to indicate that an external tape catalog, such as the Cray Reel Librarian, will be used. You can override the setting provided in the config_file by setting the environment
5-26
Landmark
variable for BYPASS_CATALOG to t, in which case the catalog will not be used. The actual tape devices to use for tape I/O must also appear as entries in the config_file, under the tape device: stanza.
Landmark
5-27
Getting Started The first step in using the ProMAX tape catalog is to create some labeled tapes. The program $PROMAX_HOME/sys/bin/tcat is used for tape labelling, catalog creation and maintenance, and for listing current catalog information. The program is run from the UNIX command line. The following steps are required to successfully access the tape catalog: 1. Label tapes. 2. Read and Display tape labels. 3. Add labeled tapes to a totally new catalog. Before adding the tapes to a new catalog, it is a good idea to visually inspect the contents of the label information file for duplicate or missing entries. The contents typically look like: 0 AAAAAA 0 1 4 1 AAAAAB 0 1 4
5-28
Landmark
2 AAAAAC 0 1 4 3 AAAAAD 0 1 4 4 AAAAAE 0 1 4 The fields are: volume serial number (digital form), volume serial number (character form), tape rack slot number, site number, and media type, respectively. You can manually edit these fields. 4. Write a label information le from the existing catalog. 5. Add labeled tapes (and datasets) to the existing catalog. 6. Merge an additional catalog into the existing catalog. 7. Delete a dataset from the catalog.
Landmark
5-29
Chapter Summary
Upon completion of this chapter you should be able to answer the following questions: Where is the Seismic Data stored Are the Trace Headers stored separate from the Data Where are the Ordered Parameter Files stored Where are the Parameter Tables stored Can you build a ProMAX start-up Script Can you personalize/change a default Menu Can you explain how data passes through: single trace tools, ensemble tools, interprocess communication and stand-alone tools
5-30
Landmark
Chapter 6
Landmark
6-1
Chapter Objectives
3. Parameter Selection
This chapter gives the processor a framework of how to define and test parameters, gates, windows and processing flows. Upon completion of this chapter you should: Be procient at Picking Gates and Windows in Trace Display Know how to Pick Bad Traces Understand Automatic Parameter Testing Be able to design IF/ENDIF conditional processing Be able to interactively test FK Filters and Spectral Analysis
6-2
Landmark
Pick Parameter Tables In this exercise you will pick a top mute, deconvolution design gate, trace kill list, and a trace reversal list. 1. Build the following ow in your Watson Rise Line.
Trace Display
Number of ENSEMBLES (line segments)/screen: -------2 Do you want to use variable trace spacing?: ----------Yes 2. In Disk Data Input, select Shots-with geometry. Sort the data by FFID and OFFSET. 3. Replace the AGC operator length default value with 1000.
Landmark
6-3
4. In Trace Display, use variable trace spacing. This will use the secondary sort key of OFFSET to variably space the traces. Also, set the number of ensembles per screen to 2. 5. Execute the ow. Trace to be killed Mute Traces to be reversed
Decon Gate
Parameter tables 6. If you did not save your trace kill table from chapter 4, go ahead and pick the bad traces here: Picking Kill traces... Kill list from Trace Display. 7. Pick a top mute to get rid of rst break and refracted energy: Picking Pick Top Mute... FB Mute by AOFFSET. Use the Paintbrush icon to see the effects of your current picks. In this case you should see only hyperbolas after Paintbrush applies the top mute.
6-4
Landmark
8. Pick a deconvolution gate on the rst shot: Picking Pick Miscellaneous Time Gates... decon gate by AOFFSET. Remember to use MB3 New Layer to pick the bottom of the decon gate. 9. After projecting your windows use Interactive Data Access to move through all the shots and QC the windows. 10. If you desire you can pick the reverse traces: Picking Reverse traces... reverse traces by AOFFSET. In general the reverse traces will be agged in the eld by the observers log. The statics routines will also detect the reverse traces for you. 11. Select File Save Picks, then select File Exit/Stop Flow.
Landmark
6-5
Parameter Test
The Parameter Test process provides a mechanism for automatically testing simple numeric parameters by creating multiple copies of input traces and replacing a key parameter in the process to be tested with specified test values. The output consists of copies of the input data with a different test value applied to each copy. Parameter Test creates two header words. The first is called REPEAT data copy number and is used to distinguish each of the identical copies of input data. The second is called PARMTEST and is an ASCII string, uniquely interpreted by the Screen Display processes as a label for the traces.
Test True Amplitude Recovery with Parameter Test In this exercise, you will use Parameter Test to compare True Amplitude Recovery on shot gathers with different values for dB/sec.
6-6
Landmark
Trace Kill/Reverse
Trace editing MODE: ---------------------------------------------Kill Get edits from the DATABASE?: ----------------------------Yes SELECT trace Kill parameter le: -----------------------------------------------------Kill list from DBTools interactive
Trace Kill/Reverse
Trace editing MODE: --------------------------------------Reverse Get edits from the DATABASE?: -----------------------------No Trace selection MODE: --------------------------------EXCLUDE PRIMARY edit list header word: ----------------------------SIN SECONDARY edit list header word: ------------SRF_SLOC TERTIARY edit list header word: -----------------------NONE SPECIFY trace to be edited: ------------------------1-17:469/
Trace Muting
Re-apply previous mutes: ---------------------------------------No Mute time reference: ----------------------------------------Time 0 TYPE of mute: ------------------------------------------------------Top Starting ramp: ------------------------------------------------------30. EXTRAPOLATE mute times?: ---------------------------------Yes Get mute le from the DATABASE?: -----------------------Yes SELECT mute parameter le: -------------------------FB Mute
Landmark
6-7
Disk Data Input Trace Kill/Reverse Trace Kill/Reverse Trace Muting Parameter Test
Enter Parameter VALUES: ------------------------------ 12|9|6 Trace grouping to reproduce: ----------------------Ensembles
Trace Display
Number of ENSEMBLES(line segments)/screen: --------2 Trace scaling option: ------------------------------Entire Screen Number of display panels: ---------------------------------------2 2. In Disk Data Input, select Shots-with geometry. Choose to sort data by SIN, and read the shot number 16. Parameter Test will not work with Interactive Data Access, so set this to No. 3. In the rst Trace Kill/Reverse, select you trace kill le. 4. In the second Trace Kill/Reverse, choose to reverse SRF_SLOC 469 for SINs 1-17. 5. In Trace Muting, select your top mute le.
6-8
Landmark
6. Specify values for Parameter Test. Enter a list of parameter values for dB/sec correction constant, each separated by a vertical bar (|). To determine the format (real, integer, sequence) and a realistic range of test values, look at the default value in the True Amplitude Recovery process. (Use values of 12, 9, and 6 dB/sec for this exercise.) 7. Specify True Amplitude Recovery parameters. Select Yes to apply spherical divergence, and enter the following velocity time pairs (0-7000, 850-9000, 1300-12000, 2000-15000). Select Yes for Apply dB/sec Correction, and enter ve nines (99999) for the dB/sec correction constant.
NOTE: Entering ve nines (99999) is a ag that tells the process to use the values found in Parameter Test for this parameter.
8. In Trace Display, choose to display 2 ensembles/screen, 2 display panels, and change the trace scaling from individual to entire screen. This will display the original shot plus the three parameter tests on a single screen. 9. Execute the ow to compare displays.
Landmark
6-9
Viewing Parameter Tests After viewing the tests and deciding on the most appropriate value for the dB/sec correction, select File Exit/Stop Flow.
6-10
Landmark
10. Select View from the ow builder menu and look at the processes that were actually executed in your ow. Near the bottom of the job.output le is a listing of the executed processes. There are some additional processes listed here, that were not in your original ow. Also notice that Parameter Test is absent. This occurs because Parameter Test is a macro, built from other processes. If you have problems with a ProMAX ow that you cannot solve simply email the job.output le to support@advance.com and they will help out in anyway they can. 11. Edit you ow again, and change the following Trace Display parameters:
Disk Data Input Trace Kill/Reverse Trace Kill/Reverse Trace Muting Parameter Test True Amplitude Recovery Trace Display
Number of ENSEMBLES(line segments)/screen: --------1 Automatically SAVE screens: --------------------------------yes Trace scaling option: ------------------------------Entire Screen Number of display panels: ---------------------------------------1 12. Execute the ow. 13. Use the Next ensemble icon to display the four tests, then use the Animation tool to review the tests. Check to see if you would still use the same value for dB/sec as you chose before. 14. Select File Exit/Stop Flow when nished.
Landmark
6-11
One method of generating multiple data copies is to use the Reproduce Traces process. This is actually the same process, designed into Parameter Test macro. Reproduce Traces generates a specified total number of copies and appends a header word to each trace, allowing you to distinguish between the multiple versions of data. This header word is known as Repeated Data Copy Number or REPEAT for short. It is a numeric value from 1-N, where N is the total number of generated copies. You should place Reproduce Traces after any processing which is common to all copies of the data, but prior to the processes you wish to compare. Branching the flow is a conceptual term for controlling the processes your dataset utilizes. In other words, you do not actually break up any single flow into separate flows, rather utilize the capability of the IF, ELSEIF, and ENDIF processes to select and direct traces for processing. This is handled automatically by the Parameter Test process, as you saw if you looked at the View information when you executed the previous flow. More specifically, each copy of the data is passed to a different process, or the same process with different parameter selection using a series of IF, ELSEIF and ELSE processes in the flow. For example, if the data copy number (REPEAT) is 1, then pass that copy of the data to the next process. If the data copy number is 2, pass that copy to a different process, and so on until all copies of the data have been passed to unique processes. The series of conditions is ended with ENDIF.
6-12
Landmark
Finally, you may use a process called Trace Display Label to generate a header word for posting a label on the display.
Compare Data With and Without Deconvolution Incorporate Reproduce Traces with IF and ENDIF to compare processed and unprocessed data. In this exercise you will compare unfiltered shot gathers with deconvolution, and filtered shot gathers with deconvolution. It is always a good idea to have a control copy, the original input, for further comparison. This flow illustrates how to compare these three copies.
Landmark
6-13
Disk Data Input Trace Kill/Reverse Trace Kill/Reverse Trace Muting True Amplitude Recovery
dB/sec correction constant: ------------------------------------- 9
Reproduce Traces
Trace grouping to reproduce: ----------------------Ensembles Total Number of datasets: ----------------------------------------3
IF
Trace selection MODE: ------------------------------------Include SELECT Primary trace header word: --------------REPEAT SELECT secondary trace header word: --------------NONE SPECIFY trace list: ---------------------------------------------------1
ELSEIF
Trace selection MODE: ------------------------------------Include SELECT Primary trace header word: --------------REPEAT SELECT secondary trace header word: --------------NONE SPECIFY trace list: ---------------------------------------------------2
Spiking/Predictive Decon Trace Display Label ELSE Spiking/Predictive Decon Trace Display Label ENDIF Trace Display
6-14
Landmark
Disk Data Input Trace Kill/Reverse Trace Kill/Reverse Trace Muting True Amplitude Recovery Reproduce Traces IF Trace Display Label ELSEIF Spiking/Predictive Decon
TYPE of deconvolution: -----------Minimum phase spiking Decon operator length(s): --------------------------------------160 Operator white noise level(s): -------------------------------0.1 Window rejection factor: ------------------------------------------2. Time gate reference: ----------------------------------------Time 0 Get decon gates from the DATABASE?: ------------------Yes SELECT decon gate parameter le: -------------decon gate Output traces or lters: ---------------Normal decon output Apply a bandpass lter after decon? ----------------------No Re-apply trace mute after decon?: -------------------------Yes
Landmark
6-15
2. Use the same parameters as the previous ow for the rst four processes. 3. In True Amplitude Recovery, set the db/sec to the value you chose in the previous ow. 4. In Reproduce Traces, enter 3 for the total number of datasets. You will generate two additional copies (3 total), one ensemble at a time. 5. Select Repeat for Select Primary trace header word in IF and ELSEIF. IF acts as the gate keeper, providing the mechanism for selecting or restricting traces which will be passed into a particular branch of the ow. Header words are used (just as in Disk Data Input) to uniquely identify the traces to include or exclude in a particular branch. In the rst IF conditional, select REPEAT as the primary trace header and 1 (copy number) as the trace list entry. Data copy 1 is passed to Trace Display Label in this example.This will be the control copy. The ELSEIF condition passes the second data copy number (REPEAT=2) to Spiking /Predictive Decon. The ELSE process selects all traces, not previously selected with IF or ELSEIF. In our case, having selected two of the three copies of data for ltering, leaves only the third data copy (REPEAT=3) for the ELSE branch. In this example, you will apply deconvolution and lter. 6. Use Trace Display Label to create labels for each copy. Label the copies according to their unique processing. For example, label the rst copy with Original Input, the second with Decon, and the nal copy with Decon with filter. 7. In Trace Display, choose to display all 3 copies on one screen. 8. Execute the ow. After viewing the data in this mode, you may choose to display each copy on a different screen, and use the screen swap mode.
6-16
Landmark
F-K Analysis In this exercise you will bring in one shot with some slow linear noise. After inspection in both the time domain and F-K space, design a filter to reject the noise. You will want to try a polygon filter as well as a fan filter to attenuate the noise.
NOTE: This is not a real processing ow, since, you would normally do the FK lter before the deconvolution. For class purposes we are using the deconvolution to enhance the ground roll so that we can demonstrate how powerful FK lters are at attenuating ground roll.
Landmark
6-17
1. Copy your previous ow, and add/delete processes so that it looks like the following:
Disk Data Input Trace Kill/Reverse - optional Trace Kill/Reverse - optional Trace Muting True Amplitude Recovery - optional Spiking/Predictive Decon Automatic Gain Control >F-K Filter< F-K Analysis
Panel width in traces: ------------------------------------------120 Starting time for analysis: ---------------------------------------0. Ending time for analysis: ----------------------------------------0. Distance between input traces: ------------------------------55. Starting display conguration: ----------------TX-TK-FX-FK Position of zero wavenumber in display: --------CENTER Position of zero frequency in display: --------------------TOP Plot FK,TK,TX panels in DB or Linear: ----------DBSCALE Initial TX gain setting (percentile): --------------------------98. Initial FK maximum gain setting (db): -----------------------0. Initial FK minimum gain setting (db): ------------------------0. Percent at for trace ramping: ------------------------------100. Percent at for time ramping: -------------------------------100. Select mute polygon table: ------------------------------fk mute Mode of F-K lter operation: ---------------------------REJECT Percent at for F-K lter windowing: ----------------------90. Time length of F-K lter (ms): -------------------------------500. Spatial extent of F-K lter (traces): --------------------------50 2. For all processes prior to F-K Analysis, use the same parameters as the previous ow.
6-18
Landmark
3. In F-K Analysis, enter 122 for the panel width to account for the shot gap in the transform. 4. Set 55 ft. for the distance between traces (do not let this default to 0). 5. Add an output mute polygon table fk mute. 6. Execute the ow. 7. With the default display you will see four panels. View only the TX, and FK panels by selecting Conguration TX-and-FK. 8. Use the dx/dt icon. You should identify the ground roll energy in the F-K domain by the velocity you measure in T-X space. 9. You may nd it helpful to rotate the color scale using Controls Edit Colormap.
Landmark
6-19
10. With the F-K data displayed, select the Picking tool icon to build a table for interactively picking a reject zone.
Picking Mute for FK Filter 11. Pick a polygon to include all the noise to lter. It is best to start with a square or rectangle and then use MB1 to add new control points and MB3 to move the control points to customize the shape of the polygon as illustrated on the previous page.
6-20
Landmark
12. After building the desired polygon, examine the response of the data to the lter by selecting FilterResponse FilteredOutput.
FK Filtered Output 13. You may also want to view the impulse response of the lter by selecting FilterResponse ImpulseResponse. To better view the operator now select Controls TX Display... Clip by amplitude .008 and then select OK. 14. After using Interactive Data Access option to view other shots, select File Exit/Stop Flow, and then select Yes to save your polygon.
Landmark
6-21
Compare F-K ltered shots using an IF loop 1. Edit your ow to include an IF loop:
Delete
Execute
View
Exit
Trace Kill/Reverse - optional Trace Kill/Reverse - optional Trace Muting True Amplitude Recovery - optional Spiking/Predictive Decon Automatic Gain Control Reproduce Traces IF F-K Filter
Type of F-K lter: -----------------------------Arbitrary Polygon Distance between input traces: -------------------------------55 Panel width in traces: ------------------------------------------122 Test the lter impulse response?: ---------------------------No Percent at for time ramping: -------------------------------100. Percent at for offset ramping: -----------------------------100. Get polygon mute le from the database: ---------------Yes Select mute parameter le: ------------------fk mute Mode of F-K lter operation: ---------------------------REJECT Percent at for F-K lter windowing: ----------------------90. Time length of F-K lter (ms): -------------------------------500. Spatial extent of F-K lter (traces): --------------------------50 Re-apply T-X trace mute after lter?: ---------------------Yes Percentage of K-space to keep around K=0: --------------0.
6-22
Landmark
2. Use the same parameters as before for the rst seven processes, except turn off the Interactive Data Access. 3. Create two copies of the shot with Reproduce Traces. 4. Use the Repeat option in IF to send one copy of the shot to the F-K Filter process. 5. Execute the ow, and review the results. 6. Modify the FK lter to use a fan lter instead of the arbitrary polygon. 7. Execute the ow, and review the results. 8. You may try changing the mode of operation from REJECT to ACCEPT and re-running. If you see any signal/hyperbolas you have removed some signal.
Landmark
6-23
Spectral Analysis In this exercise you will run Interactive Spectral Analysis in all three modes, and then compare the results of running deconvolution on the data. Deconvolution testing may become very involved in certain situations. One criterion that you may use to help decide on decon parameters is to look at amplitude (or power) spectra of the trace data before and after decon. If the decon has worked properly, you should see some flattening, or whitening of the spectrum after decon relative to before. In this exercise we will look at such a comparison on a single shot record.
6-24
Landmark
Landmark
6-25
Interactive Spectral Analysis - Simple Mode 3. Change the contents of the display by using the View Visibility pull down menu, and selecting the individual tiles of interest. 4. Exit from the display using the File Exit and Stop Flow pull down menu. 5. Edit the parameters of the Interactive Spectral Analysis to execute the Single Subset mode instead of the Simple mode.
6-26
Landmark
Interactive Spectral Analysis - Single Subset Mode In this mode you can select a Single Subset of the available data for the purposes of computing the average power and phase specta.
Landmark
6-27
7. Click on the Select Rectangular Region icon and then draw a box around an area of interest. The data window and spectral windows will change conguration to match your data selection.
You can move or redraw this window as many times as you wish. 8. Exit from the display using the File Exit and Stop Flow pull down menu. 9. Edit the parameters of the Interactive Spectral Analysis to execute the Multiple Subset mode instead of the Single Subset mode. Also choose to Freeze the selected subsets.
6-28
Landmark
11. Click on the Select Rectangular Region icon and draw a box around an area of interest and then select the Options Spectral Analysis pull down menu.
Landmark
6-29
12. If you select a new area and repeat the Options Spectral Analysis pull down selection, a new window will appear. In this way you can compare the spectral results for different areas. 13. Select File Exit and Stop Flow. 14. Copy your ow to compare a shot before and after deconvolution with an IF-ELSEIF loop.
IF
SELECT Primary trace header word: ------------- REPEAT SPECIFY trace list: ---------------------------------------------------1
ELSEIF
SELECT Primary trace header word: ------------- REPEAT SPECIFY trace list: ---------------------------------------------------2
Trace Muting
Select mute parameter le: --------------------------FB Mute
Spiking/Predictive Decon
Decon operator length(s):---------------------------------------160 Select decon gate parameter le: -------------decon gate
6-30
Landmark
16. You can use the Slope icon to calculate the dB roll on/off of the amplitude spectrum. 17. Click on the Next ensemble icon to display the data after decon. 18. Select the Options Spectral Analysis pull down menu again to show the spectral estimate for the data after decon. Observe the attened amplitude spectrum and the change in the dB scale. Do you believe the amplitudes above 80 Hz? 19. When done File Exit and Stop Flow from each of the display windows.
Landmark
6-31
Chapter Summary
Upon completion of this chapter you should be able to answer the following questions: Can you Pick Gates and Windows in Trace Display How do you Pick Bad Traces Do you understand Automatic Parameter Testing Can you design an IF/ENDIF conditional processing tree Can you interactively test FK Filters and Spectral Analysis
6-32
Landmark
Chapter 7
Landmark
7-1
Chapter Objectives
This chapter explains how to calculate and apply elevation statics. Upon completion of this chapter you should: Understand the concept of Elevation Statics Know how to choose a proper Processing Datum Be able to Calculate and Apply Elevation Statics Be able to Import and Apply User Statics
7-2
Landmark
Elevation Statics
All statics computations are performed in the database. Datum Statics Calculation* calculates the elevation (datum) static corrections. Datum Statics Apply applies the static corrections to input data. Datum Statics Calculation* performs the following functions: Compute static time shifts to take the seismic data from their original recorded times, to a time reference as if the data were recorded on a nal datum F_DATUM (usually at) using a replacement velocity (usually constant). Compute N_DATUM (a smooth surface used as the processing datum). Partition the total statics into two parts, the Pre (before) NMO term and Post (after) NMO terms relative to N_DATUM.
Datum Statics Apply performs the following function: Apply the Pre (before) NMO portion of the statics and write the remainder to the trace header.
In Datum Statics Calculation* you have the option to shift prestack data to a floating datum or a final datum. You supply a final datum elevation and a replacement velocity. The elev_stat_math file then establishes values in the database for F_DATUM, N_DATUM, S_STATIC, R_STATIC, and C_STATIC. Details of this process can best be understood by examining the contents of the elev_stat_math file. This file typically resides in $PROMAX_HOME/port/misc. Datum Statics Calculation* then creates four new header entries for statics: NMO_STAT, FNL_STAT, TOT_STAT and NA_STAT. The integer multiple of the sample period (usually a multiple of 2 or 4 ms) portion of NMO_STAT is automatically applied by Datum Statics Apply, shifting traces to the floating datum. The fractional sample period portion is written to the NA_STAT header entry and applied later. Normally the NA_STAT is applied during NMO, which will interpolate the data to the fractional static properly. If you select to process to a final datum, C_STATIC is set to zero. Recall that NMO_STAT = S_STATIC + R_STATIC + C_STATIC and that C_STATIC = -1.0*FNL_STAT. NMO_STAT is the static that shifts
Landmark ProMAX 2D Seismic Processing and Analysis 7-3
traces to the final processing datum, and FNL_STAT is zero because your data are at the final datum.
NMO_STAT
NMO_STAT
Shot Base Weathering Vreplacement
S_STATIC F_DATUM
FNL_STAT
C_STATIC
R_STATIC
Database Attributes:
N_DATUM = oating datum F_DATUM = nal datum S_STATIC = (F_DATUM - ELEV + DEPTH) / DATUMVEL R_STATIC = [(F_DATUM - ELEV + DEPTH) / DATUMVEL] - UPHOLE C_STATIC = 2 * [(N_DATUM - F_DATUM) / DATUMVEL]
7-4
Landmark
Calculate Elevation Statics 1. Create the following ow to calculate elevation statics for your data.
Landmark
7-5
The smoother is dened as number of CDPs to smooth over. This parameter may require some testing to generate the desired N_DATUM. 5. Select NMO Datum (oating) for Processing datum. 6. Choose a Run ID of 01. This will generate S_STATIC, R_STATIC, and C_STATIC and copy them to S_STAT01, R_STAT01, and C_STAT01. 7. Execute the ow. 8. When the job completes exit the ow, and select the Database menu. 9. From the DBTools window select the SRF tab (order), and then by double clicking view the following attributes: R_STAT01, F_DATUM, DATUMVEL, and ELEV (receiver elevation). Notice the inverted relationship between the static and the elevation. Select the SIN tab, and view the following attributes: S_STAT01, and ELEV (elevation of surface at the shot locations). From the CDP order, view the C_STAT01 attribute. 10. Why are the source and receiver statics opposite signs? Perhaps the shots are buried beneath the nal datum?
7-6
Landmark
11. Now from the DBTools window select Database XDB Database Display. 12. From the XDB display select Database Get. From the CDP order, view ELEV, and N_DATUM (oating datum). Notice the effect of the 51 point CDP smoother you applied.
Landmark
7-7
Apply Elevation Statics 1. Copy the ow 3.4-FK Analysis/Filter to apply pre-processing, and elevation statics to your data.
Trace Kill/Reverse Trace Kill/Reverse Trace Muting True Amplitude Recovery F-K Filter Spiking/Predictive Decon Datum Statics Apply
Source datum statics database parameter: ----------------------------------------------SIN GEOMETRY S_STAT01 Receiver datum statics database parameter: ------------------------------------------SRF GEOMETRY R_STAT01 CDP datum statics database parameter: ------------------------------------------------CDP GEOMETRY C_STAT01
7-8
Landmark
Apply External Statics In this exercise, you will import static data calculated elsewhere, and then apply it to your trace data. For this class, no ASCII format statics file is available, therefore, you will use the XDB Database ASCII Save functionality to output an ASCII file of shot and receiver statics created in the previous exercise. You will then import these statics back to the database. This will allow you to see both the ASCII import and export portions of the database.
Caution: Apply User Statics is an alternate method for applying datuming type statics. Only one of the datuming processes should be run on a dataset. Use either Datum Statics Apply, Apply User Statics, or Apply Refraction Statics, but only one. Refer to the helples for additional statics related information.
Landmark
7-9
1. Place a copy of the statics le in a directory, accessible by ProMAX. Since no ASCII statics le is available, create one with the database ASCII save function. To initiate the save procedure, bring up DBTools with the Database global command in the ows window, and then select Database XDB Database Display. Bring up the attribute selection by Database Get. Display SRF: R_STATIC. This attribute can now be saved to an ASCII le. Select ASCII Save from the global commands to get the following window.
1 2
Step 1 - Select User-dened File in the popup window and enter a full directory path and lename without an extension. (The extension, .a_db is created by the program.) Select OK.The dened path, lename, and attribute name appear. Step 2 - Click on the attribute name, in this case R_STATIC. Enter your own description or accept the default description and click on OK to create the ASCII le. Repeat this entire procedure to save S_STATIC to an ASCII le. You now have ASCII les that are ready to import in the next part of the exercise. When nished, select Cancel from the ASCII save window.
7-10
Landmark
3. Click on File and enter the full path and lename (including extension) of the ASCII le. Click OK and the contents of the ASCII le are displayed. The ASCII/CLIENT path is a generic ASCII le import functionality. 4. Once the ASCII le is displayed, select the Order (SRF or SIN), Infotype (GEOMETRY), and Attribute (USERSTAT). 5. Click on Location Index and then dene the rows and columns to import. The rows and columns containing the values to be imported are identied one of two ways: Rows can be painted by holding down MB1 and moving the mouse over all rows. The included rows will highlighted black. Columns for Location Index numbers (station numbers) are painted using MB2 and highlight in red. Columns for Attributes (statics values in this example) are painted using MB3 and highlight in blue). Click on any of the Rows or Columns buttons and you will be prompted to manually enter starting and ending values.
6. Click on Display.
Landmark
7-11
You will be prompted for an attribute description. Enter your description of the USERSTAT attribute R_STATIC USER and select OK. This displays the data you dened on import. 7. Save the new attribute in the database. Click Cancel in the Client ASCII Import window, then select Database Save from the main menu bar. Click on USERSTAT in the On-Screen Attributes to save window. Wait a moment and click OK in the acknowledgment window. Your USERSTAT values are now saved in the Database. Be sure you complete the ASCII Import steps for both shot and receiver ASCII les. 8. Copy ow 4a.2-Apply Datum Statics. Replace the static values in Datum Statics Apply with the user statics.
Disk Data Input Trace Kill/Reverse Trace Kill/Reverse Trace Muting True Amplitude Recovery F-K Filter Spiking/Predictive Decon Datum Statics Apply
Source datum statics database parameter: --------------------------------------------SIN GEOMETRY USERSTAT Receiver datum statics database parameter: ------------------------------------------SRF GEOMETRY USERSTAT CDP datum statics database parameter: -------------------------------------------------CDP GEOMETRY C_STATIC
7-12
Landmark
9. Datum Statics Apply will know to use the user_stat_math le for the Database Math Method. The user_stat_math le generates S_STATIC and R_STATIC by copying the SRF and SIN USERSTAT values that were imported to the database. These are partitioned into the database parameter C_STATIC and into the trace header value FNL_STAT. 10. Execute the ow. The trace headers are updated and the are traces shifted to the oating datum. 11. Once the job nishes view the shots with ow 1.1-View Shots. Examine the trace headers for NMO_STAT, FNL_STAT, TOT_STAT, and NA_STAT using the Header icon.
Landmark
7-13
Chapter Summary
Upon completion of this chapter you should be able to answer the following questions: What are Elevation Statics What is a good smoother for the Processing Datum How do you Calculate and Apply Elevation Statics Can you Import and Apply User Statics
7-14
Landmark
Chapter 8
Brute Stack
In this chapter you will import a velocity eld. You will then use this eld to apply NMO and create a stack.
Landmark
8-1
Chapter Objectives
5. Brute Stack
This chapter creates you first QC stack of the data. Upon completion of this chapter you should: Understand how to Import Velocities Understand the NMO and Stack Parameters
8-2
Landmark
2. A list of possible parameter tables will appear. Use the scrollbar located on the right-hand side of the window to scroll to the bottom of the list. 3. Select the VEL (RMS (stacking) Velocity) table This will take you to the RMS velocity table menu. 4. Click on Create. Do not click on Add.
5. Enter the description name for your imported velocity. Use a name similar to imported from ascii file. This opens a parameter table editing window in the form of a spreadsheet.
Landmark
8-3
6. Click on the File Import pull down menu. This opens two new windows, an empty viewing window and a File selection window.
7. Input the absolute path name to the directory where the velocity le is stored and append a /* to the end of the pathname. Click on Filter. (/misc_files/2d/*.) 8. Select the le as indicated by your instructor and click on OK. The ASCII le is opened, and the contents displayed in the Import viewing window.
8-4
Landmark
9. Click on Format. 10. Enter a new format denition name Vels Import Format or select a previously dened format (you probably do not have any yet). 11. Click on OK. A format window will open.
12. Click on CDP and then drag the mouse over the appropriate columns on the import le window to dene the correct columns for the CDP value.
Landmark
8-5
15. Select Overwrite ALL existing values with new import values and OK.
8-6
Landmark
16. The XCOOR and YCOOR columns are ignored for 2D. 17. Click on File Exit to save the parameter table and exit from the editor. 18. Check the table for correctness by going back to the list of tables from the User Interface and select to Edit the table. 19. Click on Edit and then select the table name.
20. Verify that the le has been saved properly. 21. Click on File Abort to exit from the editor.
Landmark
8-7
CDP/Ensemble Stack
You will now use the CDP/Ensemble Stack process to create a stacked section of the data with elevation statics. 1. Build the following ow.
CDP/Ensemble Stack
Sort order of input ensembles: ------------------------------CDP METHOD for trace summing: ------------------------------Mean Root power scalar for stack normalization: -------------0.5 Apply nal datum statics after stack? -------------------Yes Has NMO been applied?: --------------------------------------Yes
8-8
Landmark
2. In Disk Data Input, select your shots with elevation statics applied, and sort by CDP. 3. Add a trace display label. 4. Apply Normal Moveout Correction. Select the imported velocity le. 5. Stack the data with CDP/Ensemble Stack. 6. Write a new stacked dataset to disk. 7. Execute the ow.
Landmark
8-9
Display Stack
1. Build the following ow to display your stack.
Bandpass Filter
Ormsby lter frequency values: ------------------ 3-6-50-60 ----Default all other parameters----
Trace Display
Primary trace LABELING header entry: --------------NONE Secondary trace LABELING header entry: ------------CDP 2. Execute the ow.
8-10
Landmark
4. Exit the ow. 5. You may also stack and display the user statics dataset STK-user statics as a QC.
Landmark
8-11
Chapter Summary
Upon completion of this chapter you should be able to answer the following questions: Can you import Velocities Do you understand the NMO and Stack Parameters
8-12
Landmark
Chapter 9
Landmark
9-1
Chapter Objectives
This chapter serves as set-up for Chapter 10 Refraction Static Corrections. Refraction statics are necessary in areas of severe topography or areas of complex weathering zones. First break picks are a required input to the refraction statics algorithms. Upon completion of this chapter you should: Understand how to Pick First Breaks Be able to Train the Neural Network
9-2
Landmark
Interactive Training 1. Copy your ow 4a.2-Apply Datum Statics and add/delete/edit processes so that it looks like the following:
Trace Kill/Reverse
----Use the same parameters as before----
Trace Kill/Reverse
----Use the same parameters as before----
Trace Display
----Default all parameters for this process---2. Kill and reverse appropriate traces and apply true amplitude recovery before picking rst breaks. Do not apply Trace Muting.
Landmark ProMAX 2D Seismic Processing and Analysis 9-3
YES
Go to next gather
Interactive NN FB picking
Trace Display -> FirstBreakPicker -> Neural Net Recall -> Continuous Recall
Batch NN FB picking
Exit Trace Display Run NN First Break Picker Run Trace Display
9-4
Landmark
4. From the main Trace Display menu bar, select FirstBreakPicker Set Neural Network Parameters. The following menu will appear.
Select the pick polarity and the signal/noise gate length. The neural network works well with peaks and a gate length of 100 ms. Select OK to accept these parameters. The neural network itself, however, may key off of instantaneous phase/frequency, amplitude before or after the rst break, or any other pattern it can recognize. 5. From the main menu bar, select FirstBreakPicker Create Training Data set. A First Break NN Dataset window appears. Type in a name nn first break gate for your rst break time gate, and select OK. A second window will appear for selecting a secondary key. Choose AOFFSET, and then OK. The Picking tool icon appears on the left side of the display. There will be two entries in the Pick Layers box: FB Training Data and the nn rst break gate. 6. Select the nn rst break gate table from the Pick Layers window, and pick the top of the gate. It is not necessary to make a pick on every trace, as the gate is interpolated between picks. The network tries to follow the slope of the top gate when picking rst breaks, so it is necessary that the top gate closely follows the trend of the rst breaks. Usually picking about 25ms above the rst break, at timing line intersection works quite well. To pick the bottom of the gate click MB3 in the data window and select New Layer. The gate should contain at least three peaks, but not be so large as to lengthen execution time. It will be helpful to Zoom in on the rst breaks before picking.
Landmark
9-5
7. Click on FB Training Data in the Pick Layer window and manually pick the rst breaks.
Manually pick rst breaks using MB1. Pick rst breaks on 20-30 traces. Because training is interactive you can incrementally train the network. This means you do not need many picks to begin training, as more picks can be added in future training runs. More picks means longer training time. Use MB3 to select the Snap to peak option.
9-6
Landmark
A First Break NN Training window appears, including a list of First Break Weight Tables. Create a new table weight1 and select OK. The network will be trained using your picks. While the network is training, the cursor will change from an arrow to a wristwatch. When the cursor changes back to an arrow, training is complete. 9. From the menu bar, select FirstBreakPicker Neural Net Recall One Time Recall.
Landmark
9-7
The One time Recall option applies the neural network to the currently displayed gather. A First Break NN Recall window appears.
You will be prompted to either choose an Ordered Parameter File(OPF) from the list, or create a new OPF for storing picks. Create a new OPF called NN training test picks for the name, enter 1000 for the offset to start picking, and default all other parameters. Select OK. The Neural Network is applied to the current gather display. The results of the picking are displayed. 10. If the picks are bad, modify your FB Training Data and retrain the network. To modify training picks, click on the Picking tool icon. Your new table of picks appears in the Pick Layers window. Remove the table from the list and activate the FB Training Data. Modify or add to these training picks, select First Break NN Training, and use the same weight table. Iterate through steps 6, 7, and 8 until you are satised with the results. If you still cannot get satisfactory results, try purging the Neural Network (FirstBreakPicker Purge Neural Net) and starting over. 11. Set Neural Net Recall to Continuous and click the Next ensemble icon to go to the next shot.
9-8
Landmark
You can retrain if necessary, or if you think the picks are close enough, select File Exit/Stop Flow, and choose to save edits before exiting. The weight table, and time gates are saved and can be used in the batch NN First Break Picker process to pick the entire dataset.
Landmark
9-9
Pick First Breaks for entire survey In the previous exercise, we interactively created and saved a fb_weight matrix file, and a time gate. Now we will use these as input to the NN First Break Picker to pick all shots in batch mode. 1. Alter the existing ow as follows:
Trace Kill/Reverse Trace Kill/Reverse True Amplitude Recovery >Trace Display< NN First Break Picker
Select weight matrix parameter le: ----------------weight1 Number or traces in median line t: --------------------------5 Maximum trace to trace static:---------------------------------20 Starting offset to determine rst break pick slope: 1000 Select time gate parameter le: -------nn rst break gate First break storage: ------------------Header and Database 4 digit ID to store pick time in TRC database: ------ 0001
9-10
Landmark
2. In Disk Data Input, input your entire dataset. Some preprocessing may be necessary, such as trace edits, ltering, scaling. Preprocessing is the same as input to the interactive NN First Break Pick Training. 3. Select NN First Break Picker parameters. Data dependent parameter selections are based on testing or experience. Parameters are consistent with those for the interactive NN FB Pick Training. Input the fb_weight matrix weight1 le. You must specify a starting offset for the picker. Specify an offset with good S/N and no shingling of refractors. For this data, an offset value of about 1000 ft. is adequate. 4. Execute the ow. 5. Once the picker is completed, QC your picks. Edit the same ow, and toggle NN First Break Picker inactive, and Trace Display active, and execute the ow. From the menu bar in the Trace Display window, select Picking Edit Database Values (rst breaks)... Select NN_PICK as the Infotype, and PICK0001 (the 12345678 picks are from the interactive picker) from the OPF File Selector, and use the same name to save edits. Dont spend too much time editing picks here. The easiest way to view and edit your picks is to use the rst break editing capabilities of the Refraction Statics process in the next chapter. Also do not worry about zero picks on the dead traces.
Landmark
9-11
Chapter Summary
Upon completion of this chapter you should be able to answer the following questions: Can you Pick First Breaks Do you know how to train the Neural Network
9-12
Landmark
Chapter 10
Landmark
10-1
Chapter Objectives
This chapter serves as an alternative to Chapter 7 Elevation Statics Corrections. Refraction statics are necessary in areas of severe topography or areas of complex weathering zones. Upon completion of this chapter you should: Understand the difference between Refraction and Elevation Statics Be able to Calculate Refraction Statics Be able to Apply Refraction Statics
10-2
Landmark
Refraction Statics
ProMAX provides an interactive interface for final editing of first-break picks, layer assignment, velocity and delay time editing. The final results of this process are a near-surface depth/velocity model and travel-time corrections to the final datum written to the database. Solutions are calculated by three methods: Generalized Reciprocal Method (GRM), Standard Delay Time (DLT), and Diminishing Residual Matrices (DRM). Each solution is written to the database, giving you the option of selecting the most appropriate solution.
NOTE: First breaks must be picked and written to the database prior to this exercise. Please refer to the Neural Network First Break Picking exercise earlier in this manual.
Refraction Statics - 2D In this exercise you will use the Refraction Statics* process and firstbreak pick times to calculate a near-surface model and travel-time corrections.
NOTE: This process does not use XY values, therefore it is not applicable to crooked lines. Crooked may be dened as any line with a greater than 15 degree bend. If you are calculating refraction statics on a crooked line, refer to the Refraction Statics Calculation* process described later in this chapter.
This process calculates shot and receiver refraction statics to shift to the final datum and updates the database. Results of this exercise will be used by Datum Statics Apply in a later exercise.
Landmark
10-3
Refraction Statics*
Select display DEVICE: -----------------------------This Screen Select First Break Times le: -TRC:NN_PICK:PICK0001 Get LAYER Picks from DATABASE: -------------------------No Get Refractor Velocities from DATABASE: ----------------No Select TRACE data le: ---------------Shots-with geometry Compute V0 from UPHOLE data?: -------------------------Yes Number of layers: ----------------------------------------------------1 Use Delay Times in velocity/depth model?: ------------Yes Use Deep Hole delay time algorithm?: ---------No Use GRM in velocity/depth model?: -----------------------Yes Specify GRM minimum XY distance: -------------0. Specify GRM maximum XY distance: ------------0. Specify GRM XY distance increment: ----------55. Final datum Elevation: -----------------------------------------800 Replacement Velocity: ----------------------------------------8000 Use Uphole Time in source statics algorithm?: ---------No 2. Select Refraction Statics parameters. Select your rst break pick le. Picks are typically in the database in the TRC order and NN_PICK Infotype. Select the batch PICK0001 le for this exercise. Input trace data will be the raw shots. Enter a nal datum of 800 ft. and a replacement velocity of 8000 ft/sec.
10-4
Landmark
A menu appears with a list of options. Follow the normal sequence from top to bottom using mouse button helps.
Landmark
10-5
Use the Edit Picks option for nal editing of rst-break picks prior to inversion. Use the mouse button helps to guide your editing; use the options on the right side of the screen to edit your data. To guide your editing you may want to turn on the seismic by toggling on Add Traces. Click MB2 below the data to move to the next set of shots, or MB3 to move backwards. Select Done to go back to the main menu. Select Yes to Output Updated Picks to the Database, and provide a name RefrEdit for the pick le.
Warning: The editing in this function currently snaps to a sample and not necessarily the true peak. This could lead to up to a 4ms pick error. Residual statics, however, should correct for these slight errors.
10-6
Landmark
This option displays pick times for both sides of the spread, as in the case of split spread shooting. Dene the offset range for each layer by holding down and dragging MB1 over the corresponding range, then releasing MB1. This is an interpretive process. Note: The displayed velocity is only a guide; you are not assigning a velocity for the layer. Avoid inection points where refractors are shingling. Also avoid low S/N areas. The velocity you get should be on the order of 7500 f/s. Select Done and then Yes to Output Refractor Picks to Database.
Landmark
10-7
This option provides for interactive editing of the weathering velocity and the calculated velocity model for each refractor. The top display is a graph of velocity vs. station. Editing or smoothing of the velocity values is done only in the bottom display which is a zoomed version of the top display. Select a velocity to edit by selecting the appropriate box in the upper right, such as Edit V1. Refer to mouse button helps for editing functionality. The plotted points represent the layer number and are color coded by the calculation method used. Perhaps there is a problem in V0 resulting from questionable uphole times? You may want to smooth through V0 or replace it with a constant 5000 f/s. Select Done when editing is complete, and Yes to Output Refractor Velocities and V0 to Database.
10-8
Landmark
This option allows interactive editing of the calculated intercept times for each layer. You will rst view/edit Receiver Delay Time Solutions, then Source Delay Time Solutions. The top display is a graph of intercept time vs. station. Editing of the intercept values is done in the bottom display. Refer to mouse button helps for editing functionality. Select Done when editing is complete. Select Yes to view either Receiver Delay Time corrected Shot Records or Source Delay Time corrected Receiver Records, depending on what you edited. Select Done when nished viewing.
Landmark
10-9
This option allows viewing the calculated near-surface depth model, calculated from the velocity and intercept data. Although there is editing functionality in this option, if the depth model is not geologically possible you may want to re-edit rst break picks, velocities or intercept times, and then rerun this step. Notice how the GRM method falls short where there are no sources for the reciprocal method to calculate the receiver static. Select Done and Yes to Output Refractor DEPTHS to Database.
10-10
Landmark
This option allows viewing the shot and receiver statics calculated from the model data. Source statics from the elevation of the shot through the model to nal datum are displayed with the character s. Receiver statics from the elevation of the receiver through the model to nal datum are displayed with the character r. Select Yes to Output STATICS to the DATABASE.
NOTE: In the main menu, click MB2 on any previous box to view its current values or MB1 to re-edit those values. If you choose to re-edit, be sure to step through all subsequent options to correctly recalculate your nal statics.
Landmark
10-11
11. Exit the current ow. From the Flows window, access the database with the Database global command option. To view the calculated refractor depths and statics solutions simply double click on the appropriate attribute. You can also go the XDB Database Display to overlay the static values. The Order is SRF or SIN, and the Infotype for the total static is Geometry. This is the value that will be used by the process Datum Statics Apply. In the Statics Infotype, there are incremental statics that represent the difference between the total refraction statics, and the original elevation statics. To the right of the attributes are detailed descriptions of each.
10-12
Landmark
Very robust for noisy rst break picks. Works independent of shooting geometry. First break picks are not required for every shot.
The main disadvantages are that there is not a graphical interface for editing. The source and receiver static solutions are applied to the data in a future step, Apply Refraction Statics.
NOTE: First break times must be picked and written to the database prior to this exercise. Please refer to the Neural Network First Break Picking exercise earlier in this manual.
As a part of this exercise you will see that there are two ways to enter the refractor offset ranges. These are: Manually. By picking a Pick Top Mute in Trace Display.
In this exercise you will use first-break pick times to calculate a nearsurface model and travel-time corrections. This process calculates shot and receiver refraction statics to shift to the final datum and updates the database. Results of this exercise will be used by Apply Refraction Statics in the next exercise.
Landmark
10-13
10-14
Landmark
2. Select Refraction Statics Calculation* parameters. Select the rst break time to use for the statics decomposition. These time picks will be in the TRC OPF and will normally be of the type NNPICK. Select the PICK0001 le. If you have output an edited pick le, it will be stored with an infotype of FBPICK. Enter the number of layers to model, in this case use one layer. The identication number will be 1 for the rst run through the process. The shooting geometry is 2D split spread. There are 5 steps to Refraction Statics Calculation* described in the menu. They may all be turned on for refraction statics computation or you may select to run one option at a time and view the output in the database. 3. INPUT V0 and REFRACTOR OFFSET. In this exercise well compute V0 from uphole times and manually type in the refractor OFFSET range. Three database entries are created in the SIN OPF: SIN REFR_OFF OFFPSS11 ---Near positive offset of refractor. SIN REFR_OFF OFFPSE11 ---Far positive offset of refractor. SIN REFR_OFF OFFNGS11 ---Near negative offset of refractor. SIN REFR_OFF OFFNGE11 ---Far negative offset of refractor. SIN VELOCITY V0INIT11 ----Weathering Velocity. These database attributes may be edited. The V0INIT11 is written over each time you rerun the module. If you want to make a permanent change, edit the uphole times. 4. COMPUTE REFRACTOR VELOCITIES. With this subheading turned on a refractor velocity is calculated based on the rst break times and the offset range from the previous step. Although you can smooth the velocity model in the menu, you may wish to look at your model in the database before smoothing. You could then either smooth in the database (Good to see immediate results of smoothing), or dene a smoother in the menu.
Landmark
10-15
There is also an option to edit the rst break picks automatically by setting a deviation from the median velocity described by the offsets. If any picks deviate more than the selected amount they will be killed, and set to NULL in a new rst break picks database le TRC F_B_PICK FBPEDITX, where X is the run identication number. Only the good picks will be included in this le. Remember to examine this edited le. Three database entries are created. CDP VELOCITY VCINIT11 -- CDP velocity for 1st refractor. SIN VELOCITY VSINIT11 ----Source velocity for 1st refractor. TRC F_B_PICK FBPEDIT1 ----Edited rst break pick le. These database attributes may be edited. 5. COMPUTE DELAY TIMES. Once CDP velocity is available, delay times for shots and receivers may be computed. This is done by iteration, starting with source delay time estimates, followed by receiver delay time estimates, and (optionally) nalized by CDP velocity updating. Values are not computed for any SIN, SRF or CDP that does not meet the minimum fold (menu parameter) criterion. Once the decomposition is complete for each refractor, these missing values are interpolated based on X and Y. Three database entries are created. SIN DELAYTIM SDELAY11----Source Delay times. SRF DELAYTIM RDELAY11--- Receiver delay times. CDP VELOCITY VCFIN011---- Final CDP velocities.
10-16
Landmark
6. COMPUTE REFRACTOR DEPTH MODEL. The depth model stage inputs delay times and refractor velocities in CDP, interpolates refractor velocity into SIN and SRF, computes a depth model for sources and another for receivers. Optionally, the rst refractor depth in SRF may be projected into CDP, smoothed, projected back into SRF, V0 recomputed in SRF based on the smoothed depths, new V0 projected from SRF to SIN, and nally SIN and SRF depth models computed. Six database entries are created. SIN REFDEPTH SDEP_011---Source Refractor Depth. SIN VELOCITY VSFIN011 -- Final Source velocity for 1st refractor. SIN VELOCITY V0FIN011 ---Final Weathering Velocity. SRF REFDEPTH RDEP_011--Receiver Refractor Depth. SRF VELOCITY VRFIN011- - Final Receiver velocity for 1st refractor. SRF VELOCITY V0FIN011 ---Final Weathering Velocity. 7. COMPUTE SOURCE AND RECEIVER STATICS. The statics computation stage inputs refractor velocities and refractor depths, computes source and receiver depths to the FINAL datum of 800 feet and outputs static values. We have the choice of inputting a constant velocity or the bottom refractor velocity. For this exercise choose a user specied value of 8000 ft/sec. Two database entries are created. SRF GEOMETRY RSTAT00X --- Receiver statics. SIN GEOMETRY SSTAT00X ----Source statics.
Landmark
10-17
10-18
Landmark
2. In Datum Statics Apply, select your Source and Receiver statics. You have the option of choosing the statics from any of the refraction statics calculation methods. For Source statics, the order is SIN and the Infotype is Geometry. You will have an available list of parameters les, saved in Refraction Statics*. Select one of the following statics les: GRM Refraction Statics for Sources to Final datum. The GRM method is not valid for this line since it is not split spread. DRM Refraction Statics for Sources to Final datum.
Landmark
10-19
DLT Refraction Statics to Final datum. Coordinate based- Source statics to Final datum from coordinate based method.
For Receiver statics, the order is SRF and the Infotype is Geometry. Select one of the following statics files: GRM Refection Statics for Receivers to Final datum. The GRM method is not valid for this line since it is not split spread. DRM Refraction Statics for Receivers to Final datum. DLT Delay Time Refr. Statics-Receivers to Final datum. Coordinate based- Receiver statics to Final datum from coordinate based method.
3. Add a new output dataset Shots-decon/refr statics 4. Execute the ow. Traces are shifted to the oating or nal datum, depending on your selection. 5. Build a ow to display gathers with refraction statics applied and use the Header icon to check updated statics header entries. Display gathers with elevation statics applied instead of refraction statics and check these trace header values on the same trace. Note the differences due to Datum Statics Apply. 6. Use your previous ow 5.1-Stack to stack the refraction corrected shots to a dataset STK-refr statics.
10-20
Landmark
Chapter Summary
Upon completion of this chapter you should be able to answer the following questions: What is the difference between Refraction and Elevation Statics How do you Calculate Refraction Statics When do you Apply Refraction Statics
Landmark
10-21
10-22
Landmark
Chapter 11
Stack Comparisons
In this chapter you will use Trace Display to compare two stacks. This ow is used throughout the rest of the class to compare stack sections.
Landmark
11-1
Chapter Objectives
5. Brute Stack
In this chapter you learn a slick way to compare stack datasets. This technique is quite valuable in testing processing flows and parameters. Upon completion of this chapter you should: Be able to graphically compare any two stacks
11-2
Landmark
Compare Stacks
1. Build the following ow to compare stacks:
Bandpass Filter
Ormsby lter frequency values: ------------------ 3-6-50-60 ----Default all other parameters----
Trace Display
Primary trace LABELING header entry: --------------NONE Secondary trace LABELING header entry: ------------CDP ----Default all other parameters---2. Execute the ow. The stack with elevation statics will appear rst. Use the Next ensemble icon to display the stack with refraction statics. After both stacks have been displayed, use the animation tool to compare the stacks. You may want to execute this ow again, and display both stacks on a single screen.
Landmark
11-3
Chapter Summary
Upon completion of this chapter you should be able to answer the following question: How do you graphically compare two stacks
11-4
Landmark
Chapter 12
Landmark
12-1
Chapter Objectives
6. Velocity Analysis
Velocity analysis is a critical aspect of any processing workflow. This chapter explores one of ProMAXs techniques of picking and quality controling velocities. Upon completion of this chapter you should: Comprehend the parameters input to Velocity Analysis Understand how to use the Velocity Analysis Viewer in conjunction with the Volume Viewer/Editor Be condent in picking reasonable stacking velocity functions
12-2
Landmark
Landmark
12-3
12-4
Landmark
Precompute Velocity Analysis 1. Build the following ow to start Velocity Analysis Precompute:
Supergather Formation*
Read data from other lines/surveys?: ---------------------No Select dataset: -----------------------Shots-decon/refr statics Presort in memory or on disk?: -----------------------Memory Maximum CDP fold: ---------------------------------------------180 Minimum center cdp number---------------------------------825 Maximum center cdp number--------------------------------950 Cdp increment--------------------------------------------------------25 Cdps to combine-------------------------------------------------------9
Bandpass Filter
Ormsby frequency lter values: -------------------3-6-50-60 ----Default all remaining parameters----
Landmark
12-5
Supergather Formation* Bandpass Filter Automatic Gain Control Velocity Analysis Precompute
Number of CDPs to sum into gather: --------------------------9 Apply partial NMO-to-binning:--------------------------------Yes Apply differential CDP mean statics?:---------------------Yes Absolute offset of rst bin center: -------------------------27.5 Bin size for vertically summing offsets: -------------------55 Maximum offset: ---------------------------------------------6572.5 Use absolute value of offset for stacking?: --------------Yes Minimum semblance analysis value: -------------------7000 Maximum semblance analysis value: ----------------20000 Number of semblance calculations:--------------------------50 Semblance sample rate (in ms): ------------------------------20 Semblance calculation window (in ms): -------------------40 Number of stack velocity functions: -------------------------17 Number of CDPS per stack strip---------------------------------5 Scale stacks by number of live samples summed:---Yes Method of computing stack velocity functions:-----------------------------------------------------------Top/base range Velocity variation at time 0: ---------------------1000 Velocity variation at maximum time:---------3000 Velocity guide function table name:-----------------------------------------------------------------imported from ascii le Maximum stretch percentage for NMO: --------------------30 Long offset moveout correction?:-------------------------NONE
12-6
Landmark
2. Select your best prestack dataset for Supergather Formation. Supergather Formation is a macro that reads the data as CDPs, and combines them into supergathers. Data should be preprocessed gathers without NMO. Set the Maximum CDP fold to 180 (9 CDPs times 20 fold per CDP). Set the Min and Max CDP centers to 825 and 950 respectively. Set the CDP increment to 25. This will give you six analysis locations with supergathers starting at CDPs 825, 850, 875, 900, 925, 950. 3. Apply a bandpass lter. For velocity analysis, it is usually desirable to limit the frequency range of the input data. Select Ormsby lter values of 3-6-50-60. 4. Apply Automatic Gain Control. For velocity analysis, a relatively short AGC window is usually desirable. The default value of 500 ms will work ne for this exercise. 5. Set parameters for Velocity Analysis Precompute. Set the number of CDPs to sum into gathers as 9, and set the bin sizes.
Landmark
12-7
6. Select Yes to Apply partial NMO-to-binning. Supergather input to Velocity Analysis has reduced spatial separation between traces compared to the original CDP gather
Partial NMO and SUM Move the Traces to the NMO of the Bin Centers
Full NMO and SUM Flatten the Traces to the Zero Offset Time of the Gather
12-8
Landmark
7. Select minimum and maximum semblance values to 7000 and 20000, and set the number of stack velocity functions to 17. 8. Select Top/base range as the method of computing stack velocity functions. 9. Create a new Disk Data Output le called Precomputed Velocity Analysis. 10. Execute the ow.
Velocity Analysis In this flow we will set the prameters for velocity analysis to use the precomputed data from the previous flow.
Landmark
12-9
Velocity Analysis
Select display DEVICE: -----------------------------This Screen Is the incoming data Precomputed?: ----------------------Yes Set which items are visible?---------------------------------No Set semblance scaling and autosnap parameters?:--No --------------------------------------------------------------------------Semblance normalization mode: ---Scale Time Slice Contrast power factor: ------------------------------------- 1. Contrast noise factor: ------------------------------------- 0.1 Automatically snap------------------------------------------No Maximum velocity % change for snapping: ----------5 --------------------------------------------------------------------------Maximum vertical change for snapping: ------------------40 Display horizon(s)?: -----------------------------------------------No Use neural network velocity picker?: -----------------------No Interact with other processes using PD?:-----------------Yes Get guide function from existing parameter table?---Yes Velocity guide function table name: ----------------------------------------------------imported from ascii le --------------------------------------------------------------------------------Maximum stretch percentage for NMO: --------------------30 Long offset moveout correction?:-------------------------NONE Interval velocity below last knee: ------------------------------0 Table to store velocity picks: -------vels from precompute Copy picks to next location------------------------------------No
Submenu Controls
>Volume Viewer/Editor*<
12-10
Landmark
2. Set the Disk Data Input parameters as shown. Make sure to sort the input data by the user-dened header word SG_CDP. 3. Set the Velocity Analysis parameters. When you rst parameterize the Velocity Analysis process, a subset of the parameters will be visible so begin by setting the global parameters highlighted in the ow. Be sure to create a table to store velocity picks such as vels from precompute. Next, select Yes for Set semblance scaling and autosnap parameters to display the semblance submenu. The default settings will work fine so turn off the semblance submenu by clicking No for Set semblance scaling and autosnap parameters. The submenu parameter settings will be retained and used even though they are not visible. The parameter Set which items are visible works the same way. Both the visiblity and semblance parameters can also be changed interactively from within the velocity analysis tool.
NOTE: The Velocity Analysis parameters are only our initial guesses. Once inside the Velocity Analysis Viewer we can change any of the parameters interactively.
4. Execute the ow. The display shows a velocity semblance plot, a corresponding CDP gather or CDP supergather sorted by absolute offset, the dynamic stack positive and negative polarity, and the varying velocity stack strip panels. 5. The panel menus allows you to control several other items including the semblance scale, interval velocities derived from the RMS picks, and guide functions from previous velocity picks. From this menu you can also change the trace scaling.
Landmark
12-11
Velocity Analysis Icons Next ensemble: Proceed to and process the next ensemble in the dataset. If you are currently processing the last ensemble in the dataset, this button is inactive.
Previous ensemble: Step backward one ensemble and process. If you are currently processing the rst ensemble of the dataset, this button is inactive. Rewind: Rewind the dataset and go back to the rst ensemble as specied in the sort order. If you are currently processing the rst ensemble in the dataset, this button is inactive.
12-12
Landmark
Point Dispatcher(PD): save and send the velocity picks in the current ensemble to the Velocity Viewer/Editor. This icon works only when Velocity Viewer/Editor is running, and you have told it to interact with Velocity Analysis.
6. Pick a stacking velocity function for the rst ensemble. Activate the picking icon, and begin picking a function with MB1. You can pick in either the semblance display, or the velocity stack strips display. As you pick velocities on the semblance plot, the picks are also displayed on the velocity strips, and vice versa. Use the Next ensemble icon to move to the next analysis location After you pick the rst location and move to the second you may want to overlay the function that you just picked as a second guide. You can do this by clicking on View Object visibility... Average of all CDPs (blue). This will display the average of all of the functions that have been picked in the output table to date. 7. Experiment with some of the other display attributes such as View Object visibility... Velocity Color Key, and View Object visibility... Interval Velocity. If your workstation performance suffers such as slow redraws, turn off the more resource intensive attributes. Once you have determined your favorite settings, you can set the ow parameters so your Velocity Analysis display is automatically congured that way..
NOTE: Your velocity picks are automatically saved to an RMS velocity ordered parameter le when you move from one location to the next or Exit the program. You also have the option to save picks using the Table/Save Picks option.
Using the Volume Viewer As you pick velocities along a line using the Velocity Analysis tool, you may want to QC the picked velocity field. This can be accomplished by simultaneously viewing a color isovelocity display of the entire velocity volume. The tool used for this is a standalone process called the Volume Viewer/Editor, and should be executed while you are running Velocity Analysis, as outlined below.
Landmark
12-13
1. After picking and saving at least one velocity analysis location, iconify the Velocity Analysis Window. 2. Return to the ProMAX User Interface. Toggle off all processes and add Volume Viewer/Editor to the ow. 3. Parameterize Volume Viewer/Editor as follows.
12-14
Landmark
4. Execute the ow containing the Volume Viewer/Editor, and return to the Velocity Analysis display. The Volume Viewer/Editor window will eventually appear. Unless you have two screens, you will want to try different ways of arranging the windows on the screen until you have made an arrangement that is workable for you. The following diagram illustrates one way to arrange the windows on the screen:
Possible Window Arrangement If you have not picked any velocities, the display will contain zero values and the screen will be all blue and the velocity scale will be very large. If you have picked at least one velocity function, you will only see a vertical color variation in the Cross Section window.
Landmark
12-15
5. From the Velocity Viewer/Editor window, click on View Volume Display. A Volume Controls window will appear. Click on the Cross-section Nodes button, then Ok. This will display vertical lines in the Cross Section window indicating the positions of the Velocity Analysis centers already saved to the velocity table. The locations of these lines are referred to as nodes. 6. In the Velocity Analysis window, pick or modify the velocity function for the current location. 7. In the Velocity Analysis display, click on the bow-and-arrow PD icon to send the new information to the Volume Viewer/Editor. The velocity displayed in Volume Viewer/Editor updates in response to picks made in Velocity Analysis. You should now see a vertical line in the Cross Section window at the CDP location of the velocity function just picked. 8. In the Velocity Analysis window, click on the Next ensemble icon, and pick the next analysis location. When you are nished picking this new analysis location, click on the Next ensemble icon again. This will not only move you to the next analysis location, but will automatically send the velocity picks just made to the Volume Viewer/Editor displays. 9. In the Volume Viewer/Editor window, click on the PD icon. Any Velocity Analysis CDP location can be easily retrieved or deleted from Volume Viewer/Editor through the use of the mouse. This allows random access to any of the precomputed and picked locations.
Velocity Analysis Pointing Dispather By activating this icon, you can select a CDP and send it to Velocity Analysis. This icon does not appear if No was selected for Interact with Velocity Analysis? in the Velocity Viewer/Editor menu. With the PD icon activated, position the mouse cursor over a node. The cursor should change from an x to an o. Click MB1 to retrieve that velocity function into the Velocity Analysis display. Clicking MB2 deletes that analysis location.
12-16
Landmark
10. Try moving to a previous location by selecting it in the Volume Viewer window. 11. Continue picking velocities in Velocity Analysis until you nish all of the locations on this project. Remember, you may either use the bow-and-arrow PD icon to send the picks from Velocity Analysis to the Volume Viewer/Editor displays for QC before moving to the next analysis location, or you may move directly to the next ensemble and your previous picks will be automatically sent to the Volume Viewer/Editor displays. 12. To nish picking, rst make sure that the Point Dispatcher PD icon in Volume Viewer is deactivated. Then in Velocity Analysis, click on the File Exit/stop ow pull down menu in the velocity analysis and the File Exit pull down in the Volume Viewer/ Editor.
Landmark
12-17
Chapter Summary
Upon completion of this chapter you should be able to answer the following questions: Do you understand the parameters input to Velocity Analysis? Can you operate the Velocity Analysis Viewer in conjunction with the Volume Viewer/Editor? Are you condent in picking a reasonable stacking velocity function?
12-18
Landmark
Chapter 13
Landmark
13-1
Chapter Objectives
7. Residual Statics
To correct for high frequency variations in the near surface weathering not solved by elevation/refraction statics or velocities, some type of residual statics is almost always applied for land data. High frequency can be thought of as shorter than one cable length. This chapter explores some of ProMAXs techniques of calculating residual statics. Upon completion of this chapter you should: Know how to prepare data for input to Residual Statics Understand how Surface Consistent Statics are calculated Understand how Trim Statics are calculated Be able to build a Model Stack to pilot some of the statics routines
13-2
Landmark
Autostatics Flowchart
Autostatics Flowchart
1. Pre-Process
(geometry, gain recovery, noise reduction, deconvolution, refraction or elevation statics, NMO, BPF, AGC)
RMS Velocities
2. Apply NMO and Sort to CDPs CDP Stack Pick Autostatics Horizon
Landmark
13-3
Data preparation and horizon picking for residual statics In this exercise you are simply preparing a prestack dataset for input to autostatics, as well as picking your autostatic horizons on poststack data.
13-4
Landmark
Bandpass Filter
Ormsby lter frequency values: ---------------- 5-10-40-50
Landmark
13-5
7. Residual statics processes require that reference horizons (autostatics horizons) are picked from a preliminary stack and saved in a parameter table. Edit the following ow:
>Disk Data Input< >Normal Moveout Correction< >Automatic Gain Control< >Bandpass Filter< >Disk Data Output< Disk Data Input
Select Dataset: -----------------------------------STK-refr statics Trace read option: --------------------------------------------Get All
Trace Display
Primary trace LABELING header entry: --------------NONE Secondary trace LABELING header entry: ------------CDP 8. Input your refraction statics stack. 9. Execute the ow.
13-6
Landmark
10. From the menu bar in Trace Display, select Picking Pick Autostatics Horizons...
Picking Autostatics Horizon 11. A Table Selector window appears. Enter a new table name horizon1 and select OK. Enter smash=11 (CDP traces). Smash is the number of CDPs to sum along the horizon to form the model trace for correlation. Gate width is symmetric about the picks. For steeply dipping areas a smash of 3 to 5 should be used. For atter areas smash values of 11 to 21 are valid. Enter a gate width=100 (ms). The gate width should be bigger than twice the maximum residual static expected. In swampy/marshy areas this may be a large value. Click on OK when nished.
Landmark
13-7
12. Pick a horizon using MB1. This identies the center of the time gate. Horizons may extend across the entire dataset or cover only a portion of the data. CDPs not included in a horizon will not be included in residual statics calculations for that horizon.
NOTE: Autostatics horizons are picked from stacked data that has been shifted to the nal datum. The residual statics processes automatically shifts these time horizons to the processing datum, the same datum input CDP gathers are referenced to. This process of applying C_STATIC to the horizons is automatic and transparent to the user.
13. Additional horizons (up to 500) may be picked by clicking in the trace display area with MB3 and choosing a new layer. You will be prompted to enter a new smash value and time gate for each horizon. Notice also the new horizon is represented in the Pick Layers window with a number in parentheses. The residual statics process will average the static solutions in areas of overlapping windows. About a 10 trace overlap should provide a smooth transition between static solutions. Too much overlap can lead abrupt edges to the static solution. 14. To quit and save the autostatics horizon parameter table select File Exit/Stop Flow. Select Yes when asked to save your work.
13-8
Landmark
The most commonly used surface consistent methods are Correlation Autostatics and Maximum Power Autostatics. The max power method has proven very robust for good and bad data areas. The only downside to max power is that it is a little more expensive in terms of CPU usage. The max power simply maximize the power of the stack by shifting each trace and stacking for the maximum power. The CDP Trim Statics works similar, except the shifts are applied in the CDP domain, and thus the shifts are blind to surface consistency. Most of the methods have some problems at the edges and low fold areas. These edge problems are often corrected by editing the erroneous values in the database. Choose a method based on data quality, magnitude of statics problem, and merits of the residual statics method. Parameter selection for each method is also based on data quality and magnitude of statics problem. If nothing seems to work use Gauss-Seidel External Model Autostatics.
Landmark ProMAX 2D Seismic Processing and Analysis 13-9
Autostatics calculation In this exercise, you will calculate residual statics using Maximum Power and Correlation Autostatics. An additional exercise at the end of this section describes the external model routines, Gauss-Seidel External Model Autostatics and Cross Correlation Sum External Model Autostatics. 1. Build the following ow:
Correlation Autostatics*
Select Trace data le: -------------CDP-input to res. statics Select Autostatics HORIZON le: --------------------horizon1 Select Autostatics VELOCITY le: vels from precompute Maximum velocity error (percent): ----------------------------5. Number of CDPs for velocity smoothing: -----------------51 Minimum # of traces for vel. estimate: ---------------------36 Minimum % of offset range for vel. estimate: ------------25 Maximum statics allowed (milliseconds): -----------------20 Statics partitioning iterations: -----------------------------------4 Minimum live samples in a gate (percent): ---------------60 Seek/report reversed sources/receivers/channels: Yes Create a NEW database entry for each run?: -----------No
13-10
Landmark
Landmark
13-11
13-12
Landmark
Compare Static Solutions in the Database A quick QC tool is to examine the computed static values with the Database display tool. 1. From the Flows menu select the Database option, and then select Database XDB Database Display from the main DBTools menu. Select Database Get from the XDB display. 2. Select SRF order, Statics infotype, and the two statics les RCOR000 & RPWR000.
Landmark
13-13
The source (SIN) and receiver (SRF) statics for Correlation Autostatics are SCOR0000 and RCOR0000. The source (SIN) and receiver (SRF) statics for Maximum Power Autostatics are SPWR0000 and RPWR0000. Computed static values from several methods can be plotted simultaneously for comparison. Values can then be zeroed or edited from this database display. See mouse button helps for instruction. 3. Quality Factors computed by each method are also output to the database. Quality Factors can be used as a criteria for zeroing statics values or editing shots and receivers. The quality factor le naming convention is S_CQ0000, and R_CQ0000. Quality factors from several methods can be displayed simultaneously to compare reliability of the computed statics. Statics with low quality factor values relative to neighboring values can be zeroed or the receivers could be edited. Quality factors can be used to weight traces before CDP stack. For more information see the Residual Statics helple.
13-14
Landmark
Landmark
13-15
Compare two or more Autostatics Stacks 1. Build a ow for comparing stacked results from two or more statics methods. Use the following ow to get started:
Reproduce Traces
Trace grouping to reproduce: --------------------------All Data Total number of datasets: ----------------------------------------2
13-16
Landmark
Landmark
13-17
RMS Vels
SIN:STATICS:SGEMxxxx SRF:STATICS:SGEMxxxx
5a. EMC Gauss Seidel Eigen Matrix Time Gate TRC STATICS TRM0001 Correlations (trace data)
3. Eigen Stack
Autostatics Horizon
SIN:STATICS:SPEMxxxx SRF:STATICS:SPEMxxxx
13-18
Landmark
Create Eigen Stack Eigen Stack process uses the Eigen vector decomposition techniques to isolate the principle component of the trace matrix from a supergather of pre-stack traces. Conceptually, the wavelet on the stack trace after eigen stack is more similar to the wavelets on the prestack data than a conventional stack. Theory says that all of the wavelets recorded from a reflection point are the same. These wavelets are time shifted due to near surface velocity variations. Typically we measure these time variations by cross correlating the pre-stack traces with a stacked trace. The Eigen Stack process attempts to make the stacked trace wavelet that is as similar as possible to the wavelet of the pre-stack traces. This should improve the cross correlation process by creating a higher resolution pilot trace. The cost of this process, however, is that some of the structural information may be lost. Some writings may refer to an Eigen stack as a K-L transform. Note that this is the same technology as the government uses for pattern recognition in scanning retinas or for enhancing faces on photographs. Input Traces on CDP with NMO applied
You first need to pick a time gate that will be used in the Eigen Stack process: 1. Build the following ow to pick an eigen matrix time gate on NMO corrected CDP gathers.
13-20
Landmark
3. From the Trace Display menu bar, select Picking Pick Miscellaneous Time Gates...
Pick Time Gate 4. Input a gate name like eigen gate. 5. Select a secondary key of CDP, and pick a window from a data area that has a high Signal/Noise ratio. Make sure that your window includes the area of interest. Use MB3 inside of the Trace Display area to select a new layer for the bottom of the window. This display is also a good QC to check your velocities. If the CDP gathers are not at you may have a problem with your velocities. 6. Save picks and exit the Trace Display.
Landmark
13-21
Eigen Stack
Mode: --------------------------------------------Output Eigenstack Get matrix design gates from DATABASE?: ------------Yes SELECT design gate parameter le: ------------eigen gate Type of Computations?: ---------------------------------------Real Horizontal window width: ----------------------------------------5 Number of iterations: -----------------------------------------------0 Apply nal datum statics after stack?: ------------------Yes
>Trace Display<
The Eigen Stack process stacks at events in a CDP gather. Events with large trace to trace moveout will not be included in the output Eigen Stack. 8. In Disk Data Output, output a new dataset STK-eigen. This is used for subsequent input to the external model correlation builder. 9. Execute the ow.
13-22
Landmark
>Disk Data Input< >Disk Data Input< >Eigen Stack< >Disk Data Output< Disk Data Input
Select dataset: -------------------------------------------STK-eigen Trace read option: --------------------------------------------Get All
Trace Display
Primary trace LABELING header entry: ---------------None Secondary trace LABELING header entry: ------------CDP 11. Execute the ow.
Landmark
13-23
12. From the menu bar in Trace Display, select Picking Pick Autostatics Horizons...
Pick horizon for Autostatics Enter a new table name and click on OK. Enter smash=1 (in traces) and the gate width=300 ms and click on OK. For an external model, the smash is not used. Therefore, give it a value of 1. The pilot traces have already been somewhat smashed together by creating the Eigen Stack.
13-24
Landmark
Pick a horizon using MB1. This identies the center of the time gate. Horizons may extend across the entire dataset or cover only a portion of the data. CDPs not included in a horizon are not included in residual statics calculations for that horizon.
NOTE: Autostatics horizons are picked from stacked data that has been shifted to the nal datum. The residual statics processes automatically shift these time horizons to the processing datum, the same datum input CDP gathers are referenced to. This process is automatic and transparent to the user.
13. Additional horizons (up to 500) may be picked by clicking MB3 in the trace display area and choosing a new layer. You will be prompted to enter a new smash value and time gate for each horizon. Notice also the new horizon is represented in the Pick Layers window with a number in parentheses. 14. To automatically move picks to the nearest peak or trough, click with MB3 in the data area and choose the appropriate snap. 15. Save your autostatics horizon picks and exit Trace Display.
Landmark
13-25
13-26
Landmark
21. Build the following ow to calculate your residual statics using EMC Autostat: Gauss-Seidel*:
Landmark
13-27
13-28
Landmark
28. After comparing the various autostatics solutions, build the following ow to apply the best solution.
Trace Display
Primary trace LABELING header entry: ---------------None Secondary trace LABELING header entry: ------------CDP 29. Fill in the parameters as listed above, and then execute the ow. 30. When you are nished viewing the stack in Trace Display, select File Exit/Continue Flow.
Landmark
13-29
Chapter Summary
Upon completion of this chapter you should be able to answer the following questions: Can you prepare data for input to Residual Statics What does it mean to be Surface Consistent How are Surface Consistent Statics calculated How are Trim Statics calculated Can you build a Model Stack to pilot some of the statics routines
13-30
Landmark
Chapter 14
Landmark
14-1
Chapter Objectives
The advent of DMO in the 80s has greatly improved velocity calculation, migration of dipping reflectors and noise reduction. DMO improves the data because it migrates each sample to its zero offset position. This collapses the CMP smear induced by dipping reflectors. The DMO processes itself is quite simple, however, the prep of the input data can be a little tricky to the new user. Upon completion of this chapter you should: Be able to Common Offset Bin data for input to DMO Be familiar with some of the input parameters to DMO Understand the basic theory of what DMO does to the data
14-2
Landmark
It is relatively simple to determine the center of the first common offset bin. For example, assume that we have a survey with off-end shooting where the shot interval is 200 m, the group interval is 100 m, and 300 m is the distance to the first group. The shot-receiver offsets within this survey would be: 300, 400, 500, 600, 700, 800, 900,....... Since the shot interval of 200 m yields a bin increment of 400 m, the center of your first offset bin would lie midway between 400 and 500, or 450 m. Your DMO offset bins would now be 450 +/- 200, 850 +/200, 1250 +/- 200, etc. See diagram below: 300 400 500 600 700 800 900 1000 1100 1200 1300 1400........
450
850
1250
Skidded shots, or other irregular shooting geometry, may place two or more traces per CDP within a given DMO bin. Within the DMO process, traces within the same DMO bin having identical CDP numbers are stacked together prior to DMO.
Landmark
14-3
For regular shooting geometry, the recommended approach is to use Trace Binning as a function of absolute offset followed by F-K DMO on common offset ensemble. For datasets where the geometry is irregular and it is difficult to get good population of offset planes, Ensemble DMO in the T-X Domain should be run on common shot ensembles.
Determine trace binning parameters In this exercise, you will compute common offset bin centers using the off-end shooting assumptions. Recall that the first few shots on the line resembled off-end shooting, and the last few shots were the normal split spread geometry. We will then look at several database and trace displays to check the binning parameters. Bin centers may be based on either the signed offset or the absolute value of the offset. If absolute value of offset is used, traces with the same magnitude offset are combined in the same ensemble. You also have the opportunity to vary the width of the offset bins (the bin increment) as a function of trace offset. This may become important for lines that were collected with a regular, but asymmetric split spread geometry.
14-4
Landmark
1. Build the following ow to view the rst offset bin in your survey:
Ensemble Stack/Combine
Type of operation: -------------------------Combine and Stack Input ensembles per output ensemble: ----------------------1 How are trace headers determined: ----------------Average Secondary key bin size: ------------------------------------------1. Maximum traces per output ensemble: ------------------215 Warnings if max traces/ensemble exceeded?: --------Yes Select PRIMARY Trace Order Header Word: ----OFB_NO Average the X and Y coordinates of primary key?: ---No Select SECONDARY Trace Order Header Word: -----CDP Output trace secondary key order: --------------Ascending Print results?:---------------------------------------------------------No
Pad Traces
Header word to use for padding: --------------------------CDP Spacing of header value: ------------------------------------------1 Remove traces?: ----------------------------------------------------No Explicitly dene the bounds of header values?: ------Yes First trace header value: --------------------------------------775 Last trace header value: ---------------------------------------989
Trace Display
Primary trace LABELING header entry: ----------OFB_NO Secondary trace LABELING header entry: ------------CDP 2. Use Disk Data Input to sort the data rst by offset bin number (from the Alternate List of header words) and then sort by CDP.
Landmark ProMAX 2D Seismic Processing and Analysis 14-5
3. Use Ensemble Stack/Combine to combine offset bins, and stack any CDPs which are the same. For this rst exercise, we will only display the rst offset bin, so set the number of input ensembles/output ensemble to 1. We will use this process later to combine more than one offset bin for display. 4. Pad missing CDPs with Pad Traces. This process will insert a dead trace anytime the spacing between CDPs is greater than 1. 5. Set the primary and secondary labeling headers in Trace Display to OFB_NO, and CDP. 6. Execute the ow.
Notice that there are very few live CDPs for this single offset bin. Since DMO operates in the offset domain, it would be desirable to
14-6 ProMAX 2D Seismic Processing and Analysis Landmark
have offset bins that contain live traces for nearly all CDPs. This is the reason that we merge several offset bins prior to performing DMO. 7. Compute rst guess at bin width and center of rst bin. For this geometry, the average shot interval is 220 ft. so the rst guess at a DMO offset bin width (using the off-end assumption) would be 220 * 2 = 440 ft. The near offset of this data is 27.5 ft. and the traces are 55 ft. apart, so the source-receiver offsets would be: 27.5 82.5 137.5 192.5 247.5 302.5 357.5 412.5 467.5.....
220
660
For an offset bin width of 440 ft., the center of the rst bin would lie halfway between 192.5 and 247.5 or at 220 ft. 8. Modify the ow to display the data with a bin width of 440 ft.
Landmark
14-7
Notice that most of the CDP locations are filled by live traces. This is what we want for DMO binning. If you were to display the data with a bin width of 220 ft. (4 offset bins) you would see that a width of 220 ft. is adequate for the near offsets, but too small at the farther offsets. Remember that we have two conflicting goals in our DMO binning process: Create continuous CDP coverage in the offset bins to eliminate DMO artifacts. Create the greatest number of offset bins to enhance DMO velocity analysis.
14-8
Landmark
11. Next we will view the DMO binning parameters in the database. First we need to transfer the AOFFSET header to the database: Database Edit Attribute Apply a Function...
12. Choose the abs function, and the OFFSET attribute. Type in AOFFSET GEOMETRY for the result attribute and then select OK. Remember, if OFFSET is used, for every positive offset bin that is dened, a negative offset bin is also created. For example, in split spread shooting, the above example would contain DMO bins +220, -220, +660, -660. In marine shooting where the rst channel is farthest from the boat, all of the offsets are negative, and only the negative DMO bins would be populated. 13. For a QC that the function worked choose View Tabular... AOFFSET and then OK. You can now use MB2 to drag OFFSET from the DBTools window to the Tabular View Window. Close the Tabular View when satised. 14. Permanently write AOFFSET to the database by selecting Database Commit. 15. Now lets go into Database XDB Database Display. Be patient the more data you put into the database, the longer the initial delay when you rst execute the database.
Landmark
14-9
16. Plot two 3D XYGraphs. The rst will be TRC: OFFSET, CDP, SRF and the second will be TRC: AOFFSET, CDP, SRF. The XYGraph using AOFFSET will look similar to the following:
Check the offset bin centers by looking at the graphs and verifying that each offset bin is evenly populated with CDPs. Also determine if it is appropriate to combine the traces by absolute offset or if the negative offsets should be processed separately from the positive offsets. A general rule is to simultaneously process like offsets. Use the Grid tool to analyze your bins on the display. Select Grid Display. This will generate new icons to rotate and move the grid, modify the cell size, and generate spider or histogram plots
14-10
Landmark
of the cells. Now select Grid Parameterise, and ll in the values as displayed.
The dy should be the length of the line in CDPs. The dx should be the offset bin spacing. Click on the Green Light icon.You should have 15 offset bins displayed. Click MB3 on the top left icon in XYGraph, and see how your CDPs increment within each bin. For this data, if OFFSET is used, with a bin increment of 440, the near offset bin centers are -220 and +220, the far offset bin centers are -3300 and +6380, and we have 23 bins. If AOFFSET is used, the near offset bin center is 220, the far offset bin center is 6380, and we have 15 offset bins. Notice that the near offsets would only need a bin width of 220 ft. for continuous CDP coverage, but the far offsets need a bin width of
Landmark
14-11
440 ft. If you change dy to 1, ny to 215, and Y origin to 775 you will see a tight grid with a cell size of one CDP.
From this display you can zoom in and QC that each offset bin has cells populated with continuous CDPs.
14-12
Landmark
Assign DMO offset bins to the data In this exercise you will offset bin the data using the Trace Binning process, apply NMO in preparation for DMO, and QC the output common offset ensembles. 1. Build the following ow:
Trace Binning
Header entry to bin: ------------------------------------AOFFSET Binned header entry: -----------------------------------DMOOFF Binned entry format: -------------------------------------------Real Header entry bin centers: ---------------------------------------------------- 110-1430(220),1760-6160(440) Binned header entry values: ----------------------------------------------- 110-1430(220),1760-6160(440) Set OFFSET and AOFFSET headers to bin center: --Yes
Database/Header Transfer
Direction of transfer: ------------------------------------------------------------Load FROM Trace header TO database Number of parameters: --------------------------------------------1 First database parameter: -------------------------------------------------------TRC:Geometry:New - Enter DMOOFF First Header Entry: -------------------------------------DMOOFF
Landmark
14-13
3. Enter the bin centers from 110 to 1430 with an increment of 220,and bin centers from 1760 to 6160 with an increment of 440 based on the absolute offset. The discussion at the beginning of this section describes how to calculate these numbers. Remember our primary goals are to have the maximum number of bins, yet still have 100% CDP coverage. If you wish you can vary the bin spacing more along the line. Output the same values to a new header entry called DMOOFF. If you plan to further process the DMO gathers, you may want to set the OFFSET header word equal to DMOOFF values. 4. Transfer the DMOOFF header to the database by setting a new database value DMOOFF (oating point). 5. Output the gathers to a new dataset CDP-dmooff/input to DMO. 6. Execute the ow.
14-14
Landmark
7. Now lets QC the DMO offset bin centers in the database. Select Database XDB Database Display 3D XYGraph TRC:AOFFSET,CDP,DMOOFF. You may have to use the mouse button help on the bottom left of the window to help you locate which database entry is DMOOFF, since its label will be an 8 digit unix parsed name. When selected click Display. 8. From the XYGraph menu, select Color Edit. From the color editor menu select File Open, and select the contrast.rgb color le. Each DMO offset bin will now be displayed in a different color.
Landmark
14-15
Ensemble Stack/Combine
Type of operation: --------------------------------------Stack only How are trace headers determined?: --Average Secondary key bin size: -------------------------------1 Maximum traces per output ensemble: ------------------215 Warnings if max traces/ensemble exceeded?: --------Yes Select PRIMARY Trace Order Header Word: ---DMOOFF Average the X and Y coordinates of primary key?: ---No Select SECONDARY Trace Order Header Word: -----CDP Output trace secondary key order: --------------Ascending
Pad Traces
Header word to use for padding: --------------------------CDP Spacing of header value: ------------------------------------------1 Remove traces?: ----------------------------------------------------No Explicitly dene the bounds of header values?: ------Yes First trace header value: --------------------------------------775 Last trace header value: ---------------------------------------989
Trace Display
Primary trace LABELING header entry: ---------DMOOFF Secondary trace LABELING header entry: ------------CDP 10. In Disk Data Input, input the gathers sorting on DMOOFF:CDP. In Trace Display annotate DMOOFF and CDP. The Ensemble Stack/Combine step should be used when DMOOFF has been built from AOFFSET. It is used here to stack all like numbered CDPs within the same offset panel.
14-16
Landmark
Use pad traces to insert a dead trace whenever the spacing between CDPs is greater that 1. This display will show how many CDP traces exist per bin and will also show any gaps or unpopulated CDPs in the offset plane. 11. Execute the ow. 12. Are the trace gaps in the DMO offset panels reasonable? If not you will need to adjust you offset binning parameters and re-run ow 8.2.
Landmark
14-17
DMO
Both F-K and integral DMO methods are available in ProMAX. Common Offset F-K DMO works in F-K space using the Stolt stretch technique to account for vertical velocity variations. The data should be binned into common offsets in preparation for this process. Ensemble DMO in the T-X Domain is a Kirchhoff implementation, which can be applied to arbitrary ensembles of input traces, such as common shot gathers, common receiver gathers, or common offset data. This process may be run in the shot mode for datasets with irregular shooting geometries instead of the F-K DMO. In a typical processing sequence, DMO follows statics and NMO. Since DMO may enhance stacking velocity picks, try the following sequence.
NMO
DMO
Inverse NMO
Velocity Analysis
Re-iterate this process until the difference between input and output velocities in velocity analysis is small.
14-18
Landmark
Trace Display
2. In Disk Data Input, input the gathers using DMOOFF as the primary sort key and CDP as the secondary key. 3. Select Common Offset F-K DMO parameters.
Landmark
14-19
Input the same velocity eld used to NMO correct the input gathers. 4. In Disk Data Output, output the DMO applied data. 5. In Trace Display, label DMOOFF and CDP. 6. Execute the ow, and view the common offset planes after DMO. 7. After viewing a few of the offset planes after DMO, select File Exit/Continue Flow. Remember to select the continue ow option, otherwise the job will stop without outputting the entire dataset. 8. Optional: After the DMO job nishes, build a ow that sorts the data to CDP and views the gathers. Toggle off the rst three processes in the above ow. Change the sort to CDP in Disk Data Input, change the Trace Display ensembles per screen to 215 and execute. In a typical processing sequence you would apply inverse NMO and repick velocities.
14-20
Landmark
CDP/Ensemble Stack
----Default all parameters----
Trace Display
Primary trace LABELING header entry: ---------------None Secondary trace LABELING header entry: ------------CDP 2. In Disk Data Input, input your DMO data in CDP sort order. 3. Select an output dataset name. 4. Execute the ow, and examine the stack. If you notice any large artifacts from the DMO process, it probably resulted from a bad trace. You could either go back and kill the bad trace, or apply an AGC prior to DMO. 5. Use your ow 11-Compare Stacks to examine the differences between the DMO stack, and the regular stack. 6. Optional: Replace Common Offset F-K DMO with Ensemble DMO in the T-X Domain. Build and execute a ow to compare the output gathers and stacks. Ensemble DMO in T-X Domain process will output as many traces per CDP per bin as were input by producing copies of the output traces.
Landmark
14-21
Chapter Summary
Upon completion of this chapter you should be able to answer the following questions: Can you Common Offset Bin data for input to DMO Are you familiar with the input parameters to DMO Do you understand the basic theory of DMO
14-22
Landmark
Chapter 15
Landmark
15-1
Chapter Objectives
To further clean up and optimize the stack, some type of poststack signal enhancement is almost always applied. This chapter explores some of ProMAXs techniques of reducing noise and enhancing signal in poststack data. Upon completion of this chapter you should: Be familiar with F-X Decon and Dynamic S/N Filtering techniques Understand how to use Trace Math to enhance stacks
15-2
Landmark
Signal Enhancement In this exercise, you will compare the results of your residual statics stack, with stacks that are processed with signal enhancement techniques.
Landmark
15-3
Reproduce Traces
Trace grouping to reproduce: --------------------------All Data Total Number of datasets: ----------------------------------------4
ELSEIF
SPECIFY trace list: ---------------------------------------------------2
F-X Decon
TYPE of lter: ------------------------------------Wiener Levinson Percentage of white noise: ---------------------------------------0. Horizontal window length: -------------------------------------50 Number of lter samples: -----------------------------------------5 Time window length: -----------------------------------------1000 Time window overlap: ------------------------------------------100 F-X lter start frequency: -----------------------------------------3 F-X lter end frequency: -----------------------------------------90 Re-apply trace mute after lter?: ---------------------------Yes
ELSEIF Trace Display Label BLEND F-X Decon ELSEIF Trace Display Label Dynamic S/N Filtering ENDIF Disk Data Output Automatic Gain Control Trace Display
15-4 ProMAX 2D Seismic Processing and Analysis Landmark
Disk Data Input Reproduce Traces IF Trace Display Label ELSEIF Trace Display Label F-X Decon ELSEIF
SPECIFY trace list: ---------------------------------------------------3
BLEND
Ratio of processed/original: ----------------------------------1:2
F-X Decon
----Use same parameters as the previous F-X Decon----
ELSEIF
SPECIFY trace list: ---------------------------------------------------4
Trace Display
Primary trace LABELING header entry: ---------------None Secondary trace LABELING header entry: ------------CDP
Landmark
15-5
2. In Disk Data Input, input your residual statics stack. 3. Make four copies of your stack with Reproduce traces and choose a trace grouping of All Data. 4. Set up an IF-ELSEIF-ENDIF conditional with Trace Display Labels to easily compare the results of the different signal enhancement tools with your original stack. Enter the Repeat number to pass through this portion of the ow. Please refer to the chapter on parameter analysis if you are not familiar with the IF-ENDIF conditional logic. 5. Select F-X Decon and Dynamic S/N Filtering parameters. Refer to the online helples for parameter selection of these processes. Note that the BLEND function applies to the process immediately following. 6. Output the four copies of the dataset with Disk Data Output. This dataset will be used in the next ow. 7. Execute the ow. View the 2D ltered data, and compare the stacks using the animation tool. Which dataset looks the most mixed?
15-6
Landmark
Trace Math
Trace Math will allow you to add, subtract, multiply or divide adjacent traces, or apply a scalar to the traces. We will use this process to subtract stacks created using different processing techniques.
Use Trace Math to view differences between stacks 1. Create the following ow:
Trace Math
MODE of operation: ---------------------------------Trace/Trace TYPE of trace/trace operation: ------------Subtract Traces Honor ensemble boundaries: ----------------------------------No
Trace Display
Primary trace LABELING header entry: ---------------None Secondary trace LABELING header entry: ------------CDP 2. In Disk Data Input, input the le you just created, and select the FX Decon copy, and the Original Input copy. 3. In Trace Math, select Trace/Trace for the mode of operation, and Subtract Traces. 4. In Trace Display Label, indicate which stacks have been subtracted. 5. Execute the ow.
Landmark
15-7
A display representing the difference between the two stacks appears on your screen. Ideally all of the energy in this difference display would be noise that was removed by the FX Decon. The presence of actual signal in this display might indicate a need to try different parameters in the FX Decon. 6. Experiment with the different trace scaling methods from within Trace Display.
15-8
Landmark
Chapter Summary
Upon completion of this chapter you should be able to answer the following questions: Are you comfortable with F-X Decon and Dynamic S/N Filtering Are you comfortable with Trace Math
Landmark
15-9
15-10
Landmark
Chapter 16
Landmark
16-1
Chapter Objectives
Velocity Modeling
Preparation of your velocity field for migration is a crucial step in the imaging process. You may need to smooth the field in space and/or time and change the field type from a stacking (RMS) field to an interval velocity field. You will also need to convert your velocities from the floating processing datum to the final flat datum. In this chapter, we will discuss how to edit and modify velocity fields, using two different velocity tools. Upon completion of this chapter you should: Be able to use the Velocity Viewer/Point Editor Be able to Manipulate the Velocities to various formats
16-2
Landmark
Smooth RMS velocities, and convert to interval velocity In this exercise, you will edit and smooth a velocity field that was created with the Velocity Analysis tool. You will then convert the RMS velocities to interval velocities in time. These edited velocity fields will be used in the chapter on migrations later in the manual. Remember that FK Migrations need Vrms(t,x), Phase Shift Migrations need Vint(t), and Finite Difference Migrations need Vint(t,x).
Landmark
16-3
16-4
Landmark
Function to Edit(Pink)
Function to Compare(Blue)
Interval Velocity
4. Edit velocity control points. To edit the velocity eld, you must edit the control points that dene the velocity. A velocity control point generally consists of a vertical group (or function) of velocity-time pairs at a certain CDP location. To view these control points, click on the Edit Vel Function icon and move your mouse into the velocity eld. Move the mouse pointer from location to location and watch as the blue function changes in the edit window. You will also notice that the function nearest the mouse pointer changes from a solid line to a dashed line. Click MB2 near one of the locations to freeze the function in the edit window; the function does not change as the mouse moves across the velocity locations.
Landmark
16-5
Click MB1 near a different velocity location. You should now have a blue line and a pink line in the edit window. Click the Edit icon on the right of the edit window, and follow the mouse button help to edit the pink function. After editing, the velocity eld can be updated by clicking on the Update button on the top of the edit window. 5. Once your velocity eld has been edited to your satisfaction, apply a general smoothing. From the menu bar select Modify Smooth Velocity eld. This brings up the Smoothing Parameters window.
Smoothing Parameter window 6. Enter the smoothing parameters as indicated above, and select OK. Examine the results of the smoothing process. If you want to undo the smoothing and try again with different parameters, select Modify Undo last change, then re-smooth the velocities. 7. Once your velocity eld is sufciently smooth, select File Save table to disk. This saves the Smoothed for FK MIg le to disk for use in the FK Migration.
16-6
Landmark
8. After you have saved your smoothed velocity eld, you may compute and display Interval Velocities by selecting Modify Convert RMS to Interval Velocity Smoothed gradients Dix equation. If there are large anomalies in the interval velocity eld, you may need to select Modify Undo last change, perform more editing on the RMS eld, and then convert to interval velocities again. You can also directly edit and smooth this interval velocity eld in the same manner as described above for the RMS velocities. 9. Once you are satised with your interval velocity eld, select File Save table to disk and exit. This will save the Converted Stacking Vels le to disk for use in the FD Migration.
Landmark
16-7
Velocity Manipulation
Velocity Manipulation is used to convert one type of velocity to another, datumize velocities, apply a percentage, and/or smooth velocities. In this section, you will output three new velocity functions. The new functions will be a RMS field shifted to final datum, an interval velocity field shifted to final datum, and a single average interval velocity function shifted to final datum.
Shift smoothed RMS velocities to nal datum In this exercise, you will shift your smoothed RMS velocity field to final datum for later use in F-K migration.
16-8
Landmark
Landmark
16-9
Shift interval velocities to nal datum In this exercise, you will shift your interval velocities to final datum for later input to FD migration. 1. Edit your ow to output interval velocities at nal datum:
16-10
Landmark
Output a single interval velocity function In this exercise you will output a single interval velocity function in time to be used in Phase Shift migration. 1. Edit your ow to output a single average function:
Landmark
16-11
Chapter Summary
Upon completion of this chapter you should be able to answer the following questions: Are you comfortable with the Velocity Viewer/Point Editor Can you Manipulate the Velocities to various formats
16-12
Landmark
Chapter 17
PostStack Migration
The ProMAX suite of 2D migration tools includes pre and poststack time and depth migration and migration velocity analysis. The available poststack migrations are of the F-K, Finite Difference, Reverse Time, Phase Shift, and Kirchhoff types. The goal is to migrate the stack section with the most appropriate poststack migration process. To aid in this selection, this chapter includes a brief description of the processes. The Reference Manual and cited references give further detail.
Landmark
17-1
Chapter Objectives
Velocity Modeling
The ProMAX suite of 2D migration tools includes pre and poststack time and depth migration and migration velocity analysis. The goal here is to migrate the stack section with the most appropriate poststack migration process. To aid in this selection, this chapter includes a brief description of the processes. Upon completion of this chapter you should: Understand Tapering and other Migration Parameters Be familiar with running FK, Phase Shift, and FD Migrations
17-2
Landmark
Memory Stolt F-K Phase Shift Fast Explicit FD Time Steep Dip Explicit FD Time Kirchhoff Time Reverse Time T-K Explicit FD Depth Kirchhoff Depth
F-K Phase Shift FD FD (70 deg) FD (50 deg) Kirchhoff Reverse Time FD Imp.Eikonal Max.Amp. Mult. Arr.
Time Time Time Time Time Time Time Depth Depth Depth Depth
VRMS(x,t) VINT(t) VINT(x,t) VINT(x,t) VINT(x,t) VRMS(x,t) VINT(t) VINT(x,z) VINT(x,z) VINT(x,z) VINT(x,z)
Poor None Fair Fair Fair Fair None Good Fair Good Excel.
Poor Good Good Good Good Good Good Good Good Good Excel.
NOTE: These tests were run on an IBM 370 RS6000 system. Your times will depend on your specific environment, workload, dataset, and processing parameters.
Landmark
17-3
Tapering
Tapering is automatically applied to samples at the bottom and edges of the seismic section prior to migration. This prevents migration artifacts due to the abrupt truncation at the bottom of the input section (see diagram below). The magnitude of the edge taper should normally increase with depth, as migration artifacts originating deeper in the section tend to move a longer distance.
A Hamming taper is used, which consists of a cosine weighting that varies from 100% to 8% over the length of the horizontal taper. The bottom taper goes from 100% to 0%. In the migration processes, there is a parameter that asks if you want to change the default tapering. This does not turn off the taper, instead, it allows you the change the tapering values. If you have steeply dipping events near the edge of your data you may want to pad traces rather than the taper the edges.
17-4
Landmark
Poststack Migration
At this point, you should have your best stacked dataset with statics and velocities applied, a pre-processed input velocity parameter table (edited, smoothed, shifted to datum), and an idea of the types of migrations you would like to run.
Apply FK migration In this exercise, you will run a FK migration on your data.
Landmark
17-5
Bandpass Filter
Ormsby lter frequency values: --------------------- 5-10-60-70
Trace Display
Primary trace LABELING header entry: ------------------NONE Secondary trace LABELING header entry: -----------------CDP 2. In Disk Data Input, input your best (DMO) stack dataset.
17-6
Landmark
3. Set FK migration parameters. Select your smoothed velocity eld at nal datum. Set the velocity scaling factor. Normal ranges are 85-100 percent. 4. Execute the ow. 5. After examining your migration in Trace Display, select File Exit/Continue ow.
Landmark
17-7
Apply Phase Shift Migration 1. Copy your previous ow, and add Phase Shift Migration:
Bandpass Filter
Ormsby lter frequency values: ------------------------ 5-10-60-70
Trace Display
Primary trace LABELING header entry: ---------------------NONE Secondary trace LABELING header entry: --------------------CDP 2. In Disk Data Input, input your best (DMO) stack dataset. 3. Select Phase Shift migration parameters.
17-8 ProMAX 2D Seismic Processing and Analysis Landmark
Select your smoothed velocity eld at nal datum. Set the velocity scaling factor. Normal ranges are 85-100 percent. 4. Execute the ow. 5. After examining your migration in Trace Display, select File Exit/Continue ow.
Landmark
17-9
Bandpass Filter
Ormsby lter frequency values: ------------------------ 5-10-60-70
Trace Display
Primary trace LABELING header entry: ---------------------NONE Secondary trace LABELING header entry: --------------------CDP 2. In Disk Data Input, input your best (DMO) stack dataset.
17-10
Landmark
3. Select FD migration parameters. Select your interval velocity eld at nal datum. Choose to retain input sample rate. 4. Execute the ow. 5. After examining your migration in Trace Display, select File Exit/Continue ow.
Compare Migrations 1. Use your previous ow 5.3-Compare Stacks, to compare the various migration datasets to the input stack.
Landmark
17-11
Chapter Summary
Upon completion of this chapter you should be able to answer the following questions: Do you understand Tapering and other Migration Parameters Are you comfortable running FK, Phase Shift, and FD Migrations
17-12
Landmark
Appendix 1
Landmark
1-1
OPTIONS * Pre-Initialization no yes * Full Extraction no no yes * From Field Notes and Survey no yes
QUESTIONS
* Does Shot and Receiver X, Y, and station information exist in the headers and do you want to use it?
* Do you want to minimize the number of times that you have to read the data?
Table Diagram
Question Is shot and receiver station, and x,y information in the headers; do you want to use it? Do you want to minimize the number of times to read the data? Answer Yes Option Full Extraction
No Yes
No
Partial Extraction
1-2
Landmark
Transferring the Database to Trace Headers When the database is completed, the information contained in it is transferred to trace headers. The following question determines how to match a trace in the data file to a trace in the database:
Question Was a Full or Partial Extraction used to create the database and a new output le written? Answer No Option Inline Geom Header Load by Chan and other trace header words.
Yes
Landmark
1-3
Inline Geom Header Load is the main program used to assign geometry values to individual trace headers from the OPF database files. One of the main issues related to this geometry assignment procedure is to define how a trace in a data file will be identified in the Trace Ordered Parameter file. One of the options is to use a specific trace header word called the "valid trace number". In order to utilize the "valid trace number" we will have to spend some time discussing its origin and how it can be used. Another program that may be used in the geometry assignment procedure is called Extract Database Files. We will see that this program is one of the ways that the "valid trace number" can be generated by running it in either the Partial or Full extraction modes. Geometry Header Preparation is another program that may be selected in the geometry assignment procedures. This program can be used for a variety of different purposes. We will look specifically at how it can be used when dealing with the problem of duplicate Field File Identification Numbers.
Steps Performed by Inline Geom Header Load Inline Geom Header Load is the program that populates the trace headers of an input data le with the geometry information stored in the database. The outcome from running this program is to have a database and a data le that "match".
1-4
Landmark
This means that every trace in the output data le exists in the database and there is a one to one correspondence in all values in the trace header to those in the database. After a successful run each trace will also be assigned the "valid trace number" if it was not pre-assigned using Extract Database Files.
There are two major options in this program pertaining to how to identify a trace in the input data file with a trace in the database. These options are: 1. to read the "valid trace number" from the input trace header, or 2. to read the recording channel number (automatic) and 1 or 2 trace header words that can uniquely identify this trace as having originated from a unique shot (SIN) that exists in the shot database. Once a trace in a data file has been identified in the Trace OPF, the information in all of the OPFs for that trace is copied to the trace header.
Valid Trace Numbers Before we proceed, lets make sure that we understand the idea of the "valid trace number". Understanding this will help us decide on the "best" course of action for our data. The "valid trace number" is simply a ProMAX trace header word. Every trace in the database is numbered from 1 to N, where N is the total number of individual traces in the database. This is a unique number for each trace in the line or 3D project. A "valid trace number" combined with matching geometry is a ag that will allow fast random access sorting of disk datasets. Every trace in the TRC database is assigned to a single SIN (shot), SRF (receiver) and CDP. Every trace has an individual Shot to Receiver Offset distance, an individual midpoint X and Y location and many other values that are single numbers, that may, or may not be different for every trace.
Landmark
1-5
Inline Geom Header Load matches the current trace being processed to the database and then copies all of the trace dependent values as well as the other order values to the trace header. The last thing that happens is that the traces are "stamped" as matching the database.
Valid Trace Number Origin Where does the "valid trace number" trace header word come from? Luckily, the answer to this is very simple. The Extract Database Files program writes this trace header word after it reads and counts a trace that it is entering into the TRC database. In this case the "valid trace number" is pre-assigned. If it is not pre-assigned, the Inline Geom Header Load process will create it after it determines which trace in the database corresponds to a trace in a data le.
The "valid trace number" is a unique number for every trace and is stored in the trace header as TRACE_NO. This trace header word continues to exist ONLY if you write a new trace file after the extraction procedure. A common question that arises concerns the decision to pre-assign the "valid trace number" using Extract Database Files or to rely on the alternate header identification on the first read of the input data. You may consider using Extract Database Files if there is sufficient information in the trace headers that can be transferred to the database which will save time and increase accuracy of the geometry definition process. The extraction may be run in either the partial extraction or full extraction modes depending on what information is available in the trace headers of the input data.
Steps Performed By Extraction The steps performed by the extraction options are: Pre-Geometry Initialization (or partial extraction) which is sometimes used when no receiver information exists in the incoming headers. Partial Extraction counts each of the following: the number of traces encountered
1-6
Landmark
the number of shots encountered the number of traces per shot and then writes the trace count number and SIN to the trace header
Full Extraction is used when you want to extract the shot and receiver location and coordinate information from the incoming headers. Full Extraction counts each of the following: the number of traces encountered the number of shots encountered the number of traces per shot the number of receivers encountered the number of traces per receiver and then writes the trace count number and SIN to the trace header
IF you have run the extraction in either mode, AND written a new trace data file, AND have not altered the number of traces in the database, you now have valid trace numbers in the headers of the output data set which you can use to map a trace in a data file to a trace in the database. This mapping will be performed by Inline Geom Header Load after the database is completed.
Between Extraction and Geom Load After running Extract Database Files in either mode there are many steps that need to be completed prior to running the Inline Geom Header Load. The extraction only partially populates the database. More work will generally need to be done in the Spreadsheets to input the remaining information. After the Spreadsheets are complete, the next step would be to complete the CDP binning procedures and then finalize the database. With the database complete, you can continue with the next step of loading the geometry information from the databases to the trace headers. You may elect to address a trace by its "valid trace number"
Landmark
1-7
assigned during the extraction or you may read a combination of trace headers to identify the trace.
Geometry Load Procedures For the first option, Inline Geom Header Load operates as follows: 1) it identifies the TRACE_NO of the incoming trace and finds that trace in the TRC database. 2) it copies the appropriate TRC order values to the trace header and then 3) finds the shot, receiver, cdp, inline, crossline, and offset bin for that trace. The appropriate values from those orders are then copied to the trace headers as well. In the second option, Inline Geom Header Load does not know exactly which TRACE_NO it is looking for. It does know which channel and shot to look for based on the header word(s) that you selected. Given that this mapping is unique, the program now knows which SIN and CHAN to look for in the TRC database. Once the entry is found, the TRACE_NO is copied to the headers and the steps outlined in the first option are performed. Again, the key to the second option is that you need to identify which shot a trace came from by a "unique" combination of header words for that shot.
1-8
Landmark
Manual Input
Builds TRC and SIN OPFs only Pre Geom Init = yes
Landmark
1-9
This option may be appropriate for relatively small datasets which only have FFID and CHAN in the input trace headers. This option should be used when reading the field data and writing the data to disk for the first time. In so doing, information, such as FFID, number of shots, number of channels are written to the database, and are then available when the geometry is completed. Selecting this option will also stamp the output dataset with valid trace numbers, which allows you to process with trace headers only and overwrite the dataset with updated geometry from the database files. This is an important concept for the Inline Geom Header Load process. In the following example, you will assume that only the FFID and recording channel number exist in the incoming trace headers. This information will be extracted, using the perform pre-geometry database initialization option in Extract Database Files.
1-10
Landmark
Pre Geometry Initialization ow 1. Make a new line called from pre-initialization. 2. Build the following ow:
SEGY Input
Type of storage: --------------------------------------- Disk Image Enter DISK le path name: ---------------------------------------------------------------/misc_les/2d/segy_0_value_headers MAXIMUM traces per ensemble: ---------------------------120 Remap SEGY header values: -------------------------------- NO
Landmark
1-11
Receivers: identify by STATIONS. 5. In Extract Database Files, select Yes for the option Pre-geometry extraction. This initializes the SIN and TRC domains of the Ordered Parameter Files, stamps the dataset with valid trace numbers, and allows for the use of overwrite mode when performing the Inline Geom Header Load step later. 6. In Disk Data Output, enter the name for a new output le, such as Shots-raw data. 7. Execute the ow. After the Flow Completes 8. Exit the ow building level and select Database from the global command line. 9. Check the OPFs, verifying the number of records in the dataset, the number of channels/record, and the FFID range. The only OPF les that should exist are LIN, SIN, and TRC. If SRF exists, this means that you identied traces for receivers by coordinates. You will also nd that the SRF OPF has 1 value in it.
Complete the Spreadsheet In this sequence, the next steps would be to complete the Sources, Receivers and Patterns Spreadsheets and perform the CDP binning similarly to the sequence used in Chapter 1: Geometry Assignment.
1-12
Landmark
Load Geometry to Trace Headers 1. If the geometry in the database looks good, build the following ow:
Landmark
1-13
3. In Inline Geom Header Load, match the traces by their valid trace numbers. Since the traces were read and counted with Extract Database Files, you have a valid trace number to identify a trace. You have binned all traces; therefore, do not drop any traces. Unless you have a problem, there is no need for verbose diagnostics. 4. In Disk Data Output, output to the same dataset as specied in Disk Data Input. We will use the overwrite option in conjunction with trace header only processing in the Disk Data Input. 5. Execute this ow. In the Extract Database Files path, the Inline Geom Header Load process operates on a sequential trace basis, and includes a check to verify that the current FFID and channel information described in the OPFs matches the FFID and channel information found on each trace of each ensemble. The Inline Geom Header Load process will fail if these numbers do not correspond. You must then correct the situation by changing the geometry found in the OPFs, or possibly by changing the input dataset attributes.
1-14
Landmark
Appendix 2
Supergathers
Supergathers are ensembles which were created by combining two or more regular CDPs to form a single ensemble. Supergathers are commonly used for velocity analysis and quality control, post-NMO mute denition, and any other processes which might benet from reduced spatial separation between traces in a CDP gather. ProMAX incorporates the functionality to create supergathers in a number of analysis and quality control processes. Examples include: Velocity Analysis, Interactive Velocity Analysis, and Velocity Quality Control. This exercise is useful to help understand the mechanism employed in creating supergathers in these various processes.
Landmark
2-1
Appendix 2: Supergathers
Create Supergather
Creating supergathers is really a matter of redefining a trace flag which establishes the end-of-ensemble. In other words, how the traces are grouped. The header word is called the End-of-ensemble flag (END_ENS) and its value is either 1 or 0 (one or zero). When END_ENS = 1, this alerts any process that the last trace in an ensemble has been reached, such as shot record, CDP, or offset gather. This way, if a process redefines the value for the END_ENS header word, then it is able to regroup the traces. The following exercise will illustrate how you may control trace grouping with a process called Ensemble Stack/ Combine. 1. Build a simple ow to input and display three CDPs.
2-2
Landmark
Appendix 2: Supergathers
3. In Trace Display, set the Number of ensembles/screen large enough to allow all three CDPs on the screen at one time. Also, label primary and secondary header entries as CDP and OFFSET in the Trace Display menu. 4. Execute the ow. Your screen should look similar to the following:
Landmark
2-3
Appendix 2: Supergathers
5. Use the Header icon to display several trace headers. The value of the End-of-ensemble ag (END_ENS) header word can change from trace to trace. Be sure to check the last trace in any one of the ensembles. What is different?
2-4
Landmark
Appendix 2: Supergathers
6. Activate Ensemble Stack/Combine in your ow to create one superCDP. Select Combine only for the Type of Operation, inputting three ensembles per output ensemble. This option will only reset the END_ENS ag for the rst three CDPs so that the result is one single CDP. Also in this menu, select primary and secondary header words as OFFSET.
Trace Display
Landmark
2-5
Appendix 2: Supergathers
7. Execute the ow and compare your results to the original. It should look similar to the following:
You might use this type of operation to create a super-CDP with better offset coverage prior to Velocity Analysis.
2-6
Landmark
Appendix 2: Supergathers
Trace Display
2. Modify Ensemble Stack/Combine to use the Combine and Stack option for the type of operation. After making this selection you will see a new parameter called Secondary Key Bin Size which was previously hidden. Set this value to 350.
Landmark
2-7
Appendix 2: Supergathers
3. Execute the ow. Notice the difference between this display and your last. Why are they different this time?
One observation that should jump out is that there are fewer traces on the screen. This is due to the summation of adjacent traces performed by the Stack portion of the Combine and Stack option. The summation is dependent on which header word you select as a secondary key, and by the secondary key bin size. You might use this type of operation to reduce the amount of data going into a Prestack Migration.
2-8
Landmark
Appendix 3
Landmark
3-1
CVS Analysis
CVS Analysis is a macro process. CVS can be helpful in areas of complex structure where velocity trends can change along the seismic section. Constant velocity stacks of the entire line or a subset are produced for a specified range of velocities. Horizons may be easier to track if the whole section is seen. Random picks can be made on any constant velocity stack panel and a final gridded velocity table is output. Creating constant velocity stacks can be a time consuming event especially if you have a large dataset with many panels to create. In this case you may want to use the process Constant Velocity Stacks to create and output the CVS panels ahead of time. Then Stack Display can be used to display and pick the velocities the same way as done with the macro.
3-2
Landmark
CVS Analysis
Apply AGC to the data?: ---------------------------------------Yes AGC operator length: --------------------------------------------500 Maximum wavelet stretch: --------------------------------------30 Velocity input option: ----------------------------------Calculated Minimum velocity: ----------------------------------------------7000 Maximum velocity: -------------------------------------------17000 Number of velocity panels: -------------------------------------16 Select display DEVICE: -----------------------------This Screen Number of trace per display screen: ----------------------215 Do you wish to SCROLL your data?: -----------------------No Trace scaling option: -----------------------------------Individual SCALAR for sample value multiplication: ------------------1. 2. Select Disk Data Input parameters. A Disk Data Input step is required since it is not included within the macro. Data should be preprocessed gathers without NMO and should have a bandpass lter and scaling function applied. Sort to CDP and include the range of CDPs to be stacked. For this exercise, use all the CDPs in the line. 3. In CVS Analysis, apply an AGC prior to stacking with constant velocities. Specify the minimum, maximum velocity and number of velocity panels. Enter 7000 - 15000 ft/sec for an velocity range and create 16 panels.
Landmark
3-3
4. Select the number of traces to display per screen and an appropriate trace scaling option, such as Individual. Selecting Individual as the trace scaling option will make sure that spikes do not dominate the display. 5. Execute the ow with MB2. The display will appear in the old Stack Display tool. The last constant velocity panel will appear along with 16 screen swap boxes in the upper right of the display.
6. Click on the Pick icon to create a pick table. Click on Pick CVS/CVM panels Create new le and name it cvs vels. 7. Activate the CVS/CVM picking mode.
3-4
Landmark
Move the cursor to the screen swap boxes in the upper right hand corner of the display and use MB2 to enable picking of the CVS panels.
NOTE: You should see all the icons disappear except for the scroll icon and you should be able to move your cursor into the data area. If you dont see the scroll icon and your cursor remains in the screen swap boxes, you are not in CVS/CVM picking mode. To correct this error, click MB1 in the screen swap boxes and then click MB2 in the screen swap boxes. You should see the scroll icon remaining and you should be able to move your cursor into the data area.
8. Pick velocities on the display by using MB1. Move your cursor into the displayed stacked section. While holding down MB2, move the cursor back and forth within the stacked section. This will enable the screen swapping.
Landmark
3-5
Once you have found a velocity panel that stacks an event best, use MB1 to make a pick and to display it on-screen. You can manually enter a velocity value at your cursor location by double clicking MB1. A pop-up box will appear in the upper left hand corner of the display. You may now type in a velocity value. Hit enter to accept. This can be useful if the desired velocity falls between the displayed panel velocities. 9. To nish picking, click MB1 on any of the screen swap boxes in the upper right hand corner of the display. 10. Click the red Stop icon to exit the display. Choose Save all work to the database before quitting.
NOTE: Upon exiting the CVS Analysis display, two velocity tables are written to disk. One file contains just the picks you made in the CVS Analysis display. The second file is a fully interpolated velocity table based on the sparse picks you made on-screen.
3-6
Landmark
Landmark
3-7
3-8
Landmark
2. Select Fully Interactive for the Operation Mode. Fully Interactive allows you to choose random locations once IVA is displayed and does not involve any precomputation. Precompute then Interactive allows you to specify CDPs at which to precompute analyses. Once IVA is displayed, you can move between locations more quickly. However, other random locations can still be selected. 3. Select your range of data to process. IVA allows the selection of a specic range of CDPs to process or All Available Data. Supergathers can also be created for analysis. 4. Select or create the Input (Initial) and Output velocity Table. The input velocity table and the output velocity table can be the same le or a new output table can be added. The output table is continuously updated as each new velocity function is picked. 5. Provide a CDP Mute Table and/or a Horizon Data Table. A post NMO mute table can be supplied provided it was created as a function of CDP:AOFFSET, or one can be created and interactively picked once IVA is running. The same can be done for a horizon data table. 6. Enter velocity information and maximum frequency of data. The menu asks for a minimum and a maximum velocity of interest plus an interval velocity below the last picked time or knee. For this data use a velocity range of 7000-17000 ft/sec and an interval velocity of 17500 ft/sec. Velocity Uncertainty at Tzero and Tmax basically denes the bounds of your velocity fan. For example the default uses 900 ft/sec for Tzero meaning the fan will be no wider than 1800 ft/sec (900 on each side of the reference function) at Tzero. The maximum frequency of the data is requested simply to internally resample the data to optimize screen resolution and execution time. It is not a lter. 70 Hz is reasonable for this data.
Landmark
3-9
7. Enter the number of velocity functions and number of CDPs for the stack panels. As the number of velocity functions increases, the resolution improves, but the run time and resource requirements also increase. For this data, you can get adequate resolution using 15-17 velocity functions and 7 consecutive CDPs in the stack panels. 8. Execute the ow with MB2.
The initial display includes a semblance plot, an isovel plot and a portion of the data stacked with the input velocity table. 9. Scan your data. The portion of the stacked data displayed is dened by the rectangle box in the isovel plot. Use MB3 with the cursor located in either the stack or the isovel to scroll through the data. Clicking on MB3 will
3-10
Landmark
quickly jump to a new location on the isovel plot. The size of the box in the isovel plot is controlled by the Horizontal and Vertical Enlargement Factor.
NOTE: The mouse button helps are very important in this process because they change according to where the cursor is located on the screen.
10. Select your CDP analysis location by clicking MB2 for a previously picked location or MB1 for a new location. CDP numbers are displayed below the stack section along with time and velocity. Use MB1 to pop up a menu with analysis mode options. Select Semblance, Stacks and Gathers. In the lower right corner of the screen the Notication Window shows that the stack, semblance and gather images are being computed. 11. Click on Cong and select Expand/UnExpand Top. This option appears below the semblance plot and allows the reconguration of the display. Eliminate the isovel plot to allow more room for the semblance, stack and gather.
Landmark
3-11
12. Begin picking the velocity function by clicking on Pick. Select Velocity Function from the Pick Operation menu. A message appears in the notication window that reads Picking Function Auto_scroll enabled. You can freely scroll the mouse up and down the display. MB1 adds a control point (knee) to your function and MB3 deletes a control point. As you move the cursor within the calculated functions, the active screen images change within the ip stack and gather. Once you have nished picking your velocity function, while keeping the cursor within the semblance plot, use MB2 to save and write the function to the velocity table. Go back to the Cong option and UnExpand your window. Your function is displayed as a downline in the isovels plot. Start again and select a new CDP location for analysis. 13. Changing the velocity bounds of your fan is possible by using the Vbound option located to the right of Pick. Click here and the notication window reads Picking Vbound 1. Auto_scroll enabled. Use MB1 to add control points to the lower vbound (left side). When you are nished, click MB2 and the notication window reads Picking Vbound 2. Auto_scroll enabled. Use MB1 to select your control points for the upper vbound. When nished, MB2 recomputes a new velocity fan with new gather and stack panels. 14. Mute Analysis can be run at any CDP location. Click on your analysis location. When the Analysis Mode menu pops up, select Mute Analysis. Wait for the computations to complete. Follow the same instructions as picking a function, click on the Pick option and select Top Mute from the menu. You will notice in the gather display that mute points have already been selected. To choose your own mute, use MB1 to select time/aoffset points. When nished, use MB2 to save the output. Gathers and stacks are recalculated and you are prompted to Update the Semblance. A mark is displayed on the isovel where the analysis was done. Your mute is saved in the Parameter File menu for Mute Gates and is automatically labeled as IVA with a time/date stamp. 15. Restack Line. To restack your line with the new velocities, click on Action and select Restack Line from the popup menu. The notication window informs you that your CDPs are being restacked.
3-12
Landmark
16. Exit. When you are ready to exit IVA, use the Exit located at the bottom of the screen. Select from the menu to either save to the database or to abort the IVA session. Your velocity table can be found in the Parameter Files for RMS (stacking) Velocity menu.
Landmark
3-13
3-14
Landmark
Appendix 4
Database/Header Manipulation
The database is critical to ProMAX. Many processing attributes, such as statics and rst break picks are kept in the database. In this chapter we examine the links between the database and the trace headers by determining rst break linear moveout corrections. We will also create and alter trace headers and database attributes.
Landmark
4-1
Apply a Linear Moveout Correction In this exercise, you will compute and apply a linear moveout (LMO) correction to the data. This will create a new trace header that you can transfer to the database. Finally you will view your new attribute in the database.
4-2
Landmark
Database/Header Transfer
Direction of transfer: ---From Trace header to database Number of parameters: --------------------------------------------1 First database parameter: -------------------------------------------------------------TRC:Geometry:New - Enter LMO First Header Entry: ---------------------------------------------LMO
Header Statics
Bulk shift static: ------------------------------------------------------0 What about previous statics?: ---Add to previous statics Apply how many static header entries?: -------------------1 First Header word to apply: ---------------------------------LMO HOW to apply header statics?: -----------------------------Add
Landmark
4-3
3. In Trace Header Math, create a static for applying a linear moveout correction to your data. The following equation creates the LMO static time: LMO = 100 - (AOFFSET/8000) * 1000 where: 100 is a bulk shift to move the trace samples away from time zero by 100 ms. 8000 is the refractor velocity. 1000 converts seconds to ms. Except for the near offsets, the nal LMO corrections are fairly large negative numbers. 4. Select Database/Header Transfer parameters. Select to Load FROM trace header TO database. For First database parameter select TRC: Geometry: New to enter a name for your LMO static header and make it Floating Point. For First Header entry select User Dened and enter LMO. 5. Select Header Statics parameters. Add the LMO header entry, created in the Trace Header Math, to the previous statics. 6. Set the Trace Display parameters. You may nd that setting this display to four panels, and limiting the time range from 0 to 500 ms is useful. 7. Execute the ow, and observe the effects of the LMO. 8. View your new LMO attribute in the database.
4-4
Landmark
Appendix 5
Training Summary
This summary may be used as a quick reference for some of the most useful charts of information you have worked with during the week.
Landmark
5-1
Reference Tables
Organization of Ordered Parameter Files
LIN (Line) Contains constant line information, such as nal datum, type of units, source type, total number of shots. Contains information varying by trace, such as FB Picks, trim statics, source-receiver offsets. Contains information varying by surface receiver location, such as surface location x,y coordinates, surface location elevations, surface location statics, number of traces received at each surface location, and receiver fold. Contains information varying by source point, such as source x,y coordinates, source elevations, source uphole times, nearest surface location to source, source statics. Contains information varying by CDP location, such as CDP x,y coordinates, CDP elevation, CDP fold, nearest surface location. Contains information varying by channel number, such as channel gain constants and channel statics. Contains information varying by offset bin number, such as surface consistent amplitude analysis. OFB is created when certain processes are run, such as surface consistent amplitude analysis. Contains information describing the recording patterns.
TRC (Trace)
PAT (Pattern)
The Ordered Parameter Files database stores information in structured categories, known as Orders, representing unique sets of information applying to an individual line.
5-2
Landmark
Memory Stolt F-K Phase Shift Fast Explicit FD Time Steep Dip Explicit FD Time Kirchhoff Time Reverse Time T-K Explicit FD Depth Kirchhoff Depth
F-K Phase Shift FD FD (70 deg) FD (50 deg) Kirchhoff Reverse Time FD Imp.Eikonal Max.Amp. Mult. Arr.
Time Time Time Time Time Time Time Depth Depth Depth Depth
VRMS(x,t) VINT(t) VINT(x,t) VINT(x,t) VINT(x,t) VRMS(x,t) VINT(t) VINT(x,z) VINT(x,z) VINT(x,z) VINT(x,z)
Poor None Fair Fair Fair Fair None Good Fair Good Excel.
Poor Good Good Good Good Good Good Good Good Good Excel.
To help you decide on the optimal migration for a given situation, the above table is a summary of the poststack migrations and how they handle changes in velocity and dip.
Landmark
5-3
For Elevation Statics 1) Remove previously applied statics if TOT_STAT not equal 0
For Refraction Statics 1) Remove previously applied statics if TOT_STAT not equal 0
2) Compute S_STATIC and R R_STATIC to Final Datum 3) Compute N_DATUM (smooth surface / processing datum) 4) Partition the statics into PRE and POST NMO terms ----- NMO_STAT (pre) ----- FNL_STAT (post) 5) Apply the PRE NMO term NMO_STAT 6) Update NA_STAT and TOT_STAT in the Trace Headers
2) COPY refraction statics to S_STATIC and R_STATIC 3) Compute N_DATUM (smooth surface / processing datum) 4) Partition the statics into PRE and POST NMO terms ----- NMO_STAT (pre) ----- FNL_STAT (post) 5) Apply the PRE NMO term NMO_STAT 6) Update NA_STAT and TOT_STAT in the Trace Headers
ProMAX uses the above logic when applying datum statics. Refer to the following Datum Statics Terminology graph for a further description of the statics variables.
5-4
Landmark
Reference Graphs
Datum Statics Terminology
S.P. CDP
NMO_STAT
NMO_STAT
Shot Base Weathering Vreplacement
S_STATIC F_DATUM
FNL_STAT
C_STATIC
R_STATIC
Database Attributes:
N_DATUM = oating datum F_DATUM = nal datum S_STATIC = (F_DATUM - ELEV + DEPTH) / DATUMVEL R_STATIC = [(F_DATUM - ELEV + DEPTH) / DATUMVEL] - UPHOLE C_STATIC = 2 * [(N_DATUM - F_DATUM) / DATUMVEL]
Landmark
5-5
Manual Input
SEG-? Input
Seismic Data (ProMAX) Extract Database Files Inline Geom Header Load
Geometry Spreadsheet
5-6
Landmark
/sys
/exe exec.exe super_exec.exe *.exe from program /bin *.exe from command line
/port
/promax
*.lok - Frame help
/etc
/lib/X11/app-defaults *.help -ASCII help Application window /promax3d managers /promaxvsp /menu /promax *.menu Processes /promax3d /promaxvsp /misc *_stat_math *.rgb-colormaps ProMax_defaults /bin start-up executable
Landmark
5-7
/Line DescName 17968042TVEL 31790267TGAT 36247238TMUT 12345678CIND 12345678CMAP /12345678 HDR1 HDR2 TRC1 TRC2 /Flow1 DescName TypeName job.output packet.job /OPF.SIN
OPF60_SIN.GEOMETRY.ELEV
4) /OPF.SIN Database subdirectory and a non-spanned le /OPF.SRF Database subdirectory and a span le
/OPF.SRF
#s0_OPF60_SRF.GEOMETRY.ELEV
Understanding the ProMAX directory structure and file naming conventions will be crucial for debugging flows and managing disk space.
5-8
Landmark
Upon completion of the course your flows menu should look similar to the above.
Landmark
5-9
Datasets: Seismic
Upon completion of the course your processing should have created the above datasets. Note: how the naming convention allows for clues as to the datasets contents.
5-10
Landmark
Datasets: OPF-TRC
The TRC trace database is the largest of the Ordered Parameter Files since it contains information varying by trace, such as FB Picks, trim statics, source-receiver offsets. Note: the format in the database table is variable name, variable/info type, and variable description.
Landmark
5-11
Datasets: OPF-SRF
The SRF receivers OPF contains information varying by surface receiver location, such as surface location x, y coordinates, surface location elevations, surface location statics, number of traces received at each surface location and receiver fold.
5-12
Landmark
Datasets: OPF-SIN
The SIN source OPF contains information varying by source point, such as source x, y coordinates, source elevations, source uphole times, nearest surface location to source, and source statics.
Landmark
5-13
Datasets: OPF-CDP
The CDP OPF contains information varying by CDP location, such as CDP x, y coordinates, CDP elevation, and CDP fold nearest surface location.
5-14
Landmark
Datasets: OPF-CHN
The CHN channel OPF contains information varying by channel number, such as channel gain constants and channel statics.
Datasets: OPF-OFB
The OPF offset bin OPF contains information varying by offset bin number, such as surface consistent amplitude analysis. OFB is created when certain processes are run, such as surface consistent amplitude analysis.
Landmark
5-15
Datasets: OPF-PAT
The PAT pattern OPF contains information describing the recording patterns.
5-16
Landmark
The End
Landmark
5-17
5-18
Landmark