With years of collective experience in geoscience software, our skilled scientists and technicians have served the industry’s leading companies over the past 30 years. Here, their knowledge is shared to help clients. Discover new insights and get a better understanding of your reservoir dynamics. Expand your GeoSoftware expertise with our Tips & Tricks.
RockMod is a tool for geostatisical seismic reservoir characterization or probabilistic modeling that is part of the Jason Workbench. RockMod takes, as input, all kinds of geological data, statistical information such as models and trends, and geophysical data such as seismic and wavelets.
Although RockMod is typically used for simultaneous probabilistic modeling of elastic properties, reservoir properties and facies, this example illustrates the workflow through simple acoustic impedance modeling. What follows is an example on how you can use Jason RockMod probabilistic modeling in combination with Petrel geological modeling, and HampsonRussell Strata geophysical inputs.
This is an easy and streamlined workflow since the Jason Workbench has direct reading access to data in HampsonRussell and can easily connect to Petrel plugin. These connections allow data to be shared back and forth without data files having to be transferred. So you can access Jason data directly from HampsonRussell Geoview!
Step 1. Import horizons into Petrel to make a simple CPG model. Here we show a pre-built CPG model in Petrel.
Step 2. Import the model and seismic volumes from Petrel into RockMod. Let us say there are 10 inlines and 229 crosslines. Here the Petrel model is shown at the top of the seismic section in RockMod.
Step 3. Use wells from the GeoSoftware database for statistical modeling in RockMod. The GeoSoftware database is commonly used by both the Jason Workbench and HampsonRussell. Here we show pdf and variogram modeling in RockMod.
Step 4. As you can directly connect HampsonRussell projects from the Jason Workbench, simply use the extracted wavelets from your Strata project.
a. Connect to HRS project from the Jason Workbench:
b. Extract wavelets from your Strata project. Here is an example of a scaled wavelet coming from Strata:
Step 5. Run the inversion in RockMod. Here is an example of one realization from RockMod inversion.
This was a quick and easy example on how you can use Jason RockMod and HampsonRussell Strata together on the same data. Having direct access to pre-stack and post-stack data makes cross-product workflows easier. There are many more cross-product workflows possible so you don't have to duplicate data, meaning that you save both time and disk space! But more importantly, users have access to more capabilities and functionalities for customized workflows. Do more using both Jason and HampsonRussell together.
How can we calculate the Frequency Volume for continuous properties realizations?
Turn the continuous properties (e.g. porosity) into discrete properties using FunctionMod, by setting your preferred cut-offs. For example, using the FunctionMod expression:
P=(porosity<0.35 ? 1 : porosity>= 0.35 and porosity<0.4 ? 2 : porosity>=0.4 ? 3 : 0)
You define a categorical data type for porosity in your Jason Workbench project, then assign value 1 to porosity below 0.35, value 2 to porosity from 0.35 to 0.4, and value 3 to porosity greater than 0.4. In this manner, you have captured different porosity ranges in discrete values. Do this for all realizations and bring them back into RockMod to derive Frequency and Most Probable Volumes by selecting QCs Multiple Realizations.
Here is an illustration of the example given above:
Figure 1. Porosity as a continuous (left) and as a discrete (right) property.
Figure 2. Frequency Volume of multi-realization with porosity>=0.4
In Figure 2, events that are colored dark red have a porosity greater than or equal to 0.4 in all realizations. This means we can place very high confidence in the highly porous areas shown in this section, as all realizations confirm the same porosity in certain locations.
Watch as Brian Russell shows us a few tips for using new and existing features inside HampsonRussell AVO to statistically analyze geobodies in your AVO analysis and reservoir characterization.
PowerLog is the benchmark for petrophysics, rock physics, facies analysis and statistical mineralogy. As a multi-user, multi-well, multi-interpreter software, PowerLog is tuned for flexible petrophysical workflows. Using PowerLog you will be able to properly calculate a modeled Vp & Vs curve from mineral volume and well data within our RPM Module.
In this example, we will be focusing on our Differential Effective Medium Models where we will be establishing our Vp with respect to our aspect ratio. Having this piece of information, as well as a host of others, are essential to calculate our velocities.
The necessary information for the workflow can be broken into two main categories based on where you get them. PowerLog excels at calculating pressure, temperature, water saturation, etc. However, information such as Grain-Pore Structure you will need to get from prior knowledge of your region of interest.
For this particular project we will need prior knowledge on the Gulf of Mexico. The image below is the raw data of the zone we are focusing on.
The workflow that is covered in this Tips and Tricks was developed by one of our top rock physicists; If you’d like to use this workflow in one of your projects, contact GeoSoftware support and ask for the DEM workflow.
To better understand the workflow we have broken it down into color coded windows and nodes. The yellow nodes are the major inputs necessary for this workflow to function. We will be going over them in minor detail after going over where to place the inputs.
The green boxes contain not only inputs but known constants that are needed for this information, and can be changed to fit your needs at your own discretion.
The red boxes with the round nodules are the outputs for the entire module. These nodules are composed of all of the links that connect to them and can easily be followed back to their sources.
In the Fluid Property Inputs, you will need to have curves or constants appropriate for your particular project. Double click on each node and the input can be placed in the Parameters section. Also, you will need to put the new curve name in the designated block if you wish to have an output. It is not necessary for you to do so at this point, however if you feel the need to see these curves then you do need to create an output name.
The Petrophyiscal Inputs are all values that are calculated within PowerLog itself. We have multiple modules that will calculate Water Saturation, Total Porosity, & Clay Volume in the same step. The HINDEX curve represents the locations of hydrocarbon in your well.
The P-Sonic, S-Sonic and Density curves must come from the well itself. Without these three curves it’s not possible to use this workflow. There are a few options if you do not have these curves however, but will be dependent on the data you have available.
The DTS Shale Trend for Clay, and DTP Shale Trend for Clay Moduli are unique inputs; they can be either curves or constants and the choice is up to you. If your project is using a working constant you can define it here, however PowerLog allows you to create a varying set of values if required. If you believe that you have a system with a constant then you can assign it as such but if you want to have a varying set of values PowerLog allows you to create that.
These nodes run equations that are programed within RPM. Some of them also have the capability of being manipulated within the node.
These nodes are categorized as follows:
It is essential to use the calculated mineral volumes in our calculations for the modeled velocities. The effects of the minerals can be substantial on the values of those curves. If you have a system with more minerals you simply need to add them.
Now that we’ve gone through all of the preceding nodes we can now get to the final outputs.
The final outputs for the workflow are modified Vp, Vs and Density curves as shown below. These curves take into account our well and mineral data. They can now be used to further your knowledge of your reservoir and what it holds.
Jason Well Manager provides a central place for analyzing, managing, editing well data in your project, and facilitating analysis of the data in the viewers. The Curve Browser provides centralized control for launching and updating QC views such as Crossplots and Histograms, Well View and Section View. Curve browser lets you easily select and update wells and curves within a user-defined vertical gate.
In the Curves tab, open the Curve Browser by selecting Browse curves with – Curve Browser as shown below in Fig. 1.
Figure 1: Opening the Curves Browser
, a well view
, or a section view
Figure 2: The Curves Browser
In this example, P-impedance, Vp/Vs and lithology curves from five wells are selected. Clicking the Section View icon will open an embedded Section View showing the five wells and the three curves, along a trace gate automatically defined for all selected wells.
The embedded viewers can be updated with a different combination of wells and/or curves by clicking Update All.
To selectively update the embedded viewers with the new selection of wells and/or curves, use Access > Manage Apps.
Figure 3: Access Manage Applications
This opens the Manage Embedded Applications window which lists the embedded viewers that can be individually updated with a different set of curves.
Figure 4: Manage Embedded Applications window showing the embedded applications and their status
By default, the data is displayed using the whole vertical extent. You can define the display limits for the embedded viewers with Access > Vertical Settings: Click Settings to edit the axes of the viewers for the option Control viewer vertical gate.
Figure 5: Vertical Settings
Using these options, the Curve Browser enhances browsing through the list of curves and updates the viewers.
Did you know that you can display culture data in Jason Map View? Including culture data, such as block boundaries or waterways, into your map helps you make the connection between your data and the surface. Simply go to Map View and select Input > Culture Data to load the culture data into the display.
Jason has its own file format (.cul) for culture data, but since version 9.0 it can also display ESRI shapes directly. ESRI shapes are an industry standard for culture data. An ESRI shape is usually defined in multiple files. Make sure they are all located in your Jason project before loading the shape! For more information, refer to the Jason online Help.
Jason can also display culture data in formats other than ESRI: Landmark Metafile or Tobin TDRBM (since version 5.0). These however have to be converted into the Jason .cul format before Map View can display them. Select Data > Import > Culture Data to do so.
The image below shows a shape file defining waterways, cities, counties and roads. Next to it, the same location is shown on Google maps.
You can of course edit the display attributes in the usual manner. The image shows the display attributes for an ESRI shape:
For .cul files, the display attributes are limited to color, line thickness and annotations (on/off).
When an ESRI shape spans multiple Jason projects, you will want to use the same display attributes throughout. On first loading an ESRI shape, Jason automatically creates a file in which the display attributes are stored. You can copy this along with the other shape files to other projects to use the same display attributes in all of them.
Did you know that you can manually edit your wavelets in HampsonRussell? Or that you can apply operations like phase rotations and constant time shifts? This can be done in the Wavelet Data Explorer.
Select Options > Edit from the bottom toolbar of the Wavelet Explorer.
The Edit Wavelet dialog will appear. Initially, the original wavelet in blue is completely overlain by the edited wavelet in red; no edits have been applied at this point.
The most common editing options are available as indicated by the tabs in the Edit Wavelet dialog: phase rotation, time shift, resampling, resizing and direct editing of the samples.
The wavelet shown above looks reasonable. However, one could prefer that a (minor) time shift is applied so that the central peak coincides exactly with zero time.
To do this a constant time shift is applied using the “Time Shift” option. The original wavelet in blue is displayed for reference; the time shifted wavelet is shown in red. It can be seen that its central peak now coincides exactly with zero time.
The amplitudes of the updated wavelet are displayed when you press “Refresh”. You can choose to display them in time or frequency domains (amplitude or phase).
The time shift wavelet (magenta) and constant phase rotation wavelet (brown) both have their central peak exactly coincident with zero time.
The “Resample” option in the Edit Wavelets dialog allows you to resample the wavelet.
The “Resize” option allows you to change the length of the wavelet.
The “Edit Sample” option allows you to edit the actual values directly. As an example of the latter, in the next picture the original wavelet amplitude spectrum values (blue) are edited resulting in a smoother spectrum (brown).
If you are interested to learn more about HampsonRussell software capabilities or require technical support, please feel free to contact our Regional Sales Team or our Technical Support Team.
Emerge is a powerful geostatistical module from GeoSoftware’s HampsonRussell Software that can predict log property volumes from well logs and attributes from seismic data.
Beyond applying Emerge to predict a volume of log property, Emerge can increase the resolution of the inversion results and can be a useful tool to delineate thin reservoirs.
Using multi-linear regression or neural network analysis, Emerge trains itself at the well locations to learn the optimum transform that relates the logs and seismic data. Traditionally, Emerge is trained on a one analysis, as shown below:
When multiple lithologies are present within the specified analysis zone, you may want to exclude certain samples. This can be done using a secondary curve which can be of any log type, e.g. Porosity, Gamma Ray or Lithology log, etc. You can also create a so-called “indicator” curve with 0s and 1s marking which intervals you choose to use as “1”. In the example below, we created a classification log, using the original porosity log by dividing into 3 zones:
Then, for Emerge training purposes, we want to include only zone 2 and 3 classes:
Interactive correlation and auto-stretch are newer and less known options available in HampsonRussell 10.0 to make correlation easier and faster. HampsonRussell users are familiar with the stretch and squeeze process of correlating well depths with seismic times by manually pairing events on the blue synthetic traces with reflections on the red seismic traces.
The Use Interactive Synthetic option allows the well synthetic to be moved throughout the seismic volume using mouse control.
Using the interactive synthetic feature, picks can be added, stretched and squeezed, in the same way as the more traditional synthetic display.
Automatic correlation can make the picks for you under user-parameters that change the number of picks and the amplitude range of events to be analyzed.
The Use Tops and Horizons Auto Stretch option allows for quick and easy pairing via comparative drop downs with no need to worry about naming conventions.
If you'd like to learn more about how these options can help you complete your correlations easier and faster, send an email to our regional contacts, or request an evaluation copy of our software. Why wait? See these new tools in real time now.
Emerge is a powerful geostatistical module from HampsonRussell. It can predict log property volumes from well logs and attributes from seismic data.
Besides applying Emerge to predict a volume of log property, Emerge can increase the resolution of the inversion results. Also, it can also be used to predict missing logs by using existing logs that are common to the available wells.
This article has been written by Tanya Colwell, GeoSoftware Product Manager, and it describes a series of advanced options that you can use in order to improve your Emerge training and obtain more accurate and better final predicted results. To download a PDF copy of HampsonRussell Emerge Best Practices, please click here.
Emerge training should include at least three wells. We recommend adding more well curves if possible. Since Emerge assumes that the target log is noise-free, you must edit the target logs before applying Emerge. For the time domain predictions, since Emerge will be correlating the target logs with seismic data on a sample by sample basis, the proper depth-to-time correlation is critical. For this reason, check-shot corrections and manual correlation are usually necessary.
You can bring in more external attribute volumes. For example, if you have pre-stack seismic, you can bring in the near-stack, mid-stack and far-stack volumes, flagging one of them as “seismic” and the other two as “external attributes”. You can use AVO attributes, inversion output volumes, LMR volumes or any other attributes such as Coherence, Curvature, etc.
For the depth domain volume predictions, note that HRS10.1 Emerge does not use the active well depth to seismic depth mapping table. This functionality will be added in HRS10.2.
Sometimes the correlation can be improved by applying residual time-shifts to the target log relative to the attribute. Run a single attribute analysis. Determine the best single attribute. Apply the residual time/depth domain shifts to your target logs with respect to this attribute. Rerun the single attribute training. The correlations should improve.
Remove the outlier points by using exclusion zones on the Emerge cross plot. After the points are removed, they will be marked with gray squares on the target logs and will not be used in the subsequent Emerge training.
Test various convolutional operator lengths under the Multi-Attribute List training. Using the Convolutional Operator is like adding more attributes: it will always improve the Prediction Error, but the Validation Error may not improve – the danger of over-training increases. As the operator length increases, the Training Error always decreases. The Validation Error decreases to a minimum and then increases again for longer operators.
In the example below, the validation plot on the right shows that the optimal operator length was a 7-point operator with three attributes.
For volume predictions, when you do not need to predict the zone over the entire domain window, we recommend narrowing the training interval. Emerge will give better results if the zone has consistent lithology.
If multiple zones are present, you can use an “indicator” log to exclude certain intervals with different lithology.
Emerge is a statistically based program, so sometimes it is difficult to bring in geological meaning to some of the attributes that are selected by Multi-linear regression training.
If you doubt that a specific attribute affects the results, you can exclude it from training.
For example, Time, X-Coordinate and Y-Coordinate are not selected for the training below:
If you use more than one seismic volume, e.g. near/mid/far stack, then please select an option to apply internal attributes (such as Amplitude Envelope, Derivative, etc.) to the search. However, you do not need to apply these additional internal attributes to the already complicated “external” attributes, e.g. P-Impedance inversion or LMR volumes.
In the attribute list, we have a series of default frequency bandpass filters, ranging from 5 Hz to 70 Hz. Nowadays, some seismic surveys (i.e. oil sands) may contain frequencies higher than 100 Hz. Therefore, our default set of frequency bandpass filters may not be enough to include all the frequencies of the seismic data. Emerge allows us to define a new set of frequency bandpass filters instead of the default ones.
For volume predictions, you may filter target logs to seismic frequency bandwidth. However, note that filtering the target logs will always produce an improvement in the numerical correlation result, but that may be achieved at the expense of the resolution of the output.
Compared to step-wise regression, Neural Networks can enhance the high frequency resolution.
Start Neural Network training with the best Multi-linear regression list you have identified. Choosing “yes” here means that the neural network will have exactly the same attributes and the same operator length as the selected multi-attribute transform.
Try PNN with a cascading option, since sometimes it can give a better result. With the cascading mode, the first calculation that the network performs is the multi-linear regression with the same four attributes. The predicted log from that calculation is then smoothed with a smoother length given on the Neural Network training dialog. The PNN Neural Network is then used to predict the residual (the high-frequency component of the logs that is not contained within the smooth trend). You then get the final predicted log by adding the trend from the multi-linear regression and the predicted residual from the Neural Network.
Try RBF as a neural network. Because the RBF network is an exact mathematical interpolation scheme, the training data will be optimally fit. For small training datasets, the RBF network may give a higher frequency result than the PNN. Also, the RBF network can run considerably faster than the PNN.
If you are aiming to fit Emerge predictions to the wells better, we suggest a more aggressive use of Neural Networks. Maybe try MLFN with many nodes in the hidden layer and a significantly large increase in overall iterations. The run-time would go up and the validation error would not necessarily be lowered, but that could be offset by other benefits, like fitting the target data better and having greater geological continuity.
Download a copy of HampsonRussell Emerge Best Practices: Download (PDF, 1MB)
A new feature in PowerLog version 9.x, Multiwell QC is designed to improve project efficiency. With our new viewing tools, you can easily QC and optimize all your well data in a fraction of the time it currently takes.
Multiwell QC Grid is an analysis and editing tool that enables you to readily assess your well data (curves, deviations, tops etc.) using a variety of criteria. It enables you to quickly and easily perform well data QC in a multi-well mode, giving a rapid overview of your entire well database in one pass. Each of the tabs of the grid (see figure 1) provides a view of your different well data types, in addition to edit options that enable you to adjust parameters such as properties, curve types, units, type resolutions and alias resolutions.
Figure 1. Available Well Paths, such as X, Y coordinates, lat/lon coordinates, TVD type. You can reassign x, y, and TVD unit types from within this tab.
You can filter your data in accordance with: Well Name, Curve Name and Data Type.
Figure 2. Available Curves - Lists all the curves for the selected wells.
With this tool, you can easily evaluate your measured intervals for applying depth shift or other editing methodologies. You can rename or delete curves from within this tab.
Figure 3. Available Curves in a “Display Data/Depth Range” mode.
Figure 4. Curves by Types - Lists the property types and curve names for the selected wells. You can assign property types from within this tab.
With this tool, you can rename curves [to ensure consistent naming across wells], delete curves, assign types and units to curves in multiple wells, and protect or unprotect curves and wells.
Multiwell QC Viewer provides both log plot and histogram views for investigating the quality and consistency of your well data. Using this tool you can view logs and tops for several wells in one window; so much more efficient that creating individual logplots per well to check the quality of your data!
Figure 5. Multiwell QC Viewer for four wells in Log Plot mode. The color coded interval for each well is the depth range for measured data (i.e. the depth range for each well for which some form of log data actually exists).
Figure 6. Multiwell QC Viewer for four wells in Histogram mode.
These fantastic enhancements were created to improve project efficiency and speed up the data QC stages of any petrophysical project. In particular, we here, at GeoSoftware in the UK, love the enhanced Histogram visualization of data and hope you do too!
If you'd like to learn more about how this tool can help you rapidly evaluate your well data within PowerLog, send an email to our regional contacts, or request an evaluation copy of our software. Why wait? See these new tools in real time now.
Selecting the most appropriate Petro-Elastic Model (PEM) is a critical step in Rock Physics Modeling. Once the model has been selected and appropriately parameterized, it can be used to create elastic property logs in all project wells, particularly where the measured log data are invalid or missing. In addition, the Petro-Elastic Model is used to create Rock Physics Templates for crossplot overlays, Monte Carlo Simulations of additional data points of defined or anticipated litho-facies, and in interpretations of initial and 4D reservoir studies.
Here are some of the considerations in PEM selection process:
Reservoir Rock Type: Is the reservoir comprised of mainly clastic or carbonate rocks?
Porosity versus Compressional Velocity Analysis: Are the changes in porosity and compressional velocity in the well logs predominately driven by changes in effective pressure, or by changes in pore shapes?
Analysis of the Effect of Changing Pore Shapes: Are pore shapes only related to changes in lithological rock types, or are they also related to various porosity types such as cracks, interparticular, moldic, etc, sometimes within a common lithology?
For granular rocks with small amounts of clay or cementation at the grain contacts, a Grain Contact model works best. This model takes into account the difference in the overburden pressure and the pore pressure in the rock along with the number and types of contacts between the grains to estimate velocity from porosity as illustrated in Figure 1 below.
Figure 1 – Grain Contact Model Concepts
For consolidated rocks, such as most carbonates and other consolidated lithologies, the Inclusion Models work best. The Inclusion Models are based on inclusion of the porosity in terms of the pore geometry as specified by a pore aspect ratio which is defined as the pore height/pore width. In many cases, the aspect ratios tend to be mineral specific. An Inclusion Model is illustrated in Figure 2 below.
Figure 2 – Inclusion Model Concepts
In other cases, the aspect ratios of the pores are related to the type of porosity where a common lithology might contain various types of porosity such as cracks, interparticular, moldic, etc. This is more common in carbonate reservoirs. In this case, a petro-elastic model such as defined by Xu and Payne could be most appropriate. An example of porosity types in regard to the porosity-velocity relationship is illustrated in Figure 3 below.
Figure 3 – Xu-Payne Model Concepts
By examining various crossplots of the porosity and velocity logs along with other petrophysical information including mineral volumes, the most important drivers in the porosity-velocity relationship can be observed and understood. They allow the analyst to work with the most appropriate Petro-Elastic Model. HampsonRussell RockSI™ provides interactive and highly integrated processes to work with the PEMs, Monte Carlo simulation of litho-facies, and in "What-if" scenarios in interpretations of inversion results including 4D projects.
If you'd like to learn more about how this tool can help you to identify the relationships between petrophysical evidence at wells and seismic inversion results to predict the distribution of your reservoir's properties, send an email to our regional contacts, or request an evaluation copy of our software. Why wait? See these new tools in real-time now.
AVO modeling tools can sometimes be overwhelming for simple modeling and synthetic studies. HampsonRussell gives you an intuitive AVO modeling technique that helps build blocky models either from table input, or from rock physics Petro Elastic Models. To see AVO modeling in action, keep reading.
If you'd like to learn more about how this tool can help your understanding of reservoir reconnaissance work, send an email to our regional contacts, or request an evaluation copy of our software.
The specified layer models, thickness and amplitude, can be edited interactively and the response on the AVO synthetics and AVO curves is updated in real time (Figures 1 and 2).
Figure 1. Definition of a 3-layer Ostrander model with two shales with a middle gas sand
Figure 2.1 & Figure 2.2: Interactive layer editing to study of the effects of parameter changes
Figure 3.1 & Figure 3.2: The ability to use Petro Elastic Models (PEMs) in 1D AVO modeling adds a layer of flexibility that makes the tool even more powerful. This feature is available in HampsonRussell RockSI™ and supports the specification of more complex rock physics models.
2D and 3D modeling options, including wedge, anticlinal and dipping layer models, can be found in the upcoming HampsonRussell 10.2 release. Why wait? Send an email to our regional contacts, or request an evaluation of our software to see these new tools in real time.
Reservoir Characterization is a challenging but critical process for Geoscientists. Here are some of the issues our clients have encountered:
The GeoSoftware Reservoir Characterization tool kit addresses these problems using HampsonRussell RockSI™, GeoSI and Emerge. The objective in this interpretation phase is to convert the elastic properties from seismic inversions to volumes of reservoir properties such as porosity, fluid saturation or clay volume.
By combining quantitative interpretation with qualitative analysis we can now provide a relative estimation of hydrocarbon-in-place. This provides crucial information for identifying appropriate drilling locations and well paths to optimize proven reserves and production in the project area. Below is a suggested workflow.
To see these tools in action, watch the presentation here, or read the brief summary below.
First, use RockSI™ to better understand and honor the impact of the target reservoir parameter variations within the study area; reservoir parameter changes such as variation in effective pressure, aspect ratio, mineral volumes or fluid type can be modeled.
Then use GeoSI to produce elastic properties with higher detail to better image thin layers or compartmentalization, and obtain the detailed 3D litho-facies volumes focused on the target reservoir. Probability analysis of the litho-facies and a probabilistic connectivity analysis of the reservoir bodies as defined by the preferred litho-facies can also be obtained.
Finally, use Emerge to convert the detailed elastic property volumes and associated litho-facies probabilities from GeoSI to reservoir properties such as porosity, fluid saturation or clay volume.
Start orchestrating Your Solutions with the HampsonRussell Analysis Tool Kit and improve your reservoir characterization. Visit our website to request a free software evaluation.
Geostatistical inversion yields multiple realizations that are equally probable to help you reduce the risk and uncertainty associated with oil and gas exploration.
Clients using this technology tell us that these numerous realizations can cause some "creative tension" between their geoscience and engineering departments. On one side of the loop, geoscientists see the value in every realization and seek to be helpful by providing all the available information. On the other, engineers just want a P10/50/90 to run their simulation.
GeoSoftware's RockRank tool helps you close that loop by ranking realizations based on any number of criteria of interest: volume of pay, volume of net-pay, STOOIP, etc. Several GeoSoftware tools utilize this valuable ranking technology: StatMod®, RockMod, and the new HampsonRussell tool, GeoSI.
To see how these tools can help you reduce risk and uncertainty, contact your local Business Manager.
We'll use a RockMod example to illustrate how to use the tool. Once you choose the ranking criteria and calculate the results, histograms, cumulative probabilities and an embedded map view are automatically displayed for any of the criteria, and P10/50/90 realizations are highlighted (Figure 1).
Figure 1. Ranking result based on three criteria
The tool provides several pre-defined criteria - such as Net to Gross Ratio, Volume of Net Pay (HCPV) from Sw, OOIP/STOOIP from Sw, etc. - but you can also define your own (Figure 2).
Figure 2. Pre-defined criteria
Ranking can only be calculated in connected bodies - you can enter minimum body size, optionally specify that bodies must hit wells to be taken into account either any or all selected wells (Figure 3).
Figure 3. Connectivity definition
You can define a working area by one or more layers, and define their range of a trace gate or a region, or define the working area by wells or volume mask (Figure 4).
Figure 4. Limiting the volume to be ranked by defining a working area.
An example of ranking by Net to Gross Ratio is shown on figure 5. The realizations are ordered by decreasing value of the ranking criterion. Realizations corresponding to the P10, P50 and P90 values are highlighted. The Histogram shows the distribution of Net to Gross values. These are ordered in the Cumulative Probability curve. The QC Map shows the Net to Gross including assigned constraints for a selected realization.
Figure 5. Ranking by Net to Gross Ratio
As you can see, GeoSoftware's RockRank tool helps you close the loop between geoscience interpretation and engineering simulation by ranking realizations based on any number of criteria of interest. Several GeoSoftware tools utilize this valuable ranking technology: StatMod®, RockMod, and GeoSI.
Determining the facies from just a relative inversion just doesn’t work well in some plays. In these instances, Low Frequency Models (LFMs) can be used to better define and describe your pay zones. The following tip describes such a play, and how you can leverage all of your data without resorting to the assumptions involved in simple trends.
To illustrate the value of using LFMs for inversion, let’s start with very simple trends on a facies-by-facies basis. We will use these to do an inversion and then follow it with Bayesian facies identification with Jason Fluid & Facies Probability (FFP).
In the first pass of inversion, we don’t know where the facies are so we will just use the shale trend.
No pay or even much sand was found. Why? Because some of the higher frequencies in the LFM carry the sand information and the trend did not contain it. Can it be put into the trend?
Sometimes trends are just too simple to match the true geology.
Depending on acquisition parameters, the seismic band might not fully represent the play. Lower frequencies might be critically necessary. If that is the case, then they can be provided from logs or rock physics modeling.
Using GeoSoftware's drill-bit-proven, interactive tools, you have a way to define a constant trend on structure for each of the native inversion properties that more closely matches the true geology.
Using Jason RockTrace, make simple trends hung on structure and otherwise the same at each well location. Add extra horizons (see above) so that the necessary higher low frequencies are captured. Do this for all RT parameters: Ip, Vp/Vs, density. Now re-do the inversion and FFP. You begin to see a clearer picture right away!
Next, include stacking velocity information (0-2 Hz) and facies-specific trend information where the sands and pay have been identified. Check out the final result below.