Upcoming SlideShare
×

Aug 2008 The Geomodeling Network Newsletter

1,462 views

Published on

This newsletter will be sent out to all members of the Geomodeling Network every second month.

1 Like
Statistics
Notes
• Full Name
Comment goes here.

Are you sure you want to Yes No
• Be the first to comment

Views
Total views
1,462
On SlideShare
0
From Embeds
0
Number of Embeds
29
Actions
Shares
0
0
0
Likes
1
Embeds 0
No embeds

No notes for slide

Aug 2008 The Geomodeling Network Newsletter

3. 3. August 2008 The Geomodeling Network Newsletter 1. Member Articles, Reviews & Questions 1.1 Modeling NTG and Associated Properties- workflows and Pitfalls Jose Varghese (Jose.varghese@shell.com) Defining Net to Gross (NTG) at the well level and modeling it in the static reservoir model play an important role in the hydrocarbon volume calculation. Though it may sound simple, there are chances for erroneous calculations in NTG, during reservoir modeling. This document aims to highlight some of the issues, recommended workflows and also invite comments and suggestions from the readers. WORKFLOW PRACTICES When it comes to defining and modeling of NTG, people have been following many workflows, such as: Use Gamma ray or Vshale or Porosity or a combination of all these to define Net and Non Net interval at the well log scale. Use the facies log (created using the log cut-off or manually interpreted) and create a NTG log (e.g. NTG= if(facies=0, 0,1) ; i.e. If the facies code is non reservoir, keep NTG zero, else keep it as 1) Once the facies 3D model is created, generate a binary NTG model from the facies model itself (e.g., NTG_model= if(faciesmodel=1,1, 0); this creates an exact copy of facies model. Upscale binary NTG log into the Geocellular model and interpolate independently. Upscale binary NTG log into the Geocellular model and interpolate it conditioning to facies model, but keeping NTG=0 in non reservoir facies. Upscale Binary NTG log into the Geocellular model and interpolate it Watson: Holmes! What conditioning to facies model, and model it in all facies (including non reservoir kind of rock is this! facies). Keep the property values as zero in the Non reservoir interval, upscale and model conditioned to facies. Holmes: Sedimentary, my Upscale cut property logs (Porosity, permeability etc) and model it conditioned dear Watson. to facies, assign zero values in Non reservoir facies. ......I’ll get my coat! Upscale cut property logs (Porosity, permeability etc) and model it conditioned to facies, (model it in all facies). It can be seen that, there are issues with some of the workflows. These issues become significant, when the cell thickness or layering scheme in the model is very coarse and the heterogeneity seen in the logs are not captured by the Page 3 The Geomodeling Network – Sponsored by Blueback Reservoir www.blueback-reservoir.com
4. 4. August 2008 The Geomodeling Network Newsletter layering scheme. We can see some of these issues in detail in the following sections. DEFINITION As mentioned earlier, NTG is defined either by using a log cut-off or by using a reservoir – Non reservoir discriminator log (a facies log). One of the QC method used in checking the Upscaled/modelled result is to compare the Equivalent Pore Column(EPC), between the well level properties (as shown below) and the corresponding model derived properties. They would (should) match when the the layering scheme has properly captured the heterogeneity seen in the wells. Page 4 The Geomodeling Network – Sponsored by Blueback Reservoir www.blueback-reservoir.com
5. 5. August 2008 The Geomodeling Network Newsletter Many of the issues come up during the modeling stage only, not at the well log scale of interpretation. For mappable shale layers (deterministic shales), as shown in the figure below, NTG can be assigned to “ZERO”. The accuracy of defining the layer boundary is user controlled. Net pore volume in this case is Zero. No need for even making layers in this unit. But for those shale intervals interpreted in wells, which are not correlatable (present in the reservoir units, which need to be distributed stochastically in a model), the definition and modeling of NTG bring some issues in the workflows practiced. MODELING ISSUES quot;Can ye make a model of Consider a perfect case of layering as shown below. Here both the original facies it? and upscaled cells match exactly. In other words, each cell represents 100% of If ye can, ye understands the same facies at the well log level. But this remains an ideal case, as the user it, and if ye canna, ye does not have a control on the exact match of facies boundaries and layer dinna!quot; boundaries. This can be approximated by taking a fine layering. Sometimes, the model dimensions and the modeling strategy would require going for a coarser layering. Lord Kelvin (supposedly) Page 5 The Geomodeling Network – Sponsored by Blueback Reservoir www.blueback-reservoir.com
6. 6. August 2008 The Geomodeling Network Newsletter Now consider the following situation, where the layering is bit coarser. When the facies logs are upscaled, the resulting cell facies DO NOT represent 100% of the same facies at well log level. Consider the Reservoir and Non Reservoir facies cells inside the red circles. The reservoir cell is not 100% reservoir and similarly Non reservoir cell is not 100% non reservoir. In such cases, the different upscaling and modeling practices have different impact on the GIIP or STOIIP calculated. Page 6 The Geomodeling Network – Sponsored by Blueback Reservoir www.blueback-reservoir.com
7. 7. August 2008 The Geomodeling Network Newsletter Consider the flowing case If NTG is created by using the facies in the model as a criteria (ie NTG_model= if(faciesmodel=1,1, 0); this creates an exact copy of facies model.), as seen in the red rectangle, it can be seen that, we overestimate reservoir facies in some cases (green rectangle) and in some case we underestimate the reservoir facies ( Pink rectangle). So it is clear that the “Binary NTG in the model” is not a representation of the facies at the well log level. So the correct procedure would be to upscale the binary NTG log (raw log), so that it becomes a non binary log (all values between 0 and 1 possible). This will account for the Reservoir and Non reservoir fractions lost during upscaling. Consider the upscaled non binary NTG in the above figure (blue rectangle). See the cell with a yellow boundary. It is a Non reservoir cell in the model. But it has actually about 5% of reservoir facies as well. A binary NTg with “0 “ value will not account for this. But the upscaled NTG with a value of 0.05 accounts for the 5% reservoir facies. When there are many wells and similar discrepancies due to coarser layering occurs, the sum total of all such discrepancies would result in incorrect GIIP/STOIIP. Page 7 The Geomodeling Network – Sponsored by Blueback Reservoir www.blueback-reservoir.com
8. 8. August 2008 The Geomodeling Network Newsletter Summary Always use a facies interpretation (reservoir, non reservoir discrimination). Always use a binary NTG at log level and upscale it to the model. Make sure that the R/NR log used in defining NTG is consistent with the current facies log /model used. Do not use a binary NTG directly in the model (created from facies). MODELING PROPERTIES quot;The primary role of the geologist is to recognise the existence Once the NTG is upscaled, how to proceed with modeling NTG as a full 3D of phenomena before model? The practices seen for creating 3D NTG property are, trying to explain themquot; Create NTG Model directly from facies model (discussed and mentioned as the B.M. Keilhau, 1828 wrong method in earlier section). Model NTG using the upscaled NTG log, independently of facies. Model NTG using the upscaleld NTG log, facies conditioning done, but assign zero facies in Non reservoir facies . Model NTG using the upscaled NTG log, facies conditioning done, model all facies. If NTG is modeled independently of facies, it MAY result in scenarios where the interpolated NTG model shows a low value, in a place where the facies model would show a good reservoir facies. This will result in inconsistency. But if the modeling is done conditioning to the facies (model NTG is each facies separately), the resulting NTG model will be consistent with the facies model. While conditioning to facies, if NTG is assigned as “0” in the non reservoir facies, the same mistake of making a binary NTG in the model would be repeated. In other words, the NTG value in the Non reservoir cell is not necessarily zero always. That cell may have a representation (though low) from a reservoir facies as well. Hence it will have a low but non zero NTG. If we assign this to zero, we are practically loosing that much reservoir volume. Summation of all such “small” errors in a case with many wells, would result in volumetric discrepancies. Page 8 The Geomodeling Network – Sponsored by Blueback Reservoir www.blueback-reservoir.com
9. 9. August 2008 The Geomodeling Network Newsletter Summary Model NTG by conditioning to facies model. Model NTG in all facies including Non reservoir. UPSCALING/MODELING OF OTHER PROPERTIES Similar issues can be seen with the upscaling/Modeling of other properties like Porosity, Permeability etc. Usually when the petrophysicist gives the processed porosity log, it will have a zero value in the non reservoir intervals. These zero values would influence the upscaling process and can cause double dipping in the net pore volume and hence in the GIIP/STOIIP results. Zero is also considered as a value and used for averaging during the upscaling process. Consider the example as shown in the following figure. The cell with a red rectangle on it, has a reservoir facies. But it is not 100% reservoir facies. As shown by the upscaled NTG, that cell is 60% reservoir and 40% non reservoir. But for the 60% reservoir facies, the corresponding porosity is a low value of 0.12, which is not representative of the reservoir facies. In other words Page 9 The Geomodeling Network – Sponsored by Blueback Reservoir www.blueback-reservoir.com
10. 10. August 2008 The Geomodeling Network Newsletter Net Pore Volume= Gross Volume * NTG * Porosity of the reservoir facies =Gross volume *0.6 *0.12 = 0.072*Gross Volume --- double dipping But it should have been =Gross Volume *0.6 * 0.2 = 0.12 * Gross Volume This issue can be solved by making the porosity (other properties as well) values in the non reservoir interval as “Undefined”. Consider the following illustration The impact of undefining the property values and then upscaling can be seen in the property values in the upscaled cells. Consider the cell with reservoir facies (red rectangle outline). It has 30% non reservoir in it. But the porosity assigned in that cell is representative of the reservoir facies only. Now consider the cell with non reservoir facies (orange rectangle outline). Though it is a non reservoir cell, it is not 100% non reservoir. It has 20% of reservoir facies (with a 20% porosity as well).That brings up another question. Page The Geomodeling Network – Sponsored by Blueback Reservoir www.blueback-reservoir.com 10
11. 11. August 2008 The Geomodeling Network Newsletter If Porosity is modeled, by conditioning to facies AND assigning zero value to non reservoir facies, what will happen to those non reservoir cells which has a reservoir facies fraction in it? It can be seen that, if a zero porosity value is assigned to all non reservoir facies, the cumulative effect of many such non reservoir facies cells (with some fraction of reservoir facies in it), would result in volumetric discrepancies. Even if this would create porosity values in non reservoir, it is co existing with very low NTG values and hence the net pre volume will be correct. Summary Make the properties to Undefined in the Non reservoir sections. Model the properties conditioning to facies and in all facies. ------------------------------ 1.2 To model Net to Gross or not to model Net to Gross! Juan Cottier – Blueback Reservoir (juan.cottier@blueback-reservoir.com) quot;The purpose of war In my view there are two overwhelmingly important and connected issues to consider here: is not battle but victory.quot; Firstly: net to gross is an artificial construction, it does not exist and it should or: not be modeled in 3D. The purpose of analysis is not Secondly: any attempt to model N:G will fail because of simple issues regarding modelling but scale and selection. understanding. Sun Tsu, The Art of War, ca 500 BC So, why does net:gross not exist? The concept of “net rock” has been about since Schlumberger ran the first log in 1927 and probably before that. The idea of “pay” or “producing intervals” or “kh” from a well test is standard oil field practice for reservoir engineers but is very different from a geoscientists idea of n:g. Page The Geomodeling Network – Sponsored by Blueback Reservoir www.blueback-reservoir.com 11
12. 12. August 2008 The Geomodeling Network Newsletter Net to gross is an artificial construction to allow 1 dimensional data namely a well log to be used to explain a 3 dimensional asset. If the well results are not considered in three dimensions then a “good rock / bad rock” discriminator can be applied and everyone can be happy with their “understanding” of their lovely new well and pat themselves on the back. As geologists, we should know that a 4 metre thick channel with a 2cm shale drape on the top can extend for kilometres in one direction and can pinch out to provide a 4m shale sequence within a few metres in another direction. Texans drilling wells in the 1930s and 1940s used net to gross because they had no choice. Those net to gross values were contoured up to create reservoir “You can have it good, quality maps. When mapping software came in to play in the 80s the good, old fast, or cheap: pick any school geologists would complain about the mapping algorithms, and for very two.” good reason, because they were not thinking geologically: ………….. fluvial The Project Manager's Maxim channels, stacked dunes, delta front beach sands or offshore sand-bars. In 3D we should model what we think is representing the subsurface, we can use facies modeling, we can attempt to describe a 3D volume with the detail and heterogeneity and the complexity. We don’t need to start the process by defining “good rock / bad rock”. Nor should we do. All rock is equal, comrades, even if some rock is ultimately more equal than others. So, why is it impossible to model correctly? Well there is a scale thing to start with. See the attached jpg photo. There is no doubting here that there is excellent sand (orange) and non porous shale (grey). There is also little doubt that it would be possible to sum the relative proportions of sand and shale and come up with a n:g. But look at the lens cap(*). We are looking at beds considerably smaller than 6 inches which is the standard sampling interval for logs, the resolution of tools may well be greater than that ….. so how could you possibly get a correct result from log data. For example a gamma log would be smeared with “average values”. It is possible to use curve inflections, or the curve tendency towards a value rather than an absolute value to identify thin beds. But still. It is also worth noting that these are not what would be described a “thin beds” in a “thinly bedded” or “tiger stripes” reservoir. I have worked on a field in West Africa where the sand shale couplets were providing sand beds of 2-5mm. Page The Geomodeling Network – Sponsored by Blueback Reservoir www.blueback-reservoir.com 12
13. 13. August 2008 The Geomodeling Network Newsletter Rockefeller once explained the secret of success. 'Get up early, work late - and strike oil.' Joey Adams It is also worth considering what is used as a cut off. Vshale? From a gamma log? How is the radioactivity of a rock any indication of if its ability to flow oil? Well, it’s probably because you get a gamma log with any tool run and it’s the one bit of log analysis any geoscientist can do. What about porosity? A porosity cut off at 10% porosity? 5%? Why? Absolutely no reason. Probably because, like many decisions in the oil world, it “feels about right”. But surely porosity can be a direct link to permeability? And permeability is about flow and flow is the discrimination between “good rock / bad rock”. Excellent! That means we should use permeability as a cut off. Isn’t it? Rule of thumb in the oil patch is 1 milliDarcey for oil and 0.1 milliDarcey for gas. Well for start off there is no way to directly measure permeability in the subsurface other than perhaps the NMR tools. Most perm logs come from a transform from a porosity log and that transform often comes from a relationship identified in core plugs. These plugs are samples of rock that have been sitting around for millions of years doing nothing and then within a matter of weeks are taken from some pressure of 1000s of PSI, to atmospheric, shipped across the world, washed and cooked and then tested. No wonder a core plug perm is always 20 times less than that identified by a well test. Which brings us Page The Geomodeling Network – Sponsored by Blueback Reservoir www.blueback-reservoir.com 13
14. 14. August 2008 The Geomodeling Network Newsletter back to scale. So do we apply our n2g cut off to the core perm or the well test perm? Air perm or in situ perm? “The product of an arithmetical computation is the answer to an Consider the following: equation; it is not the solution to a problem.” We use a permeability cut off for defining net:gross. We now have non net rock By Ashley Perry (bad) which doesn’t flow and net rock (good) which does flow. But of course it’s not the perm that flows oil it’s the relative perm …. The relative permeability is dependent on the saturation of the oil versus the saturation of the water. Which is the continuous phase? Oil or water? So should our n:g cut off be relative permeability? As we produce oil and inject water then our oil and water saturations will change, and so will our relative perms, and so will our continuous and non continuous fluids. And this is two phases, shall we add gas and make it three phases? Do we need to constantly update our n:g? Time-lapse net to gross modeling? For those of you still reading, here is Darcy’s Law straight from Wikipedia. Wikipedia: “The total discharge, Q (units of volume per time, e.g., m³/s) is equal to the product of the permeability (κ units of area, e.g. m²) of the medium, the cross-sectional area (A) to flow, and the pressure drop (Pb − Pa), all divided by the dynamic viscosity μ (in SI units e.g. kg/(m·s) or Pa·s), and the length L the pressure drop is taking place over.” “Say you were standing with one foot in the oven That means if permeability is calculated from Darcy’s Law then it is proportional and one foot in an ice to viscosity and indirectly proportional to the pressure drop ……………… so if gas bucket. According to evolves from the oil, then the perm will change and the rel perm will change and statistics, you should be our net to gross will change? Really? No … of course not, because net to gross perfectly comfortable” does not exist. Bobby Bragan, 1963 Then again try telling a senior geoscience manager who has had a long a successful career using n:g that it doesn’t exist. Page The Geomodeling Network – Sponsored by Blueback Reservoir www.blueback-reservoir.com 14
15. 15. August 2008 The Geomodeling Network Newsletter Much of this could well be rubbish …. But it’s worth a thought for a moment or two. ---------------------------- 1.3 Probabilistic Vs Deterministic Results when computing GIIP or STOIIP Jose Varghese – Shell (jose.varghese@shell.com) Question 1: When we compute GIIP or STOIIP using probabilistic methods and deterministic methods(low case, mid or most likely case and high case), is it ALWAYS true that the P50 case of Probabilistic method should be near to the Deterministic mid case(or most likely case)…..and same for Low & P90 and High & P10 cases?? Page The Geomodeling Network – Sponsored by Blueback Reservoir www.blueback-reservoir.com 15
16. 16. August 2008 The Geomodeling Network Newsletter Question 2: Can there be a situation where, the deterministic cases like low case and high case are not at all captured in the Probabilistic ranges? (Logically thinking it should be captured in the Probabilistic range) If the answer to this question is NO (ie deterministic cases should be always within the Probabilistic range), then my real problems comes (next figure) Example Case: varying three contacts in a probabilistic workflow Page The Geomodeling Network – Sponsored by Blueback Reservoir www.blueback-reservoir.com 16
17. 17. August 2008 The Geomodeling Network Newsletter While running the monte-carlo workflow, let’s say each of the contact is put as a variable with a Normal distribution. In every run, a random value is chosen for the contact. But that random selection of all three contacts NEED NOT BE like shown with Blue colour (all lows in one run or all Highs in one run). If only this is achieved, we would be able to include the deterministic low – high cases within the probabilistic ranges. If the situation is like shown with red colour, the probabilistic range will not include the deterministic low- high values. Deterministic Low case is calculated by taking ALL low case contacts (and probably but not necessarily all low cases of other parameters like NTG, Porosity etc). Ie in case of contacts, in all the three zones shown above, the deterministic low case will take the lowermost value only Similarly is the Base/Mid case and high case Page The Geomodeling Network – Sponsored by Blueback Reservoir www.blueback-reservoir.com 17
18. 18. August 2008 The Geomodeling Network Newsletter Do the simplest thing that could possibly work. By Kent Beck One solution is to run the simulation for large number of runs… Expecting that in some runs, it will capture all low ranges and in some cases all high ranges, thereby simulating GIIP/STOOIP values near to the Deterministic Low and High cases. ..But again..this NEED NOT be true always... When deterministic Low or High is calculated, we introduce a DEPENDENCY -..ie for one parameter if a low value is selected, other parameters are also selected from the low value This kind of dependency is difficult to introduce in a Monte-Carlo workflow….or may be large number of runs are needed. (Please correct me if I am wrong In Petrel..it is difficult…what about other 3D modeling software? ) If the parameters can be related in some way, then the dependency can be achieved… eg Contact 3= Contact 1- XX meters Now vary contact 1 in a normal distribution. For every run, and every value of Conatct1, Contact 3 also would get a value with a similar trend (ie high or low ). Page The Geomodeling Network – Sponsored by Blueback Reservoir www.blueback-reservoir.com 18
19. 19. August 2008 The Geomodeling Network Newsletter BUT this is possible only if the two parameters are having same standard deviation (please correct me if I am wrong) The same question goes for dependency between properties like Porosity, permeability, saturation etc. ------------------------------ Page The Geomodeling Network – Sponsored by Blueback Reservoir www.blueback-reservoir.com 19