Albeena Mateeva: DAS VSP applications to reservoir surveillance – overview of status

Albeena Mateeva: DAS VSP applications to reservoir surveillance – overview of status

Show Video

Earth Sciences Expert at Shell, that she joined 17 years ago, after completing her PhD in geophysics at the Center of Wave Phenomena, Colorado School of Mines in 2003. She devoted most of her career to the development of novel technologies for borehole geophysics and low cost aerial surveillance, and her research contributions have been awarded no less than three best paper award from the SEG. Today she will talk about distributed acoustic sensing for reservoir surveillance. So please, Albeena, the screen is yours. ALBEENA MATEEVA: Thank you very much Adrien and everybody. It's really my pleasure to be with you today.

Okay, here we go. I'll give you a little bit of a industry perspective of one on one of the most exciting applications, I think, in the last decade, but that has come around into physics, distributed acoustic sensing. Thus, as they call it turns, an optical fiber into an array of seismic receivers indeed has many possible applications. I'm only going to discuss one of them which is DAS VSP because this is one of the most mature and most versatile applications out there.

It's interesting to think that just 10 or 12 years ago, it sounded almost like science fiction I still remember the very first best VSP that was acquired by shell in 2009. At the time it was early days, people were looking at the data squinting at it one way or another, and saying, well, maybe we can get the ship tech shot out of that. And now, these days you can go to a big physical conference without seeing several sessions on bass VSP, it's considered technically proven commercially available and waiting for business to pick it up. It's really changed the way in which we can acquire and aspire to use borehole seismic data. And the biggest beneficiary of this change has been timeless surveillance. That's why the topic of today's presentation is web applications to reservoir surveillance.

I'll give you a little bit of an overview of where we stand with that. Before I get going to things. Many, many of my show colleagues who have contributed to the work that I show you. I can't blame them all on this title slide but you can find the names of course are so some of the papers that they mentioned in the other thing is, I'm obliged to show you this very long and interesting disclaimer, which basically says, Please take with a grain of salt, any future oriented statements that they make.

Okay, with that, we are free to go. And I assume that most people in the room in fact are familiar with the technology but just to make sure that everybody's on the same page. Basketball speed stands for vertical seismic profiling, with distributed acoustic sensing in a VSP survey, we have seismic sources on the surface and receivers in a borehole. And in the case of dust the receivers are fiber. Having fiber into bullhorn allows us to obtain 3g and 4g images around the bowl.

Let me just check and sorry for interrupting but you still hear me right. So think up on the little bit yes we can hear you. No problem. Okay, I'm sorry, I'm going back to full proposal. Okay. Okay, so. Okay, so these images that we get from that DSP are very very similar to just normal seismic images, but they have limited lateral extent. But on the Well, basically some color around the world, the exact size of that image depends on geometry and the depth of the targets and so on. But generally speaking, imagine an image that is only a kilometer to around the borehole as opposed to 10s of kilometres as normal size. Um, okay. So, for, for time lapse surveillance we use fibers that are permanently installed in boreholes either on to below or behind Casey.

But I should mention this, that it's also possible to acquire this VSP using the temporarily deployed fiber for example fiber that is incorporated in wire lines that are normally used for just were longing for example. Those wireline deployed deployments are most suitable for exploration settings or generally well as in which we have some intervention plan. Anyway, for the purpose of surveillance, we really try to avoid entering the well, so we don't want any well intervention. That's why we use permanent fibers.

They make acquisition very safe in Gz non intrusive because the only thing that you need to do to get data is to hook up an interrogator to the surface end of the fiber and bring the safe seismic sources. Kevin such easy acquisition is what allows us to do on demand time lapse monitoring and the good thing is that we can install permanent fiber in almost any type of well, it can be injectors producers and they can be active during the VSP survey we don't need to shutting the wells in order to get these data that's impossible with God has for example. these p data that's impossible with God halls, for example, and of course the same fiber can be used for many other applications, such as temperature sensing does for for one so on, which enhances the value of the fiber installation, maybe the most important thing to realize about the use of fiber is that it's not simply a poor man's g4, but instead, the way we look at it is as an enabler of novel borehole seismic applications that were previously impossible with juveniles either for technical or for logistical and cost reasons. Okay, so I already mentioned that the biggest beneficiary of this is a reservoir surveillance. That's why I'll spend most of this talk talking about 4g applications onshore and offshore but with an emphasis on offshore in especially deep water, because that is where the biggest impact is. However, before we get to 14 novelties, I would like to revisit some classical 3d VSP applications because that's VSP still a VSP. And if you would be getting 4g data, your any way and getting 3d data to so we you may want to know what to use it for.

You will notice that my slant here is very much about why we use, or don't use certain applications and not so much on how we actually do it. I know that you care about how are you say a few words about that towards the end. But then, please feel free to ask me anything that I didn't mention to them to talk. Okay, so starting from 3d, a textbook case for using a DSP is for imaging, under complex overburden such as for example salt. For some sauteing and Jamie it's known that 3d VSP can provide an uplift to image quality. But it was, it used to be expensive very expensive to do 3d VSP with warmed up geophones things as required for subsalt imaging. So, this is one example in which, that's basically enables us to do something that has long been wanted, but just it was cost prohibitive to most cases.

In this example here, turn on my pointer. Okay, this example here for sips Altima gene is from a whale, a producer, very active producer that was drilled through salt. So you can see that the image that you get from the desk via speed is not very extensive laterally. So wherever we get to see exactly where we care to see which is where the well penetrates the sub sub segments, and the image is an obvious uplift compared to surface seismic in that area.

Another type of complex overburden is the near surface onshore it's notorious for causing problems with surface seismic 3d but it's even worse for 4g. So in this case, the speed helps in several ways. One way is that reflections pass through the complex over overburden complex near surface only once instead of twice. The other is that the receivers are below the surface, and therefore they hear less surface noises, most notably less ground rock which is very difficult to clean up from for the data, and perhaps the most important part is that any change and communication that happens near the source whether due to the near surface or the source itself can be measured using the direct down going arrivals from the source to the downfall receivers, and then used to correct the reflections for that change. So, that's why. That's why these these also beneficial to onshore surveillance.

Okay, another iconic 3d application VSP data is using the first arrivals for travel time information. So, in the, in the case of death, the, the notable thing is that you very easily get large surveys like 3d and 2d as opposed to just one shop next to the well. And that allows you to do some pretty good traveled many versions such as for those entrepreneurs segments. It also allows you to create virtual source data. You might remember that 1015 years ago virtual source was a very hot topic in the VSP community, but it requires his input at least the walk away VSP into geophones that was a stumbling block, because it was expensive to get anything beyond the one it is. So essentially, here again that's a case in which the simplicity and cost, lower cost of that enables to get the necessary inputs data to do that it's useful for seeing under complex overburden It's also used for getting very accurate interval velocities.

Another thing is, if you have a world that is near salt, whether it is just outside salt or just inside salt. You can use the data for. us here is that you have a very warm receive array, which allows you to move from grey based applications will rate based. So proximity type of service to using full waveforms to image the soul trap flag. For example, RTM is a pretty good tool because, because it can handle multi passing that can occur quite easily. If you have a very rough salt to focus. Okay. So, bottom line is that VSP, even though it's mainly used for 4g. It can also help the radius, classical 3d application simply by making the data available. Ok so now go into two time lapse. Time Lapse let's first look at onshore offshore there are several domains of application. The very first idea that we had when we became able to record 3ds piece was that we can use them in fields that are undergoing enhanced recovery.

Some of those fields, especially in the Middle East, tend to be drilled down very this wealth patterns. So we thought, okay, we can install a number of fibers in a bunch of wealth and have them all these two things lead to the same carpet of shots and get a time lapse surveillance through repeated 3ds piece that overcome all of those challenges with the new surface that they mentioned two slides ago. However, when we tried that, we very quickly found the problem the problem was that these er heels of a very congested, and fast evolving infrastructure. well repeatable they harm both 3d images and 14 inches. So this application is currently on hold because of the surface obstructions another domain about education is unconventional.

That's really the birthplace of the das technology and today's an amazing variety of fiber optic applications you know unconventional. Um, the PSP is only a very small portion of that, but it allows you to see some very interesting things you can do repeatedly SPS between track stages to monitor the dynamic behavior of the induced factors they opening and closing and that geometry, and so on. So it sounds all very attractive but what is the missing link here is more integration with other fiber optic applications, so that these observations of the fractures actually lead to actions in the field, without success such actions businesses simply will not deploy a certain technology. So in the case of unconventional so this is partially a technical but partially also stakeholder management issue. In, maybe. Okay, one last domain of application, arguably growing is to to sequestration. So, you know that humanity has the ambition to score more co2 in the ground. And in doing that it's critical to be able to monitor this safe storage of co2.

So, it's mandatory it's related to the license of to operate, to have some monitoring and verification plan. So that's ESP because it has better repeatability than other tools on short, is actually the next one candidate you can put your fiber injection wells in then even without stopping the co2 injection monitor what's going on around them both for the purpose of conformance, which means knowing how your clean is evolving, as well as containment, which means making sure that this year to stay singing, then the reservoir. We have his such example one of the very first commercial deployments of co2 sequestration. We headed that quest which is onshore Canada. Since 2015, they had injected more than 5 million tons of co2 and they do that through three injection wells.

Each of those has fiber behind casing in each of those has multiple vintages so from multi azimuth walk away DSP is acquired. As I mentioned, those are used to verify conformance and containment, it's quite easy to observe the evolution of the clue. Over time, 30 sensory processing of this data set in fact led to repeatability level of what we call normalized RMS, that's a measure of noise and RMS below 10% which is truly excellent for onshore. Okay, so one of the challenges here, is that sooner or later. These co2 plumes will outgrow the area of illumination from that VSP, and then the question will be what to do next. That will depend partially on what the need for monitoring at that point in time is, which in turn, depends on how predictive, we have proven reservoir models to be. So one of the things that needs to be done with us VSP data, the early stages of injection monitoring, other than simply suits, looking at the tone evolving is using the data to calibrate the reservoir models so that they have a better predictive power, so that in the future, less monitoring is required. So, that is for sure.

For sure the story's a little different today's one main domain of application, which is for frequent monitoring in fields that are undergoing some complex type of recovery such as either water injection or some other type of stimulation ejecting something. So, normally, these fields are being monitored with OBN that's the gold standard for time lapse monitoring in deep water. OBN is a type of survey that uses nodes seismic nodes on the sea floor. It gives fantastic data over large areas but the problem is is that it's very expensive. So you cannot afford to do it very often. If you do a survey that is only several years apart. That is good for making decisions that have a long lead time, such as for example quick to do the next well, but it's not enough for making decisions that have a much shorter timescale, such as optimizing operations in existing injectors and producers. So that is where new technologies such as best come in, being lower cost, they can show you what's happening in certain small area that is of interest at this particular moment. And you can imagine that if you can see just what you need to see, at any given moment, that may also allow you to post on a little bit, the wider more expensive survey so in the long run so flexibility with ob and scheduling is a cost saving strategy.

However, doing such frequent monitoring is actually not easy because there is, there is a set of non trivial requirements for the monitoring tool. It must be low cost, in order to be able to afford frequent acquisition. It must be available on demand to answers urgent questions. It must be non intrusive to not deter us, and at the same time, it must be sensitive enough to detect the small changes that will happen over a small time interval. So, you can see that some of these requirements are contradictory, for example, will cost insensitivity.

For that reason, the entire evolution of that VSP in deep water has been very much centered on striking the right balance between sensitivity and cost, or maybe to put it in another way to the lowest cost acquisition, that would still give you enough sensitivity to see something, what you want to see with enough confidence. So, I won't bore you with all the details of maturation steps in deep water, but I can tell you that it took a lot of field testing. Everything that you could possibly imagine. From the logistics of acquisition, to the world conditions flowing not flowing where the fiber is what type of fiber what type of geometry, then it also took lots of efforts efforts to reduce the source effort to the minimum amount possible by shooting sparser short lines by using smaller sources that can be handled by lower cost vessels. Basically we do everything possible to harm our data quality but amazingly that VSP keeps performing and we are able to get fairly robust time lapse performance we get this measure of 4g DSP repeatability below 10% very routinely even even, even under horrible weather conditions in deep water.

So we have tested this performance in two ways. one is that it's just in terms of time lapse noise, the idol. The other way is by comparing the results from that view speed to ob and 4g, which means we have eavesdropped to various ob and surveys whoever I want to point out the obvious that the only time you get new information from DSP is if you use it, between stages not during all the end stages. Anyway, let me maybe just jump in show you some examples of what type of information you can get from that. He's one example, it's about monitoring the startup of a new injector, so we can adapt views be between two audience surveys that happened to capture the startup of a new horizontal injector, which has been online for less than six months. You could see the heart any signal these blue blog related to injection very clearly, but it was only at the toe of the well, this is not what was expected from the reservoir model is expected something else. and it prompted investigation so stick with you What is the root causes the completion is the reservoir moto.

Eventually we found out that it's probably the reservoir model because we had a distributor log that was showing more residual oil saturation before this injected was the road. This is a log in this injector before injection started. So, he could more oil toward the heel, and more water toward the top, which was exactly the opposite of what our starting water saturation model was, according to the model, we will expect it to have more water toward the heel and more on swept oil toward the top. Okay, so, log plus das led us to update that as a role model to honor this more complex permeability pattern.

If we didn't get these dots VSP in 2017 and if we had waited for the next OBN, which was just one year later, it would have been a lot harder to support this type of behavior, even though it was still there it was just harder to see. So then the next question is, Does it matter and what the water does next. Typically, as we're engineers wanted to know whether the waterfront might be headed preferentially toward one producer or another. So, okay we got some bass in 2019 and to the surprise of everybody. We found out that the hardening signal know corresponding to injection have been expanded so basically that they told us these little progress of the observable and water emphasis on observable because that doesn't mean that water is not moving. It just means that perhaps it's moving away from this injector along some thin layer. The next thickness of which is not enough to be visible on for the web data, especially when the visibility of it is offset by a certain level of softening that can be related to pressure up, which we see some indication is data.

So blue generally means in our ports hardening well red means softening so we can see that these blue blobs less blue than that blue blog, which can happen when you have some layered on top of the book. Anyway, so this points to the fact that this is a very complex in as it were that needs further monitoring because the the current hypothesis is that injection is happening in the sound body with the limited kind of connectivity this type of information is really of interest to assets that operate these fields because if their suite is not performing as intended. They need to know that and do certain things to help improve this week. Another type of application. Now this is you know different threads of worrying which includes injection is older. By the by the way this reservoir is in the same field as the other one just a different depth.

So it's being monitored by the same that service. We had that survey in 2017 that basically showed that water is progressing directly, a project. These are depth contours So, up to you from its previously known location, and it seemed to be following a certain preferential path that look nothing like the CCA uniform reservoir model that we had at the time. So once again, opportunity to calibrate the reservoir model.

Then, when an old being came. The following year, we noticed that the water propagation can change direction, instead of continuing going up. It was now headed directly through the years producer. So the interesting thing about that is that this is a very tight prediction. Typical reservoir models have no certainty about the year as to where the model.

When the water will show up at the producer. but what production, technologists want is a prediction that is better than six months, uncertainty This is their magical number that allows them to plan certain well treatments, as well as logistical things like the frequency of food something. So in this case, we've had this ambitious. And with confidence based on that. The next thing that we wanted to see was what the water sweeper do after that I've done this producer because there are several other producers in this field. Two of them are in the attic over here, and one is to the south, and with a suspected baffle between eight and the nearest injector.

So we got another data set with Darcy 2019, and we saw very clearly how the, the baffle is indeed there and the sweetest progress in object toward the attic producers, but not toward this producer to the south, which kept the asset. Basically prioritize where wells for scale treatment, that's the type of treatment that they do before water gets there. That doesn't mean surprises don't happen though, a little later, some water D derive into this producer sooner than we expected. So now the question is why is it that these battle was in fact leaking all along the way along some thin layer that is basically

the noise for our for the beta, or is it that soon after we acquired the survey water found its way around the bottle and on the highway to this producer. So that is what we hope to find out with the next survey. So, I would say, I hope I show you these examples to quit you hope you gather that we can see various quite useful and interesting things in our data. And you might say okay we in great shape equations industry using this technology all over the place. It's not so simple because in order to deploy these things, which you need for a particular field is something like this. If I see a certain time lapse signal. I am going to do this and that into the business impact from that would be a DC and. Okay, it's easier said than done. This is this this part here about expected the observed signals geophysicist excellent doing predictions of what kinds of things can be can be seen.

But what can be done. That is, in fact, in fact, the specialty of other disciplines the disciplines that in fact operate the world, which are ready reservoir engineers in production technologies, is not the truth is. So, here, cross discipline collaboration is absolutely essential disciplines, must speak each other's language and understand each other's options and and problems as well. So, okay, people are working on that but just one that's one that isn't quite enough so simple to deploy things.

Okay, that was with that I will stop about the organizational challenges and say a few words about how we actually do things well currently we do things in the, in the simplest way possible so for example for imaging, we currently use primary reflections. And in order to make the image wider, what we normally do is record that VSP data in several Wilson the same time you send into the same parts of shots. And then we combine these images from the various wells.

There is another way to do this ago, which is based on using multiples. The benefit of using multiples over primaries is not only that you can get a wider you need from one well, but also that you get a nice image above the whiskey receivers, but normally wouldn't see with primary reflections. So, in 3d, imaging with multiples in various kinds and flavors of how to do it has been around for many many years, however court is still not mature is the maturation for you you have to make these in multiples contributions, more difficult.

Another thing that people often ask about is the pre processing before imaging. So, if you look at the processing work well. You will notice that, in fact, it's a combination of some very typical DSP processing steps and some very typical 4g processing steps. The only thing that is that specific is one step which is related deaths calibration.

So So what is the issue here. The issue is that when you use this you know your receiver. The receiver positions along the fiber, but the fiber position with respect to the rock formation is uncertain, and thats related to complications with the fiber installation in the wells. So, in industry there are various depth calibration methods out there, one of the methods that show favors is, is a method that utilizes amplitude information that's amplitude information. So, okay, think about hooks law that's essentially measure string. If the receiver is next to a soft rock and you impinge with a with a certain pressure wave, it will perform a lot, it will do for much less if if the rock is stiff. So basically, that is the mechanism through which does amplitude does depend on the local formation properties. These can be used in two ways. first of all, in 3d in order to find out the absolute position of your fiber with respect to the combination.

You can correlate thus amplitude to available. Well logs. So that's amplitude routine, like bottom okay does killer cells the call. clothes and you try to correlate the scalars to the web logs, right now in the death community, there is quite a lot of debate. What is the exact combination between density and Sonic logs that is proper to correlate this amplitude to this question, can be quite important if once ambition was to use that amplitude, in order to work for local property so the formation such as difference, but wherever for the purposes of depth calibration.

I don't want to say we don't care but we can be a little more relaxed there many, many combinations of walks that visually can very clearly tell you where your, your boss is with respect to the information. So for net calibration. You can get away, the more qualitative things we typically use something like Rossi's queer to collect to correlate things, then another thing is that in time lapses, you don't even need the web logs, which we can do, is you get these amplitude of that extracted from each vintage of the VSP, and then you cross correlate them between themselves, that's much easier because, because you have this all over the. Unlike logs, which are only present over a short interval of the Well, usually, so basically it's much easier to correlate the scalars to the well it's not to mention that you're correlating the same type of data between get stitches. Anyway, we can talk a lot more about this but I don't want to spend a disproportionate amount of time on that. Let me say a few more words about outstanding challenges in beans, his skills. So, notice how here we have classical VSP processing combined with classical 40 processing.

There is no magic, but you may be surprised to realize how few people actually have this combination of skills. Many people have one or the other but not both. So when it comes down to processing past processing success hinges on hitting the right combination of skills that is also true for other things. So, the right combination of technical competencies can help you get the most out of that VSP data for example, knowing how to use seismic, and we in Well, the survey was, for example for production profiling or injection profiling. I already emphasized multidisciplinary engagement. It's never too early to start learning the language of other disciplines because two physicists cannot work in isolation. And last but not least, stakeholder management, people think, usually that technologies, would popularize themselves if they're sufficiently good, nothing is further from the truth. It's all about being able to balance the needs requirements from viewpoints of different stakeholders in the given field. So that is also a skill, very worth development, developing, it's a learned skill.

And again, never too early to start. Um, okay so we'll go a little more technical things. There are some things that have been a problem in these pre processing for many, many years, and somehow it has never been fully addressed. One of those things is I told you that we like the opportunity to use direct wave, a director I also basically down calling arrivals measurements to see what's happening in the near surface and then correct the reflections for overburden effects.

Those overburdened effects can include shallow multiples. The issue is that with far of severe SPS basically the path of the ongoing and outgoing arrivals to the receivers and not similar anymore. In, in the classical VSP applications you have a 1d situation the source is next to a vertical well. So then you quite literally g combo the up going way field with the down going way if you've been to suppress all of these overburden complexity, not so at large offsets there's some methods for dealing with the difference between down going in depth going waves at larger offsets, but those who rely on lateral invaders so basically horizontally layered media. What do you have a more complex medium. There is no analytical solution to that there is no general solution to that. So these days. One of the ideas is that it may be worthwhile to try, machine learning basically to learn how we can predict how we can clean up the reflections from these shallow complications, when you have both without going back on.

Okay, another thing that she's not the beautiful. So, in many ways, and instead of fighting that we'd better use it. One of the things is that we record that VSP data in active wells. That means a tremendous amount of noise, but instead of simply throwing it away. We can try using it, the typical example of that is using the two waves for completion monitoring, so that's something that is going on various levels of different projects. Another thing is, okay, this is not the point answer. What does really measures ease of this, the stretch or squeezing of a certain length of fiber called gauge length.

And so, it's a linear sensor that is along the borehole, one of the, one of the choices that people make in acquisition is to set what gauge length you want your data to be at. And that is related to the signal to noise ratio, as well as the evolution of the output data set, but the typical direction of thought on that is been well let's optimize the gauge length for different wavelengths for different formations with different philosophies, you may want a different Gator. Let's choose the best gauging. What about instead of choosing the best gauge length, you, you may want to try to combine different engagements in order to exploit them somehow, in order to get higher resolution spatial right mode is Lucia data. So, okay, I'm just mentioning these things in case you have some energy to deal with any of them that could protect interest. And then last one last thing that is totally within industry domain, not so much academia is but it's about maturity that systems

for subsea wells, everything that I showed you so far was for wells that have a dry three so basically you have a platform offshore, to which the fiber goals, but the subsea wells in which the world has had is at the sea floor, and you those wells for now there are very few fiber installations they're coming in starting to come, but it's slow because the hardware is there with some limitations about the conditions. So, eventually it will come. But when it comes the applicable base would be much larger one thing to know about this is that it's not just about hardware it's also about the interrogation system, because in the subsea wells you would have many connections between different pieces of fiber, and in when you have these connections, each event is a big loss and fibers tend to be longer as well, which is an additional optical loss so basically nano doesn't irrigation systems that are suitable for short wells on sure I'm not going to be suitable for subsea and vendors are working, working to address. With that, let me sum it up so that we have a few minutes for questions. So it really does enables both classical and normal DSP applications mainly through, allowing non intrusive low cost acquisition, and it's a two step change in our ability to monitor

rather than localized subsurface changes the technical capability or for the desk ESP have proven have been proven very well by now by multiple field trials on sure maybe the most important, maybe the most important application is for sister. She has monetary because we benefit to other size and methods of Sure. The biggest thing is complimentary to OBS, and they are definitely further opportunities to leverage this technology.

2021-04-12 19:12

Show Video

Other news