TRIP GENERATION Survey and ORIGIN / DESTINATION STUDY
GeoTeam_GT-150-13_Data Processing Report_Rev01
1. Data Processing
Technical Report
PROcessinG Is the ART of hOw to Make the
Data match Reality...
Thisdocumentcontainsmytechnical commentson free spandetectionandCP
measurementsof Bightof BiafraField projectconsideringthe dataacquisition,
data processinganddeliverablesstatusfrommypointof view asa seniordata
processor
2015
MohamedMetwally
GeoTeam
11/4/2015
2. GeoTeam| Confidential
Page | 2
Data Processing Technical Report
By: Mohamed Metwally
CONFIDENTIAL DOCUMENT
FOR
GEOTEAM OFFICE
3. GeoTeam| Confidential
Page | 3
Table of Contents
DATA ACQUISITION............................................................................................................5
TSS VERIFICATION:...............................................................................................................5
DIGITAL TERRAIN MODEL:......................................................................................................8
DATA PROCESSING...........................................................................................................12
NAVIGATION:....................................................................................................................12
BLOCKS OVERLAP:..............................................................................................................15
SENSORS:.........................................................................................................................16
DIGITIZING THE PIPE:...........................................................................................................18
FUZZFACTOR:....................................................................................................................23
EVENTS:...........................................................................................................................23
EVENTS IN OVERLAP AREA:....................................................................................................25
OFFLINE PROCESSING LOG:....................................................................................................27
RECOMMENDATIONS.......................................................................................................28
4. GeoTeam| Confidential
Page | 4
Table of Figures
Figure 1 TSS Parameters.....................................................................................................6
Figure 2 Pipe tracker depth above DTM in burial area........................................................7
Figure 3 PL#19 Bad DTM.....................................................................................................8
Figure 4 PL#21 Bad DTM.....................................................................................................9
Figure 5 Kalman filter parameters.....................................................................................13
Figure 6 PL#30 Navigation area need to be removed.........................................................14
Figure 7 overlap points causing bad SD calculations..........................................................16
Figure 8 Interpolate outliers .............................................................................................17
Figure 9 Interpolate points with no data from TSS ............................................................17
Figure 10 save kalman filter as digitizedline .....................................................................18
Figure 11 pipe tracker filter flexibility settings..................................................................19
Figure 12 Digitized line properties ....................................................................................20
Figure 13 Drape and smooth digitizedline ........................................................................21
Figure 14 join digitized lines to be one line .......................................................................22
Figure 15 Event KP is alternative for event position..........................................................23
Figure 16 Event position user offset inNE.........................................................................25
Figure 17 Event takenin blocks overlap area ....................................................................26
5. GeoTeam| Confidential
Page | 5
Data acquisition
TSS Verification:
At the first look of the data I observed that there is difference between the
pipe tracker depths and the pipeline crown in the DTM in sections that
pipe was exposed and show the pipe crown in DTM.
Prior to commencing the project survey operations, it is normal procedure
to conduct an intersystem comparison between the TSS TOP and TOP
derived from either DHSS or MBES. This provides the client with proof
that the TSS is configured correctly and that the scale factor is valid.
The following steps should be followed when conducting a TSS
Verification survey
1. Identify an area of exposed product – approx. 200m
2. Conduct background compensation and seawater check as per
procedure
3. Starting at an altitude of 0.5m fly the ROV along the product for 20m
maintaining a speed of approx 600m/hr
4. Increase the ROV altitude by 0.5m every 20m until the TSS is no
longer able to track the product.
5. Having logged 20m of data where the TSS is unable to track, decrease
the altitude again in 0.5m increments until the ROV is as low as
possible.
6. After 20m rise the ROV to optimum survey altitude – dictated by the
TSS target scaling report. Log 40m of data at this altitude.
6. GeoTeam| Confidential
Page | 6
7. The online surveyor should ensure that there is at least 20m of good
quality TSS data before stepping the altitude up or down. This may
require a distance far in excess of this to be flown
I explained the importance of the TSS verification to the Report
Coordinator and he told me that he didn’t conduct this test before and
contacted the party chief on phone and he asked me about the procedures
of the test, then I explained to them the above mentioned steps for the test
And I had been told that the altimeter of the ROV isn’t working properly
and we cannot conduct the TSS Verification test!!!!!
All they doing till my crew change regarding the TSS is the Background
Compensation test which is correct but for completely different reasons
than the TSS verification, as explained below
In order to tare the TSS system from static noise generated by the vehicle,
the online surveyor must conduct a back ground compensation check.
This quantifies the amount of background magnetic noise and effectively
zeros the system prior to tracking the product.
When encountering different types of products, it is important to
understand and predict how the TSS will perform. Failing to account for
these differences will undoubtedly lead to incorrect calculation of TOP
and could eventually lead to false reporting to the client.
Figure 1 TSS Parameters
7. GeoTeam| Confidential
Page | 7
Regarding the absence of correct target scaling factor and coils threshold
added to the TSS parameters in the ROV surface unit PC, we got a lot of
troubles during gathering the data, couple of days before my crew change
ROV wasn’t able to detect the pipe because it is going deeper in burial
and I told the RC to check the threshold used to detect the pipe and
regarding the feedback I got from the ROV pilot they change the
threshold used and pipe tracker now gives us better data and detects the
pipe deeper than before.
Figure 2 Pipe tracker depth above DTM in burial area
8. GeoTeam| Confidential
Page | 8
Digital Terrain Model:
Regarding the reasons mentioned in the previous section considering the
TSS parameters, the ROV was not able to fly on top of pipe with
reasonable altitude to get good coverage from the profilers (swath width)
Because they will lose the pipe tracker signal if they try to fly much
higher, especially with the absence of visibility making the only solution
to depend on the pipe tracker to follow the pipe even when it is exposed.
This low altitude causes a bad influence on the quality of the DVL data
and the DTM data regarding the flying objects coming from the seabed
because of the ROV thrusters, also if we secure good data from the pipe
tracker or DVL it can be useful to use it to remove the swell artifact and
get better quality DTM
Figure 3 PL#19 Bad DTM
9. GeoTeam| Confidential
Page | 9
Figure 4 PL#21 Bad DTM
All sensors offsets values should be recorded in place in the mobilization
report and to be up to date with any change, at the start of my trip they
were changing the profilers offsets when the ROV maintenance carried
out without measuring the new values and apply it in naviscan !!!
Making fake relief in the DTM at the meeting point of the data gathered
by the old values and the one gathered by new values
Disregarding to conduct alignment test for the profilers even when we
relocate the profilers between vehicles (Cougar/Panther)
Making problems in DTM especially in roll values and profilers Z offsets
as per the mail coming from GeoTeam office suggesting conducting roll
calibration test, attached below
12. GeoTeam| Confidential
Page | 12
Data Processing
Navigation:
Regarding the reasons mentioned above about the DVL data quality it
wasn’t applicable to apply DVL on the navigation for all blocks or use it
to remove the swell artifact from the data
Every time you apply kalman filter you add some rotation error angle to
the navigation because it is always an error angle between the navigation
and the Doppler data. So try to apply the filter by using small mean
position error value and do it for many file as much as you can unless you
haven’t a time gap between the files. If you applied to an individual SBD
file so you need to rotate the navigation a bit to be matching the next file
that doesn’t have time gap.
Apply good DVL data on the blocks and stop apply it on blocks with bad
quality DVL regarding EIVA standard in figure 5 with small main
position error value (0.1)
But applying DVL on all single block separately for blocks haven’t a time
gap between, using kalman filter suggested parameters with mean
position error value (0.4) as the current situation until my crew change
isn’t recommended at all for the above mentioned reasons.
14. GeoTeam| Confidential
Page | 14
Navigation needs more cleaning especially in the area the ROV struggling
due to strong current or recording data during the CP stabs, making loops
show up in the navigation need to be deleted from navigation window in
NaviEdit, this will delete also these bad scans attached to these points
from the DTM and improve the DTM quality
Figure 6 PL#30 Navigation area need to be removed
15. GeoTeam| Confidential
Page | 15
Blocks Overlap:
There were two ways used to cut the overlap area on board
1. Cut a part from each block at the middle of overlap area keeping 1m
of overlap to avoid the gabs in the ROV track, disregarding that this
area covered from two blocks and we should check which block
containing better data to keep it and cut from the bad one, also it
will make troubles if there is online event taken in the overlap area
belongs to the deleted part that we cannot find the corresponding
event time in the track inside NaviModel.
2. Second way to keep the navigation lines in the overlap area inside
NaviEdit as it is and reject the DTM points that we don’t need, and
keep 1m DTM overlap inside NaviModel, this way is better but
when we start reprocess the pipe in the office data processor should
cut these overlaps before smoothing the navigation for all pipe
bocks together because these overlap points will be included in the
calculations of the standard deviation shifting the pipe position away
from reality, and event taken in the overlap area problem will show
up again in the final deliverables after cutting the overlap
16. GeoTeam| Confidential
Page | 16
Figure 7 overlap points causing bad SD calculations
Sensors:
Regarding DVL (Vx/Vy/Vz) and Pipe tracker (X/Z) it is
recommended to interpolate the spikes points and points with no
data coming from sensor in the sensor window in NaviEdit,
regarding the current bad data quality coming from these sensors for
the reasons mentioned previously in this report
Create a region around the points, then right click on the region and
choose interpolate points option from the menu
18. GeoTeam| Confidential
Page | 18
Digitizing the pipe:
Digitizing the pipe from the DTM when the pipe is buried by using
low DTM opacity and following the kalman filter isn’t the best way
to do the job, especially when the (80%) of the pipes are buried and
sometimes the pipe crown show up in DTM for couple of meters
only.
Also sometimes this crown isn’t the pipe crown but it is a mattress
or a huge anodes (3m /5m) length and the ROV get CP stab on it
and the pipe still in burial.
1. import the pipe tracker in the final NaviModel project
2. Right click on pipe tracker range and choose save as digitized
line option from the menu.
Figure 10 save kalman filter as digitized line
19. GeoTeam| Confidential
Page | 19
3. Import the saved digitized line to the project and interpolate it
every 1m
No one can predict the pipe movement in burial status better than
the kalman filter, but it is recommended to use different pipe tracker
settings in NaviModel regarding the pipe diameter and save it in the
project templates, because the pipe tracker filter horizontal and
vertical flexibility will change regarding the pipe diameter
Figure 11 pipe tracker filter flexibility settings
4. Select the digitized line and in the properties window change
the type to pipe, depth alignment to line Z and burial status to
Auto
20. GeoTeam| Confidential
Page | 20
Figure 12Digitized line properties
5. Generate the pipe preliminary to be able to QC the digitized
line in vertical from long profile window in NaviModel
6. Digitized line now ready for QC in horizontal and vertical for
the areas need horizontal or vertical intervention from the data
processor in burial status by deleting the bad points
7. When the pipe is exposed break the digitized line at the start
and end point of exposed area by selecting the point you want
to break at and then press (SHIFT+DEL)
8. Drop the digitized line in the exposed area to the DTM to get
DTM depth
9. Smooth the line with reasonable iteration to be graphically
representative well in the long profile window
21. GeoTeam| Confidential
Page | 21
Figure 13 Drape and smooth digitized line
10. Reconnect the digitized line from both ends after the break
to avoid having a spiky pipe in horizontal or vertical position
11. When the digitized line(s) finished, join the lines to be one
line with name of (PL Name_QC)
22. GeoTeam| Confidential
Page | 22
Figure 14 join digitized lines to be one line
12. Recalculate the pipe and side flags then fly along the pipe for
final QC.
Regarding the above mentioned steps to create the digitized pipe,
It gives the data processor the advantage to make the intervention
needed to produce good product even in the area with no reliable
pipe tracker data in burial condition which is deserve to be used
regarding the bad TSS data for the previously mentioned reasons.
23. GeoTeam| Confidential
Page | 23
Fuzz Factor:
Because of bad DTM, sometimes when we drape the cover flags to
train, it rest on TOP or go deeper inside the pipe adding fuzz factor
calculations to TOP depth when we export it for charting, making
fake vertical spikes in the pipe profile created by AutoChart when
these spikes doesn’t exist in long profile window in NaviModel, but
if we use the digitized line properties in figure 12 to produce the
digitized pipe as explained before, this problem will disappear
completely.
Events:
Using the online events KPs as a reference to import MariSoft and
CP events inside NaviModel regarding marisoft_02 template isn’t
recommended at all because the KP column is alternative column
for the event position
Figure 15 Event KP is alternative for event position
24. GeoTeam| Confidential
Page | 24
Online event KP is coming from the raw event position that NaviPac
drape it perpendicularly to the run line to get the equivalent
(Raw KP), but NaviModel use the event position after moving it to
the TOP and drape it perpendicularly to the run line to get the
equivalent (Processed KP), TOP position should be the reference
position for the deliverables, so that moving the events in
NaviModel to be identical to the raw KPs is completely wrong.
It is recommended to use the event raw position as a reference to
import events inside NaviModel and moving events to the right
event time regarding the ROV track, and then drape it to the TOP,
If the event time close to event time in video and the KPs within
reasonable difference from raw KPs (2-3m), it is enough to deliver
the events to the client.
I surprised that they tried to solve the event problem by deleting the
user offset created in NaviEdit for event position used to export the
ROV track and change the online setting to push the NaviPac to
send event position to NaviScan instead of CRP position!!!!
This will shift all sensors position away from reality in Y direction.
(Data processing is the art of how to make the data match reality)
Data processor to use the event user offset created in NE as a
reference to export the ROV track (*.etr) file to import in the NM
final project, especially when we gather the data using the two
ROVs (Cougar / Panther ) for the same pipeline, should export the
ROV track (*.etr) for blocks gathered by each ROV separately.
25. GeoTeam| Confidential
Page | 25
Figure 16 Event position user offset in NE
Events in overlap area:
Regarding the two ways using onboard to handle the blocks overlap
previously mentioned in blocks overlap section, most of the time
NaviModel pick up the time stamp from the wrong ROV track in the
overlap area, especially with importing the DNU blocks track to the
project to have time stamp for every event when we import the
events.
26. GeoTeam| Confidential
Page | 26
Figure 17 Event taken in blocks overlap area
I changed the criteria of eventing asking the online eventor to take
the events in the overlap again in the rerun block, if they decide to
rerun this block with overlap area for any reason and the two events
to be included in MariSoft event list
Now we have two events for the same object taken in each block of
the overlapped blocks, and it is up to the data processor to check
which block have better sensors data to keep and delete the bad one
keeping 1m of overlap to avoid gaps, and keep the event taken in the
used block included in the final event list and delete the event
belongs to deleted block.
27. GeoTeam| Confidential
Page | 27
There is event template created and saved with name (MM) to
import events inside NaviModel using the event position as
reference instead of KP with all static event data to compare it with
the dynamic event data before exporting, and designed to match the
last MariSoft event list template approved by RC.
Offline processing log:
I created this sheet before the operation start contains all processing
steps sequentially through all EIVA processing modules to control
the data overflow through the processing operation and to make any
edit on the data during processing documented to improve the
cooperation between the data processors especially in daily
handover and I think it worth to be up to dated.
There is additional column added to the sheet this trip with name
(Vehicle) to trace which ROV used to gather the data block to
consider which event user offset will use as a reference to export
the ROV track (*.etr) for final NaviModel project.
28. GeoTeam| Confidential
Page | 28
Recommendations
Conduct a pre shift meeting every day to avoid MIS-communication
between the team and improve the cooperation between the different
departments.
Conducting TSS verification test to avoid a lot of future problems in the
final deliverables as previously explained in this report.
TSS verification to be included in the GeoTeam operation procedures for
future projects.
Replacing the profilers with dual head multibeam.
Conduct a full alignment test calibration for profilers in issue of reinstall the
profiler frame on the vehicle for any reason.
Conduct a roll calibration for profiler in case of moving just one head from
the profiler frame.
Profilers offsets to be measured and imported in naviscan after any change
ROV to fly gently on TOP during gathering the data at the curved sections
especially at the pipeline ends close to the platform, to give the profilers the
ability to cover the pipe curvature without gaps.
Pay more attention for the project documentation update such as
mobilization report.
Use different project settings template regarding the different pipe diameters
considering the different flexibility for the pipe and the pipe tracker filter
regarding the diameter.
No need to import the DNU block track (DNU.etr) to the final NM project
regarding the new eventing criteria to avoid pick up the wrong event time
stamp as explained previously.
Pay more attention to keep the offline processing log up to date with all
processing steps.