Get efficient pipeline integrity data for Support with airSlate SignNow
See airSlate SignNow eSignatures in action
Our user reviews speak for themselves
Why choose airSlate SignNow
-
Free 7-day trial. Choose the plan you need and try it risk-free.
-
Honest pricing for full-featured plans. airSlate SignNow offers subscription plans with no overages or hidden fees at renewal.
-
Enterprise-grade security. airSlate SignNow helps you comply with global security standards.
Pipeline integrity data for Support
Pipeline integrity data for Support How-To Guide
Experience the benefits of airSlate SignNow and streamline your document signing process today. With airSlate SignNow, you can increase efficiency, reduce costs, and improve overall workflow. Sign up for a free trial now and see the difference for yourself.
Try airSlate SignNow and take your document management to the next level!
airSlate SignNow features that users love
Get legally-binding signatures now!
FAQs online signature
-
How reliable are pipelines?
Are pipelines safe? Yes, absolutely. Pipelines are the safest and most environmentally responsible method of transporting large volumes of petroleum products and natural gas over long distances.
-
What does a pipeline integrity engineer do?
Pipeline casings/ road crossing/ water crossing evaluation. Inspection plan development/ optimization. Identify pipeline preventative and mitigative measures, re-assessment interval and re-assessment methods. Monitoring and surveillance of integrity parameters to ensure reliable operations.
-
What is the integrity of the pipelines?
Pipeline integrity (PI) is the degree to which pipelines and related components are free from defect or damage.
-
What is the pipeline integrity management process?
PIM programs are systems managed by pipeline owner-operators that consider all stages of the pipeline life cycle, from conception, to engineering and design, construction, operation, inspection, and finally to repair/replacement when necessary.
-
What is an integrity test for pipes?
Pipeline integrity testing refers to various processes, hydrostatic testing, used to test the structural integrity of a pipe. Hydrostatic testing is used to test certain pressure vessels, such as plumbing systems or pipelines. This test aims to examine the strength of a vessel, which is a pipeline in this context.
-
What is structural integrity of pipe?
Pipeline Structural Integrity Definition In other words, the pipeline must be designed in such a way that it is able to safely withstand all the external and internal loads, including its own weight and all dynamic forces. In addition, the quality of all parts must be inspected at the time of installation.
-
How to ensure data integrity in data pipeline?
How to Enhance Data Quality in Your Data Pipeline Understanding data quality in the context of a data pipeline. ... Assessing your current data quality. ... Implementing data cleansing techniques. ... Data validation and verification strategies. ... Regular data quality audits. ... Leveraging automation for continuous data quality improvement.
-
How do transnet pipelines monitor the integrity of its pipeline network?
Transnet pipelines continually monitors the integrity of its pipeline network. Internal inspection tools, known as Intelligent Pigs, are valuable devices for this work. They make use of the magnetic stray flux principle to determine and record possible areas of metal loss from corrosion or any other cause.
Trusted e-signature solution — what our customers are saying
How to create outlook signature
hello everybody what-a-light has altered out about today is the perspective from a consultants point of view about recent innovations in pipeline integrity and the future of my run integrity particularly with reference to digitalization and i'd like to add it this is our perspective you know operators would have a different perspective technology providers would have a different of different spec perspective so if it's been have been in the in the business for about for about 25 years so this is the combination of that knowledge so this is what I'd like to like to give us today first of all just to be make sure everybody's on the same understanding about pipeline integrity and how it's part of enterprise risk management talk about what we're calling the industry quadrille EMA typical scenario with with with regards to pipeline shall be management as it is as it typically is now developing background in technology and expectations what's called the fourth Industrial Revolution or industry 4.0 or what we'd like to call pipeline 4.0 a little bit about new data data sources and new data sources some examples of our current developments and research and where we think that the challenges and roadblocks are to what we we are trying to do another pass the industry trying to do so firstly most operators if not all operators have some kind of perspective or enterprise risk management there's a standard for this and this will include the broad existential risks for example you know commodity risk oil price fluctuations existential risks such as pandemics or strikes etcetera but part of that is the risk associated with with the asset or the pipeline asset and this in this case so really pipeline integrity managed management is managing the risk that the asset poses to the environment people reputation and the business and you'll find that these are all tap into aims but you'll find fims facilities adobe medical system structural integrity management systems are even well interpreting madam systems the concept is the same you know it's about it's really risk management through the life as part of enterprise risk management from you know from most of these around by operators of course so one thing one important observation that we've made it over the last few years it's what we're calling the the quadrille Emma so it's four challenges facing the industry the first one is is obviously core to integrity management which is aging assets you may be familiar with it with a bath tub curve which is a bath tub shape where you you find in early early asset development you get in photography Lea's and then you have a long period of a reduced number of failures and at the end you get the wear out face now pipelines tend to exhibit this this behavior and in the UK alone but you know there are pipelines are built in the 40s 50s certainly 60s and 70s and in the u.s. we may even see 100 100 year old pipeline coming along so there are lots of pipelines you know with with with developing developer developing issues and one aspect of this of course is that the pylons are built in the sixties 40s 50 and 60s weren't built to the same standards that we currently apply them so a second point is scarcity resources and I'm not talking about steel or oil there this is this is really people there's been a you know the up-and-down of the oil price I think retirees and you know movement movements of movement of staff and also I think fundamentally the ability of the industry to attract attract you know young young engineers has led unfortunately to it to a scarcity resources you see this you know we we run a global business and you see this pretty much everywhere there is there is a shortage of people try not you know trying to recruit an experience integrity engineer they take a long time to grow you know they need only need a science base degree and then probably 10 years of experience and the third one is legislation increasing legislation you know governments are aware of the of the risk that pipeline assets are assets generally opposed to their to their income tax or taxable revenue and also to you know to the to the environments and people so maybe we're in the UK there's a there's a there's a new legislation which is a maximum the economic recovery of oil and this gives the hoarding gas agency powers to remove licenses to to issue penalties to to operators that that that can't demonstrate that they're maximizing the recovery will use resources in the UK notice clearly a good thing but you know it's another it's another burden for operators they have to worry about what I'm so a new financial outlook much fluctuation significant fluctuations in all price commodity prices or even negative oil prices lead to uncertainty lead to less money available to do things and if you're even if you are charging a tariff for transportation then then less less transportation revenue as well so really these these four things work together to to make tough life tough for operators and obviously we're supporting that process as a consultant so now the broad topic of pipeline integrity management there are a lots of advice codes of standards they've all been written in the in the last fifteen or twenty twenty years you know some us play some some European urine most European by some Norwegian and we we came out with our own we call them the the seventeen elements which is a which is a broad holistic way of looking at looking at pipeline integrity management because you're probably aware to manage it properly you need to cover a wide variety of different aspects of an organization including Quality Assurance also needs health safety in environments Human Resources legal projects maintenance and procurements so it's to be done properly it's a it's a it's a holistic process and another way of looking at really is again back to risk management is really managing the risk throughout the life throughout the life of the asset this is this is projected risk because obviously during the design phase the actual asset isn't producing a risk to the environment or people about projecting it forward so if during the design phase you optimize your design and produce control measures which could be route based on routing for a high pressure gas line it could be on based on wall thickness on the glass location and eventually you need to demonstrate typically that it's as low as reasonably practical and no further measures are required on our honor honor as low as reasonable practical basis then you can move into the construction phase now the the risk produced by the construction phase will almost certainly be different from from that predictor doing design but well it's more or less here we've we've assumed it to be the same and then you finally go into operation and this is where the bathtub comes into comes into play as you get towards the end of the life of the asset you have the wear out phase and you start you start you start to get issues such as a coating degradation fatigue damage external corrosion internal internal corrosion or system decay and then due to making decisions or inspection you might you might choose to actually do something that could be you repair your pipeline you introduce in inhibition all you reduce the operating pressure there are a number of different different scenarios and you go through managing the risk through the pipeline's life and then eventually you reach the point where the risk is unacceptable I'll say you've reached the point where the risk is unacceptable or the asset is not it's not providing a positive net present value in other words it hasn't it's not doesn't occupy a business satisfy our business function anymore and then you can the emission decommission it and then the then the risk reduces to to almost nothing not absolutely nothing because depending on where the pipeline is you'll still have some guy Noblet ongoing obligation you know the concept of abandonment is a complicated one because you'll still have potentially you know a band or a pipeline which is not operated still potentially can produce risk to the environment or to people so so you know in our experience of looking at the way operators manage this process it's it's a it's a very it's a very it's very patchy typically typically we've not really come across many operators that have you know full enterprise solution to this most of most operators large and small have a proliferation of spreadsheets multiple legacy er ERP systems particularly bearing in mind that the the acquisitions and mergers they've gone on in the past you'll find that there'll be all sorts of enterprise enterprise systems that are b2b used multiple highly disparate data sources some paper some in spreadsheets some in databases some in some in documents and unfortunately we've often come across situations where it's been lost as well so so gone often single user applications you know thick thick clients doing a very specific thing for no real connect connectivity with other with other applications so with it with this I would say that you know with and also with a limited limited resources quite often it's difficult for operators in our experience to achieve the minimum and when you start to think about some of the requirements for for generating KPIs or producing KPIs you know you need to collect information from the whole organization it becomes very very on will unwieldy and there really is I think in a lot of cases you know a loss of the ability to make you know good good insights a lot of them a lot of effort going into turning the handle and not much into getting you know getting getting a big picture or an oversight so now I used to be very skeptical about - about enterprise solutions to this I think in selling the nineties there were a lot of attempts you know with larger Sorrell operators to integrate all of this together in our experience not many of those have been have been very successful is it's a difficult it's a difficult difficult thing to do with with that the situation that current exists however things are quite quite dramatically changing you know certainly in the last 10 years five years or even or even the last couple of years so with a background of evolving tillage II involving technology and you know artificial intelligence on the rise and there are considerable advanced it advances in computer power there's been a 10,000 fold increase in processing power since since 2000 there's a 3,000 fold in reduction in storage since since 2000 and all listen typically across most industries an exponential growth growth in growth in debt growth in day sir and you see the chart the chart on the right the computing power has actually slowed down since about 2010 but in terms of computing power and particularly in the context of artificial intelligence GPU tting GPU or graphics graphics processor unit computing is kind of taken over for the very computationally expensive things a typical in cell I 9 top of the range has 10 calls and 20 threads but CPUs or GPUs rather have a slower clock speak clocks bleed and they are they have a smaller instruction set but there are thousands of them the NVIDIA GeForce r-tx 2020 80 that's 4352 Gudda Gudda course and you can get 5,000 plus available in virtual GPUs as well so really the GPU is for computational expensive calculations has really taken over taken over from the from the CPU and again you know available virtually the other very important development is just cloud based computing generally you know in terms of in terms of web services there's a whole wealth of web services being being developed to do all sorts of apply also subject different technologies you know for example optical character recognition that can be used for taking payments getting data performing analysis storage storing data and much of its free and a lot of it's very very low cost also open-source applications as well MongoDB and non SQL database Postgres SQL kubernetes allows you to put some a variety of different languages you know into it into an online cloud environment and do calculations from a server basis Jupiter notebooks or sharing for sharing computer code angular for graphical user interfaces elasticsearch for very very powerful searching and these have thousands of users you know and thousands of developers always always being developed so really that you know there's that there's a incredible technology being being developed and available generally you know and I think expectations as well generally most of us at home are used to used in getting everything a few clicks away you know using Amazon Amazon and Google we can get pretty much anything we like and really there's there's no real reason why some of this some of this technology couldn't you know couldn't be applied to to the pipeline integrity industry you know and I don't think there's really any way any way back from this I think I think this is you may or may not heard the end of the term and the fourth Industrial Revolution this makes reference to the progression from from steam power industry 1.0 electric power industry 2.0 automation you know maybe starting about 20 years ago and then now we're is a hyper hyper connectivity so this is industrial Internet of Things IOC you know artificial intelligence machine learning big data and light data so this is this is computers you know making making making good decisions of developing insights without without interaction from humans effectively [Music] but also there is a proliferation of load of low cost low cost sensors and you know they're obviously aimed at taking advantage of this technology and this can give live orc or quasi live real-time data and includes include sources such as pipeline strain gauges of course all of these allow this gentleman it's been around for some time but not in such an affordable and obviously without Vantage ease of communication as well pressure monitoring temperature mat temperature monitoring vibration monitoring process monitoring cathodic protection monitoring you know satellite lidar field verifications weather forecasting again is available for free from from web services you know again seismic monitoring you know again from web source services also collecting one call data as well it's where people are calling you know one number to report activities close close close to close to a pipeline so and also new data sources as well we're getting increasingly better better satellite data with low orbit lower orbit satellites there are new technologies with synthetic aperture radar which can be used for getting some millimetre measurements for ground ground displacements these are being used extensively in in cities that have poor ground conditions for monitoring monitoring displacements of buildings for risk assessment I'm not sure where it's being applied some pipelines yet lidar certainly has been a part of the pipelines but but that's harder to collect normally requiring requiring a plane or or an awesome one autonomous vehicle to collect lidar data but potentially this could be done you know over over the net effects really collecting data from a from a from a third party so what does industry for how to benefit the pipeline business now some things in some places we are still where we are there already you're probably aware there are fiber optic systems that can give very very good determinations of things like strain temperature measurements which can be used for leak detection they can be used for right-of-way monitoring you know they can detect digging or or vibration that could be used to prevent theft grand movements but it's possible you know that not many not many of is integrated but they can be included depending on the threat for live fitness or purpose assessments for example some of our clients have pipelines where the main threat is from land movement so you can actually ground movement government sensors or fiber-optic cables so you can actually get yet a real-time one time measurement of strain and actually relate that to a failure criteria to get a purpose you know incorporating all this into life - walls you can special have live validation of inspection of inspection periods predictive analytics for a live risk management you know in pre-emptive maintenance planning also dynamic life cycle costing is already a technology used in other industries for jet engines and mining equipment well you have a forward production of your lifecycle cost that makes you it helps you make decisions about maintenance or obsolescence on repairs you know no reason why we couldn't apply the same technology to two pipelines I'm gonna talk a little bit some of the experiments we've been doing with some of this method with this methodology now for doing integrity assessments matching subsequent inspection data's is a very ism is a very critical process excuse me you need to do this too to allow you to predict term prediction corrosion rates or rate rates of degradation and this this all really relates around the growth world number because to make a judgment about a pipeline you've really got a related to growth Wells because creepies or kilometer points tend to drift along along the length of the pipeline so we've tried a number of different techniques fast Fourier transforms we've used we've used annealing and relaxation methods and they all think that you know the both work work very very well very very quick by taking fractions of seconds and we compare it with our with our manual methodology we've used used in the past and it spots things that you wouldn't normally wouldn't normally spot so much much better so really to bet the benefit of this is much better much better predictions also we've been experimenting the very available corrosion rates you know from a long pipeline typical internally typically internal corrosion and and remaining life is based upon a single corrosion corrosion rate but in practice that could be for a long line lots of things going along going along during you know drains cathodic protection systems third body interactions damaging coatings changes in corrosion mechanisms natural causes lightning strikes etc so again we've been using we've been using this technology to actually to to develop corrosion rates along a longer pipelines who provided a debt better diagnosis and a better understanding of mechanisms allowing better failure predictions and of course you know better decisions which is really important thing in the end so the other area of exploration for us has been massive massive parallel computing using virtual GPS or actual GPUs this is a this is a a deist surface surface if as me a modified b31 1g and we've actually to do this in a conventional there's the first order and I believe a reliability method or the second order reliability method they have some computational disadvantages or impact cavities but a very robust way of dealing with this is using Monte Carlo simulation but this is very very computationally expensive and even you can get runs that go into years so if you're trying to produce a complete failure surface it could take a long time but using GPL GPU parallelization we managed to make this much much faster in fact runs that could potentially take days or months or even years we've got down and got down to seconds also been optimizing this using subset simulation which which focuses on on the local losses of failure and also we've been using artificial intelligence or machine learning to actually fill in the gaps so we populate the failure surface with a number of points and then filling the gap so we benefit here is that we can shift to probabilistic remaining life calculations probabilistic assessments which give benefits you know rather than a deterministic deterministic perspective ie your pipeline will fail after 20 years you can say that your pipeline has a probability of failing of this by 20 years and probably this will be 30 years so you know allows you to make more and better informed decisions also allow applying the same technology to very complex erosion features as well this is 12 meter long corrosion feature which we experimented with and using the PRC our string methodology with a 1.5 millimeter spacing that gives us about the methodology includes a combination of the various parts of the defect this will give us 32 million combinations so to do a probabilistic calculation on 32 million different combinations would take almost an infinite time we thought we were - now it sounds like ridiculous that we worked it out to be about a hundred thousand years using that methodology but we managed to get it down to down to tens of minutes by combining all these different enhancement techniques another thing we've invest a lot of time and effort into is really is really doing assessments that's a core core core business tour for us this is where you can take multiple sets of eight of in line inspection data all multiple construction data operations a historic data and produces an assessment literally in 30 seconds so you ingest the data go right the way through I'm obviously the benefits of this and it's highly accessible very very fast you know an insight and you can you can you get you can apply insights as well the other thing that we are contemplating and working on is dynamic life cycle costing this is currently have a long round of drawn-out process collecting pressure data which could be thousands of points calculating equivalent life cycles life cycles fatigue analysis Manulife manually aligning a mile a data estimating cost of æneas estimating hold I think comes for whole life costs calculating MPVs for economics you know it's very very long very very long drawn-out at taking months potentially but really you know we we we we're pretty confident we can we can automate this process so really our vision you know getting back to the the enterprise solution for pipelines is is really to to develop a native cloud-based solution not virtual machines or remote desktops with native clouds and integrate with existing LP solutions which are for sure we're here to stay with most operators and that's including solutions like Maximo ESRI and sa P to automate data adjustment ingestion and alignments using a smart recognition so you can pretty much make any any kind of data format and includes things like data document management live drive data feeds flexible user-defined holistic apos you know role-based portals for collecting KPI data loading solutions so you can use it you know train up people document libraries and benchmarking so we've completed we completed some of this but this is kind of our vision for for a cloud-based enterprise solution in terms of challenges challenges you know it's it's a key question really I mean this type of over technology can be very very accessible but do really operates operators want to feint responsibility for it and do that by themselves that's not that's not clear but certainly aspects of it they will then it's certainly an efficiency for us as a consultant also investing in a legacy systems you know a lot of people including us I've got investment in legacy systems which don't conform well to this type of thinking centralization I think I'm entered a new word there but obviously it's expensive would you apply that to all pipelines and lack of interoperability between IT systems so I'm actually running a bit short on time so I'm gonna speed up a little bit so what we found is a changing skill requirements you know we're automating now our assessments and some of our work so really people have time to spend more on insights and taking a high level of youth so really you know data science programming are becoming more important skills for us you know than someone that's capable of doing it doing it doing an assessment which is kind of changing the dynamics of our of our of our industry the of our organization and also it's affecting our ability to hire people as well you know it's offering a different kind of role security is a concern for a lot of operators in the Middle East you know there's quite a high resistance to cloud-based solutions some operators have ruled you out completely general concerns over over over over over over unauthorized access you know but we got our ISO 27001 you know that has a very very important checklist you know and you can use multi-factor authentication you can use higher ethical hackers and you know it's it's we've a bloke we've applied applied this methodology to our o to our environment I'm trying to got time to go through this but this is effectively what we're using it's a combination of open-source technology to provide a whole range of functionality based on file based data using API is to gather data for enterprise other enterprise systems or real time devices and produce procedure eyes processes to produce coach coach an app output across a whole I've got a whole range of different different things so in summary you know digital transformation using cloud computing open source web services and Industry four concepts has a lot of a certain well for the industry once implements is you know we think it'll give easy access to resources better insights and make better decisions there are roadblocks you know but we can see that some of those will be overcome the industry could benefit I think for a closer corporation in some areas particular transfer protocols I think the acceptance of cloud computing will come eventually and I think the other thing is you know is hiring engineers understand data science I understand coding can do coding you know from our experience that try and explain pipeline integrity concepts to computer programmers can become a bit of a challenge would be facing that one must on a day-to-day basis no disrespect to computer programmers so finally I think that's it from me and I believe Annie is going to facilitate some question on answering session and question answering session for us
Show more










