Empower your Security with Pipeline Integrity Data for security

airSlate SignNow offers secure and scalable solutions for businesses to ensure pipeline integrity data for Security. Experience great ROI and superior 24/7 support.

airSlate SignNow regularly wins awards for ease of use and setup

See airSlate SignNow eSignatures in action

Create secure and intuitive e-signature workflows on any device, track the status of documents right in your account, build online fillable forms – all within a single solution.

Collect signatures
24x
faster
Reduce costs by
$30
per document
Save up to
40h
per employee / month

Our user reviews speak for themselves

illustrations persone
Kodi-Marie Evans
Director of NetSuite Operations at Xerox
airSlate SignNow provides us with the flexibility needed to get the right signatures on the right documents, in the right formats, based on our integration with NetSuite.
illustrations reviews slider
illustrations persone
Samantha Jo
Enterprise Client Partner at Yelp
airSlate SignNow has made life easier for me. It has been huge to have the ability to sign contracts on-the-go! It is now less stressful to get things done efficiently and promptly.
illustrations reviews slider
illustrations persone
Megan Bond
Digital marketing management at Electrolux
This software has added to our business value. I have got rid of the repetitive tasks. I am capable of creating the mobile native web forms. Now I can easily make payment contracts through a fair channel and their management is very easy.
illustrations reviews slider
Walmart
ExxonMobil
Apple
Comcast
Facebook
FedEx
be ready to get more

Why choose airSlate SignNow

  • Free 7-day trial. Choose the plan you need and try it risk-free.
  • Honest pricing for full-featured plans. airSlate SignNow offers subscription plans with no overages or hidden fees at renewal.
  • Enterprise-grade security. airSlate SignNow helps you comply with global security standards.
illustrations signature

Pipeline integrity data for security

When it comes to ensuring the security of your pipeline integrity data, airSlate SignNow is the ideal solution for businesses. With its user-friendly interface and cost-effective features, airSlate SignNow empowers organizations to securely send and eSign documents with ease.

Pipeline integrity data for security

With airSlate SignNow, businesses can streamline their document signing processes while ensuring the security of their pipeline integrity data. Start using airSlate SignNow today to experience a more efficient way of handling document workflows.

Sign up for a free trial of airSlate SignNow and take the first step towards securely managing your pipeline integrity data.

airSlate SignNow features that users love

Speed up your paper-based processes with an easy-to-use eSignature solution.

Edit PDFs
online
Generate templates of your most used documents for signing and completion.
Create a signing link
Share a document via a link without the need to add recipient emails.
Assign roles to signers
Organize complex signing workflows by adding multiple signers and assigning roles.
Create a document template
Create teams to collaborate on documents and templates in real time.
Add Signature fields
Get accurate signatures exactly where you need them using signature fields.
Archive documents in bulk
Save time by archiving multiple documents at once.
be ready to get more

Get legally-binding signatures now!

FAQs online signature

Here is a list of the most common customer questions. If you can’t find an answer to your question, please don’t hesitate to reach out to us.

Need help? Contact support

Trusted e-signature solution — what our customers are saying

Explore how the airSlate SignNow e-signature platform helps businesses succeed. Hear from real users and what they like most about electronic signing.

life made easy
5
User in Real Estate

What do you like best?

saves me time and easy for my client to sign

Read full review
Super User Friendly
5
User in Hospitality

What do you like best?

Really easy and convenient for securing contracts and documents

Read full review
Easy, efficient and effective
5
User in Medical Devices

What do you like best?

Easy and fast way to get documents signed.

Read full review
video background

How to create outlook signature

so that's what i'd like to cover today i mean uh just this may be a recap for some but just a little bit about our perspective on pipeline integrity what the objectives are i think that's really background and then the background really to digitalization there's something that we're calling the industry called dilemma some technology opportunities that have come about through advances typical what we feel are typical operator scenarios um multiple new and shared data sources are available now and then the fourth industrial revolution how that might apply to pipeline integrity um our vision for our for our cloud-based service there and where we are with where pensmen are with this and what we see as challenges and potential roadblocks uh to this this type of thinking and tech technology so excellent please so pipeline is pipeline integrity management uh there's a there's a broader concept that most operators will be familiar with which will be an enterprise enterprise risk management so this would cover all kinds of risks could be uh riots pandemics currency currency fluctuations or oil price and part of part of the part of that risk is the risk that the asset poses the environment people reputation and business and that could be wrapped up in one umbrella called asset integrity management and we're really focusing on one part of it is pipeline integrity management but there are structural integrity management programs and um and uh well integrity uh well integrity management systems as well so it's so that's quite like that's that's our view of pipeline integrity management it's part of enterprise risk management next slide please the years and some of you are familiar with these a lot of codes and standards have been generated that actually have a perspective on this and provide provide guidance uh pence ben have been involved in developing these uh these methodologies auditing clients and developing systems of clients for many many years and we've come up with what we call the 17 17 elements view which is really in our view what everything you need to effectively manage pipeline integrity and this is a matrix that compares the various codes and standards with with that view and you can see iso 55000 is a very very general standard on asset management you know broadly covers broadly covers all these topics so we know we have a lot of guidance about to actually actually undertake this task so next slide please another way of looking at it really is is risk management throughout the life throughout the life throughout the life of the pipeline and really when we start off this is the design phase that i'm showing here the vertical axis is risk horizontal axis is time in the design phase you don't actually have you're not really exposing people the environment and your business to risk or very very minimal risk so you're really predicting forward what that risk might be so you're doing you're undertaking um studies like hazops and fmeas and then you you you reduce that risk by design until it's as low as reasonably practical so you put those reduction measures in place so then you go through the construction phase during that phase not nothing much changes your your your future predicted risk is the same but then you go into operation and whilst it's an operation this is where you start to get you start to get degradation the horizontal line dotted line is is what we consider to be an horrible level of risk at some point you intervene and that could be depressurization that could be undertaking a repair or in fact inspecting alone just inspecting won't change your risk but it gives you the opportunity to do something which will change your risk but it decreases uncertainty so you you can discover uh the condition of your line and then you continue to go through that process within intervening uh undertaking underpaid taking repairs or other or other kind of mitigating actions that eventually becomes intolerable and ideally this would happen the day before you you want you want your field becomes like uneconomic so i mean when you when you do a remaining life study that's what you're trying to achieve effectively you want your system to become unacceptably the risk to be unacceptable about the same time or just shortly before you know you you need your the end of the end of the required life effectively and then after that depressurize it decommission it and there's just a little bit yeah and then you actually interestingly enough in most cases it's very difficult to abandon an asset completely you always have a residual some kind of residual risk for an offshore line that might be a risk to um to the environment to fishing for an onshore line if it's a large diameter that could be risk to other land users uh risk to water courses etc so it's it's very hard to completely eliminate eliminate risk so in a pipeline integrity management system this is what we're trying to achieve in in you know in general general terms managing the risk of our asset throughout throughout the life um next slide please excuse me this is an observation that we've made that i'm i hope some of you we're familiar with we're calling it the industry quadrilemma so it's four things that make us as pipeline integrity as integrity engineers make our life a bit more difficult the first one is increasing increased risk of failure and that's because you may have heard the term bathtub bathtub curve so the beginning of beginning of an assets life you get um early life failures and then typically you have a fairly long period of stability and then you start to get the wear out phase and you start to get failures and a lot of assets around the world are actually in this in this wear out phase many many assets are way beyond their initial design life i mean in the us many pipelines more than 50 years old certainly 50 year old pipelines in the uk as well the second point is the scarcity of resources you know for one reason or the other fewer people have been coming into the into the oil and gas business people have been retiring the average age is is is very high compared with compared with other industries uh and it takes a long time to train a pipeline integrity engineer and also one and two here play together because the skill set is changing you know the requirements for skills related to aging assets is different than they are for a for a form where business is usable as a usual asset so they one and two work work together to compound the problem the third one more and more legislation governments are certainly aware of uh aware of or regulators aware of the potential liabilities with aging assets and also there's there's legislation regarding abandonment of assets uh so you may be required to continue to operate your asset um even though you don't actually have a use for it because it's strategic infrastructure effectively so there's more and more legislation more and more prescriptive legislation uh and also the fourth one you know there's a new financial outlook volatility of oil prices reduces and reduces confidence in investment and also you know a lot of fields are at that plateau as well so producing less producing less oil so there's there's just less money around budgets tighter so those all these things work together to make life difficult for operators you know they've got more problems to worry about harder to hire people more requirements from a legislation point of view and less money less money going around so so it's the industry background next slide please in the meantime we've got substantially evolving technology these slides are about increasing increasing in computer computing power the graph on the left shows increasing computing power between 2000 2017 and you can see we've got roughly 10 000 times more processing power in 2000 and 2017 now having we're having 2000 the cost of storage is is radically radically cheaper and the amount of data that's collected is uh is radically higher as well so so we have cheap storage high computing power and on the right uh you may have come across the term moore's law this is the uh uh the performance of central p or cpus central processing unit units uh on a single thread and there they are multiplying roughly 1.5 uh times per year and interestingly enough you know around 2000 the the the technology to start to started to use graphical processor units the difference between uh between the cpu and the gpu is a cpu has up you know typically maximum about 16 processors more typically for four or eight but gpus even though the processor is more a lot simpler they can have thousands potentially 2000 even more than even more than two thousand i'm getting greater so computing power is uh is is massively more than you know and developing uh yearly so next slide please the other things happening a lot in the background is you know is cloud computing and i think the first point is with software as a service you can get enterprise class very high reliability and hardware really would snap you snap your fingers with something like amazon web services azure you can just enter your credit card details and it's there up and running up and running very very very very reliable and also on top of that we've got a lot of different web services um with maps payments accessing data storing data optical characteristic recognition recognition much much more a lot of it's absolutely free or if it's not free it's very very very very low cost or it's pay as you pay as you go and also there's you know collaboration all around the world developing open source codes you know some of the most some of the most advanced computer search research is being done using open source code some examples are which we're actually using ourselves mongodb posters postgres sql kubernetes jupyter notebooks java angular are all examples uh free to use um you know and developing and developing and very and very powerful so this is very very empowering uh for for industry or you know everybody as a whole effectively so it gives fantastic opportunities there for collaboration and problem solving across all industries next slide please and really expectations as well we've got faster computers we've got all sorts of cloud-based services available and i think we all we all expect things to be to be available in a few clicks and amazon's a good example you know you can search for something in the morning a few clicks and it arrives the next day or even the same day if you're if you're living in a city so you know people can find new skills buy things listen to music read books all very very quickly extremely quickly um just just over over the over the internet so so the way we're interacting with data has radically changed and really there's no there's no going back you know that's the way things are going to be going forward next slide please however compensable have been looking at pipeline integrity management systems and working off with operators for many many years and uh kind of no disrespect but i would say that most even the most sophisticated operators still have a proliferation of spreadsheets uh you know multiple legacy erp systems a lot of manual manual movement of data between between systems um and you know this this with the uh lack of resources means it's it's it's very hard without the interoperability you know it's quite it's hard to get insights hard to make decisions you know there's a dependence on contract as well so it's it's as i said even the most sophisticated operators struggle struggle struggle with this next slide please and not surprisingly really you know the the the oil and gas industry as well this is some work that was done by uh by deloitte's and they developed a digital maturity um index in index and the oil and gas um agency in the in the uk actually under undertook this research with that using their maturity index and you can see oil gap oil and gas compared with all these other industries you know is one is one of the lowest uh you know if you compare it with with telecoms uh hospital hospitality travel banking securities you know it's very very very low down in the list really so next slide please and the background as well there's more and more more and more data sources uh more and more autonomous sensors are being developed uh for for real time or quasi real-time data uh sensors to measure strain curvature vibration position pressure also to monitor cp systems to undertake corrosion assessments uh to measure coupons and sensors also online online systems as well for things like weather and forecasting things like uh wave predictions and wave wave and wind predictions and also connection to third party systems as well uh just systems that are quite common now in the industry osi soft is one example data historian and uh system control and data uh data acquisition systems scada systems so we've got a lot a lot of data a lot of new data sources uh coming coming to play so that relates to that explosion of data i just talked about next slide please and an advantage and advances in data sources as well this is just one particular a system which is using using a satellite to actually uh measure ground ground movements so it's applying a synthetic synthetic aperture radar or sars you've heard of lidar but typically lidar has to be taken from a from a relatively uh from a vehicle or playing relatively close to the ground and it's quite quite expensive but this technology actually allows us to measure ground movement um with a satellite so much much more convenient potentially a lot cheaper typically it's being used now for uh building um settlements but no reason why it couldn't be used for pipelines so it's and it can get sub-centimeter measurements of ground displacement so it's it's very accurate so next slide please and faster and better data sources and these are two to examples of digitalization in terms of data data gathering the first is that the handheld 3d laser scanner typically if um substantial external corrosion is found during during a dcvg survey they might expose expose the pipeline and if and actually undertake pit measurements using using a pit gauge and this would then be used to actually under to undertake an assessment to understand what the remaining life is to want to take a repair so this is quite quite a tedious exercise but with a handheld 3d laser scanner you don't it doesn't require any particular sophisticated setup you can just get pointed at it it's got it has a stereo laser and it will do a very very accurate a very very accurate um calculation of the surface of the outside surface of the pipeline and it actually has built-in analysis and diagnosis software as well so very very quick times the speed and actually four times the resolution of a big gauge as well so and then the bottom one is um fast digital imaging inspection this is for um subsea survey of surveyor pipelines this historically was done by by using a video from an rv and then which is has to go relatively slowly to capture the data but actually things typically don't happen that quickly on pipelines so what happens with this fast digital imaging inspection is that it goes something like five times the speed and takes very very high definition photographs intermittently along the pipeline and with this they can actually produce um 3d models it can actually uh combine side scan multi-vein laser high definition images fill gradients non-contact cp and data all simultaneously so again five times quicker so so faster and much much better data sources so next slide please some of the audience may have heard about industry 4.0 and this is something that's really um sweeping a lot of industries and being a benefit to a lot of industries around the world um so we just explained um industry 1.0 steam energy a long time ago in the 18th century and then we had industry 2.0 electrical power industry 3.0 automation now we have industry 4.0 hyper hyper connectivity so this includes um interconnection the um industrial internet of things use of art artificial intelligence machine learning big data and live data and that is affecting a lot of industries around the world so so the interesting question is you know how could that how could that relate to how could that relate to pipelines for us next slide please so one one important area is uh certainly a line aligning data we understand we undertake inspection to to understand the condition of our of our of our asset or our pipeline whether that's from on the inside or the outside using using very various techniques but if you want to actually predict it for the future the best way to do it is to actually um undertake another survey uh at an appropriate time and then match the matches they match this data together so we found using you know using uh some of these mathematical techniques we can we can very very accurately match data together whether it's based on um the shape of a pipeline curvature or it's based on the uh the world the weld uh the pipeline the injection lengths well welds all based on individual under individual defects so the the ability to align data is is very very critical because if you're going to line data you can actually and you can actually undertake an assessment you can calculate the degradation rate and actually use that to uh to predict a predictor remaining remaining life and that that might be uh one or two or three four or many many data sets so we found that using um you know computer science this can be done but this can be done very very effective really so the benefit here is you know better failure predictions um better better much better decisions in you know in in calculating things like remaining life so next slide please and allowing us to do that gives us a better understanding of degradation so i mean typically um typically this is uh corrosion rates have been calculated as one figure along the whole length of the pipeline but i think using these more sophisticated techniques allows us to actually calculate credible corrosion growth rates along the length of the line so actually allows us to actually change our strategy depending on where we are on the line so the variances online could be could be caused by different soil characteristics could be caused by different operating temperatures one over the airline versus the other could be caused by um different construction techniques that could pick up things like uh preferential weld corrosion due to fuel joint uh coating issues or preferential well world world corrosion just to um uh the cathodic and anodic um differences you know in on the in the inside the pipeline so again more sophisticated analysis gives us a better understanding of mechanisms uh and allows us to have better better failure predictions more optimum more optimum repairs next level is and also better computing power allows us to to to delve more into the probabilistic assessment as well um typically a lot of a lot of work in the in the integrity area is actually done deterministically so with a deterministic analysis you take an account of um uncertainty by allowing by allowing by including safety factors for example including a safety factor in the yield structure of material you include safety factors in the choice of tolerances for for your measuring device your tool you include tolerances in uh in in the pressure operating pressure so this you can end up with quite conservative quite conservative results you know partic particularly if you're doing something which requires a lot of variables like remaining life doing it deterministically you end up necessarily doing it in very very conservative way so much better way of doing this is to actually take account of the variation in the in the various um various data or inlets or inputs and also in the equations themselves this allows you to actually take it take a you know a p50 result or an average or a p90 so it's a much more sophisticated way of taking taken account of uncertainty but numerically it's a lot more complicated um you may have heard of the tone structure reliability analysis so there is a the first order reliability method second order reliability method these are computationally quite hard to make to make robust there's a very robust way which is often used called simulation or monte carlo monte carlo simulation but this is very computationally expensive you know it can take literally um hours months or even years to actually undertake a calculation if you're looking for a very very uh very very low probability but with modern computers certainly with gpu parallelization which have you know thousand plus processors for online services or 2000 for for for for for actual cards and also using other techniques like you can you can populate a surface based on monte carlo simulation then you could fill in you can fill in the gaps basically using using interpolation um and also there are other techniques like uh like using subset simulation for further optimization so really faster computing and use of um machine learning can can you know can actually bring the bring these probabilistic methods into a more practical into a more practical domain um so the benefit really is probabilistic remaining life uh probabilistic net present value you know you can say that the you know there's a there's a 90 chance of your remaining life being x so it just really leads to more informed better decisions really taking account of safety and uncertainty in a more logical and scientific way next slide please automation very much part of pipeline pipeline 4.0 industry 4.0 and typically going through going through survey data um using uh using a human methodology can be very very very tedious i mean a good example is external survey data if you use a three out if you if you do a three hour video along a pipeline you literally have to sit there for three hours and watch it it's possible to speed it up a bit but it's easy to miss something you do that so in other in other sciences particularly medical science um it's been it's been discovered that using um convolutional neural networks these are actually better than the human eye and detecting detecting features or classifying classifying features you do this by actually uh having a set of training images and then you you train that you train the neural network that tend to detect the features so this means you can actually go through the video frame by frame using the neural neural network to to detect the features and then highlighting them um generally speaking uh you may have to actually go back and and review it but it should be a very very good at giving a very very first pass to see if anything very critical um has been found if there's any real um showstoppers um the examples on the right are pipeline spans sorry i'm going to finish that one pipeline spans weight coding damage movements so this type of data um we can do we can use this for a lot of the ili data you know protecting predicting or looking for particular features in the data we can use it for cp data external video images or internal damage and you know it's been quite a lot of success with this really so benefit faster better indication of anomalies really so thanks lovelies also analysis of complex features um quite often when we have complex features could be the interaction between various anomalies in the pipeline um more often than not it's easy to simplify it and and make some very very very very conservative judgment but there are more sophisticated methods but they are again very very complex very very complex well not not complex but um computationally expensive this is an actual example of a five meter long defect um and it's been been analyzed um probabilistically which would which using conventional techniques uh would have been an extremely um onerous and onerous from a computational point of view again using using gpu computing and artificial intelligence can be done very very very very quickly so so really what the benefit here is is better better much better decisions much better predictions um in four con four complex defects that could be interact things like interacting vents and interacting interacting anomalies for example so next slide please and this one i think is a really interesting one really um if you've got real-time data you know if you if you're applying um industry or pipeline 4.0 um principles you can be collecting real real-time real-time data so you can use this for example to calculate uh fatigue life on on the fly you can you can take accounts or variations in variations in product product composition taking out variations in in pressure and use this to actually with a digital twin effectively you can calculate real-time uh cost cost of failure you can uh calculate real-time real-time net present value or economics um and real you know real time uh time to next inspection etc so so really this gives you a better understanding of long-term consequences of operating better better scenarios better decisions so next slide please i passed the turnaround of data as well this is um the general uh going from running a tool to to to obtaining a report historically you know from my own experience and i know a lot of a lot of um inspection companies are already doing this to be fair but typically you'd run the tool download the data and this would be an enormous enormous amount of data these could have been actually couriered put onto onto a hard disk career courier took to a to a lab they would actually process the data they're actually produce a written report which will be then you know then reviewed sent off to a client but uh you know within within a pipeline 4.0 environment the reason why you can't run at all upload the data to the cloud you can you can do your automatic anomaly detection so you can give an extremely early prediction of of any any real um serious issues process the data in the cloud and using uh cloud technology uh computational power and actually deliver the report to your client in the cloud as well so the benefit here really is that much much faster speed much much faster between um between you know running the tool and getting and getting getting the results so and better and better results next slide please so really taking all this into into the consideration um you know penn state have been in the pipeline integrity business now for about 25 years plus probably even 30 years so we set about to to create uh a pipeline integrity management system in in in the cloud and really the vision we had for this really that it would be native cloud-based taking advantage of the of the uh the most up-to-date cloud-based computing so not virtual machine or remote desktop but actually completely native cloud so all you need is a browser to make it work we knew we'd never complete with erp solutions maximo esri sap for example you know we're in a different we're you know we're providing subject matter expertise um so we wanted to really just to be able to allow these systems to be integrated and take data from these systems to do to do better better analysis also data ingestion as well we just wanted to be dragged drag and drop so that you don't have to do lots of lots of lots of formatting which we which typically has been done in the past and so alignment alignment between data sets will be automatic so smart recognition of any kind of format really also data and document management we know that operators with even a relatively small pipeline network just a few pipelines over many years will develop quite a complex data management requirement so operators with hundreds of pipelines you know having a very very large data management and data management task um so to build in document management we thought was an inherent part really of of of this system also to access live data feeds dynamic of things like dynamic life cycle costing and kpis remaining life is set to time to next expression exp next inspection we thought that was very that was a very important part of our vision also flexibility particularly in defining um key performance indicators we've actually observed and set up a lot of key performance indicators for our clients but i'd say there's no there's really no two the same they're all very they're all very very different um so we've we our vision really is to create a system for clients where they can be totally totally uh flexible based on organizational individual pipelines um also role-based portals makes things simpler you know we have different people potentially involved in this so the um cp engineers and and asset managers it's an important thing really is to only provide information that's relevant to that person to make it straightforward also we've got we have we've invested a lot over the years in in e-learning material related to pipeline integrity we want that to be built into the system we want us in a similar way document libraries so access to um methodology and standards can be built into the system also benchmarking what we mean by benchmarking is comparing comparing uh an operator's pipeline pipeline integrity management system based on some sort of qualitative methodology with others so you can compare and actually plot your progress and of course the last one which is very important of course enhanced security and control of data needs to be needs needs to be very very good of course to be to be acceptable to operators so next slide please so hence ben um we had you know roughly 25 years in this business we've developed a lot of a lot of code and methods but never really a cloud-based solution so so to solve this we actually partnered with a with a company called qao who um who are um have developed a industrial analytics platform and have a number of a number of uh key clients including rolls-royce using you know using using this system so we built we built our product which we call thea on their system and that gives us a lot of things for free and you see that the their platform is in three layers a data layer model there and an application layer so within with the data layer we get by using the qa qrio platform away to actually connect to edge computers to connect to real real-time devices and if there's an api we can connect to it so that allows us to connect to things like um sap maximo esri for example or we can drag in we can drag in files and ingest those so so the bottom layer you can see the right-hand side we can actually have visualization engineers and data engineers to actually understand the data and then we use our subject matter experts for the next layer we develop we develop models that includes our risk assessments our integrity assessments so this is where we our our main input really is and then we turn that into applications which transfer to the application layer for our for our client and this system is multi-tenants so it doesn't require a separately hosted and instance for any particular operator it can all be um it can be shared amongst amongst other operators it doesn't preclude them from having their own system of course so this is how this is this is how the system works is a lot more complicated of course but you know in very very general terms so we've built it on the qio's industrial analytics platform so we get a lot of this functionality kind of for free effectively so next time please so where are we that we've somewhere that somewhere about a year or so into the into our development now obviously one of the one of the very important things is getting data into it so um we've developed a drag and drop methodology so that was one of the one of the challenges so we recognize all the various different various different columns you can potentially have from comma separate separated variable files or excel files and if and the way the way it works is that that we have a large number of builder number of these recogniti these file formats in there if you if a user comes across one that doesn't exist then it's possible to model it there is actually a way of picking the columns and choosing them and then that can be saved so we have it for the future so the more it's used the more it learns the better the ingestion routine is the other thing is getting from from sql to fear we can accept we can accept uh pod's database or any in fact any other lies data source as well using a number of different protocols so so largely due to the qio platform is is is pretty good at ingesting ingesting data so next slide please we thought it's very very important that one of our early early module developments was around dashboards pretty much all clients like like i want dashboards so we came up with a completely flexible flexible system it can take data from within thea in other words the results of your calculations outside of fear which could be manually entered it connects to connect to other systems it connects to uh it can get to third-party systems top right is a matrix a matrix view where you can see all your pipelines across the top all the different criteria down the side and then the traffic light system which you know again the traffic light system is totally customizable with up to five different colors if required and then there's also there's a there's a status a current status perspective but also you can toggle between the current status and also a trend as well so you can see how your kpis are actually varying varying with time and again you can see um the bottom right bottom there we've got various graphs and we've got the traffic light colors replicated on the graphs as well so we can have a field overview we can have a trend overview and we have organizational kpis as well which could be progress against um competencies or progress against hiring or it could be progress against overall inspection targets or budget so again you know completely flexible for the operators to you know design and and create and as they desire so next slide please the other thing we've managed to do is um is you know we've got the uh the the ili assessment routines all built in here so you can drag your data in you can set the criteria that you want that you want to the assessment the methodology you want to use um you can immediately get you can really get results in in a graphical explorable uh form and then you can actually create an automated report and the automated report has the methodology which varies depending on what assessment you're doing it has it has a narrative it has a conclusion it has a summary of the data and it has um insights that have been developed from the from the analysis as well and it's the way we've written it is actually can be done in multi-language multiple languages as well you can see the one the example here is actually in spanish so next slide please and e-learning over the last 20 years or so we've um pennsylvania developed a very large um set of e-learning um material and it's about it's about something like ten and a half thousand thousand slides it's in um e-learning format so scorm format in fact html5 and this is actually built into the into the theater platform and it covers defect assessment junior hazards pipeline materials on onshore offshore pipeline engineering and risk risk management so you can actually dip in dip into this as a search routine so you can actually search search for the slides you want just use it to get the background on the assessment you're doing or you can systematically go through it and it's actually up to um up to university postgraduate level this was actually used on a couple of postgraduate programs in here in the uk and also in mexico as well so next slide please and obviously data visualization very very important as well superimposing um pipeline pipeline shapes display shapes from from one set of data versus the other superimposing um profiles against um against maps again very important the bottom right example there is where a pipeline is displaced and quite quite considerably you know you can pick it up very very quickly so you can use um algorithms to to to detect insights within your data and then you can use the data visualization to actually drill down to where that actually is on your on your pipeline length so so you can you can very very quickly find problematic areas uh before the esc before you know before before the situation escalates so again you know very quick route to a prognosis diagnosis and and uh determining the outcome required so next slide please so saying that you know we've um we've invested a lot of time and money in this but i think there are there have there are some challenges we've come across certainly from the uh from the automation of a normally detection you need a lot of data and this is probably more than one operator has so i think there's a lot of a lot of benefit here in in you know in collaboration um so the data is if they operators were to pull their data i think there'd be a lot of benefit in doing that to get to get to get a better faster prognosis also there's quite a bit of investment in legacy systems getting people to adopt new things when they've already invested in in you know in software and their own methodology again part of that i guess is about change management you have to have obviously a compelling case cost decentralization and also the communication part of it as well it's certainly a very high risk um issues or certainly for very specific areas you know it doesn't make a lot of sense you know to put strain gauges on pipelines that you that are subject to moving but actually putting sensors along the whole pipeline is very is very expensive really and i think it's only really justified if there is if there are very high consequences of failure so there is a you know a big big game to gain to be had you know also lack of interrupt interoperability between between eye systems again is a challenge uh you know most operators complain about this operators have very very sophisticated systems that still end up manually transporting data between between one and one and the other also we've had some some issues with lack of common data formats there are some standards around this pods the uh pipeline open data standard which i'm sure some of you are familiar with there are some new ones being developed as well and of course the security concerns some uh some operators just won't accept a cloud environment so but i think that i think that i think that is changing so next slide please and i think another another another observation here which is um which i think is important one is that you know we're providing theory as a software as a service um but do opera do operators really want to do all this by themselves uh you know we so we certainly hope so we're certainly encouraging that some some some some already do um some hopefully could be sueded to do themselves and there is you know there is a solution and we would call it an augmented consultancy this is where pencil would actually use these tools to give a particular better service and provide the information using you know using fear for the client the client just gets the results they don't actually undertake the work themselves so this is we call this augmented consultancy and the other important issue i think is that you know the routine tasks tend to be uh dealt with very very quickly using thea and this gives this gives people time to actually provide better results to their client better insights [Music] the other thing is that you know the development of this and the use of these of this this these type of systems are driving new skill sets as well pipeline and subsidy engineers the future i think will need to have digital skills and i think engineers who possess both mechanical pipeline and digital competencies are likely to become the key people as well i think the ability to use these systems and get insights and define actions uh you know is an incredibly important skill so but i think this is important you know we talked about the difficulty in recruiting in oil and gas which is a real issue you know i think if we're recruiting people in um with with data where if you're mixing in data management into the skills and data science i think this makes a much more aggressive attractive career path so next slide please and there are some positives with regards to security as well there's a night there's a standard iso 27001 which has very very clear guidelines about security you can have ingress or ingress audits or what we call ethical hackers also multi-factor authentication as well and uh you know the other possibilities with private cloud you know if if multi-tenancy is not is not acceptable it's possible to do it completely privately within an operator's own secure environment and i think an example of how things are changing in the world as generally is that microsoft won pentagon's 10 billion dollar jedi jedi contract 14 amazon or amazon web services and this is putting pulling all of the the joint enterprise defense infrastructure into the cloud so this is ken the pentagon's enterprise cloud solution effectively so so i think if it's good enough for the pentagon but i think that we could should it should take some kind of comfort in that i think so next slide please i kind of nearly nearly from last i'd like you to uh meet uh ray hill right here day in the life of raheel a pipeline engineer with a fresh pot of arabica coffee on the go raheel's day is off to a smooth start but here comes the latest inline inspection data 10 gigabytes looks like you're going to need more of that coffee reheal there's so much data all with different formatting and so much work to do but raheel has an idea fear is our new secure integrity management solution in the cloud fear makes light work of all the data utilizing advanced data and engineering analytics to provide useful insights into raheel's pipelines now that raheel is spending less time managing and formatting data he can focus on the safe and efficient running of his pipelines go on raheel have another coffee you've earned it slide please so just to summarize digital transformation industry 4.0 concepts i think i've got a lot of to offer the continued safe operating pipelines certainly going to lead to better insights and better decisions um there are some roadblocks but i think uh you know i think the industry could benefit from closer cooperation and i think if some of these could be non-proprietary that's of course it could be it could be a roadblock with you know open outputs and this i think this applies to data standardization data formats and protocol protocols doing this an open whale i think will venus benefit the industry as a whole i think the link between engineering and data science is becoming increasingly important and we can see that from from our own work and i think that's important industry as well so and thea i think is very much part of part of this you know we and we're looking really for opportunities to try it out as much as possible to to improve it so thank you very much for your your attention and uh we have i think i believe we have eight minutes left so happy to answer any questions thanks very much nigel for the presentation and we've had a few questions come in um first is i notice you do not talk uh much about risk assessment is this the plan yes risk assessment is uh it's an important part of important part of here and that's our next that's our next major um development is going to be the area area of risk assessment we're going to be doing it from a from a qualitative point of view um for uh for rbi and also for a quantitative quantitative point of view as well so it's we've always that was always part of the roadmap we had to start somewhere so that's on that that's our next major challenge here i think risk assessment thanks nigel uh can you provide fear in a private cloud environment yes we can absolutely more a lot more expensive of course but for clients that are very you know very concerned about security we can certainly provide it in the cloud so we can certainly provide it in a cloud environment um oh a question just come in as well um is there available now yes yep it is available now we just we launched it officially about a month about a month ago so uh yes there's a there's a request a demo button on the website so you can click that and our team will contact you and we'll we can check we can show it to you um we have a you know we've got a bit of a road map as well so that you know clients can see what you know what we're planning to add in over the next year or so so yeah it's a software service on a subscription model so just log on from a web browser and drag your data in and then undertake undertake assessments builds kpis you know it's all it's all there fabulous thank you very much if you do have any uh further questions that you'd like to ask and i'll present to nigel curtin after the webinar and i've posted his email address um in in the q a so feel free to contact him directly and like just to round today up so on behalf of penn sven and we hope that you enjoyed the webinar as mentioned at the beginning penspen is running a series of webinars these will be announced on our penn spend linkedin page and website the next webinar will be on the 18th of november and is entitled qra for hp gas pipelines in europe thank you for joining and i hope you enjoy the rest of your day thank you thank you very much

Show more
be ready to get more

Get legally-binding signatures now!

Sign up with Google