Pipeline integrity data management for product quality
See airSlate SignNow eSignatures in action
Our user reviews speak for themselves
Why choose airSlate SignNow
-
Free 7-day trial. Choose the plan you need and try it risk-free.
-
Honest pricing for full-featured plans. airSlate SignNow offers subscription plans with no overages or hidden fees at renewal.
-
Enterprise-grade security. airSlate SignNow helps you comply with global security standards.
Pipeline integrity data management for product quality
Pipeline integrity data management for product quality How-To Guide:
With airSlate SignNow, you can enjoy the benefits of a user-friendly interface, secure document storage, and seamless collaboration. The platform empowers businesses to send and eSign documents with ease, making it a cost-effective solution for your document management needs.
Streamline your document signing process today with airSlate SignNow and experience the efficiency of pipeline integrity data management for product quality.
airSlate SignNow features that users love
Get legally-binding signatures now!
FAQs online signature
-
How to ensure data integrity in data pipeline?
How to Enhance Data Quality in Your Data Pipeline Understanding data quality in the context of a data pipeline. ... Assessing your current data quality. ... Implementing data cleansing techniques. ... Data validation and verification strategies. ... Regular data quality audits. ... Leveraging automation for continuous data quality improvement.
-
What are the issues with pipeline integrity?
Flaws in the pipeline can occur by improper processing of the metal or welding defects during its initial construction. The handling of the pipe during transportation may cause dents or buckling which compromise the pipeline.
-
What is pipeline integrity management?
Pipeline Integrity Management (PIM) is the cradle-to-grave approach of understanding and operating pipelines in a safe, reliable manner.
-
How do you ensure data quality and integrity?
Employ data wrangling measures, including cleaning, updating, removing duplicates, correcting errors, and filling in missing values.
-
How do you ensure data integrity in a database?
What are the best practices for data integrity? Data validation and verification. Access control. Data encryption. Regular backups and recovery plans. Data versioning and timestamps. Audit trails and logs. Error handling mechanisms.
-
How do you ensure data quality in a data pipeline?
Data quality checks can include the following: Uniqueness and deduplication checks: Identify and remove duplicate records. ... Validity checks: Validate values for domains, ranges, or allowable values. Data security: Ensure that sensitive data is properly encrypted and protected.
-
What does a pipeline integrity engineer do?
Pipeline casings/ road crossing/ water crossing evaluation. Inspection plan development/ optimization. Identify pipeline preventative and mitigative measures, re-assessment interval and re-assessment methods. Monitoring and surveillance of integrity parameters to ensure reliable operations.
-
How do you ensure data quality and integrity in a data pipeline?
Implementing robust validation and verification strategies is essential to ensure data integrity in your pipeline. Data validation involves checking incoming data against predefined rules and criteria to ensure it meets quality standards.
Trusted e-signature solution — what our customers are saying
How to create outlook signature
hi everyone uh welcome and thank you for joining us for today's webinar uh the webinar today is integrating data integrity principles into your quality management system my name is ashley sharik i'm one of the growth marketing managers here at qualio and i will be moderating today's session so today's structure of the webinar is going to be about 45 minutes long with a question and answer session right directly after that if you have any questions during this webinar please feel free to type it in the chat and we'll address it during the the end of the session if we cannot get to your questions today then we'll absolutely follow up via email in the next couple of days and also you will be receiving a copy of this webinar through the email that you registered also in the next couple of days so a quick uh information about qualio so uh qualio does uh is here for enabling life science teams to launch and scale life-saving products um products just like any other company in the uh life sciences industry so we get new products to to market faster by simplifying and uniting your team's processes and data in an all-in-one quality management software uh to enable you to launch and scale life-saving products so i am very excited to introduce our speakers today we have shirley hart a consultant and owner of sm heart systems llc and tanya sharma who is a co-founder of ashura llc and smart consulting so on that note i will pass it on to the presenters and let them take it from here great thanks um as ashley said i'm an independent consultant i've been doing this working in lab informatics for about 30 years and consulting for over 10. and tanya if you want to give a brief introduction yeah thank you so much ashley for such a kind introduction i am the co-founder for sure yeah little c and um i have expertise in data integrity and i i work in projects with data integrity and emerging technologies like blockchain i also lead several initiatives within ispe so i'm happy to connect on linkedin as well so feel free to reach out and um today as ashley mentioned our topic is integrating data integrity principles into your qms so we're going to talk about what is data integrity why are we talking about it introduction to data integrity by design incorporating standardization data integrity templates into your qms and then we'll have a discussion on computer software assurance and then we'll have some time for q a as well um we talk about data integrity a lot in our industry and data integrity is completeness consistency and accuracy of data and oftentimes data integrity is looked as a project or data integrity remediation or being inspection ready but data integrity should really be built into the fabric of our organization it really starts with building a culture of data integrity and it starts with the leadership and having ownership to communicate data integrity principles into the organization so that it can channel through to others as well and spreading the awareness having data integrity trainings then only we can have policies and procedures related to data integrity and control behavioral technical and procedural gaps and in data integrity and also have supporting processes like validation and having that consistency continue to build onto our process as well and being inspection ready come comes with it or qualify qualifying a system as you will see in this presentation as well um once lost data integrity can never be regained and that's why it's really important to look at data integrity holistically so why are we talking about data integrity and what exactly does it mean if you follow any of the fda inspections um a lot of the recent 483s probably in the last probably 10 years now a lot of the findings are citing data integrity and a lot of these problems come from silos where you have the data in different areas or two or more areas are doing the exact same thing but they're doing it differently so you have similar data but housed differently or managed differently for an example you might have i recently had a customer where they had gcp and gmp they actually had two different quality managers completely different sops for change control and my project came along and i just needed to do it for software in order to not conflict with their existing sops we ended up doing a software change control just so that you know we didn't have that problem um but they really should be just one sop for change control uh there's also the risk a based approach and that's been floating around for a long time with gamm5 and even before that and what does that mean it come boils down to if you have a higher risk you want more controls but then how to determine if something's high risk can be very subjective so you're better off having some way of calculating the risk or at least doing it in a consistent manner and that just involves training and defining what it means for your company and for what you're using it for risk determinations can be done for different things at a corporate level it could be your comfort level whether or not you're willing to take on risk you know risk and having less validation for example or you really want more because you have that comfort level that you need to fulfill for software it's usually the risk of failing if something just doesn't work or if not catching it if it doesn't but higher risk items like security would be a higher risk item the text that's in a tool tip i'd say that's pretty low risk but i've seen places where they test everything to the same level across the board so critical thinking comes into place where you just need to document what your approach is and how you're going to be aligning those and it also aligns with the current regulatory thinking you know the little c and cgmp means current so they're always raising that bar over time and this is part of it as far as the data integrity approaches and csa kind of goes along with that one question that comes up a lot is data integrity versus data quality and they're often used interchangeably but there is some differences data quality really means the fitness for the intended use and the metadata and context around it my favorite one is i've had systems where you know you have a number the number is meaningless unless i have units to go with it and in some cases if you'd have the wrong units you know trouble can happen so it has to make it the data needs to be useful data quality is usually dealing with processes um i did have an example this is a long time ago where i was happy had this new system put in they wanted me to validate it and had an audit trail yay but the audit trail was not human readable i mean i could open it up and all i could see was basically garbage um so i couldn't actually validate that to part 11 requirements but the quality wasn't there because it wasn't you know usable the data integrity comes into play when you're maintaining that quality of the data and that goes like over time and making sure that it's there you can trust that data so that's where you often see technology and surrounding processes like backups reviews um a lot of times periodic reviews but if you can um like i said you can have one without the other and they're all overlapping a little bit but both are actually needed and recently i'd say data integrity was more overlooked and that's why you've seen a lot more of the 483s and things like that around data integrity so here's some examples of when you can have one without the other so data quality without data integrity would be a case where you have instruments that are transferring data to a limbs but users on the instruments can modify or delete the data prior to the transfer well then you don't have good data but you have a good process for transferring it but the data coming in is very questionable the flip side of that you can have data integrity without data quality where you have that a secure transfer of data to the limbs but test methods on different instrument types or different groups or even different sites they may appear differently in the limbs for the same exact measurement so the data is trusted but it's very difficult to use if you want to get any trending or you know actually use that data in reports even sometimes it makes it a lot more difficult so there you have data integrity but you don't have the quality it's not as usable as it should be thank you um charlie and as i also wanted to share data integrity by design data integrity by design is a guidance document by ispe and this is a really good visual to show that achieving data integrity requires unifying systems and also walking through the risk elements throughout your system life cycle and as shirley mentioned the importance of critical thinking and understanding risk throughout the process i thought this was a good visual to share risk should be evaluated based on patient safety product quality and data integrity because ultimately we are responsible for patient data and also serving patients if you start from the business process then you can understand and determine the inputs and outputs of data being created how it's transferred in the functional area and how it interacts with different systems as well that way we can control data integrity by understanding how our data flows within that function and we can plan our data life cycle better also in parallel we need to have periodic review and change control procedures as well and as you can see from this visual we really need to emphasize on the importance of using a risk-based approach and critical thinking throughout our computerized life cycle before we go into the next slide i also wanted to add that based on fda findings and inspections 41 to 48 percent of errors happen in the specification phase and that's why we will see in the next slide the importance of having standardized document workflows and there's several issues when it comes to data integrity but we wanted to highlight the importance of consistency standardization and traceability we really need to eliminate barriers and silos and documentation and that's where we can really leverage quality management systems to be a repository and maintain consistency throughout the document life cycle and there's some examples that i think um i wanted there are some examples that i wanted to share as well just walking through how a qms really helps keeping everything in one place um if we're qualifying a new example a new system and we have a standardized document workflow then we're using the same steps and we maintain consistency for our qualification process and we're able to visualize what is going to come next as we go through the process we can assess the impact of the system by using a gxp impact assessment we can see what are the periodic review and change control procedures and as we plan our qualification the data document workflow can be a tool that we use because we can have standard procedures for validation templates we can have data integrity questionnaires and checklists and also have supplier information also if we're qualifying a new system if the same system is in multiple sites there can be redundancy and department departments might have different user requirements and different risk assessments for the same computerized system then that same computerized system might have different naming conventions might have different test results and a different process for reviewing documents this can really add to the inconsistency for the organization and also have inadequate risk assessments and it doesn't it's not a unified approach across sites especially when we see when we want to trend data when we want to look at root causes when we want to troubleshoot there isn't that consistency that's why it's really important to have the same procedures and process for all sites and use quality management management system to convey the importance of following a document workflow that everybody has access to and everyone can use um also some questions that we should consider when we add approvals and review to our documents are do we have the right people on board for review and are the process owners and system owners providing the feedback that we need are recapturing the right reviews and we can see from this slide having data integrity governance quality standards computerized system planning data integrity training and vendor management provides that traceability for regulatory purposes and also for the for the access for everyone and guidance that we can have we'll be able to trend and deep dive into root causes and you can also able to trend the system if the same problems are happening with the system at two different sites you're able to leverage information and see the history of the system as well especially when it comes to computerized system planning the requirements can be tested correctly test scripts can be developed using the same requirements and configuration is correctly captured in a system admin sop so if we can build our data integrity workflow into our qms we really reduce the risk of deviations and data integrity issues because everything is in one place and it's not siloed across several different documents management systems as well and we're able to problem solve and troubleshoot a lot better and another example is if you have a urs you can already have all of the data integrity requirements as part of the urs and that way every system is going through the same process it's going through the same thought process and critical thinking as well and um also the foundational data integrity documents should should not be siloed across different uh document management systems as well so this is really good to consider to have a unified approach in having everything when it comes to data integrity in one place and like tanya was saying um data integrity and data quality likes it still go hand in hand and the data integrity often looks at security and reviews and data qualities looking specifically at processes and naming conventions and i know she touched on the naming conventions as being key you want to make sure from the quality side that your internal data audits also include all your data integrity aspects how often are reviews conducted how often is a backup and restore tested i've had instances where you know the validation went great the backup was validated and everything but after the validation the backup actually never occurred on the instrument so it was never being verified they know that it was happening and restores do need to be tested so computer software assurance is it's a new term that's been floating around probably for a couple years now it did recently come out in the ispe data integrity by design it's an appendix s2 as you can see here but the background of it is it came out of the fda industry csa team which is fi csa i don't know if they pronounce that fixa or what um but that started in 2014 and it has a bunch of members from different people in the industry as well as consulting companies so there's fda representation gileads there johnson johnson medtronic siemens and a whole lot of others their mission statement and i got this right off of their linkedin profile is to drive a paradigm shift to a leaner value driven and patient-focused transformation from csv to csa the big push for this was you mentioned validation and most people cringe unless you're consulting getting paid to be doing it um the it's it's very labor intensive it's very documentation intensive and i mean i can tell you i've spent days sometimes just writing out deviations for simple little protocol errors you know the pges protocol generation errors and it's just a lot of documentation it's a lot of screenshots and there's not that much value in it it doesn't produce a better product i've also heard a lot of people that said well why is there an error we validated it well it's only as good as your testing so right now there is the isp designed by di by design that talks about csa in the appendix they also have a new document just came out this month um and i'll i have that in a later slide that goes into a little bit more detail on that there are expecting an fda guidance document to come out this year and a lot of people are kind of waiting for that but there really is no reason to wait this is here you should be doing this now and what exactly is csa first of all it applies pretty much to any software but not software on a medical device but you can use it and leverage it for a lot of different aspects of validation it's very similar to csv where csv is just computer systems validation but it divides relies more on critical thinking which is to me just common sense so it's like looking at things and seeing going back to your risk assessments is there value in this is it a high risk is it detectable and then having less documentation to generate and review i mean nobody likes really generating the documents nobody in quality because i've been in quality likes reviewing them and the fda doesn't like reviewing them either because you know you get these piles and stacks of documents that nobody wants to go through and i did remember having a validation manager at one company i worked for where i had this large packet for a large enterprise uh software system and he picked it up and he just said yeah it weighs about right it's it's probably good it's like that's what we've been doing is just over documenting pretty much just to kind of cover you know you know what um the csap approach is more based on risk and it gives you more guidance on how to do that and what it will allow you to do is actually produce software with fewer bugs because you're testing it more you're doing it more like i used to do when i was doing software development i was a qa manager for a limbs vendor and we used to just let people loose and testing the different modules to see how we could break it in some cases we used to have games as to who could you know find the most errors and you can do that if you leverage different ways of documenting it and plan for it so you actually have faster validations this way because you have less documentation you have less protocol generation errors and overall you save money because you're not writing these longer detailed protocols you're not doing all the protocol deviations and the review times take a lot less time so if you were going to go down that path where would you start well first you want to get educated and find out more information um like i said the isp data integrity by design is the first place to start with that appendix from there they have a new one it's called ispe good practice guide enabling innovation uh tell you the truth i haven't had a chance to read it it's a little over 100 pages long but it does talk about critical thinking the risk assessments and things like that look for different seminars and things gather as much information as you can one good place is compliance group incorporated they're on linkedin ispe and they have some youtube videos they were one of the consultants that worked with the fda on that other group we were talking about earlier to help develop these tools but you need to if you're all on board with that you know where you're going to start your first steps would be to look at your own validation sops and make sure they allow for risk-based validations and then you're probably going to want to draft a val plan so in this case you're going to determine the scope in some cases i've seen bound plans that cover departments or areas or groups of equipment or individual systems but once you determine the scope that will pick your system your project or depend department whatever and that limits it down to a you know smaller way to start but the big key piece here would be training and you need to get people trained so that they understand what the approach means and how to write those validation documents and how to execute the protocols one of the key things and unfortunately we don't have enough time to go into all the details around csa but one of the key takeaways besides critical thinking is it allows you for two different types of protocols and one that's called uh that's a typical one you're used to which is your formal script-based one and the other one is a more flexible i want to say free form but it's an unscripted one where it doesn't detail every step that you're doing and in this case what you would do is oh like say if you had i deal a lot with limb systems so maybe you'd have uh something you want to just accession a material into a supplies you would just have one that is really high level that you give to the actual user and the script is only to test that portion of the system you know it adds this uh some new supply and see make sure that it's usable so they would do that if they came up with any problems they would document the bug it's more important at this point not to document and take screen captures of did it work but you want to know did it what didn't it work or capturing those bugs that is going to be a key portion of it because you need to have those bugs it's almost like you're doing a pilot plan or like a feasibility study but you let the end users basically play with the system and do what they would do in a normal day-to-day um work situation without having a huge documentation of the step-by-step you know click here do this and documenting every single step for the high risk items you would still want those protocols i mean i still want to know what the software version is you know what the hardware components are and things like that where it's located um you know so there are always going to be some critical things that need to be documented in bet you know more detail but the goal is to have 80 percent the unscripted and about 20 of those scripted tests which have the higher um documentation overhead that alone would just save so much time and like i said in not generating those pges but again you need people to be trained in how to do this and how to write them and you probably need somebody to manage the project in tracking those bugs you can do it either in a spreadsheet or i would say whatever method that you're already using if you use teams you can do it there if you're using slack you can do it there sharepoint you know whatever you commonly use to track um issues you could do it there i wouldn't put it in a formal like help desk software or anything like that because you just want to track did this occur you know is it reproducible you know and was it fixed a lot of things are going to be configuration changes or learning where somebody's just doing something wrong that helps you develop better sops in the long run you don't want to just have an sop that's developed from you vendors manual you need to have it so that it does exactly what the people are doing and you know short enough so where it makes sense and then once you have that bug tracking what once you resolve those bugs either by changing the configuration or in some cases getting a new release from the vendor then you have a good system that has less bugs and that people are going to enjoy more because it's going to work once it's actually released and then all those documentation you still need to review those validation documents but the number of documents is going to be so much smaller the review process goes a lot faster and i know i'm jumping around a lot here but it's kind of like a crash course in all of validation in like five minutes or less and that's kind of hard to do so you need to go out and get as much information as you can like i said from these other things that are available out there if you're not comfortable or you know there's an area that you're maybe a little weak in you know there's lots of consultants include including me and tanya and like i said the compliance group incorporated there's been developing a lot of tools to help you do this but overall it will save you money and time and resources and all i can say is you know apologies apologies too nike of just do it uh if you don't start doing it now you're going to get left be left behind and everybody like i said the bar is always rising it's that current gmp and gcp regulations that you need to stay up to date and it looks like i ended a little bit early i probably talked to way too fast but thank you and i'm sure we might have some questions yes absolutely thank you so much shirley and tanya um we do have a handful of questions that we can go over um so the first question is how how should we get our organizations to consider csa charlie i don't know if you want to answer first or if i should um either way um i know it's a sometimes it's a hard sale i've had some of my customers even will say that they're waiting for the fda guidance to come out well hopefully that's supposed to be out this year so that's not going to be too much longer um i would say recommend trying to do a small project maybe just you know one small project even if it's just a change control you know to managing it that way um just to kind of get their feet wet the biggest thing is making sure that your sops support a risk-based approach um and that you could move forward that way yeah and just to add to that as well i think that maybe also leveraging a hybrid approach where we introduce csa um more in segments rather than the entire program as well might be a good option because i know that some of my clients um they're also waiting for the fda guidance to come out but they're open to the idea of using critical thinking and using a risk-based approach so we can certainly also do a hybrid model in introducing csa and also showing the value of less documentation and the importance of critical thinking into developing our life cycle approach and the di by design visual i think the ispe data integrity by design is a really good guide it's definitely a good starting point for looking at it but i would also look at some of the other webinars that ispe has and other ones that are out there on linkedin and youtube videos because a lot of them show companies that have already gone through this and the value that they're seeing a lot of them are seeing being like 40 to 50 even up to 60 reduction in review times um protocol generation time you know across the board that's a lot of money on a large project and time you know if people want to get a system up and running quicker this is the way to do it great thank you shirley and tanya um the next question we have is during planning what are some of the documents that we should be developing so i think during planning definitely having a gxp impact assessment would help categorize the system better to see what's the category is it a non-configured is it a configured system and after that um having a really good urs with all of the data integrity requirements is part of it okay now uh the next question is what are some considerations for a risk a system risk assessment did you want to take that tanya or me too oh sure um i think for a system risk assessment as shortly was mentioning during her slides and also i touched on it to um critical thinking that's number one i mean especially when it comes to um having a lot of resources on our on ispe of system categorization um really looking at the complexity of the system and building a risk assessment based on that like really going through what is what are we using the system for what is the complexity of the system and building a risk assessment based on that rather than just testing everything and as shirley was mentioning like csa is going to be the guidance for csa is going to be coming out too to really um to so it's important to have a risk assessment that shows critical thinking and really tests high impact items okay um now what are some recommendations for where to find additional resources i think i went over those a little bit um the biggest thing i'd say would be looking for um other um webinars there's a there's one coming up i'm going to be attending in october there's a company that's called kenx ken x that's doing one on validate software and validation it's here in san diego but it's also virtual in that october no the october one is a webinar um that i know of december and the other one's in december i think so it's just looking for the um seminars if you want to attend i know those cost money if you want to kind of bootstrap it i'd say look for the linkedin and youtube videos for people who are supporting csa there's a lot of information out there okay good to know um the next question is is csa required so validation is required in most regulated industries using the csa approach is not really required um but it's better um it i mean it saves you money it saves you time um i don't know why you would not want to do those things but no using that approach is it's not required but again they're always raising the bar at you know they're going to soon be expecting it i mean at one point the agencies were only expecting you to have a plan if you had if you had deficiencies in software you could always have procedural controls if you didn't have technical controls and that was okay for a while well now it was to the point where then you had to have a plan and now you need to have a plan and dates basically of when you're going to be putting those technical controls into place so that bar is always moving awesome um now how this is an interesting question um how do you deal with uh reluctancy from uh the management or management yeah that's that's a hard one um it kind of depends on your management i've talked to some people who you know have no idea you know what it is and there it's it's just educating them and you know providing them with the materials so they can get to know it and understand what it is and then explaining i would say coming up with kind of a plan as to how you know to implement it you know you know your own company and your management better than anybody hopefully so it's it's just depends on the manager and the comfort level of change basically in a company some companies do not like change at all other companies i have one client right now who they already knew about this they knew about it before i did and they're all on board and they're ready to you know just jump on board and you know show them the way so you know it there's a lot of variance out there as far as you know who does what but i'd say education more than anything and if you can find other companies that could talk to them maybe you know maybe that they if they know them that have gone through this um and see what they think about it i think um also just wanted to add to that that it also depends on the maturity of the organization as well in adapting csa some organizations already have smes they already have experts in the company or they're able to work with consulting companies to get that expertise but having that data integrity maturity and understanding the maturity of the organization as well as an important element when it comes to the reluctance of support in something like csa that's very true too you're right great and the final question it looks like is uh what are some challenges about using an all-in-one qms platform so i think that planning um [Music] when you're a startup or a mid-sized company you might have a lot of options of quality management systems or you have different systems to do different things but that's where really going back to the business process and going back to mapping it out that's where it's really important to see the value of having everything in one quality management system and the importance of walking through the life cycle um and when it comes to challenges i think the challenges lie in the planning and everything moving in our industry so fast um and that's where i think that a lot of people struggle is they have a lot of options a lot of different segmented um sites and the standardization is the challenge and i think that goes back to the maturity as to whether or not you have it um one of the other challenges i would say that goes along those same lines is you know i deal with a lot of customers where they're going through mergers of you know multiple companies so you've got multiple systems because of those multiple companies and by default they're already doing things differently and have different naming conventions and it takes a while to consolidate that information in any system i know places that are still going through it even after you know multiple years so it's sometimes it does take time and also i think if you already have a good document workflow and you know even if it's just on paper for example that you know systems coming in and it's going to go through the gxp impact assessment as a first step and after that there's all these different uh workflows and if you already have that process and that's able to be duplicated no matter where then that's easy to replicate that into a quality management system um so having just that thought process having that down on you know just having the basics down of how does that workflow look like and having a visual then you can replicate it replicate that into any quality management system as well and that's where i think that a lot of the challenges would also be reduced um in having those kind of conversations yeah that's very true you can't automate a bad um process it's not going to fix it you need to have that you know solid paper process already defined and then you can automate it you know regardless that applies to qms system instrument systems limb systems you know anything um yeah that process needs to be pretty solid before you can automate it and make some improvements that way great um thank you too so much that actually is all of our questions i believe well actually we might have one more oh so we did get a question um but i do not have the answer right now um we did have an attendee asked is qualio building document templates for these kinds of di documentation activities um i will get back to the people that have asked um for those specific answers with our product and quality team in the next 24 to 48 hours um so i just wanted to let you guys know um that i am reading that i understand it and i will absolutely get back um and other than that uh there are no further questions so again on behalf of qualio i really appreciate all of your all of the attendees uh spending the last hour with us today to learn about just integrating data integrity and you know quality management systems so thank you so much uh shirley and tanya for taking the time and creating this fantastic presentation and as i uh remind everyone this will be available to watch um within the next 24 hours and you'll also get a live session link that you can share with others who may want to see it at another time um but other than that we are all set and you guys have a great uh rest of your day thank you so much thanks ashley hi hi goodbye everyone goodbye
Show more










