Create a Payment Reminder Sample for Product Quality Effortlessly
Move your business forward with the airSlate SignNow eSignature solution
Add your legally binding signature
Integrate via API
Send conditional documents
Share documents via an invite link
Save time with reusable templates
Improve team collaboration
See airSlate SignNow eSignatures in action
airSlate SignNow solutions for better efficiency
Our user reviews speak for themselves
Why choose airSlate SignNow
-
Free 7-day trial. Choose the plan you need and try it risk-free.
-
Honest pricing for full-featured plans. airSlate SignNow offers subscription plans with no overages or hidden fees at renewal.
-
Enterprise-grade security. airSlate SignNow helps you comply with global security standards.
Payment reminder sample for Product quality
When managing product quality, timely communication can signNowly enhance relationships with partners and customers. One effective approach to ensure accountability is by sending payment reminders. Utilizing tools like airSlate SignNow allows businesses to seamlessly send eSignature invites for their documents. Follow the guide below to maximize your payment reminders utilizing airSlate SignNow's efficient features.
Payment reminder sample for Product quality
- Open your web browser and navigate to the airSlate SignNow homepage.
- Register for a complimentary trial or log into your existing account.
- Select the document that requires signing or needs to be sent for e-signatures and upload it.
- If you plan to use this document multiple times, save it as a reusable template.
- Access your document to make necessary modifications: add fillable sections or input required information.
- Insert your signature and designate fields for the recipients' signatures.
- Proceed by clicking Continue to arrange and send out your eSignature invitation.
By incorporating airSlate SignNow in your workflow, you empower your organization with a budget-friendly solution that greatly enhances document management efficiency.
With user-friendly features, transparent pricing, and robust customer support available 24/7 for all paid plans, consider leveraging airSlate SignNow for your payment reminder needs today!
How it works
airSlate SignNow features that users love
Get legally-binding signatures now!
FAQs
-
What is a payment reminder sample for Product quality?
A payment reminder sample for Product quality is a customizable template designed to help businesses remind clients about upcoming or overdue payments. It ensures that payments are made promptly while maintaining a professional tone. Utilizing such a sample can enhance cash flow management and improve client relationships. -
How can I use a payment reminder sample for Product quality with airSlate SignNow?
With airSlate SignNow, you can easily personalize a payment reminder sample for Product quality to fit your business needs. The platform allows you to add your branding, adjust the content, and automate sending reminders to your clients. This feature streamlines the payment process, ensuring timely payments. -
What features does airSlate SignNow offer for payment reminders?
AirSlate SignNow offers various features for sending payment reminders, including customizable templates, automated notifications, and eSignature capabilities. You can create effective payment reminder samples for Product quality that seamlessly integrate with your existing workflow. These features help in enhancing efficiency while reducing time spent on manual follow-ups. -
Are there any pricing options for using airSlate SignNow to manage payment reminders?
Yes, airSlate SignNow offers flexible pricing plans to accommodate various business sizes. Each plan includes access to features that help create and send payment reminder samples for Product quality. You can choose a plan that best fits your budget and needs, making document management cost-effective. -
What are the benefits of using a payment reminder sample for Product quality?
Using a payment reminder sample for Product quality helps maintain cash flow, reduces the chances of late payments, and fosters professional communication with clients. It enhances transparency and accountability, allowing businesses to stay organized and efficient. Moreover, it can reduce the time spent on follow-ups, freeing you to focus on other important tasks. -
Can airSlate SignNow integrate with other tools for payment reminders?
Absolutely! AirSlate SignNow can integrate with various accounting, CRM, and invoicing tools, allowing you to use your payment reminder sample for Product quality seamlessly within your existing systems. This integration ensures that your reminders are timely and synchronized with your financial records, improving overall efficiency. -
How do I customize a payment reminder sample for Product quality?
Customizing a payment reminder sample for Product quality in airSlate SignNow is straightforward. You can modify the text, add your company logo, and include specific details relevant to your clients. The user-friendly interface makes it easy to tailor each reminder to ensure it reflects your brand and communicates effectively. -
Is there customer support available for using airSlate SignNow for payment reminders?
Yes, airSlate SignNow provides excellent customer support to assist you with any questions or issues regarding payment reminders. You can signNow out via email, live chat, or phone. Our support team is knowledgeable about creating and using payment reminder samples for Product quality, ensuring you get the most out of the platform.
What active users are saying — payment reminder sample for product quality
Related searches to Create a payment reminder sample for product quality effortlessly
Payment reminder sample for Product quality
okay hello everyone and thank you for joining today's call for quality measures webinar during today's webinar the centers for medicare and medicaid services will provide an overview of the 2021 call for quality measures process including the timeline and measure evaluation process to assist with measure development for future mips performance periods and after the webinar cms will take as many questions as time allows so now i will turn it over to dr daniel green a medical officer within the quality measurement and value-based incentives group at cms to begin thank you lauren welcome everybody thank you all for joining today's presentation we're very happy that you were able to take time out of your busy schedules to uh to listen to um and i'm sorry to listen to and provide some input as to what we're looking for in measures for the mips program next slide please and the next slide please okay so this is our typical cms talk disclaimer next slide please just want to do a quick run through of what the agenda holds for us today so obviously this brief introduction we'll have an overview of the call for measures we'll go over some of the requirements measure evaluation considerations our future state resources and question and answers so hopefully at the end of this uh presentation and and question and answer session you guys will be able to have a better understanding of what cms is looking for for measures on our measures under consideration list as we move forward with the mips program um so at this point i'm going to break for just one second i have the pleasure of introducing dr michelle schreiber dr schreiber is the uh group director for the division of um quality measures um value-based incentive program group she's also the deputy director in the clinical standards and quality group which is our the the parent organization if you will of queen vig uh michelle is a practicing internist from from michigan where she led the quality programs if you will for the henry ford hospital she's also a fellow dog lover so michelle let me turn it over to you thank you thank you dr greene i think that last one may have been the most important one both dr green and i share a passion for our dogs welcome everybody to uh today's session we're very glad that you were able to attend i'd like to just take a moment to thank all of the cms staff who are on the call who really worked very hard on these programs and on these measures and to our vendor ketchum for having organized today but most importantly thank you to each and every one of you who are on the call today and who may even represent other organizations in particular to any providers who are on the call this last year obviously has been one of tremendous challenge we know that practices have been significantly affected and that people have been really working incredibly hard to manage during the covet pandemic and on behalf of cms we say thank you a sincere and hearty thanks really to each and every one of you you are the true healthcare heroes and we deeply appreciate it i wanted to spend a few minutes today to really talk about where cms views measures going forward in the future and what some of our plans are for even using measures to give you a better idea of what may be coming forward as higher priority areas as you start developing measures and bringing measures forward there's probably nothing more frustrating than putting a lot of work into developing measure only to find out no not going in that direction after two years so hopefully by setting the stage here it will become more clear what cms is hoping to achieve at least over the next few years recognizing that this may uh obviously continue to evolve if i could have the next slide please and actually we can skip forward so over the last year cms has been putting together its specific quality action plan regarding quality measurement and our vision which i think everybody shares this vision is to use impactful quality measures to improve health outcomes and deliver value by empowering our patients to make informed care decisions while at the same time reducing burden to clinicians and making measurement meaningful to clinicians as well next slide many of you are familiar actually with the meaningful measures framework that we started in 2017 and we've had significant success with that so that had outlined a yes series of six domains and 17 different focus areas and created priorities for us is to where we tried to focus our measures we have further refund refined meaningful measures and we'll show you that in just a moment but use that in the context of the overall strategies and goals of the cms quality measures action plan the first of those goals is to use the meaningful measures framework to streamline and really align quality measurement and by a line i mean a line across cms so for example cmsb for service a medicaid cmmi to align across our federal partners such as va and dod and we meet with them frequently as well as to align in the healthcare space writ large many of you are familiar with the cqmc the core quality measures collaborative which is ahip america's health insurance plans cms and nqf trying to determine a course set of measures that will be used by all pairs so we recognize that there's a lot of work to do in alignment and in streamlining and continuing to identify those areas of highest impact while we continue to make measures better the second goal of the action plan is once we have measures to leverage them to drive improvement in cms's levers include public reporting and our value-based payment programs many of our programs are paid for reporting but many of them as you're very familiar include incentives and or penalties measures are obviously the essential building blocks of these programs we continue to be modernizing our programs in particular the mips program as you know we're making that transition to mips value-based pathways so that we will have smaller sets of cohesive measures that relate to each other from quality to while promoting interoperability to improvement activities to cost as well as a foundational level all geared to improving a certain aspect of care such as improving prevention or improving joint replacement surgery so we're really looking forward to that and appreciate all of you who have been part of those conversations as we've been developing those programs we've also been working on our other value-based programs for example improving and having released a new version of the hospital stars rating many of you may be familiar with the fact that we're expanding the sniff value-based purchasing program that legislation just passed as part of the consolidated appropriations act and so we continue to move on all of our value-based programs to make them more meaningful and hopefully modernize them we want to improve the efficiency of our measures by transitioning all of our measures to digital measures so cms set out the goal at last year's cms quality conference to transition to all digital measures in 2030 and our former administrator sima verma really wanted to accelerate that timeline and so we're targeting actually 2025 recognizing full well that that's a very aggressive timeline now we have a broad view of what a digital measure is and it is not only an electronic clinical quality measure which is derived from the electronic medical record but it could be claims-based measures and it could even evolve to be measures that take data from a downloadable patient device the advantage of using digital measures is that we feel with interoperability we can have more seamless access to information we can provide rapid cycle feedbacks and we can leverage advanced data analytics to view measures in a way that we never could before and so a key strategy is for us to transition to digital measures and i i'll go out on a limb a little bit by saying that at least for use in our programs we will likely be insisting on digital measures in the future as a matter of fact this year's muck list that went to the map was 80 digital measures mentors are also important to empower consumers to make the best health care choices through patient directed quality measures through public transparency and certainly by increasing patient reported outcomes measures we know that proms have been sort of clunky to use and we're looking for recommendations and programs where we can work out how to best use proms but cms has made the commitment to increase patient reported outcomes by 50 over the next several years we've also made the commitment to improving our public transparency you've seen the latest version of the compare site that was updated several months ago to make it easier for consumers to really understand information and be able to make best health care choices we also highly prioritize those measures that we would consider patient-centered or patient-directed such as shared decision-making for example and finally we believe that measures can shine a spotlight on disparities and we would like to leverage our quality measures to promote equity and close those gaps of care so we'll be looking for measures around social determinants of health or how we can actually drive equity and decrease disparities if i can have the next slide this is a schematic diagram of our new meaningful measures 2.0 um and i guess i'm looking at it and it's on my right side but the house diagram that our goal is really building towards value-based care with the patient and caregivers and family at the true north and top of this diagram we have eight specific domains of care we took away the 17 different focus areas to really create domains of care that we think are the highest priority persons centered care which i spoke of recently equity safety and just so you know we had called it patient safety but we've called it safety to have a broader focus on the health care ecosystem including patient safety workforce safety facility safety so safety kind of in a larger frame affordability and efficiency are obviously very important the management of chronic conditions wellness and prevention seamless care coordination which also includes seamless communication and we've added behavioral health and i would add mental health and substance use disorders there because we recognize how important that is this is all built on the foundation of the consumer and the caregiver voice so that we are always taking that into consideration and obviously as you are developing measures ensure that we are always from the very beginning listening to the patient voice making sure that we have representation from a wide variety of patients and including the consumer and all that we do the goals of meaningful measures again i spoke of them before utilize only the highest quality measures that impact these domains align measures across all of our value-based programs across all partners prioritize outcomes and patient reported measures transform to digital and develop and implement measures that reflect social and economic determinants and that decreased disparities next slide please this just highlights again leveraging quality measures to highlight disparities in close performance gaps cms has plans to expand our confidential feedback report stratified by dual eligibility and we're looking at other means of stratification to provide feedback reports introducing plans to close equity gaps and we spoke already of ensuring equity by supporting the development of either social and economic status ses measures sdoh measures and are partnering with the office of minority health looking at the health equity system score as another potential marker so with this i'm going to pass the baton to jocelyn in just a moment once again thank you to all of you for your work in measure development we think this area is extremely important it is the tool that we use to highlight quality safety and really outcomes in healthcare to ensure that we are getting the best value for all customers and all beneficiaries and we look forward to working with each one of you to hearing your feedback and hopefully this at least gives you some directionality for what cms is considering around our quality measurement framework jocelyn thank you and i will turn it back to you all right thank you dr schreiber and thanks to everyone joining today's call we can go to the next actually skip two slides we're just i'm going to provide a quick overview of the copper measures um so the annual copper measures process as stakeholders including clinicians professional and medical societies researchers consumer groups to send or recommend quality measures for consideration in the quality performance category for mips stakeholder recommendations are a part of our rigorous quality measure selection process since it's likely that these stakeholders would submit measures that are meaningful and applicable to them next slide so as dr schreiber had mentioned the meaningful measure initiative serves as a guide to evaluate each measure for inclusion on the muck list to ensure that the selection of measures pursues and aligns with the agency's priorities through setting program needs and priorities cms hopes stakeholders will continue to take this into account when the developing measures and submitting these measures for consideration on the muck list the cms quality measurement strategy continues to drive towards patient-centered value-based care through the development and selection and implementation of measurement that includes accelerating the transition to digital measures promoting the use of all-pair data increasing alignment of measures across the agency and of course using that patient voice next slide additionally we wanted to provide just a high level timeline which may seem a little different from this past year but in a typical year we likely open the call for measures in january we use in march and april timeframe we hold educational webinars for the call for measures and this will come to a close in may we use july and august to review those measure candidates and evaluate how the measure aligns with cms's needs and priorities and we'll definitely take a deeper dive into that in the coming slides by september we provide feedback to the submitters as a muckless enters the cms hhs clearance process typically the muck list is published no later than december 1st to gear up for the map meetings that occurred during december and january the map recommendations are published by february and just a level set the map or otherwise known as the measure applications partnership is to provide an opportunity for stakeholders to provide a recommendation on the measure to hhs with the overall goal is to maintain transparency and encourage public engagement and actually we just wrapped up that map meeting cycle this past week and we definitely hope to see you all there next year next slide all right providing an overview of just the expectations of what goes on within the call for measures in order to be considered as a mixed quality measure a submitter is required to submit a few different items obviously we need the measure information but along with that we would need the peer-reviewed journal article template which we do provide a formal template for you to fill out as a submitter and then that doesn't mean that the measure actually has to be submitted to a peer review article or journal article we take care of that on the cms side all measures need to be fully developed and tested at the clinician level and feasible at the time of submission there may be additional outreach following the call for measure closed to gain clarity if the team has questions or anything like that as it's being evaluated we do want to note that the jira system was retired after the 2019 call for measures and cms is proud to announce that they'll be moving towards a muck entry review information tool otherwise known as merit but the 2021 call for measures and more information will be provided at a later date okay moving on to after the submission if the candidate measure is placed on the muck list the submitter is expected to attend those nqf map meetings that i just referenced in the timeline this allows the stakeholders to ask questions provide feedback and really fully understand the measure prior to engaging in the voting process and finally once the measure if the measure is proposed and finalized within mips the measure steward would maintain the measure through the annual revision process this process ensures that the measure aligns with the current guidelines any updates to coding as we know there's always annual updates to the icd-10 cpt et cetera we do promote coordination of the measure changes across collection types governing bodies and other cms programs as this will contribute to reducing a clinician burden reporting burden and then lastly um there will be uh cms does host measure steward meetings and would request that those measure students attend as scheduled all right moving along to measure requirements go to the next slide actually skip that next one all right as i had mentioned testing so for a measure to be considered it must be tested at the clinician level at the time of submission we're not expecting any type of extension to submit testing data so once coffer measures closes we would need that testing information this includes reliability testing to ensure that the measure can reproduce results this can be demonstrated by the signal-to-noise ratio we also require that the measure is valid and accurately represents the concept and achieves its attended youth we would direct you to the cms blueprint for additional guidance on testing and have included the link within the resources section as well as the next slide in this presentation so we can go on to the next slide all right for ecqm the ecqm team evaluates the testing and readiness by reviewing the measure information to ensure that it uses cql or clinical quality language as as the expression logic in the health quality measure format and requires that data collection mechanisms must be able to transmit and receive requirements as identified in mips regulation similarly ecqms assess the reliability and validity using signal to noise the kappa agreement is one way to evaluate validity and the only testability of the easycrim data element specifically agreement between ehr extract and the chart extracted information score level validity is done through hypothesis testing with the gold standard and then lastly the feasibility can or feasibility can be demonstrated by the nqf feasibility scorecard the team also reviews the specification by assessing the map output the value sets needed needed to be published on the vsac and then the body test cases and additionally the ec chem did team provided a number of different measure testing resources for you on this slide moving along to the next slide we'll take a peek at the future of of um easy qms and really we just wanted to put this on your radar for the future so the fast healthcare interoperability resources otherwise known as fire is a data interoperability standard developed by hl7 international for the purposes of exchanging healthcare information electronically fire will combine a simple framework of building blocks or data elements with the latest web technology to standardize the way healthcare data is being represented and shared it is flexible enough to be used across a variety of ehrs and other healthcare exchange platforms so how would this impact the ecqms instead of using the qdm data elements and qrda reporting fire data model and fire reporting requirements will be used for representing the reported reporting ecm data the fire standard has the potential to reduce overall burden on measure developers and implementers by providing faster real-time access to quality data aligning data exchange standards within the clinical support initiatives and automate automating both eigr data retrieval and quality data submission so exciting things happening in the world of ecqms but i do want to mention uh mention that their cms is planning a fire 101 webinar taking place on february 3rd um so we can place that registration link in the chat if that would be helpful but wanted to put that on your radar next slide okay so beginning with the 2020 call for measures process emitters were required to link their their mip squad or their quality measure candidates to an existing or related cost measure or and or improvement activities as applicable and feasible submitters are required to provide a rationale as to how they correlate to each other to the other performance categories and activities as part of the call for measures process so this will aid in the development of robust and cohesive mvps in the future and when they are available submitters may also be asked to link to an existing mips value pathway and we'll we'll discuss the transition to mvp later in this presentation but wanted to note this is one of the requirements next slide all right perfect thank you so each year cms assesses the quality measure inventory and identifies specialties or clinical topics that are not widely represented within the mips quality measure inventory and so for that reason we would request that measures that support identified gaps be submitted from our initial review we're seeking measures applicable to the specialty and topics listed on your screen and as dr ment dr schreiber mentioned we really are seeking to have that patient voice represented in the quality measures whether it be from patient reported outcomes or shared decision making measures but again those are just some areas that we're seeing gaps in measurement and clinical topic next slide all right in addition to clinical topics and specialty um preference we're we're also seeking to move towards more outcome-based measures and include patient reported performance measures that utilize proms patient experience measures aim to improve the customer experience that support innovative approaches to improve quality accessibility and affordability and most importantly empowering patients and english clinicians to make decisions about their health care together this incorporates that patient voice which is definitely a priority within cms next slide all right reducing we're aiming to reduce a patient or reporting burden for clinicians in a number of ways so you may have noticed that we've been adding administrative claims measures and these measures do not require any additional action on the behalf of the clinician but are automatically calculated by claims data additionally those that digital quality measurement that leverages sources of healthcare information that are captured and can be transmitted electronically via interoperable systems will assist in this effort in addition other ways using that transition to fire will also assist in reducing burden and additionally we by aligning the measures across programs those clinicians who participate in multiple programs will have decreased burden and they will not have to implement the same measure concept in multiple ways based upon the program and this this goes back to that core quality measures collaborative collaborative dqmc next slide now we'll enter into sort of the the heart of the presentation so each year we receive and review measure candidates where we provide consistent feedback whether it is to revise or to combine but we wanted to take a minute to walk through a couple of the typical detours to consider when developing a measure so that your measure has a better opportunity to move forward to the muck list and eventually mips implementation so first off we have splitting we do not want to see that measures split into a single measure into several different measures we would like folks to explore opportunities to combine into a composite for a greater picture of quality so for example we would look at a mips measure measure 441 this is ivd optimal care this takes a more comprehensive approach to managing and carrying for patients with ivd by assessing not just one of the components individually but a broad broader picture of care so in this measure it would be assessing that the blood pressure is controlled that the patient does not use tobacco and is taking either an aspirin and anti-platelet and stan unless contraindicated what we don't want to see is those four components and come in as four different measures so um yeah definitely take a peek at that mips measure 441 as an example next we don't want to see a duplicative or a one-off measure to an existing mips measure we would really encourage the collaboration with mips measure stewards to be inclusive of appropriate clinical clinical settings and a patient population so an example of this if a measure didn't contain for example the inpatient setting but you feel it applies that is it's a good measure and it should we don't want to see that measure submitted with that patient population or that inpatient setting as a separate measure but we'd rather request that you'd reach out to the measure steward to see if they're opening open to expanding the clinical settings represented within that given measure next as measures are submitted we do look at the burden that it may pose to a clinician or group and would encourage development of measures that also take into consideration that in into consideration especially um leveraging those digital tools that were mentioned earlier and the last do's and do are the do nots is an important aspect definitely you want to make sure that your measure as it's being developed does not have any unintended consequences so we have seen measures come in as like a cost resource measure that actually was looking to deprive oncology patients with shortness of breath from their palliative oxygen therapy so those are the kinds of things that we want to put the patient first and make sure that we're taking that in consideration or those unintended consequences okay moving on to the next slide and of the do's and don'ts first is submitting a measure that can only capture the assessment or that a survey was completed this is probably the most common scenario that we see and we will be very consistent in responding that these measures should be combined into a single measure that infants emphasizes that a robust quality action however if you do think it's important to measure that assessment separately there is always that option to create a measure similar to the mips measure 226 and this is looking at tobacco use screening and cessation so this measure in particular breaks it into three different rates so the first rate being that assessment only and then second rate being those that assess positive who received the quality action of the tobacco cessation and then the third is just that combined rate so distilling if this delineation is important there are opportunities to provide a more granular data to the clinicians if it was structured in this manner so just keep that in mind and and i definitely would say we would not be looking for those assessment only or completion of survey measures next is as previous mentioned we're looking for outcome patient reported outcomes um and definitely believe this is very apparent within all the cms resources that we're we're seeking more outcome-based measures rather than the process all right moving along to the third item is um measures should align with the current guidelines um and if it wasn't if they are in not in alignment with the current guidelines they would not be considered for the muck list so for measures already in use and mips each year during the measure maintenance process the measure should be updated to align with the current guidelines and why we do this is if a measure remains misaligned and creates misleading results it may be suppressed until the measure is updated while the clinicians don't get scored on these measures it does place extra burden on the clinician to collect data that ultimately does not contribute to their mips final score so we want to make sure that those are in alignment there and meaningful so but we do rely on the measure stewards to alert cms if any guideline issues moving on to the attribution attribution is the assessment of results of a measure to a reporting clinician or group mips is a clinician group focus program and we want to implement measures that are attributable to that clinician group or virtual group um an example we have seen measures that we would reject because they aren't able to be tied back to a quality action of a clinician so looking at medication shortage or equipment supply shortage things like that does not really speak to the care that's being provided by the clinician or group and those types of measures would not be appropriate for the mips program okay lastly on this slide all the components of the measure should be considered consider the evidence so denominator exceptions or medical reasons should be precisely defined as an evidence base a broadly defined medical reason such as any reason documented by by a clinician may create uneven comparison if some clinicians apply reasons that may not be evidence-based however medication specified in a numerator is shown to harm fetuses and the patient's pregnancy is documented as a reason for not prescribing that indicated medic medication this should not negatively affect the clinician's performance as there was evidence-based based support not just the prescribe the contraindicated medication so just taking and making sure all elements um are precisely defined and taken in consideration publications and evidence moving on to the last slide of the do's and don'ts we ask that you do not submit a measure that is a never event with no variability in the clinician performance well we in this example that we provided we never want a fire in the operating operating room they occur so frequent that it's not a very meaningful measure so if you're looking at a complication rate measure measure developers should ensure that they are trying to close a gap on a given complication or event they they are important measures um we don't disagree there but when we have to consider how the measure will be scored if there's less than a percentage separating the top performer from the bottom it is hard to say that one clinician is doing significantly better when it's just a separated by one percentage point and lastly we do not want to see measures assessing the standard of care this relates back to the the scoring consideration if there's little variability or gap it's hard to differentiate performance among clinicians measures should ensure that there is a performance gap through testing and investigate any opportunity to revise the measure to be more robust or comprehensive so this goes back to maybe creating composite or things of that nature all right next slide all right we made it through the do's and don'ts so as mentioned in the last slide we want to see that performance data and performance gap to ensure we're closing or improving care measure performance data would be requested to be included within your submission measure performance and evidence should identify opportunities for improvement and keep in mind that cms does not intend to implement measures in which evidence identifies high levels of performance with little or very little variation or opportunity for improvement for example topped out so each year i think we receive submissions and the sample data that they pulled it may be you know the average performance rate is already in the upper 90s this is not a measure that would be long lived within the mips program as we do have that policy to retire tapped out measures after a given life cycle or extremely topped out measures so keep that in mind when you're looking to develop a measure also when looking at what tools to use within a measure we want to make sure that at least they are standardized and validated tools but additionally that they're non-proprietary at least there's one option option within the measure that's non-proprietary at least and then lastly the measures should all be intuitive or have high face validity and this is very important because this information and the performance feedback can be viewed by the patient within care compare so things to consider there all right next slide please future state so and i'm sure that um some of you are of the attendees today are already looking ahead to this but cms is looking to transition to mips value pathways otherwise known as mvps the goal for mvps is to create a more cohesive participation experience by connecting activities and measures resulting in a more value-focused program with reduced selection burden there's been another number of town halls and rfis related to this topic and we're we're definitely taking all that into consideration but um so how that applies to the call for measures is when we receive your measures we're taking a deeper dive into how it relates to the different cost measures or improvement activities and how they really complement each other so is that improvement activity um contributing to a better performance within the quality category so this will be an important link moving forward as we transition over to mvps but um so as you submit it's it's important to provide that rationale and take a peek at the the cost measures inventory or the ia inventory or improvement activity inventory to see how those areas can be complementary all right and lastly just um we did as i mentioned we provided some resources if you can go to the next slide we'll go over them really quickly um and we'll open it up to question and answer so um the blueprint is a very handy resource when it comes to measure testing and development a lot of supplemental materials as well and then we will be posting the fact sheet as well as the peer-reviewed journal templates on the pre-rule making website we also provide a couple of examples that are handy to take a peek at when when populating those templates and lastly the ecqi resource is is listed there for any of the ecm questions and again if you have any questions on the content you're welcome to reach out to the mip the pims audio measure mailbox and we can help you um from that aspect so i will turn it back over to ketchum i believe for any question and answer great thank you jocelyn all right yeah now it is time for the question answer period of today's webinar to ask a question you can submit it via the chat box and we can read it out loud or if you'd like to ask a question over the phone line you can toggle the hand icon on your go to menu and we will go ahead and unmute your line okay so just to get started we have a few in the chat box um your first question asks does cms recommend any particular software for quality measured data collection and analysis so dr green i don't know if you want to take this one but i do not believe that cms endorses any particular software sorry i couldn't get uh sorry i couldn't get off uh you're absolutely right we um we don't uh endor endorse any particular software we do um you know there is uh obviously onc certified software um we also um do a qualification of potential registries and qcdrs as third-party intermediary vendors and those are posted on our website but we don't endorse a specific one or product okay great thank you for that all right your next question asks um are patient reported measures allowed during the public health emergency i'm sorry our patient reported say that one more time please are patient reported measures allowed during the current public health emergency so if by that you mean cap survey and things of that nature yes okay perfect okay next is there any resource in regards to new merit tool for measure submission i believe that will be coming out in the in the near future okay now right okay next how can quality measures with patient reported outcomes as a basis become digital measures yeah so i think it's it's definitely dependent on how um the met or the patient reported outcome or the problem is implemented i mean um there could be digital tools upon um checking in with your your physician or your doctor's office that could be linked into your ehr that could be considered digital okay great okay um next for missed measures do they have to be specified at the individual clinician level or can they be specified at only the group or virtual group level as a mix measure they should be specified at the clinician level um as mip cqm okay great thank you okay again to ask a question over the phone line please just toggle the hand icon on your go to menu and we will go ahead and meet your line all right in the meantime is cms working towards developing quality measures for wound care i go ahead as i say i'm not aware that that we are um but if anybody else knows differently please speak up uh i'm but as mentioned i'm not aware we do have um i believe that qcdr we still have we have some wound measures in qcdr's i believe jocelyn maybe you can speak to that yep yep we have a few in qcr world but then also if there is a um i believe these would be listed in the measure development plan of what's coming down the pike or in the pipeline so that might be a good resource to check for any specific topics you're not duplicating effort okay great thank you okay we've had a few questions come in over the phone line uh so first kathy johnson will go ahead and mute your line um and kathy you're self-committed so if you'd like to unmute on your end you can go ahead and ask a question great thank you so much what i was wondering about is with the app coming in in 2022 um and the all the submissions are going to be uh through the app it's an ecqm data measurement it looks like it's only going to be three measures at this point i was trying to get wrap my head around is this a continual submission so this data is flowing and it's constantly updated and how will that affect the submission period um any any information or any places i can go to find out more about that not sure that we have the um the app folks and submission folks on this particular call okay okay do you can you help me figure out what resources where i should go to find out the i i feel like i'm reading between the lines that it means that it's going to be continually updating your performance based on that flow of the data going in but um but i can't i don't know where to go to confirm that yeah i um dr green i don't know if you know off the top your head but i would probably since this is more call for measures based i would submit your um your question to the qpp service center and hopefully they can direct you great thank you so much okay great okay next uh question asked do you require testing at other levels of analysis as well such as facility plan or state for mips it's um it's at the clinician level okay great all right your next question asks um how can qcdr measures be a part of an mvp i know sofia is on but i do know that there is additional requirements um outlined in the final the 2021 final rule that require them to be fully tested um actually be approved for the prior year things like that so um and it would be part of the notice and comment rule making process but sophia do you have anything else to add on that uh no you've covered it uh qcdr measures as you've mentioned jocelyn uh need to be fully tested before they're considered for inclusion within the mvp all mvps need to go through notice and comment rule making so you would need to have that approval through yourself through the self-domination process that your measure is approved uh before we put it in the rule um we don't want to be in a place where we're temporary or tentatively including measures that have not gone through the full approval process the qcdr measure approval process um so i just want to caveat that uh the full testing and approval and then uh the measures included within the mvp as a part of the notice and comment rule making process okay great thank you okay uh your next question is from maggie loan so maggie will go ahead and unmute your line all right and maggie if you'd like to go ahead you may ask your questions thank you very much uh so not to belabor the point about the uh clinician level reporting but my question regards specialties such as pathology that solely primarily solely report as a group so does that clinician level reporting include clinician group level reporting for that kind of specialty yeah so i i know for for mips reporting we do have to i mean a pathologist can choose to either report as a clinician or group but for the purposes of mips requirements it should be tested at the clinician level but we know that you know pathology from the research that we've done they are participating more as a group so to answer your question it is testing at the at the clinician level okay uh and is there accommodation or recognition that that the results would be low at that level given that they do their work as a group does that make sense you know i'm not quite following like um right right so you know pathology groups yeah pathologists work as a as a team as a group so it's rare that a single pathologist could claim credit for performance um yeah so the testing would demonstrate that that as a group they would especially for a pathologist-centric measure on that they would be able to report good quality as a group but if you were to parse it out as trying to do that for a single pathologist the test results would be low um just bit by the nature of their work yep we may have to um if you want to email us i think we can chat through this a little bit further in detail but i think i know where you're going um but definitely let's um let's touch base and just email the quality measure support and we'll make sure that you have what you need great thank you all right your next question asks when will the new merit system be up and running i know um i believe helen is on but um last i heard it would be a couple days yep hi helen yeah hi how are you thanks for the question yes so our anticipation is that it will be live starting tomorrow um but it'll be live throughout the whole submission period so feel free to go in and take a look we're really excited about it great all right thank you for that okay uh next question asked when do you anticipate the 2021 muck submission guidance will be posted to the pre-rule making website as far as the mips um coffer measures quality measures we are hoping within the week great thank you okay and next you have a question on the phone line from ronin rosenbloom so ronan will go ahead and meet your line now and you may go ahead and ask your question hi uh so first thank you for the update very interesting uh presentation so uh i would appreciate if you can provide us with specific information about democracy submission dates for 2021 as far um and as well as if there is a time frame to submit after the marketplace submission um updated um data analysis um i just checked the the the site and it's not updated and you mentioned in the beginning of the talk something around may but if you can be more specific about the timetable for for this year i will appreciate that so may 27th is the yep go ahead so now it's just going to say yeah but may 27. this is helen sorry uh and so again it will be open through then for submission of your measures and as far as we are concerned as i know this past season we allowed for test information and data to be submitted thereafter we are looking to not do that we did that as a result of the the pandemic that we were all dealing with and so for this particular cycle we are looking to um have all the information in by that date thank you okay thank you okay great all right next question asks what is the best way to identify measures that are in a planning or pre-development phase i.e they are not listed yet in the cmit the best resource at this point i know of of um you know i do know that there's a qcr measures list that has a number of measures i think there's over 400 this this year that would be a good resource to make sure you're not duplicating efforts the measure development plan and again as i had mentioned we do have those priorities listed for those clinical topics and specialty areas would be a good resource okay great okay uh next question asks is there a resource that explains and gives an example and format for a safe validity document that is acceptable to cms i think the best resource that that would be would be the blueprint but there's no you know set um you know reliability threshold or whatnot but yeah definitely the best resource for any of the testing information would be the cms blueprint okay great thank you okay uh next question asks uh that some outcome measures require a preliminary process step to collect data on the outcome such as consistent problem completion to assess propm so will cms take this into consideration when reviewing these process measures if the process is rejected but the outcome is approved the outcome will not be viable i think i understand this question dr green feel free to chime in but it sounds like you want an assessment measure separate from the outcome measure it's what i'm assuming but feel free to raise your hand if i'm not interpreting your question correctly so i think at that point i mean there is opportunity as i mentioned you know to combine those look at the outcome or look at the assessment provide a performance rate but then what we would score or what we would look at is we want to see how the outcome would be measured and and i think it could be one measure depending on how they would construct the measure okay great all right as a reminder to ask a question over the phone line you may do so by toggling the hand icon on your go to menu and we will go ahead and unmute your line um in the meantime regardless of the submission date will there be an opportunity to submit additional data at a later date and if so what would that date be that we are looking for all the information all the data to be submitted by the end of may uh we are not looking to have an extended period for additional information so again if you could just please have all your information ready to go by then that would be great thank you okay great thank you all right next question will cms provide clarification on the distinction between digital measures and the ecqms or will cms create materials to help delineate digital measures from other types of measures i think ecqms could be considered a subset of digital measures so a digital measure can use a number of different um interoperability methods to sort of tie into the quality measurement but i i do think that cms and correct me they're they're um will be providing additional guidance on what would be considered a digital measure yeah i can't say that we have something definitively uh on the drawing board but i can't imag on the other hand i can't imagine that we won't be providing uh more contacts more examples etc um sophia i don't know if you're able to comment on this or or not yes i think sophia had to hop off actually um thanks so much this is lisa marie we are in the process of looking at defining and finding ways to be able to clearly delineate what we mean by digital measures so we are in the process of doing that thank you lisa marie great all right thank you next question asks will there be any requirements around digital measures and mvps now we're in the future uh i do believe that mvps promote the use of digital measures um we are still in the the midst of policy um writing for that particular area but that is um one of the areas that they would like to see more digital measure used all right great okay next question asks do you have a preferred psychometric techniques for reliability testing of performance scores nope i don't think there's any preferred method just what is outlined in the blueprint okay great um just had a few more questions to clarify the date of muckless submission this year in may was that may 27th okay perfect and and i will say just really quick i want to clarify one thing that's muckless submission um that's when the the candidate submission closes for call for measures and then that that is when we look um and review all the measures to see if it will include that particular measure on the muck list so i just wanted to clarify those two different process process that go on great thank you all right we do have um a few more minutes left for questions today so if you'd like to ask a question you may do so by submitting it through the chat box and we'll read it out loud um or of course you can toggle the hand icon on your go to menu and we can unmute your line to ask a question over the phone line get your options there um feel free to keep submitting your measures or sorry your questions um all right uh next question is a bit of a comment if you could clarify um but they note that the current not functionality limits the ability to develop composite measures can you repeat that question is it a question it is a um statement i'm wondering if you have any response to it uh but it just says current map functionality limits the ability to develop composite measures i don't know if you're looking for a statement this is joel we've we've noticed the issue in our in our own uh measure development efforts um i i don't unfortunately have a have an immediate solution to offer you at this point but it is something that we're very much aware of okay great thank you all right well that's actually all the questions that we have um at the moment so again if you'd like to ask any questions you may do so by submitting it over the chat box um or you can raise your hand and we can take a few more questions over the phone lines so we'll give it just one or two more minutes and then trust when we can pass it back over to you all right this is a general call and we do have a few more minutes left so we'll see if there are any questions left well i think i'm gonna actually jump in if that's okay uh lauren um if another question comes in while i'm speaking we'll certainly be happy to uh to take it um i just want to thank everyone uh all of our participants uh for for dialing in for your thoughtful questions uh and we really appreciate your interest um and you know working with us to try to make the measures in the mips program uh as as good as they can be and is useful and meaningful uh to patients and clinicians alike i'd also like to take a second um um to thank all the uh all of our contractors who helped work on this um ketchum folks the pims folks and i know there are others and i'm sorry if i'm leaving you out but i also want to recognize lisa marie gomez who was instrumental in helping to put this together she's a cms person that that we work closely with with our on our quality team and thanks to all the smes uh sophia joel um and dr schreiber for giving opening remarks as well as anybody else i left out so if i did i didn't mean to but um thanks again and i'll turn it back to lauren one more time to see if any questions came in if not i wish you all a safe and pleasant rest of your day great all right no more questions around the line um so that concludes today's webinar everyone may now just connect thanks everybody you
Show moreGet more for payment reminder sample for product quality
Find out other payment reminder sample for product quality
- How to sign a PDF document in an email effortlessly
- How to sign a password-protected PDF online with ...
- How to sign a PDF electronically on a Mac with airSlate ...
- How to sign a file online with ease and efficiency
- How to sign a PDF electronically on MacBook Pro with ...
- How to sign digital forms with airSlate SignNow
- How to sign a PDF on an iPhone easily and securely
- How to sign papers online with airSlate SignNow
- How to sign PDF files on Android with airSlate SignNow
- How to sign PDFs on MacBook Air with ease
- How to sign PDF files on a computer with airSlate ...
- How to sign a document using Word for effortless ...
- How to sign an email attachment effortlessly with ...
- How to sign forms on iPhone effortlessly
- How to sign a Word document using e-signature with ...
- How to enroll online with airSlate SignNow for seamless ...
- Learn how to sign and upload a PDF document with ease
- Adding a digital signature in Word document made easy ...
- How to mark and sign a PDF with airSlate SignNow
- How to sign Word with digital signature made simple