2015-08-14

Hello, and welcome to the Academy of Oncology Nurse & Patient Navigators live webinar, "What is the Patient Navigator Role in Quality Metrics?" with Elaine Sein.

Elaine Sein:  Thank you for joining us today. Welcome to our webinar. My name's Elaine Sein. I am the Chair for the Quality Outcomes and Performance Improvement Committee for the Academy of Oncology Nurse & Patient Navigators. Our webinar today is really being sponsored by our subcommittee.

We conducted a needs assessment of our membership to really glean areas we could support you in your role as a navigator and how to begin to become involved in quality improvement and research, in the broadest sense of the word.

Basically, quality metrics came out as one of the top items that navigators are really looking for today, in guidance in how to develop them and how to create dashboards. Everyone is really working in an environment where we need to justify our existence.

I know that we have a varied audience of clinical navigators, nurses, social workers, case managers, and non‑clinical navigators, as you work in a variety of practice settings. Our goal today is to give you the nuts and bolts of how to engage in developing your own quality process, regardless of the model of navigation you work in.

We will discuss quality metrics and their impact on cancer programs, define quality metrics as they relate to your role as the patient navigator, look at some best practice models for how you could possibly integrate quality platforms into your own patient navigation programs, and hopefully provide tools to assist navigation programs in developing a navigation quality dashboard.

In 2014, all cancer programs are data‑driven, so we need to find a way to make navigation data be as meaningful as possible. Let's get started with the basics of quality care, which really dates back to the Institutes of Medicine's 1999 position statement.

We need to be evaluating care and processes on evidence, with the patient at the center of the process and having the most beneficial effect for the majority of patients in a system‑minded approach. Does that sound like 2014?

There are many different definitions for quality, especially now, with healthcare reform in place. This definition from the Agency for Healthcare Research and Quality states that healthcare quality is getting the right care to the right patient, at the right time. This probably rings a bell for us as navigators because our goal is to ensure access and care coordination for the right patient every, every time.

The second definition is from the Institutes of Medicine. They define quality care that is consistently safe, timely, effective, efficient, equitable, and patient‑centered. Does that ring true to our navigator role?

There are a lot of key players that are part of quality measures today. Many of these key players and stakeholders, many of you are probably very familiar with. I just wanted to give you a snapshot of what's really out there in the cancer domain.

The Joint Commission, we are all familiar with because they have been measuring accountability for inpatient and hospital settings for a long period of time, both inpatient and outpatient.

HCAHPS, if you work in a hospital setting, you're very familiar with HCAHPS. That's Hospital Consumer Assessment of Healthcare Providers and Systems. Long name for surveys where we actually ask the patients how we take care of them, what their experience has been like. More customer service‑based.

HEDIS measures are Health Effectiveness and Data Information Systems. That's really a set of 70 measures that are used to prove effective care, continuously updated with new scientific evidence that helps to raise the bar. In the cancer domain, there are measures that are looking at breast, cervical, colorectal screening, and smoking cessation.

The ACOS, or the American College of Surgeons, is the accrediting body for all of our cancer programs. Basically, in 2012, they changed the format of a lot of their standards to really base it on patient‑centered care, with new requirements expanding the focus on the quality care of patients, looking at patient outcomes.

The NCDB database, which actually falls under the American College of Surgeons because it's a joint program of the Commission on Cancer, the American College of Surgeons, and the American Cancer Society, is a national oncology outcomes database for more than 1,500 Commission‑accredited cancer programs in the United States and Puerto Rico.

About 70 percent of cancer oncology cases nationwide have their data captured in this NCDB database in order to explore trends in cancer care, to create regional and state benchmarks for participating hospitals, and to serve as a basis for quality improvement.

The NAPBC is the National Accreditation Program for Breast Centers. That's looking at breast cancer measures. That falls under the American College of Surgeons as well. Then we have the National Quality Forum, which are endorsed measures that are considered the gold standard for healthcare measurement in the United States.

They come from an expert committee made up of various stakeholders, including patients, who evaluate the measures for NQF endorsement. This is a federal government...many private sectors use the NQF measures above all others because of the rigor and consensus process that goes behind them.

The Oncology Nursing Society has nurse‑sensitive measures. Many started with breast and survivorship. The NCCN guidelines are really the gold standard of care for oncology.

The National Quality Forum really started in 2008, to zero in on care coordination and develop a portfolio of care coordination preferred practices and performance measures that really are looking at structure, process, and outcome in order to evaluate physician office capacity, access, [inaudible 08:38], communications, and tracking of patients across providers and settings.

Given the high nature of transitions in care, this work is going to be building on efforts to really establish principles of effective patient handoff among clinicians and providers. Does that sound like what you do in your navigator's role? Of course it does.

High‑priority research areas are to advance the evaluation of care coordination as a quality improvement tool. Accountable care organizations and value‑based payments is another area that's coming into play now.

The National Quality Forum actually commissioned the Rand Corporation, about a year or two ago, to identify key areas where measures are needed for payment reform that really rewards value over volume, looking at outcomes, care coordination, patient engagement, and the longer view of the patient's experience.

The type of measures in use in payment programs that really will impact navigation include the patient experience, preventative‑type services, clinical care processes, and care coordination with a patient survey.

The Centers for Medicare and Medicaid are also looking at clinical quality measurements and quality, looking to measure the quality of patient care to help drive improvements in healthcare, again, identifying areas for improvement, looking at the differences in care outcomes amongst various populations, but indeed, again, improving care coordination between healthcare providers.

As you can see from these first few slides, I'm really attempting to tell a story about where healthcare quality management and payment systems are rapidly moving toward. We want to be sure, as care coordinators and as navigators, that we are being proactive and that we are setting ourselves up to have a seat at this table.

The National Quality Committee for Quality Assurance, which is a nonprofit since 1990, looked at state, federal government, consumer, and business leaders, is a group that looks together to find and improve quality, looking at what is the best way to come up with good quality measures, what's going to work for us, looking at clinically important things that are important to the patient, to the payers, and to the providers of care.

Evidence‑based, solid science. Consensus on what to do. Transparency. What's being measured? Why is it being measured? How is it being measured? Whether these measures are feasible, reliable, and have a reasonable effort to get accurate results from. Whether they're actionable. That they show a difference. That they're going to show where improvement is needed and that we're really vigorously looking at the results and that they can be independently verified.

How do we make sure we're choosing the right metrics? It can be very daunting, and we are really trying to get started in deciding on how...You may have an idea about a process improvement or a quality improvement you would like to do. How am I really going to be able to pull this off, to make sure that it's something that's going to be workable in my individual setting?

The C4QI scoring methodology was presented at an Advisory Board meeting back in 2009. For those of you who are not familiar with it, the Advisory Board and the Oncology Roundtable is a think tank in Washington, DC, that surveys cancer programs across the country about futuristic things that are happening in oncology care, clinical practice, programmatic operations, revenue‑generating ideas for oncology management.

The C4QI is a group of quality directors from comprehensive NCI‑Designated Cancer Centers throughout the country who really have gotten together to try to pull resources for the best mechanisms to be able to evaluate measures for cancer programs, regardless of the type of program that you are.

When you're looking at criteria for a measure, you want to look at whether it's meaningful, whether there's an ease in collecting the data, whether it has a resource intensity to it that's feasible, whether there's literature to support it, and what the breadth of applicability of it is to your patient population. You can score these on one to five, with one being the least effective and five being the highest effectiveness to what you're looking for. This is just an easy way to look at the measure that you want to track.

These are the different types of quality improvement methods that are out there. Access measures adjust the ability to obtain timely and appropriate care. Outcome measures, the patient's health status after receiving a health service. It can be used to evaluate the quality of care to the extent that healthcare services influence the likelihood of the desired outcome.

The patient experience we are all familiar with as navigators. That aggravates reports of what the patients feel about the observations and the participation they have in their healthcare. The process measure actually assesses the actual healthcare service that was provided for or on behalf of the patient.

The structure describes the feature of an organization or a clinician that's relevant to the capacity to provide that care. From the nursing perspective, it could be the nurse‑to‑patient ratio to number of beds, or the number of patient navigators to the patient population that you're actually dealing with. Then resource and user efficiency is the resources that actually would be available to be able to provide that service.

The basic terminology, which probably is a refresher for most people, is that you're looking at the definition of a quality indicator is the definition of the standard of care that you're looking for. The quality measure is the mechanism that you will use to quantify adherence to that standard.

The numerator is a statement that defines the cases within the population of interest you have received that are specific to the care being measured. The denominator is the statement that actually defines the patient population. Later on in this webinar, we'll actually be showing examples of how you can develop this into the format for your own different process improvement project.

Why quality measures? Basically, in one sentence it is what gets measured gets improved. We, as navigators, on a regular basis are always trying to enhance our program. Sometimes we just do it anecdotally, just from the work that we do every day. If there's a way that you can put it on paper and document what you're doing, that's really where you're going to be able to show the validity and sustainability of what you're doing.

Quality measures. Let's look at some quality measures of how they relate to patient navigation. Patient navigation becomes a mandated standard for all Commission on Cancer‑approved cancer programs starting in January of 2015.

As we read this statement, it becomes crystal clear that evaluating, documenting, and recording our navigation processes is not only expected, but part of the standard, and it's meant to be on a continuous basis. Really, we have no choice. This is a standard. Going forward, we are all going to have to constantly monitor the process that we're doing.

When looking at key elements of patient navigation and patient‑centered care, these are some of the areas that dovetail into the various models of patient navigation you are all practicing in. Community navigators are working in prevention and early diagnostic days and can carve out specific measures or process improvements that can impact the earlier diagnosis or early screening of patients.

Disease‑specific navigators who participate in tumor boards on cancer conferences can track the use of NCCN guidelines. Case managers can track utilization of emergency room usage with baseline information, working with a team to create a process improvement plan, and by actually then reducing emergency room utilization and be able to monitor it and report it.

Tracking patient education and decision‑making support sessions and adding them as questions in a patient satisfaction survey that are specific to navigation is also another mechanism that can be used. All aspects of direct support care have navigation flowing through them.

Depending on the model that you work in, whether it's psychosocial distress, survivorship, genetics, pain or palliative care, these are all areas that metrics are being pulled out of from the direct support of navigation. We all want customer feedback to defining aspects of navigation that led to an exceptional patient experience.

What is the navigator's role in the multidisciplinary team? The navigator really is a member of the team, but since the navigator also is the eyes and the ears of their patient, the navigator is a reality check on the patient care experience, regardless of which aspect of the continuum of care you're working in. A successful navigator is a relationship builder and the glue of the multidisciplinary team.

I'd like to just take a look at some of the navigator checklist of areas that are the six areas that cross over all navigation. This, again, was from surveys that were done throughout the country at navigation programs of where navigators felt they had the most impact. Care coordination, timely care, multidisciplinary clinics, business development, patient education and the empowerment that you give your patient, as well as the psychosocial support.

These are responsibilities. Some of the responsibilities that came out of all six of these domains when they did their survey for oncology practices across the country were that navigators serve as the main point of contact for patients and families in each of these areas, that they schedule the appointments, that you work with referring physicians to understand preferences for communications about test results. You prepare letters for referring physicians and summarize recommendations. These are all areas where navigators should be using to develop measures that prove your value and your sustainability.

How do we select the right metrics? For anyone who's heard me lecture before on navigation programs, I always say there are no cookie‑cutter programs. Each navigation program is going to fit the needs of your own individual program, regardless of whether it's a clinical program or a non‑clinical program. You're going to need to pick and select the metrics that are going to best fit what you're looking for in your program.

You could be looking at community outreach programs. You could be looking at advocacy programs. These are just a snapshot of the various types of various types of quality measures you might want to look at.

Patient‑centered measures might look at retention rate, new patient volumes, patient satisfaction. The vision‑focused metrics might be looking at the number of physician referrals if you're trying to grow a disease‑specific program, and the physician satisfaction with the program.

Clinical quality measures are looking at timeliness to care, whether a patient might be accrued to a clinical trial, and whether patients are being cared for in concordance with either NCCN or ASCO guidelines.

At the end of the presentation I do have a resource list with a direct link to the Association of Community Cancer Centers actually has a whole portfolio of patient navigation tools and resources that you could actually pull from that will give a little bit more detail. You do not have to be a member of the Association of Community Cancer Centers to be able to access this. They're directly on their link.

I would like to share with you some ways that you could incorporate a quality platform into your own individual program. Let's look at quality teams. Who is the quality team in your institution and in your setting? Do you have any connection with them to support your programs?

What goals does the quality team have? In normally most quality teams, the patient experience is always a goal for quality in order to report back on what's actually happening within your institution.

How is the data already collected for information that you're looking on? Do you track your touch points, your volumes, your EMR migration? Any specific project that you're already doing? Either on paper, in an Excel spreadsheet, in an Access database. Are you part of an EMR? Do you have any kind of a software tracker that you're using?

What metrics are really going to make sense for your own institution in your own setting? How are your metrics reported out? What actions are actually taken? Do you bring these metrics to cancer committees, to a disease‑specific team, to administration?

Do the metrics that you're looking at matter to patients and are there [inaudible 22:25] that are there that are actually going to support the metrics? Or the metrics that you come up with, are they going to be revenue‑generating metrics that really prove the value and efficacy of your role?

Quality team members. I put this slide together really to just give you an overview of what could be a quality team. This may actually be something that might be a disease‑specific quality team, where the patient is always the center of your team.

But you always do need to have a physician champion when you're trying to begin any kind of a clinical enterprise, especially if it's disease‑specific, to have buy‑in from the rest of the staff. That the physician understand the value of what you're doing. Normally, you will be able to get some buy‑in from those administration quality directors, that type of thing.

This would be the team. The patient, the physician, you as a navigator, the nursing administrator or the hospital administrator, the cancer registrar, the director of quality, and the IT staff. The nurse, the navigator, the physician, the patient, that's more or less understood. But these other three members need to really be defined so that we as navigators have a really good understanding of how they can enhance and help us with our quality improvement projects.

We do not work in silos. We need to work as a team. My experience in working with those [inaudible 23:50] and with community navigators is that that the cancer registrar is your, is the navigator's best friend.

The registrar has the ability to pull datasets to support enhancements to clinical care, information related to the stage of diagnosis for various disease sites that may actually be the quality initiative that your team mainly might want to work on, as well as a lot of the timing for treatments that can be pulled from the cancer registry abstract.

Your quality director knows all the processes and the administrative ways of how to put this into place that will be meaningful for your institution, and your IT staff has the capability, hopefully, to be able to make the process as seamless as possible to provide the systems or the electronic mechanisms to make it easy for you to be able to track what you're going to.

Some of the processes and best practices out there and ways that you as a navigator would be able to embed yourself into some quality improvement initiatives would relate to if you're a single solo navigator is to actually try to get yourself on your institutional quality council or quality improvement committee. Usually these committees are multidisciplinary in nature and they really work on areas that impact both inpatient and outpatient care.

As a navigator you have a unique perspective on the patient experience and can provide data on process issues as they are actually happening in real‑time. This is your way of bringing navigation to the forefront within your institution. All it is is a matter of asking to be a part of something that's already in existence.

Another option is if you are an institution that has multiple navigators, to actually form your own process improvement team or an institutional navigation committee that would include both navigators and your administrators.

Where you could do regular meetings to identify processes that you may have in place that you would want to change that are either disease‑specific or that cross navigation processes in general in order to generate quality measurement schema to really have...be able to develop your baseline data to monitor the results. Figure out how you're going to report them out so that you're improving care, and therefore be able to validate your role.

If you're a member of a disease‑specific team, the navigator really can be able to choose care coordination metrics that improve patient care for that particular disease team that would add to the disease‑specific dashboard. An example would be what may happen in a lung nodule clinic and how the navigator would have impact on that. Or in a GI clinic if a patient, what happens when a patient has a positive colonoscopy and the next steps in how that could be tracked.

As we move forward, when we're really looking at these things we really have to take in the fact that measurement is critical for justifying our ongoing investment. Given that navigation is an unreimbursed service in cancer program leaders really have faced increasing pressure to quantify the benefits of this investment.

Even though in healthcare everything is about efficiency, care coordination, we still need to find a way to show that the revenues that we're tracking for these patients are providing a better quality patient experience and in the end providing for better outcomes for that patient. That's really how we will in the end justify long‑term sustainability for this position going forward.

These are some areas that when you're look at value situation and sustainability for your role, that you can carve out some areas to look at. Outcome measures, timing and access and through quick type of measures. Things that have to do with patient education, patient satisfaction. And we look at patient satisfaction, many of these patient satisfaction surveys are really general to cancer programs and not specific to your role as a navigator.

I would strongly encourage you, even as a first project, if you have a really good program in place, to really be able to justify the specific interactions that you have with that patient in a patient satisfaction survey that's asking about your role. That that will be able to get back to administration so that they can see the value that you're bringing to your patients.

Whether or not you're involved in educating patients and screening and approving them to clinical trials, these are all things that are important to institutions as well.

The key thing on this slide is that you as a navigator need to be able to report out what you're doing on a regular basis. Whether it's in a report format and a dashboard that's tracking some of your key functions will really be valuable to you in the end.

We really wanted to be able to provide some tools that would be able to assist with quality dashboard development. Creating an outcomes database really is something that's going to be unique to each institution.

Because it's going to depend on the type of program you have, whether it's a clinical program, whether it's a community‑based program. How involved you are in the outcomes process. What kind of technology is there to actually help you? Again, are you having to work from paper? Do you work from an Excel database? Do you have any kind of a tracker?

What kind of resources are there to help you? Do you have an IT program that's sophisticated enough to actually be able to make a navigation tracker or a navigation dashboard meaningful for the institution to be able to work with you?

Because most nurses are not trained in how to develop dashboards, at least not in my generation. Maybe in the younger generation it's probably something that's much more evident in training today. But you need to utilize the resources that are actually available within your own setting.

Navigation software is the buzzword that everyone is really looking for today, because they really build platforms that track your activities so that you can evaluate your quality improvement needs. It makes our lives so much easier if we actually have these databases in place.

Some options for these would be, depending again on your institution, whether or not you can have a homegrown quality improvement database that is specific to navigation that might actually intertwine with other systems within your organization so that you or your administrator can pull reports off that show the efficacy of your program.

Then there are commercial software navigation programs out there that your institution can actually purchase, and many of them are online. Our quality outcomes and performance improvement committee has developed some links on our website for resources that actually take you to different websites of a variety of different commercial software products that are out there.

We just are listing things that as a resource because they're out there. We're not supporting them in any shape or form. It's really just for you for information.

These are some quality improvement standards that we're all looking at. Many of them, they were already mentioned earlier on. I'm not going to go into them. I just want you to know that these are just areas and organizations that you can look to, depending on what area of oncology and navigation you're working in to actually support you in some of the projects that you might be working on.

But as navigators, we really want to start to look at the link to evidence as we start to develop the dashboard and the processes that we want to do individually. [inaudible 31:46] really looked and brought the evidence from the highest quality of evidence to the national to the local levels for us so that we really can get a perspective.

High‑quality evidence is what's guiding the standard of care, and that would be the NCCN guidelines. The National Quality Measures help clinicians and practices consistently apply the high‑quality standard of care, and then they allow that for benchmarking. That's where you get the National Quality Forum information.

Then at the local level is where processes help to examine the reasons for variance and implement effective local changes that will meet standards. This is where as navigators we really have the inclination and the opportunities to really impact local quality improvement going forward. Remember, again, we are..."No man is an island" and we can't do this alone in silos. You need to remember to pull all the people in place within your institution who really can help you to do this.

How do you construct a dashboard? That's everyone's question. The operative word. When we're looking at the elements of a dashboard, it makes [inaudible 33:59], because you're going to look at the type of measure, which we'll say was a, it could be a process measure. You're going to look at the improvement noted.

I'll use an example maybe that would probably cross all navigation would be maybe a decrease in wait time. A numerator statement might be that a newly diagnosed breast cancer patient receives an appointment with an oncologist within 48 hours of a phone call. The data element would be the scheduled appointment.

The denominator statement would be newly diagnosed breast cancer patients at this facility for this episode of care. The data collection tools could be a couple of things. Retrospective data sources. It could be administrative coding and billing and medical records, abstracting.

Also it could be a navigation tracker. Data accuracy really then comes from the pretreatment period, the dates from when the patient is diagnosed with breast cancer until that first appointment using a tracking tool, using the data diagnosis from a pathology report, or from an abstract from the cancer registry. Then any kind of selective references. There's a great deal of information that goes into creating a quality measure.

When we're actually putting these things together, everyone just thinks they need a numerator and they need a denominator and then they're going to get the report. You really have to have all the pieces of the puzzle in place so that when you're looking at measures, you're looking at apples to apples.

I tried to create an example for you of how you would go about looking at a measure in order to put it into a dashboard. I wanted to pick something that would be fairly easy to explain. A measure of a patient whose date of needle biopsy precedes the date of her surgery. The steward of that who actually developed that measure and monitors it is the American College of Surgeons, because it's one of their quality measures.

An endorsement would be, if this was actually an endorsement by the National Quality Forum, the numerator is the patient whose date of needle biopsy precedes the date of surgery, and the denominator is patients presenting with stages 0, 1, 2, or 3 disease who undergo a surgical excision or resection of a primary breast tumor.

How do you collect it? Who's going to collect this? The collector would be the cancer registrar, and it also could be a disease‑specific breast navigator as a cross reference. The results would be 75 percent of the patient population met this standard.

Another way of looking at it, especially depending on what your metric is, is to actually break that down into not actually the percentage, but depending on the volume that you're looking at, the actual number.

Sometimes that can be meaningful if this is 75 out of 100 cases, but depending in your metric it may have been 4 out of 25, or 18 out of 20, something like that. Depending on the metric, it can be meaningful to have the number in place.

We found that to be very helpful in a lot of the metrics that we come up with. The benchmark, of course, is 90 percent, and the variance would be that excisional biopsies were done in 15 of the cases when a drill down was done on the patient chart.

Then the next thing would be as a working document, what will be your process improvement? In this instance the process improvement would be that all newly diagnosed breast cancer patients would be presented at a breast conference where NCCN and NAPB standards are reviewed.

This is a way of collecting the data to put it into a framework for you. A dashboard report that could be presented out at an administrative meeting or the cancer committee would be to measure what the benchmark is and how you present your information.

First, second, third quarter, and what the variances were, and what process improvement you may have put into place as a result of that variance to improve the process. It's easy and clean, and it's a way of getting started.

Measuring the impact of navigation it's really best to collect outcomes through the evidence‑based care, but try to keep them simple. When you're starting something like this, it's best to choose one or two data points that really prove beyond a doubt that your navigation program is responsible for the improvement. You always need to have baseline data of where it was before you started it in order to be able to report on outcomes going forward.

As part of this webinar, we did ask for you to share any current tools that people are using, just so that members can actually see that people are tracking things and starting to monitor things in their own way. Other institutions have developed their own trackers and their dashboards that will give you a feel for what is actually being done.

I'd like to just share some of these with you, and I thank members of our quality outcomes and performance improvement committee for actually sharing some of the tools that they're actually using.

Memorial Hospital and the University of Colorado Health System shared several trackers with us. This is one that was easy for me to actually share with you that actually talks about their surgical trackers.

The different types of things that could actually be pulled out of the surgical tracker would be the turnaround time from biopsy to surgery, the number of cases that they were able to present at a case conference, and the number of navigator referrals so that they can see how their volumes are increasing on a regular basis.

UMC Southwest Cancer Center really has done a wonderful job on building out an acuity workflow to actually be able to show the value of their role so that they were able to increase their FTEs for navigation. Looking at direct patient contact and indirect patient contact. Quantifying the number of patients, the number of hours spent, the total number of hours, and the actual notes that were presented.

This is one aspect of it, and this is the second one that actually breaks it down a little bit more on coordination of care, patient care meetings, documentation, and other aspects. Actually you can see the total number of hours that the navigator actually has in place.

They really have done a wonderful job on putting this together, and this entire project with additional tools has been submitted to the "Journal of Oncology Navigation and Survivorship" for publication. You will probably able to see more information related to that going forward.

The Commission on Cancer also has a best‑practice repository online that anyone can get into. That link is at the end of the presentation in my resources. This is the Billings clinic that actually developed the tracker that showed demographics and the outcomes that they were looking for. That, again, was tied to care type of information.

What are you working on now? What kind of patient issues as a navigator could you be looking at, just through the different examples that I've shown today? Any kind of communications issues, any kind of disease team enhancements. Process improvement or research studies that you actually yourself might actually be in the midst of working on.

We at the AONN+ quality outcomes and performance committee really would like to be able to help and can help. We just recently added to our resource section on the website the four steps in research about how you yourself can start to begin to collect, to check, to create, and communicate information about projects that you're working on.

Actually in and a way to develop the poster that you maybe would be able to present at an AONN conference or any conference that would really be able to show the value of the work that you're doing and the process improvements, whether they're strictly a process improvement or whether they're actually a research‑based project.

This is the direct link on the bottom of this page that you can go in and it'll take you directly to the page on our website that explains each one of these aspects of the four steps of research, as well as all the resources that go along with how to collect, how to check, how to create, and how to communicate.

These are some resources that actually are available through the National Quality Clearinghouse, the Commission on Cancer's best practice repository, and the Association of Community Cancer Centers. There's multiple resources that are out there.

We have a wealth of information and resources also available on the AONN website. I encourage everyone to take a little bit of time to actually look into these, to follow up if there's any questions.

I would like to end with this and to say that not everything that's advertised on [inaudible 42:13] is the best. Not everything that counts can be counted, and not everything that can be counted counts. But in reality, you cannot improve what we do not measure. We all have to get started somewhere.

As a member of the [Evidence into Practice Subcommittee] subcommittee, I really would like to encourage you to utilize our committee, participate as much as possible. We have a community forum that you can ask questions on. It's very easy to access. We're here to help in any way that we can.

At this time, I'd like to just open it to any questions. Anything that I have available that we could send to you, feel free. My email address is here. I'm happy to follow up with anyone afterwards with any kind of specific questions you might have directly related to individual projects. But please, if there are any questions, feel free. Thank you so much. I hope you enjoyed today's presentation.

Trisha:  Thank you, Elaine. At this time we will conduct the live question and answer session. If you'd like to ask a question, please type in your question in the chat box located on the task bar at the top of your screen, and Elaine will answer them in the order in which they are received. Elaine, are you there?

Elaine:  I'm here. You'll have to repeat the questions, though. I don't have that tag on my computer. I just still have the slides, Trisha.

Trisha:  OK. We do have one question. You have some compliments coming in from your listeners. Our first question comes from Mary and she asks, "As a solo navigator in a small community hospital, how would you suggest I start to develop a quality dashboard related to my work as a navigator?"

Elaine:  Good question. Basically, my thoughts are is that the first step you need to do is to really not to try to do it alone, but to work with your administrator or whoever you report to directly as a navigator to help you. You should not really have to create a dashboard without the guidance of a program administrator who really has had the experience in creating meaningful reports that are related to the cancer program.

The second thing I might think of is what are your touch points with your patient population? The diagnosis, whether you work as a diagnostic navigator and you work with patients through the entire treatment process.

Where do you think you have the most impact? Where would you like to be able to quantify what you're doing? What issues have you uncovered that you believe your role as a navigator has improved on, whether it's time to care or a decrease in EMR migration. Then just choose one or two to really start and create it as a navigation‑specific patient satisfaction survey is always a good start.

Trisha:  Thank you, Elaine. Our next questions comes from Joanne. She asks, "Do navigation software programs provide supports toward developing quality metrics, and can they provide reports that are helpful to prove value of a program?"

Elaine:  Yes, they can. I just touched on that during the webinar, that if you create a homegrown database as a team with your administrator, the navigators, and the IT staff, you can build a program that will look at metrics that are the most meaningful to your own program. That's the value of being able to develop something internally, because you're able to do it really specifically.

Commercial software programs all offer various components that help to track navigators' workload, and some of them may be able to build specific smart sets of data to look at the quality measure that your particular institution is interested in.

Then simple Excel databases are always a good start, usually for those who are not very techie. I would always make sure that I found someone who had really good IT expertise in your institution to assist you from the start of any kind of a database development, even if it is only working with Excel or Access, so that you really, right from the start, don't close yourself to have any unnecessary stressors. Because everyday navigation work is hard enough without making it more complicated. Any way you can simplify what you're doing.

If navigation software helps you, I'm all for it. A lot of that has to do with the dollar price tag that comes along with the commercial software products.

Trisha:  Just to remind everybody, if you would like to ask a question, please type your question into the chat box located on the taskbar at the top of your screen, and Elaine will answer them in the order in which they are received. We do have another question from Amy. She asks, "Would it be useful to use elements of an individual job description as part of the quality outcome?"

Elaine:  Absolutely. Actually picking out pieces of your specific job description that impact the patient's experience, that actually will quantify your role for your administrator, but it also will show the actual patient outcomes that come from having a navigation process in place.

Trisha:  Thank you. It looks like our last question, at least for now, comes from Teresa. She asks, "Do you know of any grants available for small hospitals that need to purchase navigation software?"

Elaine:  There are many types of small grants that you can usually get. Depending on what kind of program you're speaking of, if you're looking for some disease‑specific ones, I know that the American Cancer Society has small grants that are making strides against...for breast cancer grants.

The Avon Foundation probably has some grants that have to do with breast cancer. I would check into all the advocacy organizations to see if there is any grant money for navigation software.

If you have a grants office within your own institution or you have access to a grants office as part of a university setting, I would check in with them for the specific type of organizations that would provide those type of grants. That's probably something that would be a good thing for us to do some research on from our quality outcomes committee and add to our resources on the website, so thank you for the question.

Trisha:  Elaine, it looks like you were very thorough and you don't have any more questions. I think they were very happy with the presentation. You are getting compliments across the chat line. [laughs] Thank you so much. I wanted to ask, Elaine, do you have any...

Oh, we did just get one more question in. Teresa asks...Oh, she adds on. She says, "Try your state's cancer alliance for funding." She just chimed in there.

Elaine:  Very good.

Trisha:  At this time would you like to have some final thoughts?

Elaine:  I'd like to thank everyone for your participation today and I hope it was helpful in your practice. In order to continue to support your needs regarding quality measures, you'll be receiving a short survey following this webinar that will really help to assist our quality outcomes and performance improvement committee in planning future education programs.

Please feel free to contact me by email with any additional questions. I'm happy to help in any way that I can. Everyone, thank you so much.

Show more