Publication

Article

Urology Times Journal

Vol 48 No 8
Volume48
Issue 08

How to use data to measure adherence to clinical guidelines in prostate cancer

Against the backdrop of a new guideline for advanced prostate cancer, Robert A. Dowling, MD, describes how you might measure adherence to guidelines in your own practice.

Robert A. Dowling, MD

Robert A. Dowling, MD

"Clinicians are challenged to remain up-to-date and informed…” begins the new clinical guideline on advanced prostate cancer released by the American Urological Association, American Society for Radiation Oncology, and Society of Urologic Oncology in June 2020.1 However, significant burnout affecting urologists,2 time pressure, bureaucratic tasks, rapidly evolving diagnostic and treatment advances, and the inherent resistance to the rationalization of skills combine to make it even harder to incorporate new guidelines into clinical practice.

Clinical guidelines are formulated and graded on scientific evidence (or expert opinion) derived from research involving populations, and clinicians treat individual patients based on personal experience in addition to the science; these competing perspectives may result in a lack of adherence to clinical guidelines, sometimes written off to “the art of medicine.” Some guidelines lead to the development of “clinical pathways” used by decision support tools, payers, and public health experts. No guideline is intended to be a rigid rule for 100% of situations, but measuring adherence can sometimes be leveraged to identify outliers for closer examination. In this article, against the backdrop of a new guideline for advanced prostate cancer, I will discuss how you might measure adherence to guidelines in your own practice using some simple examples.

Opportunities for measurement

Advanced prostate cancer is a segment of the typical urology practice that lends itself to a focus on adherence to guidelines: The disease is common in a urology practice, the diagnostic testing choices and results are discrete, the clinical status is well defined (symptom status, tumor marker status, metastatic status, castrate sensitivity status), and the therapeutic options have clear indications. Many guideline statements are unambiguous, and there is a well-developed algorithm for quick reminders. The examples below each provide an opportunity for measurement:

• “Clinicians should not offer first-generation antiandrogens (bicalutamide, flutamide, nilutamide) in combination with luteinizing hormone-releasing hormone (LHRH) agonists in patients with metastatic hormone-sensitive PC (mHSPC), except to block testosterone flare.”

• “In metastatic castrate-resistant PC (mCRPC) patients, clinicians should obtain baseline labs (eg, prostate-specific antigen [PSA], testosterone, lactate dehydrogenase [LDH], hemoglobin, alkaline phosphatase level) and review location of metastatic disease (bone, lymph node, visceral)...”

• “In patients with mCRPC, clinicians should offer germline and somatic tumor genetic testing to identify DNA repair deficiency mutations and microsatellite instability status that may inform prognosis and counseling regarding family risk as well as potential targeted therapies.”

• “Clinicians should discuss the risk of osteoporosis associated with androgen deprivation therapy (ADT) and should assess the risk of fragility fracture in patients with APC.”

• “Clinicians should prescribe a bone-protective agent (denosumab or zoledronic acid) for mCRPC patients with bony metastases to prevent skeletal-related events.”

How do you measure adherence to guidelines in your practice? Most urology practices have 2 sources of information: their practice management (PM) (billing) system and their electronic health record (EHR) system. The former was not designed for measuring clinical performance, but it does have the capability, when properly used, to do just that. For example, the guideline statement “Clinicians should prescribe a bone-protective agent (denosumab or zoledronic acid) for mCRPC patients with bony metastases to prevent skeletal-related events” has 3 discrete elements that could contribute to a straightforward analysis: (1) a clinical status of CRPC, (2) presence of bone metastases, and (3) administration of an injectable drug typically acquired and billed by the practice. If these discrete elements could be reliably identified in the PM data, then that system could be leveraged to measure adherence to guidelines: Find all patients with CRPC, metastatic to bone, who have received denosumab (sum A); find all patients with CRPC, metastatic to bone (sum B); calculate adherence (A divided by B).

In my experience, many urologists are not disciplined about coding for CRPC (International Classification of Diseases, Tenth Revision, code Z19.2) or bone metastases (C79.51), probably because there was never an incentive to code beyond prostate cancer (C61). However, measuring adherence to guidelines presents an excellent incentive to do so. Certified coders in your office can be trained to enter this information. Once the data are being reliably populated, measuring adherence to this guideline will be easy in the PM system using commonly available stock reports (billing records) that can be dimensioned by physician, location, time period, and even insurance payer.

Measuring adherence to other guidelines may require reporting out of your EHR system, and some EHR vendors make that easier than others. Fortunately, the data needed for this exercise are generally discrete, not dependent on physician documentation style, and easily available in many EHRs: prescriptions, lab results, x-ray results. For example, measuring adherence to “Clinicians should not offer first-generation antiandrogens (bicalutamide, flutamide, nilutamide) in combination with LHRH agonists in patients with mHSPC, except to block testosterone flare” could begin with a simple prescription utilization report for these drugs; this report could be enhanced to identify mHSPC if physicians coded for hormone status and metastatic status in the EHR (see above). While there are no reliable benchmarks for this kind of analysis, a peer comparison in a group practice can identify outliers for further chart review. Chart review of average or above-average physicians is generally not needed.

Leveraging lab results may be slightly more complicated, but it can be simplified. The guideline statement “In mCRPC patients, clinicians should obtain baseline labs (eg, PSA, testosterone, LDH, hemoglobin, alkaline phosphatase level)” obviously requires the ability to measure the presence or absence of these labs. All EHRs offer the ability to store, and theoretically report on, discrete lab results. One challenge, though, is that some lab results may return more discretely to the EHR than others, depending upon the performing location and interface; nondiscrete results may not be visible in the EHR unless the physician manually enters the information. The best way to avoid this common pitfall is to measure orders, not results; orders are the best reflection of the clinician’s intent and adherence to guidelines, and they do not depend on discrete results. Some smaller, niche, or older EHR systems do not offer reporting on discrete orders, but most do.

Measuring order utilization for testosterone in mCRPC patients

To simplify measuring adherence to this guideline, I recommend measuring order utilization for testosterone in mCRPC patients; you can determine a time window (testosterone order date through mCRPC first diagnosis date) to apply your own definition of “baseline,” keeping in mind that a castrate level of testosterone is technically needed to even establish a diagnosis; the order date should or may precede the mCRPC first diagnosis date. Keep it simple at first.

Leveraging radiology results is a more challenging exercise in reporting on clinical data. Only the most sophisticated EHR systems store a result discretely; most results come in as documents in different formats or are scanned with different labels. Standardization in labeling scan documents can help, but it is difficult to monitor and enforce. Again, the best answer is to leverage orders. The guideline statement “Clinicians should discuss the risk of osteoporosis associated with ADT and should assess the risk of fragility fracture in patients with APC” requires a definition of risk assessment for measuring adherence. The gold standard to assess this risk is the Fracture Risk Assessment (FRAX) tool, US version, which includes a measurement of bone mineral density using dual-energy x-ray absorptiometry (DEXA) scan (https://www.sheffield.ac.uk/FRAX/tool.aspx?country=9). So, to simplify measurement of adherence to this guideline, measure order utilization of DEXA scan in a population of patients on ADT using the time window of your choice. There is no firm consensus on when, or how often, to repeat DEXA scan in this guideline; keep it simple, measure consistently across physicians, and identify and drill down on outliers only.

Bottom line: Clinical guidelines are not rigid standards, but they draw on the available evidence to support best practices. Tools exist to simplify the measurement of adherence to clinical guidelines in your practice using your existing systems and reporting capabilities at nominal cost. Start simple, measure consistently, validate the results, leverage peer comparisons (“apples to apples”), accept that information based on data is imperfect and that adherence will rarely be 100%, focus on outliers only, and provide feedback to improve data collection. Be transparent and patient with those you are trying to educate or influence.

References

1. Advanced prostate cancer: AUA/ASTRO/SUO guideline. American Urological Association. 2020. Accessed June 27, 2020. https://www.auanet.org/guidelines/advanced-prostate-cancer

2. Guimond-Franc J, McNeil B, Schlossberg S, et al. Urologist burnout: frequency, causes, and potential solutions to an unspoken entity. Can Urol Assoc J. 2018;12(4):137-142. doi:10.5489/cuaj.4668

Dowling is the president of Dowling Medical Director Services, a private health care consulting firm specializing in quality improvement, clinical informatics, and health care policy affecting specialty care. He is the former medical director of a large, metropolitan single-specialty urology group in Fort Worth, Texas.

Related Videos
Raveen Syan, MD, FPMRS, answers a question during a Zoom video interview
Man talking with a doctor | Image Credit: © Chinnapong - stock.adobe.com
Phillip M. Pierorazio, MD, answers a question during a video interview
Michael Jenson, PA-C, answers a question during a Zoom video interview
Couple talking with doctor | Image Credit: © Chinnapong - stock.adobe.com
Related Content
© 2024 MJH Life Sciences

All rights reserved.