American College Of Surgeons - Inspiring Quality: Highest Standards, Better Outcomes

December 2016 CoC Source

(HTML Version)

Happy Holidays!

As 2016 comes to a close, we wish you all a happy holiday season. Thank your continued interest in and readership of our publications.

Stay tuned to see what's in store next year, as we plan to make some changes.

CoC News

CoC Recognizes Outgoing Members and Leaders

The Commission on Cancer (CoC) recognizes the following departing members for their distinguished contributions to the work of the Commission:

Members

  • Lisa Bailey, MD, FACS (Ex-officio)*
  • Al B. Benson III, MD, FACP—American College of Physicians
  • David Bentrem, MD, MS, FACS*
  • Stephen B. Edge, MD, FACS*
  • Kirsten Edmiston, MD, FACS*
  • Benjamin E. Greer, MD—American College of Obstetricians and Gynecologists
  • James J. Hamilton, Jr., MD, FACS*
  • Kathryn Keifer Hamilton, MA, RD, CSO, CDN—Academy of Nutrition and Dietetics, Oncology Nutrition Dietetic Practice Group
  • Daniel Kollmorgen, MD, FACS*
  • Michael Lanuti, MD, FACS—Society of Thoracic Surgeons
  • Charles L. McGarvey III, PT, MS, DPT, FAPTA—American Physical Therapy Association
  • Carol Murtaugh, RN, OCN—Community Oncology Alliance
  • Lynn Penberthy, MD, MPH—National Cancer Institute: Surveillance, Epidemiology, and End Results (SEER) Program
  • Joe B. Putnam, Jr., MD, FACS*
  • Richard Reznichek, MD, MHA, FACS*
  • Kim Thiboldeaux—Cancer Support Community
  • Joan L. Warren, RN, PhD—National Cancer Institute: Healthcare Delivery Research Program
  • David J. Winchester, MD, FACS (Ex-officio)*
  • Douglas Wood, MD, FACS*

Leadership

  • Stephen B. Edge, MD, FACS—Chair, Nominating Committee
  • James J. Hamilton, Jr., MD, FACS—Chair, Advocacy Committee

* Fellowship

CoC Announces New Members and Leaders

The CoC welcomes the following individuals who were appointed to membership and leadership positions at the 2016 annual meeting held on October 16 at the Clinical Congress in Washington, DC.

Representing the Fellowship for a Three-Year Term:

  • George J. Chang, MD, MS, FACS, FASCRS—UT MD Anderson Cancer Center, Houston, TX
  • David W. Dietz, MD, FACS, FASCRS—Seidman Cancer Center, Cleveland, OH
  • Laura S. Dominici, MD, FACS— Dana Farber Cancer Institute, Boston, MA
  • James B. Harris, MD, FACS—Western Surgical Group, Reno, NV
  • Benjamin D. Li, MD, FACS—Case Western Reserve University, Cleveland, OH
  • Sharon S. Lum, MD, FACS – Loma Linda University School of Medicine, Loma Linda, CA
  • Timothy W. Mullett, MD, FACS—Markey Cancer Center, Lexington, KY
  • David G. Sheldon, MD, FACS—Kalispell Regional Healthcare, Kalispell, MT

Representing Member Organizations for a Three-Year Term

  • Academy of Nutrition and Dietetics, Oncology Nutrition Group (AND)—Barbara L. Grant, MS, RDN, CSO, FAND, Saint Alphonsus Cancer Care Center, Boise, ID
  • American Cancer Society (ACS) Corporate Center—Otis Brawley, MD, MACP, American Cancer Society, Inc., Atlanta, GA
  • American College of Obstetricians and Gynecologists (ACOG)—Daniel Lyle Clarke-Pearson, MD, University of North Carolina, Chapel Hill, NC
  • American College of Physicians (ACP)—Lee Hartner, MD, University of Pennsylvania School of Medicine, Philadelphia, PA
  • American Physical Therapy Association (APTA)—Mary Lou Galantino, PT, MS, PhD, MSCE, Stockton University, University of Pennsylvania, Galloway, NJ
  • Cancer Support Community (CSC)—Linda House MSM, BSN, RN, Cancer Support Community, Washington, DC
  • Community Oncology Alliance (COA)—Robert “Bo” Gamble, Community Oncology Alliance, Washington, DC
  • National Cancer Institute Healthcare Delivery Research Program (NCI HDRP)—Dolly Penn, MD, MSCR, National Cancer Institute, Bethesda, MD
  • National Cancer Institute: Surveillance, Epidemiology, and End Results Program (NCI SEER)—Serban Negoita, MD, DrPH, National Cancer Institute, NIH, DHHS, Rockville, MD
  • Society of Thoracic Surgeons (STS)—Leslie J. Kohman, MD, Upstate Medical University, Syracuse, NY

Representing the Leadership

  • Lawrence N. Schulman, MD, FACP – Abramson Cancer Center, University of Pennsylvania, Philadelphia, PA—Chair, Commission on Cancer and Executive Committee
  • Alan G. Thorson, MD, FACS—Nebraska Methodist Hospital, Omaha, NE—Chair, Advocacy Committee
  • Michael S. Bouton, MD, MA—Sanford Health, Roger Maris Cancer Center, Fargo, ND—Vice-Chair, Advocacy Committee
  • Daniel McKellar, MD, FACS—Wayne Healthcare, Greenville, OH —Chair, Nominating Committee
  • Matthew A. Facktor, MD, FACS—Geisinger Medical Center, Danville, PA—Chair, Quality Integration Committee
  • Ted James, MD, FACS—Beth Israel Deaconess Medical Center, Burlington,  VT—Vice-Chair, Quality Integration Committee

Accreditation Corner

Information Regarding 2017 Surveys and Improvements to the SAR

Commission on Cancer (CoC) surveys in 2017 will be reviewed on cancer program activity for the complete years of 2014, 2015, and 2016 (not 2017). Activity for 2014 and 2015 will be assessed according to the Cancer Program Standards 2012: Ensuring Patient-Centered Care (Version 1.2.1). Activity for the complete year of 2016 will be assessed according to the Cancer Program Standards: Ensuring Patient-Centered Care (2016 edition).

The Survey Application Record (SAR), located within CoC Datalinks, is used by cancer programs to record cancer committee activity and documentation that demonstrates compliance with the CoC Standards and Eligibility Requirements. Any information entered into the 2016 Program Activity Record (PAR) will be automatically transferred to the 2017 SAR. Please note: Some data fields or tables have changed or have been added, which may impact data being transferred over completely from the PAR. Please be sure to review the SAR for completeness and update accordingly.

Enhancements to the 2017 SAR have been finalized based on feedback from surveyors, cancer program users, and CoC internal users to align with the 2016 CoC standards, compliance criteria, and user functionality.

Some high points of the SAR enhancements and additions include, but are not limited to:

  • The SAR has a cleaner aesthetic to improve readability.
  • It is easier to use and more intuitive for each standard.
  • Facility Self-Assessment Ratings have been removed from the SAR, but the Facility Comments and rating criteria remain.
    • Cancer programs will still be able to view the ratings entered by the National Cancer Database (NCDB) and/or CoC staff prior to survey for Standards 5.2, 5.5, 5.6, and 5.7.
  • The Hospital Locator link has been renamed the Find an Accredited Program to be consistent with the American College of Surgeons website.
  • Only one cancer committee meeting date can be entered into each Eligibility Requirement (ER) page. It should be the most current date the policy was reviewed and approved by the cancer committee (required on an annual basis).
  • ER10, ER12, and Standard 2.4 services tables have been updated to reflect current services and terminology.
  • The Five Major Cancer Sites table is now only available on the Standard 1.7 screen.
  • Standard 1.3: Cancer committee meeting attendance has been put into an accordion format by year to shrink the size of the table and reduce the amount of page scrolling.
  • Standard 1.9: The Edit table now matches the Summary table on the standard pages to reflect the eligible studies from the CoC Standards Manual.
  • A Cancer Committee Review table with date field, as well as the ability to upload policies and procedures, has been added to Standards 2.3 and 2.4.
  • Standard 3.1: The Barrier Category table now includes a subcategory for each main category for programs to better define the barriers they are addressing.
  • Standards 4.1, 4.2, and 4.6: A Cancer Site dropdown box has been added to specify the cancer sites being addressed for compliance.
  • Standard 4.2: Fields for Date Screening Activity Held and Type of Screening Activity have been added for more specificity.
  • Standards 4.4 and 4.5: The 2017 CoC Survey Expected EPR text is now a live link that will take you to the NCDB measures website.
  • Standards 5.2, 5.5, 5.6, and 5.7: Submission information now appears in an accordion format based on years to shrink the size of the table and reduce the amount of page scrolling.

The CoC’s goal is for cancer programs to experience the 2017 SAR as a more user-friendly tool that will offer assistance as they prepare for survey. The News tab, located at the top of each page in CoC Datalinks, is there to bring you the latest information about the standards, survey, NCDB, educational events, and more. Don’t forget to use the FAQ clickable icons within each standard to help you with completing the SAR.

Please plan your work accordingly! The SAR must be completed at least 30 calendar days prior to the confirmed survey date and will be closed for edits 14 days prior to the survey date. Any missing information or incomplete SAR tables after the 14-day window are subject to a deficiency.

PLEASE REMEMBER to remove all protected health information (PHI) from documents uploaded in the SAR/PAR. Please check all quality studies, cancer conference grids, or any other documents that may include PHI. All PHI must be removed before documentation is uploaded to the SAR/PAR. For questions regarding PHI or HIPAA compliance, please communicate with your facility’s privacy officer.

If you have questions:

For CoC or accreditation questions, please use the Contact CoC form.

For questions about the CoC Standards, please read and submit questions to the CAnswer Forum.

We appreciate your cancer program’s commitment to high-quality cancer care and participation in the CoC Accreditation Program!

CoC Standards 4.4 and 4.5 and Implementation for Surveys in 2017

The updates for the 2014 Cancer Program Practice Profile Reports (CP3R) data became available on October 26. Accredited cancer programs are required to review the data during each calendar year with the cancer committee for compliance to Standards 4.4 and 4.5.

We have received several questions regarding the availability of the updated CP3R data for the fourth quarter meeting of 2016. Cancer programs whose last cancer committee meeting of 2016 took place in October will be required to review its 2014 CP3R data in the first quarter (January, February, or March) of 2017 to remain compliant. This does not replace the CP3R measures review for Standards 4.4 and 4.5 for compliance in 2017. Cancer programs with fourth quarter cancer committee meetings that take place in November or December 2016 must review the 2014 CP3R data during the last quarter meeting to be in compliance for 2016.

Meeting dates and minutes will be reflected in the program’s SAR/PAR and verified by the surveyor at the time of survey.

The expected Estimated Performance Rates (EPR) for accountability and quality improvement measures assessed for the CoC Standards 4.4 and 4.5 for programs being surveyed in 2017 are below. These standards require performance levels be met annually according to the specified accountability and quality improvement measures defined by the CoC.

Evaluation Criteria of Measures

To be compliant with Standards 4.4 and 4.5, cancer programs must do one of the following:

  • Meet the above performance rates either with their EPR in CP3R or the upper bound of the 95 percent confidence interval.
  • If the performance rates are below the EPR, cancer programs must establish and implement an action plan that reviews and addresses improving performance.

Expected Performance Rates

Expected EPRs have been established based on a review of current performance by CoC-accredited cancer centers for these measures. Each of the following measures will be assessed and rated for 2016 surveys. To rate this Standard, programs must assess their attainment of the expected EPR rates for each of the years listed below. EPRs remain the same as previously released for cases diagnosed in 2012–2013.

Table 1. 2017 CoC Survey’s Assessed Quality Measures and Expected EPRs

Table 1

Note: Expected EPRs include the EPR and the upper limit of the confidence interval for the EPR.

How to Interpret Confidence Intervals

The following tables provide examples of how to interpret the 95 percent Confidence Intervals (CIs) for compliance with Standards 4.4 and 4.5.

Example 1: Compliance with Standard 4.4 Based on the Upper Limit of the Confidence Interval

Example 1

In the table above, the program's actual performance rate for the HT measure is 86.4 percent, and the upper bound of the 95 percent CI is 100 percent, which is above the 90 percent expected EPR. The program will be assessed as meeting the performance criteria for the HT measure, as the CI indicates that the rate is not significantly different from the EPR. In Example 1, all of the accountability measures meet the evaluation criteria.

Example 2: Noncompliance with Expected EPR Based on the Upper Limit of the Confidence Interval

Example 2

In the table above, the program’s calculated performance rate for BCS/RT is 80.2 percent, and the MASTRT is 79.6 percent. These performance rates do not meet the 90 percent EPR; neither do the upper bound of the CI (85% and 85.9% respectively) meet or exceed the 90 percent EPR. In order to be compliant with Standard 4.4, this program would need to develop an action plan for these measures.

Programs should apply this assessment for each diagnosis year and quality measure being assessed.

NCDB News

NCDB Announces 2017 Call for Data

The NCDB is pleased to announce the 2017 Call for Data, an integral part of CoC accreditation. The information generated from the NCDB enables cancer programs to compare treatment and outcomes with regional, state, and national patterns. This year all analytic cases diagnosed in 2015 must be submitted. In addition, cases added or changed by the hospital registrar since December 1, 2015, must be submitted for all analytic cases diagnosed between the cancer program’s Reference Year (or 1985 if the Reference Year is earlier than 1985) and 2014, inclusive.

Sometimes registries inadvertently fail to submit all required cases, resulting in a deficiency rating for Standards 5.5 and 5.6. See Determining Compliance and Commendation for helpful details about how survey scores are assigned for these two Program Standards.

To facilitate the submission of high-quality data to the NCDB and reduce temporal pressure on vendors and registrars, this year initial submission of all required cases is due by 11:59 pm February 28, 2017, Central time. Cases from any diagnosis year that are “rejected” have technically not been submitted, and must be corrected and resubmitted by April 1, 2017, for compliance with Standard 5.5. Registries are strongly encouraged to submit their data early in January to allow time to address any issues that may arise before the submission deadline.

For more information on the Call for Data, visit the CoC website.

Cancer Quality Improvement Program Coming Soon

The Cancer Quality Improvement Program (CQIP) 2016 report will be released in January 2017! CoC-accredited programs will be able to access the report through CoC Datalinks.

This annual report is unique in providing not only short-term quality and outcome data, but also long-term outcome data, including five-year survival rates for commonly treated malignancies stratified by stage. The report allows programs to assess the quality and outcomes based on the data submitted to the NCDB.

The 2016 CQIP report includes slides on:

  • Cancer program volume
  • Cancer program in/out migration
  • CoC Quality Measure reports
  • Volume of selected complex cancer operations
  • Thirty- and 90-day mortality after selected cancer operations
  • Unadjusted Survival Reports by stage for breast, colon, and non-small cell lung cancer (NSCLC)
  • Risk-adjusted survival hazards ratios by stage for breast, colon, and NSCLC
  • Site-specific reports for breast, colon, NSCLC, prostate, and melanoma of the skin cancers

A complete CQIP Slide Directory can be found under the About CQIP section of CoC Datalinks of the reporting application.

If you have any questions, please contact NCDBCQIP@facs.org.

NCDB to Accept PUF Applications in January 2017

The NCDB currently accepts applications for Participant User Files (PUFs) on a semi-annual basis. Look for a communication in January announcing the next application period for organ-site specific files, including cases diagnosed through 2014.

The NCDB PUF is a Health Insurance Portability and Accountability Act-compliant data file containing cases submitted to the CoC’s NCDB and complies with the terms of the Business Associate Agreement between the American College of Surgeons and cancer programs accredited by the CoC. The PUF contains de-identified patient-level data and is designed to provide investigators at CoC-accredited cancer programs with a data resource they can use to review and advance the quality of care delivered to cancer patients through analyses of cases reported to the NCDB. Prospective applicants can find more information on the PUF website.

Additional questions regarding the NCDB PUFs or the application process for a PUF may be directed to NCDB technical staff at NCDB_PUF@facs.org.

Resources for Cancer Liaison Physicians

Complete the CLP Activity Report by December 31

After December 31, 2016, the Cancer Liaison Physician (CLP) Activity Report for this year will be closed and won’t be accessible.

The CLP must complete the CLP Activity Report. This report is found in CoC Datalinks and is part of your Survey Application Record (SAR) during your survey year. It is accessible in the Program Application Record (PAR) during nonsurvey years. Not only is it important to complete this report during the time of your program’s scheduled survey, it can also be beneficial to complete it during nonsurvey years to track activity annually.

If you are appointing a new CLP, please make sure that your current CLP has completed the CLP Activity Report before the change occurs. The current CLP will not have access once the new appointment is processed.

If you have any questions or concerns, please e-mail clp@facs.org.

Expiring CLP Term—January 1, 2017

CLPs serve a three-year term and are eligible to serve an unlimited number of terms based on performance and evaluation data collected at the time of survey. More than 100 CLP terms will expire on January 1, 2017, and each cancer committee must determine whether their CLP is appropriately serving in this role or if another candidate would be better suited to the position.

Notification and instructions will be e-mailed in December to cancer programs with a CLP whose term is expiring.* The facility must either reappoint the CLP for another three-year term or recommend a replacement.

Please do not panic if January 1 has already passed. Although the CLP will not have access to his or her program’s CoC Datalinks menu, the cancer committee may decide to extend the current CLP’s term date until a decision is made to reappoint or replace them. Your accreditation will not be affected. If the cancer committee decides to replace the CLP, the new appointment can be processed at any time with no consequences regarding accreditation.

If you are replacing your CLP, make sure the former CLP completes the CLP Activity Report before he or she is removed from the cancer committee contacts.

Please update and confirm your CLP’s contact information in CoC Datalinks.  It will also be a good time to choose your CLP designated alternate. If you have questions or concerns, please e-mail us at clp@facs.org.

Please note: If your accredited facility does not have a CLP in place, you are in jeopardy of noncompliance with CoC Standards 1.3 and 4.3. It is important that you make an appointment as soon as possible and that you designate an individual as the CLP in CoC Datalinks. If a CLP is not appointed for your facility and recorded in CoC Datalinks, the CoC recommends that the cancer committee chair be listed as the interim CLP until an official appointment is made.

*If the notification was sent to someone in error, please make sure your staff contacts are updated through CoC Datalinks. Notifications are sent electronically based on what is entered through your program’s “Manage Staff Contacts” page.

Educational Programs and Resources

Cancer Programs to Hold Annual Conference September 2017

Plan to attend the 2017 Cancer Programs Annual Conference, September 7-9 in Rosemont (Chicago). The conference will cover content from all areas of the American College of Surgeons Cancer Programs—American Joint Committee on Cancer (AJCC), the Clinical Research Program, Commission on Cancer (CoC), National Accreditation Program for Breast Centers (NAPBC), and the National Cancer Database (NCDB). Further details on this event will be available soon. Click here to be added to the mailing list for the annual conference. (Please note: The 2017 Cancer Programs Annual Conference replaces the educational event traditionally held in June.)

News from Cancer Programs

AJCC News

Implementation of AJCC 8th Edition Cancer Staging System

The American Joint Committee on Cancer (AJCC) has been working closely with all of its member organizations throughout the development of the recently published 8th Edition Cancer Staging Manual.  The coordination of the implementation for a new staging system is critically important to ensure that all partners in patient care and cancer data collection are working in synchrony.

In order to ensure that the cancer care community has the necessary infrastructure in place for documenting  8th Edition stage, the AJCC Executive Committee, in dialogue with the National Cancer Institute (NCI-SEER), Centers for Disease Control and Prevention (CDC), the College of American Pathologists (CAP), the National Comprehensive Cancer Network (NCCN, the National Cancer Data Base (NCDB), and the Commission on Cancer (CoC), made the decision to delay the implementation of the 8th Edition Cancer Staging System to January 1, 2018.

Clinicians will continue to use the latest information for patient care, including scientific content of the 8th Edition Manual.   All newly diagnosed cases through December 31st 2017 should be staged with the 7th edition.  The time extension will allow all partners to develop and update protocols and guidelines and for software vendors to develop, test, and deploy their products in time for the data collection and implementation of the 8th edition in 2018.

The AJCC is working together with all of its members as well as software vendors to make this transition as smooth as possible for the oncology community.  More communication will follow from the AJCC and the member organizations over the coming weeks.

The latest information regarding the AJCC 8th Edition Cancer Staging System can be found at www.cancerstaging.org.

Click here to purchase.

For information regarding the API click here.

FAQs now up on the AJCC website for 8th Edition Cancer Staging Manual.

ACS Clinical Research Program News

Recent Results from the 2015 Special Study

Thanks to the hard work and dedication of the registrars at CoC-accredited institutions across the country, results are being published from the 2015 Special Study on posttreatment recurrence and surveillance in breast, colorectal, and lung cancers. Included below are details from two articles that have been accepted for publication and one abstract that was recently presented.

An article entitled “Utility of Clinical Breast Examinations in Detecting Local-Regional Breast Events after Breast-Conservation in Women with a Personal History of High-Risk Breast Cancer” was recently published (Ann Surg Oncol. 2016 Oct 23(10):3385-91). The effect of clinical breast examinations in detecting local-regional recurrence was studied in 11,099 breast cancer survivors. Results showed that a minority of patients (10%) had their local-regional breast event detected by clinical exam, with most being focused on breast imaging (48%) or by the patients themselves (29%). The authors found that clinical examinations as an adjunct to screening mammography have only a modest effect on detection of local-regional recurrence.

The abstract entitled “Improving Recurrence Capture in National Cancer Registries” was recently presented (American College of Surgeons Clinical Congress, October 2016). Recurrence rates in the special study were compared with rates in the National Cancer Database (NCDB) for breast, colorectal, and lung cancer. Using detailed abstraction instructions and mandating collection of recurrence, the authors found that recurrence capture was markedly increased. While registrars estimated additional work to collect this data for all patients, most reported it would be less intensive to do prospectively, and 80 percent indicated that improving recurrence capture was important or critical to the mission of the CoC.

The article entitled “Impact of Age and Comorbidity on Treatment of Non-Small Cell Lung Cancer Recurrence Following Complete Resection: A Nationally Representative Cohort Study” will be published in the Lung Cancer journal (In press). A total of 9,001 patients with surgically resected stage I–III non-small cell lung cancer were studied. Results showed that older patients independent of comorbidity, patients with substance abuse, and women were less likely to receive active treatment for postoperative non-small cell lung cancer recurrence.

We appreciate your dedication to this important work. We will continue to inform the registry community of future results from the 2015 Special Study. For questions, please contact Amanda Francescatti at afrancescatti@facs.org.