Medicare and Medicaid Programs; CY 2016 Home Health Prospective Payment System Rate Update; Home Health Value-Based Purchasing Model; and Home Health Quality Reporting Requirements

Federal Register, Volume 80 Issue 214 (Thursday, November 5, 2015)

Federal Register Volume 80, Number 214 (Thursday, November 5, 2015)

Rules and Regulations

Pages 68623-68719

From the Federal Register Online via the Government Publishing Office www.gpo.gov

FR Doc No: 2015-27931

Page 68623

Vol. 80

Thursday,

No. 214

November 5, 2015

Part II

Department of Health and Human Services

-----------------------------------------------------------------------

Centers for Medicare & Medicaid Services

-----------------------------------------------------------------------

42 CFR Part 409, 424, and 484

Medicare and Medicaid Programs; CY 2016 Home Health Prospective Payment System Rate Update; Home Health Value-Based Purchasing Model; and Home Health Quality Reporting Requirements; Final Rule

Page 68624

-----------------------------------------------------------------------

DEPARTMENT OF HEALTH AND HUMAN SERVICES

Centers for Medicare & Medicaid Services

42 CFR Parts 409, 424, and 484

CMS-1625-F

RIN 0938-AS46

Medicare and Medicaid Programs; CY 2016 Home Health Prospective Payment System Rate Update; Home Health Value-Based Purchasing Model; and Home Health Quality Reporting Requirements

AGENCY: Centers for Medicare & Medicaid Services (CMS), HHS.

ACTION: Final rule.

-----------------------------------------------------------------------

SUMMARY: This final rule will update Home Health Prospective Payment System (HH PPS) rates, including the national, standardized 60-day episode payment rates, the national per-visit rates, and the non-

routine medical supply (NRS) conversion factor under the Medicare prospective payment system for home health agencies (HHAs), effective for episodes ending on or after January 1, 2016. As required by the Affordable Care Act, this rule implements the 3rd year of the 4-year phase-in of the rebasing adjustments to the HH PPS payment rates. This rule updates the HH PPS case-mix weights using the most current, complete data available at the time of rulemaking and provides a clarification regarding the use of the ``initial encounter'' seventh character applicable to certain ICD-10-CM code categories. This final rule will also finalize reductions to the national, standardized 60-day episode payment rate in CY 2016, CY 2017, and CY 2018 of 0.97 percent in each year to account for estimated case-mix growth unrelated to increases in patient acuity (nominal case-mix growth) between CY 2012 and CY 2014. In addition, this rule implements a HH value-based purchasing (HHVBP) model, beginning January 1, 2016, in which all Medicare-certified HHAs in selected states will be required to participate. Finally, this rule finalizes minor changes to the home health quality reporting program and minor technical regulations text changes.

DATES: Effective Date: These regulations are effective on January 1, 2016.

FOR FURTHER INFORMATION CONTACT: For general information about the HH PPS please send your inquiry via email to: HomehealthPolicy@cms.hhs.gov. Michelle Brazil, (410) 786-1648 or Theresa White, (410) 786-2394 for information about the HH quality reporting program. Lori Teichman, (410) 786-6684, for information about HHCAHPS. Robert Flemming, (844) 280-5628, or send your inquiry via email to HHVBPquestions@cms.hhs.gov for information about the HHVBP Model.

SUPPLEMENTARY INFORMATION:

Table of Contents

  1. Executive Summary

    1. Purpose

    2. Summary of the Major Provisions

    3. Summary of Costs and Benefits

  2. Background

    1. Statutory Background

    2. System for Payment of Home Health Services

    3. Updates to the Home Health Prospective Payment System

    4. Advancing Health Information Exchange

  3. Provisions of the Proposed Rule and Response to Comments

    1. Monitoring for Potential Impacts--Affordable Care Act Rebasing Adjustments

    2. CY 2016 HH PPS Case-Mix Weights and Reduction to the National, Standardized 60-day Episode Payment Rate to Account for Nominal Case-Mix Growth

      1. CY 2016 HH PPS Case-Mix Weights

      2. Reduction to the National, Standardized 60-day Episode Payment Rate to Account for Nominal Case-Mix Growth

      3. Clarification Regarding the Use of the ``Initial Encounter'' Seventh Character, Applicable to Certain ICD-10-CM Code Categories, under the HH PPS

    3. CY 2016 Home Health Rate Update

      1. CY 2016 Home Health Market Basket Update

      2. CY 2016 Home Health Wage Index

      3. CY 2016 Annual Payment Update

    4. Payments for High-Cost Outliers Under the HH PPS

    5. Report to the Congress on the Home Health Study Required by Section 3131(d) of the Affordable Care Act and an Update on Subsequent Research and Analysis

    6. Technical Regulations Text Changes

  4. Provisions of the Home Health Value-Based Purchasing (HHVBP) Model and Response to Comments

    1. Background

    2. Overview

    3. Selection Methodology

      1. Identifying a Geographic Demarcation Area Overview of the Randomized Selection Methodology for States

    4. Performance Assessment and Payment Periods

      1. Performance Reports

      2. Payment Adjustment Timeline

    5. Quality Measures

      1. Objectives

      2. Methodology for Selection of Quality Measures

      3. Selected Measures

      4. Additional Information on HHCAHPS

      5. New Measures

      6. HHVBP Model's Four Classifications

      7. Weighting

    6. Performance Scoring Methodology

      1. Performance Calculation Parameters

      2. Considerations for Calculating the Total Performance Score

      3. Additional Considerations for the HHVBP Total Performance Scores

      4. Setting Performance Benchmarks and Thresholds

      5. Calculating Achievement and Improvement Points

      6. Scoring Methodology for New Measures

      7. Minimum Number of Cases for Outcome and Clinical Quality Measures

    7. The Payment Adjustment Methodology

    8. Preview and Period To Request Recalculation

  5. Evaluation

  6. Provisions of the Home Health Care Quality Reporting Program (HH QRP) and Response to Comments

    1. Background and Statutory Authority

    2. General Considerations Used for the Selection of Quality Measures for the HH QRP

    3. HH QRP Quality Measures and Measures Under Consideration for Future Years

    4. Form, Manner, and Timing of OASIS Data Submission and OASIS Data for Annual Payment Update

      1. Statutory Authority

      2. Home Health Quality Reporting Program Requirements for CY 2016 Payment and Subsequent Years

      3. Previously Established Pay-for-Reporting Performance Requirement for Submission of OASIS Quality Data

    5. Home Health Care CAHPS Survey (HHCAHPS)

      1. Background and Description of HHCAHPS

      2. HHCAHPS Oversight Activities

      3. HHCAHPS Requirements for the CY 2016 APU

      4. HHCAHPS Requirements for the CY 2017 APU

      5. HHCAHPS Requirements for the CY 2018 APU

      6. HHCAHPS Reconsideration and Appeals Process

      7. Summary

    6. Public Display of Home Health Quality Data for the HH QRP

  7. Collection of Information Requirements

  8. Regulatory Impact Analysis

  9. Federalism Analysis

    Acronyms

    In addition, because of the many terms to which we refer by abbreviation in this final rule, we are listing these abbreviations and their corresponding terms in alphabetical order below:

    ACH LOS Acute Care Hospital Length of Stay

    ADL Activities of Daily Living

    APU Annual Payment Update

    BBA Balanced Budget Act of 1997, Pub. L. 105-33

    BBRA Medicare, Medicaid, and SCHIP Balanced Budget Refinement Act of 1999, Pub. L. 106-113

    CAD Coronary Artery Disease

    CAH Critical Access Hospital

    CBSA Core-Based Statistical Area

    CASPER Certification and Survey Provider Enhanced Reports

    Page 68625

    CHF Congestive Heart Failure

    CMI Case-Mix Index

    CMP Civil Money Penalty

    CMS Centers for Medicare & Medicaid Services

    CoPs Conditions of Participation

    COPD Chronic Obstructive Pulmonary Disease

    CVD Cardiovascular Disease

    CY Calendar Year

    DM Diabetes Mellitus

    DRA Deficit Reduction Act of 2005, Pub. L. 109-171, enacted February 8, 2006

    FDL Fixed Dollar Loss

    FI Fiscal Intermediaries

    FR Federal Register

    FY Fiscal Year

    HAVEN Home Assessment Validation and Entry System

    HCC Hierarchical Condition Categories

    HCIS Health Care Information System

    HH Home Health

    HHA Home Health Agency

    HHCAHPS Home Health Care Consumer Assessment of Healthcare Providers and Systems Survey

    HH PPS Home Health Prospective Payment System

    HHRG Home Health Resource Group

    HHVBP Home Health Value-Based Purchasing

    HIPPS Health Insurance Prospective Payment System

    HVBP Hospital Value-Based Purchasing

    ICD-9-CM International Classification of Diseases, Ninth Revision, Clinical Modification

    ICD-10-CM International Classification of Diseases, Tenth Revision, Clinical Modification

    IH Inpatient Hospitalization

    IMPACT Act Improving Medicare Post-Acute Care Transformation Act of 2014 (Pub. L. 113-185)

    IRF Inpatient Rehabilitation Facility

    LEF Linear Exchange Function

    LTCH Long-Term Care Hospital

    LUPA Low-Utilization Payment Adjustment

    MEPS Medical Expenditures Panel Survey

    MMA Medicare Prescription Drug, Improvement, and Modernization Act of 2003, Pub. L. 108-173, enacted December 8, 2003

    MSA Metropolitan Statistical Area

    MSS Medical Social Services

    NQF National Quality Forum

    NQS National Quality Strategy

    NRS Non-Routine Supplies

    OASIS Outcome and Assessment Information Set

    OBRA Omnibus Budget Reconciliation Act of 1987, Pub. L. 100-203, enacted December 22, 1987

    OCESAA Omnibus Consolidated and Emergency Supplemental Appropriations Act, Pub. L. 105-277, enacted October 21, 1998

    OES Occupational Employment Statistics

    OIG Office of Inspector General

    OT Occupational Therapy

    OMB Office of Management and Budget

    MFP Multifactor productivity

    PAMA Protecting Access to Medicare Act of 2014

    PAC-PRD Post-Acute Care Payment Reform Demonstration

    PEP Partial Episode Payment Adjustment

    PT Physical Therapy

    PY Performance Year

    PRRB Provider Reimbursement Review Board

    QAP Quality Assurance Plan

    RAP Request for Anticipated Payment

    RF Renal Failure

    RFA Regulatory Flexibility Act, Pub. L. 96-354

    RHHIs Regional Home Health Intermediaries

    RIA Regulatory Impact Analysis

    SAF Standard Analytic File

    SLP Speech-Language Pathology

    SN Skilled Nursing

    SNF Skilled Nursing Facility

    TPS Total Performance Score

    UMRA Unfunded Mandates Reform Act of 1995.

    VBP Value-Based Purchasing

  10. Executive Summary

    1. Purpose

      This final rule will update the payment rates for HHAs for calendar year (CY) 2016, as required under section 1895(b) of the Social Security Act (the Act). This reflects the 3rd year of the 4-year phase-

      in of the rebasing adjustments to the national, standardized 60-day episode payment rate, the national per-visit rates, and the NRS conversion factor finalized in the CY 2014 HH PPS final rule (78 FR 72256), as required under section 3131(a) of the Patient Protection and Affordable Care Act of 2010 (Pub. L. 111-148), as amended by the Health Care and Education Reconciliation Act of 2010 (Pub. L. 111-152) (collectively referred to as the ``Affordable Care Act'').

      This rule will update the case-mix weights under section 1895(b)(4)(A)(i) and (b)(4)(B) of the Act and provides a clarification regarding the use of the ``initial encounter'' seventh character applicable to certain ICD-10-CM code categories. This final rule will finalize reductions to the national, standardized 60-day episode payment rate in CY 2016, CY 2017, and CY 2018 of 0.97 percent in each year to account for case-mix growth unrelated to increases in patient acuity (nominal case-mix growth) between CY 2012 and CY 2014 under the authority of section 1895(b)(3)(B)(iv) of the Act. In addition, this rule finalizes our proposal to implement an HH Value-Based Purchasing (VBP) model, in which certain Medicare-certified HHAs are required to participate, beginning January 1, 2016 under the authority of section 1115A of the Act. Finally, this rule will finalize changes to the home health quality reporting program requirements under section 1895(b)(3)(B)(v)(II) of the Act and will finalize minor technical regulations text changes in 42 CFR parts 409, 424, and 484 to better align the payment requirements with recent statutory and regulatory changes for home health services.

    2. Summary of the Major Provisions

      As required by section 3131(a) of the Affordable Care Act, and finalized in the CY 2014 HH final rule, ``Medicare and Medicaid Programs; Home Health Prospective Payment System Rate Update for 2014, Home Health Quality Reporting Requirements, and Cost Allocation of Home Health Survey Expenses'' (78 FR 77256, December 2, 2013), we are implementing the 3rd year of the 4-year phase-in of the rebasing adjustments to the national, standardized 60-day episode payment amount, the national per-visit rates and the NRS conversion factor in section III.C.3. The rebasing adjustments for CY 2016 will reduce the national, standardized 60-day episode payment amount by $80.95, increase the national per-visit payment amounts by 3.5 percent of the national per-visit payment amounts in CY 2010 with the increases ranging from $1.79 for home health aide services to $6.34 for medical social services, and reduce the NRS conversion factor by 2.82 percent.

      In the CY 2015 HH PPS final rule (79 FR 66072), we finalized our proposal to recalibrate the case-mix weights every year with more current data. In section III.B.1 of this rule, we are recalibrating the HH PPS case-mix weights, using the most current cost and utilization data available, in a budget neutral manner. In addition, in section III.B.2 of this rule, we are finalizing reductions to the national, standardized 60-day episode payment rate in CY 2016, CY 2017, and CY 2018 of 0.97 percent in each year to account for estimated case-mix growth unrelated to increases in patient acuity (nominal case-mix growth) between CY 2012 and CY 2014. In section III.B.3 of this rule we are providing a clarification regarding the use of the ``initial encounter'' seventh character, applicable to certain ICD-10-CM code categories, under the HH PPS. In section III.C.1 of this rule, we are updating the payment rates under the HH PPS by the home health payment update percentage of 1.9 percent (using the 2010-based Home Health Agency (HHA) market basket update of 2.3 percent, minus 0.4 percentage point for productivity as required by section 1895(b)(3)(B)(vi)(I) of the Act. In the CY 2015 final rule (79 FR 66083 through 66087), we incorporated new geographic area designations, set out in a February 28, 2013 Office of Management and Budget

      Page 68626

      (OMB) bulletin, into the home health wage index. For CY 2015, we implemented a wage index transition policy consisting of a 50/50 blend of the old geographic area delineations and the new geographic area delineations. In section III.C.2 of this rule, we will update the CY 2016 home health wage index using solely the new geographic area designations. In section III.D of this final rule, we discuss payments for high cost outliers. In section III.E, we are finalizing several technical corrections in 42 CFR parts 409, 424, and 484, to better align the payment requirements with recent statutory and regulatory changes for home health services. The sections include Sec. Sec. 409.43(e), 424.22(a), 484.205(d), 484.205(e), 484.220, 484.225, 484.230, 484.240(b), 484.240(e), 484.240(f), 484.245.

      In section IV of this rule, we are finalizing our proposal to implement a HHVBP model that will begin on January 1, 2016. Medicare-

      certified HHAs selected for inclusion in the HHVBP model will be required to compete for payment adjustments to their current PPS reimbursements based on quality performance. A competing HHA is defined as an agency that has a current Medicare certification and that is being paid by CMS for home health care delivered within any of the states selected in accordance with the HHVBP Model's selection methodology.

      Finally, section V of this rule includes changes to the home health quality reporting program, including one new quality measure, the establishment of a minimum threshold for submission of Outcome and Assessment Information Set (OASIS) assessments for purposes of quality reporting compliance, and submission dates for Home Health Care Consumer Assessment of Healthcare Providers and Systems Survey (HHCAHPS) Survey through CY 2018.

    3. Summary of Costs and Transfers

      Table 1--Summary of Costs and Transfers

      ------------------------------------------------------------------------

      Provision description Costs Transfers

      ------------------------------------------------------------------------

      CY 2016 HH PPS Payment Rate ................. The overall economic

      Update. impact of the HH PPS

      payment rate update

      is an estimated -

      $260 million (-1.4

      percent) in payments

      to HHAs.

      CY 2016 HHVBP Model........... ................. The overall economic

      impact of the HHVBP

      model provision for

      CY 2018 through 2022

      is an estimated $380

      million in total

      savings from a

      reduction in

      unnecessary

      hospitalizations and

      SNF usage as a

      result of greater

      quality improvements

      in the HH industry.

      As for payments to

      HHAs, there are no

      aggregate increases

      or decreases to the

      HHAs competing in

      the model.

      ------------------------------------------------------------------------

  11. Background

    1. Statutory Background

      The Balanced Budget Act of 1997 (BBA) (Pub. L. 105-33, enacted August 5, 1997), significantly changed the way Medicare pays for Medicare HH services. Section 4603 of the BBA mandated the development of the HH PPS. Until the implementation of the HH PPS on October 1, 2000, HHAs received payment under a retrospective reimbursement system.

      Section 4603(a) of the BBA mandated the development of a HH PPS for all Medicare-covered HH services provided under a plan of care (POC) that were paid on a reasonable cost basis by adding section 1895 of the Social Security Act (the Act), entitled ``Prospective Payment For Home Health Services.'' Section 1895(b)(1) of the Act requires the Secretary to establish a HH PPS for all costs of HH services paid under Medicare.

      Section 1895(b)(3)(A) of the Act requires the following: (1) The computation of a standard prospective payment amount include all costs for HH services covered and paid for on a reasonable cost basis and that such amounts be initially based on the most recent audited cost report data available to the Secretary; and (2) the standardized prospective payment amount be adjusted to account for the effects of case-mix and wage levels among HHAs.

      Section 1895(b)(3)(B) of the Act addresses the annual update to the standard prospective payment amounts by the HH applicable percentage increase. Section 1895(b)(4) of the Act governs the payment computation. Sections 1895(b)(4)(A)(i) and (b)(4)(A)(ii) of the Act require the standard prospective payment amount to be adjusted for case-mix and geographic differences in wage levels. Section 1895(b)(4)(B) of the Act requires the establishment of an appropriate case-mix change adjustment factor for significant variation in costs among different units of services.

      Similarly, section 1895(b)(4)(C) of the Act requires the establishment of wage adjustment factors that reflect the relative level of wages, and wage-related costs applicable to HH services furnished in a geographic area compared to the applicable national average level. Under section 1895(b)(4)(C) of the Act, the wage-

      adjustment factors used by the Secretary may be the factors used under section 1886(d)(3)(E) of the Act.

      Section 1895(b)(5) of the Act gives the Secretary the option to make additions or adjustments to the payment amount otherwise paid in the case of outliers due to unusual variations in the type or amount of medically necessary care. Section 3131(b)(2) of the Patient Protection and Affordable Care Act of 2010 (the Affordable Care Act) (Pub. L. 111-

      148, enacted March 23, 2010) revised section 1895(b)(5) of the Act so that total outlier payments in a given year would not exceed 2.5 percent of total payments projected or estimated. The provision also made permanent a 10 percent agency-level outlier payment cap.

      In accordance with the statute, as amended by the BBA, we published a final rule in the July 3, 2000 Federal Register (65 FR 41128) to implement the HH PPS legislation. The July 2000 final rule established requirements for the new HH PPS for HH services as required by section 4603 of the BBA, as subsequently amended by section 5101 of the Omnibus Consolidated and Emergency Supplemental Appropriations Act (OCESAA) for Fiscal Year 1999, (Pub. L. 105-277, enacted October 21, 1998); and by sections 302, 305, and 306 of the Medicare, Medicaid, and SCHIP Balanced Budget Refinement Act (BBRA) of 1999, (Pub. L. 106-113, enacted November 29, 1999). The requirements include the implementation of a HH PPS for HH services, consolidated billing requirements, and a number of other related changes. The HH PPS described in that rule replaced the retrospective reasonable cost-based system that was used by Medicare for the payment of HH services under Part A and Part B. For a complete and full description of the HH PPS as required by the BBA, see the July 2000 HH PPS final rule (65 FR 41128 through 41214).

      Page 68627

      Section 5201(c) of the Deficit Reduction Act of 2005 (DRA) (Pub. L. 109-171, enacted February 8, 2006) added new section 1895(b)(3)(B)(v) to the Act, requiring HHAs to submit data for purposes of measuring health care quality, and links the quality data submission to the annual applicable percentage increase. This data submission requirement is applicable for CY 2007 and each subsequent year. If an HHA does not submit quality data, the HH market basket percentage increase is reduced by 2 percentage points. In the November 9, 2006 Federal Register (71 FR 65884, 65935), we published a final rule to implement the pay-for-reporting requirement of the DRA, which was codified at Sec. 484.225(h) and (i) in accordance with the statute. The pay-for-

      reporting requirement was implemented on January 1, 2007.

      The Affordable Care Act made additional changes to the HH PPS. One of the changes in section 3131 of the Affordable Care Act is the amendment to section 421(a) of the Medicare Prescription Drug, Improvement, and Modernization Act of 2003 (MMA) (Pub. L. 108-173, enacted on December 8, 2003) as amended by section 5201(b) of the DRA. Section 421(a) of the MMA, as amended by section 3131 of the Affordable Care Act, requires that the Secretary increase, by 3 percent, the payment amount otherwise made under section 1895 of the Act, for HH services furnished in a rural area (as defined in section 1886(d)(2)(D) of the Act) with respect to episodes and visits ending on or after April 1, 2010, and before January 1, 2016. Section 210 of the Medicare Access and CHIP Reauthorization Act of 2015 (MACRA) (Public Law 114-10) amended section 421(a) of the MMA to extend the rural add-on for two more years. Section 421(a) of the MMA, as amended by section 210 of the MACRA, requires that the Secretary increase, by 3 percent, the payment amount otherwise made under section 1895 of the Act, for HH services provided in a rural area (as defined in section 1886(d)(2)(D) of the Act) with respect to episodes and visits ending on or after April 1, 2010, and before January 1, 2018.

    2. System for Payment of Home Health Services

      Generally, Medicare makes payment under the HH PPS on the basis of a national standardized 60-day episode payment rate that is adjusted for the applicable case-mix and wage index. The national standardized 60-day episode rate includes the six HH disciplines (skilled nursing, HH aide, physical therapy, speech-language pathology, occupational therapy, and medical social services). Payment for non-routine supplies (NRS) is no longer part of the national standardized 60-day episode rate and is computed by multiplying the relative weight for a particular NRS severity level by the NRS conversion factor (See section II.D.4.e). Payment for durable medical equipment covered under the HH benefit is made outside the HH PPS payment system. To adjust for case-

      mix, the HH PPS uses a 153-category case-mix classification system to assign patients to a home health resource group (HHRG). The clinical severity level, functional severity level, and service utilization are computed from responses to selected data elements in the OASIS assessment instrument and are used to place the patient in a particular HHRG. Each HHRG has an associated case-mix weight which is used in calculating the payment for an episode.

      For episodes with four or fewer visits, Medicare pays national per-

      visit rates based on the discipline(s) providing the services. An episode consisting of four or fewer visits within a 60-day period receives what is referred to as a low-utilization payment adjustment (LUPA). Medicare also adjusts the national standardized 60-day episode payment rate for certain intervening events that are subject to a partial episode payment adjustment (PEP adjustment). For certain cases that exceed a specific cost threshold, an outlier adjustment may also be available.

    3. Updates to the Home Health Prospective Payment System

      As required by section 1895(b)(3)(B) of the Act, we have historically updated the HH PPS rates annually in the Federal Register. The August 29, 2007 final rule with comment period set forth an update to the 60-day national episode rates and the national per-visit rates under the HH PPS for CY 2008. The CY 2008 HH PPS final rule included an analysis performed on CY 2005 HH claims data, which indicated a 12.78 percent increase in the observed case-mix since 2000. Case-mix represents the variations in conditions of the patient population served by the HHAs. Subsequently, a more detailed analysis was performed on the 2005 case-mix data to evaluate if any portion of the 12.78 percent increase was associated with a change in the actual clinical condition of HH patients. We examined data on demographics, family severity, and non-HH Part A Medicare expenditures to predict the average case-mix weight for 2005. We identified 8.03 percent of the total case-mix change as real, and therefore, decreased the 12.78 percent of total case-mix change by 8.03 percent to get a final nominal case-mix increase measure of 11.75 percent (0.1278*(1-0.0803)=0.1175).

      To account for the changes in case-mix that were not related to an underlying change in patient health status, we implemented a reduction, over 4 years, to the national, standardized 60-day episode payment rates. That reduction was to be 2.75 percent per year for 3 years beginning in CY 2008 and 2.71 percent for the fourth year in CY 2011. In the CY 2011 HH PPS final rule (76 FR 68532), we updated our analyses of case-mix change and finalized a reduction of 3.79 percent, instead of 2.71 percent, for CY 2011 and deferred finalizing a payment reduction for CY 2012 until further study of the case-mix change data and methodology was completed.

      In the CY 2012 HH PPS final rule (76 FR 68526), we updated the 60-

      day national episode rates and the national per-visit rates. In addition, as discussed in the CY 2012 HH PPS final rule (76 FR 68528), our analysis indicated that there was a 22.59 percent increase in overall case-mix from 2000 to 2009 and that only 15.76 percent of that overall observed case-mix percentage increase was due to real case-mix change. As a result of our analysis, we identified a 19.03 percent nominal increase in case-mix. At that time, to fully account for the 19.03 percent nominal case-mix growth identified from 2000 to 2009, we finalized a 3.79 percent payment reduction in CY 2012 and a 1.32 percent payment reduction for CY 2013.

      In the CY 2013 HH PPS final rule (77 FR 67078), we implemented a 1.32 percent reduction to the payment rates for CY 2013 to account for nominal case-mix growth from 2000 through 2010. When taking into account the total measure of case-mix change (23.90 percent) and the 15.97 percent of total case-mix change estimated as real from 2000 to 2010, we obtained a final nominal case-mix change measure of 20.08 percent from 2000 to 2010 (0.2390*(1-0.1597)=0.2008). To fully account for the remainder of the 20.08 percent increase in nominal case-mix beyond that which was accounted for in previous payment reductions, we estimated that the percentage reduction to the national, standardized 60-day episode rates for nominal case-mix change would be 2.18 percent. Although we considered proposing a 2.18 percent reduction to account for the remaining increase in measured nominal case-mix, we finalized the 1.32 percent payment reduction to the national, standardized

      Page 68628

      60-day episode rates in the CY 2012 HH PPS final rule (76 FR 68532).

      Section 3131(a) of the Affordable Care Act requires that, beginning in CY 2014, we apply an adjustment to the national, standardized 60-day episode rate and other amounts that reflect factors such as changes in the number of visits in an episode, the mix of services in an episode, the level of intensity of services in an episode, the average cost of providing care per episode, and other relevant factors. Additionally, we must phase in any adjustment over a 4 year period in equal increments, not to exceed 3.5 percent of the amount (or amounts) as of the date of enactment of the Affordable Care Act, and fully implement the rebasing adjustments by CY 2017. The statute specifies that the maximum rebasing adjustment is to be no more than 3.5 percent per year of the CY 2010 rates. Therefore, in the CY 2014 HH PPS final rule (78 FR 72256) for each year, CY 2014 through CY 2017, we finalized a fixed-

      dollar reduction to the national, standardized 60-day episode payment rate of $80.95 per year, increases to the national per-visit payment rates per year as reflected in Table 2, and a decrease to the NRS conversion factor of 2.82 percent per year. We also finalized three separate LUPA add-on factors for skilled nursing, physical therapy, and speech-language pathology and removed 170 diagnosis codes from assignment to diagnosis groups in the HH PPS Grouper. In the CY 2015 HH PPS final rule (79 FR 66032), we implemented the 2nd year of the 4 year phase-in of the rebasing adjustments to the HH PPS payment rates and made changes to the HH PPS case-mix weights. In addition, we simplified the face-to-face encounter regulatory requirements and the therapy reassessment timeframes.

      Table 2--Maximum Adjustments to the National Per-Visit Payment Rates

      Not to exceed 3.5 percent of the amount(s) in CY 2010

      ------------------------------------------------------------------------

      Maximum adjustments

      2010 National per- per year (CY 2014

      visit payment rates through CY 2017)

      ------------------------------------------------------------------------

      Skilled Nursing............. $113.01 $3.96

      Home Health Aide............ 51.18 1.79

      Physical Therapy............ 123.57 4.32

      Occupational Therapy........ 124.40 4.35

      Speech-Language Pathology... 134.27 4.70

      Medical Social Services..... 181.16 6.34

      ------------------------------------------------------------------------

    4. Advancing Health Information Exchange

      HHS has a number of initiatives designed to encourage and support the adoption of health information technology and to promote nationwide health information exchange to improve health care. As discussed in the August 2013 Statement ``Principles and Strategies for Accelerating Health Information Exchange'' (available at http://www.healthit.gov/sites/default/files/acceleratinghieprinciples_strategy.pdf), HHS believes that all individuals, their families, their healthcare and social service providers, and payers should have consistent and timely access to health information in a standardized format that can be securely exchanged between the patient, providers, and others involved in the individual's care. Health IT that facilitates the secure, efficient, and effective sharing and use of health-related information when and where it is needed is an important tool for settings across the continuum of care, including home health. While home health providers are not eligible for the Medicare and Medicaid EHR Incentive Programs, effective adoption and use of health information exchange and health IT tools will be essential as these settings seek to improve quality and lower costs through initiatives such as value-based purchasing.

      The Office of the National Coordinator for Health Information Technology (ONC) has released a document entitled ``Connecting Health and Care for the Nation: A Shared Nationwide Interoperability Roadmap'' (available at https://www.healthit.gov/sites/default/files/hie-interoperability/nationwide-interoperability-roadmap-final-version-1.0.pdf). In the near term, the Roadmap focuses on actions that will enable individuals and providers across the care continuum to send, receive, find, and use a common set of electronic clinical information at the nationwide level by the end of 2017. The Roadmap's goals also align with the Improving Medicare Post-Acute Care Transformation Act of 2014 (Pub. L. 113-185) (IMPACT Act), which requires assessment data to be standardized and interoperable to allow for exchange of the data. Moreover, the vision described in the draft Roadmap significantly expands the types of electronic health information, information sources, and information users well beyond clinical information derived from electronic health records (EHRs). The Roadmap identifies four critical pathways that health IT stakeholders should focus on now in order to create a foundation for long-term success: (1) Improve technical standards and implementation guidance for priority data domains and associated elements; (2) rapidly shift and align federal, state, and commercial payment policies from fee-for-service to value-

      based models to stimulate the demand for interoperability; (3) clarify and align federal and state privacy and security requirements that enable interoperability; and (4) align and promote the use of consistent policies and business practices that support interoperability, in coordination with stakeholders. In addition, ONC has released the draft version of the 2016 Interoperability Standards Advisory (available at https://www.healthit.gov/standards-advisory/2016), which provides a list of the best available standards and implementation specifications to enable priority health information exchange functions. Providers, payers, and vendors are encouraged to take these ``best available standards'' into account as they implement interoperable health information exchange across the continuum of care, including care settings such as behavioral health, long-term and post-

      acute care, and home and community-based service providers.

      We encourage stakeholders to utilize health information exchange and certified health IT to effectively and efficiently help providers improve internal care delivery practices, engage patients in their care, support management of care across the continuum, enable the reporting of

      Page 68629

      electronically specified clinical quality measures (eCQMs), and improve efficiencies and reduce unnecessary costs. As adoption of certified health IT increases and interoperability standards continue to mature, HHS will seek to reinforce standards through relevant policies and programs.

  12. Provisions of the Proposed Rule and Responses to Comments

    We received 118 timely comments from the public. The following sections, arranged by subject area, include a summary of the public comments received, and our responses.

    1. Monitoring for Potential Impacts--Affordable Care Act Rebasing Adjustments

      In the CY 2016 HH PPS proposed rule (80 FR 39840), we provided a summary of analysis conducted on FY 2013 HHA cost report data and how such data, if used, would impact our estimate of the percentage difference between Medicare payments and HHA costs. In addition, we also provided a summary of MedPAC's Report to the Congress on home health payment rebasing and presented information on Medicare home health utilization using CY 2014 HHA claims data (the 1st year of the 4 year phase-in of the rebasing adjustments mandated by section 3131(a) the Affordable Care Act). We will continue to monitor the impact of future payment and policy changes and will provide the industry with periodic updates on our analysis in future rulemaking and/or announcements on the HHA Center Web page at: https://www.cms.gov/Center/Provider-Type/Home-Health-Agency-HHA-Center.html.

    2. CY 2016 HH PPS Case-Mix Weights and Reduction to the National, Standardized 60-day Episode Payment Rate to Account for Nominal Case-

      Mix Growth

      1. CY 2016 HH PPS Case-Mix Weights

      For CY 2014, as part of the rebasing effort mandated by the Affordable Care Act, we reset the HH PPS case-mix weights, lowering the average case-mix weight to 1.0000. To lower the HH PPS case-mix weights to 1.0000, each HH PPS case-mix weight was decreased by the same factor (1.3464), thereby maintaining the same relative values between the weights. This ``resetting'' of the HH PPS case-mix weights was done in a budget neutral manner by inflating the national, standardized 60-day episode rate by the same factor (1.3464) that was used to decrease the weights. For CY 2015, we finalized a policy to annually recalibrate the HH PPS case-mix weights--adjusting the weights relative to one another--using the most current, complete data available. To recalibrate the HH PPS case-mix weights for CY 2016, we propose to use the same methodology finalized in the CY 2008 HH PPS final rule (72 FR 49762), the CY 2012 HH PPS final rule (76 FR 68526), and the CY 2015 HH PPS final rule (79 FR 66032). Annual recalibration of the HH PPS case-

      mix weights ensures that the case-mix weights reflect, as accurately as possible, current home health resource use and changes in utilization patterns.

      To generate the proposed CY 2016 HH PPS case-mix weights, we used CY 2014 home health claims data (as of December 31, 2014) with linked OASIS data. For this CY 2016 HH PPS final rule, we used CY 2014 home health claims data (as of June 30, 2015) with linked OASIS data to generate the final CY 2016 HH PPS case-mix weights. These data are the most current and complete data available at this time. The tables below have been revised to reflect the results using the updated data. The process we used to calculate the HH PPS case-mix weights are outlined below.

      Step 1: Re-estimate the four-equation model to determine the clinical and functional points for an episode using wage-weighted minutes of care as our dependent variable for resource use. The wage-

      weighted minutes of care are determined using the Bureau of Labor Statistics national hourly wage (covering May 2014) plus fringe rates (covering December 2014) for the six home health disciplines and the minutes per visit from the claim. The points for each of the variables for each leg of the model, updated with CY 2014 data, are shown in Table 3. The points for the clinical variables are added together to determine an episode's clinical score. The points for the functional variables are added together to determine an episode's functional score.

      Page 68630

      GRAPHIC TIFF OMITTED TR05NO15.000

      Page 68631

      GRAPHIC TIFF OMITTED TR05NO15.001

      Page 68632

      GRAPHIC TIFF OMITTED TR05NO15.002

      Page 68633

      GRAPHIC TIFF OMITTED TR05NO15.003

      In updating the four-equation model for CY 2016 using 2014 data (the last update to the four-equation model for CY 2015 used 2013 data), there were few changes to the point values for the variables in the four-equation model. These relatively minor changes reflect the change in the relationship between the grouper variables and resource use between 2013 and 2014. The CY 2016 four-equation model resulted in 124 point-giving variables being used in the model (as compared to the 120 point-giving variables for the 2015 recalibration). There were eight variables that were added to the model and four variables that were dropped from the model due to the absence of additional resources associated with the variable. The points for 24 variables increased in the CY 2016 four-equation model and the points for 38 variables decreased in the CY 2016 4-equation model. There were 54 variables with the same point values.

      Step 2: Re-define the clinical and functional thresholds so they are reflective of the new points associated with the CY 2016 four-

      equation model. After estimating the points for each of the variables and summing the clinical and functional points for each episode, we look at the distribution of the clinical score and functional score, breaking the episodes into different steps. The categorizations for the steps are as follows:

      In updating the four-equation model for CY 2016 using 2014 data (the last update to the four-equation model for CY 2015 used 2013 data), there were few changes to the point values for the variables in the four-equation model. These relatively minor changes reflect the change in the relationship between the grouper variables and resource use between 2013 and 2014. The CY 2016 four-equation model resulted in 124 point-giving variables being used in the model (as compared to the 120 point-giving variables for the 2015 recalibration). There were eight variables that were added to the model and four variables that were dropped from the model due to the absence of additional resources associated with the variable. The points for 24 variables increased in the CY 2016 four-equation model and the points for 38 variables decreased in the CY 2016 4-equation model. There were 54 variables with the same point values.

      Step 2: Re-define the clinical and functional thresholds so they are reflective of the new points associated with the CY 2016 four-

      equation model. After estimating the points for each of the variables and summing the clinical and functional points for each episode, we look at the distribution of the clinical score and functional score, breaking the episodes into different steps. The categorizations for the steps are as follows:

      Step 1: First and second episodes, 0-13 therapy visits.

      Step 2.1: First and second episodes, 14-19 therapy visits.

      Step 2.2: Third episodes and beyond, 14-19 therapy visits.

      Step 3: Third episodes and beyond, 0-13 therapy visits.

      Step 4: Episodes with 20+ therapy visits.

      We then divide the distribution of the clinical score for episodes within a step such that a third of episodes are classified as low clinical score, a third of episodes are classified as medium clinical score, and a third of episodes are classified as high clinical score. The same approach is then done looking at the functional score. It was not always possible to evenly divide the episodes within each step into thirds due to many episodes being clustered around one particular score.\1\ Also, we looked at the average resource use associated with each clinical and functional score and used that to guide where we placed our thresholds. We tried to group scores with similar average resource use within the same level (even if it meant that more or less than a third of episodes were placed within a level). The new thresholds, based off of the CY 2016 four-equation model points are shown in Table 4.

      ---------------------------------------------------------------------------

      \1\ For Step 1, 54% of episodes were in the medium functional level (All with score 15). For Step 2.1, 77.2% of episodes were in the low functional level (Most with score 2 and 4). For Step 2.2, 67.1% of episodes were in the low functional level (All with score 0). For Step 3, 60.9% of episodes were in the medium functional level (Most with score 10). For Step 4, 49.8% of episodes were in the low functional level (Most with score 2).

      Page 68634

      Table 4--CY 2016 Clinical and Functional Thresholds

      --------------------------------------------------------------------------------------------------------------------------------------------------------

      1st and 2nd Episodes 3rd+ Episodes All episodes

      -------------------------------------------------------------------------------

      0 to 13 14 to 19 0 to 13 14 to 19 20+ therapy

      therapy visits therapy visits therapy visits therapy visits visits

      --------------------------------------------------------------------------------------------------------------------------------------------------------

      Grouping Step: 1 2.1 3 2.2 4

      Equation(s) used to calculate points: (see Table 3)..................... 1 2 3 4 (2&4)

      --------------------------------------------------------------------------------------------------------------------------------------------------------

      Dimension................................. Severity Level..............

      Clinical.................................. C1.......................... 0 to 1 0 to 1 0 0 to 3 0 to 3

      C2.......................... 2 to 3 2 to 7 1 4 to 12 4 to 16

      C3.......................... 4+ 8+ 2+ 13+ 17+

      Functional................................ F1.......................... 0 to 14 0 to 6 0 to 6 0 0 to 2

      F2.......................... 15 7 to 13 7 to 10 1 to 7 3 to 6

      F3.......................... 16+ 14+ 11+ 8+ 7+

      --------------------------------------------------------------------------------------------------------------------------------------------------------

      Step 3: Once the clinical and functional thresholds are determined and each episode is assigned a clinical and functional level, the payment regression is estimated with an episode's wage-weighted minutes of care as the dependent variable. Independent variables in the model are indicators for the step of the episode as well as the clinical and functional levels within each step of the episode. Like the four-

      equation model, the payment regression model is also estimated with robust standard errors that are clustered at the beneficiary level. Table 5 shows the regression coefficients for the variables in the payment regression model updated with CY 2014 data. The R-squared value for the payment regression model is 0.4822 (an increase from 0.4680 for the CY 2015 recalibration).

      Table 5--Payment Regression Model

      ------------------------------------------------------------------------

      New payment

      Variable description regression

      coefficients

      ------------------------------------------------------------------------

      Step 1, Clinical Score Medium........................... $24.69

      Step 1, Clinical Score High............................. $59.72

      Step 1, Functional Score Medium......................... $76.46

      Step 1, Functional Score High........................... $114.89

      Step 2.1, Clinical Score Medium......................... $68.55

      Step 2.1, Clinical Score High........................... $156.28

      Step 2.1, Functional Score Medium....................... $34.15

      Step 2.1, Functional Score High......................... $87.13

      Step 2.2, Clinical Score Medium......................... $61.06

      Step 2.2, Clinical Score High........................... $211.40

      Step 2.2, Functional Score Medium....................... $10.90

      Step 2.2, Functional Score High......................... $70.39

      Step 3, Clinical Score Medium........................... $10.27

      Step 3, Clinical Score High............................. $91.72

      Step 3, Functional Score Medium......................... $56.53

      Step 3, Functional Score High........................... $87.94

      Step 4, Clinical Score Medium........................... $72.66

      Step 4, Clinical Score High............................. $238.69

      Step 4, Functional Score Medium......................... $15.65

      Step 4, Functional Score High........................... $65.68

      Step 2.1, 1st and 2nd Episodes, 14 to 19 Therapy Visits. $479.21

      Step 2.2, 3rd+ Episodes, 14 to 19 Therapy Visits........ $505.35

      Step 3, 3rd+ Episodes, 0-13 Therapy Visits.............. -$76.20

      Step 4, All Episodes, 20+ Therapy Visits................ $930.06

      Intercept............................................... $391.33

      ------------------------------------------------------------------------

      Source: CY 2014 Medicare claims data for episodes ending on or before

      December 31, 2014 (as of June 30, 2015) for which we had a linked

      OASIS assessment.

      Step 4: We use the coefficients from the payment regression model to predict each episode's wage-weighted minutes of care (resource use). We then divide these predicted values by the mean of the dependent variable (that is, the average wage-weighted minutes of care across all episodes used in the payment regression). This division constructs the weight for each episode, which is simply the ratio of the episode's predicted wage-weighted minutes of care divided by the average wage-

      weighted minutes of care in the sample. Each episode is then aggregated into one of the 153 home health resource groups (HHRGs) and the ``raw'' weight for each HHRG was calculated as the average of the episode weights within the HHRG.

      Step 5: The weights associated with 0 to 5 therapy visits are then increased by 3.75 percent, the weights associated with 14-15 therapy visits are decreased by 2.5 percent, and the weights associated with 20+ therapy visits are decreased by 5 percent. These adjustments to the case-mix weights were finalized in the CY 2012 HH PPS final rule (76 FR 68557) and were done

      Page 68635

      to address MedPAC's concerns that the HH PPS overvalues therapy episodes and undervalues non-therapy episodes and to better aligned the case-mix weights with episode costs estimated from cost report data.\2\

      ---------------------------------------------------------------------------

      \2\ Medicare Payment Advisory Commission (MedPAC), Report to the Congress: Medicare Payment Policy. March 2011, P. 176.

      ---------------------------------------------------------------------------

      Step 6: After the adjustments in step 5 are applied to the raw weights, the weights are further adjusted to create an increase in the payment weights for the therapy visit steps between the therapy thresholds. Weights with the same clinical severity level, functional severity level, and early/later episode status were grouped together. Then within those groups, the weights for each therapy step between thresholds are gradually increased. We do this by interpolating between the main thresholds on the model (from 0-5 to 14-15 therapy visits, and from 14-15 to 20+ therapy visits). We use a linear model to implement the interpolation so the payment weight increase for each step between the thresholds (such as the increase between 0-5 therapy visits and 6 therapy visits and the increase between 6 therapy visits and 7-9 therapy visits) are constant. This interpolation is the identical to the process finalized in the CY 2012 HH PPS final rule (76 FR 68555).

      Step 7: The interpolated weights are then adjusted so that the average case-mix for the weights is equal to 1.0000.\3\ This last step creates the CY 2016 case-mix weights shown in Table 6.

      ---------------------------------------------------------------------------

      \3\ When computing the average, we compute a weighted average, assigning a value of one to each normal episode and a value equal to the episode length divided by 60 for PEPs.

      Table 6: Final CY 2016 Case-Mix Payment Weights

      ----------------------------------------------------------------------------------------------------------------

      Final CY 2016

      Payment group Step (episode and/or therapy Clinical and functional levels (1 case-mix

      visit ranges) = Low; 2 = Medium; 3= High) weights

      ----------------------------------------------------------------------------------------------------------------

      10111........................ 1st and 2nd Episodes, 0 to 5 C1F1S1 0.5908

      Therapy Visits.

      10112........................ 1st and 2nd Episodes, 6 C1F1S2 0.7197

      Therapy Visits.

      10113........................ 1st and 2nd Episodes, 7 to 9 C1F1S3 0.8485

      Therapy Visits.

      10114........................ 1st and 2nd Episodes, 10 C1F1S4 0.9774

      Therapy Visits.

      10115........................ 1st and 2nd Episodes, 11 to C1F1S5 1.1063

      13 Therapy Visits.

      10121........................ 1st and 2nd Episodes, 0 to 5 C1F2S1 0.7062

      Therapy Visits.

      10122........................ 1st and 2nd Episodes, 6 C1F2S2 0.8217

      Therapy Visits.

      10123........................ 1st and 2nd Episodes, 7 to 9 C1F2S3 0.9372

      Therapy Visits.

      10124........................ 1st and 2nd Episodes, 10 C1F2S4 1.0527

      Therapy Visits.

      10125........................ 1st and 2nd Episodes, 11 to C1F2S5 1.1681

      13 Therapy Visits.

      10131........................ 1st and 2nd Episodes, 0 to 5 C1F3S1 0.7643

      Therapy Visits.

      10132........................ 1st and 2nd Episodes, 6 C1F3S2 0.8832

      Therapy Visits.

      10133........................ 1st and 2nd Episodes, 7 to 9 C1F3S3 1.0021

      Therapy Visits.

      10134........................ 1st and 2nd Episodes, 10 C1F3S4 1.1210

      Therapy Visits.

      10135........................ 1st and 2nd Episodes, 11 to C1F3S5 1.2399

      13 Therapy Visits.

      10211........................ 1st and 2nd Episodes, 0 to 5 C2F1S1 0.6281

      Therapy Visits.

      10212........................ 1st and 2nd Episodes, 6 C2F1S2 0.7690

      Therapy Visits.

      10213........................ 1st and 2nd Episodes, 7 to 9 C2F1S3 0.9098

      Therapy Visits.

      10214........................ 1st and 2nd Episodes, 10 C2F1S4 1.0507

      Therapy Visits.

      10215........................ 1st and 2nd Episodes, 11 to C2F1S5 1.1915

      13 Therapy Visits.

      10221........................ 1st and 2nd Episodes, 0 to 5 C2F2S1 0.7435

      Therapy Visits.

      10222........................ 1st and 2nd Episodes, 6 C2F2S2 0.8710

      Therapy Visits.

      10223........................ 1st and 2nd Episodes, 7 to 9 C2F2S3 0.9985

      Therapy Visits.

      10224........................ 1st and 2nd Episodes, 10 C2F2S4 1.1259

      Therapy Visits.

      10225........................ 1st and 2nd Episodes, 11 to C2F2S5 1.2534

      13 Therapy Visits.

      10231........................ 1st and 2nd Episodes, 0 to 5 C2F3S1 0.8016

      Therapy Visits.

      10232........................ 1st and 2nd Episodes, 6 C2F3S2 0.9325

      Therapy Visits.

      10233........................ 1st and 2nd Episodes, 7 to 9 C2F3S3 1.0633

      Therapy Visits.

      10234........................ 1st and 2nd Episodes, 10 C2F3S4 1.1942

      Therapy Visits.

      10235........................ 1st and 2nd Episodes, 11 to C2F3S5 1.3251

      13 Therapy Visits.

      10311........................ 1st and 2nd Episodes, 0 to 5 C3F1S1 0.6810

      Therapy Visits.

      10312........................ 1st and 2nd Episodes, 6 C3F1S2 0.8362

      Therapy Visits.

      10313........................ 1st and 2nd Episodes, 7 to 9 C3F1S3 0.9913

      Therapy Visits.

      10314........................ 1st and 2nd Episodes, 10 C3F1S4 1.1465

      Therapy Visits.

      10315........................ 1st and 2nd Episodes, 11 to C3F1S5 1.3017

      13 Therapy Visits.

      10321........................ 1st and 2nd Episodes, 0 to 5 C3F2S1 0.7964

      Therapy Visits.

      10322........................ 1st and 2nd Episodes, 6 C3F2S2 0.9382

      Therapy Visits.

      10323........................ 1st and 2nd Episodes, 7 to 9 C3F2S3 1.0800

      Therapy Visits.

      10324........................ 1st and 2nd Episodes, 10 C3F2S4 1.2218

      Therapy Visits.

      10325........................ 1st and 2nd Episodes, 11 to C3F2S5 1.3635

      13 Therapy Visits.

      10331........................ 1st and 2nd Episodes, 0 to 5 C3F3S1 0.8544

      Therapy Visits.

      10332........................ 1st and 2nd Episodes, 6 C3F3S2 0.9996

      Therapy Visits.

      10333........................ 1st and 2nd Episodes, 7 to 9 C3F3S3 1.1449

      Therapy Visits.

      10334........................ 1st and 2nd Episodes, 10 C3F3S4 1.2901

      Therapy Visits.

      10335........................ 1st and 2nd Episodes, 11 to C3F3S5 1.4353

      13 Therapy Visits.

      21111........................ 1st and 2nd Episodes, 14 to C1F1S1 1.2351

      15 Therapy Visits.

      21112........................ 1st and 2nd Episodes, 16 to C1F1S2 1.4323

      17 Therapy Visits.

      21113........................ 1st and 2nd Episodes, 18 to C1F1S3 1.6296

      19 Therapy Visits.

      21121........................ 1st and 2nd Episodes, 14 to C1F2S1 1.2836

      15 Therapy Visits.

      21122........................ 1st and 2nd Episodes, 16 to C1F2S2 1.4719

      17 Therapy Visits.

      Page 68636

      21123........................ 1st and 2nd Episodes, 18 to C1F2S3 1.6601

      19 Therapy Visits.

      21131........................ 1st and 2nd Episodes, 14 to C1F3S1 1.3588

      15 Therapy Visits.

      21132........................ 1st and 2nd Episodes, 16 to C1F3S2 1.5450

      17 Therapy Visits.

      21133........................ 1st and 2nd Episodes, 18 to C1F3S3 1.7313

      19 Therapy Visits.

      21211........................ 1st and 2nd Episodes, 14 to C2F1S1 1.3324

      15 Therapy Visits.

      21212........................ 1st and 2nd Episodes, 16 to C2F1S2 1.5307

      17 Therapy Visits.

      21213........................ 1st and 2nd Episodes, 18 to C2F1S3 1.7289

      19 Therapy Visits.

      21221........................ 1st and 2nd Episodes, 14 to C2F2S1 1.3809

      15 Therapy Visits.

      21222........................ 1st and 2nd Episodes, 16 to C2F2S2 1.5702

      17 Therapy Visits.

      21223........................ 1st and 2nd Episodes, 18 to C2F2S3 1.7595

      19 Therapy Visits.

      21231........................ 1st and 2nd Episodes, 14 to C2F3S1 1.4560

      15 Therapy Visits.

      21232........................ 1st and 2nd Episodes, 16 to C2F3S2 1.6434

      17 Therapy Visits.

      21233........................ 1st and 2nd Episodes, 18 to C2F3S3 1.8307

      19 Therapy Visits.

      21311........................ 1st and 2nd Episodes, 14 to C3F1S1 1.4569

      15 Therapy Visits.

      21312........................ 1st and 2nd Episodes, 16 to C3F1S2 1.6902

      17 Therapy Visits.

      21313........................ 1st and 2nd Episodes, 18 to C3F1S3 1.9234

      19 Therapy Visits.

      21321........................ 1st and 2nd Episodes, 14 to C3F2S1 1.5053

      15 Therapy Visits.

      21322........................ 1st and 2nd Episodes, 16 to C3F2S2 1.7297

      17 Therapy Visits.

      21323........................ 1st and 2nd Episodes, 18 to C3F2S3 1.9540

      19 Therapy Visits.

      21331........................ 1st and 2nd Episodes, 14 to C3F3S1 1.5805

      15 Therapy Visits.

      21332........................ 1st and 2nd Episodes, 16 to C3F3S2 1.8028

      17 Therapy Visits.

      21333........................ 1st and 2nd Episodes, 18 to C3F3S3 2.0252

      19 Therapy Visits.

      22111........................ 3rd+ Episodes, 14 to 15 C1F1S1 1.2722

      Therapy Visits.

      22112........................ 3rd+ Episodes, 16 to 17 C1F1S2 1.4571

      Therapy Visits.

      22113........................ 3rd+ Episodes, 18 to 19 C1F1S3 1.6419

      Therapy Visits.

      22121........................ 3rd+ Episodes, 14 to 15 C1F2S1 1.2877

      Therapy Visits.

      22122........................ 3rd+ Episodes, 16 to 17 C1F2S2 1.4746

      Therapy Visits.

      22123........................ 3rd+ Episodes, 18 to 19 C1F2S3 1.6615

      Therapy Visits.

      22131........................ 3rd+ Episodes, 14 to 15 C1F3S1 1.3721

      Therapy Visits.

      22132........................ 3rd+ Episodes, 16 to 17 C1F3S2 1.5539

      Therapy Visits.

      22133........................ 3rd+ Episodes, 18 to 19 C1F3S3 1.7357

      Therapy Visits.

      22211........................ 3rd+ Episodes, 14 to 15 C2F1S1 1.3589

      Therapy Visits.

      22212........................ 3rd+ Episodes, 16 to 17 C2F1S2 1.5483

      Therapy Visits.

      22213........................ 3rd+ Episodes, 18 to 19 C2F1S3 1.7378

      Therapy Visits.

      22221........................ 3rd+ Episodes, 14 to 15 C2F2S1 1.3743

      Therapy Visits.

      22222........................ 3rd+ Episodes, 16 to 17 C2F2S2 1.5658

      Therapy Visits.

      22223........................ 3rd+ Episodes, 18 to 19 C2F2S3 1.7573

      Therapy Visits.

      22231........................ 3rd+ Episodes, 14 to 15 C2F3S1 1.4587

      Therapy Visits.

      22232........................ 3rd+ Episodes, 16 to 17 C2F3S2 1.6452

      Therapy Visits.

      22233........................ 3rd+ Episodes, 18 to 19 C2F3S3 1.8316

      Therapy Visits.

      22311........................ 3rd+ Episodes, 14 to 15 C3F1S1 1.5722

      Therapy Visits.

      22312........................ 3rd+ Episodes, 16 to 17 C3F1S2 1.7670

      Therapy Visits.

      22313........................ 3rd+ Episodes, 18 to 19 C3F1S3 1.9619

      Therapy Visits.

      22321........................ 3rd+ Episodes, 14 to 15 C3F2S1 1.5876

      Therapy Visits.

      22322........................ 3rd+ Episodes, 16 to 17 C3F2S2 1.7845

      Therapy Visits.

      22323........................ 3rd+ Episodes, 18 to 19 C3F2S3 1.9815

      Therapy Visits.

      22331........................ 3rd+ Episodes, 14 to 15 C3F3S1 1.6721

      Therapy Visits.

      22332........................ 3rd+ Episodes, 16 to 17 C3F3S2 1.8639

      Therapy Visits.

      22333........................ 3rd+ Episodes, 18 to 19 C3F3S3 2.0557

      Therapy Visits.

      30111........................ 3rd+ Episodes, 0 to 5 Therapy C1F1S1 0.4758

      Visits.

      30112........................ 3rd+ Episodes, 6 Therapy C1F1S2 0.6351

      Visits.

      30113........................ 3rd+ Episodes, 7 to 9 Therapy C1F1S3 0.7944

      Visits.

      30114........................ 3rd+ Episodes, 10 Therapy C1F1S4 0.9536

      Visits.

      30115........................ 3rd+ Episodes, 11 to 13 C1F1S5 1.1129

      Therapy Visits.

      30121........................ 3rd+ Episodes, 0 to 5 Therapy C1F2S1 0.5611

      Visits.

      30122........................ 3rd+ Episodes, 6 Therapy C1F2S2 0.7064

      Visits.

      30123........................ 3rd+ Episodes, 7 to 9 Therapy C1F2S3 0.8518

      Visits.

      30124........................ 3rd+ Episodes, 10 Therapy C1F2S4 0.9971

      Visits.

      30125........................ 3rd+ Episodes, 11 to 13 C1F2S5 1.1424

      Therapy Visits.

      30131........................ 3rd+ Episodes, 0 to 5 Therapy C1F3S1 0.6085

      Visits.

      30132........................ 3rd+ Episodes, 6 Therapy C1F3S2 0.7613

      Visits.

      30133........................ 3rd+ Episodes, 7 to 9 Therapy C1F3S3 0.9140

      Visits.

      30134........................ 3rd+ Episodes, 10 Therapy C1F3S4 1.0667

      Visits.

      30135........................ 3rd+ Episodes, 11 to 13 C1F3S5 1.2194

      Therapy Visits.

      30211........................ 3rd+ Episodes, 0 to 5 Therapy C2F1S1 0.4913

      Visits.

      30212........................ 3rd+ Episodes, 6 Therapy C2F1S2 0.6648

      Visits.

      30213........................ 3rd+ Episodes, 7 to 9 Therapy C2F1S3 0.8383

      Visits.

      30214........................ 3rd+ Episodes, 10 Therapy C2F1S4 1.0118

      Visits.

      30215........................ 3rd+ Episodes, 11 to 13 C2F1S5 1.1854

      Therapy Visits.

      Page 68637

      30221........................ 3rd+ Episodes, 0 to 5 Therapy C2F2S1 0.5766

      Visits.

      30222........................ 3rd+ Episodes, 6 Therapy C2F2S2 0.7362

      Visits.

      30223........................ 3rd+ Episodes, 7 to 9 Therapy C2F2S3 0.8957

      Visits.

      30224........................ 3rd+ Episodes, 10 Therapy C2F2S4 1.0553

      Visits.

      30225........................ 3rd+ Episodes, 11 to 13 C2F2S5 1.2148

      Therapy Visits.

      30231........................ 3rd+ Episodes, 0 to 5 Therapy C2F3S1 0.6241

      Visits.

      30232........................ 3rd+ Episodes, 6 Therapy C2F3S2 0.7910

      Visits.

      30233........................ 3rd+ Episodes, 7 to 9 Therapy C2F3S3 0.9579

      Visits.

      30234........................ 3rd+ Episodes, 10 Therapy C2F3S4 1.1249

      Visits.

      30235........................ 3rd+ Episodes, 11 to 13 C2F3S5 1.2918

      Therapy Visits.

      30311........................ 3rd+ Episodes, 0 to 5 Therapy C3F1S1 0.6143

      Visits.

      30312........................ 3rd+ Episodes, 6 Therapy C3F1S2 0.8058

      Visits.

      30313........................ 3rd+ Episodes, 7 to 9 Therapy C3F1S3 0.9974

      Visits.

      30314........................ 3rd+ Episodes, 10 Therapy C3F1S4 1.1890

      Visits.

      30315........................ 3rd+ Episodes, 11 to 13 C3F1S5 1.3806

      Therapy Visits.

      30321........................ 3rd+ Episodes, 0 to 5 Therapy C3F2S1 0.6996

      Visits.

      30322........................ 3rd+ Episodes, 6 Therapy C3F2S2 0.8772

      Visits.

      30323........................ 3rd+ Episodes, 7 to 9 Therapy C3F2S3 1.0548

      Visits.

      30324........................ 3rd+ Episodes, 10 Therapy C3F2S4 1.2324

      Visits.

      30325........................ 3rd+ Episodes, 11 to 13 C3F2S5 1.4100

      Therapy Visits.

      30331........................ 3rd+ Episodes, 0 to 5 Therapy C3F3S1 0.7470

      Visits.

      30332........................ 3rd+ Episodes, 6 Therapy C3F3S2 0.9320

      Visits.

      30333........................ 3rd+ Episodes, 7 to 9 Therapy C3F3S3 1.1170

      Visits.

      30334........................ 3rd+ Episodes, 10 Therapy C3F3S4 1.3020

      Visits.

      30335........................ 3rd+ Episodes, 11 to 13 C3F3S5 1.4870

      Therapy Visits.

      40111........................ All Episodes, 20+ Therapy C1F1S1 1.8268

      Visits.

      40121........................ All Episodes, 20+ Therapy C1F2S1 1.8484

      Visits.

      40131........................ All Episodes, 20+ Therapy C1F3S1 1.9176

      Visits.

      40211........................ All Episodes, 20+ Therapy C2F1S1 1.9272

      Visits.

      40221........................ All Episodes, 20+ Therapy C2F2S1 1.9488

      Visits.

      40231........................ All Episodes, 20+ Therapy C2F3S1 2.0180

      Visits.

      40311........................ All Episodes, 20+ Therapy C3F1S1 2.1567

      Visits.

      40321........................ All Episodes, 20+ Therapy C3F2S1 2.1784

      Visits.

      40331........................ All Episodes, 20+ Therapy C3F3S1 2.2475

      Visits.

      ----------------------------------------------------------------------------------------------------------------

      To ensure the changes to the HH PPS case-mix weights are implemented in a budget neutral manner, we apply a case-mix budget neutrality factor to the CY 2016 national, standardized 60-day episode payment rate (see section III.C.3. of this final rule). The case-mix budget neutrality factor is calculated as the ratio of total payments when the CY 2016 HH PPS grouper and case-mix weights (developed using CY 2014 claims data) are applied to CY 2014 utilization (claims) data to total payments when the CY 2015 HH PPS grouper and case-mix weights (developed using CY 2013 claims data) are applied to CY 2014 utilization data. Using CY 2014 claims data as of December 31, 2014, we calculated the case-mix budget neutrality factor for CY 2016 to be 1.0141. Updating our analysis with 2014 claims data as of June 30, 2015, we calculated a final case-mix budget neutrality factor for CY 2016 of 1.0187.

      The following is a summary of the comments and our responses to comments on the CY 2016 case-mix weights.

      Comment: One commenter noted that the case-mix weights were increased 3.75 percent for 0-5 therapy visits, decreased by 2.5 percent for 14-15 therapy visits, and decreased 5 percent for 20+ therapy visits to address MedPAC's concerns that the therapy episodes are over-

      valued and non-therapy episodes are undervalued, but stated that a therapist's salary and benefits costs are higher than those same costs for nursing, due to the overall market for therapists and the greater difficulty in retaining them in the home health environment versus other health care settings. Additionally, the commenter noted that patients requiring 20+ therapy visits typically have functional deficits in multiple domains, requiring the expertise of multiple therapy disciplines (PT/OT/ST) to address, justifying the higher case mix.

      Response: As we noted in the CY 2015 HH PPS final rule, these adjustments to the case-mix weights are the same adjustments finalized in the CY 2012 HH PPS final rule (76 FR 68557). As the commenter correctly noted, these adjustments were made, in part, to address MedPAC's concerns that the HH PPS overvalues therapy episodes and undervalues non-therapy episodes (March 2011 MedPAC Report to the Congress: Medicare Payment Policy, p.176). However, we further note that these adjustments also better aligned the case-mix weights with episode costs estimated from cost report data (79 FR 66061).

      Comment: One commenter stated that they are pleased that CMS used updated claims and cost data to recalibrate all of the case-mix weights. However, the commenter went on to state that they were somewhat confused that high-therapy episodes tend to get increased case-mix weights, even though CMS has stated its intention that therapy visit volume should have less impact on the weights. One commenter noted that CMS did not provide sufficient transparency of the details and methods used to recalibrate the HH PPS case-mix weights in its discussion in the proposed rule. In addition, CMS provided little justification for recalibrating the case-mix weights just 1

      Page 68638

      year following the recalibration of case-mix weights in CY 2015 and a mere 3 years since the recalibration for the CY 2012 HH PPS final rule. The commenter noted that this proposed recalibration reduces the case weights for 117 HHRGs or 76 percent of the 153 HHRGs. Another commenter stated that analysis of the case mix weight changes from 2014 through 2016 indicates an average decrease of 1.52 percent in each HIPPS code weight. The commenter stated that they believe that these changes alone have produced an overall decrease in the case mix scoring of episodes since 2013. Specifically, applying the 2016 case mix weights to the HHA's 2014 episodes would produce a decrease in overall case mix weight of 4.7 percent and from 2014-2016, the overall case-mix weight was reduced by 7.2 percent for certain HIPPS codes.

      Response: As stated in the CY 2015 HH PPS final rule, the methodology used to recalibrate the weights is identical to the methodology used in the CY 2012 recalibration except for the minor exceptions as noted in the CY 2015 HH PPS proposed and final rules (79 FR 38366 and 79 FR 66032). We encourage commenters to refer to the CY 2012 HH PPS proposed and final rules (76 FR 40988 and 76 FR 68526) and the CY 2012 technical report on our home page at: https://www.cms.gov/Center/Provider-Type/Home-Health-Agency-HHA-Center.html for additional information about the recalibration methodology.

      As we noted in the CY 2015 HH PPS final rule (79 FR 66067), decreases in the case-mix weights for the low therapy case-mix groups and increases in the case-mix weights for the high therapy case-mix groups is generally attributable to shifts away from the use of home health aides and a shift to either more nursing or more therapy care across all therapy groups. While some of the low therapy groups did add more skilled nursing visits, most of the high therapy groups added more occupational therapy (OT) and speech-language pathology (SLP), which have substantially higher Bureau of Labor Statistics (BLS) average hourly wage values compared to skilled nursing. In addition, while the average number of total visits per episode has decreased overall, it decreased disproportionately more for the no/low therapy case-mix groups. These utilization changes result in changes to the weights observed by the commenter, specifically, the decreases in the case-mix weights for the low or no therapy groups and increases in the case-mix weights for the high therapy groups.

      Comparing the final CY 2016 HH PPS case-mix weights (Table 5) to the final CY 2015 HH PPS case-mix weights (79 FR 66062), the case-mix weights change very little, with most case-mix weights either increasing or decreasing by 1 to 2 percent with no case-mix weights increasing by more than 3 percent or decreasing by more than 4 percent. The aggregate decreases in the case-mix weights are offset by the case-

      mix budget neutrality factor, which is applied to the national, standardized 60-day episode payment rate. In other words, although the case-mix weights themselves may increase or decrease from year-to-year, we correspondingly offset any estimated decreases in total payments under the HH PPS, as result of the case-mix recalibration, by applying a budget neutrality factor to the national, standardized 60-day episode payment rate. For CY 2016, the case-mix budget neutrality factor will be 1.87 percent as described above. For CY 2015, the case-mix budget neutrality factor was 3.66 percent (79 FR 66088). In addition, when the CY 2014 case-mix weights were reset to 1.0000 by decreasing the case-

      mix weights by 1.3464, we correspondingly increased the national, standardized 60-day episode payment rate by the same factor (1.3464) as part of the rebasing of the HH PPS payment rates required by the Affordable Care Act (78 FR 72273). The recalibration of the case-mix weights is not intended to increase or decrease overall HH PPS payments, but rather is used to update the relative differences in resource use amongst the 153 groups in the HH PPS case-mix system and maintain the level of aggregate payments before application of any other adjustments.

      Final Decision: We will finalize the recalibration of the HH PPS case-mix weights as proposed. The CY 2016 scores for the case-mix variables, the clinical and functional thresholds, and the case-mix weights were developed using complete CY 2014 claims data as of June 30, 2015. We note that we finalized the recalibration methodology and the proposal to annually recalibrate the HH PPS case-mix weights in the CY 2015 HH PPS final rule (79 FR 66072). No additional proposals were made with regard to the recalibration methodology in the CY 2016 HH PPS proposed rule.

      2. Reduction to the National, Standardized 60-day Episode Payment Rate to Account for Nominal Case-Mix Growth

      Section 1895(b)(3)(B)(iv) of the Act gives the Secretary the authority to implement payment reductions for nominal case-mix growth (that is, case-mix growth unrelated to changes in patient acuity). Previously, we accounted for nominal case-mix growth through case-mix reductions implemented from 2008 through 2013 (76 FR 68528-68543). As stated in the 2013 final rule, the goal of the reductions for nominal case-mix growth is to better align payments with real changes in patient severity (77 FR 67077). Our analysis of data from CY 2000 through CY 2010 found that only 15.97 percent of the total case-mix change was real and 84.03 percent of total case-mix change was nominal (77 FR 41553). In the CY 2015 HH PPS final rule (79 FR 66032), we estimated that total case-mix increased by 2.76 percent between CY 2012 and CY 2013 and in applying the 15.97 percent estimate of real case-mix growth to the estimate of total case-mix growth, we estimated nominal case-mix growth to be 2.32 percent (2.76 - (2.76 x 0.1597)). However, for 2015, we did not implement a reduction to the 2015 national, standardized 60-day episode payment amount to account for nominal case-

      mix growth, but stated that we would continue to monitor case-mix growth and may consider proposing nominal case-mix reductions in the future. Since the publication of 2015 HH PPS final rule (79 FR 66032), MedPAC reported on their assessment of the impact of the mandated rebasing adjustments on quality of and beneficiary access to home health care as required by section 3131(a) of the Affordable Care Act. As noted in section III.A.2 of the proposed rule, MedPAC concluded that quality of care and beneficiary access to care are unlikely to be negatively affected by the rebasing adjustments. For the proposed rule, we further estimated that case-mix increased by 1.41 percent between CY 2013 and CY 2014 using preliminary CY 2014 home health claims data (as of December 31, 2014) with linked OASIS data. In applying the 15.97 percent estimate of real case-mix growth to the total estimated case-

      mix growth from CY 2013 to CY 2014 (1.41 percent), we estimated that nominal case-mix growth to be 1.18 percent (1.41 - (1.41 x 0.1597)). Given the observed nominal case-mix growth of 2.32 percent in 2013 and 1.18 percent in 2014, we estimated that the reduction to offset the nominal case-mix growth for these 2 years would be 3.41 percent (1 - 1/

      (1.0232 x 1.0118) = 0.0341).

      We proposed to implement this 3.41 percent reduction in equal increments over 2 years. Specifically, we proposed to apply a 1.72 percent (1 - 1/(1.0232

      Page 68639

      x 1.0118) 1/2 = 1.72 percent) reduction to the national, standardized 60-day episode payment rate each year for 2 years, CY 2016 and CY 2017, under the ongoing authority of section 1895(b)(3)(B)(iv) of the Act. In the proposed rule, we noted that proposed reductions to the national, standardized 60-day episode payment rate in CY 2016 and in CY 2017 to account for nominal case-mix growth are separate from the rebasing adjustments finalized in CY 2014 under section 1895(b)(3)(A)(iii) of the Act, which were calculated using CY 2012 claims and CY 2011 HHA cost report data (which was the most current, complete data at the time of the CY 2014 HH PPS proposed and final rules).

      In updating our analysis for the final rule and in reassessing our methodology in response to comments, as discussed further below in this section, we used a more familiar methodology (one used in the past) to measure case-mix growth. We first calculated the average case-mix index for 2012, 2013, and 2014 before comparing the average case-mix index for CY 2012 to CY 2013 and the average case-mix index for CY 2013 to CY 2014 to calculate the total case-mix growth between the years. To make the comparison between the 2013 average case-mix index and the 2014 average case-mix index, we had to inflate the 2014 average case-mix index (multiply it by 1.3464) before doing the comparison. We inflated the 2014 average case-mix index by 1.3464 to offset the decrease by that same factor when the CY 2014 case-mix weights were reset to 1.0000 in the CY 2014 HH PPS final rule (78 FR 72256). By first calculating the average case-mix index for 2012, 2013, and 2014 before comparing the average case-mix index for CY 2012 to CY 2013 and then comparing the average case-mix index for CY 2013 to CY 2014 to calculate the total case-mix growth between the years, we used a more familiar methodology than what was done for the CY 2015 HH PPS final rule and the CY 2016 HH PPS proposed rule. In those rules, we instead simulated total payments using case-mix weights from 2 consecutive years (used to calculate the case-mix budget neutrality factor when recalibrating the case-mix weights) and isolated the portion of the budget neutrality factor that was due to changes in case-mix. Calculating the average case-mix index in a given year, and comparing indices across years, better aligns with how CMS historically measured case-mix growth in previous years and is a methodology that was thoroughly vetted in previous rulemaking. In addition, we believe that this more familiar methodology results in a more straightforward measure of case-mix growth between 2012 and 2014, given that annual recalibration of the case-mix weights did not begin until CY 2015.

      Using this methodology, we estimate that the average case-mix for 2012 was 1.3610 and that the average case-mix for 2013 was 1.3900.\4\ Dividing the average case-mix for 2013 by the average case-mix for 2012, we obtain a total case-mix growth estimate from 2012 to 2013 of 2.13 percent (1.3900/1.3610 = 1.0213), compared to 2.76 percent in the proposed rule. We estimate that the average case-mix for 2014 was 1.0465. We note that in 2014, we decreased all of the case-mix weights uniformly by 1.3464. Therefore, in order to make a comparison between the 2014 average case-mix weight and the 2013 average case-mix weight, we multiplied the 1.0465 estimate by 1.3464 (1.0465 x 1.3464 = 1.4090). We then divided the average case-mix for 2014 by the average case-mix for 2013 to obtain a total case-mix growth estimate from 2013 to 2014 of 1.37 percent (1.4090/1.3900 = 1.0137), compared to 1.41 percent in the proposed rule.

      ---------------------------------------------------------------------------

      \4\ We include outlier episodes in the calculation along with normal episodes and PEPs. We note that the case-mix for PEP episodes are downward weighted based on the length of the home health episode.

      ---------------------------------------------------------------------------

      Using the 2.13 percent estimate of total case-mix growth between CY 2012 and CY 2013, we estimate nominal case-mix growth to be 1.79 percent (2.13 - (2.13 x 0.1597) = 1.79). Similarly, using the 1.37 percent estimate of total case-mix growth between CY 2013 and CY 2014, we estimate nominal case-mix growth to be 1.15 percent (1.37 - (1.37 x 0.1597) = 1.15). Using the updated estimates of case-mix growth between 2012 and 2013 and between 2013 and 2014, we estimate that the reduction to the national, standardized 60-day episode payment rate needed to offset the nominal case-mix growth from 2012 through 2014 would be 2.88 percent (1 - 1/(1.0179 x 1.0115) = 0.0288). If we finalized the 2 year phase-in described in the proposed rule, we would need to implement a reduction of 1.45 percent to the national, standardized 60-day episode payment rate each year for 2 years, CY 2016 and CY 2017, to account for nominal case-mix growth from 2012 through 2014 (1 - 1/(1.0179 x 1.0115) 1/2 = 0.0145).

      In the CY 2016 HH PPS proposed rule, we solicited comments on the proposed reduction to the national, standardized 60-day episode payment amount in CY 2016 and in CY 2017 to account for nominal case-mix growth from CY 2012 through CY 2014 and the associated changes in the regulations text at Sec. 484.220 in section VII. The following is a summary of the comments and our responses.

      Comment: MedPAC supported the proposed case-mix reductions and stated that the Commission has long held that it is necessary for CMS to make adjustments to account for nominal case-mix growth to prevent overpayments.

      Response: We thank MedPAC for their support.

      Comment: Several commenters expressed concern with the methodology used to determine case-mix growth from CY 2012 to CY 2014 and the portion of such growth that is nominal versus real. Specifically, commenters stated that the percent change in real case-mix used to calculate the proposed nominal case-mix reductions is not reflective of the real case-mix growth between 2012 and 2014. Commenters stated that patients are entering into home health at a much higher acuity level than in previous years and cited a number of statistics to support their statements. Commenters also disagreed with the use of the percent change in real case-mix used in the case-mix reduction calculations as it was based on data from 2000-2010 and applied to the total case mix growth from 2012 to 2014. They stated that no adjustments should be considered until CMS conducts a thorough analysis of real and nominal changes in case mix through evaluation of changes that occurred during the actual years of concern (2012-2014) with respect to the proposed adjustment and any adjustments that might be considered in future years. They further stated that CMS should have the data and tools to perform an updated analysis of the percentage of real versus nominal case-mix growth between 2012 and 2014 and they noted that the historical analyses conducted by CMS demonstrate that the level of ``nominal'' case mix weight change is not consistent from year to year. While some commenters urged CMS to update its analysis to determine the percentage of real versus nominal case-mix growth for CY 2012 through CY 2014, other commenters stated that out of the 921 variables used in such analyses, there are only four drivers of real case-mix growth and implied that CMS' analysis was not reliable or comprehensive enough. Some commenters stated that the adjustments to payments should be based on current data informed by clinical evaluation. Finally, one commenter stated that CMS should not implement the proposed case-mix reductions and not propose

      Page 68640

      any additional case-mix reductions in the future.

      Response: We believe the percent change in real case-mix used in the case-mix reduction calculations, which is based on analysis of 2000 through 2010 data, is a stable proxy for the real case-mix growth between 2012 and 2014. Our analysis of data has not indicated that real case-mix change between 2012 through 2014 is greater than the change in real case-mix between 2000 and 2010. In fact, our analysis of claims data has shown a decrease in the number of total visits per episode between 2012 and 2014. Furthermore, our analysis of 2012 and 2013 cost report data showed that the cost per episode has decreased each year.

      In addition, we note that there is prior precedent for applying historical estimates of real case-mix growth on more current data to set payment rates. In the rate year (RY) 2008 and the RY 2009 LTCH final rules, an estimate of the percentage of real case-mix growth from a prior time period was applied to the total case-mix growth from FY 2004 to FY 2005 and from FY2005 to FY 2006 in determining the RY 2008 and RY 2009 federal rate updates (72 FR 26889 and 73 FR 26805).

      With regard to the recommendation that the estimates should be informed by clinical evaluation, we note that CMS' case-mix change model, developed by Abt Associates, only includes a few variables that are derived from OASIS assessments (measures of patient living arrangement) because the OASIS items can be affected by changes in coding practices. It is not practical to consider other types of home health clinical data (for example, from medical charts) in the model given the resources available.

      We note that as a result of the comments we received expressing concerns about our methodology and questioning the case-mix growth estimates we presented in the proposed rule, we did re-evaluate the methodology to determine total case-mix growth and are moving forward with a more familiar, and slightly more accurate, methodology (one used in the past) to measure case-mix growth (as described above). The methodology results in the calculation of a 1.45 percent reduction each year in CY 2016 and CY 2017 to account for nominal case-mix growth from 2012 to 2014 (instead of the 1.72 percent reduction described in the CY 2016 proposed rule).

      Comment: A commenter stated that their analyses suggest that all of the historical increases have been driven by increased therapy utilization that is, in turn, based on real needs of the patients. A commenter stated that the technical analyses used to conclude that case-mix increases are generally ``not real'' have been based on the non-case-mix variables and that those non-case-mix variables were found to have a lower explanatory value. The commenter expressed concerns with CMS' exclusion of the therapy variables in the model to assess real case-mix, stating that those have the highest explanatory power. The commenter asked that CMS address this question in the final rule to better inform their understanding of its conclusions as to how ``real'' versus ``nominal'' determinations are made.

      Response: The models to assess real and nominal case-mix growth were intended to analyze changes in case-mix over time and do not distinguish whether these changes are due to increases in therapy use or other factors. We do not believe that it would be appropriate to include utilization-related variables, such as the number of therapy visits, as predictors in the model, as such variables are provider-

      determined. In addition, the goal of these analyses was to examine changes in measures of patient acuity that are not affected by any changes in provider coding practices. For example, the models do incorporate information about change in the types of patients more likely to use therapy, such as post-acute joint replacement patients. We encourage commenters to review the Analysis of 2000-2009 Home Health Case-Mix Change Report, available on the HHA center page at: https://www.cms.gov/Center/Provider-Type/Home-Health-Agency-HHA-Center.html, in order to better understand the models used to assess real and nominal case-mix growth.

      Comment: A number of commenters encouraged CMS to seek payment system reforms that are value-based rather than implementing payment reductions.

      Response: The Home Health Value-Based Purchasing (HHVBP) model will be implemented January 1, 2016, as described in section IV of this final rule. However, the reductions to account for nominal case-mix growth are necessary to prevent overpayments due to coding practices that led to increases in payment that are not related to real increases in patient acuity.

      Comment: Commenters referenced section 1895(b)(3)(B)(iv) of the Act, stating that there has not been an increase in aggregate payments that would justify the proposed reductions, and that CMS should withdraw its proposal. Commenters stated that there was a decrease in spending from 2010 through 2013 and questioned how nominal case-mix growth could have increased during the time period. Another commenter stated that Medicare data for 2012 to 2014 appear to indicate that the per episode payment during this period actually fell below the level that would have occurred as a result of any up-coding even though CMS' estimates case mix up-coding occurred. Commenters stated that no payment reductions should be implemented unless CMS could demonstrate that Medicare spending on home health services exceeded the Congressional Budget Office's (CBO) forecasted spending.

      Response: We have no statutory authority to consider the relationship of CBO projections to home health outlays when setting the HH PPS payment rates. The Secretary's authority to respond to nominal coding change is set out at section 1895(b)(3)(B)(iv) of the Act. In addition, the reference to ``a change in aggregate payments'' in that provision does not mean that overall expenditures under the HH PPS need to increase in order to implement reductions for nominal case-mix growth. We would also like to note that a decrease in expenditures does not mean that there has been no case-mix growth. The case-mix growth during this time period may have offset the decrease in expenditures that might have otherwise occurred.

      Comment: Commenters stated that the recent recalibrations have eliminated the nominal case-mix growth observed from 2012 through 2014. Furthermore, commenters stated that the removal of certain ICD-9-CM codes included in the HH PPS Grouper for CY 2014 addressed, in part, nominal case-mix growth from 2012 through 2014. Commenters stated that CMS should fully evaluate the impact of the recalibration on case-mix growth and publicly disclose the information.

      Response: While the recent recalibrations (starting in CY 2015) may help to reduce future nominal case-mix growth, the proposed reductions are addressing the nominal case-mix growth from 2012 through 2014, prior to recent efforts to annually recalibrate the HH PPS case-mix weights. The reductions to account for nominal case-mix growth ensure that payments are not inflated by case-mix changes unrelated to patient severity that occurred from 2012 through 2014. This remains important even in years when we are annually recalibrating the case-mix weights. When CMS recalibrates the case-mix weights, a budget neutrality factor is applied to the national, standardized 60-day episode payment rate to ensure that

      Page 68641

      the recalibration of the case-mix weights result in the same aggregate expenditures as the aggregate expenditures using the current payment weights. For the recalibration of the weights in this rule, the budget neutrality factor is applied to the CY 2016 national, standardized 60-

      day episode payment rate to ensure that the recalibration of the case-

      mix weights results in the same aggregate expenditures using the current CY 2015 payment weights (simulating payments using CY 2014 utilization data, the most current and complete data available at this time). If there is nominal case-mix growth in the data used to recalibrate the case-mix weights, the nominal case-mix growth is built into the national, standardized 60-day episode rate through the budget neutrality factor. Thus nominal case-mix in a given year could result in increases to the national, standardized 60-day payment rate that would otherwise not have occurred, and future adjustments may be needed to better align payment with patient severity.

      In measuring case-mix growth, we are factoring in the removal of the ICD-9-CM codes from the CY 2014 HH PPS Grouper into our assessment of case-mix growth from 2013 to 2014. We used the 2013 grouper and 2013 case-mix weights to calculate the average case-mix index for 2013. Then we used the 2014 grouper, which excluded ICD-9-CM codes found to be rarely used and/or not associated with resource use increases, and 2014 case-mix weights, to calculate the average case-mix index for 2014. Comparing the 2013 average case-mix index to the 2014 average case-mix index (multiplied by 1.3464 in order to make the comparison), we obtained an estimate of case-mix growth which factors in the removal of the ICD-9 codes. We estimated 1.37 percent growth in total case-mix even after taking out the ICD-9-CM codes in 2014. We will continue to monitor case-mix growth and may examine the effects of the annual recalibrations on future case-mix growth.

      Comment: Some commenters questioned why the 2012 recalibration did not have a budget neutrality adjustment.

      Response: The 2012 recalibration was implemented in a budget neutral manner. While a budget neutrality factor was not applied to the national, standardized 60-day episode payment rate, we did apply a budget neutrality factor to the weights to ensure that the recalibration was implemented in a budget neutral manner (76 FR 68555).

      Comment: A few commenters stated that CMS did not take into consideration any probable coding effect in the transition from ICD-9-

      CM to ICD-10-CM. The commenters stated that it is highly likely that a decrease in productivity will occur due to the implementation of ICD-

      10-CM. Commenters also stated that it is also highly likely that ICD-

      10-CM will result in coding inaccuracies, which in turn, will lower average case mix. The commenters encouraged CMS to reconsider this large negative adjustment and at least postpone it until additional information and study results are available. A commenter stated that, in addition to ICD-10-CM implementation, HHAs are simultaneously facing increased costs due to the implementation of the new Department of Labor (DOL) rule on minimum wage and overtime for companionship providers.

      Response: We note that providers have been aware of the transition from ICD-9-CM to ICD-10-CM for some time. The original implementation date for ICD-10-CM was October 1, 2013 (74 FR 3328). Therefore, the increase in costs due to the ICD-10-CM transition should be reflected in the latest cost report data we examined for the rebasing monitoring analyses in the proposed rule (that is, CY 2013 cost report data). In that analysis we found that an even greater reduction to HHA payments would need to occur to better align payments with costs than is currently allowed under section 1895(b)(3)(A)(iii) of the Act (80 FR 39845). We will continue to analyze HHA Medicare cost report data and monitor case-mix growth in future rulemaking and may consider revising payments accordingly.

      Comment: Many commenters stated that their individual home health agencies have consistently had case-mix that was below the national average and; therefore, would be disproportionally impacted. Commenters suggested that CMS develop program integrity measures to address provider-specific up-coding rather than implementing the across-the-

      board reductions. A commenter suggested the program integrity efforts could be performed through the Recovery Audit Contractors (RACs). Another commenter suggested that CMS rehyphenintroduce the Medicare review procedures of the past in both the clinical and financial operations of home health with monetary penalties and/or recoupments based on those reviews. A third commenter stated that CMS should continue utilizing the existing fraud and abuse prevention processes to identify and target specific agencies that have excessive profit margins rather than impose the across the board reductions for all agencies and that CMS should use its enforcement authority to conduct targeted claims reviews and deny payment for claims where the case mix weight is not supported by the plan of care rather than cut the national standardized episode rate for all agencies.

      One commenter stated that the Medicare Administrative Contractors (MACs) are tasked with finding instances of inappropriate coding and that the industry should not be penalized for inappropriate coding that the MACs were unable to find. The commenter also stated that the proposed reductions are a ``double whammy'' because the claims that were identified as erroneously billed have already been adjusted and any identified overpayments have been recovered and that CMS is attempting to recover even more than what was in error through the proposed reductions. In addition, the commenter questioned why there have not been more denials if there has been widespread up-coding, as suggested by CMS' analysis.

      Response: For a variety of reasons, as we have noted in previous regulations, we have not proposed targeted reductions for nominal case-

      mix change. The foremost reason is that we believe changes and improvements in coding have been widespread, so that such targeting would likely not separate agencies clearly into high and low coding-

      change groups. When performing an independent review of our case-mix measurement methodology, Dr. David Grabowski, Ph.D., a professor of health care policy at Harvard Medical School, and his team agreed with our reasons for not proposing targeted reductions, stating their concerns about the small sample size of many agencies and their findings of significant nominal case-mix across different classes of agencies (please see the ``Home Health Study Report--Independent Review of the Models to Assess Nominal Case-Mix Growth'', dated June 21, 2011, located at: https://www.cms.gov/Center/Provider-Type/Home-Health-Agency-HHA-Center.html).

      While certain commenters seem to assume that CMS can precisely identify those agencies practicing abusive coding, we do not agree that agency-specific case-mix levels can precisely distinguish the agencies that engage in abusive coding from all others. System wide, case-mix levels have risen over time throughout the country, while patient characteristics data indicate little real change in patient severity over time. That is, the main problem is not the level of case-mix billed by any

      Page 68642

      specific HHA over a period of time, but the amount of change in the billed case-mix weights not attributable to underlying changes in actual patient severity. We note that we have taken various measures to reduce payment vulnerabilities and the federal government has launched actions to directly identify fraudulent and abusive activities. Commenters should be aware of tip lines available that can help support investigative efforts of the federal government. The Office of the Inspector General, Department of Health and Human Services Web site at: http://oig.hhs.gov/fraud/report-fraud/index.asp, provides information about how to report fraud. Another Web site, http://www.stopmedicarefraud.gov/index.html, is oriented to Medicare patients and their families and provides information about recognizing fraud.

      In terms of recoupments that correspond to claims denied after they were reviewed, such would typically be reflected in the claims data we used in our case-mix analysis. In the case where a paid-claim dispute is still active, because the volume is so low, this data would likely have little to no effect on our determination of nominal case-mix growth. In addition, while we appreciate the commenters' suggestion, targeted claim review on a scale that would be required to counteract the broad-based uptrend in case-mix weights would be resource-intensive and not feasible.

      Comment: Some commenters stated that the additional payment reductions for nominal case-mix growth are based on a subset of the same factors used to determine the rebasing adjustment, such as the ``intensity of services'' factor. The commenters stated that the use of an earlier legislative authority to justify an additional type of reduction above the legislative cap on rebasing adjustments is contrary to congressional intent. The commenters urged CMS to adhere to the limits on home health rate rebasing established by Congress and recommended that CMS evaluate the impact of the rebasing adjustments and consult with Congress before considering additional reductions. Other commenters stated that CMS should provide a comprehensive explanation as to why it has not determined that the 2014 rate rebasing effectively eliminated the impact of any alleged nominal case mix weight change that may have occurred in 2013 and 2014. Commenters recommended that CMS should hold off on imposing the adjustments until the completion of the rebasing in 2017. Alternatively, the commenters recommended phasing-in the proposed reductions over more years. A commenter stated that this approach would be more consistent with approaches used by the agency to implement similar rate reductions in the IPPS and would soften the impact for those agencies whose case-mix growth was due to changes in patient acuity. Another commenter stated that CMS should do further analysis including validation that no element of the proposed coding cut would duplicate reductions already accounted for in the rebasing adjustments. Another commenter requested that CMS provide a discussion of the interaction of the rebasing adjustments and the recalibration of case weights on the purported nominal case mix growth, stating that they believed that the rebasing and recalibration of case weights addressed any nominal case mix growth at that time.

      Response: The rebasing adjustments proposed and finalized for CY 2014 through CY 2017 were based on 2011 cost report data and 2012 claims data. We compared payment and costs using 2011 cost data and 2012 claims data and therefore, we did not account for any nominal case-mix growth from 2012 to 2014 in the methodology. Specifically, using the 2011 cost data, we estimated a 2013 60-day episode cost by increasing the 2011 60-day episode cost by the change in the visit data between 2011 and 2012 and the full 2012 and 2013 market baskets. We calculated payments by taking the 2012 national, standardized 60-day payment amount and updating it by the average case-mix weight for 2012 as well as updating the estimate based on the payment policies implemented in CY 2013 to estimate average payments in 2013. In the rebasing methodology, we did not factor in future projections of nominal case-mix growth from 2012 to 2014 in our analysis. As stated previously, the nominal case-mix reductions would allow us to account for nominal case-mix growth from 2012 through 2014 and mitigate structural overpayments.

      While resetting the weights to 1.0000 and doing annual recalibrations may potentially reduce future nominal case-mix growth, it does not offset the nominal case-mix growth previously unaccounted for, particularly for those last few years before annual recalibrations began. We note that there is a two year lag between the data used to recalibrate the case-mix weights and the year that the weights will be implemented and we use the same claims data when comparing payments and developing the budget neutrality factor. If that utilization in the claims data is too high, it is built into the payments for both the future year's case mix weights and the previous year's case mix weights on which the recalibration is based, and so that increased utilization ends up being carried forward. In other words, the recalibration is adjusting for the next year's case mix change as compared to the previous one, but, barring additional action, will not (even in future years) adjust for unaccounted nominal case mix growth already built in to the system.

      With regard to the commenters' concerns about congressional intent, we do not believe that application of the case-mix adjustment is contrary to congressional intent. We have received input from stakeholders and appreciate their comments but believe our final policy is within the authority under the statute and is consistent with congressional intent. Moreover, this policy reflects our goal to better align Medicare reimbursement with real changes in patient severity. With regard to the comment about phasing-in the reductions over more years, we note that in response to comments, we are phasing-in the case-mix reductions over 3 years (CY 2016, CY 2017, and CY 2018) rather than the 2 years (CY 2016 and CY 2017) described in the proposed rule. Specifically, we will be finalizing a 0.97 percent reduction each year in CY 2016, CY 2017, and CY 2018 to account for nominal case-mix growth from CY 2012 through CY 2014 (1 - 1/(1.0179 x 1.0115) 1/3 = 0.0097). Iteratively implementing the case-mix reduction over three years gives home health agencies more time to adjust to the intended reduction of 2.88 percent than would be the case were we to account for the nominal case-mix growth in two years.

      Comment: Commenters stated that the proposed case-mix reductions would disproportionately affect hospital-based agencies and that hospital-based HHA's Medicare margins have been negative for the past few years. A commenter stated that hospital-based HHAs treat more severe patients than freestanding HHAs. Another commenter recommended that CMS consider the differences in case-mix across the types of HHAs and regions.

      Response: Hospital-based HHAs comprise less than 10 percent of all home health agencies in our impact analysis (see section VII of this final rule). As stated in their March 2011 Report to Congress, MedPAC focuses on freestanding agencies because they are the majority of providers and because their costs do not reflect the sort of allocation of overhead costs seen in facility-based providers' Medicare cost reports, such as hospital-based HHA's

      Page 68643

      Medicare cost reports. MedPAC explains that in the case of hospitals, which often provide services that are paid for by multiple Medicare payment systems, measures of payments and costs for an individual sector could become distorted because of the allocation of overhead costs or complementarities of services. In addition, MedPAC has reported negative Medicare margins for hospital-based HHAs since at least 2005,\5\ even though freestanding HHA Medicare margins have been around or over 15 percent. We question how hospital-based HHAs can still be operating after several years with negative Medicare margins and whether those HHAs have incentives to report negative Medicare margins (such as cost shifting/allocation by hospitals amongst their various units).

      ---------------------------------------------------------------------------

      \5\ Medicare Payment Advisory Commission (MedPAC), Report to the Congress: Medicare Payment Policy. March 2007, P. 194.

      ---------------------------------------------------------------------------

      In their March 2009 Report to the Congress, MedPAC stated that hospital-based providers have a lower case-mix index, which suggests that they serve less costly patients.\6\ Similarly, we also examined the average case-mix index for freestanding versus facility-based HHAs in CY 2014 and found that hospital-based HHAs had an average case-mix index that was approximately 6 percent lower than freestanding HHAs. However, the report on the independent review of the model used to assess real case-mix growth, performed by Dr. David Grabowski from Harvard University, stated ``. . . when we re-ran the Abt model by ownership type (non-profit, government, for-profit), agency type (facility-based, freestanding), region of the country (north, south, Midwest, west), agency size (large vs. small; based on number of initial episodes) and agency focus (post-acute versus community-

      dwelling), the results suggest that--although there is some variation--

      a consistent percentage of the growth in case-mix is nominal growth. As such, these results do not provide much support for adjusting payments by classes of agencies.'' The ``Home Health Study Report--Independent Review of the Models to Assess Nominal Case-Mix Growth'', dated June 21, 2011, is located on our homepage at: https://www.cms.gov/Center/Provider-Type/Home-Health-Agency-HHA-Center.html.

      ---------------------------------------------------------------------------

      \6\ Medicare Payment Advisory Commission (MedPAC), Report to the Congress: Medicare Payment Policy. March 2009, P. 196.

      ---------------------------------------------------------------------------

      Comment: Commenters expressed concerns with the impact of the proposed reductions on HHA margins and the financial viability of HHAs. Commenters stated that CMS estimated that 43 percent of all HHAs would face negative margins by 2017 with the impact of rebasing and the annual productivity adjustment and provided other information on margins. Commenters stated that a recent analysis by NAHC indicates that the percentage of impacted HHAs is now forecasted at 53.71 percent by 2017 and that, with the addition of the case mix weight adjustment proposed by CMS, some states will be impacted to a much higher degree. Some other commenters stated that analysis conducted by Avalere Health determined that 45.3 percent of all HHAs nationwide will operate at a loss by the end of 2017. A commenter stated the MedPAC Medicare Margin estimate is not intended to serve as a measure of home health agencies' profit/loss, but is often interpreted as such, and an HHA's overall margin (rather than just the Medicare margin) is a standard measure of a home health company's bottom line/profit (or loss, as applicable). A few commenters stated that policymakers may want to consider providers' overall margins, as well as the MedPAC Medicare margin, when contemplating changes to home health reimbursement. A commenter stated that CMS should accurately account for the current costs of providing HH services to Medicare beneficiaries and to offer HH agencies a fair opportunity to generate a margin needed to make the ongoing investments that are necessary to maintain and improve patient care.

      Response: In the CY 2014 final rule, we estimated that approximately 40 percent of providers would have negative margins in CY 2017 and that of the 40 percent of providers predicted to have negative margins, 83 percent of these providers already reported negative margins in 2011. In their March 2015 Report to the Congress, MedPAC estimates that the Medicare margins for freestanding agencies averaged 12.7 percent in 2013 and averaged 17 percent between 2001 and 2013. The Commission estimates that the Medicare margin for 2015 will be 10.3 percent. In addition, as mandated in section 3131(a) of the Affordable Care Act, MedPAC conducted a study on the rebasing implementation, which included an impact analysis on access to care, and submitted a Report to Congress on their findings. MedPAC's Report to Congress noted that the rebasing adjustments are partially offset by the payment update each year and across all four years of the phase-in of the rebasing adjustments the cumulative net reduction would equal about 2 percent. MedPAC concluded that, as a result of the payment update offsets to the rebasing adjustments, HHA margins are likely to remain high under the current rebasing policy and quality of care and beneficiary access to care are unlikely to be negatively affected.

      Furthermore, in their 2013 Report to Congress, MedPAC stated ``low cost growth or no cost growth has been typical for home health care, and in some years we have observed a decline in cost per episode. The ability of HHAs to keep costs low has contributed to the high margins under the Medicare PPS.'' Our analysis of 2012 and 2013 cost report data supports MedPAC's statement about low or no cost growth and suggests that the cost of 60 day home health episodes has decreased since 2011. In the CY 2014 final rule, we estimated the cost of a 60-

      day episode in 2011 to be $2,453.71 using CY 2011 Medicare claims data and 2011 Medicare cost report data (78 FR 72277). In the CY 2015 proposed rule, we estimated the cost of a 60-day episode in 2012 to be $2,413.82 using CY 2012 Medicare claims data and FY 2012 Medicare cost report data (79 FR 38371). In the CY 2016 proposed rule, we estimated the cost of a 60-day episode in 2013 to be $2,402.11 using CY 2013 Medicare claims data and FY 2013 Medicare cost report data (80 FR 39846).

      In addition, we note that in their 2013 Report to Congress, MedPAC stated that during the interim payment system (1997-2000), when payments dropped by about 50 percent in two years, many agencies exited the program. However, new agencies entered the program (about 200 new agencies a year) and existing agencies expanded their service areas to enter markets left by exiting agencies. This is due in part to the low capital requirements for home health care services that allow the industry to react rapidly when the supply of agencies changes or contracts. Reviews of access found that access to care remained adequate during this period despite a substantial decline in the number of agencies (Liu et al. 2003). In summary, MedPAC's past reviews of access to home health care found that access generally remained adequate during periods of substantial decline in the number of agencies. MedPAC stated that this is due in part to the low capital requirements for home health care services that allow the industry to react rapidly when the supply of agencies changes or contracts. As described in section III.A.3 of the CY 2016 proposed rule, the number of HHAs billing Medicare for home health services in CY 2013 was 11,889, or over 80 percent higher than the 6,511 HHAs billing Medicare for home health services in 2001. Even if some HHAs were to exit

      Page 68644

      the program due to possible reimbursement concerns, we would expect the home health market to remain robust (80 FR 39846).

      With regard to the comments about the overall margin, we note that as stated in the CY 2014 final rule, Medicare has never set payments so as to cross-subsidize other payers. Indeed, section 1861(v)(1)(A) of the Act states ``under the methods of determining costs, the necessary costs of efficiently delivering covered services to individuals covered by the insurance programs established by this title will not be borne by individuals not so covered, and the costs with respect to individuals not so covered will not be borne by such insurance programs.'' As MedPAC stated in its March 2011 Report to Congress, cross-subsidization is not advisable for two significant reasons: ``Raising Medicare rates to supplement low Medicaid payments would result in poorly targeted subsidies. Facilities with high shares of Medicare payments--presumably the facilities that need revenues the least--would receive the most in subsidies from the higher Medicare payments, while facilities with low Medicare shares--presumably the facilities with the greatest need--would receive the smallest subsidies. Finally, increased Medicare payment rates could encourage states to further reduce their Medicaid payments and, in turn, create pressure to raise Medicare rates'' (78 FR 72284).

      Comment: A commenter stated that the proposed payment rate reductions will create job losses, particularly for people in education and quality positions. Commenters expressed concerns that the proposed rate reductions may create instability within the industry and impact access to care, particularly in underserved communities or for patients with higher cost or more complex care needs. Commenters also stated that the proposed rate reductions will have a significant impact on those home health agencies that serve as the safety-net providers for their communities and another commenter stated that the proposed cuts will threaten access to care in rural areas stating that patients in rural areas tend to be sicker, older, poorer, and require more complex care than their urban counterparts. A commenter urge CMS to eliminate the proposed case mix cut pending a detailed analysis utilizing current data and incorporating an assessment of the impact of such an additional cut on Medicare beneficiaries as well as the rural, small, and other HHAs who serve them.

      Response: We do not expect the payment reductions for nominal case-

      mix growth to have a significant impact, particularly given MedPAC's projected margins for 2015; however, we will continue to monitor for unintended consequences. As noted above, we are phasing-in the reductions over three years, rather than two years as described in the proposed rule. Iteratively implementing the case-mix reduction over three years gives home health agencies more time to adjust to the intended reduction of 2.88 percent than would be the case were we to account for the nominal case-mix growth in two years.

      In addition, as described in the CY 2016 proposed rule, CMS has awarded a follow-on contract to Abt Associates to further explore margin differences across patient characteristics and possible payment methodology changes suggested by the results of the home health study. We presented several model options under development in the CY 2016 proposed rule and may consider implementing payment reform to address the margin differences across patient characteristics in future rulemaking (80 FR 39865). With regard to the comment about patients in rural areas, we note that episodes provided in rural areas will continue to receive a three percent add-on payment in CY 2016.

      Comment: A commenter stated that the proposed reductions will limit services to the homebound population and will lead to increased re-

      hospitalization and costs. Another commenter stated that the proposed reductions would threaten the efficiency of the health care system and will likely increase the likelihood of unnecessary institutional care episodes and that this improper utilization may lead to higher costs. The commenter urged CMS to consider the role and value of home health care in the overall health care system as it makes changes to the home health prospective payment system. The commenter asked CMS to consider the most vulnerable populations and the demographics of home health users when implementing payment adjustments. The commenter urged CMS to consider the potential impact of payment adjustments on a generally, older, sicker, poorer, and more vulnerable population, and mitigate these risks where possible. Commenters also expressed concerns that the proposed cuts may impact quality of care.

      Response: We note that we believe the commenter is referring to both the rebasing reductions as well as the proposed reductions to account for nominal case-mix growth. As described in the CY 2016 proposed rule, section 3131(a) of the Affordable Care Act required the Medicare Payment Advisory Commission (MedPAC) to assess, by January 1, 2015, the impact of the mandated rebasing adjustments on quality of and beneficiary access to home health care. As part of this assessment, the statute required MedPAC to consider the impact on care delivered by rural, urban, nonprofit, and for-profit home health agencies. MedPAC's Report to Congress noted that the rebasing adjustments are partially offset by the payment update each year and across all four years of the phase-in of the rebasing adjustments the cumulative net reduction would equal about 2 percent. MedPAC concluded that, as a result of the payment update offsets to the rebasing adjustments, HHA margins are likely to remain high under the current rebasing policy and quality of care and beneficiary access to care are unlikely to be negatively affected \7\ (80 FR 39846). In addition, the overall impact of this rule as discussed in section VII of this final rule is smaller than the overall impact of previous rules in which reductions for nominal case-

      mix growth have been implemented. For instance, we estimated that the overall impact of the CY 2011 HH PPS final rule would be -4.89 percent and the overall impact of the CY 2012 HH PPS final rule would be -2.31 percent.

      ---------------------------------------------------------------------------

      \7\ Medicare Payment Advisory Commission (MedPAC), ``Report to the Congress: Impact of Home Health Payment Rebasing on Beneficiary Access to and Quality of Care''. December 2014. Washington, DC. Accessed on 5/05/15 at: http://www.medpac.gov/documents/reports/december-2014-report-to-the-congress-impact-of-home-health-payment-rebasing-on-beneficiary-access-to-and-quality-of-care.pdf?sfvrsn=0.

      ---------------------------------------------------------------------------

      Commenters did not provide specific information about why they believe payment reductions would reduce the quality of care. MedPAC estimates that the Medicare margin for 2015 will be 10.3 percent, which should support current levels of quality. We also believe that policymaking in the quality improvement area should help to ensure quality advances. The HHVBP described in this final rule will be implemented on January 1, 2016, further enhancing quality-related incentives. While we do not anticipate significant negative impacts of this rule, we will continue to closely monitor the effects of the payments adjustments on HHAs, as well as on beneficiaries' access and quality of care.

      Comment: Commenters stated that the proposed reductions will limit home health providers' ability to continue participating in broader payment and

      Page 68645

      delivery system reform efforts and in the HHVBP program. Commenters stated that the proposal fails to account for significant new cost burdens placed on agencies since 2010 and fails to take into account the current and future healthcare environment, such as the reform initiatives underway. Another commenter stated that the payment cuts should be delayed until their impact on HHAs can be more fully understood in light of the dynamics that the Bundled Payment for Care Improvement Initiative (BPCI), the proposed Comprehensive Care for Joint Replacement (CCJR) model, Accountable Care Organizations (ACOs) and various other healthcare delivery and payment reform initiatives are creating for the home health sector, including shifting more medically complex functional impaired patients into HHAs.

      Response: While there may be increased costs associated with implementing the broader payment and delivery system reform initiatives, we expect that providers will be rewarded for efficient care or higher quality of care and will receive a return on their investment for investing in the payment reform efforts. The initiatives cited by the commenters offer financial rewards for high quality of care and/or efficient care.

      Comment: A commenter stated that the proposed reductions will threaten the ability of home health agencies to reduce re-

      hospitalization rates and requested that CMS re-consider the reductions, given the current reductions due to sequestration and rebasing. Another commenter stated that they disagree with the rationale used to justify the proposed case-mix reductions. The commenter stated that the logic is ill-conceived and implies that Medicare home health services have increased due to overutilization. Another commenter stated that the proposed reductions assume that providers ``gamed the system.'' A commenter stated that the proposed reductions are based on the fact that CMS believes that the industry has profit margins that are too high and has inflated the case-mix of the patients served.

      Response: The goal of the reductions for nominal case-mix growth is to better align payment with real changes in patient severity. The reductions would adjust the national, standardized 60-day episode payment rate to account for nominal case-mix growth between CY 2012 and CY 2014 and mitigate overpayments. As we have stated in previous regulations, we believe nominal coding change results mostly from changed coding practices, including improved understanding of the ICD-9 coding system, more comprehensive coding, changes in the interpretation of various items on the OASIS and in formal OASIS definitions, and other evolving measurement issues. Our view of the causes of nominal coding change does not emphasize the idea that HHAs or clinicians in general ``gamed the system'' or over-provided services or the idea that HHAs have high profit margins. However, since our goal is to pay only for increased costs associated with real changes in patient severity, and because nominal coding change does not demonstrate that underlying changes in patient severity occurred, we believe it is necessary to exclude nominal case-mix effects that are unrelated to changes in patient severity. We note that we will continue to monitor for any unintended consequences of the payment reductions.

      Comment: One commenter stated that the starting point in the real and nominal case-mix growth analysis should have been 2002 or 2003, not 2000. Another commenter stated that the original baseline of a case-mix weight of 1.000 in 2000 was incorrect and that the analysis is flawed because the foundation or baseline is incorrect. Commenters cited multiple examples to support their statements that 2000 should not have been used as a baseline. For instance, they stated that in the first couple of years of the HH PPS, many industry participants were struggling with the transition to the new payment system and the submission of OASIS data. They also stated that the OASIS document has changed over time and that staff in 2000 had inadequate training on the OASIS. A commenter stated that the OASIS does not adequately capture the level of illness of the population being served.

      Response: We followed the Administrative Procedure Act (APA) in implementing the HH PPS under the mandate in the Balanced Budget Act of 1997. Under the APA, we solicited public comments in 1999 on the then proposed system. OASIS itself was developed with industry participation for the purpose of measuring home health outcomes (see GAO-01-205, January 2001, Appendix II). A version of OASIS was used in the original case-mix research that led to the design of the HH PPS case-mix system. The research results indicated that adequate case-mix adjustment of payments could be achieved using OASIS variables. We have noted in previous regulations that the average case-mix weight nationally, as estimated from OASIS assessments in the 12 months leading up to October 1, 2000, was about 13 percent higher than the average in the sample of agencies whose data were used for the case-mix research. We used the estimate from the 12 months leading up to October 1, 2000 as our baseline for measuring case-mix change because it represented a very large, broad-based set of episodes. It did not reflect the earliest days of OASIS use. Given that coding practices continually evolved subsequent to the last 12 months ending October 1, 2000, and that agencies were not subject to the HH PPS incentives during the 12 months ending October 1, 2000, the selected baseline period is the most appropriate one to use to begin measuring coding change that occurred in relation to the introduction of the HH PPS. Any other period subsequent to our baseline builds in impacts on coding of the HH PPS and is questionable to use from the point of view of responsible fiscal stewardship.

      We note that comments referencing coding improvements, such as increasing accuracy, do not recognize that such improvements are an inappropriate basis for increased payment. We believe that measurable changes in patient severity and patient need are appropriate bases for changes in payment. Our analysis found only small changes in patient severity and need.

      With regard to the comments about the baseline, we note that in our May 2007 proposed rule and our August 2007 final rule, we described the IPS samples and PPS samples that were used to calculate case-mix change. We remind the commenters that 313,447 observations is an extremely large sample by statistical standards, and that agencies began collecting OASIS data in 1999, following issuance of a series of regulations beginning on January 25, 1999 (64 FR 3764). Most of the data we used for the baseline period come from the first 3 quarters of the year 2000--months after collection was mandated to begin in August 1999. By 2000 the vast majority of agencies were complying with the reporting requirements. Indirect evidence that the data from the early years of the HH PPS were sufficiently reliable comes from model validation analysis we conducted during that period. Validation of the 80-group model on a large 19-month claims sample ending June 2002 (N = 469,010 claims linked to OASIS) showed that the goodness-of-fit of the model was comparable to the fit statistic from the original Abt Associates case-mix sample (0.33 vs. 0.34), notwithstanding that average total resources per episode declined by 20 percent. That analysis

      Page 68646

      also showed that all but three variables in the scoring system remained statistically significant.

      Comment: A commenter questioned CMS' ability to be able to statistically infer the difference between increases in real changes in case-mix vs. nominal case-mix growth to the degree that the estimate was used in developing the proposed reductions, i.e., a hundredth of a percentage point. Some commenters stated that the home health payment system itself is flawed and cited the Report to Congress on the home health study on access to care for vulnerable populations. The commenter implied that since the payment system is flawed, the analysis to assess real and nominal case-mix is also flawed. Commenters stated that the proposed rule relies heavily on a case-mix methodology that CMS itself found requires ``additional analysis'' and ``potential modifications''. A commenter stated that the proposed case-mix creep adjustments should be suspended pending the development of a new case-

      mix model.

      Response: As described in the CY 2012 final rule and discussed above, we procured an independent review of our methodology by a team at Harvard University led by Dr. David Grabowski (``Home Health Study Report--Independent Review of the Models to Assess Nominal Case-Mix Growth'', dated June 21, 2011). When reviewing the model, the Harvard team found that overall, our models were robust. As stated previously, we would like to account for nominal case-mix growth from 2012 through 2014 and mitigate overpayments. We note that, as described in the CY 2016 proposed rule, we have several model options under development and may implement payment reform in the future. However, while we are currently in the process of developing payment reform options to the case-mix methodology, we think it is appropriate to account for the nominal case-mix growth from 2012 to 2014.

      Final Decision: After considering the comments received in response to the CY 2016 HH PPS proposed rule (80 FR 39840) and for the reasons discussed above, we are finalizing a 0.97 percent reduction to the national, standardized 60-day episode payment rate each year in CY 2016, CY 2017, and CY 2018 to account for nominal case-mix growth from 2012 to 2014.

      3. Clarification Regarding the Use of the ``Initial Encounter'' Seventh Character, Applicable to Certain ICD-10-CM Code Categories, under the HH PPS

      The ICD-10-CM coding guidelines regarding the seventh character assignment for diagnosis codes under Chapter 19, Injury, poisoning, and certain other consequences of external causes (S00-T88), were revised in the Draft 2015 ICD-10-CM, The Completed Official Draft Code Set. Based upon the 2015 revised coding guidance above, certain initial encounters are appropriate when the patient is receiving active treatment during a home health episode.

      Comment: A commenter requested clarification on the use of the seventh character for ``initial encounters'' in the home health setting. The commenter agrees that it seems reasonable that traumatic injury codes with the initial encounter extension may not be appropriate. However, the commenter contends that certain initial encounter extensions may be appropriate if the patient is still receiving active treatment. The commenter provided an example of active treatment whereby the patient is receiving active treatment with the continuation of antibiotics for treatment of a postoperative infection. Based upon this example of active treatment, the commenter recommends that CMS revise the home health grouper to allow the reporting of the initial encounter seventh character for the ICD-10-CM codes for those conditions that could reasonably continue to receive active treatment in the home health setting. A couple of other commenters noted similar concerns regarding initial encounters.

      Response: While this comment is outside the scope of this rule, we recognize that in the CY 2014 HH PPS final rule (78 FR 72271), we discussed the decision to eliminate codes with initial encounter extensions, listed in the GEMs translation for ICD-10-CM codes, that began with S and T that are used for reporting traumatic injuries (e.g., fractures and burns) as part of our ICD-10 grouper conversion effort. Codes beginning with S and T have a seventh character that indicates whether the treatment is for an initial encounter, subsequent encounter or a sequela (a residual effect (condition produced) after the acute phase of an illness or injury has terminated).

      The decision to eliminate the seventh character initial encounter for the S and T ICD-10-CM codes from the HH PPS ICD-10-CM translation list was based, not only on the most current coding conventions and guidelines that were available at that time, but also in collaboration with the cooperating parties of the ICD-10 Coding Committee (the American Health Information Management Association, the American Hospital Association, the Centers for Disease Control and Prevention's National Center for Health Statistics, and CMS) who confirmed that initial encounter extensions were not appropriate for care in the home health setting. Code extensions D, E, F, G, H, J, K, M, N, P, Q and R indicate the patient is being treated for a subsequent encounter (care for the injury during the healing or recovery phase) and were included in the translation list in place of the initial encounter extensions. CMS provided the draft translation list to the public on the CMS Web site at https://www.cms.gov/Center/Provider-Type/Home-Health-Agency-HHA-Center.html?redirect=/center/hha.asp. We did not receive any comments on the ICD-10-CM draft translation list and the elimination of initial encounter seventh character extension.

      Since the publication of the CY 2014 HH PPS final rule, the ICD-10-

      CM coding guidelines regarding the use of the seventh character assignment for diagnosis codes under Chapter 19, Injury, poisoning, and certain other consequences of external causes (S00-T88), were revised in the Draft 2015 ICD-10-CM, The Completed Official Draft Code Set. Specifically, in March of 2015, the coding guidelines were revised to clarify that the designation of an initial encounter is based on whether a patient is receiving active treatment for the condition for which the code describes. Initial encounters are not based on chronology of care or whether the patient is seeing the same or a new provider for the same condition. Examples of active treatment are: Surgical treatment, emergency department encounter, and evaluation and continuing treatment by the same or a different physician. Based on these revisions, it is possible for a home health agency to use a diagnosis code with a seventh character ``A'' (an initial encounter) for certain conditions. A clinical example of this could include a patient who was in the acute care hospital for IV antibiotics for a post-surgical wound infection and who is discharged to home health on IV antibiotics for ongoing treatment of the surgical wound infection. This would be considered active treatment as the surgical wound infection requires continued IV antibiotics.

      The coding guidelines state to assign the seventh character ``D'', indicating a subsequent encounter, for encounters after the patient has received active treatment of the condition and is receiving routine care for the condition during the healing or recovery phase. Examples of subsequent care include: cast change or removal, an x-ray to check healing status of fracture, removal of external or internal fixation device,

      Page 68647

      medication adjustment, other aftercare and follow up visits following treatment of the injury or condition. Therefore, it is also possible for home health encounters to be designated as subsequent encounters based on services that are provided during healing and recovery, after treatment of the condition described by the code is completed. A clinical example of this could include a patient who was in the acute care hospital for a traumatic hip fracture that was surgically repaired and the patient is discharged to home health for rehabilitation services. This would be considered a subsequent encounter as the hip fracture has been repaired and the patient is now in the healing and recovery phase.

      We recognize that this revision may have caused some confusion among home health providers and that there may be subtle clinical differences between what is considered active treatment of a condition versus routine care during the healing and recovery phase of a condition in the home health setting. The assignment of the seventh character should be based on clinical information from the physician and depends on whether the individual is receiving active treatment for the condition in which the code describes, or if the individual is receiving ongoing care for that condition during the healing and recovery stage. In determining which diagnosis codes would be appropriate for an HHA to indicate that the care is for an initial encounter, CMS developed and shared a draft list of codes with the cooperating parties. Agreement was reached between CMS and the cooperating parties and a revised translation list effective January 1, 2016 will be posted on the CMS Web site. Also effective, January 1, 2016, the Home Health Prospective Payment System Grouper logic will be revised to award points for certain initial encounter codes based upon the revised ICD-10-CM coding guidelines for M0090 dates on or after October 1, 2015.

    3. CY 2016 Home Health Rate Update

      1. CY 2016 Home Health Market Basket Update

      Section 1895(b)(3)(B) of the Act requires that the standard prospective payment amounts for CY 2015 be increased by a factor equal to the applicable HH market basket update for those HHAs that submit quality data as required by the Secretary. The HH market basket was rebased and revised in CY 2013. A detailed description of how we derive the HHA market basket is available in the CY 2013 HH PPS final rule (77 FR 67080- 67090). The HH market basket percentage increase for CY 2016 is based on IHS Global Insight Inc.'s (IGI) third quarter forecast with historical data through the second quarter of 2015. The HH market basket percentage increase for CY 2016 is 2.3 percent.

      Section 3401(e) of the Affordable Care Act, adding new section 1895(b)(3)(B)(vi) to the Act, requires that the market basket percentage under the HHA prospective payment system as described in section 1895(b)(3)(B) of the Act be annually adjusted by changes in economy-wide productivity for CY 2015 and each subsequent calendar year. The statute defines the productivity adjustment, described in section 1886(b)(3)(B)(xi)(II) of the Act, to be equal to the 10-year moving average of change in annual economy-wide private nonfarm business multifactor productivity (MFP) (as projected by the Secretary for the 10-year period ending with the applicable fiscal year, calendar year, cost reporting period, or other annual period) (the ``MFP adjustment''). The Bureau of Labor Statistics (BLS) is the agency that publishes the official measure of private nonfarm business MFP. Please see http://www.bls.gov/mfp to obtain the BLS historical published MFP data.

      Multifactor productivity is derived by subtracting the contribution of labor and capital input growth from output growth. The projections of the components of MFP are currently produced by IGI, a nationally recognized economic forecasting firm with which CMS contracts to forecast the components of the market basket and MFP. As described in the CY 2015 HH PPS proposed rule (79 FR 38384 through 38386), in order to generate a forecast of MFP, IGI replicated the MFP measure calculated by the BLS using a series of proxy variables derived from IGI's U.S. macroeconomic models. In the CY 2015 HH PPS proposed rule, we identified each of the major MFP component series employed by the BLS to measure MFP as well as provided the corresponding concepts determined to be the best available proxies for the BLS series.

      Beginning with the CY 2016 rulemaking cycle, the MFP adjustment is calculated using a revised series developed by IGI to proxy the aggregate capital inputs. Specifically, IGI has replaced the Real Effective Capital Stock used for Full Employment GDP with a forecast of BLS aggregate capital inputs recently developed by IGI using a regression model. This series provides a better fit to the BLS capital inputs as measured by the differences between the actual BLS capital input growth rates and the estimated model growth rates over the historical time period. Therefore, we are using IGI's most recent forecast of the BLS capital inputs series in the MFP calculations beginning with the CY 2016 rulemaking cycle. A complete description of the MFP projection methodology is available on our Web site at http://www.cms.gov/Research-Statistics-Data-and-Systems/Statistics-Trends-and-Reports/MedicareProgramRatesStats/MarketBasketResearch.html. In the future, when IGI makes changes to the MFP methodology, we will announce them on our Web site rather than in the annual rulemaking.

      Using IGI's third quarter 2015 forecast, the MFP adjustment for CY 2016 (the 10-year moving average of MFP for the period ending CY 2016) is 0.4 percent. The CY 2016 HH market basket percentage of 2.3 percent will be reduced by the MFP adjustment of 0.4 percent. The resulting HH payment update percentage is equal to 1.9 percent, or 2.3 percent less 0.4 percentage point.

      Section 1895(b)(3)(B) of the Act requires that the HH update be decreased by 2 percentage points for those HHAs that do not submit quality data as required by the Secretary. For HHAs that do not submit the required quality data for CY 2016, the HH payment update will be -

      0.1 percent (1.9 percent minus 2 percentage points).

      2. CY 2016 Home Health Wage Index

      1. Background

        Sections 1895(b)(4)(A)(ii) and (b)(4)(C) of the Act require the Secretary to provide appropriate adjustments to the proportion of the payment amount under the HH PPS that account for area wage differences, using adjustment factors that reflect the relative level of wages and wage-related costs applicable to the furnishing of HH services. Since the inception of the HH PPS, we have used inpatient hospital wage data in developing a wage index to be applied to HH payments.

        We will apply the appropriate wage index value to the labor portion of the HH PPS rates based on the site of service for the beneficiary (defined by section 1861(m) of the Act as the beneficiary's place of residence).

        We will continue to use the same methodology discussed in the CY 2007 HH PPS final rule (71 FR 65884) to address those geographic areas in which there are no inpatient hospitals, and thus, no hospital wage data on which to base the calculation of the CY 2015 HH PPS wage index. For rural areas that do

        Page 68648

        not have inpatient hospitals, we will use the average wage index from all contiguous CBSAs as a reasonable proxy. For FY 2016, there are no rural geographic areas without hospitals for which we would apply this policy. For rural Puerto Rico, we will not apply this methodology due to the distinct economic circumstances that exist there (for example, due to the close proximity to one another of almost all of Puerto Rico's various urban and non-urban areas, this methodology would produce a wage index for rural Puerto Rico that is higher than that in half of its urban areas). Instead, we will use the most recent wage index previously available for that area. For urban areas without inpatient hospitals, we use the average wage index of all urban areas within the state as a reasonable proxy for the wage index for that CBSA. For CY 2016, the only urban area without inpatient hospital wage data is Hinesville, GA (CBSA 25980).

      2. Update

        On February 28, 2013, OMB issued Bulletin No. 13-01, announcing revisions to the delineations of MSAs, Micropolitan Statistical Areas, and CBSAs, and guidance on uses of the delineation of these areas. This bulletin is available online at http://www.whitehouse.gov/sites/default/files/omb/bulletins/2013/b-13-01.pdf. This bulletin states that it ``provides the delineations of all Metropolitan Statistical Areas, Metropolitan Divisions, Micropolitan Statistical Areas, Combined Statistical Areas, and New England City and Town Areas in the United States and Puerto Rico based on the standards published on June 28, 2010, in the Federal Register (75 FR 37246-37252) and Census Bureau data.''

        In the CY 2015 HH PPS final rule (79 FR 66085 through 66087), we finalized changes to the HH PPS wage index based on the newest OMB delineations, as described in OMB Bulletin No. 13-01, including a 1-

        year transition with a blended wage index for CY 2015. Because the 1-

        year transition period expires at the end of CY 2015, the final HH PPS wage index for CY 2016 will be fully based on the revised OMB delineations adopted in CY 2015. The final CY 2016 wage index is available on the CMS Web site at http://www.cms.gov/Medicare/Medicare-Fee-for-Service-Payment/HomeHealthPPS/Home-Health-Prospective-Payment-System-Regulations-and-Notices.html

        3. CY 2016 Annual Payment Update

      3. Background

        The Medicare HH PPS has been in effect since October 1, 2000. As set forth in the July 3, 2000 final rule (65 FR 41128), the base unit of payment under the Medicare HH PPS is a national, standardized 60-day episode payment rate. As set forth in 42 CFR 484.220, we adjust the national, standardized 60-day episode payment rate by a case-mix relative weight and a wage index value based on the site of service for the beneficiary.

        To provide appropriate adjustments to the proportion of the payment amount under the HH PPS to account for area wage differences, we apply the appropriate wage index value to the labor portion of the HH PPS rates. The labor-related share of the case-mix adjusted 60-day episode rate is 78.535 percent and the non-labor-related share is 21.465 percent as set out in the CY 2013 HH PPS final rule (77 FR 67068). The CY 2016 HH PPS rates will use the same case-mix methodology as set forth in the CY 2008 HH PPS final rule with comment period (72 FR 49762) and will be adjusted as described in section III.C. of this rule. The following are the steps we take to compute the case-mix and wage-adjusted 60-day episode rate:

        1. Multiply the national 60-day episode rate by the patient's applicable case-mix weight.

        2. Divide the case-mix adjusted amount into a labor (78.535 percent) and a non-labor portion (21.465 percent).

        3. Multiply the labor portion by the applicable wage index based on the site of service of the beneficiary.

        4. Add the wage-adjusted portion to the non-labor portion, yielding the case-mix and wage adjusted 60-day episode rate, subject to any additional applicable adjustments.

        In accordance with section 1895(b)(3)(B) of the Act, this document constitutes the annual update of the HH PPS rates. Section 484.225 sets forth the specific annual percentage update methodology. In accordance with Sec. 484.225(i), for a HHA that does not submit HH quality data, as specified by the Secretary, the unadjusted national prospective 60-

        day episode rate is equal to the rate for the previous calendar year increased by the applicable HH market basket index amount minus two percentage points. Any reduction of the percentage change will apply only to the calendar year involved and would not be considered in computing the prospective payment amount for a subsequent calendar year.

        Medicare pays the national, standardized 60-day case-mix and wage-

        adjusted episode payment on a split percentage payment approach. The split percentage payment approach includes an initial percentage payment and a final percentage payment as set forth in Sec. 484.205(b)(1) and (b)(2). We may base the initial percentage payment on the submission of a request for anticipated payment (RAP) and the final percentage payment on the submission of the claim for the episode, as discussed in Sec. 409.43. The claim for the episode that the HHA submits for the final percentage payment determines the total payment amount for the episode and whether we make an applicable adjustment to the 60-day case-mix and wage-adjusted episode payment. The end date of the 60-day episode as reported on the claim determines which calendar year rates Medicare would use to pay the claim.

        We may also adjust the 60-day case-mix and wage-adjusted episode payment based on the information submitted on the claim to reflect the following:

        A low-utilization payment adjustment (LUPA) is provided on a per-visit basis as set forth in Sec. 484.205(c) and Sec. 484.230.

        A partial episode payment (PEP) adjustment as set forth in Sec. 484.205(d) and Sec. 484.235.

        An outlier payment as set forth in Sec. 484.205(e) and Sec. 484.240.

      4. CY 2016 National, Standardized 60-Day Episode Payment Rate

        Section 1895(3)(A)(i) of the Act required that the 60-day episode base rate and other applicable amounts be standardized in a manner that eliminates the effects of variations in relative case mix and area wage adjustments among different home health agencies in a budget neutral manner. To determine the CY 2016 national, standardized 60-day episode payment rate, we will apply a wage index standardization factor, a case-mix budget neutrality factor described in section III.B.1, a nominal case-mix growth adjustment described in section III.B.2, the rebasing adjustment described in section II.C, and the HH payment update as discussed in section III.C.1 of this final rule.

        To calculate the wage index standardization factor, henceforth referred to as the wage index budget neutrality factor, we simulated total payments for non-LUPA episodes using the 2016 wage index and compared it to our simulation of total payments for non-LUPA episodes using the 2015 wage index. By dividing the total payments for non-LUPA episodes using the 2016 wage index by the total payments for non-LUPA episodes using the 2015 wage index, we obtain a wage index budget neutrality factor of 1.0011. We will apply the wage index budget neutrality factor of 1.0011 to the CY

        Page 68649

        2016 national, standardized 60-day episode rate.

        As discussed in section III.B.1 of this final rule, to ensure the changes to the case-mix weights are implemented in a budget neutral manner, we will apply a case-mix weight budget neutrality factor to the CY 2016 national, standardized 60-day episode payment rate. The case-

        mix weight budget neutrality factor is calculated as the ratio of total payments when CY 2016 case-mix weights are applied to CY 2014 utilization (claims) data to total payments when CY 2015 case-mix weights are applied to CY 2014 utilization data. The case-mix budget neutrality factor for CY 2016 will be 1.0187 as described in section III.B.1 of this final rule.

        Next, as discussed in section III.B.2 of this final rule, we will apply a reduction of 0.97 percent to the national, standardized 60-day episode payment rate in CY 2016 to account for nominal case-mix growth between CY 2012 and CY 2014. Then, we will apply the -$80.95 rebasing adjustment finalized in the CY 2014 HH PPS final rule (78 FR 72256) and discussed in section II.C. Lastly, we will update the payment rates by the CY 2016 HH payment update of 1.9 percent (MFP-adjusted home health market basket update) as described in section III.C.1 of this final rule. The CY 2016 national, standardized 60-day episode payment rate is calculated in Table 7.

        Table 7--CY 2016 National, Standardized 60-Day Episode Payment Amount

        --------------------------------------------------------------------------------------------------------------------------------------------------------

        CY 2016

        Wage index Case-mix Nominal case- CY 2016 CY 2016 HH National,

        CY 2015 National, standardized 60-day episode budget weights budget mix growth Rebasing payment update standardized

        payment neutrality neutrality adjustment (1- adjustment percentage 60-day episode

        factor factor .0097) payment

        --------------------------------------------------------------------------------------------------------------------------------------------------------

        $2,961.38......................................... x 1.0011 x 1.0187 x 0.9903 -$80.95 x 1.019 $2,965.12

        --------------------------------------------------------------------------------------------------------------------------------------------------------

        The CY 2016 national, standardized 60-day episode payment rate for an HHA that does not submit the required quality data is updated by the CY 2016 HH payment update (1.9 percent) minus 2 percentage points and is shown in Table 8.

        Table 8--For HHAs That Do Not Submit the Quality Data--CY 2016 National, Standardized 60-Day Episode Payment Amount

        --------------------------------------------------------------------------------------------------------------------------------------------------------

        CY 2016 HH

        Wage index Case-mix Nominal case- payment update CY 2016

        CY 2015 National, standardized 60-day episode budget weights budget mix growth CY 2016 percentage National,

        payment neutrality neutrality adjustment (1- Rebasing minus 2 standardized

        factor factor .0097) adjustment percentage 60-day episode

        points payment

        --------------------------------------------------------------------------------------------------------------------------------------------------------

        $2,961.38......................................... x1.0011 x1.0187 x0.9903 -$80.95 x0.999 $2,906.92

        --------------------------------------------------------------------------------------------------------------------------------------------------------

      5. CY 2016 National Per-Visit Rates

        The national per-visit rates are used to pay LUPAs (episodes with four or fewer visits) and are also used to compute imputed costs in outlier calculations. The per-visit rates are paid by type of visit or HH discipline. The six HH disciplines are as follows:

        Home health aide (HH aide);

        Medical Social Services (MSS);

        Occupational therapy (OT);

        Physical therapy (PT);

        Skilled nursing (SN); and

        Speech-language pathology (SLP).

        To calculate the CY 2016 national per-visit rates, we start with the CY 2015 national per-visit rates. We then apply a wage index budget neutrality factor to ensure budget neutrality for LUPA per-visit payments and increase each of the six per-visit rates by the maximum rebasing adjustments described in section II.C. of this rule. We calculate the wage index budget neutrality factor by simulating total payments for LUPA episodes using the 2016 wage index and comparing it to simulated total payments for LUPA episodes using the 2015 wage index. By dividing the total payments for LUPA episodes using the 2016 wage index by the total payments for LUPA episodes using the 2015 wage index, we obtain a wage index budget neutrality factor of 1.0010. We will apply the wage index budget neutrality factor of 1.0010 to the CY 2016 national per-visit rates.

        The LUPA per-visit rates are not calculated using case-mix weights. Therefore, there is no case-mix weight budget neutrality factor needed to ensure budget neutrality for LUPA payments. Then, we apply the rebasing adjustments finalized in the CY 2014 HH PPS final rule (78 FR 72280) to the per-visit rates for each discipline. Finally, the per-

        visit rates are updated by the CY 2016 HH payment update of 1.9 percent. The national per-visit rates are adjusted by the wage index based on the site of service of the beneficiary. The per-visit payments for LUPAs are separate from the LUPA add-on payment amount, which is paid for episodes that occur as the only episode or initial episode in a sequence of adjacent episodes. The CY 2016 national per-visit rates are shown in Tables 9 and 10.

        Page 68650

        Table 9--CY 2016 National Per-Visit Payment Amounts for HHAs That DO Submit the Required Quality Data

        --------------------------------------------------------------------------------------------------------------------------------------------------------

        Wage index

        CY 2015 Per- budget CY 2016 CY 2016 HH CY 2016 Per-

        HH discipline type visit payment neutrality Rebasing payment update visit payment

        factor adjustment percentage

        --------------------------------------------------------------------------------------------------------------------------------------------------------

        Home health aide................................................... $57.89 x 1.0010 +$1.79 x 1.019 $60.87

        Medical Social Services............................................ 204.91 x 1.0010 + $6.34 x 1.019 215.47

        Occupational Therapy............................................... 140.70 x 1.0010 + $4.35 x 1.019 147.95

        Physical Therapy................................................... 139.75 x 1.0010 + $4.32 x 1.019 146.95

        Skilled Nursing.................................................... 127.83 x 1.0010 + $3.96 x 1.019 134.42

        Speech-Language Pathology.......................................... 151.88 x 1.0010 + 4.70 x 1.019 159.71

        --------------------------------------------------------------------------------------------------------------------------------------------------------

        The CY 2016 per-visit payment rates for HHAs that do not submit the required quality data are updated by the CY 2016 HH payment update of 1.9 percent minus 2 percentage points (which is equal to -0.1 percent) and is shown in Table 10.

        Table 10--CY 2016 National Per-Visit Payment Amounts for HHAs That DO NOT Submit the Required Quality Data

        --------------------------------------------------------------------------------------------------------------------------------------------------------

        CY 2016 HH

        Wage index payment update

        CY 2015 Per- budget CY 2016 percentage CY 2016 Per-

        HH discipline type visit rates neutrality Rebasing minus 2 visit rates

        factor adjustment percentage

        points

        --------------------------------------------------------------------------------------------------------------------------------------------------------

        Home Health Aide................................................... $57.89 x 1.0010 + $1.79 x 0.999 $59.68

        Medical Social Services............................................ 204.91 x 1.0010 + $6.34 x 0.999 211.24

        Occupational Therapy............................................... 140.70 x 1.0010 + $4.35 x 0.999 145.05

        Physical Therapy................................................... 139.75 x 1.0010 + $4.32 x 0.999 144.07

        Skilled Nursing.................................................... 127.83 x 1.0010 + $3.96 x 0.999 131.79

        Speech-Language Pathology.......................................... 151.88 x 1.0010 + 4.70 x 0.999 156.58

        --------------------------------------------------------------------------------------------------------------------------------------------------------

      6. Low-Utilization Payment Adjustment (LUPA) Add-On Factors

        LUPA episodes that occur as the only episode or as an initial episode in a sequence of adjacent episodes are adjusted by applying an additional amount to the LUPA payment before adjusting for area wage differences. In the CY 2014 HH PPS final rule, we changed the methodology for calculating the LUPA add-on amount by finalizing the use of three LUPA add-on factors: 1.8451 for SN; 1.6700 for PT; and 1.6266 for SLP (78 FR 72306). We multiply the per-visit payment amount for the first SN, PT, or SLP visit in LUPA episodes that occur as the only episode or an initial episode in a sequence of adjacent episodes by the appropriate factor to determine the LUPA add-on payment amount. For example, for LUPA episodes that occur as the only episode or an initial episode in a sequence of adjacent episodes, if the first skilled visit is SN, the payment for that visit would be $248.02 (1.8451 multiplied by $134.42), subject to area wage adjustment.

      7. CY 2016 Non-routine Medical Supply (NRS) Payment Rates

        Payments for NRS are computed by multiplying the relative weight for a particular severity level by the NRS conversion factor. To determine the CY 2016 NRS conversion factor, we start with the 2015 NRS conversion factor ($53.23) and apply the -2.82 percent rebasing adjustment described in section II.C. of this rule (1 - 0.0282 = 0.9718). We then update the conversion factor by the CY 2016 HH payment update of 1.9 percent. We do not apply a standardization factor as the NRS payment amount calculated from the conversion factor is not wage or case-mix adjusted when the final claim payment amount is computed. The NRS conversion factor for CY 2016 is shown in Table 11.

        Table 11--CY 2016 NRS Conversion Factor for HHAs That DO Submit the Required Quality Data

        ----------------------------------------------------------------------------------------------------------------

        CY 2016 CY 2016 HH CY 2016 NRS

        CY 2015 NRS conversion factor Rebasing payment update conversion

        adjustment percentage factor

        ----------------------------------------------------------------------------------------------------------------

        $53.23....................................................... x 0.9718 x 1.019 $52.71

        ----------------------------------------------------------------------------------------------------------------

        Using the CY 2016 NRS conversion factor, the payment amounts for the six severity levels are shown in Table 12.

        Page 68651

        Table 12--CY 2016 NRS Payment Amounts for HHAs That DO Submit the Required Quality Data

        ----------------------------------------------------------------------------------------------------------------

        Points CY 2016 NRS

        Severity level (scoring) Relative weight payment amounts

        ----------------------------------------------------------------------------------------------------------------

        1............................................................ 0 0.2698 $14.22

        2............................................................ 1 to 14 0.9742 51.35

        3............................................................ 15 to 27 2.6712 140.80

        4............................................................ 28 to 48 3.9686 209.18

        5............................................................ 49 to 98 6.1198 322.57

        6............................................................ 99+ 10.5254 554.79

        ----------------------------------------------------------------------------------------------------------------

        For HHAs that do not submit the required quality data, we again begin with the CY 2015 NRS conversion factor ($53.23) and apply the -

        2.82 percent rebasing adjustment as discussed in section II.C of this final rule (1 - 0.0282 = 0.9718). We then update the NRS conversion factor by the CY 2016 HH payment update of 1.9 percent minus 2 percentage points. The CY 2016 NRS conversion factor for HHAs that do not submit quality data is shown in Table 13.

        Table 13--CY 2016 NRS Conversion Factor for HHAs That DO NOT Submit the Required Quality Data

        ----------------------------------------------------------------------------------------------------------------

        CY 2016 HH

        payment update

        CY 2016 percentage CY 2016 NRS

        CY 2015 NRS conversion factor rebasing minus 2 conversion

        adjustment percentage factor

        points

        ----------------------------------------------------------------------------------------------------------------

        $53.23....................................................... x 0.9718 x 0.999 $51.68

        ----------------------------------------------------------------------------------------------------------------

        The payment amounts for the various severity levels based on the updated conversion factor for HHAs that do not submit quality data are calculated in Table 14.

        Table 14--CY 2016 NRS Payment Amounts For HHAs That DO NOT Submit the Required Quality Data

        ----------------------------------------------------------------------------------------------------------------

        CY 2016 NRS

        Severity level Points (scoring) Relative weight payment amounts

        ----------------------------------------------------------------------------------------------------------------

        1......................................... 0................................. 0.2698 $13.94

        2......................................... 1 to 14........................... 0.9742 50.35

        3......................................... 15 to 27.......................... 2.6712 138.05

        4......................................... 28 to 48.......................... 3.9686 205.10

        5......................................... 49 to 98.......................... 6.1198 316.27

        6......................................... 99+............................... 10.5254 543.95

        ----------------------------------------------------------------------------------------------------------------

      8. Rural Add-On

        Section 421(a) of the MMA requires, for HH services furnished in a rural area (as defined in section 1886(d)(2)(D) of the Act), for episodes or visits ending on or after April 1, 2010, and before January 1, 2018, that the Secretary increase the payment amount that otherwise would have been made under section 1895 of the Act for the services by 3 percent. Section 421 of the MMA waives budget neutrality related to this provision, as the statute specifically states that the Secretary shall not reduce the standard prospective payment amount (or amounts) under section 1895 of the Act applicable to HH services furnished during a period to offset the increase in payments resulting in the application of this section of the statute.

        For CY 2016, home health payment rates for services provided to beneficiaries in areas that are defined as rural under the OMB delineations will be increased by 3 percent as mandated by section 421(a) of the MMA. The 3 percent rural add-on is applied to the national, standardized 60-day episode payment rate, national per visit rates, and NRS conversion factor when HH services are provided in rural (non-CBSA) areas. Refer to Tables 15 through 18 for these payment rates.

        Page 68652

        Table 15--CY 2016 Payment Amounts for 60-Day Episodes for Services Provided in a Rural Area

        --------------------------------------------------------------------------------------------------------------------------------------------------------

        For HHAs that DO submit quality data For HHAs that DO NOT submit quality data

        --------------------------------------------------------------------------------------------------------------------------------------------------------

        CY 2016 rural CY 2016 rural

        Multiply by the national, Multiply by the national,

        CY 2016 national, standardized 60-day episode 3 percent rural standardized 60- CY 2016 national, standardized 60- 3 percent rural standardized 60-

        payment rate add-on day episode day episode payment rate add-on day episode

        payment rate payment rate

        --------------------------------------------------------------------------------------------------------------------------------------------------------

        $2,965.12....................................... x 1.03 $3,054.07 $2,906.92......................... x 1.03 $2,994.13

        --------------------------------------------------------------------------------------------------------------------------------------------------------

        Table 16--CY 2016 Per-Visit Amounts for Services Provided in a Rural Area

        --------------------------------------------------------------------------------------------------------------------------------------------------------

        For HHAs that DO submit quality data For HHAs that DO NOT submit quality data

        -----------------------------------------------------------------------------------------------------

        HH Discipline type Multiply by the Multiply by the

        CY 2016 per- 3 percent rural CY 2016 rural CY 2016 per- 3 percent rural CY 2016 rural

        visit rate add-on per-visit rates visit rate add-on per-visit rates

        --------------------------------------------------------------------------------------------------------------------------------------------------------

        HH Aide........................................... $60.87 x 1.03 $62.70 $59.68 x 1.03 $61.47

        MSS............................................... 215.47 x 1.03 221.93 211.24 x 1.03 217.58

        OT................................................ 147.95 x 1.03 152.39 145.05 x 1.03 149.40

        PT................................................ 146.95 x 1.03 151.36 144.07 x 1.03 148.39

        SN................................................ 134.42 x 1.03 138.45 131.79 x 1.03 135.74

        SLP............................................... 159.71 x 1.03 164.50 156.58 x 1.03 161.28

        --------------------------------------------------------------------------------------------------------------------------------------------------------

        Table 17--CY 2016 NRS Conversion Factor for Services Provided in Rural Areas

        --------------------------------------------------------------------------------------------------------------------------------------------------------

        For HHAs that DO submit quality data For HHAs that DO NOT submit quality data

        --------------------------------------------------------------------------------------------------------------------------------------------------------

        Multiply by the CY 2016 rural Multiply by the CY 2016 rural

        CY 2016 conversion factor 3 percent rural NRS conversion CY 2016 conversion factor 3 percent rural NRS conversion

        add-on factor add-on factor

        --------------------------------------------------------------------------------------------------------------------------------------------------------

        $52.71.......................................... x 1.03 $54.29 $51.68............................ x 1.03 $53.23

        --------------------------------------------------------------------------------------------------------------------------------------------------------

        Table 18--CY 2016 NRS Payment Amounts for Services Provided in Rural Areas

        --------------------------------------------------------------------------------------------------------------------------------------------------------

        For HHAs that DO submit quality For HHAs that DO NOT submit

        data (CY 2016 NRS conversion quality data (CY 2016 NRS

        factor = $54.29 conversion factor = $53.23)

        Severity level Points (scoring) -------------------------------------------------------------------

        CY 2016 NRS CY 2016 NRS

        Relative weight payment amounts Relative weight payment amounts

        for rural areas for rural areas

        --------------------------------------------------------------------------------------------------------------------------------------------------------

        1............................................... 0................................. 0.2698 $14.65 0.2698 $14.36

        2............................................... 1 to 14........................... 0.9742 52.89 0.9742 51.86

        3............................................... 15 to 27.......................... 2.6712 145.02 2.6712 142.19

        4............................................... 28 to 48.......................... 3.9686 215.46 3.9686 211.25

        5............................................... 49 to 98.......................... 6.1198 332.24 6.1198 325.76

        6............................................... 99+............................... 10.5254 571.42 10.5254 560.27

        --------------------------------------------------------------------------------------------------------------------------------------------------------

        The following is a summary of comments we received regarding the CY 2016 home health rate update.

        Comment: A commenter objected to the proposed 0.6 percent productivity adjustment.

        Response: The productivity adjustment was mandated by Section 3401(e) of the Affordable Care Act by adding section 1895(b)(3)(B)(vi) to the Act and requiring that the market basket percentage under the HH PPS be annually adjusted by changes in economy-wide productivity in CY 2015 (and in subsequent calendar years). Since publication of the proposed rule, our forecast for the productivity adjustment has been revised to 0.4 percent based on an updated forecast with historical data through 2014.

        Comment: A commenter stated that because CAHs are located in rural areas, the absence of CAH wage data further compromises the accuracy of the hospital wage index to determine labor costs of HHAs providing services in rural areas. In addition, pending development of an industry specific wage index, CMS should add a population density adjustment to the labor portion of the payment to account for increased costs of providing services in less densely populated areas.

        Response: Although the pre-floor, pre-reclassified hospital wage index does not include data from CAHs, we believe it reflects the relative level of wages and wage-related costs applicable to providing home health services. As we stated in the IPPS Final Rule published on August 1, 2003 (68 FR 45397), ``CAHs represent a substantial number of hospitals with significantly different labor costs in many labor market areas where they exist.'' We further noted that, ``. . . in 89 percent of all labor market areas with hospitals that

        Page 68653

        converted to CAH status sometime after FY 2000, the average hourly wage for CAHs is lower than the average hourly wage for other short-term hospitals in the area. In 79 percent of the labor market areas with CAHs, the average hourly wage for CAHs is lower than the average hourly wage for other short-term hospitals by 5 percent or greater. These results suggest that the wage data for CAHs, in general, are significantly different from other short-term hospitals.

        At this time, we do not have evidence that a population density adjustment is appropriate. While rural HHAs cite the added cost of long distance travel to provide care for their patients, urban HHAs cite added costs associated with needed security measures and traffic congestion.

        Comment: A commenter urges CMS to review the wage index calculation for rural Massachusetts and to include Nantucket Cottage Hospital's data in the calculation. The commenter states that Nantucket Cottage Hospital had given up its critical access hospital (CAH) designation in 2014 yet CMS has apparently not used wage data from Nantucket Cottage Hospital in calculating the 2016 wage index for rural Massachusetts. The commenter urges CMS to include wage data from CAHs in calculating the wage index for HHAs and other non-hospital provider types. The commenter believes that including wage data from CAHS would make the wage index more reflective of actual local wage practices.

        Response: Data from Nantucket Cottage Hospital is included in the calculation of the 2016 wage index for rural Massachusetts. In fact, data from this hospital has been included in the calculation of the HH wage index for rural Massachusetts since CY 2012. It has been our longstanding practice to not include data from CAHs in the calculation of the HH wage index. We only include hospital data from acute IPPS hospitals in the calculation of the HH wage index.

        Comment: A commenter questions the validity of the wage index assigned to CBSA 22520, Florence-Muscle Shoals, AL. The commenter requests that the underlying data to determine this index be investigated to determine its validity. In addition, the commenter states that the wage index as assigned places this urban area below the rural wage index for the state, which cannot be correct.

        Response: The HH wage index values in urban areas are not necessarily higher than the HH wage index values in rural areas. The wage index values are based on data submitted on the inpatient hospital cost reports. We utilize efficient means to ensure and review the accuracy of the hospital cost report data and resulting wage index. The home health wage index is derived from the pre-floor, pre-reclassified wage index which is calculated based on cost report data from hospitals paid under the IPPS. All IPPS hospitals must complete the wage index survey (Worksheet S-3, Parts II and III) as part of their Medicare cost reports. Cost reports will be rejected if Worksheet S-3 is not completed. In addition, our intermediaries perform desk reviews on all hospitals' Worksheet S-3 wage data, and we run edits on the wage data to further ensure the accuracy and validity of the wage data. We believe that our review processes result in an accurate reflection of the applicable wages for the areas given. The processes and procedures describing how the inpatient hospital wage index is developed are discussed in the Inpatient Prospective Payment System (IPPS) rule each year, with the most recent discussion provided in the FY 2016 IPPS final rule (80 FR 49488 through 49508). Any provider type may submit comments on the hospital wage index during the annual IPPS rulemaking cycle.

        Comment: Several commenters took issue with the fact that the HH wage index is based on pre-floor, pre-reclassified hospital wage data, but hospitals in the same geographic locations have the ability to apply for re-classification to another CBSA and may be eligible for the rural floor wage index. The commenters state that this inequity has created a competitive advantage for hospitals in recruiting and retaining scarce labor. Several commenters believe that the statute does give CMS authority to address and correct some of these inequities. One commenter believes that a correction to the manner in which the wage index is calculated is needed in order to recruit and retain staff necessary to provide home health care. The commenter continues to state that otherwise it may be difficult for HHAs to meet the increased demand for services, which may jeopardize the success of CMS' VBP initiatives. Another commenter recommends that CMS reform the HH wage index by instituting a proxy that allows HHAs to receive the same reclassification as hospitals if they provide series in the same service area.

        Response: We continue to believe that the regulations and statutes that govern the HH PPS do not provide a mechanism for allowing HHAs to seek geographic reclassification or to utilize the rural floor provisions that exist for IPPS hospitals. Section 4410(a) of the BBA provides that the area wage index applicable to any hospital that is located in an urban area of a State may not be less than the area wage index applicable to hospitals located in rural areas in that state. This is the rural floor provision and it is specific to hospitals. The re-classification provision found in section 1886(d)(10) of the Act. Section 1886(d)(10)(C)(i) of the Act states, ``The Board shall consider the application of any subsection (d) hospital requesting that the Secretary change the hospital's geographic classification . . .'' This provision is only applicable to hospitals as defined in section 1886(d) of the Act.

        In addition, we do not believe that using hospital reclassification data would be appropriate as these data are specific to the requesting hospitals and it may or may not apply to a given HHA in a given instance. With regard to implementing a rural floor, we do not believe it would be prudent at this time to adopt such a policy. MedPAC has recommended eliminating the rural floor policy from the calculation of the IPPS wage index (see Chapter 3 of MedPAC's March 2013 Report to Congress on Medicare Payment Policy, available at http://medpac.gov/documents/reports/mar13_entirereport.pdf, which notes on page 65 that in 2007, MedPAC had ``. . . recommended eliminating these special wage index adjustments and adopting a new wage index system to avoid geographic inequities that can occur due to current wage index policies.''

        We continue to believe that using the pre-floor, pre-reclassified hospital wage index as the wage adjustment to the labor portion of the HH PPS rates is appropriate and reasonable.

        Comment: A commenter requests that CMS explore wholesale revision and reform of the HH wage index. The commenter believes that existing law permits CMS flexibility in establishing area wage adjustment factors. Another commenter notes that CMS indicated that the entire wage index system was under review, and that a move to a Commuting-

        Based Wage Index (CBWI) was being considered. The commenter urges CMS to expedite that review and implement a system that not only recognizes variations between localities, but also treats all provider types within a local market equitably. Until such a system is in place, the commenter urges CMS to adjust the 2016 HHA wage index to reflect a policy to limit the wage index disparity between provider types within a given CBSA to no more than 10 percent.

        Response: CMS' ``Report to Congress: Plan to Reform the Medicare Wage Index'' was submitted by the Secretary on April 11, 2012 and is available on

        Page 68654

        our Wage Index Reform Web page at https://www.cms.gov/Medicare/Medicare-Fee-for-Service-Payment/AcuteInpatientPPS/Wage-Index-Reform.html. This report states that other steps are necessary before we would be able to adopt a CBWI. In the meantime, we do not believe that limiting wage index differences between provider types within a given CBSA would be feasible. Regardless of whether or not it would be appropriate to do so, it would not be feasible to limit the differences in wage index values among provider types within a given CBSA to no more than 10 percent, due to timing issues. Some provider types are reimbursed on a calendar year basis and some are reimbursed on a fiscal year basis.

        Comment: A commenter opposes CMS' use of the hospital wage index to establish the HH wage index. The commenter states that differences in the occupational personnel pool and costs between hospitals and HHAs make the use of the hospital wage index inappropriate in the HH setting. The commenter further states that hospitals benefit from institutional efficiencies that and rural hospitals have a reclassification mechanism to avoid exposure to the drastic rural index rate in most states. The commenter believes that Congress has granted CMS discretion in establishing the HH wage index and that CMS should establish a HH specific wage index. Another commenter believes that basing the wage index on hospital data is not reliable for home health. The commenter continues to state that home health workers pay is typically much more than that of a hospital employee due to the demanding nature of the job. The commenter suggests that CMS complete a detailed study of this issue.

        Response: Our previous attempts at either proposing or developing a home health specific wage index were not well received by the home health industry. In a Federal Register Notice (53 FR 38476) published on September 30, 1988, the Health Care Financing Administration (HCFA), as we were then known, implemented an HHA-specific wage index based on data received from HHAs. Subsequently, HCFA and the Congress received numerous complaints from providers concerning the burden that the reporting requirements posed and the accuracy of the data. As a result, the Congress retroactively repealed its mandate in the Medicare Catastrophic Coverage Act of 1988 for use of an HHA wage index and referenced use of the hospital wage index (see section 1895(b)(4)(C) of the Act). This caused great confusion among both providers and fiscal intermediaries.

        Developing a wage index that utilizes data specific to HHAs would require us to engage resources in an audit process. In order to establish a home health specific wage index, we would need to collect data that is specific to home health services. Because of the volatility of the home health wage data and the significant amount of resources that would be required to improve the quality of those data, we do not expect to propose a home health specific wage index until we can demonstrate that a home health specific wage index would be more reflective of the wages and salaries paid in a specific area, be based upon stable data sources, significantly improve our ability to determine payment for HHAs, and that we can justify the resources required to collect the data, as well as the increased burden on providers. We believe that in the absence of home health specific wage data, using the pre-floor, pre-reclassified hospital wage data is appropriate and reasonable for the HH PPS.

        Comment: A commenter states that the wage index needs to reflect the growing difficulties of providing care in rural areas. The commenter states that paying lower wages for rural health care professionals that put as much time, skill and intensity into their work as their urban counterparts, exacerbates the workforces shortages. The commenter continues to state that further reducing the wage index for rural providers will make recruiting and retaining medical professionals more difficult for rural America. The commenter states that using the wage index for the local area ignores important market forces and that many health professionals are recruited from a distance, making the local wage insufficient financial incentive for practicing in rural America. Another commenter states that rural HHAs often function as the primary caregivers for elderly homebound patients, who have high resource needs, which also increases the cost of rural home health services.

        Response: The HH wage index values in rural areas are not necessarily lower than the HH wage index values in urban areas. The HH wage index reflects the wages that inpatient hospitals pay in their local geographic areas. In addition, HHAs receive rural add-on payments for services provided to beneficiaries in rural areas. Section 421(a) of the MMA, as amended by section 210 of the Medicare Access and CHIP Reauthorization Act of 2015 (MACRA) (Pub. L. 114-10), provides for a payment increase of 3 percent for HH services provided in rural areas for episodes or visits ending on or after April 1, 2010, and before January 1, 2018.

        Final Decision: After considering the comments received in response to the CY 2016 HH PPS proposed rule (80 FR 39840) and for the reasons discussed above, we are finalizing our proposal to use the pre-floor, pre-reclassified hospital inpatient wage index as the wage adjustment to the labor portion of the HH PPS rates. For CY 2016, the updated wage data are for hospital cost reporting periods beginning on or after October 1, 2011 and before October 1, 2012 (FY 2012 cost report data).

    4. Payments for High-Cost Outliers Under the HH PPS

      1. Background

      In the July 10, 2015 Medicare and Medicaid Programs; CY 2016 Home Health Prospective Payment System Rate Update; Home Health Value-Based Purchasing Model; and Home Health Quality Reporting Requirements; Proposed Rules (80 FR 39863 through 39864), we described the background and current method for determining outlier payments under the HH PPS. In that rule, we did not propose any changes to the current home health outlier payment policy for CY 2016.

      For this final rule, simulating payments using CY 2014 claims data (as of June 30, 2015) and the CY 2016 payment rates, without the rebasing and nominal case-mix growth adjustments as described in section III.C.3 of this rule, we estimate that outlier payments in CY 2016 would comprise 2.13 percent of total payments. Based on simulations using CY 2014 claims data and the CY 2016 payments rates, including the rebasing and nominal case-mix growth adjustments as described in section III.C.3 of this rule, we estimate that outlier payments would comprise approximately 2.30 percent of total HH PPS payments, a percent change of almost 8 percent. This increase is attributable to the increase in the national per-visit amounts through the rebasing adjustments and the decrease in the national, standardized 60-day episode payment amount as a result of the rebasing and nominal case-mix growth adjustments. Given the same rebasing adjustments and case-mix growth reduction would also occur for 2017, and hence a similar anticipated increase in the outlier payments, we estimate that for CY 2017 outlier payments as a percent of total HH PPS payments would be approximately 2.5 percent.

      We did not propose a change to the FDL ratio or loss-sharing ratio for CY

      Page 68655

      2016 as we believe that maintaining an FDL of 0.45 and a loss-sharing ratio of 0.80 are appropriate given the percentage of outlier payments is estimated to increase as a result of the increase in the national per-visit amounts through the rebasing adjustments and the decrease in the national, standardized 60-day episode payment amount as a result of the rebasing adjustment and nominal case-mix growth reduction. We will continue to monitor the percent of total HH PPS payments paid as outlier payments to determine if future adjustments to either the FDL ratio or loss-sharing ratio are warranted.

      The following is a summary of comments we received regarding payments for high-cost outliers.

      Comment: One commenter expressed support of the continuation of the high cost outlier parameters as currently structured.

      Response: We appreciate the commenter's support of the current HH PPS outlier policy. We strive to maintain an approach that accounts for episodes that incur unusually high costs due to patient care needs.

      Comment: Several commenters recommended changes to the existing outlier policy, including the elimination of the outlier payment policy altogether as well as modifications to the FDL Ratio and/or Loss-

      Sharing Ratio in order to generate outlier payment levels approximating 2.5 percent.

      Response: We believe that section 1895(b)(5)(A) of the Act affords the Secretary the discretion as to whether or not to have an outlier policy under the HH PPS. We plan to continue investigating whether or not an outlier policy remains appropriate as well as ways to maintain an outlier policy for episodes that incur unusually high costs due to patient care needs without qualifying episodes of care that do not meet said criteria or are potentially fraudulent. We recently awarded a contract to Abt Associates to address any findings from the home health study required by section 3131(d) of the Affordable Care Act, monitor the potential impact of the rebasing adjustments and other recent payment changes, and develop payment options to ensure ongoing access to care for vulnerable populations. The work under this contract may include potential revisions to the outlier payment methodology to better reflect costs of treating Medicare beneficiaries with high levels of severity of illness.

      Comment: One commenter suggested that CMS's outlier policy and ten percent threshold cap are not appropriate fraud-fighting initiatives and suggested other mechanisms for oversight and monitoring, including a provider-specific floor (minimum) on the number or percent of episodes that result in LUPAs.

      Response: As we have noted in the past (74 FR 58085), we are committed to addressing potentially fraudulent activities, especially those in areas where we see suspicious outlier payments. As we noted above, we plan to examine potential revisions to the outlier payment methodology through ongoing studies and analysis of home health claims and other utilization data. Monitoring of potentially fraudulent activity will be captured in this analysis, and we will make policy and other adjustments as necessary in light of the new data and outcomes as appropriate.

      Final Decision: We are finalizing no change to the FDL ratio or loss sharing ratio for CY 2016. However, we will continue to monitor outlier payments and continue to explore ways to maintain an outlier policy for episodes that incur unusually high costs due to patient care needs without qualifying episodes of care that do not meet that criteria.

    5. Report to the Congress on the Home Health Study Required by Section 3131(d) of the Affordable Care Act and an Update on Subsequent Research and Analysis

      In the CY 2016 HH PPS proposed rule (80 FR 39840), we included an informational summary of the Report to Congress on the home health study required by section 3131(d) of the Affordable Care Act and we provided an update on subsequent research and analysis completed to date. We will continue to provide the home health industry with periodic updates on the progress of our subsequent research, aimed at addressing the findings from the section 3131(d) of the Affordable Care Act home health study, in future rulemaking and/or announcements on the HHA Center Web page at: https://www.cms.gov/Center/Provider-Type/Home-Health-Agency-HHA-Center.html.

    6. Technical Regulations Text Changes

      We proposed to make several technical corrections in part 484 to better align the payment requirements with recent statutory and regulatory changes for home health services. We proposed to make changes to Sec. 484. 205(e) to state that estimated total outlier payments for a given calendar year are limited to no more than 2.5 percent of total outlays under the HHA PPS, as required by section 1895(b)(5)(A) of the Act as amended by section 3131(b)(2)(B) of the Affordable Care Act, rather than 5 percent of total outlays. Similarly, we also proposed to specify in Sec. 484.240(e) that the fixed dollar loss and the loss sharing amounts are chosen so that the estimated total outlier payment is no more than 2.5 percent of total payments under the HH PPS. We also proposed to describe in Sec. 484.240(f) that the estimated total amount of outlier payments to an HHA in a given year may not exceed 10 percent of the estimated total payments to the specific agency under the HH PPS in a given year. This update aligns the regulations text at Sec. 484.240(f) with the statutory requirement. Finally, we proposed a minor editorial change in Sec. 484.240(b) to specify that the outlier threshold for each case-mix group is the episode payment amount for that group, or the PEP adjustment amount for the episode, plus a fixed dollar loss amount that is the same for all case-mix groups.

      In addition to the proposed changes to the regulations text pertaining to outlier payments under the HH PPS, we also proposed to amend Sec. 409.43(e)(iii) and to add language to Sec. 484.205(d) to clarify the frequency of review of the plan of care and the provision of Partial Episode Payments (PEP) under the HH PPS as a result of a regulations text change in Sec. 424.22(b) that was finalized in the CY 2015 HH PPS final rule (79 FR 66032). Specifically, we proposed to change the definition of an intervening event to include transfers and instances where a patient is discharged and return to home health during a 60-day episode, rather than a discharge and return to the same HHA during a 60-day episode. In Sec. 484.220, we proposed to update the regulations text to reflect the downward adjustments to the 60-day episode payment rate due to changes in the coding or classification of different units of service that do not reflect real changes in case-mix (nominal case-mix growth) applied to calendar years 2012 and 2013, which were finalized in the CY 2012 HH PPS final rule (76 FR 68532) as well as updating the CY 2011 adjustment to 3.79 percent as finalized in the CY 2011 HH PPS final rule (75 FR 70461). In Sec. 484.225 we proposed to eliminate references to outdated market basket index factors by removing paragraphs (b), (c), (d), (e), (f), and (g). In Sec. 484.230 we proposed to delete the last sentence as a result of a change from a separate LUPA add-on amount to a LUPA add-on factor finalized in the CY 2014 HH PPS final rule (78 FR 72256). Finally, we proposed deleting and reserving Sec. 484.245 as we believe that this language is no longer applicable under the HH PPS, as it was meant to

      Page 68656

      facilitate the transition to the original PPS established in CY 2000.

      Lastly, we proposed to make one technical correction in Sec. 424.22 to re-designate paragraph (a)(1)(v)(B)(1) as (a)(2).

      We invited comments on these technical corrections and associated changes in the regulations in parts 409, 424, and 484. However, we did not receive any comments regarding the technical regulations text changes.

      Final Decision: We are finalizing the technical regulations text changes at Sec. 409, Sec. 424, and Sec. 484 as proposed.

  13. Provisions of the Home Health Value-Based Purchasing (HHVBP) Model and Response to Comments

    1. Background

      In the CY 2015 Home Health Prospective Payment System (HH PPS) final rule titled ``Medicare and Medicaid Programs; CY 2015 Home Health Prospective Payment System Rate Update; Home Health Quality Reporting Requirements; and Survey and Enforcement Requirements for Home Health Agencies (79 FR 66032-66118), we indicated that we were considering the development of a home health value-based purchasing (HHVBP) model. We sought comments on a future HHVBP model, including elements of the model; size of the payment incentives and percentage of payments that would need to be placed at risk in order to spur home health agencies (HHAs) to make the necessary investments to improve the quality of care for Medicare beneficiaries; the timing of the payment adjustments; and, how performance payments should be distributed. We also sought comments on the best approach for selecting states for participation in this model. We noted that if the decision was made to move forward with the implementation of a HHVBP model in CY 2016, we would solicit additional comments on a more detailed model proposal to be included in future rulemaking.

      In the CY 2015 HH PPS final rule,\8\ we indicated that we received a number of comments related to the magnitude of the percentage payment adjustments; evaluation criteria; payment features; a beneficiary risk adjustment strategy; state selection methodology; and the approach to selecting Medicare-certified HHAs. A number of commenters supported the development of a value-based purchasing model in the home health industry in whole or in part with consideration of the design parameters provided. No commenters provided strong counterpoints or alternative design options which dissuaded CMS from moving forward with general design and framework of the HHVBP model as discussed in the CY 2015 HH PPS proposed rule. All comments were considered in our decision to develop an HHVBP model for implementation beginning January 1, 2016. Therefore, in the CY 2016 HH PPS proposed rule, we proposed to implement a HHVBP model, which included a randomized state selection methodology; a reporting framework; a payment adjustment methodology; a payment adjustment schedule by performance year and payment adjustment percentage; a quality measures selection methodology, classifications and weighting, measures for performance year one, including the reporting of New Measures, and a framework for proposing to adopt measures for subsequent performance years; a performance scoring methodology, which includes performance based on achievement and improvement; a review and recalculation period; and an evaluation framework. As we discuss in more detail below, we are finalizing our proposal to implement the HHVBP Model beginning January 1, 2016. We respond to comments received on the proposed components of the model, and discuss our final policies with respect to each of these components, in the relevant sections below.

      ---------------------------------------------------------------------------

      \8\ Medicare and Medicaid Programs; CY 2015 Home Health Prospective Payment System Rate Update; Home Health Quality Reporting Requirements; and Survey and Enforcement Requirements for Home Health Agencies, 79 FR 66105-66106 (November 6, 2014).

      ---------------------------------------------------------------------------

      The basis for developing the proposed value-based purchasing (VBP) model, as described in the proposed regulations at Sec. 484.300 et seq., stems from several important areas of consideration. First, we expect that tying quality to payment through a system of value-based purchasing will improve the beneficiaries' experience and outcomes. In turn, we expect payment adjustments that both reward improved quality and penalize poor performance will incentivize quality improvement and encourage efficiency, leading to a more sustainable payment system.

      Second, section 3006(b) of the Affordable Care Act directed the Secretary of the Department of Health and Human Services (the Secretary) to develop a plan to implement a VBP program for payments under the Medicare Program for HHAs and the Secretary issued an associated Report to Congress in March of 2012 (2012 Report).\9\ The 2012 Report included a roadmap for implementation of an HHVBP model and outlined the need to develop an HHVBP program that aligns with other Medicare programs and coordinates incentives to improve quality. The 2012 Report also indicated that a HHVBP program should build on and refine existing quality measurement tools and processes. In addition, the 2012 Report indicated that one of the ways that such a program could link payment to quality would be to tie payments to overall quality performance.

      ---------------------------------------------------------------------------

      \9\ CMS, ``Report to Congress: Plan to Implement a Medicare Home Health Agency Value-Based Purchasing Program'' (March 15, 2012) available at http://www.cms.gov/Medicare/Medicare-Fee-for-Service-Payment/HomeHealthPPS/downloads/stage-2-NPRM.PDF.

      ---------------------------------------------------------------------------

      Third, section 402(a)(1)(A) of the Social Security Amendments of 1967 (as amended) (42 U.S.C. 1395b-1(a)(1)(A)), provided authority for us to conduct the Home Health Pay-for-Performance (HHPFP) Demonstration that ran from 2008 to 2010. The results of that demonstration found modest quality improvement in certain measures after comparing the quality of care furnished by demonstration participants to the quality of care furnished by the control group. One important lesson learned from the HHPFP Demonstration was the need to link the HHA's quality improvement efforts and the incentives. HHAs in three of the four regions generated enough savings to have incentive payments in the first year of the demonstration, but the size of payments were unknown until after the conclusion of the demonstration. Also, the time lag between quality performance and payment incentives was too long to provide a sufficient motivation for HHAs to take necessary steps to improve quality. The results of the demonstration, published in a comprehensive evaluation report \10\ suggest that future models could benefit from ensuring that incentives are reliable enough, of sufficient magnitude, and paid in a timely fashion to encourage HHAs to be fully engaged in the quality of care initiative.

      ---------------------------------------------------------------------------

      \10\ ``CMS Report on Home Health Agency Value-Based Purchasing Program'' (February of 2012) available at https://www.cms.gov/Research-Statistics-Data-and-Systems/Statistics-Trends-and-Reports/Reports/Downloads/HHP4P_Demo_Eval_Final_Vol1.pdf.

      ---------------------------------------------------------------------------

      Furthermore, the President's FY 2015 and 2016 Budgets proposed that VBP should be extended to additional providers including skilled nursing facilities, home health agencies, ambulatory surgical centers, and hospital outpatient departments. The FY 2015 Budget called for at least 2-percent of payments to be tied to quality and efficiency of care on a budget neutral

      Page 68657

      basis. The FY 2016 Budget outlines a program which would tie at least 2-percent of Medicare payments to the quality and efficiency of care in the first 2 years of implementation beginning in 2017, and at least 5-

      percent beginning in 2019 without any impact to the budget. We proposed and are finalizing an HHVBP model that follows a graduated payment adjustment strategy within certain selected states beginning January 1, 2016.

      The Secretary has also set two overall delivery system reform goals for CMS. First, we seek to tie 30-percent of traditional, or fee-for-

      service, Medicare payments to quality or value-based payments through alternative payment models by the end of 2016, and to tie 50-percent of payments to these models by the end of 2018. Second, we seek to tie 85-

      percent of all traditional Medicare payments to quality or value by 2016 and 90-percent by 2018.\11\ To support these efforts the Health Care Payment Learning and Action Network was recently launched to help advance the work being done across sectors to increase the adoption of value-based payments and alternative payment models. We believe that testing the HHVBP Model would support these goals.

      ---------------------------------------------------------------------------

      \11\ Content of this announcement can be found at http://www.hhs.gov/news/press/2015pres/01/20150126a.html.

      ---------------------------------------------------------------------------

      Finally, we have already successfully implemented the Hospital Value-Based Purchasing (HVBP) program, under which value-based incentive payments are made in a fiscal year to hospitals that meet performance standards established for a performance period with respect to measures for that fiscal year. The percentage of a participating hospital's base-operating DRG payment amount for FY 2016 discharges that is at risk, based on the hospital's performance under the program for that fiscal year, is 1.75 percent. That percentage will increase to 2.0 by FY 2017. We proposed and are finalizing in this rule an HHVBP Model that builds on the lessons learned and guidance from the HVBP program and other applicable demonstrations as discussed above, as well as from the evaluation report discussed earlier.

      As we stated in the CY 2016 HH PPS proposed rule, the HHVBP Model presents an opportunity to improve the quality of care furnished to Medicare beneficiaries and study what incentives are sufficiently significant to encourage HHAs to provide high quality care. The HHVBP Model will offer both a greater potential reward for high performing HHAs as well as a greater potential downside risk for low performing HHAs. We proposed, and are finalizing in this rule, that the model will begin on January 1, 2016, and include an array of measures that would capture the multiple dimensions of care that HHAs furnish.

      The HHVBP Model, as finalized, will be tested by CMS's Center for Medicare and Medicaid Innovation (CMMI) under section 1115A of the Act. Under section 1115A(d)(1) of the Act, the Secretary may waive such requirements of Titles XI and XVIII and of sections 1902(a)(1), 1902(a)(13), and 1903(m)(2)(A)(iii) as may be necessary solely for purposes of carrying out section 1115A with respect to testing models described in section 1115A(b). The Secretary is not issuing any waivers of the fraud and abuse provisions in sections 1128A, 1128B, and 1877 of the SSA or any other Medicare or Medicaid fraud and abuse laws for this model. Thus, notwithstanding any other provisions of this rule, all providers participating in the HHVBP Model must comply with all applicable fraud and abuse laws and regulations. Therefore, to clarify the scope of the Secretary's authority we have finalized Sec. 484.300 confirming authority to establish Part F under sections 1102, 1115A, and 1871 of the Act (42 U.S.C. 1315a), which authorizes the Secretary to issue regulations to operate the Medicare program and test innovative payment and service delivery models to improve coordination, quality, and efficiency of health care services furnished under Title XVIII.

      As we proposed, we are using section 1115A(d)(1) waiver authority to apply a reduction or increase of up to 8-percent to current Medicare payments to competing HHAs delivering care to beneficiaries in selected states, depending on the HHA's performance on specified quality measures relative to its peers. Specifically, the HHVBP Model will utilize the waiver authority to adjust Medicare payment rates under section 1895(b) of the Act.\12\ In accordance with the authority granted to the Secretary in section 1115A(d)(1) of the Act, we are waiving section 1895(b)(4) of the Act only to the extent necessary to adjust payment amounts to reflect the value-based payment adjustments under this model for Medicare-certified HHAs in specified states selected in accordance with CMS's selection methodology. We are not implementing this model under the authority granted by the Affordable Care Act under section 3131 (``Payment Adjustments for Home Health Care'').

      ---------------------------------------------------------------------------

      \12\ 42 U.S.C. 1395fff.

      ---------------------------------------------------------------------------

      We are finalizing in this rule, as we proposed, that the defined population includes all Medicare beneficiaries provided care by any Medicare-certified HHA delivering care within the selected states. Medicare-certified HHAs that are delivering care within selected states are considered `Competing Home Health Agencies' within the scope of this HHVBP Model. If care is delivered outside of selected states, or within a non-selected state that does not have a reciprocal agreement with a selected state, payments for those beneficiaries are not considered within the scope of the model because we are basing participation in the model on state-specific CMS Certification Numbers (CCNs). Payment adjustments for each year of the model will be calculated based on a comparison of how well each competing HHA performed during the performance period for that year (proposed, and finalized below, to be one year in length, starting in CY 2016) with its performance on the same measures in 2015 (proposed, and finalized below, to be the baseline data year).

      As we proposed, and are finalizing in this rule, the first performance year will be CY 2016, the second will be CY 2017, the third will be CY 2018, the fourth will be 2019, and the fifth will be CY 2020. Greater details on performance periods are outlined in Section D--Performance Assessment and Payment Periods. This model will test whether being subject to significant payment adjustments to the Medicare payment amounts that would otherwise be made to competing Medicare-certified HHAs would result in statistically-significant improvements in the quality of care being delivered to this specific population of Medicare beneficiaries.

      We proposed, and are finalizing in this rule, to identify Medicare-

      certified HHAs to compete in this model using state borders as boundaries. We do so under the authority granted in section 1115A(a)(5) of the Act to elect to limit testing of a model to certain geographic areas. This decision is influenced by the 2012 Report to Congress mandated under section 3006(b) of the Affordable Care Act. This Report stated that HHAs which participated in previous value-based purchasing demonstrations ``uniformly believed that all Medicare-certified HHAs should be required to participate in future VBP programs so all agencies experience the potential burdens and benefits of the program'' and some HHAs expressed concern that absent mandatory participation, ``low-performing agencies in areas with

      Page 68658

      limited competition may not choose to pursue quality improvement.'' \13\

      ---------------------------------------------------------------------------

      \13\ See the Recommendations section of the U.S. Department of Health and Human Services. Report to Congress: Plan to Implement a Medicare Home Health Agency Value-Based Purchasing Program.'' (March 2012) p. 28.

      ---------------------------------------------------------------------------

      Section 1115A(b)(2)(A) of the Act requires that the Secretary select models to be tested where the Secretary determines that there is evidence that the model addresses a defined population for which there are deficits in care leading to poor clinical outcomes or potentially avoidable expenditures. The HHVBP Model was developed to improve care for Medicare patients receiving care from HHAs based on evidence in the March 2014 MedPAC Report to Congress citing quality and cost concerns in the home health sector. According to MedPAC, ``about 29-percent of post-hospital home health stays result in readmission, and there is tremendous variation in performance among providers within and across geographic regions.'' \14\ The same report cited limited improvement in quality based on existing measures, and noted that the data on quality ``are collected only for beneficiaries who do not have their home health care stays terminated by a hospitalization,'' skewing the results in favor of a healthier segment of the Medicare population.\15\ This model will test the use of adjustments to Medicare HH PPS rates by tying payment to quality performance with the goal of achieving the highest possible quality and efficiency.

      ---------------------------------------------------------------------------

      \14\ See full citation at note 11. MedPAC Report to Congress (March 2014) p. 215.

      \15\ MedPAC Report to Congress (March 2014) p. 226.

      ---------------------------------------------------------------------------

    2. Overview

      We proposed to include in Sec. 484.305 definitions for ``applicable percent'', ``applicable measure'', ``benchmark'', ``home health prospective payment system'', ``larger-volume cohort'', ``linear exchange function'', ``Medicare-certified home health agency'', ``New Measures'', ``payment adjustment'', ``performance period'', ``smaller-

      volume cohort'', ``selected states'', ``starter set'', ``Total Performance Score'', and ``value-based purchasing'' as they pertain to this subpart. Where we received comments on the proposed definitions or the substantive provisions of the model connected to the proposed definitions, we respond to comments in the relevant sections below. We are finalizing all the definitions as proposed in Sec. 484.305 except for two: We are revising ``applicable percent'' so the final definition reflects the revised percentages as 3-percent for CY 2018, 5-percent for CY 2019, 6-percent for 2020; 7-percent for CY 2021 and 8-percent for CY 2022, as discussed in section G and we are revising ``Medicare-

      certified home health agency'' as ``Competing home health agency'' for clarity, since all HHAs with CCNs are, by definition, Medicare-

      certified, and only those HHAs in selected states are competing in the model. As we proposed and are finalizing in this rule, the HHVBP Model will encompass 5 performance years and be implemented beginning January 1, 2016 and conclude on December 31, 2022.

      Payment and service delivery models are developed by CMMI in accordance with the requirements of section 1115A of the Act. During the development of new models, CMMI builds on the ideas received from internal and external stakeholders and consults with clinical and analytical experts.

      We are finalizing our proposal to implement a HHVBP Model that has an overall purpose of improving the quality and efficient delivery of home health care services to the Medicare population. The specific goals of the model are to:

      1. Incentivize HHAs to provide better quality care with greater efficiency;

      2. Study new potential quality and efficiency measures for appropriateness in the home health setting; and,

      3. Enhance current public reporting processes.

      We proposed that the HHVBP Model would adjust Medicare HHA payments over the course of the model by up to 8-percent depending on the applicable performance year and the degree of quality performance demonstrated by each competing HHA. As discussed in greater detail in section G, we are finalizing this proposal with modification. Under our final policy, the model will reduce the HH PPS final claim payment amount to an HHA for each episode in a calendar year by an amount up to the applicable percentage revised and defined in Sec. 484.305. The timeline of payment adjustments as they apply to each performance year is described in greater detail in the section D2 entitled ``Payment Adjustment Timeline.''

      As we proposed, and are finalizing in this rule, the model will apply to all Medicare-certified HHAs in each of the selected states, which means that all HHAs in the selected states will be required to compete. We codify this policy at 42 CFR 484.310. Furthermore, a competing HHA will only be measured on performance for care delivered to Medicare beneficiaries within selected states (with rare exceptions given for care delivered when a reciprocal agreement exists between states). The distribution of payment adjustments will be based on quality performance, as measured by both achievement and improvement, across a set of quality measures rigorously constructed to minimize burden as much as possible and improve care. Competing HHAs that demonstrate they can deliver higher quality of care in comparison to their peers (as defined by the volume of services delivered within the selected state), or their own past performance, could have their payment for each episode of care adjusted higher than the amount that otherwise would be paid under section 1895 of the Act. Competing HHAs that do not perform as well as other competing HHAs of the same size in the same state might have their payments reduced and those competing HHAs that perform similarly to others of similar size in the same state might have no payment adjustment made. This operational concept is similar in practice to what is used in the HVBP program.

      We expect that the risk of having payments adjusted in this manner will provide an incentive among all competing HHAs delivering care within the boundaries of selected states to provide significantly better quality through improved planning, coordination, and management of care. The degree of the payment adjustment will be dependent on the level of quality achieved or improved from the baseline year, with the highest upward performance adjustments going to competing HHAs with the highest overall level of performance based on either achievement or improvement in quality. The size of a competing HHA's payment adjustment for each year under the model will be dependent upon that HHA's performance with respect to that calendar year relative to other competing HHAs of similar size in the same state and relative to its own performance during the baseline year.

      We proposed that states would be selected randomly from nine regional groupings for model participation. As discussed further in section IV.C. of this rule, we are finalizing this proposal. A competing HHA is only measured on performance for care delivered to Medicare beneficiaries within boundaries of selected states and only payments for HHA services provided to Medicare beneficiaries within boundaries of selected states will be subject to adjustment under this model unless a reciprocal agreement is in place. Requiring all Medicare-certified HHAs within the boundaries of selected

      Page 68659

      states to compete in the model ensures that: (1) There is no self-

      selection bias, (2) competing HHAs are representative of HHAs nationally, and (3) there is sufficient participation to generate meaningful results. We believe it is necessary to require all HHAs delivering care within boundaries of selected states to be included in the model because, in our experience, Medicare-providers are generally reluctant to participate voluntarily in models in which their Medicare payments could be subject to possible reduction. This reluctance to participate in voluntary models has been shown to cause self-selection bias in statistical assessments and thus, may present challenges to our ability to evaluate the model. In addition, state boundaries represent a natural demarcation in how quality is currently being assessed through Outcome Assessment Information Set (OASIS) measures on Home Health Compare (HHC). Secondly, it is our intent to generate an appropriate selection of competitor types in this model as a means of yielding the most optimal level of generalizability and representativeness of HHAs in the nation. Finally, having an appropriate number of competitors within the model should generate an appropriate statistical power to detect key effects we are testing in this model.

    3. Selection Methodology

      1. Identifying a Geographic Demarcation Area

      We proposed to adopt a methodology that uses state borders as boundaries for demarcating which Medicare-certified HHAs will be required to compete in the model and proposed to select nine states from nine geographically-defined groupings of five or six states. Groupings were also defined so that the successful implementation of the model would produce robust and generalizable results, as discussed later in this section. We are finalizing this approach here.

      We took into account five key factors when deciding to propose selection at the state-level for this model. First, if we required some, but not all, Medicare-certified HHAs that deliver care within the boundaries of a selected state to participate in the model, we believe the HHA market for the state could be disrupted because HHAs in the model would be competing against HHAs that are not included in the model (herein referenced `non-competing HHAs'). Second, we wanted to ensure that the distribution of payment adjustments based on performance under the model could be extrapolated to the entire country. Statistically, the larger the sample to which payment adjustments are applied, the smaller the variance of the sampling distribution and the greater the likelihood that the distribution accurately predicts what would transpire if the methodology were applied to the full population of HHAs. Third, we considered the need to align with other HHA quality program initiatives including HHC. The HHC Web site presently provides the public and HHAs a state- and national-level comparison of quality. We expect that aligning performance with the HHVBP benchmark and the achievement score will support how measures are currently being reported on HHC. Fourth, there is a need to align with CMS regulations which require that each HHA have a unique CMS Certification Number (CCN) for each state in which the HHA provides service. Fifth, we wanted to ensure sufficient sample size and the ability to meet the rigorous evaluation requirements for CMMI models. These five factors are important for the successful implementation and evaluation of this model.

      We expect that when there is a risk for a downward payment adjustment based on quality performance measures, the use of a self-

      contained, mandatory cohort of HHA participants will create a stronger incentive to deliver greater quality among competing HHAs. Specifically, it is possible the market would become distorted if non-

      model HHAs are delivering care within the same market as competing HHAs because competition, on the whole, becomes unfair when payment is predicated on quality for one group and volume for the other group. In addition, we expect that evaluation efforts might be negatively impacted because some HHAs would be competing on quality and others on volume, within the same market.

      We proposed the use of state boundaries after careful consideration of several alternative selection approaches, including randomly selecting HHAs from all HHAs across the country, and requiring participation from smaller geographic regions including the county; the Combined Statistical Area (CSA); the Core-Based Statistical Area (CBSA); Metropolitan Statistical Area (MSA) rural provider level; and the Hospital Referral Region (HRR) level.

      A methodology using a national sample of HHAs that are randomly selected from all HHAs across the country could be designed to include enough HHAs to ensure robust payment adjustment distribution and a sufficient sample size for the evaluation; however, this approach may present significant limitations when compared with the state boundaries selection methodology we proposed in this model. Of primary concern with randomly selecting at the provider-level across the nation is the issue with market distortions created by having competing HHAs operating in the same market as non-model HHAs.

      Using smaller geographic areas than states, such as counties, CSAs, CBSAs, rural, and HRRs, could also present challenges for this model. These smaller geographic areas were considered as alternate selection options; however, their use could result in too small of a sample size of potential competing HHAs. As a result, we expect the distribution of payment adjustments could become highly divergent among fewer HHA competitors. In addition, the ability to evaluate the model could become more complex and may be less generalizable to the full population of Medicare-certified HHAs and the beneficiaries they serve across the nation. Further, the use of smaller geographic areas than states could increase the proportion of Medicare-certified HHAs that could fall into groupings with too few agencies to generate a stable distribution of payment adjustments. Thus, if we were to define geographic areas based on CSAs, CBSAs, counties, or HRRs, we would need to develop an approach for consolidating smaller regions into larger regions.

      Home health care is a unique type of health care service when compared to other Medicare provider types. In general, the HHA's care delivery setting is in the beneficiaries' homes as opposed to other provider types that traditionally deliver care at a brick and mortar institution within beneficiaries' respective communities. As a result, the HHVBP Model needs to be designed to account for the unique way that HHA care is provided in order for results to be generalizable to the population. HHAs are limited to providing care to beneficiaries in the state that they have a CCN however; HHAs are not restricted from providing service in a county, CSA, CBSA or HRR that they are not located in (as long as the other county/CBSA/HRR is in the same state in which the HHA is certified). As a result, using smaller geographic areas (than state boundaries) could result in similar market distortion and evaluation confounders as selecting providers from a randomized national sampling. The reason is that HHAs in adjacent counties/CSAs/

      CBSAs/HRRs may not be in the model but, would be directly competing for services in the same markets or geographic regions. Competing HHAs delivering care in the

      Page 68660

      same market area as non-competing HHAs could generate a spillover effect where non-model HHAs would be vying for the same beneficiaries as competing HHAs. This spillover effect presents several issues for evaluation as the dependent variable (quality) becomes confounded by external influences created by these non-competing HHAs. These unintentional external influences on competing HHAs may be made apparent if non-competing HHAs become incentivized to generate greater volume at the expense of quality delivered to the beneficiaries they serve and at the expense of competing HHAs that are paid on quality instead of volume. Further, the ability to extrapolate these results to the full population of HHAs and the beneficiaries they serve becomes confounded by an artifact of the model and inferences would be limited from an inability to duplicate these results. While these concerns would decrease in some order of magnitude as larger regions are considered, the only way to eliminate these concerns entirely is to define inclusion among Medicare-certified HHAs at the state level.

      In addition, home health quality data currently displayed on HHC allows users to compare HHA services furnished within a single state. Selecting HHAs using other geographic regions that are smaller and/or cross state lines could require the model to deviate from the established process for reporting quality. For these reasons, we stated in the proposed rule that we believe a selection methodology based on the use of Medicare-certified HHAs delivering care within state boundaries is the most appropriate for the successful implementation and evaluation of this model. In the proposed rule, we requested comments on this proposed state selection methodology as well as potential alternatives. We summarize and respond to comments received at the end of this section (section IV.C.). As we discuss below, we are finalizing the state selection model as proposed.

      2. Overview of the Randomized Selection Methodology for States

      We proposed the state selections listed in proposed Sec. 484.310 based on the described proposed randomized selection methodology. We proposed to group states by each state's geographic proximity to one another accounting for key evaluation characteristics (that is, proportionality of service utilization, proportionality of organizations with similar tax-exempt status and HHA size, and proportionality of beneficiaries that are dually-eligible for Medicare and Medicaid).

      Based on an analysis of OASIS quality data and Medicare claims data, we stated in the proposed rule that we believe the use of nine geographic groupings would account for the diversity of beneficiary demographics, rural and urban status, cost and quality variations, among other criteria. To provide for comparable and equitable selection probabilities, these separate geographic groupings each include a comparable number of states. Under our proposed methodology, groupings were based on states' geographic proximity to one another, having a comparable number of states if randomized for an equal opportunity of selection, and similarities in key characteristics that will be considered in the evaluation study because the attributes represent different types of HHAs, regulatory oversight, and types of beneficiaries served. This is necessary for the evaluation study to remain objective and unbiased and so that the results of this study best represent the entire population of Medicare-certified HHAs across the nation.

      Several of the key characteristics we used for grouping state boundaries into clusters for selection into the model are also used in the impact analysis of our annual HHA payment updates, a fact that reinforces their relevance for evaluation. The additional proposed standards for grouping (level of utilization and socioeconomic status of patients) are also important to consider when evaluating the program, because of their current policy relevance. Large variations in the level of utilization of the home health benefit has received attention from policymakers concerned with achieving high-value health care and curbing fraud and abuse.\16\ Policymakers' concerns about the role of beneficiary-level characteristics as determinants of resource use and health care quality were highlighted in the Affordable Care Act, which mandated a study \17\ of access to home health care for vulnerable populations \18\ and, more recently, the Improving Medicare Post-acute Care Transformation (IMPACT) Act of 2014 required the Secretary to study the relationship between individuals' socioeconomic status and resource use or quality.\19\ The parameters used to define each geographic grouping are further described in the next three sections.

      ---------------------------------------------------------------------------

      \16\ See MedPAC Report to Congress: Medicare Payment Policy (March 2014, Chapter 9) available at http://medpac.gov/documents/reports/mar14_entirereport.pdf. See also the Institute of Medicine Interim Report of the Committee on Geographic Variation in Health Care Spending and Promotion of High-Value Health Care: Preliminary Committee Observations (March 2013) available at http://iom.edu/Reports/2013/Geographic-Variation-in-Health-Care-Spending-and-Promotion-of-High-Care-Value-Interim-Report.aspx.

      \17\ This study can be accessed at http://www.cms.gov/Center/Provider-Type/Home-Health-Agency-HHA-Center.html.

      \18\ Section 3131(d) of the Affordable Care Act.

      \19\ Improving Medicare Post-acute Care Transformation (IMPACT) Act of 2014 (Public Law 113-185).

      ---------------------------------------------------------------------------

      1. Geographic Proximity

        We explained in the proposed rule that under this methodology, in order to ensure that the Medicare-certified HHAs that would be required to participate in the model are not all in one region of the country, the states in each grouping are adjacent to each other whenever possible while creating logical groupings of states based on common characteristics as described above. Specifically, analysis based on quality data and claims data found that HHAs in these neighboring states tend to hold certain characteristics in common. These include having similar patterns of utilization, proportionality of non-profit agencies, and types of beneficiaries served (for example, severity and number, type of co-morbidities, and socio-economic status). Therefore, the proposed groupings of states were delineated according to states' geographic proximity to one another and common characteristics as a means of permitting greater comparability. In addition, each of the groupings retains similar types of characteristics when compared to any other type of grouping of states.

      2. Comparable Number of States in Each Grouping

        Under the proposed randomized selection methodology, each geographic region, or grouping, has a similar number of states. As a result, all states had a 16.7-percent to 20-percent chance of being selected under our proposed methodology, and Medicare-certified HHAs had a similar likelihood of being required to compete in the model by using this sampling design. We asserted in the proposed rule that this sampling design would ensure that no single entity is singled out for selection, since all states and Medicare-certified HHAs would have approximately the same chance of being selected. In addition, this sampling approach would mitigate the opportunity for HHAs to self-

        select into the model and thereby bias any results of the test.

        Page 68661

      3. Characteristics of State Groupings

        Without sacrificing an equal opportunity for selection, we explained in the proposed rule that the proposed state groupings are intended to ensure that important characteristics of Medicare-certified HHAs that deliver care within state boundaries can be used to evaluate the primary intervention with greater generalizability and representativeness of the entire population of Medicare-certified HHAs in the nation. Data analysis of these characteristics employed the full data set of Medicare claims and OASIS quality data. Although some characteristics, such as beneficiary age and case-mix, yielded some variations from one state to another, other important characteristics do vary substantially and could influence how HHAs respond to the incentives of the model. Specifically, home health services utilization rates, tax-exemption status of the provider, the socioeconomic status of beneficiaries (as measured by the proportion of dually-eligible beneficiaries), and agency size (as measured by average number of episodes of care per HHA), are important characteristics that could influence outcomes of the model. Subsequently, we intend to study the impacts of these characteristics for purposes of designing future value-based purchasing models and programs. These characteristics and expected variations must be considered in the evaluation study to enable us to avoid erroneous inferences about how different types of HHAs will respond to HHVBP incentives.

        Under our proposed state selection methodology, state groupings reflect regional variations that enhance the generalizability of the model. In line with this methodology, each grouping includes states that are similar in at least one important aforementioned characteristic while being geographically located in close proximity to one another. Using the criteria described above, the following geographic groupings were identified using Medicare claims-based data from calendar years 2013-2014. Each of the 50 states was assigned to one of the following geographic groups:

        Group #1: (VT, MA, ME, CT, RI, NH)

        States in this group tend to have larger HHAs and have average utilization relative to other states.

        Group #2: (DE, NJ, MD, PA, NY)

        States in this group tend to have larger HHAs, have lower utilization, and provide care to an average number of dually-eligible beneficiaries relative to other states.

        Group #3: (AL, GA, SC, NC, VA)

        States in this group tend to have larger HHAs, have average utilization rates, and provide care to a high proportion of minorities relative to other states.

        Group #4: (TX, FL, OK, LA, MS)

        States in this group have HHAs that tend to be for-profit, have very high utilization rates, and have a higher proportion of dually-

        eligible beneficiaries relative to other states.

        Group #5: (WA, OR, AK, HI, WY, ID)

        States in this group tend to have smaller HHAs, have average utilization rates, and are more rural relative to other states.

        Group #6: (NM, CA, NV, UT, CO, AZ)

        States in this group tend to have smaller HHAs, have average utilization rates, and provide care to a high proportion of minorities relative to other states.

        Group #7: (ND, SD, MT, WI, MN, IA)

        States in this group tend to have smaller HHAs, have very low utilization rates, and are more rural relative to other states.

        Group #8: (OH, WV, IN, MO, NE., KS)

        States in this group tend to have HHAs that are of average size, have average utilization rates, and provide care to a higher proportion of dually-eligible beneficiaries relative to other states.

        Group #9: (IL, KY, AR, MI, TN)

        States in this group tend to have HHAs with higher utilization rates relative to other states.

      4. Randomized Selection of States

        We stated in the proposed rule that upon the careful consideration of the alternative selection methodologies discussed in that rule, including selecting states on a non-random basis, we proposed to use a selection methodology based on a randomized sampling of states within each of the nine regional groupings described above. We examined data on the evaluation elements listed in this section of the proposed rule and this final rule to determine if specific states could be identified in order to fulfill the needs of the evaluation. After careful review, we determined that each evaluation element could be measured by more than one state. As a result, we determined that it was necessary to apply a fair method of selection where each state would have a comparable opportunity of being selected and which would fulfill the need for a robust evaluation. The proposed nine groupings of states, as described in this section of the proposed rule and this final rule, permit the model to capture the essential elements of the evaluation including demographic, geographic, and market factors.

        We explained in the proposed rule that the randomized sampling of states is without bias to any characteristics of any single state within any specific regional grouping, where no states are excluded, and no state appears more than once across any of the groupings. The randomized selection of states was completed using a scientifically-

        accepted computer algorithm designed for randomized sampling. The randomized selection of states was run on each of the previously described regional groupings using exactly the same process and, therefore, reflects a commonly accepted method of randomized sampling. This computer algorithm employs the aforementioned sampling parameters necessary to define randomized sampling and omits any human interaction once it runs.

        Based on this sampling methodology, SAS Enterprise Guide (SAS EG) 5.1 software was used to run a computer algorithm designed to randomly select states from each grouping. SAS EG 5.1 and the computer algorithm were employed to conduct the randomized selection of states. SAS EG 5.1 represents an industry-standard for generating advanced analytics and provided a rigorous, standardized tool by which to satisfy the requirements of randomized selection. The key SAS commands employed include a ``PROC SURVEYSELECT'' statement coupled with the ``METHOD=SRS'' option used to specify simple random sampling as the sample selection method. A random number seed was generated by using the time of day from the computer's clock. The random number seed was used to produce random number generation. Note that no stratification was used within any of the nine geographically-diverse groupings to ensure there is an equal probability of selection within each grouping. For more information on this procedure and the underlying statistical methodology, please reference SAS support documentation at: http://support.sas.com/documentation/cdl/en/statug/63033/HTML/default/viewer.htm#statug_surveyselect_sect003.htm/.

        Based on consideration of the comments received and for the reasons discussed, we believe this state selection methodology provides the strongest evidence of producing meaningful results representative of the

        Page 68662

        national population of competing Medicare-certified HHAs and, in turn, meets the evaluation requirements of section 1115A(b)(4) of the Act.

        In Sec. 484.310, we proposed to codify the names of the states selected utilizing this proposed methodology, where one state from each of the nine groupings was selected. For each of these groupings, we proposed to use state borders to demarcate which Medicare-certified HHAs would be required to compete in this model: Massachusetts was randomly selected from Group 1, Maryland was randomly selected from Group 2, North Carolina was randomly selected from Group 3, Florida was randomly selected from Group 4, Washington was randomly selected from Group 5, Arizona was randomly selected from Group 6, Iowa was randomly selected from Group 7, Nebraska was randomly selected from Group 8, and Tennessee was randomly selected from Group 9. Thus, we explained in the proposed rule that if our methodology is finalized as proposed, all Medicare-certified HHAs that provide services in Massachusetts, Maryland, North Carolina, Florida, Washington, Arizona, Iowa, Nebraska, and Tennessee will be required to compete in this model. We invited comments on this proposed randomized selection methodology.

        We summarize and respond to these comments at the end of this section. As discussed we are finalizing the state selection methodology as proposed without modification, as well as finalizing the states that were selected utilizing this methodology as codified in Sec. 484.310.

      5. Use of CMS Certification Numbers (CCNs)

        We proposed that Total Performance Scores (TPS) and payment adjustments would be calculated based on an HHA's CCN \20\ and, therefore, based only on services provided in the selected states. The exception to this methodology is where an HHA provides service in a state that also has a reciprocal agreement with another state. Services being provided by the HHA to beneficiaries who reside in another state would be included in the TPS and subject to payment adjustments.\21\ The reciprocal agreement between states allows for an HHA to provide services to a beneficiary across state lines using its original CCN number. Reciprocal agreements are rare and, as identified using the most recent Medicare claims data from 2014, there was found to be less than 0.1 percent of beneficiaries that provided services that were being served by CCNs with reciprocal agreements across state lines. Due to the very low number of beneficiaries served across state borders as a result of these agreements, we stated in the proposed rule that we expect there to be an inconsequential impact by including these beneficiaries in the model.

        ---------------------------------------------------------------------------

        \20\ HHAs are required to report OASIS data and any other quality measures by its own unique CMS Certification Number (CCN) as defined under title 42, chapter IV, subchapter G, part 484, Sec. 484.20 Available at URL http://www.ecfr.gov/cgi-bin/text-idx?tpl=/ecfrbrowse/Title42/42cfr484_main_02.tpl.

        \21\ See Chapter 2 of the State Operations Manual (SOM), Section 2184--Operation of HHAs Cross State Lines, stating ``When an HHA provides services across State lines, it must be certified by the State in which its CCN is based, and its personnel must be qualified in all States in which they provide services. The appropriate SA completes the certification activities. The involved States must have a written reciprocal agreement permitting the HHA to provide services in this manner.''

        ---------------------------------------------------------------------------

        We received the following comments on the proposed selection methodology. As discussed, we are finalizing the selection methodology as proposed.

        Comment: A few commenters expressed concern that participating HHAs will receive payment adjustment incentives based on quality of care, while non-participating HHAs in the same geographic area might be incentivized to generate greater volume at the expense of quality. Some commenters recommended expanding the model to allow more states to participate in each succeeding year of the model to prevent non-

        participating states from falling behind, and some commenters also recommended CMS shorten the duration of the model to three (3) years to expedite the implementation of VBP nationally.

        Response: Competing HHAs within the selected states will not be compared with non-competing HHAs within the same geographic area. HHAs will not compete across state borders, other than those HHAs that may provide services in a state that has a reciprocal agreement with another state. Specifically, the model is designed to have HHAs compete only within their state and within their size cohort, as discussed further in section F. Competing HHAs will not compete for payment adjustment incentives outside of their state or size cohort. The decision to utilize states to select HHAs for inclusion in the model was based on a range of factors related to implementation and evaluation and weighed against other selection alternatives. Specifically, we considered how the competing HHA's CCN is operationalized at the state-level and how evaluation will determine whether the payment adjustment incentive has an effect on quality within each competing HHA's state and size-cohort. In response to comments suggesting that non-competing HHAs in non-selected states might `fall behind,' we again reference the design of the payment methodology which precludes non-competitors from competing outside of selected states and size-cohorts. The purpose of this model is to test the effect of high incentives on quality. Performance measurement is based on a linear exchange function which only includes competing-HHAs. If the model yields early positive results within these states and competing cohorts, expansion may be considered if the requirements of the statute are met. Section 1115A(c) of the Act authorizes the Secretary to expand the scope and duration of a model being tested through rulemaking, including implementation on a nationwide basis. In addition, we do not expect that HHAs in non-selected states would fall significantly behind in improving quality because of their interest in attracting beneficiaries, and improving performance on quality metrics in other programs, such as the HHQRP. Further, we believe testing the model over 5 years will provide more data with which to evaluate the effects of high incentives with greater certainty.

        Comment: Several commenters expressed concern regarding how HHAs are selected to participate in the HHVBP Model. Commenters expressed concerns centered on leaving behind innovative HHAs in non-

        participating states. Many commenters recommended including voluntary participation by interested innovative HHAs in non-participating states and carefully documenting characteristics of selected agencies. Commenters also stated that mandatory participation may potentially put agencies with fewer resources in selected states at risk for closure.

        Response: We appreciate the comments and input on the state selection methodology. The selection methodology was based on lessons learned, industry stakeholder perspectives, and an analysis of Medicare data. For the reasons discussed above, we believe that application of this methodology will result in participation by HHAs that represent an accurate reflection of the entire population of Medicare-certified HHAs, both in terms of size and in terms of quality. In general, providers do not voluntarily participate in alternate payment models when payments are at risk of being lowered. This reluctance to participate in voluntary models has been shown to cause self-selection bias in statistical assessments and thus, we believe that

        Page 68663

        allowing voluntary participation by interested HHAs in non-

        participating states could present challenges to our ability to evaluate the model. In reference to concerns that some HHAs with fewer resources may be at greater risk for closure, CMS will continue to monitor for direct associations between HHAs that exhibit poor performance and the effect of the payment adjustment incentive.

        Comment: Commenters questioned the fairness of being required to participate in both the proposed HHVBP Model and the proposed Comprehensive Care Joint Replacement Model (CJR).

        Response: HHAs located in the MSAs included in the proposed CJR Model will not be excluded from the HHVBP Model. HHAs are not participants in the proposed CJR Model. As proposed, Hospitals are the participants. Home health payments for beneficiaries participating in the proposed CJR are not subject to alteration under that model. As proposed, only the hospital payments are at risk. HHAs will continue to be paid for the services they provide to and bill for Medicare beneficiaries that are participating in the proposed CJR.

        Comment: Some commenters expressed concern that state selection will not sufficiently represent the Medicare population at large and impacts a disproportionate portion of the Medicare population. Another commenter recommended CMS consider a hardship exemption for HHAs with a high percentage of Medicaid services or that serve a high percentage of dual-eligible patients. Commenters also expressed concern on various topics around state selection, including lack of complex urban areas and corresponding utilization patterns; peer cohorts based simply on size and state; consideration of profit or non-profit status, hospital-

        based or free-standing HHAs, and rural and urban status, all related to either under-representation or potential bias in the selected competing HHAs, or over-representation of certain sub-populations of Medicare beneficiaries included in the model One commenter also recommended excluding states with populations under a certain threshold, such as 2.5 million, to ensure a large population and making the model more robust.

        Response: We have taken into consideration the level of utilization and socioeconomic status of patients in grouping states for random selection, and will evaluate the model sensitive to these differences. The alternative methodologies proposed by stakeholders did not fulfill the requirements to be generalizable and representative of the entire population of Medicare-certified HHAs in the nation. Our mechanisms, including tracking quality improvement through performance measures and conducting comparative analysis based on variations on HHA size, geographic location, organizational structure, and other HHA demographic information will be utilized for evaluating the model. We have conducted extensive analysis on the population of HHAs included in the model and are confident we will be able to effectively extrapolate model results to the general population. In part, this analysis is referenced in Table 24 and finds an association between the higher proportion of dually-eligible beneficiaries serviced and better performance. The performance and subsequent payment distributions are consistent with respect to the four described categories (that is dually-eligible, level of acuity, percent rural, and organization type). In addition, CMS conducted a statistical analysis of the sample size of HHAs provided by the nine selected states and determined it was sufficient to effectively detect the model's impact.

        Comment: One commenter stated that Maryland should not be included in the selected states for HHVBP because Maryland is already participating in the Maryland All Payer Model. Another commenter suggested that Florida not be included in both HHVBP and ACO bundling models because it is difficult for HHAs to track compliance with all relevant policy and regulatory requirements.

        Response: We understand the variances in state demographics, state regulatory structures, and the interplay with other federal initiatives, and intend to evaluate how the HHVBP Model performs in the selected states, including interactions with existing policies, models and programs operating in the specific states selected. For example, the Maryland All-Payer Model does not directly intersect with HHVBP because it is a hospital-based model, so we do not believe this is a compelling reason to exclude this state. In addition, concerns that Florida Medicare-certified HHAs would also be included in ACO models is not a compelling reason to exclude this state because other states have HHAs participating within ACO models. We do, however, recognize the need to evaluate the impact of the model in the context of the various policies and programs operating in those states where participating HHAs serve patients. As discussed, after consideration of the public comments received, we are finalizing our proposal to include the nine selected states as stated in Section 2. In comparison to other alternatives for selection, we believe the proposed randomized state-

        selection method provides an equitable process of selection and a comparable number of HHAs to account for the power to detect statistical variations between the payment adjustment incentive as well as non-financial incentives and their effect on quality. The nine selected states finalized here will participate for the full duration of the model.

        Comment: One commenter suggested that selected states be more homogenous in having no prior experience in VBP and to exclude any states that participated in 2008-2010 HH Pay for Performance demonstration.

        Response: We understand concerns about previous program and model participation in that some competitors may be more prepared for VBP in comparison to others. While we are not convinced that we can attribute the level of preparedness for VBP to the HHA's experience with the HHP4P Demonstration or any other VBP initiative, we intentionally developed a methodology for randomized selection of states to prevent bias to any characteristics of any single state within any specific grouping. As a result of this randomness of selection, the design permits an equitable opportunity for selection and provides a greater capacity to generalize results to the entire population of Medicare-

        certified HHAs in the U.S.

        Final Decision: For the reasons stated and in consideration of the comments received, we are finalizing the state selection methodology as proposed, including the nine states selected under this methodology as codified at Sec. 484.310. All Medicare-certified HHAs that provide services in Massachusetts, Maryland, North Carolina, Florida, Washington, Arizona, Iowa, Nebraska, and Tennessee will be required to compete in the HHVBP Model.

    4. Performance Assessment and Payment Periods

      1. Performance Reports

      We proposed to use quarterly performance reports, annual payment adjustment reports, and annual publicly-available performance reports as a means of developing greater transparency of Medicare data on quality and aligning the competitive forces within the market to deliver care based on value over volume, and are finalizing this reporting structure here. The publicly-reported reports will inform home health industry

      Page 68664

      stakeholders (consumers, physicians, hospitals) as well as all competing HHAs delivering care to Medicare beneficiaries within selected state boundaries on their level of quality relative to both their peers and their own past performance.

      We proposed that competing HHAs would be scored for the quality of care delivered under the model based on their performance on measures compared to both the performance of their peers, defined by the same size cohort (either smaller- or larger-volume cohorts as defined in Sec. 484.305), and their own past performance on the measures. We also proposed in Sec. 484.305 to define larger-volume cohort to mean the group of competing HHAs within the boundaries of a selected state that are participating in Home Health Care Consumer Assessment of Healthcare Providers and Systems (HHCAHPS) in accordance with Sec. 484.250 and to define smaller-volume cohort to mean the group of HHAs within the boundaries of a selected state that are exempt from participation in HHCAHPS in accordance with Sec. 484.250. We also proposed where there are too few HHAs in the smaller-volume cohort in each state to compete in a fair manner (that is, when there is only one or two HHAs competing within a small cohort in a given state), these HHAs would be included in the larger-volume cohort for purposes of calculating the total performance score and payment adjustment without being measured on HHCAHPS. We requested comments on this proposed methodology.

      Comment: A few commenters mentioned the cohort methodology in their submissions. One commenter offered support to CMS's decision to measure each HHA against a comparable cohort by size of agency and agreed that large HHAs with multiple locations have a scale that smaller agencies do not, rendering outcomes difficult to measure by comparison. Conversely, other commenters did not support CMS's proposal to base performance payments on relative performance within HHA peer cohorts, with one commenter recommending payments should be based solely on comparisons to prior year performance and another suggesting using national data for all HHAs, taking into account socio-demographic factors.

      Response: Analysis of existing HHA data (see 80 FR 39910, Table 26--HHA Cohort Payment Adjustment Distributions by State) indicates dividing HHAs into large and small cohorts results in a higher likelihood of fair and accurate performance comparisons and the subsequent payment adjustments. We intend to closely evaluate model outcomes across a range of demographic factors within the small and large cohorts, and may modify the model if warranted in subsequent years.

      Final Decision: After considering the comments received, we are finalizing the large and small cohort structure as proposed.

      We proposed that quality performance scores and relative peer rankings would be determined through the use of a baseline year (calendar year 2015) and subsequent performance periods for each competing HHA. Further, these reports will provide competing HHAs with an opportunity to track their quality performance relative to their peers and their own past performance. Using these reports provides a convenient and timely means for competing HHAs to assess and track their own respective performance as capacity is developed to improve or sustain quality over time.

      Beginning with the data collected during the first quarter of CY 2016 (that is, data for the period January 1, 2016 to March 31, 2016), and for every quarter of the model thereafter, we proposed to provide each Medicare-certified HHA with a quarterly report that contains information on their performance during the quarter. We stated that we expect to make the first quarterly report available in July 2016, and make performance reports for subsequent quarters available in October, January and April. The final quarterly report would be made available in April 2021. We proposed that the quarterly reports would include a competing HHA's model-specific performance results with a comparison to other competing HHAs within its cohort (larger- or smaller-volume) within the state boundary. These model-specific performance results will complement all quality data sources already being provided through the QIES system and any other quality tracking system possibly being employed by HHAs. We note that all performance measures that competing HHAs will report through the QIES system are also already made available in the CASPER Reporting application. The primary difference between the two reports (CASPER reports and the model-specific performance report) is that the model-specific performance report we proposed consolidates the applicable performance measures used in the HHVBP Model and provides a peer-ranking to other competing HHAs within the same state and size-cohort. In addition, CASPER reports will provide quality data earlier than model-specific performance reports because CASPER reports are not limited by a quarterly run-out of data and a calculation of competing peer-rankings. For more information on the accessibility and functionality of the CASPER system, please reference the CASPER Provider Reporting Guide.\22\

      ---------------------------------------------------------------------------

      \22\ The Casper Reporting Guide is available at http://www.cms.gov/Medicare/Quality-Initiatives-Patient-Assessment-Instruments/HomeHealthQualityInits/downloads/HHQICASPER.pdf).

      ---------------------------------------------------------------------------

      We proposed that the model-specific quarterly performance report will be made available to each HHA through a dedicated CMMI model-

      specific platform for data dissemination and include each HHA's relative ranking amongst its peers along with measurement scores and overall performance rankings.

      We also proposed that a separate payment adjustment report would be provided once a year to each of the competing HHAs. This annual report will focus primarily on the payment adjustment percentage and include an explanation of when the adjustment will be applied and how this adjustment was determined relative to performance scores. Each competing HHA will receive its own annual payment adjustment report viewable only to that HHA.

      We also proposed a separate, annual, publicly available quality report that would provide home health industry stakeholders, including providers and suppliers that refer their patients to HHAs, with an opportunity to confirm that the beneficiaries they are referring for home health services are being provided the best possible quality of care available.

      We invited comments on the proposed reporting framework.

      Comment: Some commenters expressed support for the proposed HHVBP reporting framework of quarterly/annual reports and public reporting. Specifically, one commenter supported CMS in its efforts to provide agencies with performance reports and notices of payment change prior to the imposition of any payment penalty. One commenter suggested that CMS employ a continuous improvement cycle with industry stakeholders to maximize the value of the annual publicly available quality reports so that information does not mislead beneficiaries. Another commenter supported the proposed timeliness with which quarterly reports would be made available to HHAs after agency data submission, but expressed doubts about CMS's ability to comply with its own proposed timeline for

      Page 68665

      releasing quarterly reports. Conversely, a few commenters suggested that challenges related to providing updated quarterly reports on performance should be considered more fully before implementation. Some commenters also suggested that CMS should include in future rulemaking how quarterly reconciliation will be implemented. Another commenter posited that current reporting timeframes, even if complied with, do not give small and rural HHAs enough lead time to improve quality.

      Response: We thank the commenters for their overall support for the inclusion of performance reports for all competing HHAs and industry stakeholders. In reference to concerns with the timelines for delivery of reports, we intend to meet all performance report timeline expectations. However, in this final rule, we are revising the timelines for notification and preview of the annual payment adjustment to remove the references to specific days of the month set forth in the proposed rule. This will allow for greater flexibility for the industry and CMS to meet these expectations and to account for the possibility of a specific day falling on a weekend or holiday. Through technical assistance efforts, we will continuously work with all competing-HHAs and stakeholders in how these reports are interpreted and reconciled and how they may be used to support transformational efforts to deliver care within the HHVBP system of incentives.

      Comment: Some comments offered their general support of the HHVBP public reporting of performance data because it will inform industry stakeholders of quality improvements, and noted several areas of value in performance data. Specifically, commenters suggested public reports would permit providers to steer patients to high-performing HHAs based on quality reports. Commenters offered that to the extent possible, accurate comparable data will provide HHAs the ability to improve care delivery and patient outcomes, while better predicting and managing quality performance and payment updates. These same commenters urged CMS to consider the HHA information technology infrastructure needed to support complex performance tracking connected with a VBP program. Overall, commenters generally encouraged the transparency of data pertaining to the HHVBP Model.

      Response: As part of the HHVBP Model, we will provide technical assistance and other tools for HHAs in selected states to encourage best practices when making changes to improve quality. We anticipate that the HHVBP learning network will be an integral part of data monitoring and performance related discussion and feedback. As indicated in the proposed rule (see 80 FR 39873) we also intend to make public competing HHAs' Total Performance Scores with the intention of encouraging providers and other stakeholders to utilize quality ranking when selecting an HHA.

      Final Decision: For the reasons discussed and in consideration of the comments received, we are finalizing the reporting framework for the HHVBP Model as proposed without modification.

      2. Payment Adjustment Timeline

      We proposed to codify in Sec. 484.325 that competing HHAs will be subject to upward or downward payment adjustments based on the agency's Total Performance Score. We proposed that the model would consist of 5 performance years, where each performance year would link performance to the opportunity and risk for payment adjustment up to an applicable percent as defined in proposed Sec. 484.305. The 1st performance year would transpire from January 1, 2016 through December 31, 2016, and subsequently, all other performance years would be assessed on an annual basis through 2020 unless modified through rulemaking. We proposed that the first payment adjustment would begin January 1, 2018 applied to that calendar year based on 2016 performance data. Subsequently, all other payment adjustments would be made on an annual basis through the conclusion of the model. We proposed that payment adjustments would be increased incrementally over the course of the model with a maximum payment adjustment of 5-percent (upward or downward) in 2018 and 2019, a maximum payment adjustment of 6-percent (upward or downward) in 2020, and a maximum payment adjustment of 8-

      percent (upward or downward) in 2021 and 2022. We proposed to implement this model over a total of seven (7) years beginning on January 1, 2016, and ending on December 31, 2022.

      After consideration of comments received, we are modifying the final payment adjustment percentages as discussed in Section G and finalized in Sec. 484.305.

      We proposed that the baseline year would run from January 1, 2015 through December 31, 2015 and provide a basis from which each respective HHA's performance will be measured in each of the performance years. Data related to performance on quality measures will continue to be provided from the baseline year through the model's tenure using a dedicated HHVBP web-based platform specifically designed to disseminate data in this model (this ``portal'' will present and archive the previously described quarterly and annual quality reports). Further, HHAs will provide performance data on the three new quality measures discussed in section E5 through this platform as well. Any additional measures added through the model's tenure and proposed through future rulemaking, will use data from the previous calendar year as the baseline.

      We proposed that new market entries (specifically, new competing HHAs delivering care in the boundaries of selected states) would also be measured from their first full calendar year of services in the state, which would be treated as baseline data for subsequent performance years under this model. The delivery of services would be measured by the number of episodes of care for Medicare beneficiaries and used to determine whether an HHA falls into the smaller- or larger-

      volume cohort. Furthermore, these new market entries would be competing under the HHVBP Model in the first full calendar year following the full calendar year baseline period.

      We proposed that HHAs would be notified in advance of their first performance level and payment adjustment being finalized, based on the 2016 performance period (January 1, 2016 to December 31, 2016), with their first payment adjustment to be applied January 1, 2018 through December 31, 2018. We proposed that each competing HHA would be notified of this first pending payment adjustment on August 1, 2017 and a preview period would run for 10 days through August 11, 2017. This preview period would provide each competing HHA an opportunity to reconcile any performance assessment issues relating to the calculation of scores prior to the payment adjustment taking effect, in accordance with the process in Section H--Preview and Period to Request Recalculation. Once the preview period ends, any changes would be reconciled and a report finalized no later than November 1, 2017 (or 60 days prior to the payment adjustment taking affect). As discussed further in section H, we are finalizing this proposal with modification, to allow for a longer preview period of quarterly performance reports and annual payment adjustment reports for all competing HHAs. Specifically, we are extending the preview period such that each HHA will be notified of the first pending payment adjustment in

      Page 68666

      August 2017 and followed by a 30-day preview period.

      We proposed that subsequent payment adjustments would be calculated based on the applicable full calendar year of performance data from the quarterly reports, with competing HHAs notified and payments adjusted, respectively, every year thereafter. As a sequential example, the second payment adjustment will occur January 1, 2019 based on a full 12 months of the CY 2017 performance period. Notification of the second adjustment will occur in August of 2018, followed by a 30-day preview period (under our modifications to the proposed notification and preview timeline, as discussed previously) and followed by reconciliation prior to November 1, 2018. Subsequent payment adjustments will continue to follow a similar timeline and process.

      Beginning in CY 2019, we may consider revising this payment adjustment schedule and updating the payment adjustment more frequently than once each year if it is determined that a more timely application of the adjustment as it relates to performance improvement efforts that have transpired over the course of a calendar year would generate increased improvement in quality measures. Specifically, we would expect that having payment adjustments transpire closer together through more frequent performance periods would accelerate improvement in quality measures because HHAs would be able to justify earlier investments in quality efforts and be incentivized for improvements. In effect, this concept may be operationalized to create a smoothing effect where payment adjustments are based on overlapping 12-month performance periods that occur every 6 months rather than annually. As an example, the normal 12-month performance period occurring from January 1, 2020 to December 31, 2020 might have an overlapping 12-month performance period occurring from July 1, 2020 to June 30, 2021. Following the regularly scheduled January 1, 2022 payment adjustments, the next adjustments could be applied to payments beginning on July 1, 2022 through December 31, 2022. Depending on if and when more frequent payment adjustments would be applied, performance would be calculated based on the applicable 12-months of performance data, HHAs notified, and payments adjusted, respectively, every six months thereafter, until the conclusion of the model. As a result, separate performance periods would have a 6-month overlap through the conclusion of the model. HHAs would be notified through rulemaking and be given the opportunity to comment on any proposed changes to the frequency of payment adjustments.

      We received the following comments on this proposed payment adjustment schedule.

      Comment: Many commenters recommended a delay in the payment adjustment schedule. One commenter recommended that CMS collect and report quality data for 2016 as an educational exercise only, and use 2017 data as the basis to adjust payment rates beginning in October 2018. This same commenter also recommended CMS delay the first year of rate adjustments by nine months to October 1, 2018. Another commenter supported the importance of HHAs in the VBP program not experiencing payment adjustments until two years after the performance year in an effort to minimize the programmatic impact and allow agencies the ability to plan ahead. Several commenters suggested a one year delay in implementing the model, citing the timeline as too aggressive. A few commenters posited that it is difficult for HHAs in the HHVBP Model to begin preparing for the model now without a final rule to guide them, and noted concern that the final rule will publish so close to the beginning of the model. Some commenters specifically supported payment adjustment on an annual basis, positing adjustments made more frequently than once each year may jeopardize the financial viability of smaller volume providers, causing further disruption, as multiple adjustments throughout a fiscal year would be difficult to manage. Further, due to the delay in data collection and reporting used in these programs, significant change in performance in shorter increments would be unlikely, as quality improvement initiatives take time to fully implement and for results to be realized. Another commenter offered that any move to increase the payment adjustment to every 6 months would not offer HHAs sufficient time to improve clinician practice patterns and evaluate the effectiveness of the changes made.

      Response: We are finalizing the proposed payment adjustment timeline for model implementation on an annual basis. Any changes to the frequency of payment adjustments under the model would be implemented through future rulemaking. In response to concerns with having the first performance year tied to an annual payment adjustment in 2018, we expect that competing HHAs will begin transforming delivery patterns as soon as this model is implemented. Delaying the payment adjustment, which is the primary intervention in this model, limits the ability to understand the intervention's associated effect on quality. We expect that model-specific technical assistance which will be made available to all competing-HHAs will provide the appropriate information and tools needed to transform how care is delivered within the HHVBP Model.

      Comment: Several commenters expressed concern about the time lag between the performance year and the year in which payment adjustments would be applied and strongly recommended less time lapse between performance measurement and payment adjustment. One commenter recommended CMS revise the HHVBP Model so that rewards and penalties are imposed within 6 months of the end of the measurement period, rather than a full year later, and consider imposing the rewards and penalties for 6 months at a time, allowing the rates to return to normal for the first 6 months of the subsequent year. Another commenter offered that this expedited timeframe would allow agencies working towards improvement to have the resources available to do so more immediately.

      Response: We agree that there may be merit in closing the gap between performance measurement and payment adjustments in order to more effectively connect improvements in quality care with financial incentives. We will closely evaluate the efficacy of the model, and may consider whether shorter performance assessment cycles (and by extension, shorter payment adjustment cycles) are warranted. Any such changes will be implemented through future rulemaking.

      Final Decision: For the reasons discussed, we are finalizing the payment adjustment timeline as proposed with modification. Specifically, we are finalizing that payment adjustments will be increased incrementally over the course of the model with a maximum payment adjustment of 3-percent (upward or downward) in 2018, a maximum payment adjustment of 5-percent (upward or downward) in 2019, a maximum payment adjustment of 6-percent (upward or downward) in 2020, a maximum payment adjustment of 7-percent (upward or downward) in 2021, and a maximum payment adjustment of 8-percent (upward or downward) in 2022. We are also modifying the timeline for notification and preview of the pending payment adjustment to allow for greater flexibility and to account for the possibility of a specific day falling on a weekend or holiday,

      Page 68667

      and also to provide a longer preview period for HHAs. Specifically, we are extending the preview period such that each HHA will be notified of each pending payment adjustment in August of the year prior to the payment adjustment being applied and the preview period will run for 30 days of that year. We also removed specific days of the month previously referenced in the proposed rule to allow for greater flexibility.

    5. Quality Measures

      1. Objectives

      We proposed that initially, the measures for the HHVBP Model would be predominantly drawn from the current OASIS,\23\ which is familiar to the home health industry and readily available for utilization by the model. In addition, the HHVBP Model provides us with an opportunity to examine a broad array of quality measures that address critical gaps in care. A recent comprehensive review of the VBP experience over the past decade, sponsored by the Office of the Assistant Secretary for Planning and Evaluation (ASPE), identified several near- and long-term objectives for HHVBP measures.\24\ The recommended objectives emphasize measuring patient outcomes and functional status; appropriateness of care; and incentives for providers to build infrastructure to facilitate measurement within the quality framework.\25\ The following seven objectives derived from this study served as guiding principles for the selection of the proposed measures for the HHVBP Model:

      ---------------------------------------------------------------------------

      \23\ For detailed information on OASIS see the official CMS OASIS Web resource available at http://www.cms.gov/Medicare/Quality-Initiatives-Patient-Assessment-Instruments/OASIS/index.html?redirect=/oasis. See also industry resource available at http://www.oasisanswers.com/index.htm, specifically updated OASIS component information available at www.oasisanswers.com/LiteratureRetrieve.aspx?ID=215074).

      \24\ U.S. Department of Health and Human Services. Office of the Assistant Secretary for Planning and Evaluation (ASPE) (2014) Measuring Success in Health Care Value-Based Purchasing Programs. Cheryl L. Damberg et al. on behalf of RAND Health.

      \25\ Id.

      ---------------------------------------------------------------------------

      1. Use a broad measure set that captures the complexity of the HHA service provided;

      2. Incorporate the flexibility to include Improving Medicare Post-

      Acute Care Transformation (IMPACT) Act of 2014 measures that are cross-

      cutting amongst post-acute care settings;

      3. Develop second-generation measures of patient outcomes, health and functional status, shared decision making, and patient activation;

      4. Include a balance of process, outcome, and patient experience measures;

      5. Advance the ability to measure cost and value;

      6. Add measures for appropriateness or overuse; and,

      7. Promote infrastructure investments.

      2. Methodology for Selection of Quality Measures

      1. Direct Alignment With National Quality Strategy Priorities

        A central driver of the proposed measure selection process was incorporating innovative thinking from the field while simultaneously drawing on the most current evidence-based literature and documented best practices. Broadly, we proposed measures that have a high impact on care delivery and support the combined priorities of HHS and CMS to improve health outcomes, quality, safety, efficiency, and experience of care for patients. To frame the selection process, we utilized the domains described in the CMS Quality Strategy that maps to the six National Quality Strategy (NQS) priority areas (see Figure 3 for CMS domains).\26\

        ---------------------------------------------------------------------------

        \26\ The CMS Quality Strategy is discussed in broad terms at URL http://www.cms.gov/Medicare/Quality-Initiatives-Patient-Assessment-Instruments/QualityInitiativesGenInfo/CMS-Quality-Strategy.html. CMS Domains appear presentations by CMS and ONC (available at http://www.cms.gov/eHealth/downloads/Webinar_eHealth_March25_eCQM101.pdf) and a CMS discussion of the NQS Domains can be found at URL http://www.cms.gov/Regulations-and-Guidance/Legislation/EHRIncentivePrograms/2014_ClinicalQualityMeasures.html.

        ---------------------------------------------------------------------------

        Page 68668

        GRAPHIC TIFF OMITTED TR05NO15.004

      2. Referenced Quality Measure Authorities

        We proposed at Sec. 484.315 that Medicare-certified HHAs will be evaluated using a starter set of quality measures (``starter set'' refers to the quality measures for the first year of this model) designed to encompass multiple NQS domains, and provide future flexibility to incorporate and study newly developed measures over time. New and evolving measures will be considered for inclusion in subsequent years of this model and proposed through future rulemaking.

        To create the proposed starter set we began researching the current set of OASIS measures that are being used within the health home environment.\27\ Following that, we searched for endorsed quality measures using the National Quality Forum (NQF) Quality Positioning System (QPS),\28\ selecting measures that address all possible NQS domains. We further examined measures on the CMS-generated Measures Under Consideration (MUC) list,\29\ and reviewed other relevant measures used within the health care industry, but not currently used in the home health setting, as well as measures required by the IMPACT Act of 2014. Finally, we searched the National Quality Measures Clearinghouse (NQMS) to identify evidence-based measures and measure sets.

        ---------------------------------------------------------------------------

        \27\ All data for the starter set measures, not including New Measures, is currently collected from HHAs under Sec. Sec. 484.20 and 484.210.

        \28\ The NQF Quality Positioning System is available at http://www.qualityforum.org/QPS.

        \29\ To review the MUC List see https://www.qualityforum.org/Setting_Priorities/Partnership/Measures_Under_Consideration_List_2014.aspx.

        ---------------------------------------------------------------------------

      3. Key Policy Considerations and Data Sources

        So that measures for the HHVBP Model take a more holistic view of the patient beyond a particular disease state or care setting, we proposed, and are finalizing in this rule, measures, which include outcome measures as well as process measures, that have the potential to follow patients across multiple settings, reflect a multi-faceted approach, and foster the intersection of health care delivery and population health. A key consideration behind this approach is to use in performance year one (PY1) of the model proven measures that are readily available and meet a high impact need, and in subsequent model years augment this starter set with innovative measures that have the potential to be impactful and fill critical measure gap areas. All substantive changes or additions to the starter set or new measures would be proposed in future rulemaking. This approach to quality measure selection aims to balance the burden of collecting data with the inclusion of new and important measures. We carefully considered the potential burden on HHAs to report the measure data when developing the starter set, and prioritized measures that will draw both from claims data and data already collected in OASIS.

        The majority of the measures proposed, as well as the majority of measures being finalized, in this model will use OASIS data currently being reported to CMS and linked to state-specific CCNs for selected states in order to promote consistency and to reduce the data collection burden for providers. Utilizing primarily OASIS data will allow the model to leverage reporting structures already in place to evaluate performance and identify weaknesses in care delivery. This model will also afford the opportunity to study measures developed in other care settings and new to the home health industry (hereinafter referred to as ``New Measures''). Many of the New Measures have been used in other health care settings and are readily applicable to the home health environment (for example, influenza vaccination coverage for health care personnel). The final New Measures for PY1 are described in detail below. We proposed, and are finalizing with modification, in PY1 to collect data on these New Measures which have already been tested for validity, reliability, usability/feasibility, and sensitivity in

        Page 68669

        other health care settings but have not yet been validated within the home health setting. As discussed in further detail under ``E5.New Measures,'' we are finalizing three of the four proposed New Measures for reporting under this model. HHVBP will study if their use in the home health setting meets validity, reliability, usability/feasibility, and sensitivity to statistical variations criteria. For PY1, we proposed that HHAs could earn points to be included in the Total Performance Score (TPS) simply for reporting data on New Measures (see Section--Performance Scoring Methodology). To the extent we determine that one or more of the New Measures is valid and reliable for the home health setting, we will consider in future rulemaking to score Medicare-certified HHAs on their actual performance on the measure.

        3. Selected Measures

        The initial set of measures proposed for PY1 of the model utilizes data collected via OASIS, Medicare claims, HHCAHPS survey data, and data reported directly from the HHAs to CMS. We proposed, in total, 10 process measures and 15 outcome measures (see Figure 4a of the proposed rule) plus four New Measures (see Figure 4b of the proposed rule). As discussed below, we are finalizing the proposed starter set of measures with modification; specifically, under our final policy, there are in total six process measures and 15 outcome measures (see Figure 4a of this final rule) and three New Measures (see Figure 4b of this final rule). Process measures evaluate the rate of HHA use of specific evidence-based processes of care based on the evidence available. Outcomes measures illustrate the end result of care delivered to HHA patients. When available, NQF endorsed measures will be used. This set of measures will be subject to change or retirement during subsequent model years and revised through the rulemaking process. For example, we may propose in future rulemaking to remove one or more of these measures if, based on the evidence; we conclude that it is no longer appropriate for the model due to its performance being topped-out. We will also consider proposing to update the measure set if new measures that address gaps within the NQS domains became available. We will also consider proposing adjustments to the measure set based on lessons learned during the course of the model. For instance, in light of the passage of the IMPACT Act of 2014, which mandates the collection and use of standardized post-acute care assessment data, we will consider proposing in future rulemaking to adopt measures that meet the requirements of the IMPACT Act as soon as they became available. Provisions of the IMPACT ACT applicable to HHAs will take effect beginning CY 2017. Currently, IMPACT measures for home health are in the development stage and not available for inclusion in the starter set of measures. We requested public comment on the methodology for constructing the proposed starter set of quality measures and on the proposed selected measures.

        Comment: Many commenters expressed concern at the number of measures proposed for use in the model, with the primary concern related to the burden placed on HHAs to focus on so many different areas at once, as well as the effort required to track and report New Measures at the same time. Many commenters suggested decreasing the number of measures, particularly process measures, in the starter set and expressed the opinion less measures would allow for greater targeting of quality improvement.

        Response: We have considered the commenters' suggestions and agree that more narrowly focusing the starter set of measures being tested in the HHVBP Model may increase the likelihood of HHA success in their quality improvement and transformation efforts. In addition, we were encouraged by commenters to re-evaluate the proposed starter set of measures and specifically include fewer process measures in the final starter set. After consideration of these comments we are reducing the number of measures in the final starter set. We proposed that the starter set would include 25 measures that are currently reported through existing systems (in addition to the proposed New Measures). Twenty of these proposed measures were process/outcomes measures collected on the OASIS or through claims data and five are HHCAHPs. We agree with commenters that placing an emphasis on outcome measures over process measures determines performance in a way most meaningful to patients. For each process measure in the proposed starter set we analyzed what specific metrics were being assessed in relation to the entire starter set and how close the measure was to being `topped-out' based on the most recent available data. Based on these comments and for the reasons stated we are reducing the number of process measures by four resulting in a final starter set with six process measures, 10 outcome measures and five HHCAHPS. In addition, we have decreased the New Measures from four to three (as discussed later in this section). We are not including the following proposed measures in the final starter set: Timely Initiation of Care (NQF0526), Pressure Ulcer Prevention and Care (NQF0538), Multifactor Fall Risk Assessment Conducted for All Patients who can Ambulate (NQF0537), Depression assessment conducted (NQF0518), and Adverse Event for Improper Medication Administration and/or Side Effects (New Measures).

        Comment: We received some public comments expressing concern that all measures in the starter set are not endorsed by NQF.

        Response: We agree that wherever possible NQF-endorsed measures should be utilized. When creating the proposed starter set it was our policy to utilize an NQF-endorsed measure whenever one was available to address a known quality improvement issue in home health. For other measures included in the finalized starter set, we are utilizing long-

        standing OASIS data components to track quality. As an innovation model, it is our intention to closely monitor the quality measures and to address any needed adjustments through future rulemaking. In addition, the information we learn during this model may, where appropriate, be utilized to assist in effective measures gaining endorsement within the HH service line.

        Comment: We received a number of public comments citing the settlement agreement in Jimmo v. Sebelius and expressing concern with the inclusion of five measures related to improvement and articulating the importance of including measures related to patient stabilization and maintenance.

        Response: We appreciate the feedback on the measures methodology and acknowledge that skilled care may be necessary to improve a patient's current condition, to maintain the patient's current condition, or to prevent or slow further deterioration of the patient's condition, as was clarified through the Jimmo settlement. The Jimmo settlement agreement, however, pertains only to the clarification of CMS's manual guidance on coverage standards, not payment measures, and expressly does not pertain to or prevent the implementation of new regulations, including new regulations pertaining to the HHVBP Model. While we considered using some of the stabilization measures for this model, we found that in contrast to the average HHA improvement measure scores which ranged from 56- to 65-percent, the average HHA stabilization measure scores ranged from 94- to 96-percent. Using measures where the average rates are nearly 100-percent would not allow

        Page 68670

        for meaningful comparisons between competing-HHAs on the quality of care delivered. In addition, we performed analyses on whether the proportion of an individual HHA's episodes of care relating to ``low therapy'' episodes (episodes with 0-5 therapy visits) and the proportion of an individual HHA's total therapy visits relating to maintenance therapy would have an impact on the measures related to improvement used in the model. HHAs that have a higher proportion of patients that require maintenance therapy or patients that receive little to no therapy at all would not be expected to perform well on the measures related to improvement. Although the functional measures related to improvement are expected to be sensitive to the provision of therapy, our analysis did not determine that HHAs' performance on the measures related to improvement were negatively impacted by whether they had a higher proportion of maintenance therapy patients or a higher proportion of patients that had little to no therapy.

        Based on these two analyses, CMS expects that, at this time, HHAs that provide care to more beneficiaries that are maintenance-oriented will not be at a disadvantage in the model. We also do not expect any access issues for beneficiaries that have more maintenance needs because HHAs would not know whether the beneficiary has restorative or maintenance needs until the HHA initiates the episode of care and conducts the necessary assessments. Once the initial OASIS assessment is complete, the beneficiary will be included in measure calculation.

        We are finalizing the measures related to improvement as proposed in the proposed rule, however, we are sensitive to this issue and will closely monitor whether HHVBP Model-specific measures have the potential to impact beneficiaries that require skilled care to maintain the patient's current condition, or to prevent or slow further deterioration of the patient's condition. If necessary, we will use future rulemaking if we determine that this issue has a meaningful detrimental effect on payments of those HHAs that provide more maintenance care. In addition, we are currently working on the development of valid and reliable stabilization measures that may be incorporated into the HHVBP Model in the future. One stabilization measure is referenced in Table 20 `Future Setting-specific Measure Constructs under Consideration'. The HHVBP Model is designed such that any measures determined to be good indicators of quality will be considered for use in the HHVBP Model in future years and may be added through the rulemaking process.

        Comment: Although CMS received general support for the use of OASIS data, some commenters expressed concern with OASIS issues related to data validation or with the use of certain OASIS data elements as the basis for measuring quality.

        Response: We appreciate the comments on this issue and are committed to balancing concerns related to provider burden with concerns related to data validation and accurate reporting of information to CMS via OASIS. In designing the HHVBP Model, we intentionally crafted a starter set of measures to minimize burden. Specifically, the majority of measures rely on OASIS data already reported by HHAs. In response to a 2012 report issued by the Office of the Inspector General,\30\ CMS affirmed a series of monitoring activities related to OASIS education, training and also updated the HHA surveyor worksheet related to HHA OASIS compliance. As part of the monitoring and evaluation of this model CMMI will utilize CMS best practices for determining the validity of OASIS data and detecting fraud related to data submission. Should validation concerns arise, CMMI may consider implementing data validation processes. The model will closely monitor reported measures for indications of fraud and CMS will propose any changes to the model as needed in future rulemaking.

        ---------------------------------------------------------------------------

        \30\ Cite for OIG report here.

        ---------------------------------------------------------------------------

        Comment: A few commenters expressed specific concern that measures in the starter set will be duplicative of, or will not take into account the future measures implemented under the IMPACT Act, and suggested consciously aligning the HHVBP starter set with the IMPACT Act as it is implemented.

        Response: We agree the HHVBP measure set should be in alignment with the IMPACT Act. As stated in the HHVBP proposed rule and finalized here, as soon as new IMPACT measures are finalized and approved, we will consider how best to incorporate and align IMPACT Act measures with the HHVBP measure in future rulemaking. As an example, once baseline data is available for NQF #0678 `pressure ulcers' which will be implemented in CY 2016, we will consider using this measure in future years through rulemaking.

        Comment: One commenter recommended eliminating all vaccine-related measures, as vaccines are not the primary focus of home health care. The commenter stated that the use of vaccine-related measures creates misalignment between patient centered principles and HHA financial incentives.

        Response: We have included two immunization measures in the starter set that are NQF-endorsed as preventive services measures and already collected by home health agencies. These measures are the pneumococcal vaccine and the influenza vaccines for HHA beneficiaries. The immunization measures that are New Measures, the shingles vaccine and influenza vaccines for HHA staff, under the final HHVBP Model serve important public health functions. The New Measure for influenza vaccination for HHA staff is a well-established scientific principle as being a sound mechanism for protecting vulnerable patient populations from avoidable disease transmission. In addition, this New Measure is utilized in every care setting except home health, and is intended to close the gap in protection. The Shingles vaccination is the other New Measure utilizing immunizations, and its efficacy in either preventing shingles entirely or reducing the pain symptoms associated with shingles is directly related to improvement of patient quality of life. The measurements related to vaccination are not connected to whether a patient does or does not receive the vaccinations. Patients are free to decline vaccinations and competing HHAs are not financially penalized for the patient's choice.

        Final Decision: For the reasons discussed and in consideration of the comments received we are not finalizing the following proposed measures:

        Timely Initiation of Care (NQF0526)

        Pressure Ulcer Prevention and Care (NQF0538)

        Multifactor Fall Risk Assessment Conducted for All Patients Who Can Ambulate (NQF0537)

        Depression assessment conducted (NQF0518)

        Adverse Event for Improper Medication Administration and/

        or Side Effects (New Measure)

        We are finalizing the remaining quality measures as proposed. The final starter set includes 6 process measures, 10 outcome measures and 5 HHCAHPS, and three New Measures.

        The final PY1 measures are presented in the following figures.

        Page 68671

        Figure 4a: Final PY1 Measures \31\

        --------------------------------------------------------------------------------------------------------------------------------------------------------

        NQS Domains Measure title Measure type Identifier Data source Numerator Denominator

        --------------------------------------------------------------------------------------------------------------------------------------------------------

        Clinical Quality of Care........ Improvement in Outcome........... NQF0167........... OASIS (M1860)..... Number of home Number of home

        Ambulation- health episodes health episodes

        Locomotion. of care where the of care ending

        value recorded on with a discharge

        the discharge during the

        assessment reporting period,

        indicates less other than those

        impairment in covered by

        ambulation/ generic or

        locomotion at measure-specific

        discharge than at exclusions.

        the start (or

        resumption) of

        care.

        Clinical Quality of Care........ Improvement in Bed Outcome........... NQF0175........... OASIS (M1850)..... Number of home Number of home

        Transferring. health episodes health episodes

        of care where the of care ending

        value recorded on with a discharge

        the discharge during the

        assessment reporting period,

        indicates less other than those

        impairment in bed covered by

        transferring at generic or

        discharge than at measure-specific

        the start (or exclusions.

        resumption) of

        care.

        Clinical Quality of Care........ Improvement in Outcome........... NQF0174........... OASIS (M1830)..... Number of home Number of home

        Bathing. health episodes health episodes

        of care where the of care ending

        value recorded on with a discharge

        the discharge during the

        assessment reporting period,

        indicates less other than those

        impairment in covered by

        bathing at generic or

        discharge than at measure-specific

        the start (or exclusions.

        resumption) of

        care.

        Clinical Quality of Care........ Improvement in Outcome........... NA................ OASIS (M1400)..... Number of home Number of home

        Dyspnea. health episodes health episodes

        of care where the of care ending

        discharge with a discharge

        assessment during the

        indicates less reporting period,

        dyspnea at other than those

        discharge than at covered by

        start (or generic or

        resumption) of measure-specific

        care. exclusions.

        Communication & Care Discharged to Outcome........... NA................ OASIS (M2420)..... Number of home Number of home

        Coordination. Community. health episodes health episodes

        where the of care ending

        assessment with discharge or

        completed at the transfer to

        discharge inpatient

        indicates the facility during

        patient remained the reporting

        in the community period, other

        after discharge. than those

        covered by

        generic or

        measure-specific

        exclusions.

        Communication & Care Care Management: Process........... NA................ OASIS (M2102)..... Multiple data Multiple data

        Coordination. Types and Sources elements. elements.

        of Assistance.

        Efficiency & Cost Reduction..... Acute Care Outcome........... NQF0171........... CCW (Claims)...... Number of home Number of home

        Hospitalization: health stays for health stays that

        Unplanned patients who have begin during the

        Hospitalization a Medicare claim 12-month

        during first 60 for an admission observation

        days of Home to an acute care period. A home

        Health. hospital in the health stay is a

        60 days following sequence of home

        the start of the health payment

        home health stay. episodes

        separated from

        other home health

        payment episodes

        by at least 60

        days.

        Efficiency & Cost Reduction..... Emergency Outcome........... NQF0173........... CCW (Claims)...... Number of home Number of home

        Department Use health stays for health stays that

        without patients who have begin during the

        Hospitalization. a Medicare claim 12-month

        for outpatient observation

        emergency period. A home

        department use health stay is a

        and no claims for sequence of home

        acute care health payment

        hospitalization episodes

        in the 60 days separated from

        following the other home health

        start of the home payment episodes

        health stay. by at least 60

        days.

        Patient Safety.................. Improvement in Outcome........... NQF0177........... OASIS (M1242)..... Number of home Number of home

        Pain Interfering health episodes health episodes

        with Activity. of care where the of care ending

        value recorded on with a discharge

        the discharge during the

        assessment reporting period,

        indicates less other than those

        frequent pain at covered by

        discharge than at generic or

        the start (or measure-specific

        resumption) of exclusions.

        care.

        Patient Safety.................. Improvement in Outcome........... NQF0176........... OASIS (M2020)..... Number of home Number of home

        Management of health episodes health episodes

        Oral Medications. of care where the of care ending

        value recorded on with a discharge

        the discharge during the

        assessment reporting period,

        indicates less other than those

        impairment in covered by

        taking oral generic or

        medications measure-specific

        correctly at exclusions.

        discharge than at

        start (or

        resumption) of

        care.

        Page 68672

        Patient Safety.................. Prior Functioning Outcome........... NQF0430........... OASIS (M1900)..... The number (or All patients in a

        ADL/IADL. proportion) of a risk adjusted

        clinician's diagnostic

        patients in a category with a

        particular risk Daily Activity

        adjusted goal for an

        diagnostic episode of care.

        category who meet Cases to be

        a target included in the

        threshold of denominator could

        improvement in be identified

        Daily Activity based on ICD-9

        (that is, ADL and codes or

        IADL) functioning. alternatively,

        based on CPT

        codes relevant to

        treatment goals

        focused on Daily

        Activity

        function.

        Population/Community Health..... Influenza Vaccine Process........... NA................ OASIS (M1041)..... NA................ NA.

        Data Collection

        Period: Does this

        episode of care

        include any dates

        on or between

        October 1 and

        March 31?

        Population/Community Health..... Influenza Process........... NQF0522........... OASIS (M1046)..... Number of home Number of home

        Immunization health episodes health episodes

        Received for during which of care ending

        Current Flu patients (a) with discharge,

        Season. Received or transfer to

        vaccination from inpatient

        the HHA or (b) facility during

        had received the reporting

        vaccination from period, other

        HHA during than those

        earlier episode covered by

        of care, or (c) generic or

        was determined to measure-specific

        have received exclusions.

        vaccination from

        another provider.

        Population/Community Health..... Pneumococcal Process........... NQF0525........... OASIS (M1051)..... Number of home Number of home

        Polysaccharide health episodes health episodes

        Vaccine Ever during which of care ending

        Received. patients were with discharge or

        determined to transfer to

        have ever inpatient

        received facility during

        Pneumococcal the reporting

        Polysaccharide period, other

        Vaccine (PPV). than those

        covered by

        generic or

        measure-specific

        exclusions.

        Population/Community Health..... Reason Process........... NA................ OASIS (M1056)..... NA................ NA.

        Pneumococcal

        vaccine not

        received.

        Clinical Quality of Care........ Drug Education on Process........... NA................ OASIS (M2015)..... Number of home Number of home

        All Medications health episodes health episodes

        Provided to of care during of care ending

        Patient/Caregiver which patient/ with a discharge

        during all caregiver was or transfer to

        Episodes of Care. instructed on how inpatient

        to monitor the facility during

        effectiveness of the reporting

        drug therapy, how period, other

        to recognize than those

        potential adverse covered by

        effects, and how generic or

        and when to measure-specific

        report problems exclusions.

        (since the

        previous OASIS

        assessment).

        --------------------------------------------------------------------------------------------------------------------------------------------------------

        Home Health CAHPS: Satisfaction Survey Measures

        --------------------------------------------------------------------------------------------------------------------------------------------------------

        Patient & Caregiver-Centered Care of Patients.. Outcome........... .................. CAHPS............. NA................ NA.

        Experience.

        Patient & Caregiver-Centered Communications Outcome........... .................. CAHPS............. NA................ NA.

        Experience. between Providers

        and Patients.

        Patient & Caregiver-Centered Specific Care Outcome........... .................. CAHPS............. NA................ NA.

        Experience. Issues.

        Patient & Caregiver-Centered Overall rating of Outcome........... .................. CAHPS............. NA................ NA.

        Experience. home health care

        and.

        Patient & Caregiver-Centered Willingness to Outcome........... .................. CAHPS............. NA................ NA.

        Experience. recommend the

        agency.

        --------------------------------------------------------------------------------------------------------------------------------------------------------

        ---------------------------------------------------------------------------

        \31\ For more detailed information on the proposed measures utilizing OASIS refer to the OASIS-C1/ICD-9, Changed Items & Data Collection Resources dated September 3, 2014 available at www.oasisanswers.com/LiteratureRetrieve.aspx?ID=215074. For NQF endorsed measures see The NQF Quality Positioning System available at http://www.qualityforum.org/QPS. For non-NQF measures using OASIS see links for data tables related to OASIS measures at http://www.cms.gov/Medicare/Quality-Initiatives-Patient-Assessment-Instruments/HomeHealthQualityInits/HHQIQualityMeasures.html. For information on HHCAHPS measures see https://homehealthcahps.org/SurveyandProtocols/SurveyMaterials.aspx.

        Page 68673

        Figure 4b--Final PY1 New Measures

        --------------------------------------------------------------------------------------------------------------------------------------------------------

        NQS domains Measure title Measure type Identifier Data source Numerator Denominator

        --------------------------------------------------------------------------------------------------------------------------------------------------------

        Population/Community Health..... Influenza Process........... NQF0431 (Used in Reported by HHAs Healthcare Number of

        Vaccination other care through Web personnel in the healthcare

        Coverage for Home settings, not Portal. denominator personnel who are

        Health Care Home Health). population who working in the

        Personnel. during the time healthcare

        from October 1 facility for at

        (or when the least 1 working

        vaccine became day between

        available) October 1 and

        through March 31 March 31. of the

        of the following following year,

        year: (a) regardless of

        received an clinical

        influenza responsibility or

        vaccination patient contact.

        administered at

        the healthcare

        facility, or

        reported in

        writing or

        provided

        documentation

        that influenza

        vaccination was

        received

        elsewhere: or (b)

        were determined

        to have a medical

        contraindication/

        condition of

        severe allergic

        reaction to eggs

        or to other

        components of the

        vaccine or

        history of

        Guillain-Barre

        Syndrome within 6

        weeks after a

        previous

        influenza

        vaccination; or

        (c) declined

        influenza

        vaccination; or

        (d) persons with

        unknown

        vaccination

        status or who do

        not otherwise

        meet any of the

        definitions of

        the above-

        mentioned

        numerator

        categories.

        Population/Community Health..... Herpes zoster Process........... NA................ Reported by HHAs Total number of Total number of

        (Shingles) through Web Medicare Medicare

        vaccination: Has Portal. beneficiaries beneficiaries

        the patient ever aged 60 years and aged 60 years and

        received the over who report over receiving

        shingles having ever services from the

        vaccination? received zoster HHA.

        vaccine (shingles

        vaccine).

        Communication & Care Advance Care Plan. Process........... NQF0326........... Reported by HHAs Patients who have All patients aged

        Coordination. through Web an advance care 65 years and

        Portal. plan or surrogate older.

        decision maker

        documented in the

        medical record or

        documentation in

        the medical

        record that an

        advanced care

        plan was

        discussed but the

        patient did not

        wish or was not

        able to name a

        surrogate

        decision maker or

        provide an

        advance care plan.

        --------------------------------------------------------------------------------------------------------------------------------------------------------

        4. Additional Information on HHCAHPS

        Figure 5 provides details on the elements of the Home Health Care Consumer Assessment of Healthcare Providers and Systems Survey (HHCAHPS) we proposed, and are finalizing, to include in the PY1 starter set. The HHVBP Model will not alter the HHCAHPS current scoring methodology or the participation requirements in any way. Details on participation requirements for HHCAHPS can be found at 42 CFR 484.250 \32\ and details on HHCAHPS scoring methodology are available at; https://homehealthcahps.org/SurveyandProtocols/SurveyMaterials.aspx.\33\

        ---------------------------------------------------------------------------

        \32\ 76 FR 68606, Nov. 4, 2011, as amended at 77 FR 67164, Nov. 8, 2012; 79 FR 66118, Nov. 6, 2014.

        \33\ Detailed scoring information is contained in the Protocols and Guidelines manual posted on the HHCAHPS Web site and available at https://homehealthcahps.org/Portals/0/PandGManual_NOAPPS.pdf.

        Figure 5--Home Health Care Consumer Assessment of Healthcare Providers

        and Systems Survey (HHCAHPS) Composites

        ------------------------------------------------------------------------

        ------------------------------------------------------------------------

        Care of Patients Response

        Categories

        ------------------------------------------------------------------------

        Q9. In the last 2 months of care, how often did home Never, Sometimes,

        health providers from this agency seem informed and Usually, Always.

        up-to-date about all the care or treatment you got

        at home?

        Q16. In the last 2 months of care, how often did home Never, Sometimes,

        health providers from this agency treat you as Usually, Always.

        gently as possible?

        Q19. In the last 2 months of care, how often did home Never, Sometimes,

        health providers from this agency treat you with Usually, Always.

        courtesy and respect?

        Q24. In the last 2 months of care, did you have any Yes, No.

        problems with the care you got through this agency?

        ------------------------------------------------------------------------

        Page 68674

        Communications Between Providers & Patients Response

        Categories

        ------------------------------------------------------------------------

        Q2. When you first started getting home health care Yes, No.

        from this agency, did someone from the agency tell

        you what care and services you would get?

        Q15. In the past 2 months of care, how often did home Never, Sometimes,

        health providers from this agency keep you informed Usually, Always.

        about when they would arrive at your home?

        Q17. In the past 2 months of care, how often did home Never, Sometimes,

        health providers from this agency explain things in Usually, Always.

        a way that was easy to understand?

        Q18. In the past 2 months of care, how often did home Never, Sometimes,

        health providers from this agency listen carefully Usually, Always.

        to you?

        Q22. In the past 2 months of care, when you contacted Yes, No.

        this agency's office did you get the help or advice

        you needed?

        Q23. When you contacted this agency's office, how Same day; 1 to 5

        long did it take for you to get the help or advice days; 6 to 14

        you needed? days; More than

        14 days.

        ------------------------------------------------------------------------

        Specific Care Issues Response

        Categories

        ------------------------------------------------------------------------

        Q3. When you first started getting home health care Yes, No.

        from this agency, did someone from the agency talk

        with you about how to set up your home so you can

        move around safely?

        Q4. When you started getting home health care from Yes, No.

        this agency, did someone from the agency talk with

        you about all the prescription medicines you are

        taking?

        Q5. When you started getting home health care from Yes, No.

        this agency, did someone from the agency ask to see

        all the prescription medicines you were taking?

        Q10. In the past 2 months of care, did you and a home Yes, No.

        health provider from this agency talk about pain?

        Q12. In the past 2 months of care, did home health Yes, No.

        providers from this agency talk with you about the

        purpose for taking your new or changed prescription

        medicines?

        Q13. In the last 2 months of care, did home health Yes, No.

        providers from this agency talk with you about when

        to take these medicines?

        Q14. In the last 2 months of care, did home health Yes, No.

        providers from this agency talk with you about the

        important side effects of these medicines?

        ------------------------------------------------------------------------

        Global type Measures Response

        Categories

        ------------------------------------------------------------------------

        Q20. What number would you use to rate your care from Use a rating

        this agency's home health providers? scale (0-10) (0

        is worst, 10 is

        best).

        Q25. Would you recommend this agency to your family Definitely no;

        or friends if they needed home health care? Probably no;

        Probably yes;

        Definitely yes.

        ------------------------------------------------------------------------

        5. New Measures

        As discussed in the proposed rule and the previous section of this final rule, the New Measures we proposed are not currently reported by Medicare-certified HHAs to CMS, but we believe fill gaps in the NQS Domains not completely covered by existing measures in the home health setting. We proposed that all competing HHAs in selected states, regardless of cohort size or number of episodes, will be required to submit data on the New Measures for all Medicare beneficiaries to whom they provide home health services within the state (unless an exception applies). We proposed at Sec. 484.315(b) that competing HHAs would be required to report data on these New Measures. Competing HHAs will submit New Measure data through a dedicated HHVBP web-based platform. This web-based platform will function as a means to collect and distribute information from and to competing HHAs. Also, for those HHAs with a sufficient number of episodes of care to be subject to a payment adjustment, New Measures scores included in the final TPS for PY1 are only based on whether the HHA has submitted data to the HHVBP web-based platform or not. We proposed the following New Measures for competing HHAs:

        Advance Care Planning;

        Adverse Event for Improper Medication Administration and/

        or Side Effects;

        Influenza Vaccination Coverage for Home Health Care Personnel; and,

        Herpes Zoster (Shingles) Vaccination received by HHA patients.

        For the reasons explained below and in consideration of the comments received, we are not including the proposed ``Adverse Event for Improper Medication Administration and/or Side Effects'' as one of the final New Measures. We are finalizing the other three proposed New Measures without modification.

      4. Advance Care Planning

        Advance Care Planning is an NQF-endorsed process measure in the NQS domain of Person- and Caregiver-centered experience and outcomes (see Figure 3). This measure is currently endorsed at the group practice/

        individual clinician level of analysis. We believe its adoption under the HHVBP Model represents an opportunity to study this measure in the home health setting. This is an especially pertinent measure for home health care to confirm that the wishes of the patient regarding their medical, emotional, or social needs are met across care settings. The Advance Care Planning measure will focus on Medicare beneficiaries, including dually-eligible beneficiaries.

        We proposed that the measure would be numerically expressed by a ratio whose numerator and denominator are as follows:

        Numerator: The measure would calculate the percentage of patients age 65 years and older served by the HHA that have an advance care plan or

        Page 68675

        surrogate decision maker \34\ documented in the clinical record or documentation in the clinical record that an advance care plan was discussed, but the patient did not wish or was not able to name a surrogate decision maker or provide an advance care plan.

        ---------------------------------------------------------------------------

        \34\ A surrogate decision maker, also known as a health care proxy or agent, advocates for patients who are unable to make decisions or speak for themselves about personal health care such that someone else must provide direction in decision-making, as the surrogate decision-maker.

        ---------------------------------------------------------------------------

        Denominator: All patients aged 65 years and older admitted to the HHA.

        Advance care planning provides that the health care plan is consistent with the patient's wishes and preferences. Therefore, studying this measure within the HHA environment allows for further analysis of planning for the ``what ifs'' that may occur during the patient's lifetime. In addition, the use of this measure is expected to result in an increase in the number of patients with advance care plans. Increased advance care planning among the elderly is expected to result in enhanced patient autonomy and reduced hospitalizations and in-hospital deaths.\35\

        ---------------------------------------------------------------------------

        \35\ Lauren Hersch Nicholas, Ph.D., MPP et al. Regional Variation in the Association Between Advance Directives and End-of-

        Life Medicare Expenditures. JAMA. 2011;306(13):1447-1453. doi:10.1001/jama.2011.1410.

        ---------------------------------------------------------------------------

        We invited comments on this proposed measure.

        Comment: Some commenters expressed support for the inclusion of the advance care directive quality measure in the HHVBP Model as an important step towards advancing the needs and wishes of Medicare beneficiaries and improving care near the end of life. One commenter suggested CMS should collect data separately for advance care plans and for surrogate decision makers, since they should not be considered to be alternatives to each other and suggested breaking this one measure into two new separate measures. Another commenter recommended that information collected for Advanced Care Planning be compliant with the standard at Sec. 484.10(c)(ii), in which the HHA must inform and distribute written information to the patient, in advance, concerning its policies on advance directives, including a description of applicable state law.

        Response: HHAs are already required to comply with Conditions of Participation as codified in Sec. 484.10(c)(1)(ii) regarding patient rights and participation in this model in no way alters those regulatory obligations for participating HHAs. We will analyze the data collected for this New Measure and based on this analysis determine if we need to modify the measure in future rulemaking. We also note that standard practices for developing advance care plans integrate selection of surrogate decision making into the plan, so if and when a surrogate is needed they are readily made aware of the patient's wishes as articulated in the care plan.

        Comment: One commenter did not support adoption of an Advance Care Planning measure and stated that an HHA should not be given an incentive to make the patient acquire an advanced directive. The commenter also asserted that Advance Care Planning is better suited for long-term care relationships and that advance directive compliance is already assessed at the HHA level. The commenter expressed concern that the Advance Care Planning measure shows a preference for living wills instead of working through a process to create an advance care plan.

        Response: Advance Care Plans are fundamentally different than advanced directives (also referred to as living wills.) The basis for an Advance Care Plan is ongoing communication with health providers, family members, and potential surrogate decision makers; they are not focused exclusively on end of life or life threatening conditions. Advance Care Plans ensure patient centered care by providing an opportunity for health care providers and patients to identify how a patient would like to be cared for when a medical crisis makes it difficult or impossible to make their own healthcare decisions.

        Comment: Commenters suggested that this metric, and the reporting on all New Measures be delayed until CY2017 and that it be included within OASIS for data collection due to the complexity of the question and its multiple parts.

        Response: Based on the comments we received from HHAs to delay the reporting requirement for New Measures, including Advance Care Planning, we are modifying our proposal to require HHAs to submit the first round of data on this and the other New Measures no later than October 7, 2016 for the period July 2016 through September 2016. In response to the recommendation that we incorporate this measure into OASIS before including it in the Model, part of the purpose of testing this measure in the HH setting is to make informed decisions based on newly available data analysis prior to recommending that this measure be incorporated into measures that all HHAs are required to report.

        Comment: Some commenters expressed concern that the Advance Care Planning Measure does not clearly state that the patient does not have to complete the advance care plan. In addition, some commenters wrote that the measure creates an incentive to pressure patients to do so. A few commenters requested CMS make regulations and policy guidance on the Advance Care Planning measure to more strongly clarify that the well-being and autonomy of the individual patient is the primary concern, not cost savings for the government.

        Response: Beneficiaries are free to make their own decisions related to their participation in their care, and this measure ascertains that providers provide information and opportunity to the patient so they can engage in planning their own care. The intent of the measure is to provide education and guidance to the beneficiaries, not to pressure them regarding this measure. We will provide robust technical assistance for HHAs related to this new measure, including necessary tools and information for ensuring autonomous decision making on the part of the patient.

        Final Decision: For the reasons discussed and in consideration of the comments received, we are finalizing this New Measure as proposed, with the modification that HHAs will be required to begin reporting data no later than October 7, 2016 for the period July 2016 through September 2016 and quarterly thereafter. As a result, the first quarterly performance report in July 2016 will not account for any of the New Measures.

      5. Adverse Event for Improper Medication Administration and/or Side Effects

        We proposed an Adverse Event for Improper Medication Administration and/or Side Effects measure that aligns with the NQS domain of Safety (specifically ``medication safety''--see Figure 3) with the goal of making care safer by reducing harm caused in the delivery of care. The National Quality Forum included ADEs as a Serious Reportable Event (SRE) in the category of Care Management, defining said event as a ``patient death or serious injury associated with a medication error (for example, errors involving the wrong drug, wrong dose, wrong patient, wrong time, wrong rate, wrong preparation, or wrong route of administration),'' noting that ``. . . the high rate of medication errors resulting in injury and death makes this event important to endorse again.'' \36\ We refer

        Page 68676

        readers to the CY 2016 HH PPS proposed rule for more detail on this proposed measure (80 FR 39883 through 39884).

        ---------------------------------------------------------------------------

        \36\ National Quality Forum, Serious Reportable Events in Healthcare-2011, at 9. (2011), available at: http://www.qualityforum.org/Publications/2011/12/Serious_Reportable_Events_in_Healthcare_2011.aspx.

        ---------------------------------------------------------------------------

        We invited comments on the Adverse Drug Events measure.

        Comment: Many commenters noted the duplication between this proposed New Measure and an existing OASIS adverse event outcome measure, ``Emergent Care for Improper Medication Administration, Medication Side Effects''. A commenter recommended substituting the proposed New Measure titled Adverse Event for Improper Medication Administration and/or Side Effects with the current measure called ``Potentially Avoidable Event Outcome titled Emergent Care for Improper Medication Administration, Medication Side Effects'' generated using OASIS data. In addition, commenters generally did not support inclusion of the ADE metric as part of HHVBP because: HHA staff are not typically trained to positively identify ADEs, which are often complex; ADEs often only become apparent after further care; the complexity of ADEs means they are often not identified on discharge paperwork, meaning that more effort would be required to identify ADEs and less vigilant HHAs would be rewarded for not inputting information; and drug education metrics are already part of home health compare and in OASIS data. One commenter expressed concern that ADE measure could create a disincentive for HHAs to accept patients with complex medication regimes.

        Response: We agree with the comments suggesting Adverse Drug Event data would be duplicative and are not finalizing this measure for PY1 of the model. We will evaluate if there is a more narrowly tailored approach for measuring quality performance related to medication management. We will continue to analyze ways to address the issue of adverse drug events in the home health setting and seek input from stakeholders on including an alternative measure in future model years.

        Final Decision: In consideration of comments received we are not finalizing this measure.

      6. Influenza Vaccination Coverage for Home Health Care Personnel

        Staff Immunizations (Influenza Vaccination Coverage among Health Care Personnel) (NQF #0431) is an NQF-endorsed measure that addresses the NQS domain of Population Health (see Figure 3). The measure is currently endorsed in Ambulatory Care; Ambulatory Surgery Center (ASC), Ambulatory Care; Clinician Office/Clinic, Dialysis Facility, Hospital/

        Acute Care Facility, Post-Acute/Long Term Care Facility; Inpatient Rehabilitation Facility, Post-Acute/Long Term Care Facility; Long Term Acute Care Hospital, and Post-Acute/Long Term Care Facility: Nursing Home/Skilled Nursing Facility. Home health care is among the only remaining settings for which the measure has not been endorsed. We stated in the proposed rule that we believe the HHVBP Model presents an opportunity to study this measure in the home health setting. This measure is currently reported in multiple CMS quality reporting programs, including Ambulatory Surgical Center Quality Reporting, Hospital Inpatient Quality Reporting, and Long-Term Care Hospital Quality Reporting; we believe its adoption under the HHVBP Model presents an opportunity for alignment in our quality reporting programs. The documentation of staff immunizations is also a standard required by many HHA accrediting organizations. We believe that this measure would be appropriate for HHVBP because it addresses total population health across settings of care by reducing the exposure of individuals to a potentially avoidable virus.

        We proposed that the measure would be numerically expressed by a ratio whose numerator and denominator are as follows:

        Numerator: The measure would calculate the percentage of home health care personnel who receive the influenza vaccine, and document those who do not receive the vaccine in the articulated categories below:

        (1) Received an influenza vaccination administered at the health care agency, or reported in writing (paper or electronic) or provided documentation that influenza vaccination was received elsewhere; or

        (2) Were determined to have a medical contraindication/condition of severe allergic reaction to eggs or to other component(s) of the vaccine, or history of GuillainhyphenBarreacute Syndrome within 6 weeks after a previous influenza vaccination; or

        (3) Declined influenza vaccination; or

        (4) Persons with unknown vaccination status or who do not otherwise meet any of the definitions of the abovehyphenmentioned numerator categories.

        We proposed that each of the above groups would be divided by the number of health care personnel who are working in the HHA for at least one working day between October 1 and March 31 of the following year, regardless of clinical responsibility or patient contact.

        Denominator: This measure collects the number of home health care personnel who work in the HHA during the flu season: \37\ Denominators are to be calculated separately for the following three (3) groups:

        ---------------------------------------------------------------------------

        \37\ Flu season is generally October 1 (or when the vaccine became available) through March 31 of the following year. See URL http://www.cdc.gov/flu/about/season/flu-season.htm for detailed information.

        ---------------------------------------------------------------------------

        1. Employees: all persons who receive a direct paycheck from the reporting HHA (that is, on the agency's payroll);

        2. Licensed independent practitioners: include physicians (MD, DO), advanced practice nurses, and physician assistants only who are affiliated with the reporting agency who do not receive a direct paycheck from the reporting HHA; and

        3. Adult students/trainees and volunteers: include all adult students/trainees and volunteers who do not receive a direct paycheck from the reporting HHA.

        We stated in the proposed rule that this measure for the HHVBP Model is expected to result in increased influenza vaccination among home health professionals. Reporting health care personnel influenza vaccination status would allow HHAs to better identify and target unvaccinated personnel. Increased influenza vaccination coverage among HHA personnel would be expected to result in reduced morbidity and mortality related to influenza virus infection among patients, especially elderly and vulnerable populations.\38\

        ---------------------------------------------------------------------------

        \38\ Carman WF, Elder AG, Wallace LA, et al. Effects of influenza vaccination of healthhyphencare workers on mortality of elderly people in longhyphenterm care: a randomized controlled trial. Lancet 2000; 355:93-97.

        ---------------------------------------------------------------------------

        We proposed, and are finalizing in this rule, that information on the above numerator and denominator will be reported by HHAs through the HHVBP Web-based platform, in addition to other information related to this measure as the Secretary deems appropriate.

        We invited comments on the proposed Staff Influenza Vaccination measure.

        Comment: A few commenters asserted that HHVBP is not the correct avenue for improving population health and that extending the measure to all allied staff is too broad of a reach for the program, especially considering that the HHA has no mandate that allows it to force allied staff to comply. Commenters recommended modifying proposed influenza measures to include in the numerator HHA staff who decline the vaccination yet wear protective masks

        Page 68677

        or be limited to HHA staff who have contact with the patient. Commenters also noted that staff data is already collected through licensure and certification requirements, and recommended that CMS promote staff influenza immunization through the upcoming Conditions of Participation in Medicare and Medicaid for Home Health Agencies rule.

        Response: Home health care is among the only remaining settings for which the measure has not been endorsed. Mandatory health worker vaccinations are widely endorsed by national professional associations \39\ because public health data has conclusively demonstrated that immunizing health staff to prevent influenza improves population health.\40\ We also note that state certification and documentation requirements for licensure are not consistent from state to state and the requirement for staff vaccination is not part of the CoPs.

        ---------------------------------------------------------------------------

        \39\ For a complete list of professional organizations that endorse mandatory flu vaccinations for health workers see URL http://www.immunize.org/honor-roll/influenza-mandates.

        \40\ Carman WF, Elder AG, Wallace LA, et al. Effects of influenza vaccination of healthhyphencare workers on mortality of elderly people in longhyphenterm care: a randomized controlled trial. Lancet 2000; 355:93-97.

        ---------------------------------------------------------------------------

        Comment: Some commenters suggested CMS develop state-specific or regional time frames for when this measure applies, noting the proposed October-March timeframe may not be sufficiently protective for states in the Northeast.

        Response: We are following flu season guidelines from the Centers for Disease Control (CDC), which indicates peak flu season is from October through March. We defer to CDC expertise and will not be amending the flu time frame for the purposes of the HHVBP model at this time.

        Comment: One commenter did not support the inclusion of the metric for Influenza Vaccination Coverage for Home Health Care Personnel because, as proposed, the metric does not include consideration of the overall availability of the flu vaccine at the local/state level. The commenter asserted that regardless of known national declared shortages, regional availability limits should be reflected within the measure so as not to unduly penalize home health agencies.

        Response: In PY1, HHAs will not be scored on immunization rates for health personnel and will receive credit for reporting data related to immunizing healthcare staff.

        Comment: Some commenters expressed concern that the resources and time commitment required to be able to reliably report on this metric would create undue hardship for January 1, 2016 implementation and suggested delayed implementation.

        Response: We acknowledge the concerns expressed related to the timeline for reporting data on New Measures and agree with commenters that additional time for HHAs to prepare for data reporting is merited. We are finalizing that competing HHAs will be required to report data on this measure, as well as the other New Measures, no later than October 7, 2016.

        Final Decision: For the reasons discussed and in consideration of the comments received, we are finalizing this New Measure as proposed, with the modification that HHAs will be required to begin reporting data no later than October 7, 2016 for the period July 2016 through September 2016 and quarterly thereafter. As a result, the first quarterly performance report in July 2016 will not account for any of the New Measures.

      7. Herpes Zoster Vaccine (Shingles Vaccine) for Patients

        We proposed to adopt this measure for the HHVBP Model because it aligns with the NQS Quality Strategy Goal to Promote Effective Prevention & Treatment of Chronic Disease. Currently this measure is not endorsed by NQF or collected in OASIS. However, due to the severe physical consequences of symptoms associated with shingles,\41\ we view its adoption under the HHVBP Model as an opportunity to perform further study on this measure. The results of this analysis could provide the necessary data to meet NQF endorsement criteria. We proposed that the measure would calculate the percentage of home health patients who receive the Shingles vaccine, and collect the number of patients who did not receive the vaccine.

        ---------------------------------------------------------------------------

        \41\ For detailed information on Shingles incidences and known complications associated with this condition see CDC information available at http://www.cdc.gov/shingles/about/overview.html.

        ---------------------------------------------------------------------------

        Numerator: Equals the total number of Medicare beneficiaries aged 60 years and over who report having ever received herpes zoster vaccine (shingles vaccine) during the home health episode of care.

        Denominator: Equals the total number of Medicare beneficiaries aged 60 years and over receiving services from the HHA.

        The Food and Drug Administration (FDA) has approved the use of herpes zoster vaccine in adults age 50 and older. In addition, the Advisory Committee on Immunization Practices (ACIP) currently recommends that herpes zoster vaccine be routinely administered to adults, age 60 years and older.\42\ In 2013, 24.2 percent of adults 60 years and older reported receiving herpes zoster vaccine to prevent shingles, an increase from the 20.1 percent in 2012,\43\ yet below the targets recommended in the HHS Healthy People 2020 initiative.\44\

        ---------------------------------------------------------------------------

        \42\ CDC. Morbidity and Mortality Weekly Report 2011; 60(44):1528.

        \43\ CDC. Morbidity and Mortality Weekly Report 2015; 64(04):95-

        102.

        \44\ Healthy People 2020: Objectives and targets for immunization and infectious diseases. Available at https://www.healthypeople.gov/2020/topics-objectives/topic/immunization-and-infectious-diseases/objectives.

        ---------------------------------------------------------------------------

        The incidence of herpes zoster outbreak increases as people age, with a significant increase after age 50. Older people are more likely to experience the severe nerve pain known as post-herpetic neuralgia (PHN),\45\ the primary acute symptom of shingles infection, as well as non-pain complications, hospitalizations,\46\ and interference with activities of daily living.\47\ Studies have shown for adults aged 60 years or older the vaccine's efficacy rate for the prevention of herpes zoster is 51.3 percent and 66.5 percent for the prevention of PHN for up to 4.9 years after vaccination.\48\ The Short-Term Persistence Sub study (STPS) followed patients 4 to 7 years after vaccination and found a vaccine efficacy of 39.6 percent for the prevention of herpes zoster and 60.1 percent for the prevention of PHN.\49\ The majority of patients reporting PHN are over age 70; vaccination of this older population would prevent most cases, followed by vaccination at age 60 and then age 50.

        ---------------------------------------------------------------------------

        \45\ Yawn BP, Saddier P, Wollen PC, St Sauvier JL, Kurland MJ, Sy LS. A population-based study of the incidence and complication rate of herpes zoster before zoster vaccine introduction. Mayo Clinic Proc 2007; 82:1341-9.

        \46\ Lin F, Hadler JL. Epidemiology of primary varicella and herpes zoster hospitalizations: the pre-varicella vaccine era. J Infect Dis 2000; 181:1897-905.

        \47\ Schmader KE, Johnson GR, Saddier P, et al. Effect of a zoster vaccine on herpes zoster-related interference with functional status and health-related quality-of-life measures in older adults. J Am Geriatr Soc 2010; 58:1634-41.

        \48\ Schmader KE, Johnson GR, Saddier P, et al. Effect of a zoster vaccine on herpes zoster0-related interference with functional status and health-related quality-of-life measures in older adults. J Am Geriatr Soc 2010; 58:1634-41.

        \49\ Schmader, KE, Oxman, MN, Levin, MJ, Johnson,G, Zhang, JH, Betts, R, Morrison, VA, Gelb, L, Guatelli, JC, Harbecke, R, Pachucki, C, Keay, S, Menzies, B, Griffin, MR, Kauffman, C, Marques, A, Toney, J, Keller, PM, LI, X, Chan, LSF, Annumziato, P. Persistence of the Efficacy of Zoster Vaccine in the Shingles Prevention Study and the Short Term Persistence Substudy. Clinical Infectious Disease 2012; 55:1320-8

        ---------------------------------------------------------------------------

        We stated in the proposed rule that studying this measure in the home

        Page 68678

        health setting presents an ideal opportunity to address a population at risk which will benefit greatly from this vaccination strategy. For example, receiving the vaccine will often reduce the course and severity of the disease and reduce the risk of post herpetic neuralgia.

        We proposed, and are finalizing in this rule, that information on the above numerator and denominator will be reported by HHAs through the HHVBP web-based platform, in addition to other information related to this measure as the Secretary deems appropriate.

        We invited public comment on the proposed Herpes Zosters Vaccine measure.

        Comment: A number of commenters expressed concern that patients refuse Shingles vaccination since the vaccine is costly and is paid for only through Medicare Part D. A few commenters also expressed concerns that patients in home health may not have ready knowledge of their vaccination status, and tracking this information down could be burdensome for HHAs. Some commenters also raised the concern that a desire to comply with the measure presents the potential for unnecessary repeat vaccinations.

        Response: We appreciate public comment on this issue. CMS recognizes there are payment and access issues related to the Shingles vaccination. As a New Measure, competing HHAs will have the opportunity to report on implementation challenges related to patients accessing the Shingles vaccination and we will be evaluating feedback from HHAs provided through data reporting on the measure. However, we believe inclusion of this New Measure is connected to quality care for patients because the Shingles vaccination has been demonstrated to either reduce the incidence of Shingles or significantly mitigate the pain and discomfort associated with Shingles. Including the measure in intended to increase patient awareness and access to the vaccine if they so choose.

        Comment: One commenter recommended development of additional vaccine measures to align with ACIP policies.

        Response: We thank the commenter and note that we intend to evaluate the measures in the HHVBP Model on an annual basis and implement any changes to the measure set in future rulemaking. In PY1 we have included the ACIP recommendation to utilize the Shingles vaccination, and we will refer to ACIP recommendations when analyzing additional measures in subsequent years of the model.

        Comment: Commenters expressed concern about collecting Herpes Zoster vaccination data because they asserted that modifications to EMR will have to occur. Commenters also asserted that the resources and time commitment required to be able to reliably report on this metric would create undue hardship for January 1, 2016 implementation. Commenters recommended moving the timeline out 6-12 months for collecting this data.

        Response: We appreciate commenters' concerns regarding the timeline for data collection and agree that in some instances additional preparation time may be needed by competing HHAs including allowing for those HHAs who may have to modify their clinical record system. We are finalizing that competing HHAs will be required to report data on this measure, as well as the other New Measures, no later than October 7, 2016 for the period July 2016 through September 2016.

        Final Decision: For the reasons discussed and in consideration of the comments received, we are finalizing this New Measure as proposed, with the modification that HHAs will be required to begin reporting data no later than October 7, 2016 for the period July 2016 through September 2016 and quarterly thereafter. As a result, the first quarterly performance report in July 2016 will not account for any of the New Measures.

        6. HHVBP Model's Four Classifications

        As previously stated, the quality measures that we proposed to use in the performance years, as well as the quality measures that we are finalizing in this final rule, are aligned with the six NQS domains: Patient and Caregiver-Centered Experience and Outcomes; Clinical Quality of Care; Care Coordination; Population Health; Efficiency and Cost Reduction; and, Safety (see Figure 6).

        We proposed to filter these NQS domains and the HHVBP quality measures into four classifications to align directly with the measure weighting utilized in calculating payment adjustments. The four HHVBP classifications we proposed are: Clinical Quality of Care, Outcome and Efficiency, Person- and Caregiver-Centered Experience, and New Measures reported by the HHAs.

        We did not receive any public comments on our proposed measure classifications for the HHVBP Model and are finalizing these classifications with one modification. Specifically, we are revising Classification II from ``Outcome and Efficiency'' to ``Care Coordination and Efficiency.'' The definition of this classification is unchanged from the proposed rule. We are making this change to be more inclusive about this classification designation, which includes measures/NQS domains relating to care coordination.

        These final four classifications capture the multi-dimensional nature of health care provided by the HHA. These classifications are further defined as:

        Classification I--Clinical Quality of Care: Measures the quality of health care services provided by eligible professionals and paraprofessionals within the home health environment.

        Classification II--Care Coordination and Efficiency: Outcomes measure the end result of care including coordination of care provided to the beneficiary. Efficiencies measure maximizing quality and minimizing use of resources.

        Classification III--Person- and Caregiver-Centered Experience: Measures the beneficiary and their caregivers' experience of care.

        Classification IV--New Measures: Measures not currently reported by Medicare-certified HHAs to CMS, but that may fill gaps in the NQS Domains not completely covered by existing measures in the home health setting.

        Page 68679

        GRAPHIC TIFF OMITTED TR05NO15.005

        7. Weighting

        We proposed that measures within each classification would be weighted the same for the purposes of payment adjustment. We are weighting at the individual measure level and not the classification level. Classifications are for organizational purposes only. We proposed this approach because we did not want any one measure within a classification to be more important than another measure. Under this approach, a measure's weight will remain the same even if some of the measures within a classification group have no available data. We stated in the proposed rule that weighting will be re-examined in subsequent years of the model and be subject to the rulemaking process. We invited comments on the proposed weighting methodology for the HHVBP Model.

        Comment: We received a few comments on the weighting of measures in the starter set. Some commenters recommended that certain measures should be weighted more than others; with one comment specifying the re-hospitalization measure should have greater weight, and some other commenters suggesting that measures not based on self-reported data should have greater weight. One commenter expressed concern that by weighting measures equally, HHAs will have little opportunity to make significant improvements because each measure will only represent a small fraction of the agency's score; therefore, agencies would need to make large improvements in many measures to see a meaningful difference in their overall score. All comments related to weighting indicated a preference for moving away from each measure receiving equal weight.

        Response: The quality measures that were selected for the HHVBP Model capture the multiple dimensions of care that HHA provide to their beneficiaries. We are finalizing this proposed policy because equally weighted measures will encourage HHAs to approach quality improvement initiatives more broadly in an effort to capture the multidimensional aspects of care that HHAs provide. In addition, weighting the measures equally addresses concerns where HHAs may be providing services to beneficiaries with different needs. If particular measures are weighted more than others, HHAs may only make the investment to improve their quality in those areas where measures have a higher weight, potentially allowing other aspects of care to be subject to potential neglect. We will monitor the impact of the equally weighting the individual measures and may consider changes to the weighting methodology after analysis and through rulemaking.

        Final Decision: For the reasons discussed, we are finalizing the weighting methodology as proposed without modification.

    6. Performance Scoring Methodology

      1. Performance Calculation Parameters

      The methodology we proposed, and are finalizing in this final rule for the reasons discussed herein, for assessing each HHA's total annual performance is based on a score calculated using the starter set of quality measures that apply to the HHA (based on a minimum number of cases, as discussed herein). The methodology will provide an assessment on a quarterly basis for each HHA and will result in an annual distribution of value-based payment adjustments among HHAs so that HHAs achieving the highest performance scores will receive the largest upward payment adjustment. The methodology includes three primary features:

      The HHA's Total Performance Score (TPS) will be determined using the higher of an HHA's achievement or improvement score for each measure;

      All measures within the Clinical Quality of Care, Care Coordination and Efficiency, and Person and Caregiver-Centered Experience classifications will have equal weight and will account for 90-percent of the TPS (see Section 2 below) regardless of the number of measures in the three classifications.

      Page 68680

      Points for New Measures are awarded for submission of data on the New Measures via the HHVBP web-based platform, and withheld if data is not submitted. Data reporting for each New Measure will have equal weight and will account for 10-percent of the TPS for the first performance year; and,

      The HHA performance score would reflect all of the measures that apply to the HHA based on a minimum number of cases defined below.

      For the reasons discussed in more detail later in this section, we are finalizing our proposed performance scoring methodology with one modification related to the rounding up or down of achievement and improvement scoring used in the calculation of the Total Performance Score.

      2. Considerations for Calculating the Total Performance Score

      We proposed, and are finalizing in this final rule, in Sec. 484.320 to calculate the TPS by adding together points awarded to Medicare-certified HHAs on the starter set of measures, including the New Measures. As explained in the proposed rule, we considered several factors when developing the performance scoring methodology for the HHVBP Model. First, it is important that the performance scoring methodology be straightforward and transparent to HHAs, patients, and other stakeholders. HHAs must be able to clearly understand performance scoring methods and performance expectations to maximize quality improvement efforts. The public must understand performance score methods to utilize publicly-reported information when choosing HHAs.

      Second, we believe the performance scoring methodology for the HHVBP Model should be aligned appropriately with the quality measurements adopted for other Medicare value-based purchasing programs including those introduced in the hospital and skilled nursing home settings. This alignment will facilitate the public's understanding of quality measurement information disseminated in these programs and foster more informed consumer decision-making about their health care choices.

      Third, we believe that differences in performance scores must reflect true differences in quality performance. To make sure that this point is addressed in the performance scoring methodology for the HHVBP Model, we assessed quantitative characteristics of the measures, including the current state of measure development, number of measures, and the number and grouping of measure classifications.

      Fourth, we believe that both quality achievement and improvement must be measured appropriately in the performance scoring methodology for the HHVBP Model. The methodology specifies that performance scores under the HHVBP Model are calculated utilizing the higher of achievement or improvement scores for each measure. The impact of performance scores utilizing achievement and improvement on HHAs' behavior and the resulting payment implications was also considered. Using the higher of achievement or improvement scores allows the model to recognize HHAs that have made great improvements, though their measured performance score may still be relatively lower in comparison to other HHAs.

      Fifth, through careful measure selection we intend to eliminate, or at least control for, unintended consequences such as undermining better outcomes to patients or rewarding inappropriate care. As discussed above, when available, NQF endorsed measures will be used. In addition we are adopting measures that we believe are closely associated with better outcomes in the HHA setting in order to incentivize genuine improvements and sustain positive achievement while retaining the integrity of the model.

      Sixth, we intend that the model will utilize the most currently available data to assess HHA performance. We recognize that these data would not be available instantaneously due to the time required to process quality measurement information accurately; however, we intend to make every effort to process data in the timeliest fashion. Using more current data will result in a more accurate performance score while recognizing that HHAs need time to report measure data.

      3. Additional Considerations for the HHVBP Total Performance Scores

      Many of the key elements of the HHVBP Model performance scoring methodology that we proposed, and are finalizing in this final rule for the reasons described herein, are aligned with the scoring methodology of the Hospital Value-Based Purchasing Program (HVBP) in order to leverage the rigorous analysis and review underpinning that Program's approach to value-based purchasing in the hospital sector. The HVBP Program includes as one of its core elements the scoring methodology included in the 2007 Report to Congress ``Plan to Implement a Medicare Hospital Value-Based Purchasing Program'' (hereinafter referred to as ``The 2007 HVBP Report'').\50\ The 2007 HVBP Report describes a Performance Assessment Model with core elements that can easily be replicated for other value-based purchasing programs or models, including the HHVBP Model.

      ---------------------------------------------------------------------------

      \50\ The 2007 HVBP Report is available at the CMS Web site at https://www.cms.gov/Medicare/Medicare-Fee-for-Service-Payment/AcuteInpatientPPS/downloads/HospitalVBPPlanRTCFINALSUBMITTED2007.pdf.

      ---------------------------------------------------------------------------

      In the HVBP Program, the Performance Assessment Model aggregates points on the individual quality measures across different quality measurement domains to calculate a hospital's TPS. Similarly, the proposed HHVBP Model would aggregate points on individual measures across four measure classifications derived from the 6 CMS/NQS domains as described above (see Figure 3) to calculate the HHA's TPS. In addition, the proposed HHVBP payment methodology is also aligned with the HVBP Program with respect to evaluating an HHA's performance on each quality measure based on the higher of an achievement or improvement score in the performance period. The model is not only designed to provide incentives for HHAs to provide the highest level of quality, but also to provide incentives for HHAs to improve the care they provide to Medicare beneficiaries. By rewarding HHAs that provide high quality and/or high improvement, we believe the HHVBP Model will ensure that all HHAs will be incentivized to commit the resources necessary to make the organizational changes that will result in better quality.

      We proposed, and are finalizing for the reasons described herein, that under the model, an HHA will be awarded points only for ``applicable measures.'' An ``applicable measure'' is one for which the HHA has provided 20 home health episodes of care per year. Points awarded for each applicable measure will be aggregated to generate a TPS. As described in the benchmark section below, HHAs will have the opportunity to receive 0 to 10 points for each measure in the Clinical Quality of Care, Care Coordination and Efficiency, and Person and Caregiver-Centered Experience classifications. Each measure will have equal weight regardless of the total number of measures in each of the first three classifications. In contrast, we proposed, and are finalizing in this rule, to score the New Measures in a different way. For each New Measure, HHAs will receive 10 points if they report the New Measure or 0 points if they do not report the measure during the performance

      Page 68681

      year. In total, the New Measures will account for 10-percent of the TPS regardless of the number of measures applied to an HHA in the other three classifications.

      We proposed, and are finalizing in this rule, to calculate the TPS for the HHVBP methodology similarly to the TPS calculation that has been finalized under the HVBP program. The performance scoring methodology for the HHVBP Model will include determining performance standards (benchmarks and thresholds) using the 2015 baseline period performance year's quality measure data, scoring HHAs based on their achievement and/or improvement with respect to those performance standards, and weighting each of the classifications by the number of measures employed, as presented in further detail in Section G below.

      4. Setting Performance Benchmarks and Thresholds

      For scoring HHAs' performance on measures in the Clinical Quality of Care, Care Coordination and Efficiency, and Person and Caregiver-

      Centered Experience classifications, we proposed, and are finalizing in this rule, to adopt an approach using several key elements from the scoring methodology set forth in the 2007 HVBP Report and the successfully implemented HVBP Program \51\ including allocating points based on achievement or improvement, and calculating those points based on industry benchmarks and thresholds.

      ---------------------------------------------------------------------------

      \51\ For detailed information on HVBP scoring see http://www.medicare.gov/hospitalcompare/data/hospital-vbp.html.

      ---------------------------------------------------------------------------

      In determining the achievement points for each measure, HHAs will receive points along an achievement range, which is a scale between the achievement threshold and a benchmark. We proposed, and are finalizing in this rule, that the achievement threshold will be calculated as the median of all HHAs' performance on the specified quality measure during the baseline period and to calculate the benchmark as the mean of the top decile of all HHAs' performance on the specified quality measure during the baseline period. Unlike the HVBP Program that uses a national sample, this model will calculate both the achievement threshold and the benchmark separately for each selected state and for HHA cohort size. Under this methodology, we will have benchmarks and achievement thresholds for both the larger-volume cohort and for the smaller-volume cohort of HHAs (defined in each state based on a baseline period that runs from January 1, 2015 through December 31, 2015). Another way HHVBP differs from the Hospital VBP is this model only uses 2015 as the baseline year for the measures included in the starter set. For the starter set used in the model, 2015 will consistently be used as the baseline period in order to evaluate the degree of change that may occur over the multiple years of the model. In determining improvement points for each measure, we proposed, and are finalizing in this rule, that HHAs will receive points along an improvement range, which is a scale indicating change between an HHA's performance during the performance period and the baseline period. In addition, as in the achievement calculation, the benchmark and threshold will be calculated separately for each state and for HHA cohort size so that HHAs will only be competing with those HHAs in their state and their size cohort.

      5. Calculating Achievement and Improvement Points

      1. Achievement Scoring

        We proposed the achievement scoring under the HHVBP Model be based on the Performance Assessment Model set forth in the 2007 HVBP Report and as implemented under the HVBP Program. An HHA could earn 0-10 points for achievement for each measure in the Clinical Quality of Care, Care Coordination and Efficiency, and Person and Caregiver-

        Centered Experience classifications based on where its performance during the performance period falls relative to the achievement threshold and the benchmark, according to the following formula:

        GRAPHIC TIFF OMITTED TR05NO15.006

        We proposed that all achievement points would be rounded up or down to the nearest point (for example, an achievement score of 4.55 would be rounded to 5). After considering the potential skewing of HHA ranking that would occur with rounding up to the nearest point, we are finalizing that all achievement points will be rounded up or down to the third decimal point (for example, an achievement score of 4.5555 would be rounded to 4.556). The will ensure greater precision in scoring and ranking HHAs within their cohorts.

        HHAs could receive an achievement score as follows:

        An HHA with performance equal to or higher than the benchmark could receive the maximum of 10 points for achievement.

        An HHA with performance equal to or greater than the achievement threshold (but below the benchmark) could receive 1-9 points for achievement, by applying the formula above.

        An HHA with performance less than the achievement threshold could receive 0 points for achievement.

        We invited comments on the proposed methodology for scoring HHAs on achievement.

        Comment: Some commenters expressed concern that HHAs will not know what benchmark is needed to avoid penalty until the end of the 2015 performance year, and several commenters recommended that CMS establish benchmarks based on historical performance so it is clear to HHAs the level of achievement necessary to avoid penalties. Commenters voiced concern that agencies may not invest in quality improvement activities if the potential financial return is difficult to determine. Commenters also recommended that CMS set benchmarks at a level such that most providers have a reasonable expectation of achieving them. A few commenters suggested keeping 2015 as the base year, and suggested providing HHAs with mid-course snapshots of their performance against the benchmarks.

        Response: The HHVBP Model is using the 2015 quality data as the baseline for the model because it is the most recent data available. As indicated in the payment methodology, the achievement threshold for each measure used in the

        Page 68682

        model will be based on the median of Medicare-certified HHA performance on the specified quality measure during the baseline period (2015). The benchmark refers to the mean of the top decile of Medicare-certified HHA performance on the specified quality measure during the baseline period (2015). Benchmarks and achievement thresholds are calculated separately for the larger-volume and smaller-volume cohorts within each state. HHAs will receive points if they achieve performance equal to or above the achievement threshold (the median of 2015). We believe that awarding points to HHAs that provide better quality than the median is an achievable level and will incentivize HHAs to make the investments necessary to improve their quality. Benchmarks and achievement thresholds for each measure will be available on each respective HHA's quarterly report. The 2015 base year achievement threshold and the benchmarks for each cohort will be provided to the HHAs in April 2016. We believe that this will provide sufficient notice to HHAs of the level of performance necessary to receive points for each given measure. In addition, baseline values will be included in all quarterly reports for all measures.

        Final Decision: For the reasons discussed and in consideration of the comments received, we are finalizing the proposed methodology for scoring HHAs on achievement under the HHVBP Model, with one modification. Specifically, as noted above, under our final policy all achievement points will be rounded up or down to the third decimal point (for example, an achievement score of 4.5555 would be rounded to 4.556).

      2. Improvement Scoring

        In keeping with the approach used by the HVBP Program, we proposed that an HHA could earn 0-10 points based on how much its performance during the performance period improved from its performance on each measure in the Clinical Quality of Care, Care Coordination and Efficiency, and Person and Caregiver-Centered Experience classifications during the baseline period. A unique improvement range for each measure will be established for each HHA that defines the difference between the HHA's baseline period score and the same state and size level benchmark for the measure used in the achievement scoring calculation described previously, according to the following formula:

        GRAPHIC TIFF OMITTED TR05NO15.007

        We proposed that all improvement points will be rounded to the nearest point and are now finalizing that improvement points will be rounded up or down to the third decimal point (see example above). If an HHA's performance on the measure during the performance period was:

        Equal to or higher than the benchmark score, the HHA could receive an improvement score of 10 points;

        Greater than its baseline period score but below the benchmark (within the improvement range), the HHA could receive an improvement score of 0-10, based on the formula above; or

        Equal to or lower than its baseline period score on the measure, the HHA could receive 0 points for improvement.

        We invited comments on the proposed methodology for scoring HHAs on improvement.

        Comment: There were many comments directed at the proposed methodology for improvement scoring under the HHVBP Model. Some commenters opposed awarding credit for improvement, and noted their concern that by using the greater of either an HHA's achievement or improvement score, the methodology could reward a HHA with a low performance but high improvement score because that HHA could receive higher payments than a high performing agency. These commenters encouraged CMS to focus on rewarding the achievement of specified quality scores, and reduce its emphasis on improvement scores after the initial three years of the HHVBP Model, given that what matters most to beneficiaries is an agency's actual performance. Additionally, commenters recommended that HHA achievement scores be weighted more heavily than improvement scores, noting that some HHAs may have little or no room for improvement in their current quality performance scores. Some commenters suggested measuring performance primarily on the basis of achievement of specified quality scores, with a declining emphasis over time on improvement versus achievement.

        Response: We appreciate the commenters raising these concerns. The model is designed to improve and to ensure the highest quality of care for all Medicare beneficiaries. If the model only focused on rewarding those HHAs that already provide the highest quality of care, only the beneficiaries that receive care from those HHAs would benefit from the model. Therefore, we believe that providing the opportunity to earn points for both achievement and improvement provides the greatest opportunity for the quality of care to rise for all beneficiaries who receive services from competing HHAs. We will, however, monitor and evaluate the impact of awarding an equal amount of points for both achievement and improvement and may consider changes to the weight of the improvement score relative to the achievement score in future years through rulemaking.

        Final Decision: For the reasons discussed, we are finalizing the improvement scoring methodology as proposed.

        Comment: Several commenters expressed concern that the proposed HHVBP structure requires that HHAs be penalized each year, regardless of their performance or improvement, noting that each year, some HHAs will end up in the bottom decile, even if the difference between the highest and lowest scoring is only a few points. These commenters were concerned that if the lowest scoring HHAs do not have the resources to rise from the bottom they are at risk for going out of business by the end of the model. If low scoring HHAs leave the market, then higher scoring HHAs will move into the bottom decile the next year of the model. These HHAs could experience a downward payment adjustment even though their performance, in actuality, is not significantly different than HHAs ranked higher. These commenters are concerned this limits value based performance improvement.

        Response: We understand commenters concerns but the purpose of the model is to improve quality across the HH sector. As is the case currently, the market will not remain static, and HHAs of all calibers will leave and enter the market. In many instances, if a small number of low performing HHAs do drop out of the market, the next group

        Page 68683

        of low scoring HHAs will include HHAs whose performance equals or exceeds the average baseline performance, and will likely have received bonus payments in previous years. We have done financial modeling based on recent HHA performance (see chart I2 for further explanation) and results support our understanding of how scoring will work. In addition, we have analyzed available data and lessons learned from the Hospital VBP program and the previous home health demonstration to support our findings. As indicated in the proposed rule,\52\ HHAs may end up in the bottom decile in relationship to other HHAs in their cohort in later years of the model even after they improve their quality if all the HHAs in the model improve at the same rate. However, in the HHVBP model their downward payment adjustment, if any, could be substantially reduced because all performance scoring is anchored to the 2015 benchmark.

        ---------------------------------------------------------------------------

        \52\ 80 FR 39910 (July 10, 2015). See Table 25.

        ---------------------------------------------------------------------------

        Final Decision: For the reasons discussed and in consideration of the comments received, we are finalizing the proposed methodology for scoring HHAs under the HHVBP Model, with one modification to decimal scoring, where we are finalizing that all achievement and improvement points will be rounded up or down to the third decimal point (for example, an achievement score of 4.5555 would be rounded to 4.556).

      3. Examples of Calculating Achievement and Improvement Scores

        For illustrative purposes we present the following examples of how the performance scoring methodology will be applied in the context of the measures in the Clinical Quality of Care, Care Coordination and Efficiency, and Person and Caregiver-Centered Experience classifications. These HHA examples were selected from an empirical database created from 2013/2014 data from the Home Health Compare archived data, claims data and enrollment data to support the development of the HHVBP permutation of the Performance Assessment Model, and all performance scores are calculated for the pneumonia measure, with respect to the number of individuals assessed and administered the pneumococcal vaccine. We note that the figures and examples below are the same figures and examples set forth in the proposed rule, updated to reflect our final policy on rounding of these scores, as discussed previously.

        Figure 7 shows the scoring for HHA `A', as an example. The benchmark calculated for the pneumonia measure in this case was 0.875 (the mean value of the top decile in 2013), and the achievement threshold was 0.474 (the performance of the median or the 50th percentile among HHAs in 2013). HHA A's 2014 performance rate of 0.910 during the performance period for this measure exceeds the benchmark, so HHA A would earn 10 (the maximum) points for its achievement score. The HHA's performance rate on a measure is expressed as a decimal. In the illustration, HHA A's performance rate of 0.910 means that 91-

        percent of the applicable patients that were assessed were given the pneumococcal vaccine. In this case, HHA A has earned the maximum number of 10 possible achievement points for this measure and thus, its improvement score is irrelevant in the calculation.

        Figure 7 also shows the scoring for HHA `B'. As referenced below, HHA B's performance on this measure went from 0.212 (which was below the achievement threshold) in the baseline period to 0.703 (which is above the achievement threshold) in the performance period. Applying the achievement scale, HHA B would earn 5.640 points for achievement, calculated as follows: 9 * ((0.703 - 0.474)/(0.875 - 0.474)) + 0.5 = 5.640.

        Checking HHA B's improvement score yields the following result: Based on HHA B's period-to-period improvement, from 0.212 in the baseline year to 0.703 in the performance year, HHA B would earn 6.906 points, calculated as follows: 10 * ((0.703 - 0.212)/(0.875 - 0.212)) - 0.5 = 6.906. Because the higher of the achievement and improvement scores is used, HHA B would receive 6.906 points for this measure.

        Page 68684

        GRAPHIC TIFF OMITTED TR05NO15.008

        In Figure 8, HHA `C' yielded a decline in performance on the pneumonia measure, falling from 0.571 to 0.462 (a decline of 0.11 points). HHA C's performance during the performance period is lower than the achievement threshold of 0.472 and, as a result, receives 0 points based on achievement. It also receives 0 points for improvement, because its performance during the performance period is lower than its performance during the baseline period.

        Page 68685

        GRAPHIC TIFF OMITTED TR05NO15.009

        6. Scoring Methodology for New Measures

        The HHVBP Model provides us with the opportunity to study new quality measures. We proposed that the New Measures for PY1 would be reported directly by the HHA and would account for 10-percent of the TPS regardless of the number of measures in the other three classifications (we refer the reader to 80 FR 39890 for further discussion of our proposed scoring methodology for New Measures). For the reasons set forth in the proposed rule and in response to comments below, we are finalizing our proposed scoring methodology for New Measures, revised only to reflect that the final starter set will include three, rather than four, New Measures, as discussed in section E5. Under our final methodology, the final three New Measures that we are adopting for PY1 will be reported directly by the HHA and will account for 10-percent of the TPS regardless of the number of measures in the other three classifications. HHAs that report on these measures will receive 10 points out of a maximum of 10 points for each of the 3 measures in the New Measure classification. Hence, a HHA that reports on all 3 measures will receive 30 points out of a maximum of 30. An HHA will receive 0 points for each measure that it fails to report on. If an HHA reports on all 3 measures, it will receive 30 points for the classification and 10 points (30/30 * 10 points) will be added to its TPS because the New Measure classification has a maximum weight of 10 percent. If an HHA reports on 2 of 3 measures, it will receive 20 points of 30 points available for the classification and 6.667 points (20/30 * 10 points) added to its TPS. If an HHA reports on 1 of 3 measures, they will receive 10 points of 30 points available for the classification and 3.333 points (10/30 * 10 points) added to their TPS. If an HHA reports on 0 of 3 measures, they will receive 0 points and have no points added to their TPS. We intend to update these measures through future rulemaking to allow us to study newer, leading-edge measures as well as retire measures that no longer require such analysis.

        We invited comments on the proposed scoring methodology for New Measures.

        Comment: Several commenters expressed support for CMS limiting the burden on HHAs by allowing them to gain full credit toward their TPS on the New Measures just for reporting data to CMS.

        Response: We appreciate the commenters' support for our proposal. In order to reduce the burden of introducing innovative measures not previously endorsed for home health, and to allow HHAs to acclimate to reporting the New Measures, we are finalizing our proposed scoring methodology that awards HHAs full credit for data reporting on New Measures.

        Final Decision: For the reasons discussed and in consideration of the comments received, we are finalizing our proposed scoring methodology for New Measures, modified to reflect the removal of one New Measure resulting in a total of three New Measures for PY1.

        7. Minimum Number of Cases for Outcome and Clinical Quality Measures

        We proposed that while no HHA in a selected state would be exempt from the HHVBP Model, there may be periods when an HHA does not receive a payment adjustment because there are not an adequate number of episodes of care to generate sufficient quality measure data. We proposed, and are finalizing in this rule, that the minimum threshold for an HHA to receive a score on a given measure will be 20 home health episodes of care per year for HHAs that have been certified for at least 6-months. If a competing HHA does not meet this threshold to generate scores on five or more of the Clinical Quality of Care, Care Coordination and Efficiency, and Person and Caregiver-Centered Experience measures, no payment adjustment will be made, and

        Page 68686

        the HHA will be paid for HHA services in an amount equivalent to the amount it would have been paid under section 1895 of the Act.\53\

        ---------------------------------------------------------------------------

        \53\ HHVBP would follow the Home Health Compare Web site policy not to report measures on HHAs that have less than 20 observations for statistical reasons concerning the power to detect reliable differences in the quality of care.

        ---------------------------------------------------------------------------

        We explained in the proposed rule that HHAs with very low case volumes will either increase their volume in later performance years, and be subject to future payment adjustment, or the HHAs' volume will remain very low and the HHAs would continue to not have their payment adjusted in future years. Based on the most recent data available at this time, a very small number of HHAs are reporting on less than five of the total number of measures included in the Clinical Quality of Care, Care Coordination and Efficiency, and Person and Caregiver-

        Centered Experience classifications and account for less than 0.5 percent of the claims made over 1,900 HHAs delivering care within the nine selected states. We stated that we expect very little impact of very low service volume HHAs on the model due to the low number of low-

        volume HHAs and because it is unlikely that a HHA will reduce the amount of service to such a low level to avoid a payment adjustment. Although these HHAs will not be subject to payment adjustments, they will remain in the model and have access to the same technical assistance as all other HHAs in the model, and will receive quality reports on any measures for which they do have 20 episodes of care, and a future opportunity to compete for payment adjustments.

        We invited comments on the proposed minimum number of cases to receive a score on outcome and clinical quality measures.

        Comment: One commenter expressed concern that some HHAs would artificially suppress the number of cases open in OASIS to below 20 in order to be excluded from a particular measure, or be excluded from a sufficient number of measures to be excluded from payment adjustments entirely.

        Response: All Medicare-certified HHAs in selected states are included in the HHVBP Model, even when a particular HHA does not meet the minimum number of cases to generate scores on a sufficient number of quality measures. During a period when an HHA does not receive a payment adjustment the HHA remains in the model, performance is still monitored, and the agency is eligible for technical assistance. HHAs with small patient loads are expected to access technical assistance and engage in quality improvement activities in anticipation of earning scores on all quality measures in the future. HHAs with small patient populations are also expected to enter data on the New Measures via the CMS portal. In addition, HHAs must submit OASIS data in order to receive payment for their services. We do not anticipate HHAs suppressing the number of patients they serve in order to avoid payment adjustments because there are very few HHAs that provide care to such a small number of beneficiaries and the financial losses associated with restricting the volume of care provided would far outweigh the losses associated with the downward payment adjustment.

        Final Decision: For these reasons and in consideration of the comments received, we are finalizing our proposal on the minimum number of cases for outcome and clinical quality measures without modification.

        We provide below an example of the payment methodology. We note that this is the same example provided in the proposed rule (see 80 FR 39891), modified only to reflect our final policy to include 21 (rather than 25) measures in the Clinical Quality of Care, Care Coordination and Efficiency, and Person and Caregiver-Centered Experience classifications and three (rather than four) New Measures in the final starter set for PY1.

        HHA ``A'' has at least 20 episodes of care in a 12-month period for only nine (9) quality measures out of a possible 21 measures from three of the four classifications (except the New Measures). Under the final scoring methodology outlined above, HHA A would be awarded 0, 0, 3, 4, 5, 7, 7, 9, and 10 points, respectively, for these measures. HHA A's total earned points for the three classifications would be calculated by adding together all the points awarded to HHA A, resulting in a total of 45 points. HHA A's total possible points would be calculated by multiplying the total number of measures for which the HHA reported on least 20 episodes (nine) by the maximum number of points for those measures (10), yielding a total of 90 possible points. HHA A's score for the three classifications would be the total earned points (45) divided by the total possible points (90) multiplied by 90 because as mentioned in section E7, the Clinical Quality of Care, Care Coordination and Efficiency, and Person and Caregiver-Centered Experience classifications account for 90-percent of the TPS and the New Measures classification accounts for 10-percent of the TPS, which yields a result of 45. In this example, HHAs also reported all 3 measures and would receive the full 10 points for the New Measures. As a result, the TPS for HHA A would be 55 (45 plus 10). In addition, as specified in Section E:7--Weighting, all measures have equal weights regardless of their classification (except for New Measures) and the total earned points for the three classifications can be calculated by adding the points awarded for each such measure together.

    7. The Payment Adjustment Methodology

      We proposed to codify at 42 CFR 484.330 a methodology for applying value-based payment adjustments to home health services under the HHVBP Model. We proposed that payment adjustments would be made to the HH PPS final claim payment amount as calculated in accordance with Sec. 484.205 using a linear exchange function (LEF) similar to the methodology utilized by the HVBP Program. The LEF is used to translate an HHA's TPS into a percentage of the value-based payment adjustment earned by each HHA under the HHVBP Model. The LEF was identified by the HVBP Program as the simplest and most straightforward option to provide the same marginal incentives to all hospitals, and we believe the same to be true for HHAs. We proposed the function's intercept at zero percent, meaning those HHAs that have a TPS that is average in relationship to other HHAs in their cohort (a zero percent), would not receive any payment adjustment. Payment adjustments for each HHA with a score above zero percent would be determined by the slope of the LEF. In addition we proposed to set the slope of the LEF for the first performance year, CY 2016, so that the estimated aggregate value-based payment adjustments for CY 2016 are equal to 5-percent of the estimated aggregate base operating episode payment amount for CY 2018. The estimated aggregate base operating episode payment amount is the total amount of episode payments made to all the HHAs by Medicare in each individual state in the larger- and smaller-volume cohorts respectively.

      We provided in Figure 9 of the proposed rule an example of how the LEF is calculated and how it would be applied to calculate the percentage payment adjustment to a HHA's TPS (we refer the reader to 80 FR 39891 through 39892 for further discussion of our proposal). For this example, we applied the 8-percent payment adjustment level that was proposed to be used in the final 2 years of the HHVBP Model, and noted that the rate

      Page 68687

      for the payment adjustments for other years would be proportionally less.

      We invited comments on this proposed payment adjustment methodology.

      Comment: While offering support for the concept of value-based purchasing, the majority of commenters expressed concern with the magnitude of an 8-percent maximum payment risk such that it might reduce access to care for vulnerable patients. Commenters offered that payment adjustments could be made in later years of the model to provide HHAs with adequate time to ensure readiness to comply with model requirements and to allow CMS more time to study the initial model results. Many commenters also remarked on the differences between the Hospital Value- Based Purchasing (HVBP) Program and HHVBP Model maximum risk corridors and suggested lowering the HHVBP payment adjustments to align with the 2-percent maximum established in the HVBP Program.

      Response: We thank commenters for their input. As discussed in the proposed rule, based on lessons learned from Hospital VBP, the 2008 Home Health pay for performance demonstration, and the MedPAC report, we believe that testing high financial incentives is necessary to motivate improvements in quality and patient satisfaction. However, we agree with commenters that providing some additional leeway for HHAs to ensure compliance with the model is important, and would also address concerns associated with moving competing HHAs from FFS incentives to VBP financial incentives tied to quality measures. Accordingly, under our final policy, we are reducing the payment adjustment percentage in CY 2018 from 5-percent to 3-percent. Further, by responding to these practical concerns, the conceptual model remains intact with the capacity to test the effect of higher incentives on quality.

      We believe this will provide HHAs more time to become familiar with the operation of the model before applying the higher percentage payment adjustments in later years. Additionally, under our final policy, we are reducing the payment adjustment for CY 2021 from 8-

      percent to 7-percent to establish a more gradual payment adjustment incentive schedule of 3-percent (in 2018), 5-percent (in 2019), 6-

      percent (in 2020), 7-percent (in 2021) and, 8-percent (in 2022).

      Comment: Several commenters raised concerns with the magnitude of an 8-percent maximum payment risk such that it might reduce access to care for vulnerable patients and threaten the financial viability of HHAs, including their ability to reinvest in infrastructure, care coordination, and financial preparations to participate in the HHVBP Model.

      Response: We have conducted financial modeling based on the proposed model and posit the finalized maximum upward and downward payment adjustments (ranging from 3- to 8-percent) are sufficiently significant to improve quality of care and will not have a negative impact on beneficiary access. The model does not reduce the overall payments to HHAs and, as a result, the aggregate average margins of all competing HHAs will be unaffected by the model. Competing HHAs that provide the highest quality of care and that receive the maximum upward adjustment will improve their financial viability that could ensure that the vulnerable population that they serve has access to high quality care. Only HHAs that provide very poor quality of care, relative to the cohort they compete within, would be subject to the highest downward payment adjustments.

      Final Decision: For the reasons discussed and in consideration of the comments received, we are finalizing the proposed payment adjustment methodology with modification. As noted, we are finalizing the following maximum payment adjustment percentage for each payment year: in CY 2018, 3-percent; in CY 2019, 5-percent; in CY 2020, 6-

      percent; in CY 2021, 7-percent; and in CY 2022, 8-percent. Consistent with this final policy, under our final payment adjustment methodology, we set the slope of the LEF for the first performance year, CY 2016, so that the estimated aggregate value-based payment adjustments for CY 2016 are equal to 3-percent of the estimated aggregate base operating episode payment amount for CY 2018, rather than 5-percent as proposed.

      Figure 9 provides an example of how the LEF is calculated and how it is applied to calculate the percentage payment adjustment to a HHA's TPS under our final policy. For this example, we applied the 8-percent payment adjustment level that will be used in the final year of the HHVBP Model (CY 2022) under our final policy. The rate for the payment adjustments for other years would be proportionally less.

      Step #1 involves the calculation of the `Prior Year Aggregate HHA Payment Amount' (See C2 in Figure 9) that each HHA was paid in the prior year. From claims data, all payments are summed together for each HHA for CY 2015, the year prior to the HHVBP Model.

      Step #2 involves the calculation of the `8-percent Payment Reduction Amount' (C3 of Figure 9) for each HHA. The `Prior Year Aggregate HHA Payment Amount' is multiplied by the `8-percent Payment Reduction Rate'. The aggregate of the `8-percent Payment Reduction Amount' is the numerator of the LEF.

      Step #3 involves the calculation of the `Final TPS Adjusted Reduction Amount' (C4 of Figure 9) by multiplying the `8-percent Payment Reduction Amount' from Step #2 by the TPS (C1) divided by 100. The aggregate of the `TPS Adjusted Reduction Amount' is the denominator of the LEF.

      Step #4 involves calculating the LEF (C5 of Figure 9) by dividing the aggregate `8-percent Payment Reduction Amount' by the aggregate `TPS Adjusted Reduction Amount'.

      Step #5 involves the calculation of the `Final TPS Adjusted Payment Amount' (C6 of Figure 9) by multiplying the `TPS Adjusted Reduction Amount' (C4) by the LEF (C5). This is an intermediary value used to calculate `Quality Adjusted Payment Rate'.

      Step #6 involves the calculation of the `Quality Adjusted Payment Rate' (C7 of Figure 9) that the HHA will receive instead of the 8-

      percent reduction in payment. This is an intermediary step to determining the payment adjustment rate. For CY 2022, the payment adjustment in this column will range from 0-percent to 16-percent depending on the quality of care provided.

      Step #7 involves the calculation of the `Final Percent Payment Adjustment' (C8 of Figure 9) that will be applied to the HHA payments after the performance period. It simply involves the CY payment adjustment percent (as finalized, in 2018, 3-percent; in 2019, 5-

      percent; in 2020, 6-percent; in 2021, 7-percent; and in 2022, 8-

      percent). In this example, we use the maximum eight-percent (8-percent) subtraction to the `Quality Adjusted Payment Rate'. Note that the payment adjustment percentage is capped at no more than plus or minus 8-percent for each respective performance period and the payment adjustment will occur on the final claim payment amount.

      Page 68688

      Figure 9--8-Percent Reduction Sample

      --------------------------------------------------------------------------------------------------------------------------------------------------------

      Step 1 Step 2 Step 3 Step 4 Step 5 Step 6 Step 7

      -------------------------------------------------------------------------------------------------

      Linear

      Prior year 8-Percent TPS adjusted exchange Final TPS Quality Final

      HHA TPS aggregate payment reduction function adjusted adjusted percent

      HHA payment reduction amount (C1/ (LEF) (Sum payment payment rate payment

      * amount 100)*C3 of C3/ Sum amount (C6/C2) *100 adjustment +/

      (C2*8%) of C4) (C4*C5) - (C7-8%)

      (C1) (C2) (C3) (C4) (C5) (C6) (C7) (C8)

      --------------------------------------------------------------------------------------------------------------------------------------------------------

      HHA1....................................... 38 $100,000 $8,000 $3,040 1.93 $5,867 5.9 % -2.1%

      HHA2....................................... 55 145,000 11,600 6,380 1.93 12,313 8.5 0.5

      HHA3....................................... 22 800,000 64,000 14,080 1.93 27,174 3.4 -4.6

      HHA4....................................... 85 653,222 52,258 44,419 1.93 85,729 13.1 5.1%

      HHA5....................................... 50 190,000 15,200 7,600 1.93 14,668 7.7 -0.3%

      HHA6....................................... 63 340,000 27,200 17,136 1.93 33,072 9.7 1.7

      HHA7....................................... 74 660,000 52,800 39,072 1.93 75,409 11.4 3.4

      HHA8....................................... 25 564,000 45,120 11,280 1.93 21,770 3.9 -4.1

      ------------------------------------------------------------------------------------------------------------

      Sum.................................... ......... ............ 276,178 143,007 ............ 276,002 ............ ............

      --------------------------------------------------------------------------------------------------------------------------------------------------------

      * Example cases.

    8. Preview and Period to Request Recalculation

      We proposed that Medicare-certified HHAs be provided two separate opportunities to review scoring information under the HHVBP Model. First, HHAs will have the opportunity to review their quarterly quality reports following each quarterly posting; second, competing HHAs will have the opportunity to review their TPS and payment adjustment calculations, and request a recalculation if a discrepancy is identified due to a CMS error as described in this section. These processes would help educate and inform each competing Medicare-

      certified HHA on the direct relation between the payment adjustment and performance measure scores.

      We proposed to inform HHAs quarterly of their performance on each of the individual quality measures used to calculate the TPS. We proposed that an HHA would have ten days after the quarterly reports are provided to request a recalculation of measure scores if it believes there is evidence of a discrepancy. We stated that we will adjust the score if it is determined that the discrepancy in the calculated measure scores was the result of our failure to follow measurement calculation protocols.

      In addition, we proposed to inform each competing HHA of the TPS and payment adjustment amount in an annual report. We proposed that these annual reports would be provided to competing HHAs each August 1st prior to the calendar year for which the payment adjustment would be applied. Similar to quarterly reports, we proposed that HHAs will have ten days to request a recalculation of their TPS and payment adjustment amount from the date information is made available. For both the quarterly reports and the annual report containing the TPS and payment adjustments, competing HHAs will only be permitted to request scoring recalculations, and must include a specific basis for the requested recalculation. We will not be responsible for providing HHAs with the underlying source data utilized to generate performance measure scores. Each HHA has access to this data via the QIES system. The final TPS and payment adjustment will then be provided to competing Medicare-certified HHAs in a final report no later than 60 days in advance of the payment adjustment taking effect.

      The TPS from the annual performance report will be calculated based on the calculation of performance measures contained in the quarterly reports that have already been provided and reviewed by the HHAs. As a result, we stated in the proposed rule that we believe that quarterly reviews will provide substantial opportunity to identify and correct errors and resolve discrepancies, thereby minimizing the challenges to the annual performance scores linked to payment adjustment.

      As described above, a quarterly performance report will be provided to all competing HHAs within the selected states beginning with the first quarter of CY 2016 being reported in July 2016. We proposed that HHAs would submit recalculation requests for both quarterly quality performance measure reports and for the TPS and payment adjustment reports via an email link provided on the model-specific Web page. We proposed that the request form would be entered by a person who has authority to sign on behalf of the HHA and be submitted within 10 days of receiving the quarterly data report or the annual TPS and payment adjustment report.

      We proposed that requests for both quarterly report measure score recalculations or TPS and payment adjustment recalculations would contain the following information:

      The provider's name, address associated with the services delivered, and CMS Certification Number (CCN);

      The basis for requesting recalculation to include the specific quality measure data that the HHA believes is inaccurate or the calculation the HHA believes is incorrect;

      Contact information for a person at the HHA with whom CMS or its agent can communicate about this request, including name, email address, telephone number, and mailing address (must include physical address, not just a post office box); and,

      A copy of any supporting documentation the HHA wishes to submit in electronic form via the model-specific Web page.

      Following receipt of a request for quarterly report measure score recalculations or a request for TPS and payment adjustment recalculation, we proposed that CMS or its agent would:

      Provide an email acknowledgement, using the contact information provided in the recalculation request, to the HHA contact notifying the HHA that the request has been received;

      Review the request to determine validity, and determine whether the requested recalculation results in a score change altering performance measure scores or the HHA's TPS;

      If recalculation results in a performance measure score or TPS

      Page 68689

      change, conduct a review of quality data and if an error is found, recalculate the TPS using the corrected performance data; and,

      Provide a formal response to the HHA contact, using the contact information provided in the recalculation request, notifying the HHA of the outcome of the review and recalculation process.

      We proposed that recalculation and subsequent communication of the results of these determinations would occur as soon as administratively feasible following the submission of requests. Additionally, we stated that we will develop and adopt an appeals mechanism under the model through future rulemaking in advance of the application of any payment adjustments.

      The following is a summary of comments we received on the proposed quarterly quality measure reports and annual TPS preview periods.

      Comment: Several commenters suggested that the HHVBP Model provide 30 days, instead of 10 days, after quarterly and annual reports are provided to request a recalculation of the measure scores if the HHA believes there is evidence of discrepancy. In addition to allowing more time to challenge report contents, one commenter recommended another level of appeal be added with an independent entity to perform the calculation to determine if the discrepancy is valid.

      Response: We agree the review period for performance scores should be greater than 10 days to allow a more complete opportunity for HHAs to review, and are extending the time period for HHAs to preview their quarterly performance reports and annual payment adjustment reports (with requests for recalculations) from 10 days to 30 days. As noted in the proposed rule, CMS intends to propose an appeals mechanism in future rulemaking prior to the application of the first payment adjustments scheduled for 2018.

      Final Decision: For the reasons stated and in consideration of the comments received, we are finalizing the processes described above with modification. Specifically, under our final policy, the recalculation request form must be submitted within 30 days, rather than 10 days, of posting the quarterly data report or the annual TPS and payment adjustment reports on the model-specific Web site. We are not making any other changes to the proposed policies as described in this section.

  14. Evaluation

    We proposed, and are finalizing in this rule, to codify at Sec. 484.315(c) that competing HHAs in selected states will be required to collect and report information to CMS necessary for the purposes of monitoring and evaluating this model as required by statute.\54\ An evaluation of the HHVBP Model will be conducted in accordance with section 1115A(b)(4) of the Act, which requires the Secretary to evaluate each model tested by CMMI. We consider an independent evaluation of the model to be necessary to understand its impacts on care quality in the home health setting. The evaluation will be focused primarily on understanding how successful the model is in achieving quality improvement as evidenced by HHAs' performance on clinical care process measures, clinical outcome measures (for example, functional status), utilization/outcome measures (for example, hospital readmission rates, emergency room visits), access to care, and patient's experience of care, and Medicare costs. We also intend to examine the likelihood of unintended consequences. We intend to select an independent evaluation contractor to perform this evaluation. The procurement for the selection of the evaluation contractor is in progress, thus we cannot provide a detailed description of the evaluation methodology here.

    ---------------------------------------------------------------------------

    \54\ See section 1115A(b)(4) of the Act (42 U.S.C. 1315a).

    ---------------------------------------------------------------------------

    We intend to use a multilevel approach to evaluation. Here, we intend to conduct analyses at the state, HHA, and patient levels. Based on the state groupings discussed in the section on selection of competing HHAs, we believe there are several ways in which we can draw comparison groups and remain open to scientifically-sound, rigorous methods for evaluating the effect of the model intervention.

    The evaluation effort may require of HHAs participating in the model additional data specifically for evaluation purposes. Such requirements for additional data to carry out model evaluation will be in compliance with 42 CFR 403.1105 which, as of January 1, 2015, requires entities participating in the testing of a model under section 1115A to collect and report such information, including protected health information (as defined at 45 CFR 160.103), as the Secretary determines is necessary to monitor and evaluate the model. We will consider all Medicare-certified HHAs providing services within a state selected for the model to be participating in the testing of this model because the competing HHAs will be receiving payment from CMS under the model.\55\

    ---------------------------------------------------------------------------

    \55\ 79 FR 67751 through 67755.

    ---------------------------------------------------------------------------

    We invited comments on the proposed evaluation plan.

    Comment: Several commenters highlighted the importance of closely monitoring and evaluating Medicare beneficiary access to home healthcare to ensure the model does not inadvertently negatively impact beneficiary access to necessary and appropriate care. In addition, some commenters suggested the model may cause some HHAs in selected states to leave the market, thereby creating insufficient HHA supply. Other commenters specifically raised the concern that some HHAs may attempt to avoid treating beneficiaries they fear will have a negative impact on performance scores. These commenters suggest that CMS monitor whether Medicare beneficiaries experience problems with access to care, and if they do, immediately address issues to ensure beneficiaries receive needed services. One commenter specifically suggests surveying Medicare beneficiaries to help measure access and ensure proactive monitoring.

    Response: Beneficiary access to care is of paramount concern to us, and as indicated in the proposed rule, we will observe the progress of the model to guard against unintended consequences. Our monitoring and evaluation designs will be able to detect the types of concerns mentioned above. Adjustments to the monitoring and evaluation plans will be made as needed. As part of the development of this model, we have identified counties with low HHA market penetration, high dually-

    eligible populations, proportions of beneficiaries with high levels of acuity (as measured by hierarchical condition categories or HCCs), and organizational types. Future monitoring activities will include a continuous review of beneficiary-level claims data, Medicare cost reports, and beneficiary enrollment data to understand whether any unintended consequences arise across all competing HHAs in the Model.

    Comment: Several commenters suggested that CMS employ a process to continuously monitor quality improvement and evaluate other aspects of the model in conjunction with all stakeholders, including home health agencies. Commenters also recommended sharing lessons learned from the model to inform, educate and engage beneficiaries and the general public of lessons learned. Several commenters specifically recommended that CMS establish a HHVBP learning

    Page 68690

    network to foster smoother post-pilot implementation of VBP in home health.

    Response: We agree that wherever possible, competing HHAs should have every opportunity to share lessons learned from the model. We appreciate all suggestions related to learning from the HHVBP Model, both for competing HHAs and the public. The model contains multiple mechanisms for sharing information, including the use of a model-

    specific Web site, a collaboration Web site, and model-specific technical assistance efforts.

    Comment: Several commenters specifically requested subsequent revisions to the HHVBP Model following initial evaluation in order to ensure that payment reflects a broad range of patients and does not incentivize under or over provision of services. These commenters recommended independent evaluation that includes state specific data on changes in home health quality outcomes, changes in home health utilization and access to home health for patients with specific diagnosis and functional status, with breakdowns by geographic location of patients (for example, rural, urban).

    Response: We appreciate the recommendations provided. An independent evaluation is planned. As discussed in the proposed rule, we intend to use a multilevel approach to evaluation. We intend to conduct analyses at the state, HHA, and patient levels. The evaluation will be conducted in accordance with section 1115A(b)(4) of the Act and will include analysis of quality improvement as evidenced by HHAs' performance on clinical care process measures, clinical outcome measures (for example, functional status), utilization/outcome measures (for example, hospital readmission rates, emergency room visits), access to care, and patient's experience of care, and changes in Medicare costs. We also intend to examine the likelihood of unintended consequences. The evaluation will use a scientifically rigorous approach for evaluating the model intervention and making necessary alterations to the model as needed.

    Final Decision: For these reasons and in consideration of the comments received, we are finalizing the evaluation plan as proposed.

  15. Provisions of the Home Health Care Quality Reporting Program (HHQRP) and Response to Comments

    1. Background and Statutory Authority

      Section 1895(b)(3)(B)(v)(II) of the Act requires that for 2007 and subsequent years, each HHA submit to the Secretary in a form and manner, and at a time, specified by the Secretary, such data that the Secretary determines are appropriate for the measurement of health care quality. To the extent that an HHA does not submit data in accordance with this clause, the Secretary is directed to reduce the home health market basket percentage increase applicable to the HHA for such year by 2 percentage points. As provided at section 1895(b)(3)(B)(vi) of the Act, depending on the market basket percentage for a particular year, the 2 percentage point reduction under section 1895(b)(3)(B)(v)(I) of the Act may result in this percentage increase, after application of the productivity adjustment under section 1895(b)(3)(B)(vi)(I) of the Act, being less than 0.0 percent for a year, and may result in payment rates under the Home Health PPS for a year being less than payment rates for the preceding year.

      Section 2(a) of the Improving Medicare Post-Acute Care Transformation Act of 2014 (the IMPACT Act) (Pub. L. 113-185, enacted on Oct. 6, 2014) amended Title XVIII of the Act, in part, by adding a new section 1899B, which imposes new data reporting requirements for certain post-acute care (PAC) providers, including HHAs. New section 1899B of the Act is titled, ``Standardized Post-Acute Care (PAC) Assessment Data for Quality, Payment, and Discharge Planning''. Under section 1899B(a)(1) of the Act, certain post-acute care (PAC) providers (defined in section 1899B(a)(2)(A) of the Act to include HHAs, SNFs, IRFs, and LTCHs) must submit standardized patient assessment data in accordance with section 1899B(b) of the Act, data on quality measures required under section 1899B(c)(1) of the Act, and data on resource use, and other measures required under section 1899B(d)(1) of the Act. The Act also sets out specified application dates for each of the measures. The Secretary must specify the quality, resource use, and other measures no later than the applicable specified application date defined in section 1899B(a)(2)(E) of the Act.

      Section 1899B(b) of the Act describes the standardized patient assessment data that PAC providers are required to submit in accordance with section 1899B(b)(1) of the Act; requires the Secretary, to the extent practicable, to match claims data with standardized patient assessment data in accordance with section 1899B(b)(2) of the Act; and requires the Secretary, as soon as practicable, to revise or replace existing patient assessment data to the extent that such data duplicate or overlap with standardized patient assessment data, in accordance with section 1899B(b)(3) of the Act.

      Sections 1899B(c)(1) and (d)(1) of the Act direct the Secretary to specify measures that relate to at least five stated quality domains and three stated resource use and other measure domains. Section 1899B(c)(1) of the Act provides that the quality measures on which PAC providers, including HHAs, are required to submit standardized patient assessment data and other necessary data specified by the Secretary must be in accordance with, at least, the following domains:

      Functional status, cognitive function, and changes in function and cognitive function;

      Skin integrity and changes in skin integrity;

      Medication reconciliation;

      Incidence of major falls; and

      Accurately communicating the existence of and providing for the transfer of health information and care preferences of an individual to the individual, family caregiver of the individual, and providers of services furnishing items and services to the individual when the individual transitions (1) from a hospital or Critical Access Hospital (CAH) to another applicable setting, including a PAC provider or the home of the individual, or (2) from a PAC provider to another applicable setting, including a different PAC provider, hospital, CAH, or the home of the individual.

      Section 1899B(c)(2)(A) provides that, to the extent possible, the Secretary must require such reporting through the use of a PAC assessment instrument and modify the instrument as necessary to enable such use.

      Section 1899B(d)(1) of the Act provides that the resource use and other measures on which PAC providers, including HHAs, are required to submit any necessary data specified by the Secretary, which may include standardized assessment data in addition to claims data, must be in accordance with, at least, the following domains:

      Resource use measures, including total estimated Medicare spending per beneficiary;

      Discharge to community; and

      Measures to reflect all-condition risk-adjusted potentially preventable hospital readmission rates.

      Sections 1899B(c) and (d) of the Act indicate that data satisfying the eight measure domains in the IMPACT Act is the minimum data reporting requirement. The Secretary may specify additional measures and additional domains.

      Page 68691

      Section 1899B(e)(1) of the Act requires that the Secretary implement the quality, resource use, and other measures required under sections 1899B(c)(1) and (d)(1) of the Act in phases consisting of measure specification, data collection, and data analysis; the provision of feedback reports to PAC providers in accordance with section 1899B(f) of the Act; and public reporting of PAC providers' performance on such measures in accordance with section 1899B(g) of the Act. Section 1899B(e)(2) of the Act generally requires that each measure specified by the Secretary under section 1899B of the Act be National Quality Forum (NQF)-endorsed, but authorizes an exception under which the Secretary may select non-NQF-endorsed quality measures in the case of specified areas or medical topics determined appropriate by the Secretary for which a feasible or practical measure has not been endorsed by the NQF, as long as due consideration is given to measures that have been endorsed or adopted by a consensus organization identified by the Secretary. Section 1899B(e)(3) of the Act provides that the pre-rulemaking process required by section 1890A of the Act applies to quality, resource use, and other measures specified under sections 1899B(c)(1) and (d)(1) of the Act, but authorizes exceptions under which the Secretary may (1) use expedited procedures, such as ad hoc reviews, as necessary in the case of a measure required for data submissions during the 1-year period before the applicable specified application date, or (2) alternatively, waive section 1890A of the Act in the case of such a measure if applying section 1890A of the Act (including through the use of expedited procedures) would result in the inability of the Secretary to satisfy any deadline specified under section 1899B of the Act for the measure.

      Section 1899B(f)(1) of the Act requires the Secretary to provide confidential feedback reports to PAC providers on the performance of such PAC providers for quality, resource use, and other measures required under sections 1899B(c)(1) and (d)(1) of the Act beginning 1 year after the applicable specified application date.

      Section 1899B(g) of the Act requires the Secretary to establish procedures for making available to the public information regarding the performance of individual PAC providers for quality, resource use, and other measures required under sections 1899B(c)(1) and (d)(1) beginning not later than 2 years after the applicable specified application date. The procedures must ensure, including through a process consistent with the process applied under section 1886(b)(3)(B)(viii)(VII) for similar purposes, that each PAC provider has the opportunity to review and submit corrections to the data and information that are to be made public for the PAC provider prior to such data being made public.

      Section 1899B(h) of the Act sets out requirements for removing, suspending, or adding quality, resource use, and other measures required under sections 1899B(c)(1) and (d)(1) of the Act. In addition, section 1899B(j) of the Act requires the Secretary to allow for stakeholder input, such as through town halls, open door forums, and mailbox submissions, before the initial rulemaking process to implement section 1899B of the Act.

      Section 2(c)(1) of the IMPACT Act amended section 1895 of the Act to address the payment consequences for HHAs for the additional data which HHAs are required to submit under section 1899B of the Act. These changes include the addition of a new section 1895(b)(3)(B)(v)(IV), which requires HHAs to submit the following additional data: (1) For the year beginning on the specified application date and each subsequent year, data on the quality, resource use, and other measures required under sections 1899B(c)(1) and (d)(1) of the Act; and (2) for 2019 and subsequent years, the standardized patient assessment data required under section 1899B(b)(1) of the Act. Such data must be submitted in the form and manner, and at the time, specified by the Secretary.

      As noted, the IMPACT Act adds a new section 1899B of the Act that imposes new data reporting requirements for certain post-acute care (PAC) providers, including HHAs. Sections 1899B(c)(1) and 1899B(d)(1) of the Act collectively require that the Secretary specify quality measures and resource use and other measures with respect to certain domains not later than the specified application date that applies to each measure domain and PAC provider setting. Section 1899B(a)(2)(E) of the Act delineates the specified application dates for each measure domain and PAC provider. The IMPACT Act also amends other sections of the Act, including section 1895(b)(3)(B)(v), to require the Secretary to reduce the otherwise applicable PPS payment to a PAC provider that does not report the new data in a form and manner, and at a time, specified by the Secretary. For HHAs, amended section 1895(b)(3)(B)(v) of the Act will require the Secretary to reduce the payment update for any HHA that does not satisfactorily submit the newly required data.

      Under the current HH QRP, the general timeline and sequencing of measure implementation occurs as follows: Specification of measures; proposal and finalization of measures through notice-and-comment rulemaking; HHA submission of data on the adopted measures; analysis and processing of the submitted data; notification to HHAs regarding their quality reporting compliance for a particular year; consideration of any reconsideration requests; and imposition of a payment reduction in a particular year for failure to satisfactorily submit data for that year. Any payment reductions that are taken for a year begin approximately 1 year after the end of the data submission period for that year and approximately 2 years after we first adopt the measure.

      To the extent that the IMPACT Act could be interpreted to shorten this timeline, so as to require us to reduce HH PPS payment for failure to satisfactorily submit data on a measure specified under section 1899B(c)(1) or (d)(1) of the IMPACT Act beginning with the same year as the specified application date for that measure, such a timeline would not be feasible. The current timeline discussed above reflects operational and other practical constraints, including the time needed to specify and adopt valid and reliable measures, collect the data, and determine whether a HHA has complied with our quality reporting requirements. It also takes into consideration our desire to give HHAs enough notice of new data reporting obligations so that they are prepared to timely start reporting data. Therefore, we intend to follow the same timing and sequence of events for measures specified under sections 1899B(c)(1) and (d)(1) of the Act that we currently follow for other measures specified under the HH QRP. We intend to specify each of these measures no later than the specified application dates set forth in section 1899B(a)(2)(E) of the Act and will adopt them consistent with the requirements in the Act and Administrative Procedure Act. To the extent that we finalize a proposal to adopt a measure for the HH QRP that satisfies an IMPACT Act measure domain, we intend to require HHAs to report data on the measure for the year that begins 2 years after the specified application date for that measure. Likewise, we intend to require HHAs to begin reporting any other data specifically required under the IMPACT Act for the year that begins 2 years after we adopt requirements that

      Page 68692

      would govern the submission of that data.

      Lastly, on April 1, 2014, the Congress passed the Protecting Access to Medicare Act of 2014 (PAMA) (Pub. L. 113-93), which stated the Secretary may not adopt ICD-10 prior to October 1, 2015. On August 4, 2014, HHS published a final rule titled ``Administrative Simplification: Change to the Compliance Date for the International Classification of Diseases, 10th Revision (ICD-10-CM and ICD-10-PCS Medical Data Code Sets'' (79 FR 45128), which announced October 1, 2015 as the new compliance date. The OASIS-C1 data item set had been previously approved by the Office of Management and Budget (OMB) on February 6, 2014 and scheduled for implementation on October 1, 2014. We intended to use the OASIS-C1 to coincide with the original implementation date of the ICD-10. The approved OASIS-C1 included changes to accommodate coding of diagnoses using the ICD-10-CM coding set and other important stakeholder concerns such as updating clinical concepts, and revised item wording and response categories to improve item clarity. This version included five (5) data items that required the use of ICD-10 codes.

      Since OASIS-C1 was revised to incorporate ICD-10 coding, it was not feasible to implement the OASIS-C1/ICD-10 version prior to October 1, 2015, when ICD-10 was scheduled to be implemented. Due to this delay, we had to ensure the collection and submission of OASIS data continued, until ICD-10 was implemented. Therefore, we made interim changes to the OASIS-C1 data item set to allow use with ICD-9 until ICD-10 was adopted. The OASIS-C1/ICD-9 version was submitted to OMB for approval until the OASIS-C1/ICD-10 version could be implemented. A 6-month emergency approval was granted on October 7, 2014 and CMS subsequently applied for an extension. The extension of the OASIS-C1/ICD-9 version was reapproved under OMB control number 0938-0760 with a current expiration date of March 31, 2018. It is important to note, that this version of the OASIS will be discontinued once the OASIS-C1/ICD-10 version is approved and implemented. In addition, to facilitate the reporting of OASIS data as it relates to the implementation of ICD-10 on October 1, 2015, we submitted a new request for approval to OMB for the OASIS-C1/ICD-10 version under the Paperwork Reduction Act (PRA) process. We requested a new OMB control number for the proposed revised OASIS item as announced in the 30-day Federal Register notice (80 FR 15796). The new information collection request for OASIS-C1/ICD-10 version was approved under OMB control number 0938-1279 with a current expiration date of May 31, 2018. Information regarding the OASIS-C1 can be located on the OASIS C-1 Data Sets Web page at: http://www.cms.gov/Medicare/Quality-Initiatives-Patient-Assessment-Instruments/HomeHealthQualityInits/OASIS-C1.html. Additional information regarding the adoption of ICD-10 can be located on the ICD-10 Web page at: http://www.cms.gov/Medicare/Coding/ICD10/index.html?redirect=/icd10.

      We received multiple public comments pertaining to the general timeline and plan for implementation of the IMPACT Act, sequencing of measure implementation, and standardization of PAC assessment tools. The following is a summary of the comments we received on this topic and our responses.

      Comment: We received several comments requesting the development of a comprehensive implementation plan for all settings covered by the IMPACT Act. Commenters stated that a comprehensive implementation plan would give home health providers an opportunity to plan for the potential impact on their operations, and enable all stakeholders to understand CMS's approach to implementing the IMPACT Act across care settings. Some commenters requested that CMS plans be communicated as soon as possible and that CMS develop setting-specific communications to facilitate understanding of the IMPACT Act requirements. Another commenter urged CMS to provide clear and transparent explanations of each measure's specifications, providing as much information as possible to the public about the measures proposed. This commenter added that the detailed information submitted for NQF consensus development process would be helpful to stakeholders, and offered to work with CMS on measure development and specifications. One commenter specifically expressed the importance of a transparent process in relation to measure development, noting that the Act calls for informing the public of the measure's numerator, denominator, exclusions, and any other aspects the Secretary determines necessary. Another commenter requested that CMS abide by certain principles such as: Provide implementation timelines for data collection and reporting requirements in a timely manner; implement measures that are reliable, feasible and setting appropriate that are endorsed as well as included in the pre-rulemaking Measure Applications Partnership (MAP) process; minimize unnecessary provider burden; and finally that CMS ensure the standardization of measures and data collection across post-acute care settings as feasible.

      Response: We appreciate and agree with the commenters' requests for a comprehensive and transparent plan for implementation of the IMPACT Act, as well as the need for timely stakeholder input, the development of reliable, accurate measures that are endorsed and have undergone the pre-rulemaking MAP process, clarity on the level of standardization of items and measures, the importance of feasibility and standardization, and the avoidance of unnecessary burden on PAC providers. Our intent has been to comply with these principles in the implementation and rollout of QRPs in the various care settings, and we will continue to adhere to these principles as the agency moves forward with implementing IMPACT Act requirements.

      In addition to implementing the IMPACT Act requirements, we will follow the strategy for identifying cross-cutting measures, timelines for data collection, and timelines for reporting as outlined in the IMPACT Act. As described more fully above, the IMPACT Act requires CMS to specify measures that relate to at least five stated quality domains and three stated resource use and other measure domains. The IMPACT Act also outlines timelines for data collection and timelines for reporting. We intend to adopt measures that comply with the IMPACT Act in a manner that is consistent with the sequence we follow in other quality reporting programs. We intend to follow all processes in place for adoption of measures including the MAP review and the notice and comment rulemaking process. In the selection and specification of measures, we employ a transparent process in which we seek input from stakeholders and national experts and engage in a process that allows for pre-rulemaking input on each measure, as required by section 1890A of the Act. This process is based on a private-public partnership, and it occurs via the MAP. The MAP is composed of multi-stakeholder groups convened by the NQF, our current contractor under section 1890 of the Act, to provide input on the selection of quality and efficiency measures described in section 1890(b)(7)(B). The NQF must convene these stakeholders and provide us with the stakeholders' input on the selection of such measures. We, in turn, must take this input into consideration in selecting such

      Page 68693

      measures. In addition, the Secretary must make available to the public by December 1 of each year a list of such measures that the Secretary is considering under Title XVIII of the Act. Additionally, proposed measures and specifications are to be announced through the Notice of Proposed Rulemaking (NPRM) process in which proposed rules are published in the Federal Register and are available for public view and comment.

      We further note that we are committed to the principles surrounding public input as part of its measure development that occurs prior to rule making. As part of this measure development process, we seek input from the public on the measure specifications under development by CMS and our measure contractors. We have a designated Web page where we solicit public comment on measure constructs during measure development. This is a key component to how we develop and maintain quality measures, as outlined in the CMS Blueprint for Measures Management System. You can find more information about the CMS Blueprint for Measures Management System on the CMS Measure Management System Web page at: https://www.cms.gov/Medicare/Quality-Initiatives-Patient-Assessment-Instruments/MMS/index.html. The CMS Quality Measures Public Comment page is located at: https://www.cms.gov/Medicare/Quality-Initiatives-Patient-Assessment-Instruments/MMS/CallforPublicComment.html.

      Comment: Several commenters requested that CMS continue in its public engagement with stakeholders. They stated their appreciation for the opportunity to work with CMS during the implementation phases of the IMPACT Act. These commenters noted a need for more opportunities for stakeholder input into various aspects of the measure and assessment instrument development process. Commenters requested opportunities to provide ongoing input into measure and assessment instrument development and modifications.

      Response: We appreciate the commenters' feedback and the continued involvement of stakeholders in all phases of measure development and implementation, as we see the value in strong public-private partnerships. We also believe that ongoing stakeholder input is important to the success of the IMPACT Act and look forward to continued and regular input from the provider communities as we continue to implement the IMPACT Act. It is our intent to move forward with IMPACT Act implementation in a manner in which the measure and assessment instrument development process continues to be transparent, and includes input and collaboration from experts, the PAC provider community, and the public. It is of the utmost importance to CMS to continue to engage stakeholders, including patients and their families, throughout the measure and assessment instrument development lifecycle through our measure development public comment periods, the pre-

      rulemaking activities, participation in the Technical Expert Panels (TEPs) convened by our measure development contractors, as well as open door forums, and other opportunities. We have already provided multiple opportunities for stakeholder input, including the following activities: Our measure development contractor(s) convened TEPs for many of the measures in development under the IMPACT Act such as the functional assessment TEP, Discharge to Community TEP, Potentially Preventable Readmissions TEP, and the Drug Regimen Review TEP. We intend to continue this form of stakeholder engagement with future TEPs that will assess data standardization and Medicare Spending per Beneficiary measure concepts, among other topics. We also convened two separate listening sessions on February 10, 2015 and March 24, 2015 in order to receive stakeholder input on IMPACT Act implementation. In addition, we heard stakeholder input during the February 9, 2015 ad hoc MAP meeting provided for the sole purpose of reviewing the measures proposed in response to the IMPACT Act. We also implemented a public mail box for the submission of comments in January 2015, PACQualityInitiative@cms.hhs.gov, which is listed on our IMPACT Act of 2014 & Cross-Setting Measures Web site at: http://www.cms.gov/Medicare/Quality-Initiatives-Patient-Assessment-Instruments/Post-Acute-Care-Quality-Initiatives/IMPACT-Act-of-2014-and-Cross-Setting-Measures.html, and we held a Special Open Door Forum to seek input on the measures on February 25, 2015. The slides from the Special Open Door Forum are available http://www.cms.gov/Medicare/Quality-Initiatives-Patient-Assessment-Instruments/Post-Acute-Care-Quality-Initiatives/IMPACT-Act-of-2014-and-Cross-Setting-Measures.html.

      Comment: We received several comments requesting that CMS ensure that the data used to satisfy the IMPACT Act measure domains be aligned across PAC settings to maximize the reliability and validity of such data and to enable data comparability. Commenters noted the importance of standardized patient assessment data for cross-setting comparisons of patient outcomes. Another commenter expressed concern about the level of standardization of data collection instruments across PAC settings, specifically the importance of assessment item alignment for items selected for use in the various PAC settings, and urged CMS to consider such data alignment issues. One commenter recommended CMS move as quickly as possible to collect interoperable and standardized data, and one commenter recommended that CMS conduct testing to evaluate comparability across settings. One commenter expressed concern related to the inconsistencies in the measures proposed, suggesting that there was significant variance in relation to their numerator, denominator and exclusions.

      We received a few comments requesting details pertaining to the timing of the development and implementation of the standardized patient assessment data, measures, data collection, and reporting. Commenters requested a detailed timeline and schedule that specifies planned changes to standardize assessment data, including dates and sequencing of changes. Specifically, one commenter stated that although the sequencing for the quality measures and specified application dates were provided in the proposed rule, the detail related to the timing of the standardized data appeared to have been left out. The commenter requested that this final rule provide such timeline and sequencing.

      Response: We agree that standardization is important for data comparability and outcome analysis. We will work to ensure that items pertaining to measures required under the IMPACT Act that are included in assessment instruments are standardized and aligned across the assessment instruments. In addition, we will ensure that the data used to satisfy the IMPACT Act measure domains will be aligned across PAC settings to maximize the reliability and validity of such data and to enable data comparability. We recognize the need for transparency as we move forward to implement the IMPACT Act and we intend to continue to engage stakeholders and ensure that our approach to implementation and timing is communicated in an open and informative manner. We will continue this communication through various means, such as open door forums, national provider calls, email blasts, and announcements. We intend to provide

      Page 68694

      ongoing information pertaining to the implementation and development of standardized patient assessment data, measures, data collection, and reporting to the public. We will also continue to provide information about development and implementation of the IMPACT Act on the IMPACT Act of 2014 & Cross-Setting Measures Web page at: http://www.cms.gov/Medicare/Quality-Initiatives-Patient-Assessment-Instruments/Post-Acute-Care-Quality-Initiatives/IMPACT-Act-of-2014-and-Cross-Setting-Measures.html. In addition to the Web site updates and provider calls, we intend to provide information about development and implementation through pre-rulemaking activities surrounding the development of quality measures, which includes public input as part of our process. We intend to engage stakeholders and experts in developing the assessment instrument modifications necessary to meet data standardization requirements of the IMPACT Act. We also will use the rulemaking process to communicate timelines for implementation, including timelines for the replacement of items in PAC assessment tools, timelines for implementation of new or revised quality measures, and timelines for public reporting.

      Regarding the timeline and sequencing surrounding the standardized patient assessment data, we interpret the commenters' concern to refer to the standardized data assessment domains listed within the Act under section 2(b) ``Standardized patient assessment data''. As stated in the preamble to the CY 2016 HH PPS proposed rule, we intend to require HHAs to begin reporting data on the quality measures required under the IMPACT Act for the year that begins 2 years after we adopt requirements that govern the submission of that data.

      Comment: We received a few comments supporting and encouraging the use of NQF-endorsed measures and recommending that measures be NQF-

      endorsed prior to implementation. Specifically, commenters urged CMS to seek and receive NQF endorsement for measures in each PAC setting, noting that quality measure endorsement in one setting, such as a skilled nursing facility, may not mean a measure is appropriate, reliable, or valid for use in the home health setting.

      Response: We will propose appropriate measures that meet the requirements of the IMPACT Act measure domains and that have been endorsed or adopted by a consensus organization whenever possible. However, when this is not feasible because there is no NQF-endorsed measure that meets all the requirements for a specified IMPACT Act measure domain, we intend to rely on the exception authority given to the Secretary in section 1899B(e)(2)(B) of the Act. This statutory exception allows the Secretary to specify a measure for the HH QRP setting that is not NQF-endorsed where, as here, we have not been able to identify other measures on the topic that are endorsed or adopted by a consensus organization. For all quality measures for the HH QRP, we seek MAP review, as well as expert opinion on the validity and reliability of those measures in the HH setting. For the proposed quality measure, the Percent of Residents/Patients/Persons with Pressure Ulcers That Are New or Worsened, the MAP PAC LTC Off-Cycle Workgroup conditionally supported the quality measure for HH QRP. We wish to note that we intend to seek consensus endorsement for the IMPACT Act measures in each PAC setting.

      Comment: We received several comments about the burden on PAC providers of meeting new requirements imposed as a result of the implementation of the IMPACT Act. Specifically, commenters requested that CMS consider minimizing the burden for PAC providers when possible and avoiding duplication in data collection.

      Response: We appreciate the importance of avoiding undue burden and will continue to evaluate and consider any burden the IMPACT Act and the HH QRP places on home health providers. In implementing the IMPACT Act thus far, we have taken into consideration any new burden that our requirements might place on PAC providers. In this respect, we note that many assessment items used to calculate the measure proposed for use in the HH QRP, the Percent of Residents or Patients with Pressure Ulcers That Are New or Worsened are currently being collected in the OASIS instrument.

      Comment: We received one comment requesting that, in the future, cross-setting measures and assessment data changes related to the IMPACT Act be addressed in one stand-alone notice and rule that applies to all four post-acute care settings.

      Response: We will take this request under consideration.

      Comment: We received one comment expressing interest in learning about any proposed changes to the OASIS assessment instrument in the next version of the item set and when these changes might occur.

      Response: We are committed to transparent communication about updates to the PAC assessment instruments required to support the IMPACT Act measures, as well as any new measures for the HH QRP. We wish to clarify that the draft revisions to the integumentary portion of the OASIS were posted along with the proposed rule on the Home Health Quality Measures Web page at: https://www.cms.gov/Medicare/Quality-Initiatives-Patient-Assessment-Instruments/HomeHealthQualityInits/HHQIQualityMeasures.html. We intend to make publically available the final item set with its revisions as well as the submission specifications in a manner consistent with our previous postings of such information in the coming months.

      Comment: We received one comment expressing concern that data used in reformulating the payment model and assessing quality in PAC settings be gathered by qualified clinicians. Specifically, the commenter emphasized the unique contributions of occupational therapists to support the intent of the IMPACT Act.

      Response: We appreciate the feedback and concur on the important role played by qualified clinicians in collecting the data needed to support the requirements of the IMPACT Act.

      Comment: One commenter recommended that CMS invest in training clinicians for any new data collection requirements that address the quality measures, the assessment items, and how the measures and the items are developed to meet the mandate of the IMPACT Act objectives. This commenter additionally noted that the training should address different settings of care and how patient populations differ across PAC settings, to support consistency in data collection.

      Response: We agree that training is critical to assure both provider accuracy and understanding of the assessment and data collection requirements. We intend to provide training on updates to the OASIS assessment instrument as suggested, and intend to ensure that such training includes the information necessary to ensure consistent data collection.

      Comment: One commenter underscored cognitive function as an important aspect of the IMPACT Act, because of its significant relationship to Medicare resource use, length of stay, and patients' long term outcomes. The commenter recommended that assessment of functional cognition be incorporated as part of CMS's efforts to meet the requirements of the IMPACT Act and added that providers need more training around appropriate functional activities for patients with cognitive impairments. This commenter also

      Page 68695

      offered to provide research studies and related materials to support CMS in this area.

      Response: We concur on the importance of cognitive function and its relationship to quality outcomes for PAC patients. We are working toward developing quality measures that assess areas of cognition, recognizing that this quality topic is intrinsically linked to the function domain. We appreciate the commenter's offer of assistance and encourage the submission of comments and measure specification details to our comment email PACQualityInitiative@cms.hhs.gov.

      Comment: One commenter supported the inclusion of new standardized self-care and mobility functional items in PAC assessment tools that utilize the data source of the CARE Tool. The commenter anticipated that functional measures based on CARE items that are being implemented in other PAC settings will be eventually added to the HH QRP. This commenter noted that use of these new items would facilitate accurate representation of patient function across the spectrum of PAC settings.

      Response: We appreciate the commenter's feedback and support of the self-care and functional items that utilize data elements derived from the CARE Tool item set source. We believe that standardization of assessment items and measures, such as measures of functional status, across post-acute care settings is an important goal.

      Comment: One commenter expressed concern regarding harmonization of measures across settings and outcomes measurement when multiple populations are included. This commenter urged that proposed IMPACT Act measures be limited to Medicare FFS beneficiaries, noting that to include other populations (Medicaid, Medicare Advantage, and MCO Medicaid) will complicate the interpretation of outcome results. The commenter expressed support of the construct of the Total Cost per Beneficiary. The commenter also suggested that a measure such as the Percent of Patients Discharged to a Higher Level of Care versus Community, which the commenter suggested could be used across all patients receiving home care, be included in future measure development. In addition, the commenter expressed support for measures related to falls and nutritional assessment, and hospitalizations, but requested clarification about the population that would be measured and recommended that all of these measures be limited to Medicare FFS patients only. The commenter additionally recommended that the uniqueness of home health care be considered when developing a standardized falls measure, noting that home health staff are not present 24 hours a day, seven days a week and are reliant on patients and caregivers in reporting and preventing falls.

      Response: We appreciate the commenter's feedback about comparison of outcomes across different payer populations and appreciate the commenter's support for quality measure standardization as mandated by the IMPACT Act. The cross-setting measures: (1) Payment Standardized Medicare Spending Per Beneficiary (MSPB), (2) Percentage residents/

      patients at discharge assessment, who discharged to a higher level of care versus to the community, (Application of NQF #2510), (3) Skilled Nursing Facility 30-Day All-Cause Readmission Measure (SNFRM), and (4) Application of the LTCH/IRF All-Cause Unplanned Readmission Measure for 30 Days Post Discharge from LTCHs/IRFs are currently under development for all four PAC settings. These quality measures are being developed using Medicare claims data, thus the denominators for these measure constructs are limited to the Medicare FFS population. We intend to standardize denominator and numerator definitions across PAC settings in order to standardize quality measures as required by the IMPACT Act.

      We acknowledge the unique constraints home health agencies face in monitoring patient falls. We are in the process of standardizing a quality measure that assesses one or more falls with a major injury, rather than just a measure assessing if a fall occurred. In the FY 2016 IPPS/LTCH PPS final rule, FY 2016 IRF PPS final rule and FY 2016 SNF PPS final rule, we finalized an application of the quality measure, the Percent of Residents Experiencing One or More Falls with Major Injury (Long Stay) measure (NQF #0674). This application of the quality measure assesses falls resulting in major injuries only, satisfying the domain in the IMPACT Act, the Incidence of Major Falls. A TEP convened by our measure development contractor provided input on the technical specifications of the application of the quality measure, the Percent of Residents Experiencing One or More Falls with Major Injury (Long Stay) (NQF #0674), including the feasibility of implementing the measure across PAC settings, including home health care. The TEP was supportive of the implementation of this measure across PAC settings and was also supportive of our efforts to standardize this measure for cross-setting development. We have taken steps to standardize the numerator, denominator, and other facets of the quality measure across all PAC settings. As part of best clinical practice, the HHA should take steps to mitigate falls with major injury, especially since such falls are considered to be ``never events'' as they relate to healthcare acquired conditions.

      Finally, we appreciate the commenter's concern that home health staff are not present 24 hours, 7 days a week and may not be able to track falls as they occur.

    2. General Considerations Used for the Selection of Quality Measures for the HH QRP

      We strive to promote high quality and efficiency in the delivery of health care to the beneficiaries we serve. Performance improvement leading to the highest quality health care requires continuous evaluation to identify and address performance gaps and reduce the unintended consequences that may arise in treating a large, vulnerable, and aging population. Quality reporting programs, coupled with public reporting of quality information, are critical to the advancement of health care quality improvement efforts.

      We seek to adopt measures for the HH QRP that promote better, safer, and more efficient care. Valid, reliable, relevant quality measures are fundamental to the effectiveness of our quality reporting programs. Therefore, selection of quality measures is a priority for CMS in all of its quality reporting programs.

      The measures selected will address the measure domains as specified in the IMPACT Act and align with the CMS Quality Strategy, which is framed using the three broad aims of the National Quality Strategy:

      Better Care: Improve the overall quality of care by making healthcare more patient-centered, reliable, accessible, and safe.

      Healthy People, Healthy Communities: Improve the health of the U.S. population by supporting proven interventions to address behavioral, social, and environmental determinants of health in addition to delivering higher-quality care.

      Affordable Care: Reduce the cost of quality healthcare for individuals, families, employers, and government.

      In addition, our measure selection activities for the HH QRP take into consideration input we receive from the MAP. Input from the MAP is located on the MAP PAC LTC Programmatic Deliverable--Final Report Web page at:

      Page 68696

      http://www.qualityforum.org/Publications/2015/02/MAP_PAC-LTC_Programmatic_Deliverable_-_Final_Report.aspx. We also take into account national priorities, such as those established by the National Priorities Partnership at http://www.qualityforum.org/npp/, and the HHS Strategic Plan at: http://www.hhs.gov/secretary/about/priorities/priorities.html.

      We initiated an Ad Hoc MAP process for the review of the measures under consideration for implementation in preparation of the measures for adoption into the HH QRP that we proposed through this fiscal year's rule, in order to begin implementing such measures by 2017. We included under the List of Measures under Consideration (MUC List) measures that the Secretary must make available to the public, as part of the pre-rulemaking process, as described in section 1890A(a)(2) of the Act. The MAP Off-Cycle Measures under Consideration for PAC-LTC Settings can be accessed on the National Quality Forum Web site at: http://www.qualityforum.org/Publications/2015/03/MAP_Off-Cycle_Deliberations_2015_-_Final_Report.aspx. The NQF MAP met in February 2015 and provided input to us as required under section 1890A(a)(3) of the Act. The MAP issued a pre-rulemaking report on March 6, 2015 entitled MAP Off-Cycle Deliberations 2015: Measures under Consideration to Implement Provisions of the IMPACT Act--Final Report, which is available for download at: http://www.qualityforum.org/Publications/2015/03/MAP_Off-Cycle_Deliberations_2015_-_Final_Report.aspx. The MAP's input for the proposed measure is discussed in this section.

      To meet the first specified application date applicable to HHAs under section 1899B(a)(2)(E) of the Act, which is January 1, 2017, we focused on measures that:

      Correspond to a measure domain in sections 1899B(c)(1) or (d)(1) of the Act and are setting-agnostic: For example falls with major injury and the incidence of pressure ulcers;

      Are currently adopted for 1 or more of our PAC quality reporting programs, are already either NQF-endorsed and in use or finalized for use, or already previewed by the Measure Applications Partnership (MAP) with support;

      Minimize added burden on HHAs;

      Minimize or avoid, to the extent feasible, revisions to the existing items in assessment tools currently in use (for example, the OASIS); and

      Where possible, avoid duplication of existing assessment items.

      As discussed in section V.A. of this final rule, section 1899B(j) of the Act requires that we allow for stakeholder input, such as through town halls, open door forums, and mailbox submissions, before the initial rulemaking process to implement section 1899B. To meet this requirement, we provided the following opportunities for stakeholder input: (1) We convened a Technical Expert Panel (TEP) that included stakeholder experts and patient representatives on February 3, 2015; (2) we provided two separate listening sessions on February 10 and March 24, 2015; (3) we sought public input during the February 2015 ad hoc MAP process regarding the measures under consideration for IMPACT Act domains; (4) we sought public comment as part of our measure maintenance work; and (5) we implemented a public mail box for the submission of comments in January 2015 located at PACQualityInitiative@cms.hhs.gov. The CMS public mailbox can be accessed on our IMPACT Act of 2014 & Cross-Setting Measures Web page at: http://www.cms.gov/Medicare/Quality-Initiatives-Patient-Assessment-Instruments/Post-Acute-Care-Quality-Initiatives/IMPACT-Act-of-2014-and-Cross-Setting-Measures.html. Lastly, we held a National Stakeholder Special Open Door Forum to seek input on the measures on February 25, 2015.

      In the absence of NQF endorsement on measures for the home health (HH) setting, or measures that are not fully supported by the MAP for the HH QRP, we intend to propose for adoption measures that most closely align with the national priorities discussed above and for which the MAP supports the measure concept. Further discussion as to the importance and high-priority status of these measures in the HH setting is included under each quality measure in this final rule. In addition, for measures not endorsed by the NQF, we have sought, to the extent practicable, to adopt measures that have been endorsed or adopted by a national consensus organization, recommended by multi-

      stakeholder organizations, and/or developed with the input of providers, purchasers/payers, and other stakeholders.

    3. HH QRP Quality Measures and Measures Under Consideration for Future Years

      In the CY 2014 HH PPS final rule, (78 FR 72256-72320), we finalized a proposal to add two claims-based measures to the HH QRP, and stated that we would begin reporting the data from these measures to HHAs beginning in CY 2014. These claims based measures are: (1) Rehospitalization during the first 30 days of HH; and (2) Emergency Department Use without Hospital Readmission during the first 30 days of HH. In an effort to align with other updates to Home Health Compare, including the transition to quarterly provider preview reports, we made the decision to delay the reporting of data from these measures until July 2015 (http://www.cms.gov/Medicare/Quality-Initiatives-Patient-Assessment-Instruments/HomeHealthQualityInits/HHQISpotlight.html). Also in that rule, we finalized our proposal to reduce the number of process measures reported on the Certification and Survey Provider Enhanced Reporting (CASPER) reports by eliminating the stratification by episode length for nine (9) process measures. The removal of these measures from the CASPER folders occurred in October 2014. The CMS Home Health Quality Initiative Web site identifies the current HH QRP measures located on the Quality Measures Web page at: http://www.cms.gov/Medicare/Quality-Initiatives-Patient-Assessment-Instruments/HomeHealthQualityInits/HHQIQualityMeasures.html. In addition, as stated in the CY 2012 and CY 2013 HH PPS final rules (76 FR 68575 and 77 FR 67093, respectively), we finalized that we will also use measures derived from Medicare claims data to measure home health quality. This effort ensures that providers do not have an additional burden of reporting quality of care measures through a separate mechanism, and that the costs associated with the development and testing of a new reporting mechanism are avoided.

      (a) We proposed one standardized cross-setting new measure for CY 2016 to meet the requirements of the IMPACT Act. The proposed quality measure addressing the domain of skin integrity and changes in skin integrity is the National Quality Forum (NQF)-endorsed measure: Percent of Residents or Patients with Pressure Ulcers That Are New or Worsened (Short Stay) (NQF #0678) (http://www.qualityforum.org/QPS/0678).

      The IMPACT Act requires the specification of a quality measure to address skin integrity and changes in skin integrity in the home health setting by January 1, 2017. We proposed the implementation of quality measure NQF #0678, Percent of Residents or Patients with Pressure Ulcers that are New or Worsened (Short Stay) in the HH QRP as a cross-

      setting quality measure to meet the requirements of the IMPACT Act for the CY 2018 payment determination and subsequent years. This measure reports the percent of patients with Stage 2 through 4 pressure

      Page 68697

      ulcers that are new or worsened since the beginning of the episode of care.

      Pressure ulcers are high-volume in post-acute care settings and high-cost adverse events. According to the 2014 Prevention and Treatment Guidelines published by the National Pressure Ulcer Advisory Panel, European Pressure Ulcer Advisory Panel, and Pan Pacific Pressure Injury Alliance, pressure ulcer care is estimated to cost approximately $11 billion annually, and between $500 and $70,000 per individual pressure ulcer.\56\ Pressure ulcers are a serious medical condition that result in pain, decreased quality of life, and increased mortality in aging populations.57 58 59 60 Pressure ulcers typically are the result of prolonged periods of uninterrupted pressure on the skin, soft tissue, muscle, and bone.61 62 63 Elderly individuals are prone to a wide range of medical conditions that increase their risk of developing pressure ulcers. These include impaired mobility or sensation, malnutrition or undernutrition, obesity, stroke, diabetes, dementia, cognitive impairments, circulatory diseases, dehydration, bowel or bladder incontinence, the use of wheelchairs, the use of medical devices, polypharmacy, and a history of pressure ulcers or a pressure ulcer at admission.64 65 66 67 68 69 70 71 72 73 74

      ---------------------------------------------------------------------------

      \56\ National Pressure Ulcer Advisory Panel, European Pressure Ulcer Advisory Panel and Pan Pacific Pressure Injury Alliance. Prevention and Treatment of Pressure Ulcers: Clinical Practice Guideline. Emily Haesler (Ed.) Cambridge Media; Osborne Park, Western Australia; 2014.

      \57\ Casey, G. (2013). ``Pressure ulcers reflect quality of nursing care.'' Nurs N Z 19(10): 20-24.

      \58\ Gorzoni, M. L., and S. L. Pires (2011). ``Deaths in nursing homes.'' Rev Assoc Med Bras 57(3): 327-331.

      \59\ Thomas, J. M., et al. (2013). ``Systematic review: health-

      related characteristics of elderly hospitalized adults and nursing home residents associated with short-term mortality.'' J Am Geriatr Soc 61(6): 902-911.

      \60\ White-Chu, E. F., et al. (2011). ``Pressure ulcers in long-

      term care.'' Clin Geriatr Med 27(2): 241-258.

      \61\ Bates-Jensen BM. Quality indicators for prevention and management of pressure ulcers in vulnerable elders. Ann Int Med. 2001;135 (8 Part 2), 744-51.

      \62\ Institute for Healthcare Improvement (IHI). Relieve the pressure and reduce harm. May 21, 2007. Available from http://www.ihi.org/IHI/Topics/PatientSafety/SafetyGeneral/ImprovementStories/FSRelievethePressureandReduceHarm.htm.

      \63\ Russo CA, Steiner C, Spector W. Hospitalizations related to pressure ulcers among adults 18 years and older, 2006 (Healthcare Cost and Utilization Project Statistical Brief No. 64). December 2008. Available from http://www.hcupus.ahrq.gov/reports/statbriefs/sb64.pdf.

      \64\ Agency for Healthcare Research and Quality (AHRQ). Agency news and notes: pressure ulcers are increasing among hospital patients. January 2009. Available from http://www.ahrq.gov/research/jan09/0109RA22.htm.=

      \65\ Bates-Jensen BM. Quality indicators for prevention and management of pressure ulcers in vulnerable elders. Ann Int Med. 2001;135 (8 Part 2), 744-51.

      \66\ Cai, S., et al. (2013). ``Obesity and pressure ulcers among nursing home residents.'' Med Care 51(6): 478-486.

      \67\ Casey, G. (2013). ``Pressure ulcers reflect quality of nursing care.'' Nurs N Z 19(10): 20-24.

      \68\ Hurd D, Moore T, Radley D, Williams C. Pressure ulcer prevalence and incidence across post-acute care settings. Home Health Quality Measures & Data Analysis Project, Report of Findings, prepared for CMS/OCSQ, Baltimore, MD, under Contract No. 500-2005-

      000181 TO 0002. 2010.

      \69\ MacLean DS. Preventing & managing pressure sores. Caring for the Ages. March 2003;4(3):34-7. Available from http://www.amda.com/publications/caring/march2003/policies.cfm.

      \70\ Michel, J. M., et al. (2012). ``As of 2012, what are the key predictive risk factors for pressure ulcers? Developing French guidelines for clinical practice.'' Ann Phys Rehabil Med 55(7): 454-

      465

      \71\ National Pressure Ulcer Advisory Panel (NPUAP) Board of Directors; Cuddigan J, Berlowitz DR, Ayello EA (Eds). Pressure ulcers in America: prevalence, incidence, and implications for the future. An executive summary of the National Pressure Ulcer Advisory Panel Monograph. Adv Skin Wound Care. 2001;14(4):208-15

      \72\ Park-Lee E, Caffrey C. Pressure ulcers among nursing home residents: United States, 2004 (NCHS Data Brief No. 14). Hyattsville, MD: National Center for Health Statistics, 2009. Available from http://www.cdc.gov/nchs/data/databriefs/db14.htm.

      \73\ Reddy, M. (2011). ``Pressure ulcers.'' Clin Evid (Online) 2011.

      \74\ Teno, J. M., et al. (2012). ``Feeding tubes and the prevention or healing of pressure ulcers.'' Arch Intern Med 172(9): 697-701.

      ---------------------------------------------------------------------------

      The IMPACT Act requires the specification of quality measures that are harmonized across PAC settings. This requirement is consistent with the NQF Steering Committee report, which stated that to understand the impact of pressure ulcers across settings, quality measures addressing prevention, incidence, and prevalence of pressure ulcers must be harmonized and aligned.\75\ NQF #0678, Percent of Residents or Patients with Pressure Ulcers That Are New or Worsened (Short Stay) is NQF-

      endorsed and has been successfully implemented using a harmonized set of data elements in IRF, LTCH, and SNF settings. A new item, M1309 was previously added to the OASIS-C1/ICD-9 version to collect data on new and worsened pressure ulcers in home health patients to support harmonization with NQF #0678 and data collection for this item began January 1, 2015. A new measure, based on this item, was included in the 2014 MUC list and received conditional endorsement from the National Quality Forum. That measure was harmonized with NQF #0678, but differed in the consideration of unstageable pressure ulcers. In this rule, we proposed a HH measure that is fully-standardized with NQF #0678.

      ---------------------------------------------------------------------------

      \75\ National Quality Forum. National voluntary consensus standards for developing a framework for measuring quality for prevention and management of pressure ulcers. April 2008. Available from http://www.qualityforum.org/Projects/Pressure_Ulcers.aspx.

      ---------------------------------------------------------------------------

      A TEP convened by our measure development contractor provided input on the technical specifications of this quality measure, including the feasibility of implementing the measure across PAC settings. The TEP was supportive of the implementation of this measure across PAC settings and supported CMS's efforts to standardize this measure for cross-setting development. Additionally, the NQF MAP met on February 9, 2015 and February 27, 2015 and provided input to CMS. The MAP supported the use of NQF #0678, Percent of Residents or Patients with Pressure Ulcers that are New or Worsened (Short Stay) in the HH QRP as a cross-

      setting quality measure implemented under the IMPACT Act. More information about the MAPs recommendations for this measure on the National Quality Forum Web site at: http://www.qualityforum.org/Publications/2015/02/MAP_PAC-LTC_Programmatic_Deliverable_-_Final_Report.aspx.

      We proposed that data for the standardized quality measure would be collected using the OASIS-C1 with submission through the Quality Improvement and Evaluation System (QIES) Assessment Submission and Processing (ASAP) system. HHAs began submitting data for the OASIS items used to calculate NQF #0678, the Percent of Residents or Patients with Pressure Ulcers That Are New or Worsened (Short Stay), as part of the Home Health Quality Initiative to assess the number of new or worsened pressure ulcers in January 2015. By building on the existing reporting and submission infrastructure for HHAs, we intend to minimize the administrative burden related to data collection and submission for this measure under the HH QRP. For more information on HH reporting using the QIES ASAP system, refer to OASIS User Manual Web page at: http://www.cms.gov/Medicare/Quality-Initiatives-Patient-Assessment-Instruments/HomeHealthQualityInits/HHQIOASISUserManual.html and http://www.cms.gov/Medicare/Quality-Initiatives-Patient-Assessment-Instruments/OASIS/index.html?redirect=/oasis/.

      Data collected through the OASIS-C1 would be used to calculate this quality measure. Data items in the OASIS-C1 include M1308 (Current Number of Unhealed Pressure Ulcers at Each Stage or Unstageable) and M1309 (Worsening in Pressure Ulcer Status Since SOC/ROC). Data collected through the OASIS-C1 would be used for risk adjustment of this measure. We

      Page 68698

      anticipate risk adjustment items will include, but not be limited to M1850 (Activities of Daily Living Assistance, Transferring), and M1620 (Bowel Incontinence Frequency). OASIS C1 items M1016 (Diagnoses Requiring Medical or Treatment Change Within past 14 Days), M1020 (Primary Diagnoses) and M1022 (Other Diagnoses) would be used to identify patients with a diagnosis of peripheral vascular disease, diabetes, or malnutrition. More information about the OASIS items is available in the downloads section of the Home Health Quality Measures Web page at: https://www.cms.gov/Medicare/Quality-Initiatives-Patient-Assessment-Instruments/HomeHealthQualityInits/HHQIQualityMeasures.html.

      The specifications and data items for NQF #0678, the Percent of Residents or Patients with Pressure Ulcers that are New or Worsened (Short Stay), are available in the downloads section of the Home Health Quality Measures Web page at: https://www.cms.gov/Medicare/Quality-Initiatives-Patient-Assessment-Instruments/HomeHealthQualityInits/HHQIQualityMeasures.html.

      As part of our ongoing measure development efforts, we considered a future update to the numerator of the quality measure NQF #0678, Percent of Residents or Patients with Pressure Ulcers that are New or Worsened (Short Stay). This update would hold providers accountable for the development of unstageable pressure ulcers and suspected deep tissue injuries (sDTIs). Under this proposed change the numerator of the quality measure would be updated to include unstageable pressure ulcers, including sDTIs that are new/developed while the patient is receiving home health care, as well as Stage 1 or 2 pressure ulcers that become unstageable due to slough or eschar (indicating progression to a full thickness that is, stage 3 or 4 pressure ulcer) after admission. This would be consistent with the specifications of the ``New and Worsened Pressure Ulcer'' measure for HH patients presented to the MAP on the 2014 MUC list. We did not propose the implementation of this change (that is, including sDTIs and unstageable pressure ulcers in the numerator) in the HH QRP, but solicited public feedback on this potential area of measure development.

      Our measure development contractor convened a cross-setting pressure ulcer TEP that strongly recommended that CMS hold providers accountable for the development of new unstageable pressure ulcers and sDTIs by including these pressure ulcers in the numerator of the quality measure. Although the TEP acknowledged that unstageable pressure ulcers and sDTIs cannot and should not be assigned a numeric stage, panel members recommended that these be included in the numerator of NQF #0678, the Percent of Residents, or Patients with Pressure Ulcers That Are New or Worsened (Short Stay), as a new pressure ulcer if developed during a home health episode. The TEP also recommended that a Stage 1 or 2 pressure ulcer that becomes unstageable due to slough or eschar should be considered worsened because the presence of slough or eschar indicates a full thickness (equivalent to Stage 3 or 4) wound.76 77 These recommendations were supported by technical and clinical advisors and the National Pressure Ulcer Advisory Panel.\78\ Additionally, exploratory data analysis conducted by our measure development contractor suggested that the addition of unstageable pressure ulcers, including sDTIs, would increase the observed incidence of new or worsened pressure ulcers at the agency level and may improve the ability of the quality measure to discriminate between poor- and high-performing facilities.

      ---------------------------------------------------------------------------

      \76\ Schwartz, M., Nguyen, K.H., Swinson Evans, T.M., Ignaczak, M.K., Thaker, S., and Bernard, S.L.: Development of a Cross-Setting Quality Measure for Pressure Ulcers: OY2 Information Gathering, Final Report. Centers for Medicare & Medicaid Services, November 2013. Available: http://www.cms.gov/Medicare/Quality-Initiatives-Patient-Assessment-Instruments/Post-Acute-Care-Quality-Initiatives/Downloads/Development-of-a-Cross-Setting-Quality-Measure-for-Pressure-Ulcers-Information-Gathering-Final-Report.pdf.

      \77\ Schwartz, M., Ignaczak, M.K., Swinson Evans, T.M., Thaker, S., and Smith, L.: The Development of a Cross-Setting Pressure Ulcer Quality Measure: Summary Report on November 15, 2013, Technical Expert Panel Follow-Up Webinar. Centers for Medicare & Medicaid Services, January 2014. Available: http://www.cms.gov/Medicare/Quality-Initiatives-Patient-Assessment-Instruments/Post-Acute-Care-Quality-Initiatives/Downloads/Development-of-a-Cross-Setting-Pressure-Ulcer-Quality-Measure-Summary-Report-on-November-15-2013-Technical-Expert-Pa.pdf.

      \78\ Schwartz, M., Nguyen, K.H., Swinson Evans, T.M., Ignaczak, M.K., Thaker, S., and Bernard, S.L.: Development of a Cross-Setting Quality Measure for Pressure Ulcers: OY2 Information Gathering, Final Report. Centers for Medicare & Medicaid Services, November 2013. Available: http://www.cms.gov/Medicare/Quality-Initiatives-Patient-Assessment-Instruments/Post-Acute-Care-Quality-Initiatives/Downloads/Development-of-a-Cross-Setting-Quality-Measure-for-Pressure-Ulcers-Information-Gathering-Final-Report.pdf.

      ---------------------------------------------------------------------------

      In addition, we also considered whether body mass index (BMI) should be used as a covariate for risk-adjusting NQF #0678 in the home health setting, as is done in other post-acute care settings. We invited public feedback to inform our direction to include unstageable pressure ulcers and sDTIs in the numerator of the quality measure NQF #0678 Percent of Residents or Patients with Pressure Ulcers that are New or Worsened (Short Stay), as well as on the possible collection of height and weight data for risk-adjustment, as part of our future measure development efforts.

      We invited public comment on our proposal to adopt NQF #0678 Percent of Residents or Patients with Pressure Ulcers that are New or Worsened (Short Stay) for the HH QRP to fulfill the requirements of the IMPACT Act for CY 2018 HH payment determination and subsequent years. The following is a summary of the comments received and our responses.

      Comment: The majority of commenters supported the addition of the proposed quality measure, the Percent of Residents or Patients with Pressure Ulcers That Are New or Worsened (NQF #0678) to the Home Health Quality Reporting Program. Commenters appreciated that CMS chose a measure that uses data home health agencies already collect.

      Response: We appreciate the commenters' support for implementing the proposed quality measure, the Percent of Residents or Patients with Pressure Ulcers That Are New or Worsened (NQF #0678).

      Comment: A few commenters raised concerns about the fairness of using NQF #0678 to compare performance within home health and across PAC providers. One commenter noted that pressure ulcer improvement is challenging to measure in limited timeframes and disadvantages providers serving frailer populations and requested CMS consider risk adjustment based on sociodemographic, diagnostic, and care coordination factors. Commenters also recommended that CMS take into account the discrepancy in the control providers have over patient care in home health, relative to institutional settings. Another commenter additionally raised concerns about the reliability of the implementation of the Wound, Ostomy, and Continence Nurses (WOCN) Society guidelines used in staging pressure ulcers, and the lack of information about the status of the wound beyond staging while the patient is in the care of the provider. In addition, one commenter recommended that CMS conduct ongoing evaluation of the risk adjustment methodology for this proposed quality measure.

      Response: We appreciate the commenters' concerns about ensuring fair comparisons within and across PAC settings. We also appreciate that such comparisons take into account the discrepancy in the control providers have over patient care in home health,

      Page 68699

      relative to institutional settings. We are committed to developing risk models that take into account differences in patient characteristics, including chronic conditions and frailty. We believe that as with provider services within institutional settings, home health agencies aim to provide high quality care and therefore assess for and put into place care planning and services that mitigate poor quality outcomes. However, we will also take into account potential variation that may exist in relation to home based services as opposed to institutional services. Therefore, as part of measure maintenance, we intend to continue to evaluate for risk factors associated with pressure ulcers including those unique to the individuals receiving home health services. We intend to provide specific guidance through the OASIS manual and provider trainings to support clinicians in appropriately coding the stages of the pressure ulcers. In addition, we plan to conduct field testing on all the new and revised OASIS items that support the IMPACT Act measures, to assess inter-rater reliability and to further refine guidance and training.

      This proposed quality measure underwent recent review as part of its measure maintenance by CMS's measure development contractor. Under Technical Expert Panel review, which included national experts and members of a various professional wound organizations such as the National Pressure Ulcer Advisory Panel (NPUAP), the current staging was not adjusted. We confirm our commitment to ongoing monitoring and re-

      evaluation of the risk models for all applicable outcome measures.

      While we appreciate these comments and the importance of the role that sociodemographic status plays in the care of patients, we continue to have concerns about holding providers to different standards for the outcomes of their patients of low sociodemographic status because we do not want to mask potential disparities or minimize incentives to improve the outcomes of disadvantaged populations. We routinely monitor the impact of sociodemographic status on facilities' results on our measures.

      NQF is currently undertaking a 2-year trial period in which new measures and measures undergoing maintenance review will be assessed to determine if risk-adjusting for sociodemographic factors is appropriate for each measure. For 2 years, NQF will conduct a trial of a temporary policy change that will allow inclusion of sociodemographic factors in the risk-adjustment approach for some performance measures. At the conclusion of the trial, NQF will determine whether to make this policy change permanent. Measure developers must submit information such as analyses and interpretations as well as performance scores with and without sociodemographic factors in the risk adjustment model.

      Furthermore, the Office of the Assistant Secretary for Planning and Evaluation (ASPE) is conducting research to examine the impact of socioeconomic status on quality measures, resource use, and other measures under the Medicare program as directed by the IMPACT Act. We will closely examine the findings of these reports and related Secretarial recommendations and consider how they apply to our quality programs at such time as they are available.

      Comment: A commenter expressed concern that the proposed implementation of NQF #0678 did not include risk adjustment, just exclusion of patients who die.

      Response: The Percent of Residents or Patients with Pressure Ulcers That Are New or Worsened (NQF #0678) is risk-adjusted based on an evaluation of covariates that predict the outcome, including low body mass, diabetes, arterial and peripheral vascular disease, med mobility and bowel incompetence. As stated in the CY 2016 HH PPS proposed rule, a discussion pertaining to risk adjustment for this measure can be found in the downloads section on the Home Health Quality Measures Web page at: https://www.cms.gov/Medicare/Quality-Initiatives-Patient-Assessment-Instruments/HomeHealthQualityInits/HHQIQualityMeasures.html.

      Comment: One commenter appreciated the revision in the organization of the pressure ulcer items in section M1308 that makes the section easier to understand and suggested similar revisions to other items. The commenter also questioned why data on the number and stage of pressure ulcers was collected on both M1309 and M1308, noting that this might confuse clinicians. This commenter suggested deleting M1309 and making additional revisions to M1308 to capture the number of new or worsened pressure ulcers since the most recent SOC/ROC, and further suggested adding M1308 at recertification. Another commenter noted that OASIS Item M1309 is complex and recommended CMS develop an algorithm to assist HHAs with completing this item, adding that this complexity may lead to a wide variation of responses from HHAs and affect data reliability. This commenter further noted that home health agencies might be reliant on caregivers and patients to follow instructions related to pressure ulcer prevention in order to achieve quality outcomes for pressure ulcers.

      Response: We appreciate the commenters' positive feedback on items M1308, and suggestions related to M1309 in the current OASIS C1 item set, which we will take into consideration. We wish to clarify that M1308 would be collected at recertification. We also wish to clarify that the revised version of M1309 builds upon the current version of this item in the OASIS instrument and has been adjusted to be standardized to ensure comparable data capture of these items across the PAC settings. We appreciate the potential for confusion between the item sections M1308 and M1309. The items used in the skin assessment that inform this measure were tested during the development of the Minimum Data Set version 3.0. The inter-rater reliability and validity of these items was very strong suggesting that there was little confusion in the coding of these items by clinicians. We believe that training is important in assuring accurate assessments and OASIS coding. Therefore, we plan to issue new guidance on these items, as part of the update to the OASIS manual, well in advance of their implementation, and to provide further support through training and other education materials. We appreciate the unique role of patients and caregivers in achieving quality health outcomes in the home setting, where skilled care is intermittent in nature. We believe that as part of home health services, the provider ensures that adequate person and family centered education is provided to help in the avoidance and mitigation of pressure ulcers, or other events. Thus, CMS currently has implemented several process measures in the HH QRP, which assess whether care plans and other best practices have been implemented to help patients achieve the best possible outcomes.

      Comment: A commenter noted strong support for assessing and considering other wounds in addition to pressure ulcers when determining the clinical and functional status of the patient. This commenter additionally recommended that CMS expand the list of active diagnoses that are typically barriers to good outcomes and clarify whether these are diagnoses or symptomology.

      Response: We appreciate the comment supporting assessment and monitoring all wounds, as well as the recommendation to expand the list of active diagnoses. We believe that as part of providing quality care, home health

      Page 68700

      agencies assess, care for, document, and ensure surveillance of all wound types. We will consider this feedback in future refinements of this proposed quality measure. In addition, we will consider expanding the items referencing active diagnoses and better clarifying whether items are referencing new diagnoses or symptomology of a disease.

      Comment: Several commenters commented on the collection of a patient's height and weight in the OASIS, in order to calculate body mass index (BMI) as a risk adjustor for this proposed quality measure. CMS received several comments in support of the proposal of this quality measure. One commenter supported the efforts to standardize data to improve data accuracy and to help facilitate best practices for the prevention of pressure ulcers, while assuring appropriate care for pressure ulcers is given in all settings. The commenter expressed that there is relevance of low BMI and the incidence of pressure ulcers and recommended that CMS consider evaluating high BMI as a risk factor for developing new or worsened pressure ulcers. One commenter believed that CMS should not use BMI obtained in the home health setting, suggesting that physician offices and care centers obtain such information. One commenter did not support the use of BMI as a covariate for the New or Worsened Pressure Ulcer proposed quality measure without additional evidence of its relevance in the home care setting.

      Several commenters expressed concerns about the situations in which providers are unable to collect accurate height and weight data in the home care setting safely, including situations such as, but not limited to, bedbound patients who are unable to stand on scales or whose self-

      reported height may be invalid due to memory deficits. Commenters additionally cited the lack of appropriate equipment to obtain this information in the home, including scales and Hoyer lifts for patients who cannot transfer. An additional commenter recommended that CMS add an option box to the new OASIS items to allow coding for those patients who cannot be weighed. Finally, one commenter requested clarification of ``base weight'' and the expectation for recording a weight that is measured during the visit versus a weight which could be reported by the patient when they are weighed in their home or based a recent healthcare provider appointment or hospitalization.

      Response: We appreciate the comments received pertaining to the relevance of low BMI as a risk factor for developing pressure ulcers, the inclusion of low BMI in the measure and the suggestion that we evaluate the inclusion of high BMI as a risk factor for pressure ulcers. We further appreciate the comments regarding the challenge of obtaining height and weight data in the home for home health patients. This information is collected in order to standardize risk adjustment for measuring the incidence of new and worsened pressure ulcers to facilitate the comparison of quality data within and across post-acute care settings for this outcome measure.

      Low body mass index, which is derived from a patient's height and weight, is a known correlate of developing pressure ulcers. We recognize that there will be instances in which obtaining height and weight cannot occur, and coding response options will be available in order to indicate when such data cannot be obtained. We intend to issue specific guidance through the OASIS manual on obtaining these data, including a definition of ``base weight''. We will also offer support through training, Open Door Forums, and other communication mechanisms.

      In response to the commenter who suggested that physician offices and wound care centers obtain information related to height and weight, we will take this feedback into consideration in our ongoing maintenance of this proposed quality measure. In the cross-setting Technical Expert Panel held by our measure contractor, it was advised that we continue to use BMI, as collected, to indicate low body mass. We appreciate those comments that suggest enhancements to the measure's risk adjustment and we will take into consideration revisions to the measure and risk adjustment model in our ongoing maintenance of the measure.

      Comment: One commenter expressed support for the integration of unstageable pressure ulcers and sDTIs into the measure, and stressed the importance of education on the additional options prior to implementing this change, citing the challenges to correct staging and the importance of inter-rater reliability across PAC settings.

      Response: We appreciate the feedback on future integration of unstageable pressure ulcers and sDTIs into this measure, and will consider it when undertaking any revisions. We also appreciate the commenter's emphasis on the important of education and training as the OASIS is revised and the quality measures are developed. We historically have and will continue to provide comprehensive training each time the assessment items change. In addition to the manual and training sessions, we will provide training materials through the CMS webinars, open door forums, and help desk support. As provided previously, item testing revealed very strong inter-rater reliability. Additionally, with the measure development and maintenance process, we will continue to test this proposed measure's reliability and validity across settings.

      Final Action: After consideration of the comments received, we are finalizing as proposed the adoption of NQF #0678 Percent of Residents or Patients with Pressure Ulcers that are New or Worsened (Short Stay) for use in the HH QRP for CY 2018 HH payment determination and subsequent years.

      Page 68701

      Table 19--Future Cross-Setting Measure Constructs Under Consideration To

      Meet IMPACT Act Requirements

      Home Health Timeline for Implementation--January 1, 2017

      ------------------------------------------------------------------------

      ------------------------------------------------------------------------

      IMPACT Act Domain................. Measures to reflect all-condition

      risk-adjusted potentially

      preventable hospital readmission

      rates.

      Measures.......................... Application of (NQF #2510): Skilled

      Nursing Facility 30-Day All-Cause

      Readmission Measure (SNFRM). CMS is

      the steward.

      Application of the LTCH/IRF All-

      Cause Unplanned Readmission Measure

      for 30 Days Post Discharge from

      LTCHs/IRFs.

      IMPACT Act Domain:................ Resource Use, including total

      estimated Medicare spending per

      beneficiary.

      Measure........................... Payment Standardized Medicare

      Spending Per Beneficiary (MSPB).

      IMPACT Act Domain................. Discharge to community.

      Measure........................... Percentage residents/patients at

      discharge assessment, who

      discharged to a higher level of

      care versus to the community.

      IMPACT Act Domain................. Medication Reconciliation.

      Measure........................... Percent of patients for whom any

      needed medication review actions

      were completed.

      ------------------------------------------------------------------------

      We also identified four future, cross-setting measure constructs to potentially meet requirements of the IMPACT Act domains of: (1) All-

      condition risk-adjusted potentially preventable hospital readmission rates; (2) resource use, including total estimated Medicare spending per beneficiary; (3) discharge to community; and (4) medication reconciliation. These are shown in Table 19; we solicited public feedback to inform future measure development of these constructs as it relates to meeting the IMPACT Act requirements in these areas. These measures will be proposed in future rulemaking. The comments we received on this topic, with our responses, are summarized below.

      Comment: One commenter encouraged CMS to include clinical experts in the development of measures for cognition, expressive and receptive language, and swallowing stressing that without clinical expertise, substandard data, barriers to data collection, and risks in improving patient outcomes could occur. The commenter asked that these suggested measures be considered as items of function and not exclusively as risk adjustors. This commenter supported the risk adjustment of all outcome measures based on key case-mix variables due to the variability of patients treated in PAC settings.

      Response: We intend to incorporate clinical expertise in our ongoing measure refinement activities to better inform the development of these quality measures. One way we incorporate this form of clinical input is through the inclusion of Technical Expert Panels supported by the quality measurement development contractor. We also encourage public input on our measure development, and comments may be submitted to our quality reporting program email HomeHealthQualityQuestions@cms.hhs.gov

      We are working toward developing quality measures that assess areas of cognition and expression, recognizing that these quality topic domains are intrinsically linked or associated to the domain of function and cognitive function. In this measure development, we will take into consideration the variability of the PAC population and the appropriate risk-adjustment based on case-mix. In addition, we will take into consideration the suggestion that these measures operate as items of function and not exclusively as risk adjustors.

      Comment: One commenter requested that CMS consider the CARE-C and CARE-F items based on the National Outcomes Measurement System (NOMS) to capture communication, cognition, and swallowing as additional measures to be adopted in post-acute care settings for future measures.

      Response: We appreciate the suggestion that we consider refinements to functional items such as communication, cognition, and swallowing, which may provide a more meaningful picture of patients with impairments in these areas. We will consider these recommendations in our item, measure, and testing efforts for both measure development as well as standardized assessment domain development.

      Comment: One commenter expressed concern regarding the cross-

      setting all-cause potentially preventable hospital readmissions measure. The commenter suggested that additional research on the effectiveness of this measure be pursued. The commenter proposed that the measure include rewards for sustained achievement as well as for improvement; and that actions outside of the agency's control (for example, timely physician signatures on orders) be taken into consideration in the application of the all-cause readmission measure. In addition, the commenter recommends that CMS consider risk adjustment to address family-requested hospitalizations and increased risk of hospitalization due to select diagnoses and comorbidities.

      One commenter noted difficulty in providing meaningful comment on specific measures and measure constructs without further information. Regarding the measure ``Percent of patients for whom any needed medication review actions were completed'', the commenter stated it is unclear from the table how one would determine whether a medication review action is needed for purposes of the measure. One commenter stated they need additional time to review more thoroughly, and plans to provide further feedback in the future.

      Finally, one commenter recommended the inclusion of nurse practitioners in both the development and implementation of care plans based on quality indicators.

      Response: We appreciate the commenters' feedback and suggestions regarding the cross-setting all-cause potentially preventable hospital readmissions measure, and will consider them in future revisions. We intend to risk adjust this outcome measure, based on evaluation of statistically significant covariates, including diagnoses and co-

      morbidities.

      We appreciate the comments pertaining to the quality measure, the percent of patients for whom any needed medication review actions were completed. As we continue to develop and test this measure construct, we will make information about the measurement specifications available through posting specifications on our Web site and public comment periods. We recognize the need for transparency as we move forward to implement the IMPACT Act and will continue to engage stakeholders to ensure that our approach to measure development and implementation is communicated in an open and informative manner. We

      Page 68702

      would like to note that anyone can submit feedback on the measures by means of our mailbox PACQualityInitiative@cms.hhs.gov.

      Finally, we appreciate the important role played by nurse practitioners in patient health and home care outcomes, and encourage their participation through the variety of modes of stakeholder engagement noted above.

      We will take all comments into consideration when developing and modifying assessment items and quality measures.

      Table 20--Future Setting-Specific Measure Constructs Under Consideration

      ------------------------------------------------------------------------

      National Quality Strategy Domain Measure Construct

      ------------------------------------------------------------------------

      Safety............................ Falls risk composite process

      measure: Percentage of home health

      patients who were assessed for

      falls risk and whose care plan

      reflects the assessment, and which

      was implemented appropriately.

      Nutrition assessment composite

      measure: Percentage of home health

      patients who were assessed for

      nutrition risk with a validated

      tool and whose care plan reflects

      the assessment, and which was

      implemented appropriately.

      Effective Prevention and Treatment Improvement in Dyspnea in Patients

      with a Primary Diagnosis of

      Congestive Heart Failure (CHF),

      Chronic Obstructive Pulmonary

      Disease (COPD), and/or Asthma:

      Percentage of home health episodes

      of care during which a patient with

      a primary diagnosis of CHF, asthma

      and/or COPD became less short of

      breath or dyspneic.

      Improvement in Patient-Reported

      Interference due to Pain: Percent

      of home health patients whose self-

      reported level of pain interference

      on the Patient-Reported Objective

      Measurement Information System

      (PROMIS) tool improved.

      Improvement in Patient-Reported Pain

      Intensity: Percent of home health

      patients whose self-reported level

      of pain severity on the PROMIS tool

      improved.

      Improvement in Patient-Reported

      Fatigue: Percent of home health

      patients whose self-reported level

      of fatigue on the PROMIS tool

      improved.

      Stabilization in 3 or more

      Activities of Daily Living (ADLs):

      Percent of home health patients

      whose functional scores remain the

      same between admission and

      discharge for at least 3 ADLs.

      ------------------------------------------------------------------------

      (b) We worked with our measure development and maintenance contractor to identify setting-specific measure concepts for future implementation in the HH QRP that align with or complement current measures and new measures to meet domains specified in the IMPACT Act. In identifying priority areas for future measure enhancement and development, we took into consideration results of environmental scans and resulting gap analysis for relevant home health quality measure constructs, along with input from numerous stakeholders, including the Measures Application Partnership (MAP), the Medicare Payment Advisory Commission (MedPAC), Technical Expert Panels, and national priorities, such as those established by the National Priorities Partnership, the HHS Strategic Plan, the National Strategy for Quality Improvement in Healthcare, and the CMS Quality Strategy. Based on input from stakeholders, CMS identified several high priority concept areas for future measure development in Table 20.

      These measure concepts are under development, and details regarding measure definitions, data sources, data collection approaches, and timeline for implementation will be communicated in future rulemaking. We invited feedback about these seven high priority concept areas for future measure development. Public comments and our responses to comments are summarized below.

      Comment: Multiple commenters expressed support for the potential constructs for future development, and especially cited stabilization in function. One commenter expressed appreciation that the basic timeline for implementation of future measures is consistent with the IMPACT Act's requirements.

      One commenter recommended four new quality measure constructs related to family caregivers. These included: Home health agency documentation of whether the beneficiary has a family caregiver; whether the care or discharge plan relies on the family caregiver to provide assistance; whether the family caregiver was provided supports they need as part of the plan after determining the need for such supports; and family caregiver experience of care. A few commenters recommended that CMS ensure new measures provide meaningful information and minimize burden.

      One commenter urged CMS to provide clear and transparent explanations of measure specifications, and to provide as much information as possible about the measures proposed. One commenter recommended CMS only use measures after they have been tested in the home health setting and proved to have meaningful risk adjustment, as well as to be person-centered and realistic for patients' disease state. Two commenters recommended that CMS consider consolidating or removing measures prior to expanding the current set of measures to minimize administrative burden. One additionally noted that some existing measures could prove to be redundant or unnecessary when the IMPACT Act measures are implemented. A few commenters encouraged CMS to employ a transparent process for measure development that allows for multiple avenues for stakeholder input. One commenter welcomed the opportunity to work with CMS in the development of these measures and their specifications.

      In response to the specific constructs listed in the Notice for Proposed Rule Making, one commenter said that a nutrition assessment conducted in the home setting, to support a nutritional assessment process measure, must comprise data elements that would not be included in a facility assessment, such as access to, and resources for food shopping. This commenter additionally recommended that new measures take into account patient-centered decisions and goals, including refusal of care, and balance these against provider accountability.

      MedPAC expressed concern about the number of quality measures in the Medicare Program, specifically the number currently used in the HH QRP. MedPAC suggested that prior to expanding the current set of measures in the HH QRP, CMS should consider

      Page 68703

      whether any of the current measures can be consolidated or removed, recognizing that some measures are proposed in response to legislation. MedPAC further suggested that CMS consider whether any of its measures are unnecessary or redundant for the HH QRP, once the IMPACT Act measures are implemented.

      Response: We appreciate the feedback on potential constructs for future measure development and concur with the importance of valid and reliable stabilization measures for home health patients. Additionally, we agree that caregiver constructs are high priority areas to consider for future measure development.

      With all new measure development, we are committed to assessing the burden and utility of proposed measures, through Technical Expert Panels, public comment periods and other opportunities for stakeholder input. In addition, we are planning to conduct field testing of new and existing OASIS items to assess their reliability, validity and relevance in the home health setting. This field testing will inform new measure development.

      We agree with MedPAC, as well as other commenters, regarding the importance of a modest set of measures for the HH QRP and are re-

      evaluating the entire set to determine which measures are candidates for revision or retirement. CMS's measure contractor has convened a Technical Expert Panel of providers, caregiver representatives, and other clinical experts to aid in the re-evaluation process. This process has included: (1) Analysis of historical measure trends, as well as reliability, validity and variability; (2) a review of the scientific basis for the measure construct in the literature and guidelines; and (3) feedback on the value of the measures to providers and patients for assessing and improving quality. Ongoing evaluation of measures used in HH QRP will continue as measures intended to satisfy the IMPACT Act's specified domains are made operational.

      In the current HH QRP outcome measures are risk-adjusted for a wide array of covariates and these risk models undergo periodic review and updating. We would extend this practice to new outcome measures as appropriate.

      We recognize the unique circumstances of home health patients, who have greater control and potentially greater barriers for maintaining good nutritional status. Additionally, we recognize that home health patients may make decisions that align with their personal choice but may be at odds with high quality outcomes.

      Comment: One commenter recommended that the OASIS capture information on cerebral palsy, traumatic brain injury, and cognitive impairment for long-term home health patients.

      Response: We appreciate the commenter's recommendation to capture information on the OASIS for all individuals with cerebral palsy, traumatic brain injury, and cognitive impairment and will take these comments into consideration when developing and modifying assessment items and quality measures.

    4. Form, Manner, and Timing of OASIS Data Submission and OASIS Data for Annual Payment Update

      1. Regulatory Authority

      The HH conditions of participation (CoPs) at Sec. 484.55(d) require that the comprehensive assessment must be updated and revised (including the administration of the OASIS) no less frequently than: (1) The last 5 days of every 60 days beginning with the start of care date, unless there is a beneficiary-elected transfer, significant change in condition, or discharge and return to the same HHA during the 60-day episode; (2) within 48 hours of the patient's return to the home from a hospital admission of 24-hours or more for any reason other than diagnostic tests; and (3) at discharge.

      It is important to note that to calculate quality measures from OASIS data, there must be a complete quality episode, which requires both a Start of Care (initial assessment) or Resumption of Care OASIS assessment and a Transfer or Discharge OASIS assessment. Failure to submit sufficient OASIS assessments to allow calculation of quality measures, including transfer and discharge assessments, is a failure to comply with the CoPs.

      HHAs do not need to submit OASIS data for those patients who are excluded from the OASIS submission requirements. As described in the December 23, 2005 Medicare and Medicaid Programs: Reporting Outcome and Assessment Information Set Data as Part of the Conditions of Participation for Home Health Agencies final rule (70 FR 76202), we defined the exclusion as those patients:

      Receiving only non-skilled services;

      For whom neither Medicare nor Medicaid is paying for HH care (patient receiving care under a Medicare or Medicaid Managed Care Plan are not excluded from the OASIS reporting requirement);

      Receiving pre- or post-partum services; or

      Under the age of 18 years.

      As set forth in the CY 2008 HH PPS final rule (72 FR 49863), HHAs that become Medicare certified on or after May 31 of the preceding year are not subject to the OASIS quality reporting requirement nor any payment penalty for quality reporting purposes for the following year. For example, HHAs certified on or after May 31, 2014 are not subject to the 2 percentage point reduction to their market basket update for CY 2015. These exclusions only affect quality reporting requirements and do not affect the HHA's reporting responsibilities as announced in the December 23, 2005 final rule.

      2. Home Health Quality Reporting Program Requirements for CY 2016 Payment and Subsequent Years

      In the CY 2014 HH PPS Final rule (78 FR 72297), we finalized a proposal to consider OASIS assessments submitted by HHAs to CMS in compliance with HH CoPs and Conditions for Payment for episodes beginning on or after July 1, 2012, and before July 1, 2013 as fulfilling one portion of the quality reporting requirement for CY 2014.

      In addition, we finalized a proposal to continue this pattern for each subsequent year beyond CY 2014. OASIS assessments submitted for episodes beginning on July 1 of the calendar year 2 years prior to the calendar year of the Annual Payment Update (APU) effective date and ending June 30 of the calendar year one year prior to the calendar year of the APU effective date, fulfill the OASIS portion of the HH QRP requirement.

      3. Previously Established Pay-for-Reporting Performance Requirement for Submission of OASIS Quality Data

      Section 1895(b)(3)(B)(v)(I) of the Act states that for 2007 and each subsequent year, the home health market basket percentage increase applicable under such clause for such year shall be reduced by 2 percentage points if a home health agency does not submit data to the Secretary in accordance with subclause (II) for such a year. This pay-

      for-reporting requirement was implemented on January 1, 2007. In the CY 2015 HH PPS Final rule (79 FR 38387), we finalized a proposal to define the quantity of OASIS assessments each HHA must submit to meet the pay-

      for-reporting requirement.

      We believe that defining a more explicit performance requirement for the submission of OASIS data by HHAs would better meet the intent of the statutory requirement.

      In the CY 2015 HH PPS Final rule (79 FR 38387), we reported information on a study performed by the Department of

      Page 68704

      Health & Human Services, Office of the Inspector General (OIG) in February 2012 to: (1) Determine the extent to which HHAs met federal reporting requirements for the OASIS data; (2) to determine the extent to which states met federal reporting requirements for OASIS data; and (3) to determine the extent to which CMS was overseeing the accuracy and completeness of OASIS data submitted by HHAs. Based on the OIG report we proposed a performance requirement for submission of OASIS quality data, which would be responsive to the recommendations of the OIG.

      In response to these requirements and the OIG report, we designed a pay-for-reporting performance system model that could accurately measure the level of an HHA's submission of OASIS data. The performance system is based on the principle that each HHA is expected to submit a minimum set of two matching assessments for each patient admitted to their agency. These matching assessments together create what is considered a quality episode of care, consisting ideally of a Start of Care (SOC) or Resumption of Care (ROC) assessment and a matching End of Care (EOC) assessment. However, it was determined that there are several scenarios that could meet this matching assessment requirement of the new pay-for-reporting performance requirement. These scenarios or quality assessments are defined as assessments that create a quality episode of care during the reporting period or could create a quality episode if the reporting period were expanded to an earlier reporting period or into the next reporting period.

      Seven types of assessments submitted by an HHA fit this definition of a quality assessment. These are:

      1. A Start of Care (SOC; M0100 = `01') or Resumption of Care (ROC; M0100 = `03') assessment that can be matched to an End of Care (EOC; M0100 = `06', `07', `08', or `09') assessment. These SOC/ROC assessments are the first assessment in the pair of assessments that create a standard quality of care episode describe in the previous paragraph.

      2. An End of Care (EOC) assessment that can be matched to a Start of Care (SOC) or Resumption of Care (ROC) assessment. These EOC assessments are the second assessment in the pair of assessments that create a standard quality of care episode describe in the previous paragraph.

      3. A SOC/ROC assessment that could begin an episode of care, but the assessment occurs in the last 60 days of the performance period. This is labeled as a Late SOC/ROC quality assessment. The assumption is that the EOC assessment will occur in the next reporting period.

      4. An EOC assessment that could end an episode of care that began in the previous reporting period, (that is, an EOC that occurs in the first 60 days of the performance period). This is labeled as an Early EOC quality assessment. The assumption is that the matching SOC/ROC assessment occurred in the previous reporting period.

      5. A SOC/ROC assessment that is followed by one or more follow-up assessments, the last of which occurs in the last 60 days of the performance period. This is labeled as an SOC/ROC Pseudo Episode quality assessment.

      6. An EOC assessment is preceded by one or more follow-up assessments, the first of which occurs in the first 60 days of the performance period. This is labeled an EOC Pseudo Episode quality assessment.

      7. A SOC/ROC assessment that is part of a known one-visit episode. This is labeled as a One-Visit episode quality assessment. This determination is made by consulting HH claims data.

      SOC, ROC, and EOC assessments that do not meet any of these definitions are labeled as Non-Quality assessments. Follow-up assessments (that is, where the M0100 Reason for Assessment = `04' or `05') are considered Neutral assessments and do not count toward or against the pay-for-reporting performance requirement.

      Compliance with this performance requirement can be measured through the use of an uncomplicated mathematical formula. This pay-for-

      reporting performance requirement metric has been titled as the ``Quality Assessments Only'' (QAO) formula because only those OASIS assessments that contribute, or could contribute, to creating a quality episode of care are included in the computation.

      The formula based on this definition is as follows:

      GRAPHIC TIFF OMITTED TR05NO15.010

      Our ultimate goal is to require all HHAs to achieve a pay-for-

      reporting performance requirement compliance rate of 90 percent or more, as calculated using the QAO metric illustrated above. In the CY 2015 HH PPS final rule (79 FR 66074), we proposed implementing a pay-

      for-reporting performance requirement over a 3-year period. After consideration of the public comments received, we adopted as final our proposal to establish a pay-for-reporting performance requirement for assessments submitted on or after July 1, 2015 and before June 30, 2016 with appropriate start of care dates, HHAs must score at least 70 percent on the QAO metric of pay-for-reporting performance requirement or be subject to a 2 percentage point reduction to their market basket update for CY 2017.

      HHAs have been statutorily required to report OASIS for a number of years and therefore should have many years of experience with the collection of OASIS data and transmission of this data to CMS. Given the length of time that HHAs have been mandated to report OASIS data and based on preliminary analyses that indicate that the majority of HHAs are already achieving the target goal of 90 percent on the QAO metric, we believe that HHAs would adapt quickly to the implementation of the pay-for-reporting performance requirement, if phased in over a 3-year period.

      In the CY 2015 rule, we did not finalize a proposal to increase the reporting requirement in 10 percent increments over a 2-year period beginning July 1, 2016 until the maximum rate of 90 percent is reached. Instead, we proposed to analyze historical data to set the reporting requirements. To set the threshold for the 2nd year, we analyzed the most recently available data, from 2013 and 2014, to make a determination about what the pay-for-reporting performance requirement should be. Specifically, we reviewed OASIS data from this time period simulating the pay-for-reporting performance 70 percent submission requirement to determine the hypothetical performance of each HHA as if the pay-for-reporting performance requirement were in effect during the reporting period preceding its implementation. This analysis indicated a nominal increase of 10 percent each year would provide the greatest opportunity for successful implementation versus an increase of 20 percent from year 1 to year 2.

      Page 68705

      Based on this analysis, we proposed to set the performance threshold at 80 percent for the reporting period from July 1, 2016 through June 30, 2017. For the reporting period from July 1, 2017 through June 30, 2018 and thereafter, we proposed the performance threshold would be 90 percent.

      We provided a report to each HHA of their hypothetical performance under the pay-for-reporting performance requirement during the 2014-

      2015 pre-implementation reporting period in June 2015. On January 1, 2015, the data submission process for OASIS converted from the current state-based OASIS submission system to a new national OASIS submission system known as the Assessment Submission and Processing (ASAP) System. On July 1, 2015, when the pay-for-reporting performance requirement of 70 percent went into effect, providers were required to submit their OASIS assessment data into the ASAP system. Successful submission of an OASIS assessment consist of the submission of the data into the ASAP system with a receipt of no ``fatal error'' messages. Error messages received during submission can be an indication of a problem that occurred during the submission process and could also be an indication that the OASIS assessment was rejected. Successful submission can be verified by ascertaining that the submitted assessment data resides in the national database after the assessment has met all of the quality standards for completeness and accuracy during the submission process. Should one or more OASIS assessments submitted by a HHA be rejected due to an IT/server issue caused by CMS, we may at our discretion, excuse the non-submission of OASIS data. We anticipate that such a scenario would rarely, if ever, occur. In the event that a HHA believes that they were unable to submit OASIS assessments due to an IT/server issue on the part of CMS, the HHA should be prepared to provide any documentation or proof available, which could demonstrate that no fault on their part contributed to the failure of the OASIS records to transmit to CMS.

      The initial performance period for the pay-for-reporting performance requirement is July 1, 2015 through June 30, 2016. Prior to and during this performance period, we have scheduled Open Door Forums and webinars to educate HHA personnel as needed about the pay-for-

      reporting performance requirement program and the pay-for- reporting performance QAO metric, and distributed individual provider preview reports. Additionally, OASIS Education Coordinators (OECs) have been trained to provide state-level instruction on this program and metric. We have posted a report, which provides a detailed explanation of the methodology for this pay-for-reporting QAO methodology. To view this report, go to the downloads section at: http://www.cms.gov/Medicare/Quality-Initiatives-Patient-Assessment-Instruments/HomeHealthQualityInits/Home-Health-Quality-Reporting-Requirements.html. Training announcements and additional educational information related to the pay-for-reporting performance requirement have been provided on the HH Quality Initiatives Web page.

      We invited public comment on our proposal to implement an 80 percent Pay-for-Reporting Performance Requirement for Submission of OASIS Quality Data for Year 2 reporting period July 1, 2016 to June 30, 2017 as described previously, for the HH QRP. Public comments and our responses to comments are summarized below.

      Comment: Several commenters supported CMS' proposed phased-in approach for the ``Quality Assessments Only'' (QAO) reporting requirements and the submission of OASIS data; one additionally noted appreciation for the added clarity about the QAO benchmarks for the next two assessment periods. A few commenters noted that the proposed increase to 80 percent for the 2016-2017 was acceptable, but encouraged CMS to defer subsequent increases, pending evaluation. One of these commenters additionally requested that CMS provide continuing status updates on the progress toward these goals so that HHAs could make changes to their processes in order to be compliant.

      Response: We appreciate the feedback and support for the QAO reporting thresholds and intend to conduct ongoing monitoring of the effect of increasing the QAO threshold on the percent of agencies that are compliant with this pay-for-reporting requirement. We do not intend to defer the increase to 90 percent beyond the schedule included in the rule; this threshold was chosen based on analysis indicating compliance was already at this level for the vast majority of agencies. We designed the pay-for-reporting performance system model in response to federal reporting requirements for the OASIS data and the recommendation in the OIG report entitled, ``Limited Oversight of Home Health Agency OASIS Data,'' that we ``identify all HHAs that failed to submit OASIS data and apply the 2 percent payment reduction to them''. As the OASIS reporting requirements have been in existence for 16 years, HHAs should already possess knowledge of these requirements and know what they need to do to bring their agency into compliance. We provided a report to each HHA of their hypothetical performance under the pay-for-reporting performance requirement during the 2014-2015 pre-

      implementation reporting period in June 2015; additionally we are considering options for ongoing communication with agencies about their compliance levels.

      Comment: One commenter requested CMS provide additional clarification about the definition of ``OASIS submission'' and whether it required acceptance of the submission by the state agency, as well as whether the QAO calculation included Medicare Advantage and Medicaid patients, in addition to traditional Medicare. This commenter recommended the standard be applied only to assessments completed for traditional Medicare patients and requested CMS provide comprehensive education on the new standard at least six months before it is effective.

      Response: On January 1, 2015, the data submission process for OASIS converted from the former state-based OASIS submission system to a new national OASIS submission system known as the Assessment Submission and Processing (ASAP) System. Therefore, the commenter's question about whether successful submission requires both submission and acceptance of OASIS data by the state agency is not applicable because the state-

      based OASIS submission system is no longer in existence.

      Providers are required to submit their OASIS assessment data into the ASAP system. Successful submission of an OASIS assessment consists of the submission of the data into the ASAP system with a receipt of no fatal error messages. Error messages received during submission can be an indication of a problem that occurred during the submission process and could also be an indication that the OASIS assessment was rejected. Successful submission can be verified by ascertaining that the submitted assessment data resides in the national database after the assessment has met all of the quality standards for completeness and accuracy during the submission process.

      As noted previously, should one or more OASIS assessments submitted by a HHA be rejected due to an IT/server issue caused by CMS, we may at our discretion, excuse the non-submission of OASIS data. We anticipate that such a scenario would rarely, if ever, occur. In the event that a HHA believes they were unable to submit OASIS

      Page 68706

      assessments due to an IT/server issue on the part of CMS, the HHA should be prepared to provide any documentation or proof available which demonstrates no fault on their part contributed to the failure of the OASIS transmission to CMS.

      Patients receiving care under a Medicare or Medicaid managed care plan are not excluded from the OASIS reporting requirements, and HHAs are required to submit OASIS assessments for these patients. OASIS reporting is mandated for all Medicare beneficiaries (under 42 CFR 484.250(a), 484.225(i), and 484.55). The HH CoPs require that the HH Registered Nurse (RN) or qualified therapist perform an initial assessment within 48 hours of referral, within 48 hours of the patient's return home, or on the physician-ordered start of care date. The HH RN or qualified therapist must also complete a comprehensive assessment within 5 days from the start of care. During these assessments, the HH RN or qualified therapist must determine the patient's eligibility for the Medicare HH benefit, including homebound status (42 CFR 484.55(a)(1) and (b)). In addition, the requirement for OASIS reporting on Medicare and Medicaid Managed Care patients was established in a final rule titled ``Medicare and Medicaid Programs: Reporting Outcome and Assessment Information Set Data as Part of the Conditions of Participation for Home Health Agencies Final Rule'' dated December 23, 2005 (70 FR 76200), which stated the following:

      ``In the January 25, 1999, interim final rule with comment period (64 FR 3749), we generally mandated that all HHAs participating in Medicare and Medicaid (including managed care organizations providing home health services to Medicare and Medicaid beneficiaries) report their OASIS data to the database we established within each State via electronic transmission.''

      We do not believe that there is more burden associated with the collection of OASIS assessment data for a Medicare Managed Care patient than there is for a HH patient that receives traditional Medicare fee-

      for-service (FFS) benefits. The requirements for the HH RN or qualified therapist to perform an initial and comprehensive assessment and complete all required OASIS assessments is the same for all Medicare patients regardless of the type of Medicare or Medicaid benefits they receive. The completion of these activities is a condition of payment of both Medicare FFS and managed care claims.

      We are committed to stakeholder education and as such conducted a Special Open Door forum on the QAO methodology and compliance rates on June 2, 2015; materials from this Special Open Door Forum, along with additional educational information, are available in the downloads section at: https://www.cms.gov/Medicare/Quality-Initiatives-Patient-Assessment-Instruments/HomeHealthQualityInits/Home-Health-Quality-Reporting-Requirements.html. CMS anticipates communicating ongoing educational opportunities through the regular HH QRP communication channels, including Open Door Forums, webinars, listening sessions, memos, email notification, and web postings.

      Final Action: After consideration of the comments received, we are adopting as final our proposal to implement an 80 percent Pay-for-

      Reporting Performance Requirement for Submission of OASIS Quality Data for Year 2 reporting period July 1, 2016 to June 30, 2017, and a 90 percent Pay-for-Reporting Performance Requirement for Submission of OASIS Quality Data for the reporting period July 1, 2017 to June 30, 2018 and thereafter.

      1. Home Health Care CAHPSsupreg Survey (HHCAHPS)

      In the CY 2015 HH PPS final rule (79 FR 66031), we stated that the home health quality measures reporting requirements for Medicare-

      certified agencies includes the Home Health Care CAHPSsupreg (HHCAHPS) Survey for the CY 2015 Annual Payment Update (APU). We are continuing to maintain the stated HHCAHPS data requirements for CY 2016 that were stated in CY 2015 and in previous rules, for the continuous monthly data collection and quarterly data submission of HHCAHPS data.

      1. Background and Description of HHCAHPS

      As part of the HHS Transparency Initiative, we implemented a process to measure and publicly report patient experiences with home health care, using a survey developed by the Agency for Healthcare Research and Quality's (AHRQ's) Consumer Assessment of Healthcare Providers and Systems (CAHPSsupreg) program and endorsed by the NQF in March 2009 (NQF Number 0517) and recently NQF re-endorsed in 2015. The HHCAHPS Survey is approved under OMB Control Number 0938-1066 through May 31, 2017. The HHCAHPS survey is part of a family of CAHPSsupreg surveys that asks patients to report on and rate their experiences with health care. The Home Health Care CAHPSsupreg (HHCAHPS) survey presents home health patients with a set of standardized questions about their home health care providers and about the quality of their home health care.

      Prior to this survey, there was no national standard for collecting information about patient experiences that enabled valid comparisons across all HHAs. The history and development process for HHCAHPS has been described in previous rules and is also available on the official HHCAHPS Web site at: https://homehealthcahps.org and in the annually-

      updated HHCAHPS Protocols and Guidelines Manual, which is downloadable from https://homehealthcahps.org.

      Since April 2012, for public reporting purposes, we report five measures from the HHCAHPS Survey--three composite measures and two global ratings of care that are derived from the questions on the HHCAHPS survey. The publicly reported data are adjusted for differences in patient mix across HHAs. We update the HHCAHPS data on Home Health Compare on www.medicare.gov quarterly. Each HHCAHPS composite measure consists of four or more individual survey items regarding one of the following related topics:

      Patient care (Q9, Q16, Q19, and Q24);

      Communications between providers and patients (Q2, Q15, Q17, Q18, Q22, and Q23); and

      Specific care issues on medications, home safety, and pain (Q3, Q4, Q5, Q10, Q12, Q13, and Q14).

      The two global ratings are the overall rating of care given by the HHA's care providers (Q20), and the patient's willingness to recommend the HHA to family and friends (Q25).

      The HHCAHPS survey is currently available in English, Spanish, Chinese, Russian, and Vietnamese. The OMB number on these surveys is the same (0938-1066). All of these surveys are on the Home Health Care CAHPSsupreg Web site, https://homehealthcahps.org. We continue to consider additional language translations of the HHCAHPS in response to the needs of the home health patient population.

      All of the requirements about home health patient eligibility for the HHCAHPS survey and conversely, which home health patients are ineligible for the HHCAHPS survey are delineated and detailed in the HHCAHPS Protocols and Guidelines Manual, which is downloadable at https://homehealthcahps.org. Home health patients are eligible for HHCAHPS if they received at least two skilled home health visits in the past 2 months, which are paid for by Medicare or Medicaid.

      Page 68707

      Home health patients are ineligible for inclusion in HHCAHPS surveys if one of these conditions pertains to them:

      Are under the age of 18;

      Are deceased prior to the date the sample is pulled;

      Receive hospice care;

      Receive routine maternity care only;

      Are not considered survey eligible because the state in which the patient lives restricts release of patient information for a specific condition or illness that the patient has; or

      No Publicity patients, defined as patients who on their own initiative at their first encounter with the HHAs make it very clear that no one outside of the agencies can be advised of their patient status, and no one outside of the HHAs can contact them for any reason.

      We stated in previous rules that Medicare-certified HHAs are required to contract with an approved HHCAHPS survey vendor. This requirement continues, and Medicare-certified agencies also must provide on a monthly basis a list of their patients served to their respective HHCAHPS survey vendors. Agencies are not allowed to influence at all how their patients respond to the HHCAHPS survey.

      As previously required, HHCAHPS survey vendors are required to attend introductory and all update trainings conducted by CMS and the HHCAHPS Survey Coordination Team, as well as to pass a post-training certification test. We have approximately 30 approved HHCAHPS survey vendors. The list of approved HHCAHPS survey vendors is available at: https://homehealthcahps.org.

      2. HHCAHPS Oversight Activities

      We stated in prior final rules that all approved HHCAHPS survey vendors are required to participate in HHCAHPS oversight activities to ensure compliance with HHCAHPS protocols, guidelines, and survey requirements. The purpose of the oversight activities is to ensure that approved HHCAHPS survey vendors follow the HHCAHPS Protocols and Guidelines Manual. As stated in previous HH PPS final rules, all HHCAHPS approved survey vendors must develop a Quality Assurance Plan (QAP) for survey administration in accordance with the HHCAHPS Protocols and Guidelines Manual. An HHCAHPS survey vendor's first QAP must be submitted within 6 weeks of the data submission deadline date after the vendor's first quarterly data submission. The QAP must be updated and submitted annually thereafter and at any time that changes occur in staff or vendor capabilities or systems. A model QAP is included in the HHCAHPS Protocols and Guidelines Manual. The QAP must include the following:

      Organizational Background and Staff Experience;

      Work Plan;

      Sampling Plan;

      Survey Implementation Plan;

      Data Security, Confidentiality and Privacy Plan; and

      Questionnaire Attachments

      As part of the oversight activities, the HHCAHPS Survey Coordination Team conducts on-site visits to all approved HHCAHPS survey vendors. The purpose of the site visits is to allow the HHCAHPS Survey Coordination Team to observe the entire HHCAHPS Survey implementation process, from the sampling stage through file preparation and submission, as well as to assess data security and storage. The HHCAHPS Survey Coordination Team reviews the HHCAHPS survey vendor's survey systems, and assesses administration protocols based on the HHCAHPS Protocols and Guidelines Manual posted at: https://homehealthcahps.org. The systems and program site visit review includes, but is not limited to the following:

      Survey management and data systems;

      Printing and mailing materials and facilities;

      Telephone call center facilities;

      Data receipt, entry and storage facilities; and

      Written documentation of survey processes.

      After the site visits, HHCAHPS survey vendors are given a defined time period in which to correct any identified issues and provide follow-up documentation of corrections for review. HHCAHPS survey vendors are subject to follow-up site visits on an as-needed basis.

      In the CY 2013 HH PPS final rule (77 FR 67094, 67164), we codified the current guideline that all approved HHCAHPS survey vendors fully comply with all HHCAHPS oversight activities. We included this survey requirement at Sec. 484.250(c)(3).

      3. HHCAHPS Requirements for the CY 2016 APU

      In the CY 2015 HH PPS final rule (79 FR 66031), we stated that for the CY 2016 APU, we would require continued monthly HHCAHPS data collection and reporting for four quarters. The data collection period for the CY 2016 APU includes the second quarter 2014 through the first quarter 2015 (the months of April 2014 through March 2015). Although these dates are past, we wished to state them in this rule so that HHAs are again reminded of what months constituted the requirements for the CY 2016 APU.

      For the 2016 APU, we required that all HHAs that had fewer than 60 HHCAHPS-eligible unduplicated or unique patients in the period of April 1, 2013 through March 31, 2014 are exempted from the HHCAHPS data collection and submission requirements for the CY 2016 APU, upon completion of the CY 2016 HHCAHPS Participation Exemption Request form, and upon CMS verification of the HHA patient counts. Agencies with fewer than 60 HHCAHPS-eligible, unduplicated or unique patients in the period of April 1, 2013, through March 31, 2014, were required to submit their patient counts on the HHCAHPS Participation Exemption Request form for the CY 2016 APU posted on https://homehealthcahps.org from April 1, 2014, to 11:59 p.m., EST on March 31, 2015. This deadline for the exemption form is firm, as are all of the quarterly data submission deadlines for the HHAs that participate in HHCAHPS.

      We automatically exempt HHAs receiving Medicare certification after the period in which HHAs do their patient counts. HHAs receiving Medicare certification on or after April 1, 2014 were exempt from the HHCAHPS reporting requirement for the CY 2016 APU. These newly-

      certified HHAs did not need to complete the HHCAHPS Participation Exemption Form for the CY 2016 APU.

      4. HHCAHPS Requirements for the CY 2017 APU

      For the CY 2017 APU, we require continued monthly HHCAHPS data collection and reporting for four quarters. The data collection period for the CY 2017 APU includes the second quarter 2015 through the first quarter 2016 (the months of April 2015 through March 2016). HHAs are required to submit their HHCAHPS data files to the HHCAHPS Data Center for the second quarter 2015 by 11:59 p.m., EST on October 15, 2015; for the third quarter 2015 by 11:59 p.m., EST on January 21, 2016; for the fourth quarter 2015 by 11:59 p.m., EST on April 21, 2016; and for the first quarter 2016 by 11:59 p.m., EST on July 21, 2016. These deadlines are firm; no exceptions are permitted.

      For the CY 2017 APU, we require that all HHAs with fewer than 60 HHCAHPS-eligible unduplicated or unique patients in the period of April 1, 2014, through March 31, 2015 are exempt from the HHCAHPS data collection and submission requirements for the CY 2017 APU, upon completion of the CY 2017 HHCAHPS Participation Exemption Request form, and upon

      Page 68708

      CMS verification of the HHA patient counts. Agencies with fewer than 60 HHCAHPS-eligible, unduplicated or unique patients in the period of April 1, 2014, through March 31, 2015, are required to submit their patient counts on the CY 2017 HHCAHPS Participation Exemption Request form posted on https://homehealthcahps.org from April 1, 2015, to 11:59 p.m., EST to March 31, 2016. This deadline is firm, as are all of the quarterly data submission deadlines for the HHAs that participate in HHCAHPS.

      We automatically exempt HHAs receiving Medicare certification after the period in which HHAs do their patient count. HHAs receiving Medicare-certification on or after April 1, 2015 are exempt from the HHCAHPS reporting requirement for the CY 2017 APU. These newly-

      certified HHAs do not need to complete the HHCAHPS Participation Exemption Request Form for the CY 2017 APU.

      5. HHCAHPS Requirements for the CY 2018 APU

      For the CY 2018 APU, we require continued monthly HHCAHPS data collection and reporting for four quarters. The data collection period for the CY 2018 APU includes the second quarter 2016 through the first quarter 2017 (the months of April 2016 through March 2017). HHAs will be required to submit their HHCAHPS data files to the HHCAHPS Data Center for the second quarter 2016 by 11:59 p.m., EST on October 20, 2016; for the third quarter 2016 by 11:59 p.m., EST on January 19, 2017; for the fourth quarter 2016 by 11:59 p.m., EST on April 20, 2017; and for the first quarter 2017 by 11:59 p.m., EST on July 20, 2017. These deadlines are firm; no exceptions will be permitted.

      For the CY 2018 APU, we require that all HHAs with fewer than 60 HHCAHPS-eligible unduplicated or unique patients in the period of April 1, 2015 through March 31, 2016 are exempt from the HHCAHPS data collection and submission requirements for the CY 2018 APU, upon completion of the CY 2018 HHCAHPS Participation Exemption Request form, and upon CMS verification of the HHA patient counts. Agencies with fewer than 60 HHCAHPS-eligible, unduplicated or unique patients in the period of April 1, 2015 through March 31, 2016 are required to submit their patient counts on the CY 2018 HHCAHPS Participation Exemption Request form posted on https://homehealthcahps.org from April 1, 2016 to 11:59 p.m., EST to March 31, 2017. This deadline is firm, as are all of the quarterly data submission deadlines for the HHAs that participate in HHCAHPS.

      We automatically exempt HHAs receiving Medicare certification after the period in which HHAs do their patient count. HHAs receiving Medicare-certification on or after April 1, 2016 are exempt from the HHCAHPS reporting requirement for the CY 2018 APU. These newly-

      certified HHAs do not need to complete the HHCAHPS Participation Exemption Request Form for the CY 2018 APU.

      6. HHCAHPS Reconsiderations and Appeals Process

      HHAs should monitor their respective HHCAHPS survey vendors to ensure that vendors submit their HHCAHPS data on time, by accessing their HHCAHPS Data Submission Reports on https://homehealthcahps.org. This helps HHAs ensure that their data are submitted in the proper format for data processing to the HHCAHPS Data Center.

      We continue HHCAHPS oversight activities as finalized in the previous rules. In the CY 2013 HH PPS final rule (77 FR 6704, 67164), we codified the current guideline that all approved HHCAHPS survey vendors must fully comply with all HHCAHPS oversight activities. We included this survey requirement at Sec. 484.250(c)(3).

      We continue the OASIS and HHCAHPS reconsiderations and appeals process that we have finalized and that we have used for prior all periods cited in the previous rules, and utilized in the CY 2012 to CY 2016 APU determinations. We have described the HHCAHPS reconsiderations and appeals process requirements in the APU Notification Letter that we send to the affected HHAs annually in September. HHAs have 30 days from their receipt of the letter informing them that they did not meet the HHCAHPS requirements to reply to CMS with documentation that supports their requests for reconsideration of the annual payment update to CMS. It is important that the affected HHAs send in comprehensive information in their reconsideration letter/package because CMS will not contact the affected HHAs to request additional information or to clarify incomplete or inconclusive information. If clear evidence to support a finding of compliance is not present, then the 2 percent reduction in the annual payment update will be upheld. If clear evidence of compliance is present, then the 2 percent reduction for the APU will be reversed. CMS notifies affected HHAs by December 31 of the decisions that affects payments in the annual year beginning on January 1. If CMS determines to uphold the 2 percent reduction for the annual payment update, the affected HHA may further appeal the 2 percent reduction via the Provider Reimbursement Review Board (PRRB) appeals process, which is described in the December letter.

      The following is a summary of the comments that we received regarding HHCAHPS:

      Comment: We received one comment that HHCAHPS is an unfunded administrative mandate that entails financial and resource burdens to HHAs.

      Response: The collection of the patient's perspectives of care data for similar CAHPS surveys, such as Hospital CAHPS (HCAHPS), follow the same model where providers pay the approved survey vendors for the data collection and implementation of the survey, and CMS pays for the HHCAHPS survey administration and technical assistance processes, the vendor approval, the vendor training, and vendor oversight activities, technical support to the home health agencies and for the vendors, and the data compilation, data analysis, and public reporting of the data's findings on www.Medicare.gov. HHAs are strongly encouraged to report their HHCAHPS costs on their respective annual cost reports, but HHAs should note that HHCAHPS costs are not reimbursable under the HH PPS. We post the list of the approved HHCAHPS vendors on https://homehealthcahps.org, and we encourage HHAs to contact the vendors for cost and service information pertaining to HHCAHPS since the HHAs may find differences among the vendors and will very likely find a vendor that is very suitable to their particular cost and administrative needs for HHCAHPS.

      Comment: We received a comment of concern regarding the fact that in the CY 2013 HH PPS final rule we codified the current guideline that all approved HHCAHPS survey vendors must fully comply with all HHCAHPS oversight activities. We included this survey requirement at Sec. 484.250(c)(3).

      Response: We appreciate this commenter's continuing concern about the policy set forth in the regulation several years ago. The implementation of the policy in the past 3 years has worked out very well and it is working as intended.

      Comment: We received a comment that the HHCAHPS Star Rating methodology does not include Q25, ``Would you recommend this agency to your family or friends if they needed home health care?'' with the answer

      Page 68709

      choices of ``Definitely no, Probably no, Probably yes, and Definitely yes''. The commenter recommends that we include a Star Rating that is the average of two questions on the HHCAHPS survey, Q25 (the question above, ``Would you recommend this agency to your family or friends'') and Q20 (``Using a number from 0 to 10, where 0 is the worst home health care possible and 10 is the best home health care possible, what number would you use to rate your care from this agency's home health providers?'') or remove Q25 from the composite measure.

      Response: We thank the commenter for the comments, but will continue to retain Q20 and Q25 because they are standalone questions and they are not part of an HHCAHPS composite (which is a measure combining several survey questions).

      Comment: We received one comment that CMS should establish a minimum number of completed HHCAHPS surveys (at 50 surveys) per agency if the data are going to be used in HHVBP or any other quality assessment program.

      Response: We are going to start publicly reporting Star Ratings in January 2016. We introduced the methodology in several CMS Open Door Forums in spring 2015 and announcements on our Web sites. After extensive data testing, our statisticians established that at least 40 surveys are needed in order to report Star Ratings for a home health agency. The commenter was correct; a minimum number of surveys are needed to have Star Ratings. In testing, it was found that there is no statistically significant difference between 40 surveys and 50 surveys as a minimum number for the HHCAHPS data.

      Comment: We received one comment in support of the continuation of the Home Health CAHPSsupreg requirements that are in line with previous years' requirements.

      Response: We thank this commenter for their support.

      Final Decision: We are not recommending any changes to the HHCAHPS requirements as a result of comments received.

      7. Summary

      We did not propose any changes to the participation requirements, or to the requirements pertaining to the implementation of the Home Health CAHPSsupreg Survey (HHCAHPS). We only updated the information to reflect the dates in the future APU years. We again strongly encourage HHAs to keep up-to-date about the HHCAHPS by regularly viewing the official Web site for the HHCAHPS at https://homehealthcahps.org. HHAs can also send an email to the HHCAHPS Survey Coordination Team at HHCAHPS@rti.org, or telephone toll-free (1-866-

      354-0985) for more information about HHCAHPS.

    5. Public Display of Home Health Quality Data for the HH QRP

      Section 1895(b)(3)(B)(v)(III) of the Act and section 1899B(f) of the IMPACT Act states the Secretary shall establish procedures for making data submitted under subclause (II) available to the public. Such procedures shall ensure that a home health agency has the opportunity to review the data that is to be made public for the agency prior to such data being made public. We recognize that public reporting of quality data is a vital component of a robust quality reporting program and are fully committed to ensuring that the data made available to the public be meaningful and that comparing performance across home health agencies requires that measures be constructed from data collected in a standardized and uniform manner. We also recognize the need to ensure that each home health agency has the opportunity to review the data before publication. Medicare home health regulations, as codified at Sec. 484.250(a), requires HHAs to submit OASIS assessments and Home Health Care Consumer Assessment of Healthcare Providers and Systems Surveysupreg (HHCAHPS) data to meet the quality reporting requirements of section 1895(b)(3)(B)(v) of the Act.

      In addition, beginning April 1, 2015 HHAs began to receive Provider Preview Reports (for all Process Measures and Outcome Measures) on a quarterly, rather than annual, basis. The opportunity for providers to review their data and to submit corrections prior to public reporting aligns with the other quality reporting programs and the requirement for provider review under the IMPACT Act. We provide quality measure data to HHAs via the Certification and Survey Provider Enhanced Reports (CASPER reports), which are available through the CMS Health Care Quality Improvement and Evaluation System (QIES).

      As part of our ongoing efforts to make healthcare more transparent, affordable, and accountable, the HH QRP has developed a CMS Compare Web site for home health agencies, which identifies home health providers based on the areas they serve. Consumers can search for all Medicare-

      certified home health providers that serve their city or ZIP code and then find the agencies offering the types of services they need. A subset of the HH quality measures has been publicly reported on the Home Health Compare (HH Compare) Web site since 2003. The selected measures that are made available to the public can be viewed on the HH Compare Web site located at http://www.medicare.gov/HHCompare/Home.asp

      The Affordable Care Act calls for transparent, easily understood information on provider quality to be publicly reported and made widely available. To provide home health care consumers with a summary of existing quality measures in an accessible format, we published a star rating based on the quality of care measures for home health agencies on Home Health Compare starting in July 2015. This is part of our plan to adopt star ratings across all Medicare.gov Compare Web sites. Star ratings are currently publicly displayed on Nursing Home Compare, Physician Compare, Hospital Compare, Dialysis Facility Compare, and the Medicare Advantage Plan Finder.

      The Quality of Patient Care star rating methodology assigns each home health agency a rating between one (1) and five (5) stars, using half stars for adjustment and reporting. All Medicare-certified home health agencies are eligible to receive a Quality of Patient Care star rating providing that they have quality data reported on at least 5 out of the 9 quality measures that are included in the calculation.

      Home health agencies will continue to have prepublication access to their agency's quality data, which enables each agency to know how it is performing before public posting of the data on the Compare Web site. Starting in April 2015, HHAs are receiving quarterly preview reports showing their Quality of Patient Care star rating and how it was derived well before public posting. HHAs have several weeks to review and provide feedback.

      The Quality of Patient Care star ratings methodology was developed through a transparent process the included multiple opportunities for stakeholder input, which was subsequently the basis for refinements to the methodology. An initial proposed methodology for calculating the Quality of Patient Care star ratings was posted on the CMS.gov Web site in December 2014. CMS then held two Special Open Door Forums (SODFs) on December 17, 2014 and February 5, 2015 to present the proposed methodology and solicit input. At each SODF, stakeholders provided immediate input, and were invited to submit additional comments via the Quality of Patient Care star ratings Help Desk mailbox HHC_Star_Ratings_Helpdesk@cms.hhs.gov. CMS

      Page 68710

      refined the methodology, based on comments received and additional analysis. The final methodology report is posted on the new star ratings Web page http://www.cms.gov/Medicare/Quality-Initiatives-Patient-Assessment-Instruments/HomeHealthQualityInits/HHQIHomeHealthStarRatings.html. A Frequently-Asked-Questions (FAQ) document is also posted on the same Web page, addressing the issues raised in the comments that were received. We tested the Web site language used to present the Quality of Patient Care star ratings with Medicare beneficiaries to assure that it allowed them to accurately understand the significance of the various star ratings.

      Additional information regarding the Quality of Patient Care star rating is posted on the star ratings Web page at: http://www.cms.gov/Medicare/Quality-Initiatives-Patient-Assessment-Instruments/HomeHealthQualityInits/HHQIHomeHealthStarRatings.html. Additional communications regarding the Quality of Patient Care star ratings will be announced via regular HH QRP communication channels.

      Summaries of public comments and our responses to comments regarding the Public Display of Home Health Quality Data for the HH QRP are provided below:

      Comment: A commenter recommended that CMS include stabilization measures in the Quality of Patient Care star ratings algorithm.

      Response: We appreciate the feedback on the Quality of Patient Care star ratings methodology, and agree that stabilization is an important goal for some home health patients. CMS is committed to ongoing evaluation and improvement of the algorithm to calculate the star rating, including potential inclusion of new measures that meet the inclusion criteria for variability, reportability, and clinical relevance.

  16. Collection of Information Requirements

    While this rule contains information collection requirements, this rule does not add new, nor revise any of the existing information collection requirements, or burden estimate. The information collection requirements discussed in this rule for the OASIS-C1 data item set had been previously approved by the Office of Management and Budget (OMB) on February 6, 2014 and scheduled for implementation on October 1, 2014. The extension of OASIS-C1/ICD-9 version was reapproved under OMB control number 0938-0760 with a current expiration date of March 31, 2018. This version of the OASIS will be discontinued once the OASIS-C1/

    ICD-10 version is approved and implemented. In addition, to facilitate the reporting of OASIS data as it relates to the implementation of ICD-

    10 on October 1, 2015, CMS submitted a new request for approval to OMB for the OASIS-C1/ICD-10 version under the Paperwork Reduction Act (PRA) process. The proposed revised OASIS item was announced in the 30-day Federal Register notice (80 FR 15797) and received OMB approval and assigned OMB control number 0938-1279.

  17. Regulatory Impact Analysis

    1. Statement of Need

      Section 1895(b)(1) of the Act requires the Secretary to establish a HH PPS for all costs of HH services paid under Medicare. In addition, section 1895(b)(3)(A) of the Act requires (1) the computation of a standard prospective payment amount include all costs for HH services covered and paid for on a reasonable cost basis and that such amounts be initially based on the most recent audited cost report data available to the Secretary, and (2) the standardized prospective payment amount be adjusted to account for the effects of case-mix and wage levels among HHAs. Section 1895(b)(3)(B) of the Act addresses the annual update to the standard prospective payment amounts by the HH applicable percentage increase. Section 1895(b)(4) of the Act governs the payment computation. Sections 1895(b)(4)(A)(i) and (b)(4)(A)(ii) of the Act require the standard prospective payment amount to be adjusted for case-mix and geographic differences in wage levels. Section 1895(b)(4)(B) of the Act requires the establishment of appropriate case-mix adjustment factors for significant variation in costs among different units of services. Lastly, section 1895(b)(4)(C) of the Act requires the establishment of wage adjustment factors that reflect the relative level of wages, and wage-related costs applicable to HH services furnished in a geographic area compared to the applicable national average level.

      Section 1895(b)(3)(B)(iv) of the Act provides the Secretary with the authority to implement adjustments to the standard prospective payment amount (or amounts) for subsequent years to eliminate the effect of changes in aggregate payments during a previous year or years that was the result of changes in the coding or classification of different units of services that do not reflect real changes in case-

      mix. Section 1895(b)(5) of the Act provides the Secretary with the option to make changes to the payment amount otherwise paid in the case of outliers because of unusual variations in the type or amount of medically necessary care. Section 1895(b)(3)(B)(v) of the Act requires HHAs to submit data for purposes of measuring health care quality, and links the quality data submission to the annual applicable percentage increase.

      Section 421(a) of the MMA requires that HH services furnished in a rural area, for episodes and visits ending on or after April 1, 2010, and before January 1, 2016, receive an increase of 3 percent of the payment amount otherwise made under section 1895 of the Act. Section 210 of the MACRA amended section 421(a) of the MMA to extend the 3 percent increase to the payment amounts for serviced furnished in rural areas for episodes and visits ending before January 1, 2018.

      Section 3131(a) of the Affordable Care Act mandates that starting in CY 2014, the Secretary must apply an adjustment to the national, standardized 60-day episode payment rate and other amounts applicable under section 1895(b)(3)(A)(i)(III) of the Act to reflect factors such as changes in the number of visits in an episode, the mix of services in an episode, the level of intensity of services in an episode, the average cost of providing care per episode, and other relevant factors. In addition, section 3131(a) of the Affordable Care Act mandates that rebasing must be phased-in over a 4-year period in equal increments, not to exceed 3.5 percent of the amount (or amounts) as of the date of enactment (2010) under section 1895(b)(3)(A)(i)(III) of the Act, and be fully implemented in CY 2017.

      The HHVBP Model will apply a payment adjustment based on an HHA's performance on quality measures to test the effects on quality and costs of care. This HHVBP Model was developed based on the experiences we gained from the implementation of the Home Health Pay-for-

      Performance (HHPP) demonstration as well as the successful implementation of the HVBP program. The model design was also developed from the public comments received on the discussion of a HHVBP model being considered in the CY 2015 HH PPS proposed and final rules. Value-

      based purchasing programs have also been included in the President's budget for most provider types, including Home Health.

    2. Overall Impact

      We have examined the impacts of this rule as required by Executive Order

      Page 68711

      12866 on Regulatory Planning and Review (September 30, 1993), Executive Order 13563 on Improving Regulation and Regulatory Review (January 18, 2011), the Regulatory Flexibility Act (RFA) (September 19, 1980, Pub. L. 96-354), section 1102(b) of the Act, section 202 of the Unfunded Mandates Reform Act of 1995 (UMRA, March 22, 1995; Pub. L. 104-4), Executive Order 13132 on Federalism (August 4, 1999), and the Congressional Review Act (5 U.S.C. 804(2)).

      Executive Orders 12866 and 13563 direct agencies to assess all costs and benefits of available regulatory alternatives and, if regulation is necessary, to select regulatory approaches that maximize net benefits (including potential economic, environmental, public health and safety effects, distributive impacts, and equity). Executive Order 13563 emphasizes the importance of quantifying both costs and benefits, of reducing costs, of harmonizing rules, and of promoting flexibility. The net transfer impacts related to the changes in payments under the HH PPS for CY 2016 are estimated to be -$260 million. The savings impacts related to the HHVBP model are estimated at a total projected 5-year gross savings of $380 million assuming a very conservative savings estimate of a 6 percent annual reduction in hospitalizations and a 1.0 percent annual reduction in SNF admissions. In accordance with the provisions of Executive Order 12866, this regulation was reviewed by the Office of Management and Budget.

      1. HH PPS

      The update set forth in this rule applies to Medicare payments under HH PPS in CY 2016. Accordingly, the following analysis describes the impact in CY 2016 only. We estimate that the net impact of the policies in this rule is approximately $260 million in decreased payments to HHAs in CY 2016. We applied a wage index budget neutrality factor and a case-mix weights budget neutrality factor to the rates as discussed in section III.C.3 of this final rule. Therefore, the estimated impact of the 2016 wage index and the recalibration of the case-mix weights for 2016 is zero. The -$260 million impact reflects the distributional effects of the 1.9 percent HH payment update percentage ($345 million increase), the effects of the third year of the four-year phase-in of the rebasing adjustments to the national, standardized 60-day episode payment amount, the national per-visit payment rates, and the NRS conversion factor for an impact of -2.4 percent ($440 million decrease), and the effects of the -0.97 percent adjustment to the national, standardized 60-day episode payment rate to account for nominal case-mix growth ($165 million decrease). The $260 million in decreased payments is reflected in the last column of the first row in Table 21 as a 1.4 percent decrease in expenditures when comparing CY 2015 payments to estimated CY 2016 payments.

      The RFA requires agencies to analyze options for regulatory relief of small entities, if a rule has a significant impact on a substantial number of small entities. For purposes of the RFA, small entities include small businesses, nonprofit organizations, and small governmental jurisdictions. Most hospitals and most other providers and suppliers are small entities, either by nonprofit status or by having revenues of less than $7.5 million to $38.5 million in any one year. For the purposes of the RFA, we estimate that almost all HHAs are small entities as that term is used in the RFA. Individuals and states are not included in the definition of a small entity. The economic impact assessment is based on estimated Medicare payments (revenues) and HHS's practice in interpreting the RFA is to consider effects economically ``significant'' only if greater than 5 percent of providers reach a threshold of 3 to 5 percent or more of total revenue or total costs. The majority of HHAs' visits are Medicare-paid visits and therefore the majority of HHAs' revenue consists of Medicare payments. Based on our analysis, we conclude that the policies finalized in this rule will result in an estimated total impact of 3 to 5 percent or more on Medicare revenue for greater than 5 percent of HHAs. Therefore, the Secretary has determined that this HH PPS final rule will have a significant economic impact on a substantial number of small entities. Further detail is presented in Table 24, by HHA type and location.

      With regards to options for regulatory relief, we note that in the CY 2014 HH PPS final rule we finalized rebasing adjustments to the national, standardized 60-day episode rate, non-routine supplies (NRS) conversion factor, and the national per-visit payment rates for each year, 2014 through 2017 as described in section II.C and III.C.3 of this final rule. Since the rebasing adjustments are mandated by section 3131(a) of the Affordable Care Act, we cannot offer HHAs relief from the rebasing adjustments for CY 2016. For the 1.4 percent reduction to the national, standardized 60-day episode payment amount for CY 2016 described in section III.B.2 of this final rule, we believe it is appropriate to reduce the national, standardized 60-day episode payment amount to account for the estimated increase in nominal case-mix in order to move towards more accurate payment for the delivery of home health services where payments better align with the costs of providing such services. In the alternatives considered section for the CY 2016 HH PPS proposed rule (80 FR 39839), we note that we considered reducing the 60-day episode rate in CY 2016 only to account for nominal case-mix growth between CY 2012 and CY 2014. However, we instead proposed to reduce the 60-day episode rate over a two-year period (CY 2016 and CY 2017) to account for estimated nominal case-mix growth between CY 2012 and CY 2014 in order to lessen the impact on HHAs in a given year. As discussed in III.B.2 of this final rule, we are implementing a reduction of 0.97 percent to the 60-day episode rate in each of the next three calendar years (CY 2016 through CY 2018.

      Executive Order 13563 specifies, to the extent practicable, agencies should assess the costs of cumulative regulations. However, given potential utilization pattern changes, wage index changes, changes to the market basket forecasts, and unknowns regarding future policy changes, we believe it is neither practicable nor appropriate to forecast the cumulative impact of the rebasing adjustments on Medicare payments to HHAs for future years at this time. Changes to the Medicare program may continue to be made as a result of the Affordable Care Act, or new statutory provisions. Although these changes may not be specific to the HH PPS, the nature of the Medicare program is such that the changes may interact, and the complexity of the interaction of these changes will make it difficult to predict accurately the full scope of the impact upon HHAs for future years beyond CY 2016. We note that the rebasing adjustments to the national, standardized 60-day episode payment rate and the national per-visit rates are capped at the statutory limit of 3.5 percent of the CY 2010 amounts (as described in the preamble in section II.C. of this final rule) for each year, 2014 through 2017. The NRS rebasing adjustment will be -2.82 percent in each year, 2014 through 2017.

      In addition, section 1102(b) of the Act requires us to prepare a RIA if a rule may have a significant impact on the operations of a substantial number of small rural hospitals. This analysis must conform to the provisions of section 604 of RFA. For purposes of section 1102(b)

      Page 68712

      of the Act, we define a small rural hospital as a hospital that is located outside of a metropolitan statistical area and has fewer than 100 beds. This final rule is applicable exclusively to HHAs. Therefore, the Secretary has determined this rule will not have a significant economic impact on the operations of small rural hospitals.

      2. HHVBP Model

      To test the impact of upside and downside value-based payment adjustments, beginning in calendar year 2018 and in each succeeding calendar year through calendar year 2022, the HHVBP Model will adjust the final claim payment amount for a home health agency for each episode in a calendar year by an amount equal to the applicable percent. For purposes of this final rule, we have limited our analysis of the economic impacts to the value-based incentive payment adjustments. Under the model design, the incentive payment adjustments will be limited to the total payment reductions to home health agencies included in the model and would be no less than the total amount available for value-based incentive payment adjustment. Overall, the distributive impact of this rule is estimated at $380 million for CY 2018-2022. Therefore, this rule is economically significant and thus a major rule under the Congressional Review Act. The model will test the effect on quality and costs of care by applying payment adjustments based on HHAs' performance on quality measures. This rule was developed based on extensive research and experience with value-based purchasing models.

      Guidance issued by the Department of Health and Human Services interpreting the Regulatory Flexibility Act considers the effects economically `significant' only if greater than 5-percent of providers reach a threshold of 3- to 5-percent or more of total revenue or total costs. Among the over 1900 HHAs in the selected states that would be expected to be included in the HHVBP Model, we estimate that the maximum percent payment adjustment resulting from this rule will only be greater than minus 3 percent for 10 percent of the HHAs included in the model (using the 8 percent maximum payment adjustment threshold to be applied in CY2022). As a result, only 2-percent of all HHA providers nationally would be significantly impacted, falling well below the RFA threshold. In addition, only HHAs that are impacted with lower payments are those providers that provide the poorest quality which is the main tenet of the model. This falls well below the threshold for economic significance established by HHS for requiring a more detailed impact assessment under the RFA. Thus, we are not preparing an analysis under the RFA because the Secretary has determined that this final rule would not have a significant economic impact on a substantial number of small entities.

      In addition, section 1102(b) of the Act requires us to prepare a regulatory impact analysis if a rule may have a significant impact on the operations of a substantial number of small rural HHAs. This analysis must conform to the provisions of section 604 of the RFA. For purposes of section 1102(b) of the Act, we have identified less than 5 percent of HHAs included in the selected states that primarily serve beneficiaries that reside in rural areas (greater than 50 percent of beneficiaries served). We are not preparing an analysis under section 1102(b) of the Act because the Secretary has determined that the HHVBP Model would not have a significant impact on the operations of a substantial number of small rural HHAs.

      Section 202 of the Unfunded Mandates Reform Act of 1995 also requires that agencies assess anticipated costs and benefits before issuing any rule whose mandates require spending in any 1 year of $100 million in 1995 dollars, updated annually for inflation. In 2015, that threshold is approximately $144 million. This rule will have no consequential effect on state, local, or tribal governments or on the private sector.

      Executive Order 13132 establishes certain requirements that an agency must meet when it promulgates a proposed rule (and subsequent final rule) that imposes substantial direct requirement costs on State and local governments, preempts state law, or otherwise has Federalism implications. Since this regulation does not impose any costs on state or local governments, the requirements of Executive Order 13132 are not applicable.

      In accordance with the provisions of Executive Order 12866, this regulation was reviewed by the Office of Management and Budget.

    3. Detailed Economic Analysis

      1. HH PPS

      This final rule sets forth updates for CY 2016 to the HH PPS rates contained in the CY 2015 HH PPS final rule (79 FR 66032 through 66118). The impact analysis of this final rule presents the estimated expenditure effects of policy changes finalized in this rule. We use the latest data and best analysis available, but we do not make adjustments for future changes in such variables as number of visits or case-mix.

      This analysis incorporates the latest estimates of growth in service use and payments under the Medicare HH benefit, based primarily on Medicare claims data from 2014. We note that certain events may combine to limit the scope or accuracy of our impact analysis, because such an analysis is future-oriented and, thus, susceptible to errors resulting from other changes in the impact time period assessed. Some examples of such possible events are newly-legislated general Medicare program funding changes made by the Congress, or changes specifically related to HHAs. In addition, changes to the Medicare program may continue to be made as a result of the Affordable Care Act, or new statutory provisions. Although these changes may not be specific to the HH PPS, the nature of the Medicare program is such that the changes may interact, and the complexity of the interaction of these changes could make it difficult to predict accurately the full scope of the impact upon HHAs.

      Table 24 represents how HHA revenues are likely to be affected by the policy changes finalized in this rule. For this analysis, we used an analytic file with linked CY 2014 OASIS assessments and HH claims data for dates of service that ended on or before December 31, 2014 (as of June 30, 2015). The first column of Table 24 classifies HHAs according to a number of characteristics including provider type, geographic region, and urban and rural locations. The second column shows the number of facilities in the impact analysis. The third column shows the payment effects of the CY 2016 wage index. The fourth column shows the payment effects of the CY 2016 case-mix weights. The fifth column shows the effects the 0.97 percent reduction to the national, standardized 60-day episode payment amount to account for nominal case-

      mix growth. The sixth column shows the effects of the rebasing adjustments to the national, standardized 60-day episode payment rate, the national per-visit payment rates, and NRS conversion factor. For CY 2016, the average impact for all HHAs due to the effects of rebasing is an estimated 2.4 percent decrease in payments. The seventh column shows the effects of the CY 2016 home health payment update percentage (i.e., the home health market basket update adjusted for multifactor productivity as discussed in section III.C.1. of this final rule).

      Page 68713

      The last column shows the combined effects of all the policies finalized in this rule. Overall, it is projected that aggregate payments in CY 2016 will decrease by 1.4 percent. As illustrated in Table 24, the combined effects of all of the changes vary by specific types of providers and by location. We note that some individual HHAs within the same group may experience different impacts on payments than others due to the distributional impact of the CY 2016 wage index, the extent to which HHAs had episodes in case-mix groups where the case-mix weight decreased for CY 2016 relative to CY 2015, the percentage of total HH PPS payments that were subject to the low-utilization payment adjustment (LUPA) or paid as outlier payments, and the degree of Medicare utilization.

      Table 21--Estimated Home Health Agency Impacts by Facility Type and Area of the Country, CY 2016

      --------------------------------------------------------------------------------------------------------------------------------------------------------

      60-day episode HH payment

      Number of CY 2016 CY 2016 case- rate nominal Rebasing update

      agencies wage index mix weights case-mix \4\ percentage Total

      \1\ \2\ reduction \3\ \5\

      --------------------------------------------------------------------------------------------------------------------------------------------------------

      All Agencies............................................. 11,609 0.0% 0.0% -0.9% -2.4% 1.9% -1.4%

      Facility Type and Control:

      Free-Standing/Other Vol/NP............................... 1,094 0.0% 0.0% -0.9% -2.3% 1.9% -1.3%

      Free-Standing/Other Proprietary.......................... 9,076 0.0% -0.1% -0.9% -2.4% 1.9% -1.5%

      Free-Standing/Other Government........................... 382 -0.1% 0.2% -0.9% -2.3% 1.9% -1.2%

      Facility-Based Vol/NP.................................... 718 0.1% 0.2% -0.9% -2.3% 1.9% -1.0%

      Facility-Based Proprietary............................... 117 -0.3% 0.1% -0.9% -2.3% 1.9% -1.5%

      Facility-Based Government................................ 222 -0.3% 0.3% -0.9% -2.3% 1.9% -1.3%

      Subtotal: Freestanding............................... 10,552 0.0% 0.0% -0.9% -2.4% 1.9% -1.4%

      Subtotal: Facility-based............................. 1,057 0.0% 0.2% -0.9% -2.3% 1.9% -1.1%

      Subtotal: Vol/NP..................................... 1,812 0.1% 0.1% -0.9% -2.3% 1.9% -1.1%

      Subtotal: Proprietary................................ 9,193 0.0% -0.1% -0.9% -2.4% 1.9% -1.5%

      Subtotal: Government................................. 604 -0.2% 0.3% -0.9% -2.3% 1.9% -1.2%

      Facility Type and Control: Rural:

      Free-Standing/Other Vol/NP............................... 191 -0.9% 0.3% -0.9% -2.3% 1.9% -1.9%

      Free-Standing/Other Proprietary.......................... 149 -0.4% 0.1% -0.9% -2.3% 1.9% -1.6%

      Free-Standing/Other Government........................... 448 -0.6% 0.0% -0.9% -2.3% 1.9% -1.9%

      Facility-Based Vol/NP.................................... 218 -0.7% 0.3% -0.9% -2.4% 1.9% -1.8%

      Facility-Based Proprietary............................... 27 -0.1% 0.1% -0.9% -2.3% 1.9% -1.3%

      Facility-Based Government................................ 131 -0.5% 0.5% -0.9% -2.3% 1.9% -1.3%

      Facility Type and Control: Urban:

      Free-Standing/Other Vol/NP............................... 942 0.1% 0.0% -0.9% -2.3% 1.9% -1.2%

      Free-Standing/Other Proprietary.......................... 8,760 0.0% -0.1% -0.9% -2.4% 1.9% -1.5%

      Free-Standing/Other Government........................... 154 -0.3% 0.1% -0.9% -2.4% 1.9% -1.6%

      Facility-Based Vol/NP.................................... 500 0.2% 0.2% -0.9% -2.3% 1.9% -0.9%

      Facility-Based Proprietary............................... 90 -0.4% 0.1% -0.9% -2.2% 1.9% -1.5%

      Facility-Based Government................................ 91 -0.2% 0.2% -0.9% -2.4% 1.9% -1.4%

      Facility Location: Urban or Rural:

      Rural.................................................... 1,072 -0.6% 0.1% -0.9% -2.3% 1.9% -1.8%

      Urban.................................................... 10,537 0.0% 0.0% -0.9% -2.4% 1.9% -1.4%

      Facility Location: Region of the Country:

      Northeast................................................ 837 0.0% 0.0% -0.9% -2.2% 1.9% -1.2%

      Midwest.................................................. 3,078 0.0% 0.1% -0.9% -2.4% 1.9% -1.3%

      South.................................................... 5,713 -0.2% -0.1% -0.9% -2.4% 1.9% -1.7%

      West..................................................... 1885 0.5% 0.0% -0.9% -2.3% 1.9% -0.8%

      Other.................................................... 96 -0.2% 0.0% -0.9% -2.4% 1.9% -1.6%

      Facility Location: Region of the Country (Census Region):

      New England.............................................. 294 -0.2% 0.0% -0.9% -2.1% 1.9% -1.3%

      Mid Atlantic............................................. 543 0.1% 0.0% -0.9% -2.3% 1.9% -1.2%

      East North Central....................................... 2,447 0.0% 0.0% -0.9% -2.4% 1.9% -1.4%

      West North Central....................................... 631 -0.2% 0.2% -0.9% -2.4% 1.9% -1.4%

      South Atlantic........................................... 1,883 0.0% 0.0% -0.9% -2.4% 1.9% -1.4%

      East South Central....................................... 432 -0.3% -0.1% -0.9% -2.5% 1.9% -1.9%

      West South Central....................................... 3,398 -0.3% -0.2% -0.9% -2.4% 1.9% -1.9%

      Mountain................................................. 621 0.0% 0.1% -0.9% -2.3% 1.9% -1.2%

      Pacific.................................................. 1,264 0.7% 0.0% -0.9% -2.4% 1.9% -0.7%

      Facility Size (Number of 1st Episodes):

VLEX uses login cookies to provide you with a better browsing experience. If you click on 'Accept' or continue browsing this site we consider that you accept our cookie policy. ACCEPT