1 We use the term “national evaluation” to distinguish our work from the activities undertaken by evaluators who are under contract with many of the demonstration grantees to assess the implementation and outcomes of State-level projects. The word “national” should not be interpreted to mean that our findings are representative of the United States as a whole.
2 A detailed description of our evaluation goals and methods can be found in the design plan submitted to AHRQ in April 2014 available at http://www.ahrq.gov/sites/default/files/wysiwyg/policymakers/chipra/demoeval/what-we-learned/finaldesignplan.pdf.
3 A complete listing of these products and their relevance to the research questions is no longer available.
4 The 10 grantee States are Alaska, Florida, Illinois, Maine, Massachusetts, North Carolina, Oregon, Pennsylvania, South Carolina, and West Virginia.
5 Burwell SM. 2014 annual report on the quality of care for children in Medicaid and CHIP. November 2014. Available at https://www.medicaid.gov/medicaid/quality-of-care/downloads/2014-child-sec-rept.pdf. Accessed August 7, 2015.
6 CMS. Electronic specifications for clinical quality measures. Available at: http://www.cms.gov/Regulations-and-Guidance/Legislation/EHRIncentivePrograms/Electronic_Reporting_Spec.html. Accessed on June 3, 2015.
7 CMS. Guide for reading eligible professional (EP) and eligible hospital (EH) eMeasures, version 4. May 2013. Available at: https://www.cms.gov/Regulations-and-Guidance/Legislation/EHRIncentivePrograms/Downloads/Guide_Reading_EP_Hospital_eCQMs.pdf. Accessed on September 2, 2015.
8 Colorado and New Mexico implemented the electronic screening questionnaire as a Category E project. We include them in this section because their screening program is conceptually similar to the Pennsylvania Category B project; both projects are health IT applications.
9 See Electronic Student Health Questionnaire (eSHQ) enhances risk assessment for adolescents (available at http://www.ahrq.gov/policymakers/chipra/demoeval/what-we-learned/co-nm-specialinnovation.html) and Introducing electronic screening tools for developmental delay and autism into pediatric primary care (available at http://www.ahrq.gov/policymakers/chipra/demoeval/what-we-learned/pa-specialinnovation.html).
10 The 12 States are Oregon, Alaska, West Virginia, Utah, Idaho, Florida, Illinois, Maine, Vermont, Massachusetts, South Carolina, and North Carolina.
11 Illinois, Massachusetts, Maine, North Carolina, South Carolina, and West Virginia collected medical home survey data from more than 80 child-serving “comparison” practices that did not participate in CHIPRA practice transformation activities.
12 The 12 States were Alaska, Florida, Idaho, Illinois, Maine, Massachusetts, North Carolina, Oregon, South Carolina, Utah, Vermont, and West Virginia.
13 Information on the medical home assessment method used by Vermont is not available.
14 The MHI-RSF instrument can be found at: https://www.ahrq.gov/policymakers/chipra/demoeval/index.html.
15 The Abridged Children’s EHR Format can be found at https://ushik.ahrq.gov/mdr/lists/administeredItems/Requirements?filterColumn_8=yes&system=cehrf&enableAsynchronousLoading=true.
16 Oshiro BT, Kowalewski L, Sappenfield W. A multistate quality improvement program to decrease elective deliveries before 39 weeks of gestation. Obstet Gynecol 2013 May;121(5):1025-31.
17 See CHQC Massachusetts at http://www.masschildhealthquality.org/.
18 The goals of State-level improvement partnerships and NIPN are to facilitate collaboration and the translation of knowledge across programs, so that States can learn from each other about strategies that work (and do not work) to improve quality of care for children under Medicaid and CHIP.
19 Evaluation Highlight 4 focuses on how the demonstration helped to elevate children on State health policy agendas.
20 Evaluation Highlight 6 addresses the issue of partnerships among the States in multistate grants. We review findings from this Evaluation Highlight in Section 3 because this topic pertains to the structure of the grant program.
21 Children’s Health Insurance Program Reauthorization Act of 2009, P.L. 111-3.
22 During our baseline analysis, Massachusetts was unable to link managed care provider identification numbers to participating CHIPRA practices, a connection that is necessary for us to be able to attribute children to intervention and comparison practices. So while they sent us managed care encounter data, we could not identify which managed care patients were cared for in CHIPRA practices. By the time we determined Massachusetts had solved the linkage problem and could use managed care data, they were already in the process of conducting their own evaluation.
23 Christensen AL, Zickafoose JS, Natzke B, et al. Associations between practice-reported medical homeness and health care utilization among publicly insured children. Acad Pediatr 2015 May-June;15(3):267-74. Available at http://www.academicpedsjnl.net/article/S1876-2859(14)00429-X/abstract. Accessed October 27, 2015.
24 Details regarding our dissemination methods in the original dissemination plan were submitted to AHRQ in August 2011 and the updates submitted in May 2012, August 2013, and May 2014.
National Healthcare Quality and Disparities Report
Latest available findings on quality of and access to health care
Data & Analytics
- Data Infographics
- Data Visualizations
- Data Tools
- Data Innovations
- All-Payer Claims Database
- Consumer Assessment of Healthcare Providers and Systems (CAHPS®) Program
- Healthcare Cost and Utilization Project (HCUP)
- Medical Expenditure Panel Survey (MEPS)
- National Healthcare Quality and Disparities Report Data Tools
- AHRQ Quality Indicator Tools for Data Analytics
- United States Health Information Knowledgebase (USHIK)
- Data Sources Available from AHRQ
National Evaluation of the CHIPRA Quality Demonstration Grant Program: Final Project Report
6. References and Notes
Table of Contents
- National Evaluation of the CHIPRA Quality Demonstration Grant Program: Final Project Report
- 1. Overview
- 2. Synthesis of Key Findings by Category
- 3. Observations About the Structure of the Demonstration Grant Program
- 4. Observations About the Evaluation
- 5. Conclusion and Summary
- 6. References and Notes
- Appendix A. National Evaluation Team and Technical Expert Panel Members
- Appendix B. Products Produced By the National Evaluation Team
- Appendix C. Obstacles to Impact Analysis of Category C Projects
Page last reviewed March 2019
Page originally created October 2015
Internet Citation: 6. References and Notes . Content last reviewed March 2019. Agency for Healthcare Research and Quality, Rockville, MD.
https://www.ahrq.gov/policymakers/chipra/demoeval/what-we-learned/final-report/section6.html