Dr. Ben Chan is President and CEO of Health Quality Ontario.
The Health Council of Canada's 2012 Progress Report describes numerous advances over the last decade in public reporting, both at the provincial and pan-Canadian levels. But are we maximizing our ability to compare health system performance across the country? The answer, it appears, is no. The Council’s report notes, for example, that many jurisdictions feel the indicators agreed on through the First Ministers’ Accord aren’t sufficient for their own measurement and reporting needs. Why, then, is it so difficult to create a robust set of meaningful indicators for cross-national comparisons?
One reason presented by the report is that provinces appear interested in creating measurement systems for their own needs. There is certainly evidence of that in Ontario, where the province has invested heavily in reporting on surgical wait times, hospital acquired infections and other patient safety measures. For most of these measures, there are no equivalents elsewhere in the country. Provinces with a firm agenda for quality cannot afford to wait for national consensus on indicator definitions before pushing for improvement.
But does the “narrow provincial focus” argument really explain the aspirations of individual provinces? Not quite. Here in Ontario, one of the strongest messages I’ve been hearing from local CEOs is a huge interest in benchmarks. This thirst for comparable data is in no small part due to the quality improvement plans legislated for hospitals under the Excellent Care for All Act, 2010 (ECFAA). Hospitals are required by law to set numeric targets for improvement for each fiscal year. As a result, hospitals are looking for guidance as to what their evidence-based targets should be, and want to incorporate the strategies employed by high performers elsewhere. This demand will only grow as the requirements of the ECFAA are spread to other parts of the health system, suc h as primary care.
The thirst for comparisons goes further. Already, in our annual Quality Monitor report, we ask tough questions as to why we’re not matching the best results achieved elsewhere, in areas where good interprovincial comparisons exist. Why, for example, does Colchester-Hants, Nova Scotia have a flu vaccination rate of 82%, when Ontario’s average is 68%? What are they doing that we’re not, but should be? We’re also asking ourselves why the overall rate of hospitalization for ambulatory-sensitive conditions is 275 per 100,000 in Ontario, compared to a mere 162 in Richmond, B.C.
To achieve robust pan-Canadian reporting, we can build on our current successes in creating nationally standardized data sets and surveys. CIHI and Statistics Canada have made huge contributions to the country by making data sets like the Discharge Abstract Database and Canadian Community Health Survey available. Some obvious next steps could be common definitions for reporting on wait times, and a minimum core set of standardized patient experience questions for primary, acute, long-term and community-based care.
Electronic medical records (EMRs) also represent a treasure trove of information on what should be one of our top priorities: providing better chronic disease management (CDM). EMRs are proliferating across the country, and many EMR vendors operate now in multiple provinces. Health Quality Ontario is currently advocating for more stringent EMR vendor specifications in order to ensure that standardized data on CDM is collected. Our close partnership with CIHI can help ensure that these standards contribute to national standards.
All of us want to improve and to be among the top performers, but the only way we can truly know how we are performing is through comparison, to ourselves and to others. What we need now is a paradigm shift that recognizes that nationally standardized data sharing is one of our most powerful tools to drive change. Indeed, it is in the self-interest of each province to do whatever we can to advance sharing of comparable indicators.