Tuesday 20 October 2015

eHealth Pilotitis

"We cannot afford to remain a country of perpetual pilot projects, wasting the learning we need."
- Monique Bégin, Federal Minister of Health when the Canada Health Act was enacted

I once asked Greg Reed, former CEO of eHealth Ontario, why none of the most successful pilots in Ontario were ever rolled out across the province.  He explained to me that most pilots are tightly bound to the system they were built on because they were implemented as a simple customization of an existing system.  For example the much vaunted Sault Ste. Marie ePrescribing pilot consisted of handing out EMR access to local pharmacists, so the pharmacists were literally logged onto the same EMR as the physicians were.  It was a quick way to get scripts from doctors to pharmacists, but a far cry from a solution that could be rolled out across the province.

The Naylor Report goes into considerable detail on Canada's abusive serial relationships with pilots. We love them; but in the end we feel abandoned, frustrated and wanting.

Here's an idea: How about we institute an annual contest for the Best Pilot?  The prize would be roll-out of their solution across the entire province.  Selection criteria could include scalability to mitigate Mr. Reed's observation.  Nothing would make a pilot team more proud than to see their innovative solution rolled out across the entire province!

Wednesday 26 August 2015

AHA Report on Interoperability: Implications for Canada

The American Hospital Association released an insightful report on health data interoperability this week.  You can download it here.

Practically all of the hospital systems we run in Canada are American, so we would be wise to watch interoperability developments south of the border.

Here are a couple of sections of the report that seemed relevant to the Canadian interoperability landscape:

1. Pilot Test Standards before making them official:

To date, the federal government’s selection of standards for inclusion in the certification of EHRs has yielded mixed results. Future activities must be more grounded in whether standards are ready for use and accompanied by sufficient implementation guidance. The federal government should support the maturation of standards through pilots or demonstrations to determine the viability of the standard before inclusion in a federal regulation that mandates use. For example, ONC could support voluntary certification of draft standards that are being matured in pilots or demonstrations projects, which would signal federal support for innovation without imposing an immature standard on the entire provider community. With evidence from real-world pilots that a draft standard can be scaled for ubiquitous use and has moved to become a mature standard, the federal government can then consider whether regulations are needed to advance use of the standard.
Wow, just imagine if Canada had done this for HL7v3 *before* we sunk $1B into that standard...

2. States and provinces need to get out of the business of standards development:
State governments should be discouraged from establishing unique requirements that increase variation in standards and policies because variability diminishes the ability to share information across state lines. State policies also may play a role in establishing the business case for information sharing and infrastructure development..
The main benefit to developing national standards isn't ability to move data across state/province boundaries.  It is the huge savings in software procurement and system integration.

3. CCDA challenges.
The current iteration of content standards, such as the Consolidated Clinical Document Architecture (CCDA), do not meet the needs of clinicians for relevant clinical data. The CCDAs shared for meaningful use include large amounts of patient data, making it hard for clinicians to easily identify the information that is important
Having worked with CCDA from multiple systems in the U.S. this comment surprised me.  The problems I saw with CCDA was not that it had too much data, but rather that  there are huge gaps in the CCDA produced by the current generation of hospital systems and little consistency between the CCDA produced by the different systems.



Friday 26 June 2015

ConnectingGTA: Finally a Shared EHR in Ontario

Ontario’s been struggling for years to get some form of Shared Electronic Health Record (EHR) off the ground.  The closest thing they’ve had so far was South-Western Ontario’s ClinicalConnect (not so much of a shared EHR as a hub into the various systems in the region) or the Integrated Assessment Repository (used mostly for Mental Health Assessments across the province).

ConnectingGTA is really the first major centralized Clinical Data Repository (CDR) that Ontario has built, thanks to some strong leadership from UHN and the ConnectingGTA Clinical Working Group.  I predict that the ConnectingGTA CDR will become the dominant EHR in the province that all other shared health records initiatives will end up integrating into.  This is a big deal.

Last week, I had my first meeting with the ConnectingGTA team.  Here are some things I learned:
  • The CDR contains all patient medical records from 17 sites dating back to May 2013: 27 Terabytes.
  • 2 million encounters from 17 sites are added to the CDR every week.
  • OLIS lab results are integrated at the field level, so graphs of vitals are available.
  • Most other data are stored as unstructured documents.
  • The main gap in the data is medication and primary care.
  • There is a project underway to integrate ODB data into the system and to import ClinicalConnect data (which should bring with it some primary care data).
  • In the initial pilot, of the 1200 pilot users, 40% logged in, which is higher than the average pilot program that would typically see 20% participation.
  • The system is onboarding thousands of new users every week.  Currently over 8000 users are signed up.
  • Main usage is in hospitals and CCACs.  It has revolutionized the transition of care from CCAC to Long-term care.
  • Most are logging in using federated access (i.e. they login to their source system and then click-through to ConnectingGTA without needing to login again.)  When they click through from a patient in their local system, they automatically get directed to that patient’s page in ConnectingGTA.
  • Some users are logging in directly through ConnectingGTA because the system-to-system click-through option can be slow.  In this case, they need to manually search for the patient by MRN, OHIP, or demographics.
  • The system currently uses UHN’s in-house client registry (list of patients.)  The ConnectingGTA team would like to use the provincial registry, but the current mainframe based one (that was a rudimentary extension of the old OHIP system) won’t meet ConnectingGTA’s requirements.  The ConnectingGTA team recommends that the province upgrade to a more modern Client Registry before switching ConnectingGTA over to the provincial client registry.
  • ConnectingGTA is on a fixed release schedule of 2 releases/year.
  • For communities outside of the GTA, they will only be granted access to ConnectingGTA if they first contribute data.  This is for 2 reasons:
    • It’s an incentive to get more data into the system.
    • If ConnectingGTA is introduced into an organization with none of their patient data in it, most patient searches will come up empty and clinicians would stop using it.
  • Currently there are no plans for “secondary use” of the data.  (E.g. analysis of outcome measurement or development of clinical guidelines.)
My main takeaway from the presentation is that the viewers would be far more useful if the data coming in was structured.  If a physician needs to review a patient, are they really going to sift through huge stacks of electronic documents?  That being said, perfection is the enemy of good, and having a stack of documents to read is better than having no documents at all; Physicians are used to working this way—this new system is just moves the stack of documents from a clipboard to a computer screen.  I just wonder if increasing the stack from 10 documents to 100 might dampen their interest in reading any of them…

The other thing that caught my eye was the comment about the Client Registry.  Having a shared understanding of what patient you’re talking about is essential to any EHR initiative in the province, whether they be drug systems, lab systems, online appointment booking or electronic referrals.  Ontario has taken far too long to get a functioning client registry.  So far, Hospital shared Diagnostic Imaging is the only provincial infrastructure that is using the provincial client registry.  That’s just embarrassing.

If I was running eHealth Ontario, I would have a poster on the wall with two numbers on it:
  • Number of projects waiting for access to the Client Registry
  • Average length of time to onboard a project onto the Client Registry (hint: this should be measured in days, not years)
And I’d bonus the execs on how low those two numbers are.

Wednesday 10 June 2015

The Cost of Health IT Sovereignty

Whenever I tell my friends I’m an eHealth consultant, I get the same two questions:
  1. Why don’t we just take someone else’s eHealth system that works and run it here?
  2. If finance, supply chains, and practically every other industry can move data around easily, why can’t healthcare do it?
Canadians spend far more on their healthcare IT software than they should.  A big reason for this is our provinces’ insistence to go it alone on all their IT projects: building their own networks, software systems, and all the certification work that goes into approving that infrastructure.

Why do our provinces feel compelled to do everything on their own?  Is it a sense of provincial pride?  “We Albertans know better than Manitobans how to run a healthcare system.”  The Bureaucratic Mandate?  “My province needs its own independent Standards, Architecture, Privacy and Security healthcare offices”.  Asserting sovereignty can feel politically rewarding, but it introduces two unnecessary costs to expanding a successful eHealth solution from one Province into another:
  1. Re-certifying for Privacy and Security.
  2. Re-tooling for interoperability.
These costs could be avoided if the provinces set aside their differences and agreed to relinquish their sovereignty for certifying interoperability, privacy and security of healthcare solutions to a federal agency.

Canada Health Infoway (CHI) has been eager to take on this role and will certify that a healthcare system meets national interoperability, privacy, and security standards.  As a vendor, I welcome the opportunity to certify my system once with a national agency and be done with it.  However, before I sign up for this, I first need to be damn sure this certification will be honoured by most of the provinces.  It’s currently not mandated by any.

Without the explicit agreement of the provinces to relinquish this responsibility to CHI, CHI’s certification is meaningless.  There are two things CHI could do to fix this:
  1. Co-operation.  Persuade provinces to relinquish responsibility for this certification work to Canada Health Infoway.  Vendors want it (see the published ITAC Health position).  Citizens want it (because they want eHealth systems sooner, at a lower cost.)  All that remains is persuading provincial governments to do the right thing.
  2. Incentive.  Secure billions of dollars of funding from the Federal Government and provide it to organizations only if they purchase certified systems.  This is the approach the U.S. took with its Centers for Medicare & Medicaid Services Meaningful Use Incentive program.
If Canada doesn’t figure this out, our Health IT sector will eventually just get replaced by American solutions.  We still have time to get our act together, but we will need to act fast.  Solving this should be a top priority for Infoway.

Friday 22 May 2015

Obstacles to CDA adoption in Ontario

I'm really starting to wonder now if CDA will ever take hold in Ontario.

There was a time when I admired the adoption of CDA in the U.S. as a part of their impressive "Meaningful Use" initiative.  I have worked first hand with CDA documents from various EMR systems in the U.S. and have seen many successful Health Information Exchanges launch in the U.S. based on CCDA XDS repositories.

Despite the flurry of CDA activity south of the border, I see serious obstacles to adoption of CDA here in Ontario:
  1. The strongest case for CDA in Ontario is the abundance of CDA support and tooling in the U.S.  What's important to recognize, however, is that CDA encodes country-specific data types like postal codes and units of measure that are different between the U.S. and Canada.  So even if we wanted to take advantage of American CDA tools here in Canada, we would first need to modify those tools to use Canadian data types before we could use them up here.  The cost of this will in many cases be prohibitive.
  2. The next case for CDA in Ontario is how naturally it would support continuity of care scenarios like eReferral, eConsult, hospital discharge, admission, etc.  The problem with this is that Ontario EMR vendors have already achieved OntarioMD 4.1 certification that requires supporting the import and export of patient data in the OntarioMD "Core Data Set" data format.  In hindsight, it's clear that Ontario should never have invented its own proprietary EMR data exchange format.  But now that we have it, the EMR vendors are going to prefer that we build on that capability rather than trying to add support for a completely new CDA format.
  3. Lastly, many people I speak with about CDA are quick to point out that despite all the HIEs and EMR support developed in the U.S., it has not even come close to living up to its promise there.  In fact, the EMR backlash against CDA has prompted the formation of an industry association called the CommonWell Health Aliance that is promoting FHIR as the way forward for health data interoperability.  Every technical person I've spoken with who's seen both the CDA and FHIR specs has emphatically preferred FHIR.  Support for FHIR is snowballing everywhere.

So it now feels like we're in an awkward in-between time for EMR interoperability in Ontario.  Support for CDA is waning, but the FHIR spec is still only half-baked.  It will be years before the FHIR is released as a normative standard.

I will be watching how EMR interoperability unfolds south of the border with interest.  Momentum will either end up falling with CDA or FHIR, and it will be in Ontario's long-term best interest to follow whichever interoperability standard wins in the gigantic market to our south.


Thursday 23 October 2014

Healthcare Interoperability in Canada: Perfection is the Enemy of Good

Yesterday, as co-chair of the ITAC Interoperability and Standards Committee, I presented opening comments for an ITAC Health workshop on Interoperability.  Details of the event can be found here.  Below is the text of my opening comments.

The Problem

I’m a software developer that got into healthcare about 10 years ago.  When I joined healthcare, I was surprised by a number of things I saw.  Things like:

  • Records are stored on paper and exchanged using paper fax.
  • The software behind the desk looks like it was written in the 1980s or 1990s.
  • The endless transcribing and repeated oral communication at every encounter is reminiscent of medieval monasteries:  In a week a patient can repeat their entire medical history to multiple clinicians and dump out their bag of drugs for each and every one of them.
  • Data exchange, if it happens at all, is often extracted directly from the EMR database (bad practice) and looks like lines of custom pipe-delimited text from my Dad’s generation.

In short:  Why hasn’t technology revolutionized healthcare like it has every other industry?  It feels like Canadian Healthcare is still stuck back in the last century.  Not much has changed in the last 10 years.

Healthcare IT in Canada is behind the rest of the world by most measures.  Even the U.S., who are committed to doing everything the hard way, are years ahead of Canada when it comes to Healthcare IT.  How did we get here?  How can we fix it?

How did we get here?


You can’t blame Canada for lack of trying.  We have invested billions of dollars into major eHealth initiatives right across the country.  There has been a decade-long project to introduce new healthcare interoperability standards across Canada, organized under a Pan-Canadian EHR Blueprint to get everyone connected into centralized EHR repositories.  We were promised that everyone would have a shared electronic health record accessible by all providers by 2015.  We’re not going to make it.  What happened?

If I were to pick one overarching theme it would be this: Perfection is the enemy of Good.
I’ve seen numerous projects get derailed by intricate Privacy and Security tentacles that grow out of monstrous consent models.  Time and time again we have held up perfectly secure and functional eHealth initiatives because we’re pursuing an absolutely comprehensive and airtight privacy and security model around it.  These delays cost lives.  It’s too easy to indefinitely postpone a project over privacy and security hand waving.

Another issue I’ve seen hold Canada back is our fantasy that each province is a unique flower, requiring completely different infrastructure, software, and its own independent standards committees and EHR programs.  Get OVER yourselves.  We will all save a heck of a lot of money when the provinces just get together and present Canada as a single market to the international Healthcare vendor community, rather than as a balkanized collection of misfits.

From a software developer’s perspective, I can tell you one issue that contributed to delaying Canada’s eHealth agenda is the quality of our Interoperability Standards.  I’ve heard people say, “I don’t care what message standard you use to move your data around—the technology is irrelevant—the interoperability standard isn’t the problem.”  To this, I say “hogwash!”  I’ve seen good APIs and I’ve seen bad APIs.  The “P” in “API” stands for “Programmer.”  If you want to know whether a proposed API is any good, you have to ask an experienced programmer.  If you take a look at the HL7v3 standard, it looks to me like they skipped this step.  If it costs 10 times as much effort to implement one API over another, that’s a sign there’s probably a problem with your API.

I think when the whole Canadian HL7v3 thing started out, there were a number of vendors involved in the process.  But one by one they dropped out, and the torch was left to be carried by committees of well-intentioned, but ultimately misguided information modellers.

We in the Canadian vendor community need to take some responsibility for letting this happen.
Smaller vendors didn’t get involved because they couldn’t afford to—many were just struggling to survive in the consolidating landscape.  The tragedy here is they will be the ones most affected by lack of interoperability standards.

Larger vendors arguably stand to benefit the most from a Wild West, devoid of easy-to-use interoperability standards where their Walled Fortress can be presented as the only fully interconnected show in town!

But simply falling into the arms of a handful of large vendors will have a cost for all of us in the long run.  That cost is innovation.  It’s in our best interest to start seriously thinking about supporting a manageable collection of simple, proven interoperability standards.

How can we fix it?

Vendors are the custodians of the most experienced technical minds in Canada.  We need to bring these minds together and take on this problem.  We can’t afford to continue complaining, wiping our hands of responsibility and expecting government to figure it out for us.  We need serious software engineers at the table, rolling up our sleeves, and getting this job done.

Now it’s easy to say that.  But what can we practically do to move this forward?  I recommend 3 things.
  1. We need something in Canada akin to the IHE working groups they have in the U.S.  A focal point for vendor input on the direction interoperability standards will take in Canada.  This needs to happen at the national level.
  2. We need to leverage infrastructure already deployed and we need to leverage standards that have already been successfully implemented in other parts of the world.  This will mean moving forward with a plurality of standards, such as IHE XDS, CDA, HL7v2 and HL7v3, and potentially even FHIR. 
  3. We need to strive for simple, clear and unambiguous interoperability standards.  It’s not enough to say you broadly support a standard like HL7v2.  You need to have very specific conformance processes to go along with it that ensure my HL7v2 messages have exactly the same Z segments and use exactly the same vocabulary as your HL7v2 messages.
A bit more on the last point.  Along with each standard, you need to have, at a minimum, content specifications and vocabulary bindings.  And by this I don’t mean 400 page word document that system integrators are expected to read through and implement.  I mean MACHINE READABLE software artifacts that completely specify the structure of how the data will be represented in bytes over the wire and how field values will be unambiguously interpreted.  Representing your specs in a machine readable format accelerates interoperability tooling by a considerable factor.  It’s the difference between building robots, and building robots that are able to build other robots.

For different standards this means different things.
StandardMachine readable artifactsRecommendations
HL7v2conformance profiles, vocabulary dictionariesUHN has done some great work here with their machine readable HAPI conformance profiles
HL7v3MIFs with vocabulary constraintsalthough I don’t see much of a future for HL7v3 here in Canada outside of pharmacy and even there it’s not clear if that’s going to win in the long run
CDAtemplates with terminology constraintsI think the jury’s still out for level 3 CDA.  The Lantana group has a made a good start at organizing CDA templates, but this space still has a long way to go—I think it suffers from some of the same challenges that HL7v3 faces
IHEIntegration ProfilesDiagnostic Imaging is the poster child for how an initiative like this can be successful.  DI is way ahead of other domains in Canada and we can credit the IHE for much of that progress—we need to consider building on the success of this approach in other domains.
FHIRresource schemasI have to say, given how new the FHIR standard is, it’s impressive how many online conformance test sandboxes are already publically available—that’s a testimony to how committed FHIR is to machine readability, openness and simplicity.  Read Intelliware's assessment of FHIR here.


In closing, I’m asking the vendors: give us your best engineers, and let’s work together to get serious about establishing some simple, functioning interoperable standards to get our healthcare data moving!

Friday 27 June 2014

When off-shoring software to India, include code quality metrics as a part of the contract

I understand the appeal of off-shoring software development to India: low rates, scalable team size, and a process that has really matured over the years.  India is a serious and credible competitor for software development services.

I have personally been asked to maintain software written by large Indian off-shore companies. While the software usually meets the functional requirements, and passes manual QA testing, in my experience, the quality of the code written overseas is often poor.  Specifically, the resulting code is not extensible and it is expensive to maintain.  I am not exaggerating when I say I have seen 2000 line methods within 6000 line classes that were copy/pasted multiple times.

Setting aside for a moment the implicit conflict-of-interest of writing code that is expensive to maintain, in fairness to the Indian offshore developers, when customers complain that it's expensive to change features and add new ones to the delivered system, the developers innocently respond, "well you never told us you were going to need those changes..."

There is a simple answer to this.  Ask for it up front.  And I don't mean ask for the system to be extensible and maintainable.  That's vague.  I mean require the developer to run a Continuous Integration server (such as Jenkins) with a code quality plugin such as SonarQube, and measure the specific code quality metrics that matter.

In my experience, measuring the following 4 metrics goes a long way towards ensuring the code you get back is extensible and maintainable.

  1. Package Tangle Index = 0 cycles.  This ensures the software is properly layered, essential for extensibility.
  2. Code Coverage between 60% and 80%.  This is essential for low maintenance costs.  This metric is about automated testing.  The automated unit tests quickly discover side-effects of future feature changes, allowing you to make changes to how the system behaves and get those changes into production with a minimum of manual regression testing.
  3. Duplication < 2%.  Any competent developer will maintain low code duplication as a basic pride of craft.  But I have been astonished at the amount of copy/paste code I've seen come back from India.  If you don't measure it, unscrupulous coders will take this shortcut and produce a system whose maintenance costs quickly spiral out of control.
  4. Complexity: < 2.0 / method and < 6.0 / class.  This metric plays a huge factor in extensibilty.  Giant classes with giant methods make a system brittle and resistant to change.  Imagine a building made out of a few giant Lego blocks versus the same building made out of 10 times as many smaller Lego blocks.  The latter building will be far more flexible to reshape as business needs change.
A word of caution about using SonarQube.  Some developers, particularly those with a perfectionist bent, can get lost in a rabbit hole of trying to improve their code's "score" on many of the other metrics offered by the tool.  Violations, Rules Compliance, Technical Debt Score and LCOM4 are particularly tempting to undisciplined developers.  But in my experience, these metrics provide limited return on investment.  If you do decide to measure your code quality, I urge you to ignore these metrics.  While it can be a hill of fun spending weeks making your code "squeeky clean," the business value of these other metrics pales in comparison to what you get out of the 4 metric I recommended.

So the next time you outsource a development project to India, protect yourself from getting back junk by requiring code quality metrics right in the contract.  It might add an extra 10% to the initial cost of the system, but that cost will be more than offset by the resulting extensibility and maintainability of the code you get back.