Friday, 26 June 2015

ConnectingGTA: Finally a Shared EHR in Ontario

Ontario’s been struggling for years to get some form of Shared Electronic Health Record (EHR) off the ground.  The closest thing they’ve had so far was South-Western Ontario’s ClinicalConnect (not so much of a shared EHR as a hub into the various systems in the region) or the Integrated Assessment Repository (used mostly for Mental Health Assessments across the province).

ConnectingGTA is really the first major centralized Clinical Data Repository (CDR) that Ontario has built, thanks to some strong leadership from UHN and the ConnectingGTA Clinical Working Group.  I predict that the ConnectingGTA CDR will become the dominant EHR in the province that all other shared health records initiatives will end up integrating into.  This is a big deal.

Last week, I had my first meeting with the ConnectingGTA team.  Here are some things I learned:
  • The CDR contains all patient medical records from 17 sites dating back to May 2013: 27 Terabytes.
  • 2 million encounters from 17 sites are added to the CDR every week.
  • OLIS lab results are integrated at the field level, so graphs of vitals are available.
  • Most other data are stored as unstructured documents.
  • The main gap in the data is medication and primary care.
  • There is a project underway to integrate ODB data into the system and to import ClinicalConnect data (which should bring with it some primary care data).
  • In the initial pilot, of the 1200 pilot users, 40% logged in, which is higher than the average pilot program that would typically see 20% participation.
  • The system is onboarding thousands of new users every week.  Currently over 8000 users are signed up.
  • Main usage is in hospitals and CCACs.  It has revolutionized the transition of care from CCAC to Long-term care.
  • Most are logging in using federated access (i.e. they login to their source system and then click-through to ConnectingGTA without needing to login again.)  When they click through from a patient in their local system, they automatically get directed to that patient’s page in ConnectingGTA.
  • Some users are logging in directly through ConnectingGTA because the system-to-system click-through option can be slow.  In this case, they need to manually search for the patient by MRN, OHIP, or demographics.
  • The system currently uses UHN’s in-house client registry (list of patients.)  The ConnectingGTA team would like to use the provincial registry, but the current mainframe based one (that was a rudimentary extension of the old OHIP system) won’t meet ConnectingGTA’s requirements.  The ConnectingGTA team recommends that the province upgrade to a more modern Client Registry before switching ConnectingGTA over to the provincial client registry.
  • ConnectingGTA is on a fixed release schedule of 2 releases/year.
  • For communities outside of the GTA, they will only be granted access to ConnectingGTA if they first contribute data.  This is for 2 reasons:
    • It’s an incentive to get more data into the system.
    • If ConnectingGTA is introduced into an organization with none of their patient data in it, most patient searches will come up empty and clinicians would stop using it.
  • Currently there are no plans for “secondary use” of the data.  (E.g. analysis of outcome measurement or development of clinical guidelines.)
My main takeaway from the presentation is that the viewers would be far more useful if the data coming in was structured.  If a physician needs to review a patient, are they really going to sift through huge stacks of electronic documents?  That being said, perfection is the enemy of good, and having a stack of documents to read is better than having no documents at all; Physicians are used to working this way—this new system is just moves the stack of documents from a clipboard to a computer screen.  I just wonder if increasing the stack from 10 documents to 100 might dampen their interest in reading any of them…

The other thing that caught my eye was the comment about the Client Registry.  Having a shared understanding of what patient you’re talking about is essential to any EHR initiative in the province, whether they be drug systems, lab systems, online appointment booking or electronic referrals.  Ontario has taken far too long to get a functioning client registry.  So far, Hospital shared Diagnostic Imaging is the only provincial infrastructure that is using the provincial client registry.  That’s just embarrassing.

If I was running eHealth Ontario, I would have a poster on the wall with two numbers on it:
  • Number of projects waiting for access to the Client Registry
  • Average length of time to onboard a project onto the Client Registry (hint: this should be measured in days, not years)
And I’d bonus the execs on how low those two numbers are.

Wednesday, 10 June 2015

The Cost of Health IT Sovereignty

Whenever I tell my friends I’m an eHealth consultant, I get the same two questions:
  1. Why don’t we just take someone else’s eHealth system that works and run it here?
  2. If finance, supply chains, and practically every other industry can move data around easily, why can’t healthcare do it?
Canadians spend far more on their healthcare IT software than they should.  A big reason for this is our provinces’ insistence to go it alone on all their IT projects: building their own networks, software systems, and all the certification work that goes into approving that infrastructure.

Why do our provinces feel compelled to do everything on their own?  Is it a sense of provincial pride?  “We Albertans know better than Manitobans how to run a healthcare system.”  The Bureaucratic Mandate?  “My province needs its own independent Standards, Architecture, Privacy and Security healthcare offices”.  Asserting sovereignty can feel politically rewarding, but it introduces two unnecessary costs to expanding a successful eHealth solution from one Province into another:
  1. Re-certifying for Privacy and Security.
  2. Re-tooling for interoperability.
These costs could be avoided if the provinces set aside their differences and agreed to relinquish their sovereignty for certifying interoperability, privacy and security of healthcare solutions to a federal agency.

Canada Health Infoway (CHI) has been eager to take on this role and will certify that a healthcare system meets national interoperability, privacy, and security standards.  As a vendor, I welcome the opportunity to certify my system once with a national agency and be done with it.  However, before I sign up for this, I first need to be damn sure this certification will be honoured by most of the provinces.  It’s currently not mandated by any.

Without the explicit agreement of the provinces to relinquish this responsibility to CHI, CHI’s certification is meaningless.  There are two things CHI could do to fix this:
  1. Co-operation.  Persuade provinces to relinquish responsibility for this certification work to Canada Health Infoway.  Vendors want it (see the published ITAC Health position).  Citizens want it (because they want eHealth systems sooner, at a lower cost.)  All that remains is persuading provincial governments to do the right thing.
  2. Incentive.  Secure billions of dollars of funding from the Federal Government and provide it to organizations only if they purchase certified systems.  This is the approach the U.S. took with its Centers for Medicare & Medicaid Services Meaningful Use Incentive program.
If Canada doesn’t figure this out, our Health IT sector will eventually just get replaced by American solutions.  We still have time to get our act together, but we will need to act fast.  Solving this should be a top priority for Infoway.

Friday, 22 May 2015

Obstacles to CDA adoption in Ontario

I'm really starting to wonder now if CDA will ever take hold in Ontario.

There was a time when I admired the adoption of CDA in the U.S. as a part of their impressive "Meaningful Use" initiative.  I have worked first hand with CDA documents from various EMR systems in the U.S. and have seen many successful Health Information Exchanges launch in the U.S. based on CCDA XDS repositories.

Despite the flurry of CDA activity south of the border, I see serious obstacles to adoption of CDA here in Ontario:
  1. The strongest case for CDA in Ontario is the abundance of CDA support and tooling in the U.S.  What's important to recognize, however, is that CDA encodes country-specific data types like postal codes and units of measure that are different between the U.S. and Canada.  So even if we wanted to take advantage of American CDA tools here in Canada, we would first need to modify those tools to use Canadian data types before we could use them up here.  The cost of this will in many cases be prohibitive.
  2. The next case for CDA in Ontario is how naturally it would support continuity of care scenarios like eReferral, eConsult, hospital discharge, admission, etc.  The problem with this is that Ontario EMR vendors have already achieved OntarioMD 4.1 certification that requires supporting the import and export of patient data in the OntarioMD "Core Data Set" data format.  In hindsight, it's clear that Ontario should never have invented its own proprietary EMR data exchange format.  But now that we have it, the EMR vendors are going to prefer that we build on that capability rather than trying to add support for a completely new CDA format.
  3. Lastly, many people I speak with about CDA are quick to point out that despite all the HIEs and EMR support developed in the U.S., it has not even come close to living up to its promise there.  In fact, the EMR backlash against CDA has prompted the formation of an industry association called the CommonWell Health Aliance that is promoting FHIR as the way forward for health data interoperability.  Every technical person I've spoken with who's seen both the CDA and FHIR specs has emphatically preferred FHIR.  Support for FHIR is snowballing everywhere.

So it now feels like we're in an awkward in-between time for EMR interoperability in Ontario.  Support for CDA is waning, but the FHIR spec is still only half-baked.  It will be years before the FHIR is released as a normative standard.

I will be watching how EMR interoperability unfolds south of the border with interest.  Momentum will either end up falling with CDA or FHIR, and it will be in Ontario's long-term best interest to follow whichever interoperability standard wins in the gigantic market to our south.


Thursday, 23 October 2014

Healthcare Interoperability in Canada: Perfection is the Enemy of Good

Yesterday, as co-chair of the ITAC Interoperability and Standards Committee, I presented opening comments for an ITAC Health workshop on Interoperability.  Details of the event can be found here.  Below is the text of my opening comments.

The Problem

I’m a software developer that got into healthcare about 10 years ago.  When I joined healthcare, I was surprised by a number of things I saw.  Things like:

  • Records are stored on paper and exchanged using paper fax.
  • The software behind the desk looks like it was written in the 1980s or 1990s.
  • The endless transcribing and repeated oral communication at every encounter is reminiscent of medieval monasteries:  In a week a patient can repeat their entire medical history to multiple clinicians and dump out their bag of drugs for each and every one of them.
  • Data exchange, if it happens at all, is often extracted directly from the EMR database (bad practice) and looks like lines of custom pipe-delimited text from my Dad’s generation.

In short:  Why hasn’t technology revolutionized healthcare like it has every other industry?  It feels like Canadian Healthcare is still stuck back in the last century.  Not much has changed in the last 10 years.

Healthcare IT in Canada is behind the rest of the world by most measures.  Even the U.S., who are committed to doing everything the hard way, are years ahead of Canada when it comes to Healthcare IT.  How did we get here?  How can we fix it?

How did we get here?


You can’t blame Canada for lack of trying.  We have invested billions of dollars into major eHealth initiatives right across the country.  There has been a decade-long project to introduce new healthcare interoperability standards across Canada, organized under a Pan-Canadian EHR Blueprint to get everyone connected into centralized EHR repositories.  We were promised that everyone would have a shared electronic health record accessible by all providers by 2015.  We’re not going to make it.  What happened?

If I were to pick one overarching theme it would be this: Perfection is the enemy of Good.
I’ve seen numerous projects get derailed by intricate Privacy and Security tentacles that grow out of monstrous consent models.  Time and time again we have held up perfectly secure and functional eHealth initiatives because we’re pursuing an absolutely comprehensive and airtight privacy and security model around it.  These delays cost lives.  It’s too easy to indefinitely postpone a project over privacy and security hand waving.

Another issue I’ve seen hold Canada back is our fantasy that each province is a unique flower, requiring completely different infrastructure, software, and its own independent standards committees and EHR programs.  Get OVER yourselves.  We will all save a heck of a lot of money when the provinces just get together and present Canada as a single market to the international Healthcare vendor community, rather than as a balkanized collection of misfits.

From a software developer’s perspective, I can tell you one issue that contributed to delaying Canada’s eHealth agenda is the quality of our Interoperability Standards.  I’ve heard people say, “I don’t care what message standard you use to move your data around—the technology is irrelevant—the interoperability standard isn’t the problem.”  To this, I say “hogwash!”  I’ve seen good APIs and I’ve seen bad APIs.  The “P” in “API” stands for “Programmer.”  If you want to know whether a proposed API is any good, you have to ask an experienced programmer.  If you take a look at the HL7v3 standard, it looks to me like they skipped this step.  If it costs 10 times as much effort to implement one API over another, that’s a sign there’s probably a problem with your API.

I think when the whole Canadian HL7v3 thing started out, there were a number of vendors involved in the process.  But one by one they dropped out, and the torch was left to be carried by committees of well-intentioned, but ultimately misguided information modellers.

We in the Canadian vendor community need to take some responsibility for letting this happen.
Smaller vendors didn’t get involved because they couldn’t afford to—many were just struggling to survive in the consolidating landscape.  The tragedy here is they will be the ones most affected by lack of interoperability standards.

Larger vendors arguably stand to benefit the most from a Wild West, devoid of easy-to-use interoperability standards where their Walled Fortress can be presented as the only fully interconnected show in town!

But simply falling into the arms of a handful of large vendors will have a cost for all of us in the long run.  That cost is innovation.  It’s in our best interest to start seriously thinking about supporting a manageable collection of simple, proven interoperability standards.

How can we fix it?

Vendors are the custodians of the most experienced technical minds in Canada.  We need to bring these minds together and take on this problem.  We can’t afford to continue complaining, wiping our hands of responsibility and expecting government to figure it out for us.  We need serious software engineers at the table, rolling up our sleeves, and getting this job done.

Now it’s easy to say that.  But what can we practically do to move this forward?  I recommend 3 things.
  1. We need something in Canada akin to the IHE working groups they have in the U.S.  A focal point for vendor input on the direction interoperability standards will take in Canada.  This needs to happen at the national level.
  2. We need to leverage infrastructure already deployed and we need to leverage standards that have already been successfully implemented in other parts of the world.  This will mean moving forward with a plurality of standards, such as IHE XDS, CDA, HL7v2 and HL7v3, and potentially even FHIR. 
  3. We need to strive for simple, clear and unambiguous interoperability standards.  It’s not enough to say you broadly support a standard like HL7v2.  You need to have very specific conformance processes to go along with it that ensure my HL7v2 messages have exactly the same Z segments and use exactly the same vocabulary as your HL7v2 messages.
A bit more on the last point.  Along with each standard, you need to have, at a minimum, content specifications and vocabulary bindings.  And by this I don’t mean 400 page word document that system integrators are expected to read through and implement.  I mean MACHINE READABLE software artifacts that completely specify the structure of how the data will be represented in bytes over the wire and how field values will be unambiguously interpreted.  Representing your specs in a machine readable format accelerates interoperability tooling by a considerable factor.  It’s the difference between building robots, and building robots that are able to build other robots.

For different standards this means different things.
StandardMachine readable artifactsRecommendations
HL7v2conformance profiles, vocabulary dictionariesUHN has done some great work here with their machine readable HAPI conformance profiles
HL7v3MIFs with vocabulary constraintsalthough I don’t see much of a future for HL7v3 here in Canada outside of pharmacy and even there it’s not clear if that’s going to win in the long run
CDAtemplates with terminology constraintsI think the jury’s still out for level 3 CDA.  The Lantana group has a made a good start at organizing CDA templates, but this space still has a long way to go—I think it suffers from some of the same challenges that HL7v3 faces
IHEIntegration ProfilesDiagnostic Imaging is the poster child for how an initiative like this can be successful.  DI is way ahead of other domains in Canada and we can credit the IHE for much of that progress—we need to consider building on the success of this approach in other domains.
FHIRresource schemasI have to say, given how new the FHIR standard is, it’s impressive how many online conformance test sandboxes are already publically available—that’s a testimony to how committed FHIR is to machine readability, openness and simplicity.  Read Intelliware's assessment of FHIR here.


In closing, I’m asking the vendors: give us your best engineers, and let’s work together to get serious about establishing some simple, functioning interoperable standards to get our healthcare data moving!

Friday, 27 June 2014

When off-shoring software to India, include code quality metrics as a part of the contract

I understand the appeal of off-shoring software development to India: low rates, scalable team size, and a process that has really matured over the years.  India is a serious and credible competitor for software development services.

I have personally been asked to maintain software written by large Indian off-shore companies. While the software usually meets the functional requirements, and passes manual QA testing, in my experience, the quality of the code written overseas is often poor.  Specifically, the resulting code is not extensible and it is expensive to maintain.  I am not exaggerating when I say I have seen 2000 line methods within 6000 line classes that were copy/pasted multiple times.

Setting aside for a moment the implicit conflict-of-interest of writing code that is expensive to maintain, in fairness to the Indian offshore developers, when customers complain that it's expensive to change features and add new ones to the delivered system, the developers innocently respond, "well you never told us you were going to need those changes..."

There is a simple answer to this.  Ask for it up front.  And I don't mean ask for the system to be extensible and maintainable.  That's vague.  I mean require the developer to run a Continuous Integration server (such as Jenkins) with a code quality plugin such as SonarQube, and measure the specific code quality metrics that matter.

In my experience, measuring the following 4 metrics goes a long way towards ensuring the code you get back is extensible and maintainable.

  1. Package Tangle Index = 0 cycles.  This ensures the software is properly layered, essential for extensibility.
  2. Code Coverage between 60% and 80%.  This is essential for low maintenance costs.  This metric is about automated testing.  The automated unit tests quickly discover side-effects of future feature changes, allowing you to make changes to how the system behaves and get those changes into production with a minimum of manual regression testing.
  3. Duplication < 2%.  Any competent developer will maintain low code duplication as a basic pride of craft.  But I have been astonished at the amount of copy/paste code I've seen come back from India.  If you don't measure it, unscrupulous coders will take this shortcut and produce a system whose maintenance costs quickly spiral out of control.
  4. Complexity: < 2.0 / method and < 6.0 / class.  This metric plays a huge factor in extensibilty.  Giant classes with giant methods make a system brittle and resistant to change.  Imagine a building made out of a few giant Lego blocks versus the same building made out of 10 times as many smaller Lego blocks.  The latter building will be far more flexible to reshape as business needs change.
A word of caution about using SonarQube.  Some developers, particularly those with a perfectionist bent, can get lost in a rabbit hole of trying to improve their code's "score" on many of the other metrics offered by the tool.  Violations, Rules Compliance, Technical Debt Score and LCOM4 are particularly tempting to undisciplined developers.  But in my experience, these metrics provide limited return on investment.  If you do decide to measure your code quality, I urge you to ignore these metrics.  While it can be a hill of fun spending weeks making your code "squeeky clean," the business value of these other metrics pales in comparison to what you get out of the 4 metric I recommended.

So the next time you outsource a development project to India, protect yourself from getting back junk by requiring code quality metrics right in the contract.  It might add an extra 10% to the initial cost of the system, but that cost will be more than offset by the resulting extensibility and maintainability of the code you get back.

Friday, 28 June 2013

What Ontario can learn from Northern Europe

Earlier this month, I participated in an event hosted by the Canadian Foundation for Healthcare Improvement.  The goal of the event was to bring together thought leaders from seven countries to discuss and debate Canada's Healthcare Strategy.  Paul Martin, Deb Matthews, Don Drummond and Michael Guerriere were all there and it was an excellent discussion.  Details of the event can be found here.  Below are some of the ideas that caught my attention.


Startling Facts from Ontario

15% of prescriptions are not filled in Ontario because the patient can't afford the medication.

20% of hospital beds in the province are occupied by someone who shouldn't be in a hospital.  (This problem is often called the "ALC" problem--Alternate Level of Care.)  Hospital beds are the most expensive beds in our health care system.

How Sweden fixed ALC

Sweden had the same chronic ALC problem as Ontario until a couple of years ago when they introduced an innovative solution.  A problem that they couldn't solve for decades suddenly disappeared within 3 months. What Sweden did is split the jurisdictional responsibility of Hospital care from Long Term Care: The province kept the responsibility for acute care, but they moved responsibility for long term care (along with the funding) to the municipality.  And then, and here's the genius, the province charged the municipality a high daily hospital bed fee for every day a person was left waiting to be transferred from a hospital bed out to a long term care facility.  Since the cost of the long-term care bed was so much lower than the cost of the hospital bed, the problem resolved itself very quickly.  Now this is easier for Sweden to do because municipalities have income tax revenue, but I thought the idea of splitting responsibility to force efficiency was brilliant.  (As an aside, in the Swedish tax model, 15% of income tax goes to the federal government, 10% goes to the province, and 20% to municipalities.  No wonder they have such great transit over there!)

How the Germans do it

Here in Ontario, OHIP is managed like a Big Government Program, with a heavy bureaucracy managing a lumbering public claims system funded by taxes.  In Germany, it is managed more like a tightly efficient, regulated crown corporation.  Patients pay their health insurance premiums directly to the insurer. The government subsidizes these premiums for low wage earners, and a salary-based sliding scale higher premium paid by higher wage earners.  Because it's managed as a separate financial institution (and because it's German) there is a tireless focus on efficiency and effectiveness, managed by teams of heavyweight quants.  People are categorized into 38 different groups, with compensation to providers based on the representation of these groups in their roster.  (Compare this to Ontario's roster compensation that has 2 categories: "normal," and "old.")  Treatment outcomes are measured and a national drug formulary establishes best practices to manage costs.  The Germans approach Health Insurance like a multi-billion industry and run it like a bank.

What accounts for rising Healthcare costs?

What surprised me about rising healthcare costs was how little of the increase was due to the ageing population we hear so much about.  10% of the increase can be accounted for by an ageing population.  The lion's share of increased cost is the increase of volume of activity.  More medications and more tests.  The consensus at the event was that the solution is to stop compensating providers for services and start compensating providers based on who's in their roster, and to reward outcomes; move to a capitation model.

What can Business Do?

Michael Guerriere (Telus Health) made a number of insightful observations about the role of business in improving Canada's healthcare landscape.

Different sectors respond to failure differently.  In the private sector, if a project is failing, the business will kill it quickly and decisively.  Whereas in the public sector, when a project is failing, governments have a tendency to, as Michael put it, "double down," throwing good money after bad.  His recommendation:  Rely more on private sector capital to solve healthcare problems.

The challenge with this in Canada, however, is that we have 14 little healthcare markets.  These little markets behave too differently from one another for a vendor to build a coherent national strategy, which explains why so few American healthcare vendors have much of a presence in Canada.  I'm painfully aware of this problem in my standards work--It astounds me that every Canadian province feels the need to define different message formats for exchanging healthcare data.  Yes you read that right, Canadian provinces are each defining different, incompatible technical specifications for exchanging health care data.  It's insane.

Primary care EMRs need better communication with the rest of the care community.  This is a topic near and dear to my heart and I will be writing a separate blog post on this topic.

Monday, 10 June 2013

eHealth 2013 impressions: A thousand points of light

I've been attending Canada's eHealth conference for about 5 years now.  This year felt different from previous years.

In previous years, there was a strong presence of large national and provincial initiatives.  This year, it felt more like a "thousand points of light".  Major jurisdictional initiatives have shrunk out of the limelight.  We saw terrific presentations from grass roots pilots at various healthcare organizations across the country, but gone were the ambitious blueprints and grand proclamations of EHR 2015.

A big part of this has got to be the current eHealth Ontario crisis.  Ontario is the largest Healthcare market in Canada by far, but it feels like the wheels have fallen off the eHealth Ontario bus.  Greg Reed announced three priorities when he took the helm in 2010: Diabetes Registry, Medication Management, and OLIS.  The first two projects have been cancelled, and we've seen an unprecedented exodus of top leadership from that organization this spring.

Moving away from ambitious provincial initiatives back to grass-roots projects is mostly a good thing.  Though I continue to feel that every jurisdiction needs, at a minimum, a single patient, provider, and location registry to have any hope of ever achieving shared electronic health records.  Why do we still not have these in Ontario?

The main question I kept asking myself at this conference was: "Wow, what this surgeon accomplished in their hospital pilot was fantastic!  How do we roll her solution out to everyone else?"  That, I think, is the biggest gap in our current eHealth ecosystem.  Every year we should pick the top three best eHealth pilots, scale those systems up, and roll them out to everyone.  We need a market for innovation in Healthcare.