tag:blogger.com,1999:blog-27352784163443683262024-03-08T07:32:35.227-08:00eHealth in CanadaKen Stevenshttp://www.blogger.com/profile/11692130804936589526noreply@blogger.comBlogger11125tag:blogger.com,1999:blog-2735278416344368326.post-3422853635586421722015-10-20T12:22:00.000-07:002015-10-20T12:22:00.992-07:00eHealth Pilotitis"We cannot afford to remain a
country of perpetual pilot projects, wasting the learning we
need."<br />
<div style="text-align: right;">
- Monique Bégin, Federal Minister of Health when the Canada Health Act was enacted</div>
<br />
I once asked Greg Reed, former CEO of eHealth Ontario, why none of the most successful pilots in Ontario were ever rolled out across the province. He explained to me that most pilots are tightly bound to the system they were built on because they were implemented as a simple customization of an existing system. For example the much vaunted Sault Ste. Marie ePrescribing pilot consisted of handing out EMR access to local pharmacists, so the pharmacists were literally logged onto the same EMR as the physicians were. It was a quick way to get scripts from doctors to pharmacists, but a far cry from a solution that could be rolled out across the province.<br />
<br />
<a href="http://www.healthycanadians.gc.ca/publications/health-system-systeme-sante/report-healthcare-innovation-rapport-soins/index-eng.php">The Naylor Report</a> goes into considerable detail on Canada's abusive serial relationships with pilots. We love them; but in the end we feel abandoned, frustrated and wanting.<br />
<br />
Here's an idea: How about we institute an annual contest for the Best Pilot? The prize would be roll-out of their solution across the entire province. Selection criteria could include scalability to mitigate Mr. Reed's observation. Nothing would make a pilot team more proud than to see their innovative solution rolled out across the entire province!<br />
<br />Ken Stevenshttp://www.blogger.com/profile/11692130804936589526noreply@blogger.com0tag:blogger.com,1999:blog-2735278416344368326.post-66710198404927774132015-08-26T12:40:00.004-07:002015-08-26T13:20:25.382-07:00AHA Report on Interoperability: Implications for CanadaThe American Hospital Association released an insightful report on health data interoperability this week. You can download it <a href="http://www.aha.org/content/15/1507-iagreport.pdf">here</a>.<br />
<br />
Practically all of the hospital systems we run in Canada are American, so we would be wise to watch interoperability developments south of the border.<br />
<br />
Here are a couple of sections of the report that seemed relevant to the Canadian interoperability landscape:<br />
<br />
1. Pilot Test Standards before making them official:<br />
<br />
<blockquote class="tr_bq">
To date, the federal government’s
selection of standards for inclusion in the certification
of EHRs has yielded mixed results. Future
activities must be more grounded in whether
standards are ready for use and accompanied
by sufficient implementation guidance. The federal
government should support the maturation
of standards through pilots or demonstrations
to determine the viability of the standard before
inclusion in a federal regulation that mandates
use. For example, ONC could support voluntary
certification of draft standards that are being
matured in pilots or demonstrations projects,
which would signal federal support for innovation
without imposing an immature standard on
the entire provider community. With evidence
from real-world pilots that a draft standard can
be scaled for ubiquitous use and has moved to
become a mature standard, the federal government
can then consider whether regulations are
needed to advance use of the standard.</blockquote>
Wow, just imagine if Canada had done this for HL7v3 *before* we sunk $1B into that standard...<br />
<br />
2. States and provinces need to get out of the business of standards development:<br />
<blockquote class="tr_bq">
State governments should be discouraged
from establishing unique requirements that increase
variation in standards and policies because
variability diminishes the ability to share
information across state lines. State policies
also may play a role in establishing the business
case for information sharing and infrastructure
development..</blockquote>
The main benefit to developing national standards isn't ability to move data across state/province boundaries. It is the huge savings in software procurement and system integration.<br />
<br />
3. CCDA challenges.<br />
<blockquote class="tr_bq">
The current iteration of content
standards, such as the Consolidated Clinical
Document Architecture (CCDA), do not meet
the needs of clinicians for relevant clinical data.
The CCDAs shared for meaningful use include
large amounts of patient data, making it hard for
clinicians to easily identify the information that is
important</blockquote>
Having worked with CCDA from multiple systems in the U.S. this comment surprised me. The problems I saw with CCDA was not that it had too much data, but rather that there are huge gaps in the CCDA produced by the current generation of hospital systems and little consistency between the CCDA produced by the different systems.<br />
<br />
<br />
<br />Ken Stevenshttp://www.blogger.com/profile/11692130804936589526noreply@blogger.com3tag:blogger.com,1999:blog-2735278416344368326.post-27725394999540505252015-06-26T15:18:00.002-07:002015-06-26T15:25:11.365-07:00ConnectingGTA: Finally a Shared EHR in Ontario<div>
Ontario’s been struggling for years to get some form of Shared Electronic Health Record (EHR) off the ground. The closest thing they’ve had so far was South-Western Ontario’s ClinicalConnect (not so much of a shared EHR as a hub into the various systems in the region) or the Integrated Assessment Repository (used mostly for Mental Health Assessments across the province).</div>
<div>
<br /></div>
<div>
ConnectingGTA is really the first major centralized Clinical Data Repository (CDR) that Ontario has built, thanks to some strong leadership from UHN and the ConnectingGTA Clinical Working Group. I predict that the ConnectingGTA CDR will become the dominant EHR in the province that all other shared health records initiatives will end up integrating into. This is a big deal.</div>
<div>
<br /></div>
<div>
Last week, I had my first meeting with the ConnectingGTA team. Here are some things I learned:</div>
<div>
<ul>
<li>The CDR contains all patient medical records from 17 sites dating back to May 2013: 27 Terabytes.</li>
<li>2 million encounters from 17 sites are added to the CDR every week.</li>
<li>OLIS lab results are integrated at the field level, so graphs of vitals are available.</li>
<li>Most other data are stored as unstructured documents.</li>
<li>The main gap in the data is medication and primary care.</li>
<li>There is a project underway to integrate ODB data into the system and to import ClinicalConnect data (which should bring with it some primary care data).</li>
<li>In the initial pilot, of the 1200 pilot users, 40% logged in, which is higher than the average pilot program that would typically see 20% participation.</li>
<li>The system is onboarding thousands of new users every week. Currently over 8000 users are signed up.</li>
<li>Main usage is in hospitals and CCACs. It has revolutionized the transition of care from CCAC to Long-term care.</li>
<li>Most are logging in using federated access (i.e. they login to their source system and then click-through to ConnectingGTA without needing to login again.) When they click through from a patient in their local system, they automatically get directed to that patient’s page in ConnectingGTA.</li>
<li>Some users are logging in directly through ConnectingGTA because the system-to-system click-through option can be slow. In this case, they need to manually search for the patient by MRN, OHIP, or demographics.</li>
<li>The system currently uses UHN’s in-house client registry (list of patients.) The ConnectingGTA team would like to use the provincial registry, but the current mainframe based one (that was a rudimentary extension of the old OHIP system) won’t meet ConnectingGTA’s requirements. The ConnectingGTA team recommends that the province upgrade to a more modern Client Registry before switching ConnectingGTA over to the provincial client registry.</li>
<li>ConnectingGTA is on a fixed release schedule of 2 releases/year.</li>
<li>For communities outside of the GTA, they will only be granted access to ConnectingGTA if they first contribute data. This is for 2 reasons:</li>
<ul>
<li>It’s an incentive to get more data into the system.</li>
<li>If ConnectingGTA is introduced into an organization with none of their patient data in it, most patient searches will come up empty and clinicians would stop using it.</li>
</ul>
<li>Currently there are no plans for “secondary use” of the data. (E.g. analysis of outcome measurement or development of clinical guidelines.)</li>
</ul>
</div>
<div>
My main takeaway from the presentation is that the viewers would be far more useful if the data coming in was structured. If a physician needs to review a patient, are they really going to sift through huge stacks of electronic documents? That being said, perfection is the enemy of good, and having a stack of documents to read is better than having no documents at all; Physicians are used to working this way—this new system is just moves the stack of documents from a clipboard to a computer screen. I just wonder if increasing the stack from 10 documents to 100 might dampen their interest in reading any of them…</div>
<div>
<br /></div>
<div>
The other thing that caught my eye was the comment about the Client Registry. Having a shared understanding of what patient you’re talking about is essential to any EHR initiative in the province, whether they be drug systems, lab systems, online appointment booking or electronic referrals. Ontario has taken far too long to get a functioning client registry. So far, Hospital shared Diagnostic Imaging is the only provincial infrastructure that is using the provincial client registry. That’s just embarrassing.</div>
<div>
<br /></div>
<div>
If I was running eHealth Ontario, I would have a poster on the wall with two numbers on it:</div>
<div>
<ul>
<li>Number of projects waiting for access to the Client Registry</li>
<li>Average length of time to onboard a project onto the Client Registry (hint: this should be measured in days, not years)</li>
</ul>
</div>
<div>
And I’d bonus the execs on how low those two numbers are.</div>
<div>
<br /></div>
Ken Stevenshttp://www.blogger.com/profile/11692130804936589526noreply@blogger.com1tag:blogger.com,1999:blog-2735278416344368326.post-49994081162505563192015-06-10T14:52:00.004-07:002015-06-10T14:53:26.910-07:00The Cost of Health IT SovereigntyWhenever I tell my friends I’m an eHealth consultant, I get the same two questions:<br />
<ol>
<li>Why don’t we just take someone else’s eHealth system that works and run it here?</li>
<li>If finance, supply chains, and practically every other industry can move data around easily, why can’t healthcare do it?</li>
</ol>
Canadians spend far more on their healthcare IT software than they should. A big reason for this is our provinces’ insistence to go it alone on all their IT projects: building their own networks, software systems, and all the certification work that goes into approving that infrastructure.<br />
<br />
Why do our provinces feel compelled to do everything on their own? Is it a sense of provincial pride? “We Albertans know better than Manitobans how to run a healthcare system.” The Bureaucratic Mandate? “My province needs its own independent Standards, Architecture, Privacy and Security healthcare offices”. Asserting sovereignty can feel politically rewarding, but it introduces two unnecessary costs to expanding a successful eHealth solution from one Province into another:<br />
<ol>
<li>Re-certifying for Privacy and Security.</li>
<li>Re-tooling for interoperability.</li>
</ol>
These costs could be avoided if the provinces set aside their differences and agreed to relinquish their sovereignty for certifying interoperability, privacy and security of healthcare solutions to a federal agency.<br />
<br />
Canada Health Infoway (CHI) has been eager to take on this role and will certify that a healthcare system meets national interoperability, privacy, and security standards. As a vendor, I welcome the opportunity to certify my system once with a national agency and be done with it. However, before I sign up for this, I first need to be damn sure this certification will be honoured by most of the provinces. It’s currently not mandated by any.<br />
<br />
Without the explicit agreement of the provinces to relinquish this responsibility to CHI, CHI’s certification is meaningless. There are two things CHI could do to fix this:<br />
<ol>
<li><b>Co-operation</b>. Persuade provinces to relinquish responsibility for this certification work to Canada Health Infoway. Vendors want it (see the published ITAC Health position). Citizens want it (because they want eHealth systems sooner, at a lower cost.) All that remains is persuading provincial governments to do the right thing.</li>
<li><b>Incentive</b>. Secure billions of dollars of funding from the Federal Government and provide it to organizations only if they purchase certified systems. This is the approach the U.S. took with its Centers for Medicare & Medicaid Services Meaningful Use Incentive program.</li>
</ol>
If Canada doesn’t figure this out, our Health IT sector will eventually just get replaced by American solutions. We still have time to get our act together, but we will need to act fast. Solving this should be a top priority for Infoway.<br />
<div>
<br /></div>
Ken Stevenshttp://www.blogger.com/profile/11692130804936589526noreply@blogger.com0tag:blogger.com,1999:blog-2735278416344368326.post-44233918872654026552015-05-22T13:41:00.001-07:002015-05-22T13:44:26.472-07:00Obstacles to CDA adoption in OntarioI'm really starting to wonder now if CDA will ever take hold in Ontario.<br />
<div>
<br /></div>
<div>
There was a time when I admired the adoption of CDA in the U.S. as a part of their impressive "Meaningful Use" initiative. I have worked first hand with CDA documents from various EMR systems in the U.S. and have seen many successful Health Information Exchanges launch in the U.S. based on CCDA XDS repositories.</div>
<div>
<br /></div>
<div>
Despite the flurry of CDA activity south of the border, I see serious obstacles to adoption of CDA here in Ontario:</div>
<div>
<ol>
<li>The strongest case for CDA in Ontario is the abundance of CDA support and tooling in the U.S. What's important to recognize, however, is that CDA encodes country-specific data types like postal codes and units of measure that are different between the U.S. and Canada. So even if we wanted to take advantage of American CDA tools here in Canada, we would first need to modify those tools to use Canadian data types before we could use them up here. The cost of this will in many cases be prohibitive.</li>
<li>The next case for CDA in Ontario is how naturally it would support continuity of care scenarios like eReferral, eConsult, hospital discharge, admission, etc. The problem with this is that Ontario EMR vendors have already achieved OntarioMD 4.1 certification that requires supporting the import and export of patient data in the OntarioMD "Core Data Set" data format. In hindsight, it's clear that Ontario should never have invented its own proprietary EMR data exchange format. But now that we have it, the EMR vendors are going to prefer that we build on that capability rather than trying to add support for a completely new CDA format.</li>
<li>Lastly, many people I speak with about CDA are quick to point out that despite all the HIEs and EMR support developed in the U.S., it has not even come close to living up to its promise there. In fact, the EMR backlash against CDA has prompted the formation of an industry association called the <a href="http://www.commonwellalliance.org/">CommonWell Health Aliance</a> that is promoting FHIR as the way forward for health data interoperability. Every technical person I've spoken with who's seen both the CDA and FHIR specs has emphatically preferred FHIR. Support for FHIR is snowballing everywhere.</li>
</ol>
<div>
<br /></div>
</div>
<div>
So it now feels like we're in an awkward in-between time for EMR interoperability in Ontario. Support for CDA is waning, but the FHIR spec is still only half-baked. It will be years before the FHIR is released as a normative standard.</div>
<div>
<br /></div>
<div>
I will be watching how EMR interoperability unfolds south of the border with interest. Momentum will either end up falling with CDA or FHIR, and it will be in Ontario's long-term best interest to follow whichever interoperability standard wins in the gigantic market to our south.</div>
<div>
<br /></div>
<div>
<br /></div>
Ken Stevenshttp://www.blogger.com/profile/11692130804936589526noreply@blogger.com0tag:blogger.com,1999:blog-2735278416344368326.post-18973272142276064172014-10-23T11:37:00.003-07:002014-10-27T09:07:59.293-07:00Healthcare Interoperability in Canada: Perfection is the Enemy of Good<span style="background-color: white; color: #222222; font-family: Times, 'Times New Roman', serif; font-size: xx-small; line-height: 14px;">Yesterday, as co-chair of the ITAC Interoperability and Standards Committee, I presented opening comments for an ITAC Health workshop on Interoperability. Details of the event can be found </span><a href="http://itac.ca/blog/event/itac-health-workshop-evidence-based-interoperability-achieving-results-at-scale/" style="background-color: white; color: #888888; font-family: Times, 'Times New Roman', serif; font-size: x-small; line-height: 14px; text-decoration: none;">here</a><span style="background-color: white; color: #222222; font-family: Times, 'Times New Roman', serif; font-size: xx-small; line-height: 14px;">. Below is the text of my opening comments.</span><br />
<br />
<h3>
The Problem</h3>
I’m a software developer that got into healthcare about 10 years ago. When I joined healthcare, I was surprised by a number of things I saw. Things like:<br />
<br />
<ul>
<li>Records are stored on paper and exchanged using paper fax.</li>
<li>The software behind the desk looks like it was written in the 1980s or 1990s.</li>
<li>The endless transcribing and repeated oral communication at every encounter is reminiscent of medieval monasteries: In a week a patient can repeat their entire medical history to multiple clinicians and dump out their bag of drugs for each and every one of them.</li>
<li>Data exchange, if it happens at all, is often extracted directly from the EMR database (bad practice) and looks like lines of custom pipe-delimited text from my Dad’s generation.</li>
</ul>
<br />
In short: Why hasn’t technology revolutionized healthcare like it has every other industry? It feels like Canadian Healthcare is still stuck back in the last century. Not much has changed in the last 10 years.<br />
<br />
Healthcare IT in Canada is behind the rest of the world by most measures. Even the U.S., who are committed to doing everything the hard way, are years ahead of Canada when it comes to Healthcare IT. How did we get here? How can we fix it?<br />
<br />
<h3>
How did we get here?</h3>
<br />
You can’t blame Canada for lack of trying. We have invested billions of dollars into major eHealth initiatives right across the country. There has been a decade-long project to introduce new healthcare interoperability standards across Canada, organized under a Pan-Canadian EHR Blueprint to get everyone connected into centralized EHR repositories. We were promised that everyone would have a shared electronic health record accessible by all providers by 2015. We’re not going to make it. What happened?<br />
<br />
If I were to pick one overarching theme it would be this: Perfection is the enemy of Good.<br />
I’ve seen numerous projects get derailed by intricate Privacy and Security tentacles that grow out of monstrous consent models. Time and time again we have held up perfectly secure and functional eHealth initiatives because we’re pursuing an absolutely comprehensive and airtight privacy and security model around it. These delays cost lives. It’s too easy to indefinitely postpone a project over privacy and security hand waving.<br />
<br />
Another issue I’ve seen hold Canada back is our fantasy that each province is a unique flower, requiring completely different infrastructure, software, and its own independent standards committees and EHR programs. Get OVER yourselves. We will all save a heck of a lot of money when the provinces just get together and present Canada as a single market to the international Healthcare vendor community, rather than as a balkanized collection of misfits.<br />
<br />
From a software developer’s perspective, I can tell you one issue that contributed to delaying Canada’s eHealth agenda is the quality of our Interoperability Standards. I’ve heard people say, “I don’t care what message standard you use to move your data around—the technology is irrelevant—the interoperability standard isn’t the problem.” To this, I say “hogwash!” I’ve seen good APIs and I’ve seen bad APIs. The “P” in “API” stands for “Programmer.” If you want to know whether a proposed API is any good, you have to ask an experienced programmer. If you take a look at the HL7v3 standard, it looks to me like they skipped this step. If it costs 10 times as much effort to implement one API over another, that’s a sign there’s probably a problem with your API.<br />
<br />
I think when the whole Canadian HL7v3 thing started out, there were a number of vendors involved in the process. But one by one they dropped out, and the torch was left to be carried by committees of well-intentioned, but ultimately misguided information modellers.<br />
<br />
We in the Canadian vendor community need to take some responsibility for letting this happen.<br />
Smaller vendors didn’t get involved because they couldn’t afford to—many were just struggling to survive in the consolidating landscape. The tragedy here is they will be the ones most affected by lack of interoperability standards.<br />
<br />
Larger vendors arguably stand to benefit the most from a Wild West, devoid of easy-to-use interoperability standards where their Walled Fortress can be presented as the only fully interconnected show in town!<br />
<br />
But simply falling into the arms of a handful of large vendors will have a cost for all of us in the long run. That cost is innovation. It’s in our best interest to start seriously thinking about supporting a manageable collection of simple, proven interoperability standards.<br />
<br />
<h3>
How can we fix it?</h3>
Vendors are the custodians of the most experienced technical minds in Canada. We need to bring these minds together and take on this problem. We can’t afford to continue complaining, wiping our hands of responsibility and expecting government to figure it out for us. We need serious software engineers at the table, rolling up our sleeves, and getting this job done.<br />
<br />
Now it’s easy to say that. But what can we practically do to move this forward? I recommend 3 things.<br />
<ol>
<li>We need something in Canada akin to the IHE working groups they have in the U.S. A focal point for vendor input on the direction interoperability standards will take in Canada. This needs to happen at the national level.</li>
<li>We need to leverage infrastructure already deployed and we need to leverage standards that have already been successfully implemented in other parts of the world. This will mean moving forward with a plurality of standards, such as IHE XDS, CDA, HL7v2 and HL7v3, and potentially even FHIR. </li>
<li>We need to strive for simple, clear and unambiguous interoperability standards. It’s not enough to say you broadly support a standard like HL7v2. You need to have very specific conformance processes to go along with it that ensure my HL7v2 messages have exactly the same Z segments and use exactly the same vocabulary as your HL7v2 messages.</li>
</ol>
A bit more on the last point. Along with each standard, you need to have, at a minimum, content specifications and vocabulary bindings. And by this I don’t mean 400 page word document that system integrators are expected to read through and implement. I mean MACHINE READABLE software artifacts that completely specify the structure of how the data will be represented in bytes over the wire and how field values will be unambiguously interpreted. Representing your specs in a machine readable format accelerates interoperability tooling by a considerable factor. It’s the difference between building robots, and building robots that are able to build other robots.<br />
<br />
For different standards this means different things.
<br />
<table border=true><tr><th>Standard</th><th>Machine readable artifacts</th><th>Recommendations</th></tr>
<tr><td>
HL7v2</td><td>conformance profiles, vocabulary dictionaries</td><td>UHN has done some great work here with their machine readable HAPI conformance profiles
</td></tr>
<tr><td>HL7v3</td><td>MIFs with vocabulary constraints</td><td>although I don’t see much of a future for HL7v3 here in Canada outside of pharmacy and even there it’s not clear if that’s going to win in the long run</td></tr>
<tr><td>
CDA</td><td>templates with terminology constraints</td><td>I think the jury’s still out for level 3 CDA. The Lantana group has a made a good start at organizing CDA templates, but this space still has a long way to go—I think it suffers from some of the same challenges that HL7v3 faces</td></tr>
<tr><td>
IHE</td><td>Integration Profiles</td><td>Diagnostic Imaging is the poster child for how an initiative like this can be successful. DI is way ahead of other domains in Canada and we can credit the IHE for much of that progress—we need to consider building on the success of this approach in other domains.</td></tr>
<tr><td>FHIR</td><td>resource schemas</td><td>I have to say, given how new the FHIR standard is, it’s impressive how many online conformance test sandboxes are already publically available—that’s a testimony to how committed FHIR is to machine readability, openness and simplicity. Read Intelliware's assessment of FHIR <a href="http://www.intelliware.com/hl7-games-catching-fhir/">here</a>.</td></tr></table> <br />
<br />
In closing, I’m asking the vendors: give us your best engineers, and let’s work together to get serious about establishing some simple, functioning interoperable standards to get our healthcare data moving!<br />
<br />Ken Stevenshttp://www.blogger.com/profile/11692130804936589526noreply@blogger.com0tag:blogger.com,1999:blog-2735278416344368326.post-26198565659335039022014-06-27T14:52:00.002-07:002014-07-14T09:29:05.120-07:00When off-shoring software to India, include code quality metrics as a part of the contractI understand the appeal of off-shoring software development to India: low rates, scalable team size, and a process that has really matured over the years. India is a serious and credible competitor for software development services.<br />
<br />
I have personally been asked to maintain software written by large Indian off-shore companies. While the software usually meets the functional requirements, and passes manual QA testing, in my experience, the quality of the code written overseas is often poor. Specifically, the resulting code is not extensible and it is expensive to maintain. I am not exaggerating when I say I have seen 2000 line methods within 6000 line classes that were copy/pasted multiple times.<br />
<br />
Setting aside for a moment the implicit conflict-of-interest of writing code that is expensive to maintain, in fairness to the Indian offshore developers, when customers complain that it's expensive to change features and add new ones to the delivered system, the developers innocently respond, "well you never told us you were going to need those changes..."<br />
<br />
There is a simple answer to this. Ask for it up front. And I don't mean ask for the system to be extensible and maintainable. That's vague. I mean require the developer to run a Continuous Integration server (such as <a href="http://en.wikipedia.org/wiki/Jenkins_(software)">Jenkins</a>) with a code quality plugin such as <a href="http://en.wikipedia.org/wiki/SonarQube">SonarQube</a>, and measure the specific code quality metrics that matter.<br />
<br />
In my experience, measuring the following 4 metrics goes a long way towards ensuring the code you get back is extensible and maintainable.<br />
<br />
<ol>
<li>Package Tangle Index = 0 cycles. This ensures the software is properly layered, essential for extensibility.</li>
<li>Code Coverage between 60% and 80%. This is essential for low maintenance costs. This metric is about automated testing. The automated unit tests quickly discover side-effects of future feature changes, allowing you to make changes to how the system behaves and get those changes into production with a minimum of manual regression testing.</li>
<li>Duplication < 2%. Any competent developer will maintain low code duplication as a basic pride of craft. But I have been astonished at the amount of copy/paste code I've seen come back from India. If you don't measure it, unscrupulous coders will take this shortcut and produce a system whose maintenance costs quickly spiral out of control.</li>
<li>Complexity: < 2.0 / method and < 6.0 / class. This metric plays a huge factor in extensibilty. Giant classes with giant methods make a system brittle and resistant to change. Imagine a building made out of a few giant Lego blocks versus the same building made out of 10 times as many smaller Lego blocks. The latter building will be far more flexible to reshape as business needs change.</li>
</ol>
<div>
A word of caution about using SonarQube. Some developers, particularly those with a perfectionist bent, can get lost in a rabbit hole of trying to improve their code's "score" on many of the other metrics offered by the tool. Violations, Rules Compliance, Technical Debt Score and LCOM4 are particularly tempting to undisciplined developers. But in my experience, these metrics provide limited return on investment. If you do decide to measure your code quality, I urge you to ignore these metrics. While it can be a hill of fun spending weeks making your code "squeeky clean," the business value of these other metrics pales in comparison to what you get out of the 4 metric I recommended.</div>
<div>
<br /></div>
<div>
So the next time you outsource a development project to India, protect yourself from getting back junk by requiring code quality metrics right in the contract. It might add an extra 10% to the initial cost of the system, but that cost will be more than offset by the resulting extensibility and maintainability of the code you get back.</div>
<div>
<br /></div>
Ken Stevenshttp://www.blogger.com/profile/11692130804936589526noreply@blogger.com0tag:blogger.com,1999:blog-2735278416344368326.post-123916873009392192013-06-28T15:37:00.001-07:002013-07-02T10:07:55.782-07:00What Ontario can learn from Northern Europe<span style="font-family: Times, Times New Roman, serif;">Earlier this month, I participated in an event hosted by the Canadian Foundation for Healthcare Improvement. The goal of the event was to bring together thought leaders from seven countries to discuss and debate Canada's Healthcare Strategy. Paul Martin, Deb Matthews, Don Drummond and Michael Guerriere were all there and it was an excellent discussion. </span><span style="font-family: Times, 'Times New Roman', serif;">Details of the event can be found </span><a href="http://www.cfhi-fcass.ca/NewsAndEvents/Events/Event/13-03-19/2a874131-185e-4809-bab6-77b22a805bc0.aspx" style="font-family: Times, 'Times New Roman', serif;">here</a><span style="font-family: Times, 'Times New Roman', serif;">. Below are some of the ideas that caught my attention.</span><br />
<span style="font-family: Times, 'Times New Roman', serif;"><br /></span>
<br />
<h3>
Startling Facts from Ontario</h3>
15% of prescriptions are not filled in Ontario because the patient can't afford the medication.<br />
<div>
<br />
20% of hospital beds in the province are occupied by someone who shouldn't be in a hospital. (This problem is often called the "ALC" problem--Alternate Level of Care.) Hospital beds are the most expensive beds in our health care system.<br />
<br />
<h3>
How Sweden fixed ALC</h3>
<div>
Sweden had the same chronic ALC problem as Ontario until a couple of years ago when they introduced an innovative solution. A problem that they couldn't solve for decades suddenly disappeared within 3 months. What Sweden did is split the jurisdictional responsibility of Hospital care from Long Term Care: The province kept the responsibility for acute care, but they moved responsibility for long term care (along with the funding) to the municipality. And then, and here's the genius, the province charged the municipality a high daily hospital bed fee for every day a person was left waiting to be transferred from a hospital bed out to a long term care facility. Since the cost of the long-term care bed was so much lower than the cost of the hospital bed, the problem resolved itself very quickly. Now this is easier for Sweden to do because municipalities have income tax revenue, but I thought the idea of splitting responsibility to force efficiency was brilliant. (As an aside, in the Swedish tax model, 15% of income tax goes to the federal government, 10% goes to the province, and 20% to municipalities. No wonder they have such great transit over there!)<br />
<br /></div>
<div>
<h3>
How the Germans do it</h3>
</div>
<div>
Here in Ontario, OHIP is managed like a Big Government Program, with a heavy bureaucracy managing a lumbering public claims system funded by taxes. In Germany, it is managed more like a tightly efficient, regulated crown corporation. Patients pay their health insurance premiums directly to the insurer. The government subsidizes these premiums for low wage earners, and a salary-based sliding scale higher premium paid by higher wage earners. Because it's managed as a separate financial institution (and because it's German) there is a tireless focus on efficiency and effectiveness, managed by teams of heavyweight quants. People are categorized into 38 different groups, with compensation to providers based on the representation of these groups in their roster. (Compare this to Ontario's roster compensation that has 2 categories: "normal," and "old.") Treatment outcomes are measured and a national drug formulary establishes best practices to manage costs. The Germans approach Health Insurance like a multi-billion industry and run it like a bank.<br />
<br />
<h3>
What accounts for rising Healthcare costs?</h3>
</div>
</div>
<div>
What surprised me about rising healthcare costs was how little of the increase was due to the ageing population we hear so much about. 10% of the increase can be accounted for by an ageing population. The lion's share of increased cost is the increase of volume of activity. More medications and more tests. The consensus at the event was that the solution is to stop compensating providers for services and start compensating providers based on who's in their roster, and to reward outcomes; move to a <a href="http://en.wikipedia.org/wiki/Capitation_(healthcare)">capitation </a>model.</div>
<div>
<br /></div>
<div>
<h3>
What can Business Do?</h3>
</div>
<div>
Michael Guerriere (Telus Health) made a number of insightful observations about the role of business in improving Canada's healthcare landscape.</div>
<div>
<br /></div>
<div>
Different sectors respond to failure differently. In the private sector, if a project is failing, the business will kill it quickly and decisively. Whereas in the public sector, when a project is failing, governments have a tendency to, as Michael put it, "double down," throwing good money after bad. His recommendation: Rely more on private sector capital to solve healthcare problems.</div>
<div>
<br /></div>
<div>
The challenge with this in Canada, however, is that we have 14 little healthcare markets. These little markets behave too differently from one another for a vendor to build a coherent national strategy, which explains why so few American healthcare vendors have much of a presence in Canada. I'm painfully aware of this problem in my standards work--It astounds me that every Canadian province feels the need to define different message formats for exchanging healthcare data. Yes you read that right, Canadian provinces are each defining different, incompatible technical specifications for exchanging health care data. It's insane.</div>
<div>
<br /></div>
<div>
Primary care EMRs need better communication with the rest of the care community. This is a topic near and dear to my heart and I will be writing a separate blog post on this topic.</div>
<div>
<br /></div>
Ken Stevenshttp://www.blogger.com/profile/11692130804936589526noreply@blogger.com0tag:blogger.com,1999:blog-2735278416344368326.post-39959299800548956452013-06-10T11:31:00.002-07:002013-06-10T11:34:17.037-07:00eHealth 2013 impressions: A thousand points of lightI've been attending Canada's eHealth conference for about 5 years now. This year felt different from previous years.<br />
<br />
In previous years, there was a strong presence of large national and provincial initiatives. This year, it felt more like a "thousand points of light". Major jurisdictional initiatives have shrunk out of the limelight. We saw terrific presentations from grass roots pilots at various healthcare organizations across the country, but gone were the ambitious blueprints and grand proclamations of EHR 2015.<br />
<br />
A big part of this has got to be the current eHealth Ontario crisis. Ontario is the largest Healthcare market in Canada by far, but it feels like the wheels have fallen off the eHealth Ontario bus. Greg Reed announced three priorities when he took the helm in 2010: Diabetes Registry, Medication Management, and OLIS. The first two projects have been cancelled, and we've seen an unprecedented exodus of top leadership from that organization this spring.<br />
<br />
Moving away from ambitious provincial initiatives back to grass-roots projects is mostly a good thing. Though I continue to feel that every jurisdiction needs, at a minimum, a single patient, provider, and location registry to have any hope of ever achieving shared electronic health records. Why do we still not have these in Ontario?<br />
<br />
The main question I kept asking myself at this conference was: "Wow, what this surgeon accomplished in their hospital pilot was fantastic! How do we roll her solution out to everyone else?" That, I think, is the biggest gap in our current eHealth ecosystem. Every year we should pick the top three best eHealth pilots, scale those systems up, and roll them out to everyone. We need a market for innovation in Healthcare.<br />
<br />Ken Stevenshttp://www.blogger.com/profile/11692130804936589526noreply@blogger.com0tag:blogger.com,1999:blog-2735278416344368326.post-28018966291838754472013-03-26T11:22:00.004-07:002013-03-26T14:37:58.109-07:00Software Procurement in the Public Sector<span style="font-family: Times, Times New Roman, serif; font-weight: normal;">Last week, I was a member of the panel at a workshop hosted by ITAC Health. The goal of the workshop was to bring public sector buyers and sellers together to fix a broken procurement system. I was a member of the vendor panel, representing mid-sized Canadian software companies. The buyer side was well represented, including an Auditor General of Ontario and the Assistant Deputy Minister for Ontario Shared Services. Details of the event can be found <a href="http://itac.ca/event_details/2765">here</a>. Below is the talk I gave at the event and some suggestions vendors made for improving the public sector procurement process.</span><br />
<span style="font-family: Times, Times New Roman, serif; font-weight: normal;"><br /></span>
<br />
<h3>
Building a Software System is not like Building a Subway System</h3>
As a Software Engineer, the main barrier I see to successful software delivery in the public sector is a misunderstanding about the nature of software. Software projects in the public sector tend to be managed like major construction projects: like building a new subway system. How do you manage risk when building a new subway system? You do years of up front planning, specifying all the details of the entire project long before the first shovel breaks earth. Why do you do this? Because re-routing a subway tunnel is very, very expensive. You have to get it right the first time.<br />
<br />
The mistake we make in the public sector, is that we treat software systems the same way. Software is different. Unlike a major construction project, it is in fact relatively inexpensive to alter a software system after it has been built. Building a software system is more like building a successful political campaign platform. Successful political campaigns commit very little up front: They throw out teasers and then poll intensively to suss out the public mood--which parts of the new platform does the electorate hate, which parts get the public excited--then based on this feedback, the direction of the campaign is altered. It is not a subway tunnel, planned years in advance. It is built incrementally, guided by constant feedback from the electorate.<br />
<br />
To give a specific example, consider Ontario's Medication Management RFP. Ontario started writing this RFP 6 years ago. The RFP was finally issued 3 years ago. Today it is still not awarded. We've been planning this projects for 6 years now, and all we have to show for it is a stack of paper. If a middle-aged person arrives at an EMR today unconscious, we have *no* way of knowing what medications they are on. Why are we at this point? It's because the scope of the Ontario Medication Management system is huge. It's being procured like a new subway system: with massive scope and comprehensive specifications. The scope includes real-time transactional integrations into pharmacy systems, real-time e-prescribing integration into physician systems, real-time patient lookup into patient registries etc. These are ambitious plans.<br />
<br />
Imagine, if instead of this, 6 years ago we decided to build the software incrementally. We started with a nightly batch upload of all prescriptions from all pharmacies to a central database and then gave EMRs access to this database. Pharmacies already batch upload their prescriptions to other partners, so adding a provincial drug database to the feed would be a simple project for them. Had Ontario instead started with this scope, I believe we would have had a comprehensive province-wide prescription database within a year. Sure the prescription data is a day old, but day old data is a heck of a lot better than *no* data. More significant than that, having a real system out there, in the hands of users, allows you to start polling your users--which parts of the new platform do the users hate? Which parts get the users excited. Based on this feedback, you can steer the evolution of the system, often in ways you could never have anticipated at the start.<br />
<br />
6 years is an eternity in the world of technology. 6 years ago, there were no iPhones. 3 years ago, when the Medication Management RFP hit the streets, there were no iPads. Subway trains don't change that much. But software changes a lot. When you're planning a software project, particularly something as important as a public health system, if you plan too big for too long, your system will be obsolete by the time it's launched. Start small, get feedback, and incrementally improve.<br />
<br />
<h3>
Fix the Q&A Process</h3>
I also wanted to touch on an aspect of that process that I think is particularly broken, and that is the RFP Q&A process. The intent of the RFP Q&A rules are to level the playing field--to ensure all vendors have a fair chance at winning. The effect of the RFP Q&A rules is exactly the opposite. Vendors on the inside already understand what the customer needs, and those on the outside have no way of finding out.<br />
<br />
The rules require that any question a vendor has about an RFP must be submitted on paper on the public record, with the answers to the questions being shared with all bidders. While this process looks good on paper, this is what actually happens:<br />
<br />
<ul>
<li>Meaningful clarification questions are rarely asked for fear of giving away your requirements analysis advantage to your competitors.</li>
<li>On the rare occasion when meaningful clarification questions are asked, 9 times out of 10 the answers don't help.</li>
</ul>
<div>
Software requirements cannot be understood through a public Q&A process. Delivering software is fundamentally different from delivering milk. What's required is a conversation, with only the vendor and the customer in the room, where the vendor can ask insightful questions and the customer can answer honestly. It is very, very hard to successfully bid on a project when you don't understand what the customer actually needs, and a public Q&A process has proven that it can't get us there.</div>
<div>
<br /></div>
<div>
To make this work, there would need to be some sort of short-list qualifying process so there are a manageable number of meetings. But often times there are only 2 or 3 bidders. Even a confidential Q&A would be more fair than what we currently have.<br />
<br /></div>
<h3>
Contract Templates</h3>
We heard a lot at the workshop about how attaching a unique contract in each RFP adds delay to the bidding process, and how many clauses in these contracts are showstoppers for many vendors, especially clauses regarding unlimited liability, unlimited indemnification, and IP ownership.<br />
<br />
Why not publish, say, 5 contract templates that are be used for all public sector procurements, and then have the RFP simply refer to the contract template by name. This would allow vendors to pre-qualify which of those templates they can bid on and which they can't. Such pre-qualification could even be announced by vendors if they chose to, so that when selecting a contract template, governments would know in advance which vendors they are excluding.<br />
<br />
<h3>
RFP Star Rating</h3>
Those of us who regularly read RFPs know that there is a huge variety in the quality. Vendors prefer RFPs that describe the business problem and leave the implementation details to the vendor. Vendors prefer RFPs with clear scope and deliverables.<br />
<br />
One of the things Marion Macdonald (ADM Ontario Shared Services) mentioned at the meeting is that Ontario plans to change the system it uses to manage the RFP process. I would suggest to Marion that she consider allowing vendors to anonymously assign a star rating to posted RFPs. I imagine a future where budding RFP authors download all the 5 star RFPs as guides for how to write a really good RFP.<br />
<br />
<h3>
References vs. Innovation</h3>
<div>
My co-panelist Michael Martineau raised an important point, which is that we make a lot of noise about fostering innovation, but then when it comes time to RFP, the RFP inevitably requires "three sites in Canada where the software has been installed for over a year" or somesuch reference requirement. You're not going to introduce innovative solutions into the marketplace, particularly from other countries, so long as RFPs contain language like that.</div>
<div>
<br />
<br /></div>
<div>
<br /></div>
<div>
<br /></div>
Ken Stevenshttp://www.blogger.com/profile/11692130804936589526noreply@blogger.com4tag:blogger.com,1999:blog-2735278416344368326.post-38830965364071406322011-03-24T11:26:00.000-07:002011-03-24T15:19:52.893-07:00ePrescribing - Is "Safety Last" an option?I had the privilege of attending an ePrescribing workshop recently. The provinces were well represented, as were pharmacies and Drug Information Systems vendors.<div><br /></div><div>I particularly appreciated feedback from the physicians at the meeting who had actually used ePrescribing systems in various pilots across the country. One of these physicians gave ePrescribing in it's current form a big thumbs down. The reason? He values spending time with his patients. It takes him 12 to 14 seconds to write a paper prescription. Completing an ePrescription, on the other hand, takes him on the order of 4 minutes. He calculated that resulted in him seeing 3-4 less patients in a day.</div><div><br /></div><div>I've heard many an eHealth idealist bemoan the dinosaur physician who refuses to get with the times, or who jealously guards his or her patient's data. But this is not the case here. This physician just wants to see his patients!</div><div><br /></div><div>The bulk of this 4 minutes is spent responding to "Alerts" raised by the Drug Information System. The possibility of raising such alerts is the source of much excitement among eHealth proponents: drug interactions, drug allergies, adverse reactions etc could all be detected "at source". Who could say no to that?</div><div><br /></div><div>The reality, unfortunately, is that the signal-to-noise ratio on those alerts is so bad, that by the the end of their first week, most physicians were dismissing the endless stream of useless alerts without even looking at them. All three physicians at the meeting attested to this.</div><div><br /></div><div>It made me wonder, should we consider focusing on adoption first, and then add alerts later? There are many benefits to ePrescribing beyond alerts: prescription accuracy, minimizing call-backs, etc. Alerts will be caught by the pharmacy system DUR anyways. Perhaps "Safety Last" is actually the best way forward for ePrescribing in Canada!</div><div><br /></div><div><br /></div><div><br /></div><div><br /></div>Ken Stevenshttp://www.blogger.com/profile/11692130804936589526noreply@blogger.com2