The Usability Problem

Part 1: What the EMR Market Can Learn From Twitter

As Meaningful Use marches on and providers are deploying their newly-minted EMRs (electronic medical records), the ONC has been receiving complaints that the user interfaces of many EMRs are frustrating and poorly-designed. Dr. Farzad Mostashari, the head of the ONC, put it well last October in an interview on HIStalk:

The only problem is that providers consistently say, “I didn’t know what I bought until three months after I bought it. I didn’t know what the usability of the system was really going to be, because all I saw was these demos I had from people who knew their way around the system and knew spots to avoid.”

To help prevent these horror stories from happening to other providers, the ONC and NIST are strongly encouraging EMR vendors to include formal usability testing results when submitting their products for certification. The thinking is that if usability scores are publicly available, providers will be able to comparatively shop around for the best experiences, thus encouraging all vendors to improve their products.

While I applaud the ONC and Dr. Mostashari for their efforts, I believe that this initiative misunderstands the systemic nature of the usability problem. Contrary to popular belief, the usability problem is not directly the fault of the EMR vendors. There are fundamental technological and market forces that disincentivize EMR vendors from creating truly great user experiences. These forces must be disrupted before the EMR industry will begin to produce satisfying end-user experiences. A change in certification criteria is woefully inadequate to the task.

Before going further, I should clarify my terms. What goes by the name “usability” in most EMR discussions is actually a blend of related concepts: safety, beauty, ease-of-use, etc. What these concepts all share is the notion that EMRs should be safe, fast, easy-to-use, and pleasant. Rather than get lost in the weeds of specialized terminology, I will follow the common practice of referring to all these domains under the umbrella term “usability.” I humbly beg usability experts to forgive me this injustice.

Because many “usability” complaints exceed the scope of the formal definition of usability, it is clear on this basis alone that the ONC’s initiative is not enough. Formal usability testing is a tool used to prove the safety and efficiency of a given task, but providers are frustrated with the entire EMR experience, with the way that all tasks are bound together into clumsy, dissatisfying products. Formal usability testing can refine an innovation, but it cannot inspire it. Innovation must be inspired by a profound shift in priorities.

The reason that EMR vendors have yet to produce innovative user experiences has nothing to do with a lack of money or talent; they have healthy revenues flowing in from one of the largest and fastest growing markets; they have many bright employees; there’s no lack of technical or design expertise. If a vendor wanted to make an EMR with a show-stopping user-interface, there would be few internal obstacles in their path. So why don’t they want to do so?

The answer is that the current EMR market doesn’t really prioritize usability. EMR purchasers are not end users; their purchasing decisions are driven by other priorities. They are responsible for ensuring that meaningful use criteria are met in a timely and cost-effective manner. Every purchase must be a part of a long-term strategy of going paperless even as future changes in reimbursement will potentially threaten their bottom line. The basis of competition for these customers is essentially an arms race of features and functions. Usability is one concern among many. The vendors that offer the most features and functions are valued the highest.

Before Meaningful Use, only a small fraction of institutions were using EMRs. Most of the ones that had EMRs were using them in a limited fashion: labs, eMARs, etc. Less than one percent were fully electronic. To go from this state to a fully-electronic system presents an enormous task for managers. It also presents an enormous task for vendors.

There are few technological standards for EMR integration. Even HL7 is just a messaging protocol, with dozens of specs, all of which are haphazardly implemented in practice. There is an HL7 segment called the Z segment, which is a kind of wildcard that can be used for anything not covered in the official spec by another segment. It was intended to be used only rarely. An integration engineer I know estimates that 90% of HL7 messages are now sent via the Z segment.

When a major vendor released a disappointing iPhone app recently, I had the pleasure of speaking with the lead developer on the project about my disappointment. He acknowledged its flaws, but explained the difficulty they are having in integrating iOS applications with their core products:

This app is a small step as we continue to decouple complex logic embedded in the Windows platform and expose as services.

Translation: “We’re having a difficult time extending an API, even to ourselves. Our software was not architected for agnostic interoperability with other platforms.”

There are perfectly valid reasons for this difficulty. Since the industry has never produced true interoperability standards, EMR vendors have been left to themselves to create and maintain their own internal standards. Because of the scale and complexity of an EMR, these internal standards are full of proprietary fine-tuning and optimizations. Front-end applications were not designed to be hot-swappable.

Thus, in a market in which customers value feature lists and functionality, the vendors who offer the most features must also be the vendors who are the most technologically integrated. This is why Epic has been taking the lion’s share of the inpatient market. Their all-or-nothing approach with customers is backed by the technological capability of doing so. Epic can guarantee feature lists because it controls its technology stack, soup to nuts. Other vendors boast similar successes to the extent that they are integrated and feature-complete — or not.

What does all this have to do with usability? Since innovation is the fruit of competition, innovation in usability will not occur until usability becomes the basis of competition. This cannot happen while front-end applications and back end systems are still tightly integrated. To understand why, let’s consider the example of Twitter and third-party Twitter apps.

When Twitter was first launched in 2006, it was mainly a text-messaging service. Tweets were limited to 140 characters (well, 140 characters plus a 20 character username) because SMS messages are limited to 160 characters. You would send and receive tweets straight from the text messaging screen on your mobile phone. Twitter also hosted a website which you could use to browse your timeline and send messages.

Behind the scenes, both the website and the SMS service ran on the same underlying back end system. Twitter engineers worked hard to maintain a clear boundary between the front end SMS/web experiences and the back end, linked together by a reliable and robust API.

What Twitter did next with this API was the key to its success and cultural impact: they extended this API to third-party developers. Developers loved that they could build and sell an app to suit their customers’ tastes and needs. As long as they followed the API, the sky was the limit to their creativity. The apps that were produced from this effort — Tweetie, Twitteriffic, TweetDeck, and many others — have come to define the Twitter experience for their users. Customers could try new apps with ease. All that was needed was a username and a password. They could switch from one app to another without worrying about losing their data or being unable to send tweets. The Twitter API guaranteed interoperability.

This competition produced almost every innovation that we now consider essential components of the Twitter experience: hashtags, retweets, photos, shortened links, etc. Even the word “tweet” was coined not by Twitter employees, but by developers at a third-party company, The Icon Factory. Every Twitter client had its own unique user interface, and apps competed for the best experience. Some, like TweetDeck, competed with pro-level features, while others, like Twitteriffic, competed with simplicity and ease-of-use. Client apps that were poorly-designed or hard to use did not succeed in gaining users.

Compare this to Facebook: until relatively recently, if you wanted to use Facebook, you had to visit their official website. The Facebook experience was designed for the lowest-common-denominator, and it showed. Every major attempt to redesign the Facebook site resulted in large numbers of upset users. There is simply no way to satisfy all people with a single, monolithic user interface. Variety is a necessity if the goal is to create satisfying user experiences.

What if EMRs worked more like Twitter? What if there was a clear separation of concerns between reliable back end systems and user-facing client applications, linked together by a robust, universal API? If providers could pick and choose the component pieces of their front end software the same way that Twitter users can swap out Twitter clients, the basis of competition would radically shift towards usability. Smaller vendors would be able to compete with equal footing with the large vendors, at least on an app-by-app basis. This one makes the best CPOE module, this one the best stress-test documentation, this one the best eMAR. Providers could hire developers to write custom in-house apps at relatively little expense and difficulty compared to present circumstances.

Is it even possible for the industry to change in this way? In Part Two of this series, I will discuss the changes that would need to take place for such a scenario to occur. I will also identify the institutions that are the most likely to be able to effect the necessary changes, and whether and how they would be motivated to do so.

|  5 Jun 2012




Dr. Rick on the ONC/NIST Usability Conference

Dr. Rick on the ONC/NIST Usability Conference:

I had the pleasure of meeting Dr. Rick Weinhaus at this years Creating Usable EHR conference in Bethesda, Maryland. I was also fortunate to assist with this, his latest article on HIStalk, in which he reflects on the lessons learned there.

|  4 Jun 2012




Why is Photo Sharing Still So Hard?

Why is Photo Sharing Still So Hard?:

Phil Freo asks:

I’ve got the latest iPhone with its 8MP camera and HD video camera, complete with iOS 5 and I pay for extra storage on iCloud. Apple’s supposed to be the best at designing simple user experiences across hardware and software – and I believe they are.

So when I want to take a bunch of photos and videos that I took from my iPhone and share those with some family members, it should be simple right?

Yet it’s not.

I couldn’t agree more.

|  4 Jun 2012




The Ongoing Confusion with iOS App Icons

Whenever Apple introduces an iOS device with a new user interface idiom or screen resolution, developers have to include additional app icons to match the expected dimensions and filenames. For example, before the iPhone 4 or the iPad, there were only a few app icons:

- Icon.png
- Icon-Small.png
- ItunesArtwork

Three icons: one for the home screen, one for Settings.app (and, later, search results), and one for the iTunes App Store. When the iPhone 4 and the iPad were added, the list got longer:

- Icon.png
- Icon-Small.png
- ItunesArtwork
- Icon@2x.png
- Icon-Small@2x.png
- ItunesArtwork@2x
- Icon-72.png
- Icon-Small-50.png

With the introduction of the 3rd generation iPad this year, the list is even longer:

- Icon.png
- Icon-Small.png
- ItunesArtwork
- Icon@2x.png
- Icon-Small@2x.png
- ItunesArtwork@2x
- Icon-72.png
- Icon-Small-50.png
- Icon-72@2x.png
- Icon-Small-50@2x.png

Along the way, much confusion has been created around iOS app icons. Questions that continually plague developers:

  1. Naming: What should each image be called? For apps that must support iOS 3.1.3 or earlier, icon files must adopt the fixed naming scheme listed above. Apple added the soundalike CFBundleIconFiles and CFBundleIcons keys in iOS 3.2 and 5.0 respectively. You use these keys when setting up the “Icon Files” array in the Info.plist for an app. When using the CFBundleIcons key, filenames can be anything you wish, as long as the retina-resolution files have the same root filenames (plus the @2x suffix) as their non-retina counterparts — unless, that is, you include the file extensions in the Info.plist. If you’re easily confused by all these changes and exceptions, you are not alone.

  2. Info.plist Arrays: The Info.plist allows you to add an array of “Icon Files” in which you list the filenames for the included app icons in your app bundle. Apple has a technical document explaining how to set up this array here, but this document hasn’t been updated since July 2011. Since then iOS 5 has been released (along with the CFBundleIcons key), the retina iPad has been on the market for months, and we’re 2 weeks away from the iOS 6 announcement. This technical document has not been updated with instructions for how to deal with the new icons and plist key. Adding to the confusion, Xcode may sometimes add a duplicate Icon Files array called “Icon files (iOS 5)”, as per this StackOverflow question. It’s still not clear whether this duplicate array is an intentional effect and should be preserved for forwards compatibility, or whether it’s a bug in Xcode.

  3. Bundle Location: Apple’s technical documents state that app icons and ItunesArtwork files should be kept at the top level of the bundle directory, but neither Xcode nor iTunesConnect triggers an error if the files are buried in some other sub-directory. I only just discovered this requirement tonight. This may explain why Pillboxie’s iTunes Artwork on the retina iPad App Store is still showing the non-retina 512x512 version, even though I’ve included the 1024x1024 version.

  4. Poor Documentation: Developers have to consult way too many Apple documents just to answer the basic questions about app icons. When a blogger offers more helpful documentation than Apple, you know there’s a problem.

  5. Unexplained App Rejections: A few developers, myself included, have had apps that were built, archived, and submitted to iTunesConnect without any hiccups, only to receive a cryptic email ten or fifteen minutes after submission that states that an app icon file appears to be corrupt. Solutions I’ve found range from disabling PNG compression in the build settings, or making sure that no app icons (or launch images) were exported from Photoshop with Interlacing enabled. See this StackOverflow post for more information.

  6. UPDATED: Inconsistent Border Radii: I forgot to mention the problems that Neven Mrgan has explained better than I ever could about the way iOS, iTunes, and Safari apply app icon border radii. Even if you or your designer submits all app icons without alpha-transparent corners (which Mrgan recommends), it is still very difficult to get edge highlights and shadows to appear exactly the way you wish. The worst offender is iTunes Connect’s app info page. Thankfully, it isn’t customer-facing, but it’s a dramatic illustration of the problem:

itunesConnect

WWDC is coming soon, so hopefully some of this confusion will be addressed this time around. Or maybe things will continue to get worse.

|  2 Jun 2012




Never Go Cheap on Business Cards

With business cards, there really is no middle ground. There are two general tiers: there’s digital (or offset), which are both inexpensive, and it shows. These cards will look grainy and bland. Then there’s the old fashioned way: letterpress, foil stamping, duplexing, quality stock, etc. The difference in cost is not trivial. Based on my experience, digital or offset cards can cost from 20 to 50 cents per card, whereas “real” cards can cost anywhere from 75 cents to 3 dollars per card, depending on the options. But the difference in quality is dramatic.

I spent $750 on one-thousand cards for Splint, and I’ve never regretted it. I used Henry & Co in Atlanta (warning: Flash-only site). Everyone has their own priorities. Some people view cards that expensive as a waste. My opinion is that if I’m giving something to a customer (a product, a meal, a business card) it had damn better be nice.

|  31 May 2012