Is ONC putting too much hope on the cloud?

The Office of the National Coordinator for Health Information Technology (ONC) is pinning its latest hopes for interoperability on the cloud, namely with application programing interfaces, or APIs. But is this hope realistic?

Experts feel it will be a close call to get the API standards ready in time for requirements found in the April 27 release of the Medicare Access and CHIP Reauthorization Act of 2015 (MACRA) proposed rule. The rule makes it a national objective to achieve health data interoperability by December 31, 2018.

A measure in the MACRA proposal called, “View, Download, Transmit (VDT),” calls for eligible clinicians to help patients:

  • view, download or transmit their health information to a third party;
  • access their health information through the use of an API; or
  • a combination of both.

In an April 27 blog, Andy Slavitt, acting administrator for the Centers for Medicare & Medicaid Services (CMS), gave an indication of ONC’s hopes for APIs, saying they will “open up the physician desktop to allow apps, analytic tools, and medical devices to plug and play.”

Setting apps on ‘fire’

There’s one problem with that. The APIs aren’t ready yet. HL7 (Health Level Seven International), a group that sets standards, formats and definitions for exchanging and developing electronic health records (EHRs), is working on standards for healthcare APIs called Fast Healthcare Interoperability Resources (FHIR). It’s pronounced “fire,” and there are plenty of references to FHIR setting the whole health app world on fire.

Gary Dickinson, co-chair of HL7’s EHR workgroup believes that ONC is maybe putting too much pressure on HL7 to get the APIs completed by 2018. He thinks FHIR will “probably” be ready in time for what is required in the MACRA proposal, but not for broader use. Some companies in the industry may go ahead and use the FHIR standards prior to HL7’s vetting them, he said.

Developing FHIR standards for APIs is not an easy thing to rush. FHIR divides data into “bite-sized chunks” for transmission. These are called resources. FHIR has more than 100 different resources identified. Each of these must be vetted, involving more than 50 workgroups at HL7. Developers rate each resource with a maturity level of zero through five. Most resources today are rated zero or one, Dickinson says.

HL7 is shooting to have the third in a series of trial versions of FHIR ready by Dec 31, but that is likely “pushing it,” he says. The timeframe largely depends on the number and extent of updates required from comments received in the upcoming August FHIR ballot.

John Halamka, MD, chief information officer of the Beth Israel Deaconess Medical Center, Harvard medical professor and co-chair of the HIT Standards Committee says FHIR is key to the future of query/response interoperability, which allows users to pull the data from wherever it resides.
“We’re at a time in history when the private sector–driven by customer demand–is implementing novel solutions to healthcare information exchange,” Halamka says. “FHIR is already in production in many applications and every major HIT vendor will have FHIR application program interfaces in production this year.”

Is ONC pushing too hard?

Justin Barnes, health IT industry advisor and thought leader, doesn’t think ONC is necessarily trying to force the use of APIs in the new MACRA proposal. Barnes has been an advisor to the White House and has worked with Congress and regulatory agencies on health IT issues for more than a decade. His interpretation is that ONC officials are trying to mandate flexibility, usability and interoperability. “I don’t feel they’re pushing for regulatory granularity,” he says. “They want to allow creativity.”

Robert Tennant, director of health information technology policy at the Medical Group Management Association agrees that ONC is not forcing the use of APIs in its MACRA proposal, but “it’s clearly in there.” For Tennant, it all comes down to one thing–“trying to find a balance between what patients want and what doctors can handle.”

“The government would say that interoperability is the seamless flow of information,” Tennant says. “But the question is, does every record need to be interoperable? Probably not.”

David Kibbe, MD, is president and CEO of DirectTrust, a collaborative non-profit association of 145 health IT and healthcare provider organizations in support of secure, interoperable health information exchange via the Direct message protocols. Kibbe is also a senior advisor to the American Academy of Family Physicians. He says with the use of FHIR, APIs will help patients get a fuller picture of their own health information, because the apps will help them access it and see it in new ways. But some aspects of developing FHIR are going to be difficult, especially with cross-organizational use cases.

Of the MACRA proposed API requirements, Kibbe says, “it will be an enormous challenge for both providers and vendors to meet the new objectives and measures within the current time frames, with all the other additional objectives and measures required.”

Bipartisan optimism

At a May 11 House Ways and Means Committee hearing on the MACRA, Slavitt said the proposal is just a starting place for the discussion. “It will take work and broad participation to get it right.”
MACRA was a bipartisan effort, and true to those roots, optimism about the proposal was also bipartisan at the hearing. Rep. Ron Kind (D-Wisc.) said MACRA is “all about finding ways to care for patients.”

Rep. Peter Roskam (R-Ill.) said, “we’re on the verge of good things.”

Via HealthDataManagement.com »

What holds healthcare back from the cloud

Web designer Chris Watterston put it best when he created a sticker that went viral: “there is no cloud, it’s just someone else’s computer.”

It’s that very issue that makes the cloud both appealing and unappealing to healthcare providers. It’s appealing because it provides the scalable, usable storage for the expanded needs of today’s healthcare market, including the storage of large genomic files and digital imagery. Few providers can store this kind of data in-house – and so, they use the cloud.

But the fear of the cloud being “out there” leaves the sensation that data is vulnerable, and keeps some healthcare providers away.

Ed Cantwell, executive director, Center for Medical Interoperability says people get tripped up with who accesses the cloud, and how. “They think, if it’s in the cloud, it’s a free-for-all. But that’s not the case at all,” he says. “I’m not so sure if a hacker cares if you are in the cloud or locked in a vault. If you’re in the cloud, you’re still located somewhere physically.”

Security is definitely a theme behind cloud concerns. James Custer, director of product development at Yorktel, says when it comes to the cloud, fears about HIPAA compliance are front and center.

“There is always this huge hesitation when the cloud is discussed,” Custer says, which is why the paperwork and sign-off to using the cloud can sometimes take a healthcare organization up to a year. But despite the difficulties, the cloud has really served smaller hospital systems well that can’t afford their own infrastructure. “The cloud has been huge for them,” he says.

Ray Derochers, executive vice president of HealthEdge, a cloud host company that serves mainly health insurance companies, says despite any initial hesitancy, most large insurance companies are moving to the cloud.

Beyond security issues, there is also the need to decide what information to move to the cloud. Because of the confidentiality and complexities of the insurance business, there is no way all the data is going to the cloud, Derochers says. Because of this, “people are afraid to take a bold position. They don’t comprehend all the moving parts.”

Tips for managing cloud technical and security issues

David Furnus, the site information security director at Stanford Health Care – ValleyCare says, “the cloud isn’t impervious to attack; that’s a given.” But knowing that can help to ensure protection.

Furnus suggests engineering resilience into systems and applications. This means “to expect we will be breached and to be prepared to detect, contain, and correct the breach as quickly and as effectively as we are able.”

The security of data transmission to and from the cloud is “a non-issue,” Furnus says, if the cryptographic controls being used meet or exceed the requirements of the Federal Information Processing Standard Publication 140-2, (FIPS PUB 140-2), a federal government computer security standard used to accredit cryptographic modules.

According to Furnus, providers should only consider using the cloud if the cloud host, at a minimum, uses the Federal Risk and Authorization Management Program (FedRAMP), a government-wide program that provides a standardized approach to security assessment, authorization, and continuous monitoring for cloud products and services. In addition, the cloud provider should “be subject to the successful negotiation of other client-specific security requirements.”

Lee Kim, director of privacy and security at the Healthcare Information and Management Systems Society (HIMSS), North America, says there are a number of things to look for when selecting a cloud provider.

First, make sure the cloud host will offer access to the data on-demand, with few interruptions. This is critical to healthcare. Does the host have a good track record of making the data available during business hours? Cloud hosts schedule down time for maintenance, but do they also have frequent unscheduled downtime when physicians might need records for patients? Ask colleagues who have used the cloud provider. “Don’t believe what marketing people say on the website; it’s so much more than that,” Kim says, who advises getting any kind of promises or assurance of medical record hosting in writing. Chances are if it’s not in writing it might not be part of the agreement.

Get a copy of the cloud host’s last risk assessment to see how well they are doing with security, Kim advises. Check to see what controls they are using in terms of security. A good rule of thumb when it comes to cloud security, “sometimes you get what you pay for.”

Be wary of small start-up cloud services, he adds. Will they be around in a year? Many venture capital firms own cloud companies temporarily, planning to sell. With a large cloud provider that has been in business for 10 years or more, there’s a little more assurance they will be in business a while, Kim says.

Check out the company’s customer service capability. Sometimes it’s limited. “In this day and age, it can make a world of difference what the customer service is like. If the company isn’t responsive and keeps kicking the can down the road, that’s not good, especially, when it comes to caring for patients,” she says. “Physicians can’t fight with the technology and take care of patients at the same time.”

In terms of managing the risk after you have legally bound yourself to a cloud company, Kim says to make sure someone in the organization is keeping up with them, serving as a liaison.

Via HealthDataManagement.com »

Power of the cloud spurs big push to boost interoperability

The cloud holds great potential for health data exchange interoperability, perhaps even including hopes for international standardization to support the Internet of Things, and it may be used to leverage intense data exchange initiatives, health IT experts believe.

Precision medicine based on genomics, with its huge amounts of complex digital information, will tie masses of information to a single electronic health record, says Lee Kim, director of privacy and security for the Health Information and Management Systems Society North America. Test results, summaries of imaging studies and genomic data will make using a clinical server model impossible, Kim says. “It’s way too much data. I could definitely see the cloud playing a future role.”

“I’m not sure that cloud technologies by themselves necessarily enhance interoperability,” says John Halamka, MD, chief information officer of the Beth Israel Deaconess Medical Center and a professor of medicine at Harvard Medical School, who has co-chaired federal workgroups on the standards needed for interoperability. “However, there are cloud-based services that could reduce the burden of interoperability implementation.”

Halamka

Still, the healthcare industry has some catching up to do when it comes to using the cloud, says Ed Cantwell, executive director of the Center for Medical Interoperability (CMI), a nonprofit with mainly health system members that is striving to accelerate the seamless exchange of information. Sectors like the financial industry have relied on the flow of data to survive, but something has blocked the healthcare industry from following suit, Cantwell says.

“You can walk into any hospital in this country and systems don’t talk to each other. They don’t work in a plug-and-play manner,” says Kerry McDermott, vice president of public policy and communications at CMI. “Health systems want change,” she says. “They are living and breathing the problem every day.”

CMI is currently working to select 100 engineers for participation in the development of a blueprint for interoperability, which will include cloud and non-cloud solutions. The blueprint will be used to certify healthcare products as being capable of working on the cloud. Up for consideration for working on the project are “some of the biggest players” in other industries, Cantwell says.

CMI’s membership represents $100 billion in procurement power, and it is this, plus the opportunity to expand into the healthcare sector that has drawn interest. When the selected engineers are revealed in a few weeks, they will work in CMI’s centralized lab to tackle interoperability. “It’s a game changer,” to have the providers, which have purchasing power, implementing the drive for change, McDermott says. CMI aims to include all devices across continuum of care in the blueprint. A pilot of the blueprint will be ready before the end of the year, she says.

In Massachusetts, Halamka says, some of these services include a cloud-based provider directory for fast, reliable lookup of any provider’s direct address. It also has a cloud-based master patient index, consent registry, and record locator service to support patient identification and assembly of records for multiple sources. The state also has a cloud-based quality registry for aggregation of quality data.

Interoperability issues have been exacerbated, not lessened, with the adoption of electronic health records. To advance healthcare through the use of data, the federal government sees the need to boost interoperability; for example, rules to enact the Medicare Access and CHIP Reauthorization Act of 2015 (MACRA) emphasize achieving interoperability through the use of application programming interfaces (APIs).

In January, Centers for Medicare and Medicaid Services Acting Administrator Andy Slavitt gave a hint at the government’s direction when he said the agency will promote innovation “by unlocking electronic health information through open APIs – technology tools that underpin many consumer applications.”

“Interoperability will be a top priority, through the implementation of federally recognized, national interoperability standards and a focus on real-world uses of technology,” he said of CMS’s plans to interpret MACRA.

Even as interoperability is still up for grabs, it’s clear that the cloud is going to provide a great deal of help toward achieving other healthcare goals, such as facilitating currently complex activities such as finding participants for clinical trials and holding the data to conduct trials.

“The cloud is extremely important for clinical trials,” says Luca Emili, founder and CEO of Promeditec, an Italian-based company that develops technology to accelerate clinical trials. About 10 years ago, the quantity of data collected per patient was quite low. Now, with the addition of wearable devices, digital images and genomic data, hospitals need to find a new strategy for collecting this data, he says.

Promeditec recently chose Verizon Cloud to support the delivery of its AppClinical Trial solution, a software as a service (SaaS) through a collaborative platform that can be scaled to set up trials, and capture and manage trial data. Use of the cloud with this platform has helped to cut the expense and time of clinical trials, which can cost as much as $3 billion over the 10 to 15 years required to conduct a trial.

A patient’s genomic information often has 300 gigabytes of data, and hospitals that want to participate in clinical trials in the future will need to use the cloud because of the sheer volume of data that large trials could involve. In addition, the cloud enables the use of data gathered worldwide, and hospitals can no longer store this quantity of data in-house, Emili says.

Jim Hollingshead, president of San Diego-based ResMed, a provider of connected healthcare solutions for remote monitoring of patients with respiratory illnesses, has found a way to use to cloud to save money and increase the flow of data. ResMed’s clients, mainly home health medical equipment (HME) providers, are required by Medicare to show proof that patients are using VPAP and CPAP breathing devices. Previously, removable data cards were used for this purpose, but ResMed replaced them with cellular chips that send data straight to the cloud. Now an HME can go online and verify the usage. It greatly reduces the labor required by the HMEs to verify usage, saving them money.

A completely unexpected aspect of going to the cloud and online was the ability to identify patients that need intervention for compliance. Adherence levels jumped, as did the interest on the part of consumers. Some 900 of them go on the site per day to check the data regarding their usage and several hundred thousand patients persistently look at their data. “We were shocked” patients latched onto this, he said. There was unquestionably an underlying need.

The software platform has an API that enables hospitals to connect with the HMEs through the cloud, making the EHRs interoperable, which is especially important to providers in networks and ACOs. “We see the world going to an integrated place,” Hollingshead says.

In 2014, the company launched its SaaS API, and it was quickly adopted. Berg Insight confirmed that ResMed in a 16-month span had become the the world’s leader in remote patient monitoring with more than 1 million connected devices. . “The cloud is the next wave of creating value,” Hollingshead says.

The cloud fits many of healthcare’s needs to exchange information and will play a vital role, says CMI’s Cantwell. “It’s not an issue of whether or not healthcare is going to going to adopt the cloud; healthcare has already started to.”

But the speed of adoption depends on open data. “Once data becomes democratized, almost the only way you can allow that data to really be a source of true innovation is to open it up to more cloud-based services,” Cantwell says.

However, “it’s only a matter of time,” says Kim from HIMSS. Some hospitals are becoming more geographically diverse, and they’ll need data on demand. The cloud is scalable and can provide not only computing power and storage space, but also convenience, she says.

Via HealthDataManagement.com »