Do we need more or less healthcare IT regulation and legislation?

‘Sometimes when clinicians prescribe medication, although it does therapeutic good, it creates side effects which need to be addressed by changing a dose or by adding additional medications.’

Just as I clarified last week in my post about certification, the answer to the question “do we need more or less healthcare IT regulation and legislation” is that we need the right amount of the right regulation/legislation.

Sometimes when clinicians prescribe medication, although it does therapeutic good, it creates side effects which need to be addressed by changing a dose or by adding additional medications.

Such is the case with HITECH. It was generally good medicine, but now that we’ve seen the side effects on workflow, clinician burden and efficiency, there needs to be a dose adjustment.

I was recently asked to review the “Improving Health Information Technology Act” introduced by Senator Lamar Alexander, R-Tennessee, in February 2016 and placed on the Senate Legislative Calendar in April 2016. Its intent is good – to refine existing healthcare IT legislation with fixes that enable the right amount of the right regulation.

You’ll find the summary here and the full text of the bill here.

Here’s my analysis, section by section:

“1)  Assisting Doctors and Hospitals in Improving Quality of Care for Patients

Reduces documentation burdens by convening public and private stakeholders to develop goals, a strategy, and recommendations to minimize the documentation burden on providers while maintaining quality.”

This is a good thing. It fixes the language in HITECH which required each stage of Meaningful Use to be more stringent than the last.  That language required regulators to make each update to Meaningful Use more challenging.The Improving Health Information Technology Act enables regulators to better balance benefit and burden.

“Allows and encourages health professionals to practice at the top of their license, allowing non-physician members of the care team to document on behalf of physicians.”

This is a good thing. It encourages more team based care and documentation. Using electronic systems effectively is a team sport and should leverage social networking/groupware ideas to capture electronic data.

“Encourages the certification of health information technology for specialty providers and sites of service, like pediatric care, where more specialized technology is needed.”

As long as the Certification focuses on a few key important ideas, as noted in my previous post, this is a good thing.One set of required functionality does not make sense for diverse software supporting specific specialties.

“2) Transparent Ratings on Usability and Security to Transform Information Technology (TRUST IT)

Establishes an unbiased rating system for HIT products to help providers better choose HIT products.”

A government program to do this is unnecessary.The private sector has KLAS and other companies providing such information already.

“Allows HIT users to share feedback on the user experience of specific HIT products related to security, usability, and interoperability, among other concerns.”

A government program to do this is unnecessary. The private sector has KLAS and other companies providing such aggregations already

“3) Information Blocking
a. Gives the Department of Health and Human Services (HHS) Office of the Inspector General the authority to investigate and establish deterrents to information blocking practices that interfere with appropriate sharing of electronic health information”

Although I have not personally experienced information blocking, I hear anecdotally that there are some places in the US where competing systems refuse to share data with each other. Giving the OIG the ability to investigate is reasonable.It’s not clear there will be much to investigate.

“4) Interoperability

Convenes existing data sharing networks to develop a voluntary model framework and common agreement for the secure exchange of health information across existing networks to help foster bridging between networks.”

Convening stakeholders to develop a voluntary framework is reasonable. However, I believe the private sector will do this on its own in 2016.

“Creates a digital provider directory to both facilitate exchange and allow users to verify the correct recipient.”

This is a good thing. CMS could leverage the existing national provider identifier system.

“Requires that HHS give deference to standards developed in the private sector.”

This is a good thing. The private sector is moving very fast to embrace simpler standards such as FHIR.

“5)  Leveraging Health Information Technology to Improve Patient Care

Requires that certified HIT exchange data with registries if registries are certified to use standards endorsed by the Office of the National Coordinator (ONC).”

There are no mature/adopted standards for registry exchange at this time. In the interest of comprehensiveness, ONC has tended to publish/endorse standards that are not yet ready for adoption. Registry participation should be left to the marketplace.

“Includes vendors in Patient Safety Organizations to allow for improvements in the safety and effectiveness of HIT.”

This is very reasonable

“6) Empowering Patients and Improving Patient Access to Their Electronic Health Information
a. Supports the certification and development of patient-centered health record technology so that patients can access their health information through secure and user-friendly software that may update automatically.”

Although patient and family engagement is very important, it is not something that the government should certify.Apple and other consumer companies are innovating at the speed of the market, taking us in new directions that government could not have predicted.

“Encourages the use of Health Information Exchanges to promote patient access by educating providers and clarifying misunderstandings.”

Health information exchanges really do not have a role in patient/family engagement.The new approaches implemented by Apple  and other innovators directly connect the patient and provider.

“Requires HHS to clarify situations where it is permissible for providers to share patient information by providing best practices and common cases where sharing is allowed.”

Clarifying HIPAA through education is a good thing.

“7) GAO Study on Patient Matching
a. Directs the Governmental Accountability Office (GAO) to conduct a study to review methods for securely matching patient records to the correct patient. ”

This is a good thing. We are not going to be able to consolidate records across the care continuum unless we can identify the patient.

There you have it – a dose adjustment for HITECH. Dose adjustments can have their own side effects. Hopefully the bill will be adjusted as suggested above before it is passed. The goal of any new legislation/regulation, just as with medical care itself, should be to first do no harm.

This blog post first appeared on Life as a Healthcare CIO.

Via HealthcareITNews.com »

At iHT2 Boston, Micky Tripathi’s Refreshing Take on Interoperability

During a closing keynote presentation last week at the iHT2 Boston Health IT Summit, Micky Tripathi, Ph.D., president and CEO of the Massachusetts eHealth Collaborative (MAeHC), debunked certain healthcare interoperability “myths” while offering a positive outlook on the future of data exchange.

The event, from the Institute for Health Technology Transformation (iHT2—a sister organization to Healthcare Informatics under the Vendome Group, LLC corporate umbrella), took place at the Aloft Seaport Hotel in Boston on June 23-24, and closed with Tripathi’s Friday keynote on healthcare interoperability.

In addition to his role at MAeHC, a collaboration of Massachusetts provider, payer, and purchaser organizations, Tripathi wears various of other health IT hats: he is chair of the Information Exchange Working Group and co-chair of the Privacy and Security Tiger Team (both of the federal Health Information Technology Policy Committee), a director of the New England Health Exchange Network (NEHEN), and a director and past board chair of the eHealth Initiative. Simply put, when it comes to interoperability and standards, no one in health IT is better well-versed than Tripathi. I compare it to the NBA, when players who consistently can score the basketball are labeled “go-to guys.” For healthcare, Tripathi is the go-to guy for all things interoperability.

Tripathi opened his presentation by asking two questions to the room full of attendees: first, if they believe information blocking significantly exists in healthcare; and second, if they think that the healthcare sector is woefully lagging behind other industries in terms of being interoperable. Predictably, the majority of hands raised in affirmation to both questions. Knowing this would be the likely answer to his two questions, Tripathi moved on in an attempt to debunk these “myths.”

Indeed, looking at other industries, Tripathi noted how he gets Google Calendar invites all the time that don’t sync well in Microsoft Outlook. Or, he said, books purchased at Barnes and Noble don’t play on the Amazon Kindle. He gave several more examples of how in other businesses, companies don’t always “play nice” with one another: Apple isn’t interoperable with anyone; Netflix and Verizon recently had a fight about who should pay for the infrastructure for Netflix consumers, resulting in poor streaming quality; Fitbit has said that it’s not connecting with Google Fit, choosing to create its own network; and finally, consumers can no longer use another coffee cup in a Keurig anymore.

“Interoperability problems are rampant across all industries, public safety included,” Tripathi attested. “I’d argue that [these examples] are no different than what’s happening in healthcare. In some ways, since we have higher expectations in healthcare, we are actually doing better. We need to exchange data; other industries might not have to.” Tripathi then touched on how these interoperability issues get “resolved” in other industries, offering the example of universal product codes (UPC) in grocery stores that adopted them after having problems with inventory control. “Grocery stories created UPCs with a bunch of other grocery stores and vendors. They wanted to all purchase the same machines and get value from them,” Tripathi explained.

Thus, as HIE [health information exchange] matures, it is starting to organize itself like other industries, Tripathi said. Now the question becomes, how are these data exchange networks going to form? The early notions were of a single, federal top-down network, and that collapsed as an idea. But now, networks are starting to form, he said. “It’s not about connecting an EHR [electronic health record] to an EHR, but about being a part of a network and connecting a network to a network. That’s how the rest of the economy has solved the issue in literally every instance.”

Tripathi pointed to several examples of separate networks forming and connecting in healthcare today. He brought up the eHealth exchange for government data, the Mass HIway for local, state-based, lightweight exchange in Massachusetts, Surescripts, for e-prescribing, DirectTrust for secure email, and Carequality as an emerging framework that allows query-based exchange among different participants. “We have so many different ways to communicate with one another based on the kind of communication we want, so we have different networks—just like any other industry. The original notion was to have one way of health information exchange, but there are very few examples where that has worked,” Tripathi said.

He continued, “What type transaction do you want to make? DirectTrust is nationwide interoperability of secure email, and it doesn’t do anything else. But it’s something that has been carved out from the broader picture.” This is different than the all-or-nothing approach, or “HIE 1.0,” in which data would be dumped into a repository for everyone to be able to use for multiple purposes, Tripathi said.

Tripathi then noted how the marketplace is just beginning to see solutions for point-to-point query exchange, so a provider can query someone else’s system to get a record document, and then query another system. Carequality and CommonWell are starting to solve this problem, Tripathi said, adding that pretty much every major vendor except Epic, NextGen and GE are on board. And regarding Epic’s exclusion in these interoperability frameworks, Tripathi reminded folks that while there is not yet interoperability between CommonWell and Epic (Epic’s Care Everywhere product is for Epic users only), looking at other industries as a precedent proves that it will eventually happen in healthcare, too. “These are the beginnings of a nationwide network to solve the point and retrieve issue,” he said.

FHIR is Just a Standard

While Tripathi repeatedly pushed the idea that healthcare is not as in much trouble as people like to say in terms of it users’ ability to exchange data, he did caution that the FHIR (Fast Healthcare Interoperability Resources) standard, while certainly a big part of health IT’s future, is not the magic bullet to solve all interoperability problems. “Have we hit the peak of the hype curve?” Tripathi asked. “FHIR is being talked about as the universal magic bullet. I am amazed by the hype of it. People are looking for an answer, and need a magic bullet,” he said.

But more than that, FHIR is a genuine data-level interface that allows a provider to ask another provider just for allergies for patient X, for example, Tripathi added. Right now, he noted, a provider might receive an entire C-CDA (consolidated-clinical document architecture) document even though he or she just needed the allergies. “With FHIR, you can say ‘here are the allergies.’ So that’s why there’s excitement around it. It gets us closer to the data integration goal that we all want.”

Nonetheless, Tripathi referenced a KLAS interoperability survey from last fall which found that FHIR was the top thing people were excited about. “But I’d bet that 90 percent of those people can’t tell you much about FHIR at all,” he estimated. “I’m a big proponent of FHIR, but people will realize that it’s just another standard, and it won’t solve problems like money, legacy systems, and things like that.”

Tripathi is the project manager of the Argonaut Project—an initiative launched by Health Level Seven International (HL7) to accelerate the development and adoption of FHIR— where leaders there are currently writing app-enabled implementation guides in which a person would be able to take a mobile app or host of applications and be able to have those apps connect in a seamless way. That’s the hope of FHIR, and it’s based on RESTful application program interfaces (APIs), which the rest of the Internet is based on, Tripathi noted. “Once you base it on something like that, you bring in a lot of other brains that are willing to experiment. There is a whole economy of developers out there right now that don’t want to enter healthcare world because they think standards we use are 25 years old. And they’re right.”

One of FHIR’s challenges, Tripathi continued, is that a whole ecosystem has to form. “People get excited about the notion of having apps and just connecting them. Providers want the apps, and EHR vendors also like the idea of apps because they can’t keep up with the demand,” he said. “Medicine is way too complex, so a Cerner or an Epic can’t implement those things. So they like idea of plugging your app into their system. But an ecosystem needs to form around that. How does that work?” He mentioned the Apple store or the Google Play store which act as intermediaries where a user can go find those apps, following basic usability and quality principles, and some security principles. But Tripathi wanted to know where this happens in healthcare and who will ultimately step up to the plate.

He added that there are several options of how this can occur in healthcare, and who the app store pioneer will be is still up for debate.  Possibilities, according to Tripathi, include: Geisinger or a similar health system; major EHR vendors; a third-party company; or the SMART (Substitutable Medical Applications & Reusable Technologies) on FHIR app gallery, which as of today, is the closest we have come to a vendor-neutral app environment. That could be the place that becomes trusted first, Tripathi acknowledged. However, the one organization that the industry cannot afford to do it is the federal government, he said. “There was talk about it, but we absolutely don’t want that. That can go down a bad path, and really quickly.” Tripathi himself is betting on EHR vendors and provider organizations who will lead the way.

Certainly, listening to Tripathi’s keynote was particularly fascinating for me, since I’m mostly told how it will be years before we see true interoperability in healthcare, an opinion that was hammered home at a recent iHT2 event in April. To this end, I wrote a blog earlier this month about how 2015 data exchange numbers amongst U.S. hospitals—particularly the ability of providers to perform all of the core functions of interoperability—were nothing to write home about.

It’s certainly possible that it will take five to 10 years, or perhaps even more, for healthcare interoperability to catch up to other industries. But I don’t think that was Tripathi’s point with his keynote last week in Boston. Rather, he wanted to call out the improvements that we have seen over the years, and prove that healthcare is far from the only sector with system connectivity problems. It was extremely refreshing to hear Tripathi’s expert and balanced take on the topic. Once again, healthcare’s interoperability “go-to guy” delivers.

Via Healthcare Informatics »

HHS ups its efforts to foster interoperability

When Sylvia Mathews Burwell took the stage as the opening keynote speaker at the 2016 Healthcare Information and Management Systems Society (HIMMS) conference on Feb. 29, it was to deliver the latest deal with the private sector to advance the future of health care.

The Secretary of Health and Human Services had just secured an agreement from some of the industry’s biggest players to openly share information across multiple platforms to make it easier for providers and patients to access their medical histories.

“Sometimes the most important part of our job is to step back and let others take the lead,” she said. “That’s why today, I am very excited to announce that companies that provide over 90 percent of electronic health records used by U.S. hospitals have stepped up and made public commitments to make data work better for consumers and providers.”

The commitment, which included five of the country’s largest health care systems in addition to the industry’s leading developers, was another step toward the elusive grail of putting health care online: interoperability.

The key to the adoption of EHRs and its future in health care rests on the ability of providers to share the data across multiple IT systems and for patients to have more ownership of their medical records to share them.

But drawing up a road map for how to standardize health information and then connect and share it securely across multiple networks through an entire industry of IT products — each with its own proprietary information — has proven to be not only Herculean in scope, but more like a series of mythic tasks to bring together.

HHS’s Office of the National Coordinator for Health Information Technology — which spearheads EHR efforts, regulation and certification of interoperable health IT — released such a road map on Oct. 6, 2015. The plan outlined interoperability goals to be reached by 2024, including encouraging health information sharing.

In effect, ONC wants to help build what it calls a “learning health system” that is both patient-centered and allows health care providers to access patient information on demand and securely, regardless of device or software system.

But part of the challenge facing interoperability efforts is crafting the governance and compelling the industry to develop such a system. The report outlining ONC’s interoperability road map detailed the difficulties of getting providers to adopt information-sharing systems without the value of them being apparent.

“While important progress is being made today, the health care landscape continues to be dominated by fragmentation in care delivery and payment models that are largely based on the volume of services delivered, rather than the delivery of efficient, high-quality care and better patient outcomes,” the report read.

“When providers are rewarded for value, interoperability can be a significant tool to help them meet such requirements, but broad demand for interoperability has lagged and been insufficient to drive connectivity across health care providers.”

Collaboration for innovation

A day after Burwell’s announcement, Karen DeSalvo, National Coordinator for Health Information Technology and Acting Assistant Secretary for Health, announced a draft rule at HIMMS that would allow ONC to have greater oversight in the testing of health IT products.

The goal, DeSalvo said at the time, was twofold: to ensure that the private sector had a framework under which to apply interoperability to its technologies and to give the government more oversight as to which technologies were achieving the information-sharing functionality.

“So we would say that we are going to set goals, we’re going to move forward, we’re going to put all of our weight in one direction as the federal government, and we’ve asked the private sector to come along. They said they would,” she said at the HIMMS conference.

ONC also launched a website listing the health IT developers whose technologies met the certification. But one of the biggest challenges facing the industry when it comes to information sharing is not the achieving ONC certification, but instead the legislation that protects patient information, the Health Insurance Portability and Accountability Act of 1996 or HIPAA.

For patient information to be shared, it still must be both secure and compliant with the law, which places strong restrictions on to whom such information can be disclosed. The interpretations of HIPAA—among other industry issues—and how the information can be shared have varied widely, leading to a practice called information blocking.

Information blocking presents problems for ONC’s interoperability goals because it is built on the fluctuations of HIPPA privacy interpretations by developers and also protections they have for the proprietary information of their IT systems.

Elizabeth Sauve, a marketing and communications executive for Privacy Analytics — a Canadian software company specializing in de-identifying patient data so it can be used for research studies — said at the HIMMS conference that many companies are already sharing data. However, she said the exchange happens through private partnerships rather than building on an interoperability network.

“A lot of companies, they may not have a statistician who would be the one [looking at the data], but they do have a lawyer. So they are using data-sharing agreements, basically confidential disclosures instead of actually making steps. There’s a lot of companies who are using [HIPPA’s] Safe Harbor, because as long as you are compliant with the regulator, who else cares?”

The questions industry was asking at the HIMMS conference largely centered on how HHS planned to make HIPAA compliance clearer for developers to know what they can share.

“There’s been a lot progress on the technical front, with common standard APIs, [Health Level 7]-type formatting for the exchange of information, but then there is the whole piece that HL 7 doesn’t address on security,” said Ken Georgi, General Dynamics’ vice president of health care enterprise systems.

“Once I give you my information, what’s the standard there that we’ve agreed to? Really, it’s around security’s cousin, privacy. If you have my information, what are you allowed to do with it?”

In an effort to educate the public on what level of access HIPPA provides, ONC announced on June 2 a series of instructional videos for patients and a Patient Engagement Playbook for Providers to better explain the patient’s role in controlling their health records.

“Many people are not fully aware of their right to access their own medical records under the Health Insurance Portability and Accountability Act, including the right to access a copy when their health information is stored electronically,” said Lucia Savage, ONC’s chief privacy officer, in a statement. “The videos we released today highlight the basics for individuals to get access to their electronic health information and direct it where they wish, including to third-party applications.”

Programs like instructional videos and provider playbooks are part of ONC’s move to shift the EHR framework to a more patient-centered model, where the public retains ownership and access of its medical records.

“We must engage individuals in order to advance the safe and secure flow of health information,” said Tom Mason, ONC’s chief medical officer, in a statement. “The playbook we’re releasing today provides clinicians with the resources they need to get the most out of their health IT and help patients put their electronic information to work to better manage their health.”

Another move is making sure that the capabilities of the health IT systems being sold to providers are clear. On June 1, ONC began listing transparency specifications to help IT buyers access information about the “costs and limitations they may encounter when implementing and using certified health IT products.”

The strategies afford HHS the opportunity to provide the public more access to the EHR information while encouraging developers to offer more compatible services with their products. Whether those products will be readily adopted is another question.

Plowing the path to interoperability

While HHS and industry are collaborating and influencing interoperability, the next challenge is to ensure the users adopt the technology.

At ONC’s annual meeting on June 1 in Washington, D.C., it showed off data from the American Hospital Association that said 96 percent of hospitals were using certified EHR systems, the lower tier of functionality, while nearly 84 percent use the higher-tied basic EHR systems.

But as the stats noted, “Possession means that the hospital has a legal agreement with the EHR vendor, but is not equivalent to adoption.”

And while 85 percent of non-federal acute care hospitals have sent EHR information to an outside party in 2015, only 65 percent have received information through their IT systems. Overall, only 26 percent of non-federal acute care hospitals were collectively finding, sending, receiving and integrating information through their EHR systems.

Adoption of a new technology can take time, and HHS has new rules in final development to incentivize EHR adoption for Medicaid and Medicare.

So while the federal government encourages and the private sector innovates, the future of interoperability will be determined by the very group it has been built for: the user.

Via Federal Times »

Slavitt: Vendors Bear Burden to Deliver On Promise of Health IT Connectivity

As the Centers for Medicare and Medicaid Services moves to new payment models based on value rather than volume, a lot more will be asked of health IT technology and the vendors who sell them to providers, according to CMS Acting Administrator Andy Slavitt.

Slavitt told the nation’s largest physician group on June 13 that the burden needs to be on the vendors, not the end users, to deliver on the promise of health IT and its potential benefits to transform healthcare.

Speaking before the American Medical Association’s annual meeting in Chicago, he said CMS has heard the calls for “putting more pressure on technology vendors” and less on physicians. The goal, Slavitt asserted, must be to “make healthcare technology a tool” serving clinicians and patients.

“This is particularly true in the area of what many call interoperability,” observed Slavitt, who argued that interoperable health information would enable physicians to do tasks as simple as tracking referrals when a patient sees another specialist or visits a hospital—capabilities that don’t exist today, he contends.

He told the AMA audience about a conversation with a specialist in Chicago who complained that electronic health record systems simply don’t talk to each other, making it impossible to view patient records in those kinds of scenarios.

Besides relief from Meaningful Use requirements, Slavitt said that health IT interoperability is the “number one ask of many physicians.”

To help address these challenges, the CMS chief remarked that EHR vendors—and the providers that use their products—will now be required to open their application programming interfaces (APIs) “so data can move in and out of an application safely and securely.”

APIs, which enable a software program to access the services provided by another software program, are included in the final Meaningful Use Stage 3 rule requiring certified EHR technology to provide an API through which patient information can be viewed, downloaded and transmitted to a third party.

In addition, APIs are included in the 2015 Edition of Health IT Certification Criteria, which requires certified EHRs to demonstrate the ability to provide a patient-facing app access to the Common Clinical Data Set via an API.

“Systems will need to adapt to your needs,” concluded Slavitt. “Long-time frustration won’t disappear right away,” but he said “it is essential that physicians not only participate in but having a leading voice in the change that is ahead.”

In response to Slavitt’s comments, Leigh Burchell, chair of the EHR Association and vice president of government affairs at Allscripts, agrees that physicians must be engaged if the healthcare industry is to achieve health IT’s potential to help transform the healthcare delivery system. However, she contends that technology alone is never going to be the answer for vexing industry issues.

“Interoperability is a critical area of focus for us all. APIs and other tools will certainly move us ahead, but no one stakeholder can resolve all the issues that stand between where we are today and where we want to be in the secure sharing of patient information across provider organizations—issues such as HIE governance, data ownership and privacy agreements, and a consistent patient identity approach across disparate systems,” says Burchell.

Morgan Reed, executive director of ACT|The App Association and acting director of the Connected Health Initiative, said he supports Slavitt’s position.

“The United States is the largest consumer of healthcare services around the globe. But, physicians are unlikely to adopt new technology if it interrupts workflow or puts a patient at risk,” says Reed. “The lack of interoperable health information systems is an impediment to innovation that ultimately harms patients. We agree with Administrator Slavitt that there should be pressure on major vendors to make systems work well together.”

Via information-management.com »

The lingering challenge of healthcare interoperability

Front-line doctors, healthcare administrators, government officials—in fact, just about all everyone connected to medical care–support interoperability, saying it will improve patient care, reduce medical errors and create money-saving efficiencies.

But interoperability has yet to become a reality, despite such overwhelming support for the free flow of patient data between caregivers. In fact, healthcare isn’t even close.

“The potential promise and hope was that your [digital] record would be available wherever it’s needed by whoever needs it. But the records have not been as portable as people had hoped they would be,” Steven J. Stack, MD, president of the American Medical Association, told Medical Economics

 Many challenges—technical, financial and procedural—remain as healthcare moves toward interoperability.

One of the biggest hurdles is getting the technology to the point where it will allow the different electronic health record (EHR) systems to talk to one another—the key underpinning needed for interoperability, according to Kelly Aldrich, DNP, RN-BC, CCRN-A, chief clinical transformation officer at the Center for Medical Interoperability. The center is a non-profit organization bringing together executives from healthcare organizations and other stakeholders to accelerate the seamless exchange of health information.

The Government Accountability Office identified five obstacles in its September 2015 report Electronic Health Records: Nonfederal Efforts to Help Achieve Health Information Interoperability. They are: insufficiencies in health data standards; variation in state privacy rules; challenges in accurately matching all the right records to the right patient; the costs associated with work to reach interoperability; and the need for governance and trust among entities to facilitate sharing health information.

Stack said he thinks the federal government is also an obstacle to reaching interoperability. He explained that EHR vendors developed their software products to meet the Centers for Medicare & Medicaid’s (CMS’) Meaningful Use certification requirements but it didn’t do anything to promote interoperability.

“What we have to do is restore a marketplace where those of us who are purchasing these tools have more leverage and more power to tailor the technologies,” he said.

Progress is being made on that front.

For example, the Center for Medical Interoperability is pulling together stakeholders in an effort to bring about plug-and-play interoperability.

Major EHR vendors and some 30 large healthcare providers also came together last October at the KLAS Keystone Summit and agreed to establish measurements of interoperability performance across EHR systems.

Additionally, the U.S. Department of Health and Human Services (HHS) in February announced that the major EHR vendors, the country’s five largest private healthcare systems and more than a dozen professional associations and stakeholder groups pledged to implement three core commitments to improve the flow of health information to consumers and healthcare providers. Those core commitments center on consumer access, no information blocking and national interoperability standards.And there’s CommonWell Health Alliance, a nonprofit association of health IT companies that’s working together to create universal access to health data nationwide. It aims to create ad execute a vendor-neutral platform that allows for this data exchange.

Clinicians themselves, usually in conjunction with the healthcare systems with which they’re affiliated, are also moving forward.

“Physicians are increasingly working in large healthcare systems with relatively mature electronic health records. These systems are working with their EHR vendors to implement the nationwide interoperability roadmap as quickly as they can,” said Sam Weir, MD, a national leader in medical informatics and lead informatics physician at UNC Health Care, a position in which he ensures that medical technology supports patients in their ability to access medical care.

Weir also credited the Office of the National Coordinator of Health Information Technology and its “roadmap” for interoperability published last fall for moving the dial on interoperability and its work toward setting technology standards between vendors.

As this work moves forward, Weir said clinicians need to prepare.

“If they don’t know already they need to find out if their EHR does meet or will meet the Healthcare Information and Management Systems Society interoperability standards. If their vendor won’t give them a straight answer they need to keep pushing. The train is leaving the station and they need to get on,” he said.

Via Medical Economics »

Is EHR data blocking really as bad as ONC claims?

Consensus that EHR vendors and profit-hungry hospitals are intentionally making it hard for patients and others to access date is based on evidence – much of it put forth by the Office of the National Coordinator for Health IT – that is largely anecdotal.

With $32 billion spent already to achieve the meaningful exchange of healthcare and patient information, the federal government is hard at work trying to find where and how data is being blocked.

But whether data blocking is intentional, or not, remains a subjective question based largely on anecdotes that deserve speculation.

Many experts and industry leaders say they’ve never seen any cases of data blocking — others insist it’s more complicated than that. Some complaints are the result of a lack of technological progress, lack of standards, and misconceptions.

What’s the truth about data blocking? 
That all comes down to who you ask.

“I’ve never seen information blocking by anyone — vendor or hospital,” said John Halamka, MD, chief information officer of Beth Israel Deaconess Medical Center, Harvard medical professor and co-chair of the HIT Standards Committee at ONC. “I’ve seen a lack of infrastructure, a lack of a business model and a lack of a clinical imperative to share data, but never purposeful blocking.”

That purposeful distinction is substantial. But ONC still maintains that willful data blocking is a pressing problem that has to be addressed.

“From the evidence available, it is readily apparent that some providers and developers are engaging in information blocking,” National Coordinator Karen DeSalvo, MD, explained in a post on ONC’s site. It’s a serious problem, she said, and “one that is not being effectively addressed.”

That assertion appears to be based on a congressionally mandated report issued by the Office of the National Coordinator (ONC) in April 2015, which also calls for congressional action to put a stop to the data blocking.

But ONC has said it drew conclusions about the widespread data blocking problem based on “anecdotal evidence” collected from 60 unsolicited complaints, as well as phone surveys and a review of public records, testimony, industry analyses, trade and public news media and other sources.

DeSalvo is not the only to contend that data is being blocked for nefarious reasons.

“It is very frustrating not be able to send information electronically to another party, to find out they can’t receive it digitally, or that it doesn’t make sense when it’s received,” said David Kibbe, MD, president and CEO of DirectTrust, a nonprofit collaborative to support interoperability.

But he doesn’t feel it’s always just an innocent aspect of misplaced priorities or misaligned technology.

“In a fee-for-service health care system, information isn’t just power, it’s money, too,” said Kibbe. “So it is natural that we’ll get information hoarding, information blocking, and information channeling as means to an end by some entities.”

Is data flow being choked for other reasons?
Mari Savickis, vice president of federal affairs for the College of Healthcare Information Management Executives, or CHIME, said data exchange is improving on a daily basis – even if it’s being exchanged in less technical ways at times, through secure email, fax and even with paper.

“We’re moving toward improving data exchange; the numbers are going up,” Savickis continued. “Are we at the point of seamless data exchange? No. Is it being blocked? No one is setting out to do that.”

Savackis cited examples that she’s heard from CHIME members that could be interpreted as data blocking, but are far from it.

They are the result of lack of granularity in standards and financial limitations.

The looming question: Who foot the bill for exchange?
Kerry McDermott, vice president of public policy and communications at the Center for Medical Interoperability says all the finger-pointing about data blocking isn’t helpful.

“It’s more of a systemic issue, not limited to a specific party,” McDermott explained. “In the grand scheme of things, sharing data is new.”

In the past, it wasn’t beneficial for doctors and vendors to share data and so it follows, she added, that most of the technology in place today was not exactly engineered with interoperability in mind.

But now with the changes in value-based care reimbursement models, sharing data is going to be imperative.

Gary Dickinson, co-chair of HL7’s EHR workgroup said ONC seems to believe that if information is not flowing it’s being blocked. But that might not necessarily be the case.

“More likely it’s a situation where there’s no existing electronic infrastructure to facilitate direct sharing,” Dickinson said. “Infrastructure requires investment and who’s going to make that investment?”

Via Healthcare IT News »

Why Can’t Everyone Just Get Along?

While the rest of the world has adopted a ‘plug and play’ mentality, healthcare technology still doesn’t play well with others.

AUTHORS Cindy Sanders

You can get money out of an ATM in Istanbul and watch a movie 35,000 feet in the air on the flight back, but you cannot electronically access your patient’s x-ray taken at the urgent care center two blocks down the street.

What would be totally unacceptable in any other industry has somehow become widely tolerated as ‘business as usual’ in healthcare … but one group is determined that’s about to change. The Center for Medical Interoperability is on a quest to bring healthcare in line with other vertical markets to improve safety, outcomes and cost efficiency.

Ed Cantwell, executive director for the Center for Medical Interoperability, said the 501(c)(3) organization came about as a result of the philanthropic work and strategic vision of the Gary and Mary West Foundation and West Health Institute. Looking at what drives costs in healthcare and contributes to less-than-stellar outcomes within the industry, Cantwell and colleagues were given the task of identifying the elephant in the room.

Zeroing in on the technological disconnect from medical devices to electronic health records, Cantwell noted the team was asked to bring a fresh perspective from outside of healthcare to the problem at hand “instead of conceding defeat from a legacy attitude.”

Guiding Motivation

From the start, Cantwell said the Center has had five guiding motivators to address and resolve:

  1. High cost: “Technology is in the way instead of in the background,” he said of a lack of efficiency driving costs.
  2. Preventable deaths: “We lose nearly two 747s a day with about 400 people each,” Cantwell pointed out. “If an airline lost two planes, or 800 people a day, would the public tolerate it?”
  3. Caregiver burnout: A former pilot, Cantwell said the difference between when he flew regularly 15 years ago and today is unbelievable. Technological advancements guide decision-making and have vastly improved safety. “The systems are interconnected; they wrap the pilot in knowledge,” he pointed out. However, the same cannot be said in healthcare where there has been little effective change in the underlying technology infrastructure over the same time period. Cantwell noted in the absence of data interoperability, providers make calls without all the available information at hand. The current process of attempting to integrate data is cumbersome and exhausting.
  4. Precision medicine: “If you don’t have true data interoperability, how can you realize the benefits of personalized medicine?” he questioned of applying data to individuals and the broader population health mission.
  5. Innovation in an application-based economy: “Between Apple and Google, the app-driven economy is fundamentally changing the way you live and is starting to penetrate your wellness,” Cantwell noted. “In general, healthcare has had very little IT innovation, and it’s because access to data is so proprietary and so hard to hook into that it doesn’t attract investors. There’s not an underlying uniform infrastructure so innovators are shunning healthcare because it’s just so hard.”

Creating a Structure for Success

Cantwell said in every other vertical market – including the highly competitive cable, phone, financial, and airline industries – data has been made available to drive advancement. “That interoperability and data exchange allows for a level of wisdom that drives productivity and outcomes,” he pointed out.

Not only is communication difficult among healthcare entities across the continuum of care, but it is often hard to share data even within a single practice or health system. With so much of the equipment being proprietary, one device or piece of technology can’t ‘talk’ to another without the purchase of middleware. “Why do I need to pay for an interpreter? Why can’t you just speak the same language?” Cantwell questioned. “In many ways, the hospitals and health systems are just being held hostage.”

To change that, the core staff of the Center has spent the last few years studying other vertical markets. “There was one common denominator,” Cantwell said. “Each has the benefit of what we call a centralized lab.”

He noted these successful industries have created a non-profit made up of leading companies within their sector and have challenged the CEOs to support the non-profit by serving on the board and bringing in their technical staff to come together and agree on a fundamental architecture that is both vendor and member neutral.

Taking a page from these industries, Cantwell noted, “In mid-2012 we started building a lab for healthcare with a focus being on the seamless exchange of information.” Incubated in California, the Center is in the process of moving to its permanent home in ONEC1TY in Nashville.

The impressive members of the Center’s board represent nearly one of every eight dollars spent in healthcare. In addition to the founding chairman, Michael Johns, MD, the board includes the top executive from a host of academic, for profit, and not-for-profit companies and organizations including HCA, Robert Wood Johnson Health System, Cedars-Sinai Health System, LifePoint Health, Community Health Systems, Ascension Health, Scripps Health, and Vanderbilt University, University of North Carolina and Johns Hopkins Schools of Medicine, among others.

“We’re using the industry leaders and their procurement power and technical advisors and a dedicated R&D organization that will work within the ecosystem to develop a data interoperability platform and make it available free to the ecosystem and then become the test and certification body for it,” Cantwell said of the Center.

Moving Forward

The Center is focused on five core platform requirements to achieve interoperability:

  • Plug-and-play so that when two independent pieces are connected, they self-configure how to talk to each other with minimal or no human intervention.
  • One-to-many communication where once a device or system is certified as being conformant with reference specifications or set of standards, it can be used with similarly certified devices without additional testing.
  • Two-way data exchange enabling data to flow in both directions for feedback loops and automation.
  • Standards-based options that use open, as opposed to proprietary, solutions in reference architectures, interface specifications and testing.
  • Trust so that users have the confidence that interoperable systems will be safe, secure and reliable.

Calling the work to be done both “a moonshot and a marathon,” Cantwell said the Center has a very aggressive goal to have the basic components in place within two years. After that, he said the function of the Center would be to build strong governance that encourages continuing innovation. “You’re not going to be able to keep healthcare out of an app-based economy forever,” he said. “Once it tips, you’re not going to be able to stop it.”

Improving interoperability holds great promise both in terms of patient outcomes and increased cost efficiency. However, Cantwell noted, success has even broader implications for the overall health and wellbeing of the country. “With healthcare (spending) at nearly 25 percent of the GDP, if you take even 10 points out of that, you could almost fund all the other social issues that are threatened,” he pointed out.

Cantwell concluded, “I think as a nation, it’s time. It’s time as consumers, we demand more from our healthcare system.”

Via MedicalNewsInc »

Is ONC putting too much hope on the cloud?

The Office of the National Coordinator for Health Information Technology (ONC) is pinning its latest hopes for interoperability on the cloud, namely with application programing interfaces, or APIs. But is this hope realistic?

Experts feel it will be a close call to get the API standards ready in time for requirements found in the April 27 release of the Medicare Access and CHIP Reauthorization Act of 2015 (MACRA) proposed rule. The rule makes it a national objective to achieve health data interoperability by December 31, 2018.

A measure in the MACRA proposal called, “View, Download, Transmit (VDT),” calls for eligible clinicians to help patients:

  • view, download or transmit their health information to a third party;
  • access their health information through the use of an API; or
  • a combination of both.

In an April 27 blog, Andy Slavitt, acting administrator for the Centers for Medicare & Medicaid Services (CMS), gave an indication of ONC’s hopes for APIs, saying they will “open up the physician desktop to allow apps, analytic tools, and medical devices to plug and play.”

Setting apps on ‘fire’

There’s one problem with that. The APIs aren’t ready yet. HL7 (Health Level Seven International), a group that sets standards, formats and definitions for exchanging and developing electronic health records (EHRs), is working on standards for healthcare APIs called Fast Healthcare Interoperability Resources (FHIR). It’s pronounced “fire,” and there are plenty of references to FHIR setting the whole health app world on fire.

Gary Dickinson, co-chair of HL7’s EHR workgroup believes that ONC is maybe putting too much pressure on HL7 to get the APIs completed by 2018. He thinks FHIR will “probably” be ready in time for what is required in the MACRA proposal, but not for broader use. Some companies in the industry may go ahead and use the FHIR standards prior to HL7’s vetting them, he said.

Developing FHIR standards for APIs is not an easy thing to rush. FHIR divides data into “bite-sized chunks” for transmission. These are called resources. FHIR has more than 100 different resources identified. Each of these must be vetted, involving more than 50 workgroups at HL7. Developers rate each resource with a maturity level of zero through five. Most resources today are rated zero or one, Dickinson says.

HL7 is shooting to have the third in a series of trial versions of FHIR ready by Dec 31, but that is likely “pushing it,” he says. The timeframe largely depends on the number and extent of updates required from comments received in the upcoming August FHIR ballot.

John Halamka, MD, chief information officer of the Beth Israel Deaconess Medical Center, Harvard medical professor and co-chair of the HIT Standards Committee says FHIR is key to the future of query/response interoperability, which allows users to pull the data from wherever it resides.
“We’re at a time in history when the private sector–driven by customer demand–is implementing novel solutions to healthcare information exchange,” Halamka says. “FHIR is already in production in many applications and every major HIT vendor will have FHIR application program interfaces in production this year.”

Is ONC pushing too hard?

Justin Barnes, health IT industry advisor and thought leader, doesn’t think ONC is necessarily trying to force the use of APIs in the new MACRA proposal. Barnes has been an advisor to the White House and has worked with Congress and regulatory agencies on health IT issues for more than a decade. His interpretation is that ONC officials are trying to mandate flexibility, usability and interoperability. “I don’t feel they’re pushing for regulatory granularity,” he says. “They want to allow creativity.”

Robert Tennant, director of health information technology policy at the Medical Group Management Association agrees that ONC is not forcing the use of APIs in its MACRA proposal, but “it’s clearly in there.” For Tennant, it all comes down to one thing–“trying to find a balance between what patients want and what doctors can handle.”

“The government would say that interoperability is the seamless flow of information,” Tennant says. “But the question is, does every record need to be interoperable? Probably not.”

David Kibbe, MD, is president and CEO of DirectTrust, a collaborative non-profit association of 145 health IT and healthcare provider organizations in support of secure, interoperable health information exchange via the Direct message protocols. Kibbe is also a senior advisor to the American Academy of Family Physicians. He says with the use of FHIR, APIs will help patients get a fuller picture of their own health information, because the apps will help them access it and see it in new ways. But some aspects of developing FHIR are going to be difficult, especially with cross-organizational use cases.

Of the MACRA proposed API requirements, Kibbe says, “it will be an enormous challenge for both providers and vendors to meet the new objectives and measures within the current time frames, with all the other additional objectives and measures required.”

Bipartisan optimism

At a May 11 House Ways and Means Committee hearing on the MACRA, Slavitt said the proposal is just a starting place for the discussion. “It will take work and broad participation to get it right.”
MACRA was a bipartisan effort, and true to those roots, optimism about the proposal was also bipartisan at the hearing. Rep. Ron Kind (D-Wisc.) said MACRA is “all about finding ways to care for patients.”

Rep. Peter Roskam (R-Ill.) said, “we’re on the verge of good things.”

Via HealthDataManagement.com »

What holds healthcare back from the cloud

Web designer Chris Watterston put it best when he created a sticker that went viral: “there is no cloud, it’s just someone else’s computer.”

It’s that very issue that makes the cloud both appealing and unappealing to healthcare providers. It’s appealing because it provides the scalable, usable storage for the expanded needs of today’s healthcare market, including the storage of large genomic files and digital imagery. Few providers can store this kind of data in-house – and so, they use the cloud.

But the fear of the cloud being “out there” leaves the sensation that data is vulnerable, and keeps some healthcare providers away.

Ed Cantwell, executive director, Center for Medical Interoperability says people get tripped up with who accesses the cloud, and how. “They think, if it’s in the cloud, it’s a free-for-all. But that’s not the case at all,” he says. “I’m not so sure if a hacker cares if you are in the cloud or locked in a vault. If you’re in the cloud, you’re still located somewhere physically.”

Security is definitely a theme behind cloud concerns. James Custer, director of product development at Yorktel, says when it comes to the cloud, fears about HIPAA compliance are front and center.

“There is always this huge hesitation when the cloud is discussed,” Custer says, which is why the paperwork and sign-off to using the cloud can sometimes take a healthcare organization up to a year. But despite the difficulties, the cloud has really served smaller hospital systems well that can’t afford their own infrastructure. “The cloud has been huge for them,” he says.

Ray Derochers, executive vice president of HealthEdge, a cloud host company that serves mainly health insurance companies, says despite any initial hesitancy, most large insurance companies are moving to the cloud.

Beyond security issues, there is also the need to decide what information to move to the cloud. Because of the confidentiality and complexities of the insurance business, there is no way all the data is going to the cloud, Derochers says. Because of this, “people are afraid to take a bold position. They don’t comprehend all the moving parts.”

Tips for managing cloud technical and security issues

David Furnus, the site information security director at Stanford Health Care – ValleyCare says, “the cloud isn’t impervious to attack; that’s a given.” But knowing that can help to ensure protection.

Furnus suggests engineering resilience into systems and applications. This means “to expect we will be breached and to be prepared to detect, contain, and correct the breach as quickly and as effectively as we are able.”

The security of data transmission to and from the cloud is “a non-issue,” Furnus says, if the cryptographic controls being used meet or exceed the requirements of the Federal Information Processing Standard Publication 140-2, (FIPS PUB 140-2), a federal government computer security standard used to accredit cryptographic modules.

According to Furnus, providers should only consider using the cloud if the cloud host, at a minimum, uses the Federal Risk and Authorization Management Program (FedRAMP), a government-wide program that provides a standardized approach to security assessment, authorization, and continuous monitoring for cloud products and services. In addition, the cloud provider should “be subject to the successful negotiation of other client-specific security requirements.”

Lee Kim, director of privacy and security at the Healthcare Information and Management Systems Society (HIMSS), North America, says there are a number of things to look for when selecting a cloud provider.

First, make sure the cloud host will offer access to the data on-demand, with few interruptions. This is critical to healthcare. Does the host have a good track record of making the data available during business hours? Cloud hosts schedule down time for maintenance, but do they also have frequent unscheduled downtime when physicians might need records for patients? Ask colleagues who have used the cloud provider. “Don’t believe what marketing people say on the website; it’s so much more than that,” Kim says, who advises getting any kind of promises or assurance of medical record hosting in writing. Chances are if it’s not in writing it might not be part of the agreement.

Get a copy of the cloud host’s last risk assessment to see how well they are doing with security, Kim advises. Check to see what controls they are using in terms of security. A good rule of thumb when it comes to cloud security, “sometimes you get what you pay for.”

Be wary of small start-up cloud services, he adds. Will they be around in a year? Many venture capital firms own cloud companies temporarily, planning to sell. With a large cloud provider that has been in business for 10 years or more, there’s a little more assurance they will be in business a while, Kim says.

Check out the company’s customer service capability. Sometimes it’s limited. “In this day and age, it can make a world of difference what the customer service is like. If the company isn’t responsive and keeps kicking the can down the road, that’s not good, especially, when it comes to caring for patients,” she says. “Physicians can’t fight with the technology and take care of patients at the same time.”

In terms of managing the risk after you have legally bound yourself to a cloud company, Kim says to make sure someone in the organization is keeping up with them, serving as a liaison.

Via HealthDataManagement.com »

Power of the cloud spurs big push to boost interoperability

The cloud holds great potential for health data exchange interoperability, perhaps even including hopes for international standardization to support the Internet of Things, and it may be used to leverage intense data exchange initiatives, health IT experts believe.

Precision medicine based on genomics, with its huge amounts of complex digital information, will tie masses of information to a single electronic health record, says Lee Kim, director of privacy and security for the Health Information and Management Systems Society North America. Test results, summaries of imaging studies and genomic data will make using a clinical server model impossible, Kim says. “It’s way too much data. I could definitely see the cloud playing a future role.”

“I’m not sure that cloud technologies by themselves necessarily enhance interoperability,” says John Halamka, MD, chief information officer of the Beth Israel Deaconess Medical Center and a professor of medicine at Harvard Medical School, who has co-chaired federal workgroups on the standards needed for interoperability. “However, there are cloud-based services that could reduce the burden of interoperability implementation.”

Halamka

Still, the healthcare industry has some catching up to do when it comes to using the cloud, says Ed Cantwell, executive director of the Center for Medical Interoperability (CMI), a nonprofit with mainly health system members that is striving to accelerate the seamless exchange of information. Sectors like the financial industry have relied on the flow of data to survive, but something has blocked the healthcare industry from following suit, Cantwell says.

“You can walk into any hospital in this country and systems don’t talk to each other. They don’t work in a plug-and-play manner,” says Kerry McDermott, vice president of public policy and communications at CMI. “Health systems want change,” she says. “They are living and breathing the problem every day.”

CMI is currently working to select 100 engineers for participation in the development of a blueprint for interoperability, which will include cloud and non-cloud solutions. The blueprint will be used to certify healthcare products as being capable of working on the cloud. Up for consideration for working on the project are “some of the biggest players” in other industries, Cantwell says.

CMI’s membership represents $100 billion in procurement power, and it is this, plus the opportunity to expand into the healthcare sector that has drawn interest. When the selected engineers are revealed in a few weeks, they will work in CMI’s centralized lab to tackle interoperability. “It’s a game changer,” to have the providers, which have purchasing power, implementing the drive for change, McDermott says. CMI aims to include all devices across continuum of care in the blueprint. A pilot of the blueprint will be ready before the end of the year, she says.

In Massachusetts, Halamka says, some of these services include a cloud-based provider directory for fast, reliable lookup of any provider’s direct address. It also has a cloud-based master patient index, consent registry, and record locator service to support patient identification and assembly of records for multiple sources. The state also has a cloud-based quality registry for aggregation of quality data.

Interoperability issues have been exacerbated, not lessened, with the adoption of electronic health records. To advance healthcare through the use of data, the federal government sees the need to boost interoperability; for example, rules to enact the Medicare Access and CHIP Reauthorization Act of 2015 (MACRA) emphasize achieving interoperability through the use of application programming interfaces (APIs).

In January, Centers for Medicare and Medicaid Services Acting Administrator Andy Slavitt gave a hint at the government’s direction when he said the agency will promote innovation “by unlocking electronic health information through open APIs – technology tools that underpin many consumer applications.”

“Interoperability will be a top priority, through the implementation of federally recognized, national interoperability standards and a focus on real-world uses of technology,” he said of CMS’s plans to interpret MACRA.

Even as interoperability is still up for grabs, it’s clear that the cloud is going to provide a great deal of help toward achieving other healthcare goals, such as facilitating currently complex activities such as finding participants for clinical trials and holding the data to conduct trials.

“The cloud is extremely important for clinical trials,” says Luca Emili, founder and CEO of Promeditec, an Italian-based company that develops technology to accelerate clinical trials. About 10 years ago, the quantity of data collected per patient was quite low. Now, with the addition of wearable devices, digital images and genomic data, hospitals need to find a new strategy for collecting this data, he says.

Promeditec recently chose Verizon Cloud to support the delivery of its AppClinical Trial solution, a software as a service (SaaS) through a collaborative platform that can be scaled to set up trials, and capture and manage trial data. Use of the cloud with this platform has helped to cut the expense and time of clinical trials, which can cost as much as $3 billion over the 10 to 15 years required to conduct a trial.

A patient’s genomic information often has 300 gigabytes of data, and hospitals that want to participate in clinical trials in the future will need to use the cloud because of the sheer volume of data that large trials could involve. In addition, the cloud enables the use of data gathered worldwide, and hospitals can no longer store this quantity of data in-house, Emili says.

Jim Hollingshead, president of San Diego-based ResMed, a provider of connected healthcare solutions for remote monitoring of patients with respiratory illnesses, has found a way to use to cloud to save money and increase the flow of data. ResMed’s clients, mainly home health medical equipment (HME) providers, are required by Medicare to show proof that patients are using VPAP and CPAP breathing devices. Previously, removable data cards were used for this purpose, but ResMed replaced them with cellular chips that send data straight to the cloud. Now an HME can go online and verify the usage. It greatly reduces the labor required by the HMEs to verify usage, saving them money.

A completely unexpected aspect of going to the cloud and online was the ability to identify patients that need intervention for compliance. Adherence levels jumped, as did the interest on the part of consumers. Some 900 of them go on the site per day to check the data regarding their usage and several hundred thousand patients persistently look at their data. “We were shocked” patients latched onto this, he said. There was unquestionably an underlying need.

The software platform has an API that enables hospitals to connect with the HMEs through the cloud, making the EHRs interoperable, which is especially important to providers in networks and ACOs. “We see the world going to an integrated place,” Hollingshead says.

In 2014, the company launched its SaaS API, and it was quickly adopted. Berg Insight confirmed that ResMed in a 16-month span had become the the world’s leader in remote patient monitoring with more than 1 million connected devices. . “The cloud is the next wave of creating value,” Hollingshead says.

The cloud fits many of healthcare’s needs to exchange information and will play a vital role, says CMI’s Cantwell. “It’s not an issue of whether or not healthcare is going to going to adopt the cloud; healthcare has already started to.”

But the speed of adoption depends on open data. “Once data becomes democratized, almost the only way you can allow that data to really be a source of true innovation is to open it up to more cloud-based services,” Cantwell says.

However, “it’s only a matter of time,” says Kim from HIMSS. Some hospitals are becoming more geographically diverse, and they’ll need data on demand. The cloud is scalable and can provide not only computing power and storage space, but also convenience, she says.

Via HealthDataManagement.com »