The tech doesn’t talk: Nashville needs to fix it

Two hundred thousand people die every year because of a failure to communicate.

A trillion dollars is wasted every year.

That’s how health care experts set the stage when they talk about one of the biggest concerns facing the industry today: a lack of interoperability.

Yes, it’s a big word. And it’s one that means something different to nearly everyone you ask. But interoperability — the ability for medical records and technologies to seamlessly share data in a way that improves patient care — is a problem Nashville’s health care leaders must get serious about solving.

So the most powerful names in Nashville health care are joining forces to do just that. A nonprofit center ramping up in Nashville has brought together a group of competitors — including Middle Tennessee giants HCA Holdings Inc., Community Health Systems Inc. and LifePoint Health — in an effort to set standards for health care technology.

With the help of government dollars and innovative enthusiasm, entrepreneurs and established companies have spent the past decade developing countless technologies meant to improve care. But without the buy-in of hospital companies, the industry’s been left with hundreds of innovators casting about in the dark, each cultivating their own proprietary corner of the market.

That’s where the Center for Medical Interoperability steps in. Launched with $10 million from two philanthropists in San Diego, the group has front-loaded its board of directors with a slate of names that have the power to bring significant change to health care — and keep Middle Tennessee’s leading industry in the driver’s seat.

Building a powerhouse

Launched by the foundation for Gary and Mary West, two successful technology entrepreneurs, the Center for Medical Interoperability is aimed at creating a centralized lab where the health care industry can solve problems.

The center announced its board members last year. Among them: HCA’s Milton Johnson, CHS’ Wayne Smith and LifePoint’s Bill Carpenter, the CEOs of three of the nation’s five largest publicly traded hospital companies. They sit alongside other national figures (including the heads of Johns Hopkins Medicine and the Robert Wood Johnson Health System)and well-known Nashville leaders like Vanderbilt University Medical Center CEO Dr. Jeff Balser and Dr. Mike Schatzlein, market leader of Indiana and Tennessee ministries for the nation’s largest Catholic hospital system, Ascension, represented locally by Saint Thomas Health.

That who’s who of health care is key to making technology communicate better, said Ed Cantwell, the center’s executive director.

“The reason I came to Nashville is we wanted the largest for-profit and the largest nonprofit to be the anchor,” said Cantwell, who moved the center here from San Diego. “I came here to recruit Ascension and HCA. Little did I know that I would get CHS and LifePoint … and Vanderbilt. So right now those five are really a phenomenal force because they not only represent the diversity of health care, between the five there [is more than] $100 billion a year in revenue.”

That translates to purchasing power that can force vendors to meet standards set by the center’s board, Cantwell argues. The makers of health care technology products could have incentive to prevent their tools from playing well with others, as it might force a customer who’s on their system to buy only their products. But if the nation’s largest hospital companies won’t buy products or systems that don’t work with a broad array of other tools, that incentive goes away.

At a recent Nashville Health Care Council event, Schatzlein said this shift “needs to be driven by the providers.”

If the industry can “come together and certify against standards,” he continued, “we are in a position to implement … and enforce [those standards].”

The center is set to open its new Nashville facility and lab in the OneCity development by the end of this year. Along with the center’s board members, any type of health care business can join the center, which is supported by member dues. The center will not only work to set standards that health-tech products must meet, but it also will house a testing laboratory — designed to employ up to 120 engineers — where those products can be tested and certified for how well they meet the criteria.

“I don’t think anybody that was developing an [electronic medical record] decided to not make their system interoperable. That’s typically not how engineers think,” said Jeff Cunningham, chief technology officer for Nashville-based health tech firm Informatics Corporation of America. “But they were designed for a specific purpose.”

Why the tech doesn’t talk

There are reasons these technologies don’t work together now. First, it’s hard to do.

In the documentary “No Matter Where” — directed by Kevin Johnson, chairman of biomedical informatics at Vanderbilt University Medical Center — a West Tennessee doctor who’d seen care transformed by the effective exchange of information gave up a few years later because the hospital switched software vendors to one he didn’t like.

“I’ve tried to use it, my colleagues have tried to use it, and it’s just not user-friendly,” David Wilcox, a doctor with Memphis’ St. Francis Hospital featured in the film, said of the hospital’s software for sharing information.

But the issues aren’t just about user-friendliness and design. The business incentives also aren’t always there.

For tech vendors and health care providers, there are advantages to keeping your customer — be it the hospital or a patient — tied to your system. Just as a vendor may want to keep a hospital tied to its products, a hospital system may want to keep patients attached to its internal systems so they don’t jump to another provider.

“True interoperability is not good for providers,” Zane Burke, president of health IT giant (and dominant electronic medical record provider) Cerner, said at the recent health care council panel on the issue. “There’s business model challenges in there.”

But, proponents argue, there also are advantages to data sharing, which extend beyond the patient safety rationale so often cited. Dr. Lynn Simon, chief medical officer at Community Health Systems, said increasing interoperability would save CHS money and increase its efficiency in rolling out new technologies.

“Certainly with a lot of time and effort and resources and dollars, anything can be connected together,” Simon said.

But making those systems work together without spending all those resources and dollars would be better for both hospitals and the vendors trying to sell to them, Simon argued.

“If we can’t easily integrate them into our environment, it limits their ability to sell their product, to test their product, to expand their product,” she said.

Without effective communication, providers are “in a bit of stalemate,” said Dr. Jonathan Perlin, chief medical officer and president of clinical services at HCA, because they don’t want to “bet on the wrong horse” and pick a technology that won’t work with their other products or in the industry’s future.

“It’s fundamentally a missed business opportunity,” Perlin continued, comparing the lack of tech communication in health care to the ease of it in other industries — such as the fact that your Sprint cell phone can call your friend’s phone, even if they have AT&T.

“Fundamentally this is a care-quality [and] a safety issue,” Perlin said. “It’s also got the business case on its side, as it’s a financial opportunity for providers, payers, patients and technology vendors in terms of a more efficient health care ecosystem and certainly one that’s far more informed.”


Government’s role: Health care’s efforts still need power of government behind new tech rules

All sorts of other industries come up when health care leaders talk about technologies working effectively together. You can take money out of any bank’s ATM, regardless of where you bank. Your iPhone can send a text message to my Android. Trains from any company can roll down the same tracks.

Those precedents have some in the health care industry convinced that when it comes to making health care technology work together, standards must be set — and enforced — by the government.

“It’s pretty clear that private industry has limited incentives to do the kind of work that gets a national standard up and running quickly,” said Kevin Johnson, chairman of biomedical informatics at Vanderbilt University Medical Center. “That being said,” Johnson continued, Nashville’s new “Center for Medical Interoperability contains the right people … and they’re in the right city at the right time.”

Dr. Jonathan Perlin, chief medical officer at HCA Holdings Inc., has been involved in public-sector efforts to effectively share information and data, serving on the U.S. Department of Health and Human Services Health IT Standards Committee, among similar efforts. He contends solutions will be required from both government and industry leaders.

“We’ve not gotten there entirely by private-sector [efforts], despite shared interest,” Perlin said. But with “increasing frustration and recognition of opportunity,” private-sector operations like the Nashville interoperability center may find success, he continued. “It’s quite an amazing group.”


What’s interoperability? That’s a big word. Here’s what it means to Nashville group

Ed Cantwell’s background isn’t in health care. But the mechancial engineer’s resume — which includes flying jets in the Air Force, working for Texas Instruments and running a wireless company that served health care companies — was enough to lead him to the director’s seat of the Center for Medical Interoperability, a group aiming to make health care technology communicate better.

Cantwell has five criteria he uses to define true interoperability:

1. Plug-and-play (no cost to switching platforms)

2. Two-way (able to both send and receive)

3. One-to-many (the addition of a new device doesn’t throw off what’s already hooked up)

4. Standard space (not proprietary)

5. Truly trusted (guaranteeing safety, privacy and security)

“What happens between [electronic medical record vendors] is data-sharing. … It’s as if I allowed my GPS location of my Ford to go into a database to be accessible to others,” Cantwell said. “True interoperability is the 13 censors in my truck connected to the brakes, connected to a computer, that senses a collision and applies the brakes, even though I might miss it.”

Via Nashville Business Journal »

The lingering challenge of healthcare interoperability

Front-line doctors, healthcare administrators, government officials—in fact, just about all everyone connected to medical care–support interoperability, saying it will improve patient care, reduce medical errors and create money-saving efficiencies.

But interoperability has yet to become a reality, despite such overwhelming support for the free flow of patient data between caregivers. In fact, healthcare isn’t even close.

“The potential promise and hope was that your [digital] record would be available wherever it’s needed by whoever needs it. But the records have not been as portable as people had hoped they would be,” Steven J. Stack, MD, president of the American Medical Association, told Medical Economics

 Many challenges—technical, financial and procedural—remain as healthcare moves toward interoperability.

One of the biggest hurdles is getting the technology to the point where it will allow the different electronic health record (EHR) systems to talk to one another—the key underpinning needed for interoperability, according to Kelly Aldrich, DNP, RN-BC, CCRN-A, chief clinical transformation officer at the Center for Medical Interoperability. The center is a non-profit organization bringing together executives from healthcare organizations and other stakeholders to accelerate the seamless exchange of health information.

The Government Accountability Office identified five obstacles in its September 2015 report Electronic Health Records: Nonfederal Efforts to Help Achieve Health Information Interoperability. They are: insufficiencies in health data standards; variation in state privacy rules; challenges in accurately matching all the right records to the right patient; the costs associated with work to reach interoperability; and the need for governance and trust among entities to facilitate sharing health information.

Stack said he thinks the federal government is also an obstacle to reaching interoperability. He explained that EHR vendors developed their software products to meet the Centers for Medicare & Medicaid’s (CMS’) Meaningful Use certification requirements but it didn’t do anything to promote interoperability.

“What we have to do is restore a marketplace where those of us who are purchasing these tools have more leverage and more power to tailor the technologies,” he said.

Progress is being made on that front.

For example, the Center for Medical Interoperability is pulling together stakeholders in an effort to bring about plug-and-play interoperability.

Major EHR vendors and some 30 large healthcare providers also came together last October at the KLAS Keystone Summit and agreed to establish measurements of interoperability performance across EHR systems.

Additionally, the U.S. Department of Health and Human Services (HHS) in February announced that the major EHR vendors, the country’s five largest private healthcare systems and more than a dozen professional associations and stakeholder groups pledged to implement three core commitments to improve the flow of health information to consumers and healthcare providers. Those core commitments center on consumer access, no information blocking and national interoperability standards.And there’s CommonWell Health Alliance, a nonprofit association of health IT companies that’s working together to create universal access to health data nationwide. It aims to create ad execute a vendor-neutral platform that allows for this data exchange.

Clinicians themselves, usually in conjunction with the healthcare systems with which they’re affiliated, are also moving forward.

“Physicians are increasingly working in large healthcare systems with relatively mature electronic health records. These systems are working with their EHR vendors to implement the nationwide interoperability roadmap as quickly as they can,” said Sam Weir, MD, a national leader in medical informatics and lead informatics physician at UNC Health Care, a position in which he ensures that medical technology supports patients in their ability to access medical care.

Weir also credited the Office of the National Coordinator of Health Information Technology and its “roadmap” for interoperability published last fall for moving the dial on interoperability and its work toward setting technology standards between vendors.

As this work moves forward, Weir said clinicians need to prepare.

“If they don’t know already they need to find out if their EHR does meet or will meet the Healthcare Information and Management Systems Society interoperability standards. If their vendor won’t give them a straight answer they need to keep pushing. The train is leaving the station and they need to get on,” he said.

Via Medical Economics »

Is EHR data blocking really as bad as ONC claims?

Consensus that EHR vendors and profit-hungry hospitals are intentionally making it hard for patients and others to access date is based on evidence – much of it put forth by the Office of the National Coordinator for Health IT – that is largely anecdotal.

With $32 billion spent already to achieve the meaningful exchange of healthcare and patient information, the federal government is hard at work trying to find where and how data is being blocked.

But whether data blocking is intentional, or not, remains a subjective question based largely on anecdotes that deserve speculation.

Many experts and industry leaders say they’ve never seen any cases of data blocking — others insist it’s more complicated than that. Some complaints are the result of a lack of technological progress, lack of standards, and misconceptions.

What’s the truth about data blocking? 
That all comes down to who you ask.

“I’ve never seen information blocking by anyone — vendor or hospital,” said John Halamka, MD, chief information officer of Beth Israel Deaconess Medical Center, Harvard medical professor and co-chair of the HIT Standards Committee at ONC. “I’ve seen a lack of infrastructure, a lack of a business model and a lack of a clinical imperative to share data, but never purposeful blocking.”

That purposeful distinction is substantial. But ONC still maintains that willful data blocking is a pressing problem that has to be addressed.

“From the evidence available, it is readily apparent that some providers and developers are engaging in information blocking,” National Coordinator Karen DeSalvo, MD, explained in a post on ONC’s site. It’s a serious problem, she said, and “one that is not being effectively addressed.”

That assertion appears to be based on a congressionally mandated report issued by the Office of the National Coordinator (ONC) in April 2015, which also calls for congressional action to put a stop to the data blocking.

But ONC has said it drew conclusions about the widespread data blocking problem based on “anecdotal evidence” collected from 60 unsolicited complaints, as well as phone surveys and a review of public records, testimony, industry analyses, trade and public news media and other sources.

DeSalvo is not the only to contend that data is being blocked for nefarious reasons.

“It is very frustrating not be able to send information electronically to another party, to find out they can’t receive it digitally, or that it doesn’t make sense when it’s received,” said David Kibbe, MD, president and CEO of DirectTrust, a nonprofit collaborative to support interoperability.

But he doesn’t feel it’s always just an innocent aspect of misplaced priorities or misaligned technology.

“In a fee-for-service health care system, information isn’t just power, it’s money, too,” said Kibbe. “So it is natural that we’ll get information hoarding, information blocking, and information channeling as means to an end by some entities.”

Is data flow being choked for other reasons?
Mari Savickis, vice president of federal affairs for the College of Healthcare Information Management Executives, or CHIME, said data exchange is improving on a daily basis – even if it’s being exchanged in less technical ways at times, through secure email, fax and even with paper.

“We’re moving toward improving data exchange; the numbers are going up,” Savickis continued. “Are we at the point of seamless data exchange? No. Is it being blocked? No one is setting out to do that.”

Savackis cited examples that she’s heard from CHIME members that could be interpreted as data blocking, but are far from it.

They are the result of lack of granularity in standards and financial limitations.

The looming question: Who foot the bill for exchange?
Kerry McDermott, vice president of public policy and communications at the Center for Medical Interoperability says all the finger-pointing about data blocking isn’t helpful.

“It’s more of a systemic issue, not limited to a specific party,” McDermott explained. “In the grand scheme of things, sharing data is new.”

In the past, it wasn’t beneficial for doctors and vendors to share data and so it follows, she added, that most of the technology in place today was not exactly engineered with interoperability in mind.

But now with the changes in value-based care reimbursement models, sharing data is going to be imperative.

Gary Dickinson, co-chair of HL7’s EHR workgroup said ONC seems to believe that if information is not flowing it’s being blocked. But that might not necessarily be the case.

“More likely it’s a situation where there’s no existing electronic infrastructure to facilitate direct sharing,” Dickinson said. “Infrastructure requires investment and who’s going to make that investment?”

Via Healthcare IT News »

Why Can’t Everyone Just Get Along?

While the rest of the world has adopted a ‘plug and play’ mentality, healthcare technology still doesn’t play well with others.

AUTHORS Cindy Sanders

You can get money out of an ATM in Istanbul and watch a movie 35,000 feet in the air on the flight back, but you cannot electronically access your patient’s x-ray taken at the urgent care center two blocks down the street.

What would be totally unacceptable in any other industry has somehow become widely tolerated as ‘business as usual’ in healthcare … but one group is determined that’s about to change. The Center for Medical Interoperability is on a quest to bring healthcare in line with other vertical markets to improve safety, outcomes and cost efficiency.

Ed Cantwell, executive director for the Center for Medical Interoperability, said the 501(c)(3) organization came about as a result of the philanthropic work and strategic vision of the Gary and Mary West Foundation and West Health Institute. Looking at what drives costs in healthcare and contributes to less-than-stellar outcomes within the industry, Cantwell and colleagues were given the task of identifying the elephant in the room.

Zeroing in on the technological disconnect from medical devices to electronic health records, Cantwell noted the team was asked to bring a fresh perspective from outside of healthcare to the problem at hand “instead of conceding defeat from a legacy attitude.”

Guiding Motivation

From the start, Cantwell said the Center has had five guiding motivators to address and resolve:

  1. High cost: “Technology is in the way instead of in the background,” he said of a lack of efficiency driving costs.
  2. Preventable deaths: “We lose nearly two 747s a day with about 400 people each,” Cantwell pointed out. “If an airline lost two planes, or 800 people a day, would the public tolerate it?”
  3. Caregiver burnout: A former pilot, Cantwell said the difference between when he flew regularly 15 years ago and today is unbelievable. Technological advancements guide decision-making and have vastly improved safety. “The systems are interconnected; they wrap the pilot in knowledge,” he pointed out. However, the same cannot be said in healthcare where there has been little effective change in the underlying technology infrastructure over the same time period. Cantwell noted in the absence of data interoperability, providers make calls without all the available information at hand. The current process of attempting to integrate data is cumbersome and exhausting.
  4. Precision medicine: “If you don’t have true data interoperability, how can you realize the benefits of personalized medicine?” he questioned of applying data to individuals and the broader population health mission.
  5. Innovation in an application-based economy: “Between Apple and Google, the app-driven economy is fundamentally changing the way you live and is starting to penetrate your wellness,” Cantwell noted. “In general, healthcare has had very little IT innovation, and it’s because access to data is so proprietary and so hard to hook into that it doesn’t attract investors. There’s not an underlying uniform infrastructure so innovators are shunning healthcare because it’s just so hard.”

Creating a Structure for Success

Cantwell said in every other vertical market – including the highly competitive cable, phone, financial, and airline industries – data has been made available to drive advancement. “That interoperability and data exchange allows for a level of wisdom that drives productivity and outcomes,” he pointed out.

Not only is communication difficult among healthcare entities across the continuum of care, but it is often hard to share data even within a single practice or health system. With so much of the equipment being proprietary, one device or piece of technology can’t ‘talk’ to another without the purchase of middleware. “Why do I need to pay for an interpreter? Why can’t you just speak the same language?” Cantwell questioned. “In many ways, the hospitals and health systems are just being held hostage.”

To change that, the core staff of the Center has spent the last few years studying other vertical markets. “There was one common denominator,” Cantwell said. “Each has the benefit of what we call a centralized lab.”

He noted these successful industries have created a non-profit made up of leading companies within their sector and have challenged the CEOs to support the non-profit by serving on the board and bringing in their technical staff to come together and agree on a fundamental architecture that is both vendor and member neutral.

Taking a page from these industries, Cantwell noted, “In mid-2012 we started building a lab for healthcare with a focus being on the seamless exchange of information.” Incubated in California, the Center is in the process of moving to its permanent home in ONEC1TY in Nashville.

The impressive members of the Center’s board represent nearly one of every eight dollars spent in healthcare. In addition to the founding chairman, Michael Johns, MD, the board includes the top executive from a host of academic, for profit, and not-for-profit companies and organizations including HCA, Robert Wood Johnson Health System, Cedars-Sinai Health System, LifePoint Health, Community Health Systems, Ascension Health, Scripps Health, and Vanderbilt University, University of North Carolina and Johns Hopkins Schools of Medicine, among others.

“We’re using the industry leaders and their procurement power and technical advisors and a dedicated R&D organization that will work within the ecosystem to develop a data interoperability platform and make it available free to the ecosystem and then become the test and certification body for it,” Cantwell said of the Center.

Moving Forward

The Center is focused on five core platform requirements to achieve interoperability:

  • Plug-and-play so that when two independent pieces are connected, they self-configure how to talk to each other with minimal or no human intervention.
  • One-to-many communication where once a device or system is certified as being conformant with reference specifications or set of standards, it can be used with similarly certified devices without additional testing.
  • Two-way data exchange enabling data to flow in both directions for feedback loops and automation.
  • Standards-based options that use open, as opposed to proprietary, solutions in reference architectures, interface specifications and testing.
  • Trust so that users have the confidence that interoperable systems will be safe, secure and reliable.

Calling the work to be done both “a moonshot and a marathon,” Cantwell said the Center has a very aggressive goal to have the basic components in place within two years. After that, he said the function of the Center would be to build strong governance that encourages continuing innovation. “You’re not going to be able to keep healthcare out of an app-based economy forever,” he said. “Once it tips, you’re not going to be able to stop it.”

Improving interoperability holds great promise both in terms of patient outcomes and increased cost efficiency. However, Cantwell noted, success has even broader implications for the overall health and wellbeing of the country. “With healthcare (spending) at nearly 25 percent of the GDP, if you take even 10 points out of that, you could almost fund all the other social issues that are threatened,” he pointed out.

Cantwell concluded, “I think as a nation, it’s time. It’s time as consumers, we demand more from our healthcare system.”

Via MedicalNewsInc »

What holds healthcare back from the cloud

Web designer Chris Watterston put it best when he created a sticker that went viral: “there is no cloud, it’s just someone else’s computer.”

It’s that very issue that makes the cloud both appealing and unappealing to healthcare providers. It’s appealing because it provides the scalable, usable storage for the expanded needs of today’s healthcare market, including the storage of large genomic files and digital imagery. Few providers can store this kind of data in-house – and so, they use the cloud.

But the fear of the cloud being “out there” leaves the sensation that data is vulnerable, and keeps some healthcare providers away.

Ed Cantwell, executive director, Center for Medical Interoperability says people get tripped up with who accesses the cloud, and how. “They think, if it’s in the cloud, it’s a free-for-all. But that’s not the case at all,” he says. “I’m not so sure if a hacker cares if you are in the cloud or locked in a vault. If you’re in the cloud, you’re still located somewhere physically.”

Security is definitely a theme behind cloud concerns. James Custer, director of product development at Yorktel, says when it comes to the cloud, fears about HIPAA compliance are front and center.

“There is always this huge hesitation when the cloud is discussed,” Custer says, which is why the paperwork and sign-off to using the cloud can sometimes take a healthcare organization up to a year. But despite the difficulties, the cloud has really served smaller hospital systems well that can’t afford their own infrastructure. “The cloud has been huge for them,” he says.

Ray Derochers, executive vice president of HealthEdge, a cloud host company that serves mainly health insurance companies, says despite any initial hesitancy, most large insurance companies are moving to the cloud.

Beyond security issues, there is also the need to decide what information to move to the cloud. Because of the confidentiality and complexities of the insurance business, there is no way all the data is going to the cloud, Derochers says. Because of this, “people are afraid to take a bold position. They don’t comprehend all the moving parts.”

Tips for managing cloud technical and security issues

David Furnus, the site information security director at Stanford Health Care – ValleyCare says, “the cloud isn’t impervious to attack; that’s a given.” But knowing that can help to ensure protection.

Furnus suggests engineering resilience into systems and applications. This means “to expect we will be breached and to be prepared to detect, contain, and correct the breach as quickly and as effectively as we are able.”

The security of data transmission to and from the cloud is “a non-issue,” Furnus says, if the cryptographic controls being used meet or exceed the requirements of the Federal Information Processing Standard Publication 140-2, (FIPS PUB 140-2), a federal government computer security standard used to accredit cryptographic modules.

According to Furnus, providers should only consider using the cloud if the cloud host, at a minimum, uses the Federal Risk and Authorization Management Program (FedRAMP), a government-wide program that provides a standardized approach to security assessment, authorization, and continuous monitoring for cloud products and services. In addition, the cloud provider should “be subject to the successful negotiation of other client-specific security requirements.”

Lee Kim, director of privacy and security at the Healthcare Information and Management Systems Society (HIMSS), North America, says there are a number of things to look for when selecting a cloud provider.

First, make sure the cloud host will offer access to the data on-demand, with few interruptions. This is critical to healthcare. Does the host have a good track record of making the data available during business hours? Cloud hosts schedule down time for maintenance, but do they also have frequent unscheduled downtime when physicians might need records for patients? Ask colleagues who have used the cloud provider. “Don’t believe what marketing people say on the website; it’s so much more than that,” Kim says, who advises getting any kind of promises or assurance of medical record hosting in writing. Chances are if it’s not in writing it might not be part of the agreement.

Get a copy of the cloud host’s last risk assessment to see how well they are doing with security, Kim advises. Check to see what controls they are using in terms of security. A good rule of thumb when it comes to cloud security, “sometimes you get what you pay for.”

Be wary of small start-up cloud services, he adds. Will they be around in a year? Many venture capital firms own cloud companies temporarily, planning to sell. With a large cloud provider that has been in business for 10 years or more, there’s a little more assurance they will be in business a while, Kim says.

Check out the company’s customer service capability. Sometimes it’s limited. “In this day and age, it can make a world of difference what the customer service is like. If the company isn’t responsive and keeps kicking the can down the road, that’s not good, especially, when it comes to caring for patients,” she says. “Physicians can’t fight with the technology and take care of patients at the same time.”

In terms of managing the risk after you have legally bound yourself to a cloud company, Kim says to make sure someone in the organization is keeping up with them, serving as a liaison.

Via HealthDataManagement.com »

Power of the cloud spurs big push to boost interoperability

The cloud holds great potential for health data exchange interoperability, perhaps even including hopes for international standardization to support the Internet of Things, and it may be used to leverage intense data exchange initiatives, health IT experts believe.

Precision medicine based on genomics, with its huge amounts of complex digital information, will tie masses of information to a single electronic health record, says Lee Kim, director of privacy and security for the Health Information and Management Systems Society North America. Test results, summaries of imaging studies and genomic data will make using a clinical server model impossible, Kim says. “It’s way too much data. I could definitely see the cloud playing a future role.”

“I’m not sure that cloud technologies by themselves necessarily enhance interoperability,” says John Halamka, MD, chief information officer of the Beth Israel Deaconess Medical Center and a professor of medicine at Harvard Medical School, who has co-chaired federal workgroups on the standards needed for interoperability. “However, there are cloud-based services that could reduce the burden of interoperability implementation.”

Halamka

Still, the healthcare industry has some catching up to do when it comes to using the cloud, says Ed Cantwell, executive director of the Center for Medical Interoperability (CMI), a nonprofit with mainly health system members that is striving to accelerate the seamless exchange of information. Sectors like the financial industry have relied on the flow of data to survive, but something has blocked the healthcare industry from following suit, Cantwell says.

“You can walk into any hospital in this country and systems don’t talk to each other. They don’t work in a plug-and-play manner,” says Kerry McDermott, vice president of public policy and communications at CMI. “Health systems want change,” she says. “They are living and breathing the problem every day.”

CMI is currently working to select 100 engineers for participation in the development of a blueprint for interoperability, which will include cloud and non-cloud solutions. The blueprint will be used to certify healthcare products as being capable of working on the cloud. Up for consideration for working on the project are “some of the biggest players” in other industries, Cantwell says.

CMI’s membership represents $100 billion in procurement power, and it is this, plus the opportunity to expand into the healthcare sector that has drawn interest. When the selected engineers are revealed in a few weeks, they will work in CMI’s centralized lab to tackle interoperability. “It’s a game changer,” to have the providers, which have purchasing power, implementing the drive for change, McDermott says. CMI aims to include all devices across continuum of care in the blueprint. A pilot of the blueprint will be ready before the end of the year, she says.

In Massachusetts, Halamka says, some of these services include a cloud-based provider directory for fast, reliable lookup of any provider’s direct address. It also has a cloud-based master patient index, consent registry, and record locator service to support patient identification and assembly of records for multiple sources. The state also has a cloud-based quality registry for aggregation of quality data.

Interoperability issues have been exacerbated, not lessened, with the adoption of electronic health records. To advance healthcare through the use of data, the federal government sees the need to boost interoperability; for example, rules to enact the Medicare Access and CHIP Reauthorization Act of 2015 (MACRA) emphasize achieving interoperability through the use of application programming interfaces (APIs).

In January, Centers for Medicare and Medicaid Services Acting Administrator Andy Slavitt gave a hint at the government’s direction when he said the agency will promote innovation “by unlocking electronic health information through open APIs – technology tools that underpin many consumer applications.”

“Interoperability will be a top priority, through the implementation of federally recognized, national interoperability standards and a focus on real-world uses of technology,” he said of CMS’s plans to interpret MACRA.

Even as interoperability is still up for grabs, it’s clear that the cloud is going to provide a great deal of help toward achieving other healthcare goals, such as facilitating currently complex activities such as finding participants for clinical trials and holding the data to conduct trials.

“The cloud is extremely important for clinical trials,” says Luca Emili, founder and CEO of Promeditec, an Italian-based company that develops technology to accelerate clinical trials. About 10 years ago, the quantity of data collected per patient was quite low. Now, with the addition of wearable devices, digital images and genomic data, hospitals need to find a new strategy for collecting this data, he says.

Promeditec recently chose Verizon Cloud to support the delivery of its AppClinical Trial solution, a software as a service (SaaS) through a collaborative platform that can be scaled to set up trials, and capture and manage trial data. Use of the cloud with this platform has helped to cut the expense and time of clinical trials, which can cost as much as $3 billion over the 10 to 15 years required to conduct a trial.

A patient’s genomic information often has 300 gigabytes of data, and hospitals that want to participate in clinical trials in the future will need to use the cloud because of the sheer volume of data that large trials could involve. In addition, the cloud enables the use of data gathered worldwide, and hospitals can no longer store this quantity of data in-house, Emili says.

Jim Hollingshead, president of San Diego-based ResMed, a provider of connected healthcare solutions for remote monitoring of patients with respiratory illnesses, has found a way to use to cloud to save money and increase the flow of data. ResMed’s clients, mainly home health medical equipment (HME) providers, are required by Medicare to show proof that patients are using VPAP and CPAP breathing devices. Previously, removable data cards were used for this purpose, but ResMed replaced them with cellular chips that send data straight to the cloud. Now an HME can go online and verify the usage. It greatly reduces the labor required by the HMEs to verify usage, saving them money.

A completely unexpected aspect of going to the cloud and online was the ability to identify patients that need intervention for compliance. Adherence levels jumped, as did the interest on the part of consumers. Some 900 of them go on the site per day to check the data regarding their usage and several hundred thousand patients persistently look at their data. “We were shocked” patients latched onto this, he said. There was unquestionably an underlying need.

The software platform has an API that enables hospitals to connect with the HMEs through the cloud, making the EHRs interoperable, which is especially important to providers in networks and ACOs. “We see the world going to an integrated place,” Hollingshead says.

In 2014, the company launched its SaaS API, and it was quickly adopted. Berg Insight confirmed that ResMed in a 16-month span had become the the world’s leader in remote patient monitoring with more than 1 million connected devices. . “The cloud is the next wave of creating value,” Hollingshead says.

The cloud fits many of healthcare’s needs to exchange information and will play a vital role, says CMI’s Cantwell. “It’s not an issue of whether or not healthcare is going to going to adopt the cloud; healthcare has already started to.”

But the speed of adoption depends on open data. “Once data becomes democratized, almost the only way you can allow that data to really be a source of true innovation is to open it up to more cloud-based services,” Cantwell says.

However, “it’s only a matter of time,” says Kim from HIMSS. Some hospitals are becoming more geographically diverse, and they’ll need data on demand. The cloud is scalable and can provide not only computing power and storage space, but also convenience, she says.

Via HealthDataManagement.com »

Making Technology Talk: How Interoperability Can Improve Care, Drive Efficiency, and Reduce Waste

THE ABUNDANCE OF PROPRIETARY PROTOCOLS AND INTERFACES THAT RESTRICT OR PROHIBIT HEALTHCARE DATA EXCHANGE TAKE A HUGE TOLL ON PRODUCTIVITY. HEALTH SYSTEMS AND PROVIDERS SHOULD DEMAND BETTER INTEROPERABILITY.

Many industries have harnessed the power of technology to improve outcomes and reduce costs, but despite continued technological advances, health care overall has experienced negative productivity over the past decade. Some organizations are doing better than others. To determine where their organizations might fit into this scenario, healthcare leaders should consider three questions:

  • Does the technology within our health system help clinicians excel in their jobs and achieve the best possible outcomes for patients?
  • Does technology function seamlessly in the background, allowing streamlined operations and freeing care teams to return to the patient bedside?
  • Have we optimized the return on our technology investments?

Chances are that most health system leaders’ answers are closer to “not exactly” than a resounding “yes.”

Unfortunately, the vast majority of medical devices, electronic health records (EHRs), and other IT systems lack interoperability—i.e., the ability to seamlessly share and use information across multiple technologies. Perhaps more precisely, they lack a common, built-in system that can exchange information across vendors, settings, and device types. Various systems and equipment typically are purchased from different manufacturers, and each comes with its own proprietary interface technology.

As a result, hospitals must spend time and money—both scarce resources—setting up each technology in a different way, instead of being able to rely on a consistent means of connectivity. Moreover, hospitals often must invest in separate “middleware” systems to pull together all these disparate pieces of technology to feed data from bedside devices to EHRs, data warehouses, and other applications that aid in clinical decision making, research, and analytics. Many bedside devices, especially older ones, don’t even connect; they require manual reading and data entry. The nation’s largest health systems employ thousands of people dedicated to dealing with what one system dubs “non-interoperability.” The exhibit below depicts the current state of data flow.

Typical Existing Data Exchange Predominated by One-Way, High-Cost, Proprietary Interfaces
Typical existing data exchange predominated by one-way, high-cost, proprietary interfaces

Rethinking the Interoperability Challenge

The current lack of interoperability can compromise patient safety, undermine care quality and outcomes, contribute to clinician fatigue, and waste billions of dollars annually. It also hinders progress toward achieving goals for population health management and precision medicine. Worse yet, it impedes innovation, which may be the biggest missed opportunity for health care. People with ideas for doing things differently—both in terms of care processes and technologies used—face significant obstacles accessing data, validating solutions, integrating into highly configured environments, and scaling implementations across varied settings. As a result, innovators often steer clear of the healthcare market because navigating it is simply too difficult, which has the perverse effect of reinforcing entrenched, proprietary interests.

By contrast, the seamless exchange of information would improve care, increase operational efficiency, and lower costs. It would facilitate care coordination, enable informatics, reduce clinician workload, and increase the return on existing technologies. To realize these benefits, healthcare organizations must rethink how the disparate pieces are connected not only within one hospital, but also among every entity involved in a patient’s care, including physicians’ offices, home health agencies, and other post-acute care facilities.

Rather than continuing to be constrained by the high-cost, proprietary status quo, health systems and providers should demand and adopt a platform that is standards-based, addresses one-to-many communication, allows two-way data exchange in real time, and enables plug-and-play integration of devices and systems. Let’s explore these attributes.

The use of standards-based interfaces will reduce costs by decreasing the number of interfaces that must be built and maintained. In January 2016, the Senate Health, Education, Labor, and Pensions (HELP) Committee released draft legislation calling for such standards.aSeveral private-sector entities, including standards development organizations, continue to develop new protocols, and improve upon existing ones, for data exchange. Those responsible for purchasing and implementing technology should reward vendors that adhere to standard, as opposed to proprietary, approaches. Policymakers and regulators can provide the push toward standards-based solutions while the market creates the pull.

One-to-many communication refers to the need for one device or system to communicate with multiple other devices and systems, sometimes at the same time. For example, data elements that are critically important to the safety of care and patient outcomes, such as patient allergies, are manually entered and reentered even on the same inpatient or outpatient visit. Ideally, the data would be entered once and automatically shared with various systems, making it available to any care team and avoiding potential mistakes from delayed data entry. This capability, combined with real-time, two-way communication, would improve workflow by automating tasks as appropriate and ensuring that needed information is readily available—all with appropriate levels of privacy and security.

Two-way data exchange is the backbone of a learning health system, characterized by the Institute of Medicine as a health system “designed to generate and apply the best evidence for the collaborative healthcare choices of each patient and provider; to drive the process of discovery as a natural outgrowth of patient care; and to ensure innovation, quality, safety, and value in health care.”b Patient care technologies need to be able to send and receive data in a manner that enables feedback loops and automation. When connections are one-way, information doesn’t always reach the destination where it is needed and often places the onus on individuals to detect problems. In some cars, for example, sensors are able to communicate with brakes and automatically intervene to prevent crashes. The lack of such information exchange in health care frustrates efforts to apply advanced informatics and improve clinical workflow and care delivery through automation.

Plug-and-play means that when two independent pieces are connected, they self-configure and can talk to each other without (or with minimal) human intervention. For example, an ATM card can be used in any ATM around the globe. A platform with these attributes would give a health system or provider greater control over the data it needs to deliver safe, efficient, and effective care. A hypothetical platform is shown in the exhibit below.

Desired State of Data Exchange With Real-Time, Two-Way, Low-Cost, Standards-Based Interfaces
Desired state of data exchange with real-time, two-way, low-cost, standards-based interfaces

A key feature of the platform is that it makes it possible to scale interoperability. It provides a blueprint for how the various technologies used in patient care can plug into a health system’s operations. The platform uses adaptors and standards-based interfaces to connect data producers with data consumers. If a vendor can certify that its product works with the platform, the product will be interoperable with any other system already connected to the platform. This interoperability is analogous to how the electrical system in your home works—you can plug a phone charger into any socket in any room and it will work.

The blueprint is the foundation for standardization and innovation. Vendors can compete on capabilities and features, not on access to data, and provide more useful products because they can use all data that work with the platform. The use of standards-based interfaces also levels the playing field and reduces barriers to entry for those trying to innovate in health care.

Driving Interoperability With the Power of Procurement

Creating and implementing a standards-based system is no small task, and impossible for any single system. The procurement process is an effective lever for this magnitude of change. Health systems and providers—as the organizations buying, implementing, and using technologies to care for patients—can and must transform the technical underpinnings of the healthcare industry. Purchasers can reward vendors and developers that work together to adhere to the blueprint, thereby instilling confidence that solutions will work as expected, safely, and securely.

The unified voice of health systems and providers making consistent requests of vendors would benefit purchasers and sellers alike. The need to create and support customized solutions often is a financial burden on vendors as well. A centralized approach to establishing requirements can overcome the inability of a single health system or provider to compel change on its own. Requirements should be specified in requests for proposals (RFPs) and upheld in contracting language.

For the vendor community, a centralized approach also provides a focal point for engaging customers in solving shared technical challenges. It also makes enlisting the help of other industries easier. Breaking legacy thinking is one of the hardest, yet most critical, aspects to revamping data flow in health care. Learning from those industries that have conquered similar challenges is invaluable.

A Playbook for Change

Concurrent proprietary advances are common in the natural evolution of technology. Only after users have adopted disparate solutions do markets tend toward consensus and standardize on important operational parameters. The difficulty with regard to healthcare technology is that advances have been so rapid, and solutions have been so complex, that the “natural” evolution toward standardization is a decidedly uphill battle. If progress is to be made any time soon, it will require significant pull from users, to use marketing terminology, rather than waiting for a push from suppliers.

The following seven steps provide a methodical approach healthcare providers can take to encourage technology suppliers toward standardized information sharing.

Assemble a team. An effort as expansive and complex as improving the interoperability of a health system requires a champion (or team of champions). Start by assembling a team of cross-functional individuals, including caregivers, clinical informaticists, IT professionals, clinical engineers, operations leaders, financial managers, and a patient advocate. This team should have CEO sponsorship to ensure organizational alignment and assistance in overcoming barriers.

Describe the desired state. A major objective of the team is to set a vision for the future. This vision should reflect organizational priorities and not be limited by the constraints of how things work today. The people, process, and technology framework can be a helpful organizing principle.

The vision should contemplate improving patient outcomes and treatment experiences. It might, for example, imagine a quieter care environment that leverages built-in technology for identifying and monitoring patients. The vision could set a goal for HCAHPS scores and address the necessary inputs.

The vision should account for care teams’ biggest pain points and reimagine workflow to create a better user experience with seamless technology that supports care. One large health system armed 800 of its frontline nurses with 3-by-5 index cards and asked them over the course of two days to write down what they needed to improve their care delivery. The top needs identified through this exercise were cleaning up the EHR terminology; making the nurses mobile without nuisance alarms and notifications; and, most important, getting the nurses back to the bedside with the support of seamless technology in the background that would help them provide the care their patients deserve and desire.

The vision also should consider how technology can drive efficiency and support care outcomes. For example, documentation tasks such as recording and transcribing patient information detract from time spent directly with patients. The burdens of manual documentation generally result in less frequent documentation, which leaves real-time vital sign data lacking and limits the use of clinical decision support tools, such as the Modified Early Warning System (MEWS), to monitor and respond to changes in patients’ conditions. Technology also can help reduce waste and manage throughput, which, in turn, helps care teams.

Assess the current state. To be able to make any improvements in interoperability, the team must understand the current state of interoperability across the health system. Interoperability, like security, is not a specific state but rather a continuum, ranging from complete inability to exchange even a single data point to the fluid exchange of information. It is therefore difficult to measure. The Center for Medical Interoperability has developed a maturity model to help evaluate an organization’s level of interoperability—basic, intermediate, or advanced—in five key dimensions:

  • Infrastructure. How connected, secure, and resilient is the health system’s infrastructure?
  • Contextual/dynamic. Do information exchanges enable safety and optimal decisions?
  • Conversational complexity. Is information exchange orchestrated to meet the organization’s needs?
  • Terminology/semantic. Do the places that send and receive the organization’s data speak the same language?
  • Syntactic. Is the information the health system needs to exchange formatted to meet those needs?

Diagrammatic Representation of Interoperability Maturity Evaluation Model
This graphical representation shows the maturity level of an organization’s data interoperability. Developed by the Center for Medical Interoperability.

It is important to make coordinated progress along each dimension to increase the degree of interoperability from basic to advanced levels.

Current-state assessment also should include analyses of costs and contracts. Health systems and providers need to know the total cost of installing and maintaining the thousands of interfaces and systems supporting data exchange. They also would benefit from understanding how contracts for each device and IT system, including EHRs, affect their level of interoperability.

Identify gaps. After the current state has been assessed, the team can identify gaps between the desired and current states. It will be important to clearly define gaps and measures to evaluate progress being made in filling those gaps. As needs will likely overwhelm available resources, it will be essential to prioritize and gain executive and organizational alignment around each remedial measure.

Develop an implementation plan. Having identified interoperability gaps, the team can create a plan for addressing them. This plan should detail priorities and suggest a phased approach that emphasizes clinical impact and allows for measurable levels of attainment.

The technical activities outlined in the road map could be framed through a clinical perspective. The clinical area may be divided according to five care contexts as shown in the exhibit below.

Technical Efforts Prioritized and Organized by Clinical Care Context
Technical efforts prioritized and organized by clinical care context

Technical activities also should take into account the most appropriate sequence for integrating devices and systems. One consideration is balancing criticality of data with ease of integration. Vital signs, for example, are deemed vital for a reason, and it would make sense that vital sign data be readily available. The first phase of integration could focus on noninvasive vital signs (e.g., temperature, blood pressure, pulse, respiration rate, and oxygen saturation). The second phase could address critical care vital signs (e.g., monitors in the intensive care unit, operating room, or emergency department), and infusion pump, ventilator, and EKG data. A third phase could integrate bed alarm management, call bell management, defibrillation data, and anesthesia data.

Another consideration is prevalence and extent of use of point-of-care devices. The exhibit below presents a sample tabulation of device occurrence across acute care domains. Such a survey can help an organization prioritize its interoperability improvement efforts.

Sample Prevalence Tabulation of Point-of-Care Devices Used in Acute Care Settings
Sample prevalence tabulation of point-of-care devices used in acute care settings

The key consideration when phasing improvements is making sure those improvements suit an organization’s needs, including what’s important to clinicians. Opportunities abound for interoperability to positively affect safe clinical outcomes and patient satisfaction while also reducing costs and improving operating efficiencies. Describing a path forward—both a starting point and a gradual evolution across the clinical and technical domains of the health system—is the foundation for an organized effort to drive progress.

Achieve immediate wins. Early success will help the team gain support for the implementation plan. Under the infrastructure dimension of interoperability, there is potential for a quick win in improving the wireless environment, which in an increasingly mobility-centric world is an essential preliminary step toward connected, interoperable, and trusted health care.

A reasonable starting point for evaluating wireless health is a Wi-Fi traffic assessment. Detailed findings can inform remediation recommendations to ensure a high level of wireless assurance.

Maintain focus and communicate successes. Maintaining the enthusiasm and buy-in to execute long-term strategic initiatives is not easy. It is important to keep the established team engaged and celebrate milestones achieved. Communicating successes related to improved care, increased efficiency, and reduced waste will be important to sustaining momentum.

A Call to Action

Achieving interoperability requires effort on both the business and technical side of operations. One tactical step that health systems and providers can take immediately, and in parallel with enacting long-term strategies, is to audit existing contracts. Any contractual language that impedes optimal patient care should be catalogued. Examples of such language are gag clauses, provisions that inhibit data sharing or make it prohibitively expensive, and limitations on engaging other vendors and third parties. To remedy contractual terms that are counter to the best interests of patients, these terms must first be identified and understood.

We can no longer accept the status quo of U.S. health care. We must require the same level of interoperability that we enjoy in other aspects of life. It’s time for the healthcare ecosystem to come together and drive change. Patients deserve better outcomes and care experiences, healthcare professionals deserve technology that helps them excel in their jobs, and our nation deserves a sustainable health system to care for generations to come.


Ed Cantwell is executive director, Center for Medical Interoperability, Nashville, Tenn.

Kerry McDermott, MPH, 
is vice president, public policy and communications, Center for Medical Interoperability, Washington, D.C.

Via HFMA.org »

Medical interoperability is next frontier in health technology

I had a routine doctor visit this week.  This physician was not part of the Vanderbilt Medical Group, where my primary care physician practices.  The first 20 minutes of my visit was filling out, by hand, a six page medical history.

My daughter, Carrie, has been in Switzerland in school this year, and we can send her money by pressing a button, and talk to her by Skype from our cell phones or tablets.  I will be going to the Middle East next month, and I will be able to answer emails and conduct business seamlessly across nine time zones.  However, in healthcare, information gathering and transmission is maddeningly redundant and extraordinarily atomized.  Hopefully, relief is in sight.

Earlier this month, the Nashville Health Care Council hosted a program on medical interoperability.  Interoperability is a fancy word for the ability of health care technology systems to communicate and share data. Interoperability includes all the data involved in patient care, from a patient’s medical history to the real time data that can be transmitted by medical devices directly to electronic health records.  Medical data holds the key to not only the prompt and accurate diagnosis of a patient’s present condition, but also the aggregation and analysis of population health conditions for establishing evidence based protocol and treatments.  At the NHCC forum, panelist Zane Burke, president of Cerner, declared “Either you’re going to be open and interoperable, or you will be obsolete.”

As information technology has advanced, health professionals and systems have seen improvement, but they remain frustrated by the inability to integrate data from multiple sources, a veritable  “built to fail” model when manufacturers create technology that will only communicate with other products made by the same manufacturer.

Into this breach comes the Center for Medical Interoperability.  The center’s seed funding came from the Gary and Mary West Foundation in Southern California, but it has quickly advanced to become the nation’s leading advocate of change.  The center, founded by Dr. Michael Johns and led by Ed Cantwell, determined that to be the most effective, it needed to relocate to Nashville — the center of health care. The center’s board includes a who’s who of Nashville CEOs, including Milton Johnson (HCA), Wayne Smith (CHS), Bill Carpenter (LifePoint), Jeff Balser (Vanderbilt) and Mike Schatzlein (St. Thomas), as well as the CEOs of such health care brands as Johns Hopkins, Northwestern and Cedars Sinai.

The center is in the process of developing common communication standards for medical devices, with a multi-year plan that will bridge point-of-care information, health applications, enterprise technologies and, ultimately, national health information exchanges.  The desire is for all patient data to be connected across the spectrum of care, whether home, ambulatory, acute or post-acute.  The goal is for every health device to be able to “plug-and-play” off a single interoperable platform.

Banking was once ATM and swipe cards, but now it’s chips and wireless.  Similarly, the future of health information is wireless.  In Nashville, the Center for Medical Interoperability is developing the national wireless lab to establish construction standards, architectural processes and best practices for wireless medical devices.  Ultimately, the center will formulate wireless technology standards and certifications.  The era of health information is at an inflection point.  The bedrock of health decision making is current and accurate data.  As the nation moves to the next level of clinical efficiency and effectiveness, expect Nashville to be the center of architecture, planning, and certification for medical interoperability.

Richard Cowart is chairman of the health law and public policy departments at Baker Donelson. Reach him at dcowart@bakerdonelson.com.

Via Tennessean.com »

HealthIT: Both hard realities and major opportunities are C4MI targets

FRAGMENTED, misaligned HealthIT products contribute to the deaths of the equivalent of a jumbo-jetfull of hospital patients, each day.

In fact, infections, poor communications, outright medical errors and spotty use of the latest medical evidence in decision-making are estimated to contribute directly to as many as 440,000 hospital patient deaths that might otherwise have been prevented, each year.

Nonetheless, the chaos that results from disjointed databases, brain-fatiguing electronic alarms and frequently unreliable communication among clinicians quietly continues to set the stage for treatment failures and deaths that may range to 400,000 each year, according to wide-ranging published estimates.

This week, the Nashville-based Center for Medical Interoperability (C4MI) takes another step forward in addressing healthcare chaos and calamity, as well as opportunities to help improve U.S. healthcare outcomes and affordability, while spurring industry innovation among a myriad of entrants, from startups to mature enterprises.

C4MI’s 13-member Technology Coalition will convene its first full work session April 14-15 at St. Thomas Midtown Hospital in Nashville.

This week’s meeting follows close-on-the-heels of last week’s Nashville Health Care Council panel discussion on HealthIT interoperability.

C4MI’s Ed Cantwell
(Donn Jones)

That event at the Hilton Downtown feature C4MI Executive Director Ed Cantwell, as well as Cerner President Zane Berke and Ascension Health SVP Mike Schatzlein, M.D., the former CEO of St. Thomas Health (Ascension) and founder and former chairman of St. Thomas spin-out MissionPoint Health Partners.

The Coalition’s meeting here this week includes high-profile technology architects and business-unit leaders. Their identities have not yet been announced.

However, comments during last week’s panel seemed to suggest that in addition to Healthcare, industries represented here this week might include Defense, Airline or even consumer electronics.

Mike Schatzlein

Moreover, given that the C4MI board of directors includes not only Ascension‘s Schatzlein, but also the CEOs of HCACommunity Health Systems and LifePoint, we might anticipate seeing a platoon of local executives involved.

Regardless of the Tech Coalition’s makeup, C4MI has made clear that its members — all major customers of HealthIT and infrastructural software and services vendors — will, by the power of their collective purchases, “compel” HealthIT vendors to adopt C4MI standards that will soon be developed. C4MI will certify vendor products’ compliance, much as such laboratories do for other industries.

While all that sounds reasonable, last week’s NHCC panelists seemed to agree that the nation will never achieve personalized or precision medicine if the healthcare industry does not soon have use of unique national patient identifiers and full commitment to ensure that medical devices are seamlessly connected with patients’ electronic health records.

So great is concern about patient identifiers, that Cerner’s Berke signaled several times during last week’s panel that those who support interoperability and improved care outcomes at lower costs must be prepared to respond political and policy challenges by “detractors.”

Both Berke and Cantwell said interoperability advocates must get across to stakeholders that the current chaos in the clinical environment is wearing-out staff and stymieing many healthcare improvements.

During the NHCC panel discussion, Berke noted that Cerner is an active member of CHIME (College of Healthcare Information Management Executives), which has launched its HERO-X National Patient Identifier Challenge, the winner of which will be announced in February.

CHIME’s website emphasizes both its educational mission and its policy advocacy in the nation’s capital. Both Cerner and fellow C4MI member Intermountain Healthcare hold seats on CHIME’s board. CHIME has long been allied with HiMSS, the Healthcare Information Management Systems Society, in which the Nashville chapter plays an influential role. HiMSS has strongly called for Congress to remove the federal prohibition on establishing the patient identifier.

Kerry McDermott

This week, the C4MI Tech Coalition’s “vendor neutral” agenda is to include discussion of roles and responsibilities, plus C4MI’s very tentative roadmap for its work and related matters, said C4MI VP Kerry McDermott, who handles policy and communications from her base in the Washington, D.C. area.

The tech-architects subset of the Coalition is likely to meet several more times ahead of the September meeting of C4MI’s board of directors, who currently number 14, said McDermott. By September, C4MI may have moved into new offices in the OneC1ty development on Charlotte Ave.

C4MI aims to complete by the end of this year Version 1.0 of the “reference architecture” that will ultimately enable full interoperability of the healthcare system. C4MI operates as a laboratory for testing and certification of HealthIT, in accordance with technology standards C4MI will promulgate.

The Tech Coalition will heavily influence the creation and sustainability of technology components of the overall C4MI reference architecture, including ensuring that platform-consistent technologies are capable of “one-to-many” interrelationships with other technologies and systems, as well as featuring “plug-and-play” ease of connectivity and addressing rising security and trust issues.

On a track parallel to the efforts of its Tech Coalition, C4MI will simultaneously move to advance its “Interoperability Maturity Model” (IMM), which will help participants understand HealthIT use-cases, assess state-of-the-art and deployed solutions, and estimate and visualize existing interoperability scope and systemic gaps that require solutions.

The IMM focuses on contextual/dynamic issues, e.g., including the ability of devices and applications to share data relevant to the patient and to the clinical workflow; infrastructure (e.g., transport level connectivity, security); syntactic considerations (use of formats for communications and exchange); semantics, including vocabulary, nomenclature, ontologies and models; and, conversational complexity, including the sophistication of information exchanged and the manner in which the conversation is conducted, according to C4MI information online.

The outcome of all this effort cannot be assumed. Panelist and C4MI Board Member Schatzlein of Ascension reminded the NHCC audience in Nashville last week that some local providers had previously “failed” in trying to establish a regional health information exchange (HIE), “because we didn’t have our internal acts together.”

Meanwhile, what about all those deaths? Asked whether the C4MI and the industry are treading too lightly in spotlighting the scale of preventable patient deaths, McDermott said C4MI believes it is vitally important that professionals and the public-at-large understand that lives are at stake as a result of the lack of sufficient HealthIT interoperability.

At the same time, she said, it’s important not to sensationalize these issues, thereby clouding perception of the very real benefits of healthcare, while vilifying clinicians and contributing to distrust of providers.

She noted that earlier this month, in the course of advocating for inclusion of medical devices in any interoperability initiatives, C4MI wrote U.S. National Coordinator for Health IT Karen DeSalvo, saying, in part, that “the current lack of medical device interoperability is without question the cause of many adverse drug events, medication ordering errors, transcription errors, redundant testing, inadequate monitoring and miscommunication, all of which contribute to preventable medical errors…”

C4MI went on to say that in addition to contributing to serious medical errors and wasting valuable time, making sure medical devices are interoperable could save an estimated $30BN per year within the healthcare system.

The presenting sponsor for last week’s NHCC program was BlueCross BlueShield of Tennessee. Supporting sponsors were Bass Berry & Sims, Cressey & Company, KPMG and LifePoint Health.

C4MI has retained Jarrard Phillips Cate & Hancock for PR counsel. Cantwell noted during last week’s panel that they had advised him to further simplify C4MI’s message. VNC

Via Venture Nashville »

Syncing health care tech is ‘moonshot and marathon’

Medical interoperability is health care’s Achilles heel that, unless solved, will put the industry in a backseat position in figuring out solutions to a variety of tech problems, experts said.

Medical interoperability is health care’s Achilles heel that, unless solved, will put the industry in a backseat position while the government or Millennial innovators take the lead in figuring out solutions to a variety of tech problems, speakers at the panel said.

Health care technology, be it machines or software, is largely disconnected. It’s not the plug-and-play tech that allows people to save files on a USB drive to open on any computer.

Interoperability, noted moderator Ed Cantwell, executive director of the Center for Medical Interoperability, carries different meanings for different people and companies. In short, it’s the ability to connect, sync and harness technology, and the data it generates, with seamless exchange.

“You’ll know when you have it,” Cantwell said. “You’ll know when you don’t have it.”

The industry doesn’t have it.

Dr. Mike Schatzlein, Ascension Health’s senior vice president and group ministry operating executive, said that one nurse described the intensive care unit, with all its unconnected machines, and accompanying cords, as a “war zone.”

Outfitting a facility with technology is expensive, which Schatzlein said is counter to initiatives to push down the cost of care. Modernizing and integrating the tech system is going to be necessary for precision medicine — when treatment is tailored to a person’s genetic makeup.

Cantwell is leading the Center for Medical Interoperability, a new organization headquartered in Nashville, that is building a lab and bringing together leaders of industry heavyweights to find a solution. Schatzlein is on the board of the Center, along with HCA’s R. Milton Johnson, Vanderbilt University’s Dr. Jeffrey Ballser, LifePoint Health’s William Carpenter III and Community Health Systems’ Wayne Smith.

Thinking about the future business model is challenging right now because success is dependent upon harnessing the data, said Burke.

“He who has the data or she who has the data will win,” said Burke.

“We would rather there be a private sector solution. … Whether you’re conservative or liberal you probably don’t want the government in this,” said Schatzlein, alluding to Big Brother concerns that could arise if the government is heavily involved in a solution.

Burke wants the industry to find a way to assign a single patient ID to each person so care is more easily matched to the right person. Federal efforts have thus far failed, he said.

A person’s medical record is strewn between many providers and facilities in different formats, ranging from paper to electronic files that don’t transfer. Burke said Cerner is passionate about easing the process of matching files and care to the person to improve treatment and outcomes as well as manage the information that will be necessary under a population health model.

“The industry has solved this in every other space,” said Burke. “It’s up to the collective ‘we’ to get our stuff together.”

Reach Holly Fletcher at 615-259-8287 or on Twitter @hollyfletcher.

Via The Tennessean »