Data-driven decisions

The Affordable Care Act provided a clear wakeup call to long-term and post-acute care (LTPAC) providers that they had to do a better job of measuring and reporting on quality. Looking ahead at 2016, both provider organization executives and software vendors say their efforts will continue to shift from using data analytics to maximize reimbursement under the fee-for-service system to a focus on achieving better outcomes and partnering with others in the care continuum.

As acute-care hospitals are taking a data-driven approach to post-discharge efficiency, comparing post-acute providers based on clinical and operational performance metrics. With the penalty structure in place for preventable readmissions, they can’t afford not to.

The transition to value-based payment is shifting the conversation to the measurement of success, according to Steven Littlehale, chief clinical officer at Cambridge, Mass.-based PointRight Inc., which provides predictive analytics services to post-acute providers.

That type of measurement, however, has never been a strong point of the skilled nursing facility (SNF) sector. “To this day, we do not know how much it costs to care for certain patient types within our SNF environments. We have no idea,” Littlehale says. “That is just one example of how the absence of data has been such a long-term issue in our industry. We can’t demonstrate the value we have. We can’t demonstrate we are an exceptional provider of care.” But these days, he adds, you must have the data to justify your existence.

Providers today need to know how they perform on outcome measures relative to peers and national benchmarks, notes John Damgaard, CEO of Bloomington, Minn.-based MatrixCare, an electronic health record (EHR) provider for the LTPAC setting. “More importantly, if they are failing in a particular area, why is that? Overall prescribing of anti-psychotics might be OK, but if you dig down and region one of nine is off, then you can drill down to line of service, and see that it is one doctor who seems to be prescribing 10 times more than anyone else.”

Regarding the shift in relationships with hospitals, Damgaard says one LTPAC customer put it to him this way: “Five years ago we used to take flowers and cookies to the hospital to get their referrals. Now we take our charts and graphs. Tomorrow we’re going to take our contract lawyers, because we are going to be signing into bundled payments and other types of capitated arrangements with our partners in the spectrum of care.”

Retrospective vs. real-time data

Now that many providers can pull clinical data from the EHR, they are starting to access data that is fresher than the retrospective information provide by the Centers for Medicare & Medicaid Services. “We believe the data has got to be near real-time,” Damgaard says. “It is kind of interesting what happened 60 days ago, and it might help shape behaviors and improve processes, but 60 days is a long time in today’s world. You may have spent a quarter-million dollars of margin you didn’t need to spend in the last 60 days just in one practice area.”

Littlehale stresses that providers need both retrospective and near-real-time data. For quality improvement analysis, it doesn’t matter if your data is six months old. “The data is not changing that dramatically,” he says. If you look at re-hospitalization rates over the last year, you are not going to see a lot fluctuation. Any downward trend is going to be slight at best, and that is just the nature of this kind of data. “But when it comes to moving the needle, you need real-time data,” he added. “If you are trying to make changes at the bedside or decisions about acquisition or divestiture, having real-time data points can help you make better decisions.”

In general, Littlehale says, LTPAC providers have made gradual progress. “I think we are maturing to realize that there is a difference between data and analytics and a difference between analytics and insight. But if you think of that as a hierarchy, most providers are at that bottom level, saying OK, I need to have data.” But having that data is not insight. “That requires someone to manipulate the data to have these aha moments, and the majority of people in the SNF space don’t have the skills to manipulate the data, let alone case-mix adjust it and come up with the appropriate benchmarks.”

Some provider organizations turn to outside help for benchmarking. For instance, The Advisory Board Company’s SNF Benchmark Generator tool is designed to help member SNFs position themselves in terms of relationships with health systems under bundled payment, so the main areas in the benchmark generator focus on utilization and financial metrics: length of stay, payments and volume.

“Depending on how the organization wants to initiate the conversation with the hospital, this tool gives our members the ability to benchmark against a cohort of post-acute providers on length of stay,” says Jared Landis, an Advisory Board practice manager. “They can tell hospitals ‘we are cognizant of length of stay and not abusing the Medicare system. We are keeping them an appropriate amount of time, as measured against the cohort.’”

Landis says that most post-acute providers don’t have the expertise or a team internally designed to do analytics work. “Obviously a big chain can benchmark its own facilities internally and get that perspective across its own assets, and they are more likely to have employees who do this type of work,” he explained. “We are creating a shortcut for these organizations in terms of analytics they could do in house but don’t need to.” One trend in analytics, he added, is independent post-acute care networks of providers coming together to affiliate under an LLC to share resources, including analytics, which gives them the opportunity to have capabilities they couldn’t create on their own.

A journey, not a destination

The business intelligence effort began eight years ago at Life Care Centers of America (LCCA), which operates more than 200 skilled nursing, assisted living, retirement living and Alzheimer's care facilities in 28 states. Mark Gooch, Cleveland, Tenn.-based LCCA’s director of business intelligence, says some software vendors offer dashboard views of the data in their systems, but that from a true business intelligence standpoint, the real value is to be able to take data from multiple sources, combine them and create something you didn’t have before.

LCCA started with a balanced scorecard tool. It combined financial, clinical, labor, census and human resources data to produce quarterly reports. “Quarterly is obviously too slow to make any management decisions on,” Gooch says. “It involved manual entry of data and accuracy was a problem.” LCCA then built a data warehouse, with automated integration every hour, night, week or month depending on the data source. “We have been able to streamline that process, so there is not a lot of human effort in it. I think we are making better decisions and our data quality is getting better,” he explains.

Gooch describes the business intelligence effort as more of a journey than a destination. “It is not a project you can knock out and say we are done,” he says. “By the time it is done, you should be working on the next iteration. If we were tracking the same metrics now that we tracked 10 years ago, that would be awesome. But I am constantly being asked for new ones. A healthy data analytics initiative in any company should be living and breathing.”

The promise of predictive analytics

One bright spot is the use of predictive analytics to identify areas that require more resources, such as residents at risk of readmission. Dan Hogan built his company, Nashville, Tenn.-based Medalogix LLC, based on the capability to use data and algorithms to identify residents more likely to be readmitted to the hospital. Then customers requested a tool to predict which residents are nearing the end of life and might benefit from a transfer to hospice or home health services.

Hogan’s team went back to the drawing board, taking the same data set it had been working with for readmissions, and analyzing it for patient mortality. “Right out of the gate the model was as if not more predictive than the readmission tool we had built,” he says. Hogan stresses that the tool is about identifying an elevated probability within a 180-day window, and focusing the attention of clinical resources so they can have that conversation early. “It is a clinical decision support aide,” he adds. “There is nothing black and white about it. It is all about elevated probabilities.”

Rolled out in 2013, the tool is used by 21 of the 50 largest home healthcare companies in the country, he says. “What we have seen on average is a 30 to 35 percent lift in hospice days. We are identifying more patients that are hospice-appropriate and we are identifying them earlier.”

MatrixCare’s Damgaard notes that his company has data on 6.5 million unique lives across 10 years. “We know when an 87-year-old with two particular chronic conditions walks in the door, he has a 72 percent chance of a fall in the first 10 days. The data tells us that,” he says. “A standard intervention is to invoke our clinical decision support and bring the Johns Hopkins Fall Risk Assessment into play. That is a standard workflow that our system will suggest to the nurse.”

In addition to predictive abilities, providers want to get more proactive in their use of analytics. PointRight’s Littlehale says more users now want to get beyond basic dashboards and be able to do additional manipulation themselves, or they want to focus on a certain outcome in a certain market. “So they are looking to customize their analytics for that, and that is completely appropriate,” he adds. “More and more, we are seeing those types of requests coming, so folks are maturing in terms of how to use analytics and what they can do for you.”

Philadelphia-based David Raths is senior contributing editor for Long-Term Living's sister-brand, Healthcare Informatics.

 

 


Topics: Articles , Executive Leadership , Technology & IT