Provider perspectives on how to effectively present information on quality

Information on nursing home quality via online data sets such as Nursing Home Compare and numerous trade journals such as Nursing Homes/Long Term Care Management has become increasingly popular. Literature suggests that researchers or government bodies collecting information on quality often struggle to find ways to present the data in a format that is useful to providers without becoming overwhelming or burdensome.1 A great deal of money and effort has been invested in generating knowledge that could potentially shape decisions about health and healthcare and for providers to improve quality of care.2,3,4 However a major concern about this knowledge transfer is that “take-home messages” are not always clear to intended audiences.2,5 Other reports on the use of publicly reported comparative information on health plans and hospitals indicate that providers, consumers, and regulators have paid little attention to comparative quality information either in print or from online sources.6

Our research team at Benjamin Rose's Margaret Blenkner Research Institute in Cleveland, Ohio, conducted focus groups with providers to find out how they prefer to receive information on quality of care, satisfaction data, and related topics for quality improvement (QI) purposes.

Methods and Sample

We implemented a mixed methods approach using focus group methodology, as well as a structured questionnaire, to collect provider opinions.

Short survey. Before the start of the focus groups, participants completed a short questionnaire to collect information on background characteristics and obtain ba-sic report and Web site preferences.

Focus groups. Two focus groups were convened with members of two Ohio trade associations. Each focus group discussion was tape-recorded and lasted about two hours. The first focus group was conducted with members of the Ohio Health Care Association (OHCA), the Ohio counterpart of the American Health Care Association, whose members were employed primarily by proprietary nursing homes. This focus group was held following a meeting of the association's Facility Standards Committee and Nursing Council in Columbus, and members from all over the state attended.

The second focus group was a special event organized specifically for this study to balance the first group. These providers are members of HealthRays Alliance, a consortium of 15 long-term care, nonprofit organizations in Northeast Ohio. Alliance members also belong to AOPHA, the Ohio counterpart of the American Association of Homes and Services for the Aging.

Focus group structure. We developed open-ended questions to guide the discussion. We also created sample reports from a number of online reports, including information on quality indicators, deficiencies, and consumer satisfaction. Our data sources included the Nursing Home Compare Web site of the Centers for Medicare & Medicaid Services (CMS; and Ohio's Long-Term Care Consumer Guide (, sponsored by the Ohio Department of Aging.

The sample reports were used to generate a discussion of impressions of current reporting strategies, how research data should be reported to providers, formats that providers found easiest to follow, and preferences for comparisons with other facilities. We asked whether providers needed technical assistance in using such information for QI purposes and which publications they read to learn about such findings.

Findings From the Short Survey

Background characteristics of partici-pants. Twenty women and seven men attended the two focus groups, with an average age of 45.3 years; 54% were college graduates, 35% had a master's degree, and 12% had attended some college. Most were Caucasian, with one African-American participant. Their job titles included administrator or director of nursing, director of clinical services, and risk manager. Most of these job titles could be classified as administrative (63%) or nursing/clinical (37%) positions. Participants had worked in their current facilities an average of 6.3 years, but had been employed in the long-term care field an average of 18 years.

Background information on sites. Participants represented 18 different nursing homes—15 from organizations that were part of a chain—and 3 from organizations that managed, advocated, or provided consulting services for nursing homes in Ohio. The nursing homes and their umbrella organizations provided a broad range of services, including skilled nursing services, respite, hospice, subacute care, home care, assisted living, independent living, and dementia/Alzheimer's care. The organizations were located in suburban (35%), rural (30%), or urban (13%), settings, and some (22%) organizations had multiple types of locations.

Findings From the Focus Groups

Participants agreed that, to be most effective for providers and consumers, reports should be simple. They felt that tables with numbers and percentages were generally appropriate and easy to understand for most audiences. While they agreed that bar graphs are easy to understand, they felt that they waste space. They also suggested that additional in-depth information may be useful and most appropriate in electronic formats, perhaps as a hyperlink on a Web page.

Ranking systems. Participants were asked if they wanted their facilities to be ranked against competitors for consumer reference, and if they had a preference between ranking methods (e.g., a “star” versus percentile ranking). There was an interesting divergence of opinion between not-for-profit and proprietary professionals on this subject: Generally, not-for-profit participants favored a ranking system and felt that a “star” system or a percentile rank is fairly easy for the public to understand. In contrast, proprietary professionals doubted that either researchers or the government could design a reasonable ranking system and felt such ranking systems would unfairly penalize facilities that specialize in frail or sick populations. They also felt that percentiles can be confusing to the general public.

Peer grouping. Proprietary and not-for-profit professionals also differed in their opinions on the presentation of comparisons based on “peer groups” of similar facilities rather than another standard, such as a statewide average. Not-for-profit nursing homes participants were more likely to indicate a preference for comparisons with peer groups, and they also felt that researchers or surveyors were capable of creating accurate peer groupings.

Professionals from proprietary homes were doubtful that statistical procedures based on research findings could create accurate peer groups. All participants—both not-for-profit and proprietary—agreed that creating accurate peer groups could be complex because variations in location, levels and types of care provided, proximity to other facilities, and profit versus nonprofit status could influence how peer groups were defined. They agreed that long-term care providers were capable of making their own comparisons with publicly reported data from their competitors.

Producing overall summary scores. Providers were asked if they would prefer to receive a single, overall measure of quality as a quick-reference tool. This measure could be created from a variety of outcomes from different data sets, such as the MDS, OSCAR, and consumer reports. Participants did not like this idea and questioned how information from diverse sources could be summarized into a single useful and meaningful overall quality measurement score. They were concerned about how the overall score would be calculated and weighted and whether the information would be timely. They also questioned the value of such a complex score for benchmarking with other facilities, much less indicating where to focus their QI efforts. Instead they preferred to receive information from the various sources and domains independently.

Advanced analyses strategies valued by providers. Participants were interested in researchers’ work highlighting the specific correlates of certain quality and satisfaction outcomes, as well as work that combined information from multiple sources into one report outcome. While the basic presentation of descriptive results was interesting, providers found an additional layer of analysis done by researchers to be intriguing and helpful for refining QI efforts.

For example, in analyzing the data from the Ohio Nursing Home Family Satisfaction Survey, researchers found that families who communicate with staff (social workers, nurses, administrators) are overall more satisfied, in contrast to families who do not communicate as much. Additional findings showed that family assistance with residents’ activities of daily living (ADLs) is negatively correlated with family satisfaction. The more intimate the care provided by families when they visit, the greater the family dissatisfaction.7 Providers could see ways to use this type of information to design in-house programs that would improve communication between staff and families—for example, to increase consumer satisfaction—based on these more targeted analyses.

Researchers also merged cost report data with Ohio's family satisfaction data and found that greater nursing home expenditures on RN and LPN agency hours negatively correlated with resident satisfaction. In contrast, greater nursing home expenditures for direct care staff and fringe benefits were positively related to resident satisfaction.8 Providers were surprised to find that staff recruitment and retention policies could have an effect on family satisfaction. They were excited to learn about such relationships, indicated that they would like to learn more about such findings, and said that such findings had clear, direct implications for changing practices.

Additional information desired. Providers indicated interest in finding out if other characteristics of resident and family respondents (i.e., educational level, previous experiences/extent of information, and personalities) were likely to affect consumer satisfaction results. They were curious whether presenting satisfaction scores in the context of a facility's case-mix score or an overall measure of resident impairment would help consumers understand why satisfaction levels might vary between facilities that care for frail and impaired residents and facilities dealing with a less impaired population. They also were excited about the potential for combining data from different sources and conducting a more in-depth analysis to understand how to use the findings for QI purposes.

Reading materials. Participants were interested in reading brief research reports and summaries published in trade association and professional publications they read regularly. They further believed that research findings and their implications for practice should be published in widely read professional magazines and newsletters.


Providers wanted information presented in a simple manner. They were very clear about the kinds of information they found useful, liked information displayed in simple charts and graphics, and were comfortable reviewing tables with numbers and percentages.

On the other hand, participants had mixed feelings about using ranking systems. The group representing nonprofit facilities was more comfortable with ranking systems than their counterparts from proprietary facilities, who doubted the ability of researchers and regulatory agencies to devise a fair method of ranking facilities based on the type of populations they served. Participants from nonprofit facilities were more comfortable with peer grouping than those from proprietary facilities for similar reasons. The latter were more comfortable with comparing their own data with those of their competitors.

Participants, in general, were not in favor of creating an overall quality score and doubted its usefulness and value. They preferred receiving information from different sources independently. However, they were interested in understanding relationships among multiple data sets, especially if it had clear implications for helping them change their practices and policies. Suggestions were offered for future research topics that could be helpful for QI efforts, including information on family characteristics and expectations before nursing home placement.


Findings from these focus groups have important implications on how researchers can present research data so that providers view them more favorably and find them more useful. Future research is needed to determine if other nonprofit and for-profit providers differ significantly on some of the questions asked in this project. Because our study was limited to only one group of each type of provider (proprietary and nonprofit), had representatives of only 18 of Ohio's approximately 970 nursing homes, and was conducted only in Ohio, it should be replicated with a larger sample of providers from different parts of the country. Findings from such a study could enable CMS, state agencies, regulatory bodies, and academic researchers to determine the best way to present research findings.

Researchers and technical consultants might also consider providers’ interest in deeper analysis of existing data and combining data sources to allow new analysis of previously unexplored relationships. QI organizations might find that providing this kind of technical resource to providers will be well received. Providers also voiced an interest in new avenues of study into their client background and said that they would find pertinent information useful in their work. Further studies based on this experience will help narrow the gap between academic research and provider preferences for receiving research information to improve the quality of care in our long-term care organizations.


This project was supported by a grant from The Commonwealth Fund. The authors would like to thank the following for their support and assistance: Mary Jane Koren, program officer, The Commonwealth Fund; Kathleen Fox, senior research analyst, and Justin Johnson, research assistant, Margaret Blenkner Research Institute, Benjamin Rose; Jane Straker, Scripps Gerontology Center, Miami (Ohio) University; Kathy Chapman and Barbara Morgan, the Ohio Health Care Association; Cheryl McMillan, HealthRays Alliance; and Ferdinand Yomoah, doctoral candidate, University of Akron.

Note: The authors have no commercial associations that create a conflict of interest.

Farida Kassim Ejaz, PhD, is Senior Research Scientist II and Brenda Peters, MA, is Research Analyst at the Margaret Blenkner Research Institute, Benjamin Rose, in Cleveland. For more information, phone (216) 373-1660 or visit

To send your comments to the authors and editors, e-mail


  1. Owen P. Clinical practice and medical research: Bridging the divide between the two cultures British Journal of General Practice 1995; 45 (399):557-60.
  2. Crofton C, Darby C, Farquhar M, Clancy C The CAHPS Hospital Survey: Development, testing, and use Joint Commission Journal on Quality and Patient Safety 2005; 31 (11):655-659, 601.
  3. Elixhauser A, Pancholi M, Clancy CM Using the AHRQ quality indicators to improve health care quality Joint Commission Journal on Quality and Patient Safety 2005; 31 (9):533-8.
  4. Ejaz FK, Straker JK, Fox K, Swami S Developing a satisfaction survey for families of Ohio's nursing home residents The Gerontologist 2003; 43 (4):447-58.
  5. Feldman PH, Nadash P, Gursen M Improving communication between researchers and policy makers in long-term care: Or, researchers are from Mars; policy makers are from Venus The Gerontologist 2001; 41 (3):312-21.
  6. Hibbard JH, Stockard J, Tusler M Does publicizing hospi-tal performance stimulate quality improvement efforts? Health Affairs 2003; 22 (2):84-94.
  7. Straker J, Ejaz FK Keeping the customer satisfied: Correlates of family satisfaction with Ohio nursing homes:Symposium presented at the 56th Annual Conference of the Gerontological Society of America San Diego November 2003.
  8. Ejaz FK, Straker JK, Fox K. Using Information on Quality to Improve Nursing Home Care. Final report submitted to The Commonwealth Fund; June 15, 2005.

Topics: Articles , Technology & IT