Measuring clinical decision-making skills

In late 2008, I was grappling with the problem of trying to prioritize in-house and continuing education training programs for our 100 therapists scattered across seven metropolitan areas in four states. Since our goal as a company is to provide consistent high-quality therapy across all locations, I am constantly trying to find better ways of identifying which skills and techniques are the best ones and then share them company-wide. A specific goal I had in mind was to achieve a higher level of consistency in the decision-making criteria used in the therapy admission and discharge process for each discipline.

Our Area Directors were supposed to regularly visit each site and observe and coach all therapists. However, because they have no control over the types of patients they will be observing, it requires many visits before they can obtain a complete picture of a single staff person’s training needs. The information on training needs that they bring back over the course of a year is, therefore, limited and does not yield enough actionable data to allow us to rank the training programs we as a company should be investing in. The travel expenses for these teaching and observation visits can also be quite large.

In a “previous life” as the head of strategic planning for Lexis (the legal research database company), I had several professional market researchers reporting to me. I learned from these researchers that companies like Proctor and Gamble had long ago perfected interviewing techniques to identify and categorize consumer thought processes when making purchasing decisions. Thinking back to my days at Lexis, I postulated that if I could capture and study the thought processes used by my therapists when admitting and discharging patients, I could then see where there were wide variations in clinical decision making. With this information in hand I could then efficiently prioritize training programs to bring about the desired decision-making consistency I was looking for.

The other problem I was trying to solve was to identify scientific tools that could make admission and discharge decisions as objective as possible. As all of us in the long-term care industry know, there are constant debates between therapy and administration when it comes to admitting and discharging patients. In an ideal world, administration would have all residents with Medicare Part A benefits on therapy caseload for as long as possible. On the other hand, therapists have many professional and regulatory concerns about keeping someone on caseload too long. If I could find and implement assessment tools that both therapy and administration agreed upon, that would make the admission and discharge process more of a science and this would then reduce the frequency and length of these debates.

Since I could not find any evidence that the technique of interviewing therapists had been successfully used in the healthcare industry before, there was a risk of failure. To keep my costs and risks low, I decided to first try these techniques with my lowest population discipline, namely my 15 SLPs (Speech-Language Pathologists). The other benefit of starting with SLPs is that the rehab directors who managed them (mostly OTs and PTs) did not have a detailed understanding of their clinical thought processes, and I was hoping that by seeing answers to these structured questions, these supervisors would then be better prepared to support them.

As I had learned when structuring interviews for busy attorneys while at Lexis, questions of highly educated individuals about complex thought processes need to be carefully worded to avoid ambiguous answers. These questions, and the resultant answers, must be thoroughly tested with multiple sample groups before launching a full-scale project. Market research is much like information processing: garbage questions will only yield garbage answers.

With this past experience in mind, I began working on a 20-question interview with two of my SLPs. One SLP was on maternity leave and had a block of time each week to devote to this project. The other was our company’s most experienced SLP and in the previous four years, she had been a clinical instructor for three students and had supervised two new graduates who were in their clinical fellowship year. Over the course of several conference calls, we experimented with various types of open-ended questions, hoping to avoid leading the respondent to the right answer while at the same time extracting long, detailed responses. Since these two SLPs had never built a market research instrument before and I had never admitted or discharged a patient before, all three of us learned a lot in these highly interactive discussions.

We also debated whether to use live interviews versus having the respondents read a written survey and write down the answers to the questions. Since we wanted top-of-mind answers that simulated the fast-moving environment of working in a skilled nursing facility, we decided to have a live person ask the questions and record the answers online, rather than give the respondents time to dwell on an issue and possibly give an answer that did not reflect their true thought processes. As an example, since we wanted to find out who was using formal cognitive tests as well as which people may not be using tests at all, we specifically did not mention the word “test.” Instead, we repeatedly asked the following, inserting various diagnoses into the same question: “When you evaluate someone for possible addition to your caseload, what do you look for in the area of (diagnosis)?”

This same type of open-ended, non-leading question was also used when exploring discharge decision-making criteria, as follows: “When deciding if someone you are treating for (diagnosis) is ready for discharge, what are the criteria you use?”

We also ended the survey with an open-ended question asking for feedback; it read: “How can your management better support you?”

After more fine-tuning of the questions, I had the SLP on maternity leave interview one of our PRN SLPs, with this initial interview taking 30 minutes. Any question that yielded an ambiguous answer was rewritten, a few questions were dropped, and several more questions were added. We then conducted a second test interview with a full-time SLP. This second version of the questions yielded more meaningful results, so we decided to launch company-wide and made interview appointments with the other 15 full-time SLPs. The final work product I had in mind was a document with each question at the top of the page and all the answers below it.

What we learned…what we did

When viewing 15 sets of answers to the same question on the same page, it quickly became apparent in which clinical areas our SLPs either were not using similar decision-making criteria or were not using a consistent set of tools. The training need that percolated to the top of the list was in the area of cognitive assessment. Based on the large variance of knowledge and techniques in this area, we then decided to focus on cognitive assessment and treatment tools that Occupational Therapy and Speech could both use while also exploring the other areas that were highlighted. Our one-year action was as follows:

  • We established semi-annual SLP-only telephone audio-video conference calls in which one SLP (or an invited guest) conducts an in-service that addresses those areas identified by the interviews as either inconsistent or in high demand. This allows a few of our SLPs with specific subject expertise to obtain some peer recognition by conducting these in-services on a company-wide basis. It also enables our SLPs in different parts of the country to connect with each other as clinical sounding boards. Our first company-wide in-service was an overview on the use of the above cognitive assessment tools. The second was on techniques used to help tracheotomy patients with Passie-Muir valves regain speech skills.

  • We sent six SLPs to the annual ASHA Conference with the assignment to attend every possible geriatric session and find new techniques and processes. We especially wanted to make sure we had the latest set of tools that enable therapists to obtain an assessment score for the level of support a patient would receive when discharged to a particular environment. These SLPs were also asked to recommend topics and/or speakers they had evaluated while at ASHA for the above mentioned conference calls within their discipline. Although every assessment tool covered at the conference was already known and/or used by our people, we did pick up useful information on many other new techniques.

  • We hired an SLP consultant who was experienced at administering the Joint Commission’s annual skills assessments process and began scheduling face-to-face sessions for all SLPs to meet with this person. Although we do not plan to seek joint Commission accreditation, we saw value in adopting similar processes.

  • At the same time we enhanced our existing cognitive assessment training materials with success stories from those SLPs who were obtaining quantifiable results for our clients and our CEO (also an SLP) gave this in-service to the teams that appeared to need it the most.

  • We began educating our administrators, DONs, and social workers on our training plans to obtain their agreement that the measurements produced by these tools should be an integral part of the inter-disciplinary discharge discussion.

  • Since my management training at General Electric taught me that “if you can’t measure it you can’t manage it,” we implemented a way to measure how often the above cognitive assessment tools are actually being used. We started requiring that each time a patient is cognitively assessed, the test name, along with the Pre morbid, start-of-care and end-of-care scores be entered into our server based treatment and billing system. Each month management views the list of therapists and the cognitive test they used, and those whose names are absent, receive gentle encouragement to sign up for retraining.

  • Now that the PTs and OTs know we have used structured interviews with the SLPs and we also have a consultant administering a Joint Commission type annual skills test, they feel neglected. To keep them happy we need to someday build a structured interview instrument for them and also start looking for consultants in their discipline that have experience with administering skills assessments. However, since we have more than 50 OTs and PTs, I plan to save money and use written surveys to capture their decision-making processes rather than live interviews.

Conclusions

We drew several strong conclusions from this effort:

  1. These types of interviews are much less expensive and much more thorough than paying someone to observe a therapist while treating a patient and then question the therapistís decision process afterwards. We were pleased with how the format allowed us to quickly see which decisions were being made in a consistent fashion across all respondents and which were not.

  2. We worried too much about using a live interview versus a written instrument. In the future we will use a written instrument that is administered in a supervised group setting. We feel that writing answers in a supervised setting should adequately discourage the sharing or making up of answers just as effectively as the live interview. We anticipate that this will make the process even more cost-effective in the future.

  3. This process may also be a viable method of identifying variations in the clinical decision making process of other types of healthcare professionals, including nurses and doctors.

Robert Kunio is the Chief Operating Officer of SimplyRehab LLC, a contract therapy provider with operations in Illinois, Colorado, Utah, and North Carolina. Prior to joining SimplyRehab, he was an instructor at the Lake Forest Graduate School of Management and an independent consultant. He has successfully applied management and analysis techniques he learned earlier in his career at GE, IBM, and Lexis to the long-term care industry. He can be contacted via e-mail at

RobertK@SimplyRehab.com.

To send your comments to the editor, e-mail mhrehocik@iadvanceseniorcare.com.

Long-Term Living 2010 March;59(3):36-39


Topics: Articles , Clinical , Facility management , Staffing