What a "Bounce Back" Actually Is
You've just spent three weeks in a nursing home or rehab facility recovering from a hip replacement, a stroke, a bout of pneumonia. The discharge planner says you're ready. You go home — or maybe to assisted living. Then, within 30 days, something goes wrong. You're back in the emergency room. Back in the hospital.
That's a bounce back. Clinically it's called a 30-day unplanned readmission. And it's one of the clearest signals we have that something failed in the care transition.
Bounce backs matter because the consequences compound fast. Patients who are readmitted within 30 days lose ground they fought hard to gain — physically, cognitively, emotionally. Families who thought the worst was over find themselves blindsided, often without a plan. The costs — to patients, families, and Medicare — are substantial. CMS has estimated preventable readmissions cost Medicare billions annually.
For a family choosing a nursing home, a facility's readmission rate is one of the most direct answers to the question: "Will my parent actually get better here?"
The Problem With the Number Most Sites Show
If you've browsed nursing home comparison sites before, you've probably seen a rehospitalization percentage somewhere on a facility page. What most sites don't tell you is where that number comes from — and why the source matters enormously.
The most commonly displayed figure comes from the MDS: the Minimum Data Set, a federally mandated assessment that facilities fill out for every resident. When a patient gets readmitted to the hospital, the facility is supposed to record it. That record flows into CMS's public data and gets republished as a quality measure.
The problem is straightforward: the facility is reporting on itself.
Coding incentives are real in healthcare. A skilled nursing facility that documents a readmission may face scrutiny, lower star ratings, and payment consequences. Facilities don't have to fabricate data outright — subtle variation in how a transfer is classified, or in whether a rehospitalization is captured in a particular assessment window, can move the number. Researchers have documented gaps between MDS self-reported rates and what Medicare billing records actually show. For the same facility, in the same year, the two measures can differ by five to ten percentage points.
That gap isn't noise. It represents real patients whose readmissions may or may not be reflected in the number a family is using to make a decision.
Why We Chose SNFRM
The Skilled Nursing Facility Readmission Measure — SNFRM, NQF #2510 — is different in two important ways: it's claims-based and it's risk-adjusted.
Claims-based means the data comes from actual Medicare billing records, not from what facilities report about themselves. When a Medicare fee-for-service patient is admitted to a hospital within 30 days of a SNF discharge, that's captured in a claims transaction — a financial event that flows through CMS whether the originating facility likes it or not. The facility doesn't code it. They don't choose whether to include it.
Risk-adjusted means the measure accounts for the fact that some facilities take harder cases. A SNF that primarily serves patients recovering from complex cardiac surgery will see more readmissions than one that serves straightforward hip replacements — not because it provides worse care, but because the patients are sicker. SNFRM uses a validated statistical model to adjust for patient case mix, so you can compare facilities more fairly.
This fairness issue is mostly invisible on other sites. Raw readmission rates punish facilities that accept high-acuity patients. Risk-adjustment is the difference between measuring care quality and measuring patient selection.
There's a third reason we trust SNFRM: CMS puts money on it. Through the SNF Value-Based Purchasing (VBP) program, CMS adjusts up to 2% of a facility's Medicare payments based on SNFRM performance. When a federal agency is willing to move hundreds of millions of dollars based on a measure, it has been through more scrutiny than almost anything else in healthcare quality measurement. SNFRM is also endorsed by the National Quality Forum (NQF), the independent body that vets quality measures for clinical validity and scientific soundness.
We want families to see the number CMS is already using to pay facilities — not a softer self-reported alternative.
The Cross-Validation Stack We Show Alongside SNFRM
No single metric is enough. We know that. So we layer in additional measures to catch what SNFRM misses and to fill the gaps between annual data refreshes.
MDS short-stay rehospitalization rate. This is the self-reported number we just described — the one with potential coding issues. We still show it, clearly labeled as MDS-sourced. Why? Because it updates quarterly, which means it's more current than the annual SNFRM figure. When those two numbers are close, that's reassuring. When they diverge significantly, that's a signal worth flagging.
Hospitalizations per 1,000 long-stay resident days. SNFRM covers the post-acute (short-stay) population — patients recovering from a hospital stay. But many residents in nursing homes are there for the long term: people with dementia, Parkinson's, or other conditions that make independent living unsafe. This measure tracks whether that custodial population is also being sent to the hospital at unusually high rates. It captures a completely different group that SNFRM doesn't touch.
Discharge to Community rate. This one is subtle. SNFRM only counts readmissions that happen within 30 days of discharge. A facility that keeps patients past day 30 before releasing them — regardless of clinical readiness — can mechanically improve its SNFRM score without actually providing better care. Discharge to Community tracks what fraction of residents successfully return home or to a community setting, which helps us catch this pattern.
Five-year readmission trend from LTCFocus.org. LTCFocus is maintained by the Pembroke Institute at Brown University and aggregates longitudinal SNF quality data going back years. A facility's five-year trend tells you something a single-year snapshot can't: is this place improving, declining, or flat? We link directly to the LTCFocus profile for every facility we show.
The Fairness Rules We Enforce in How We Display These Numbers
Data without context misleads. We hold ourselves to a specific set of display rules.
We always show risk-adjusted numbers, and we always label them. If a number is risk-adjusted, we say so. If a number is not risk-adjusted (like the raw hospitalization rate), we say that too. You should never have to guess which you're looking at.
We never mix MDS and claims numbers in the same chart. They measure different things. Putting them on the same axis implies they're interchangeable. They're not.
If N is under 25 stays, we don't show a number. Small facilities — or facilities with low Medicare volume in a given year — don't have enough cases to produce a statistically reliable rate. Most comparison sites either show a number anyway (unreliable) or just blank the field (unhelpful). We do neither. We display: "Not enough qualifying stays to calculate a reliable rate." That's a real answer, not an empty cell.
We disclose the Medicare Advantage blind spot. SNFRM and most claims-based measures use Medicare fee-for-service data. Medicare Advantage (MA) enrollees are billed through private insurance, not CMS — so they don't appear in these metrics at all. For facilities with high MA enrollment, the published readmission rate may reflect only a fraction of their actual patient population. On every facility page, we note approximately what share of residents are MA-enrolled and therefore not captured in the data you're seeing. We want you to know the limits of what we can show.
What We Couldn't Get — Yet
Pennsylvania has an unusual asset: the Pennsylvania Health Care Cost Containment Council (PHC4) collects hospital-to-SNF linkage data that connects individual hospital discharge records to subsequent skilled nursing admissions in the state. This would allow finer-grained, more current readmission analysis than what CMS publishes — including readmissions from MA patients that federal measures currently miss.
Accessing that data requires a formal, paid research request. We haven't done it yet. It's on the Phase 2 roadmap. When we have it, we'll write another one of these explaining exactly what changed and why.
We'll always tell you what we don't have.
How to Actually Read the Number on a Facility Page
Here's a worked example. Suppose you're looking at Sunrise Nursing and you see:
18.4% (SNFRM FY2025, N=342 stays, 95% CI 16.1–20.9%, risk-adjusted). National average: 17.8%.
What does this tell you?
Sunrise's readmission rate is 18.4%. The national average is 17.8%. So Sunrise is slightly above average — but look at the confidence interval: 16.1% to 20.9%. The national average of 17.8% falls comfortably inside that range. That means the difference between Sunrise and average is not statistically meaningful given the sample size. We can't say with confidence that Sunrise performs differently from a typical facility.
Now contrast that with a facility showing 24.1% with a CI of 23.0–25.2%. The lower bound of that interval is still well above the national average. That's a genuinely elevated readmission rate — not noise.
The N matters too. 342 stays gives you a reasonably precise estimate. 28 stays would give you a very wide confidence interval — basically, noise dressed up as a number.
We show the CI and the N because without them, a percentage is a false precision.
Our Commitment
We show the bounce-back rate we would want to see if it were our own parent being admitted.
That means: claims-based over self-reported whenever possible, risk-adjusted always, confidence intervals always, sample sizes always, and clear disclosure when data is missing or limited.
We update SNFRM numbers when CMS updates them through the annual SNF VBP program release, typically each fall. The MDS short-stay rate updates quarterly and we refresh that on the same cadence CMS publishes it.
If we get something wrong — a data pipeline error, a miscalculated confidence interval, an out-of-date figure — we fix it fast and we link to the correction. We don't bury mistakes.
You can read the source data yourself. SNFRM figures are published in CMS's Provider Data Catalog at data.cms.gov. The SNF VBP program methodology is documented on cms.gov. LTCFocus data lives at ltcfocus.org. We link to all of them.
You should be able to check our work. That's the point.