MedSitter’s development team is constantly improving upon the MedSitter product. When MedSitter is installed as a fall avoidance tool, hospitals can save hundreds of thousands of dollars in fall-related costs. This blog contains real data from the MedSitter platform.
At MedSitter, we’re always looking for new insights that might help us improve our product. One way we do this is by looking at how the system is being used out on the front lines of patient care. One of our recent data studies showed results that we couldn’t keep to ourselves.
It’s no secret that the primary ROI vector for MedSitter is avoiding patient falls. Patient falls are an extremely costly problem in healthcare. Just a quick search on Google for “cost of patient falls” will reveal a plethora of articles citing this as costing billions. The US Center for Disease Control says patient falls cost healthcare organizations $50,000,000,000 per year… just for Americans 65 or older (Source: CDC).
But is MedSitter doing its job? Last year, we started asking sitters to explicitly track both falls avoided—by sitter verbal intervention with the patient, sounding the alarm to call nursing staff to the bedside, or other similar measures—as well as “unassisted” cases in which patients still managed to fall despite sitter oversight. We’ve had several months to collect data now, and the results are in. So how did we do?
Below is a plot of the falls avoided that sitters reported using MedSitter between March and July 2022 across one sample of our customers:

The first thing we note is that anywhere from 20 to 100 falls are being reported as avoided per day across this group of customers. We also note that the number of avoided falls by day is highly variable, which isn’t very desirable in trying to describe the gestalt performance of the system.
To help deal with this, we will normalize the data based on the total amount of patient monitoring time. We choose this metric because a patient that is monitored for longer is more likely to fall—that makes monitoring time a preferable weighting feature to simple patient count. Normalizing the data should help “even out” the cyclical elements affecting patient census like weekdays vs weekend as well as sporadic quirks like holidays or a local emergency event—say perhaps, car accidents in bad weather.
With the data normalized this way, we can stabilize and smooth our graph by examining larger samples of the data. We do this by calculating a running total of the falls avoided and dividing by the running total of the patient monitoring time, over both a 30-day moving window and a cumulative running total. We’ll also change the scale of the metric to report falls avoided per 1000 patient days—this change of scale just makes the metric easier to understand than talking about “fractions” of a fall on a single patient-day.

The green line shows the cumulative average falls avoided over this 4-month period—roughly 339 falls avoided per 1000 patient days of monitoring. This line smooths out as it moves toward the right side of the graph - this is expected as the total amount of sample data grows. However, there is a very slight upward trend that we may have missed without the rolling 30-day window shown by the orange line above. This shows that the reported falls avoided are increasing over time, even when normalized to account for monitoring time as the user base of the platform grows.
There are several reasons why we might see this “improvement”. Perhaps customers are doing a better job of evaluating which patients are the highest fall risks and assigning those to sitters over less risky patients. Perhaps sitters are doing a better job at preventing falls, or perhaps they are simply growing more conscientious about logging potential fall avoided events. In order to address the latter, let’s also look at the unassisted falls metric over this period of time. The running averages below are normalized per 1000 patient days as well.

Both the 30-day and cumulative weighted averages here are very stable, and they suggest that we should expect to still have roughly 1.5 unassisted falls per 1000 patient days of monitoring. Different studies online cite the “normal” fall incidence rate per 1000 patient days as being somewhere between 2 and 14, depending heavily on the demographics of the patients. See the following articles for example study citations:
Even with as broad of a range as 2 to 14 “real” falls per 1000 patient days, the sitter reported fall avoidance is clearly inflated, but this is not necessarily a bad thing. Sitters should err on the side of aggressively intervening during potential fall opportunities in order to protect patient safety. The 30-day running sample of reported falls avoided probably shows that the sitters are growing even more comfortable being proactive about acting on behalf of the patients.
But how can we gauge our progress in avoiding “real” falls? If the range of unassisted falls without MedSitter is expected to be between 2 and 14 per 1000 patient days, let’s choose a middling number in that range and assume for a moment that the actual rate for our population is 8 falls per 1000 patient days. This seems reasonable given that MedSitter customers span a variety of different patient demographics. If the unassisted fall rate using MedSitter is 1.53, then we could reasonably believe that MedSitter is preventing roughly 6.5 falls per 1000 patient days.
In the data spanned by our graphs, MedSitter was used for a total of almost exactly 21,500 patient days. We can posit that means roughly 21,500 / 1000 * 6.5 ~ 140 “real” falls were avoided in those four months.
One article from Johns Hopkins estimates that costs for a single patient fall averages $34,294; taken together at that estimate, the cost savings for just fall avoidance realized by these MedSitter customers for 140 falls in just four months are expected to be roughly $4.8 million!
We also learned that our sitters are very diligent and conscientious about paying attention to the patients. Based on their reporting rate, it’s likely that only roughly 1 out of 50 “fall avoided” reports by the sitters would have resulted in an actual fall: 339 reports for each ~6.5 real falls avoided means roughly 52.2 reports per real fall.
That will be an interesting metric to keep an eye on as we continue research into these fall avoided observations—if we can “calibrate” that reporting to real fall data with assistance from one or more of our customers, those reported falls avoided may be useful as a predictive indicator for real falls.
If any of our customers reading this blog are interested in joining us for expanding our data or exploring more analyses, Contact Us here! In the meantime, keep an eye on this space for future data articles!