StudioPulse connects directly to a studio's management platform and pulls member data every night. I recently did a deep dive into one studio's complete dataset: every member record, every visit, every purchase, every retention segment. I am not naming the studio, but the data is real.
What I found was not surprising to me at this point, but I think it would surprise most studio owners if they looked at their own numbers the same way. Here is what the data actually showed.
Finding 1: Your database and your customer list are not the same thing
The studio I looked at had about 3,700 member records. That is the number you would see if you pulled a full member export from the platform. It is the number a lot of owners use when they think about how many people have come through their door.
About 700 of those 3,700 people actually visited last month. That is less than 1 in 5.
The other 2,700+ are in the database as historical records. Some cancelled years ago. Some came in for a trial and never converted. Some moved away. Many are simply gone, and they will stay in that member count forever unless someone manually cleans it up.
This matters because "3,700 members" feels very different from "700 active customers." The second number is your business. The first number is your history. They are not the same thing, and conflating them leads to a distorted sense of where you actually stand.
Finding 2: New member drop-off is steeper and faster than most owners realize
I looked at every member grouped by how long ago they first visited. The retention numbers by cohort were striking.
Members who joined in the last two months: about 48% are still visiting. Members who joined two to six months ago: 16% are still active. Members who joined six to twelve months ago: about 10% are still visiting regularly.
That drop from 48% to 16% happens in just a few months. For many studios, that is the entire window between a new member becoming a regular and quietly drifting away.
The members in that two-to-six month range are past the initial enthusiasm but have not yet built a real habit. They are still technically "active" in the system. Their membership is still billing. They just stopped coming, and nobody flagged it yet.
This is not unique to this studio. It is a predictable curve. The question is whether you have a way to catch those members while they are still reachable or whether you find out they cancelled when they finally cancel.
Finding 3: There is always a lapsing window right now
At any given point, this studio had roughly 350 members who had visited within the last 90 days but had not come back in the last 30. They had not cancelled. They were not marked as inactive. They were just gone quiet.
That group represented about $2,000 in monthly revenue that was at risk. Not lost yet. Just at risk.
The window to reach those members and bring them back is real, but it is short. A member who hasn't been in for 30 days is meaningfully different from a member who hasn't been in for 90 days. At 30 days, a personal message from the studio owner has a good chance of working. At 90 days, the odds are much worse.
The problem is that without a system telling you who is in that window, those members tend to stay invisible right up until they cancel. They are not generating any alerts. They are not showing up on any report. They are just slowly not coming back.
Finding 4: How often active members visit tells you who is about to leave
The median active member at this studio visited about 5 times per month. The top 25% visited 9 or more times. The bottom 25% of active members visited about twice a month.
Visit frequency is one of the most reliable early signals of churn. A member who normally comes 8 times a month and drops to 2 times is telling you something. That change happens weeks before any cancellation does. It is visible in the data long before it is visible anywhere else.
The most engaged member in this studio visited 51 times in a single month. Some people build their entire daily structure around coming to class. Those are not members who are going to leave quietly. But the members at the lower end of the frequency range, especially if their frequency is declining, are the ones who need attention.
Aggregate attendance data does not show you this. Total check-ins going up does not mean every member is fine. You can have 10 members visiting more often while 30 others slip toward the door, and the aggregate number looks healthy the whole time.
Finding 5: If you can keep a member for a year, the curve flips
Here is the thing about long-term members that the cohort data showed: members who had been coming for more than a year were actually more likely to still be active than members in their six-to-twelve month window.
Members at six to twelve months: about 10% still visiting regularly. Members who had been coming for more than a year: about 19%.
That is not a statistical fluke. It reflects something real about how retention works. The people who make it past the one-year mark have integrated the studio into their life in a way that shorter-tenure members have not. They have built the habit. They have the relationships. They have a reason to keep coming that goes beyond just "I signed up for a membership."
Getting a member from month six to month twelve is one of the hardest parts of running a studio. The drop-off in that window is steep. But the members who make it through that window tend to stay for a long time. That asymmetry is worth knowing.
One more thing: the payment failure signal
I also looked at members who had a payment failure on their account. This studio had 27 of them. Of those 27, 26 had not visited in the past 30 days.
Payment failures and visit drop-off tend to arrive together. A member who is planning to cancel often stops coming before they actually cancel, and they also stop caring whether the payment goes through. The billing system flags the failed payment, but no one connects that flag to the fact that the member has also stopped showing up.
A payment failure on a member who is still visiting is usually an administrative issue. A payment failure on a member who hasn't been in for a month is a different situation entirely, and it deserves a different response.
What to do with this
These findings are not from a uniquely troubled studio. The studio in question has been growing steadily, revenue up month over month, new members coming in consistently. These are patterns that exist in the data of healthy studios too, because they reflect how member behavior actually works, not how we wish it worked.
The studios that retain members at above-average rates tend to do one thing consistently: they look at individual member data, not just aggregate reports. They know who is in the lapsing window. They have a process for reaching out to new members before the drop-off window closes. They notice when a regular's visit frequency drops before that member ever sends a cancellation email.
None of that requires a massive operation. It requires the right information at the right time, delivered to someone who can actually do something about it.
See your own member data, broken down like this
StudioPulse connects to your platform and shows you exactly who is in your lapsing window, who is slipping in their first 90 days, and how much monthly revenue is at risk. First report in 24 hours.
See a Sample Report30-day free trial. No credit card required.