The Wrong Question and the Right One
Most CTOs ask: "Should we build or buy?" That's the wrong question. The right question is: "Which specific tools in our current stack are costing us more than a custom replacement would?"
35% of teams have already replaced at least one SaaS tool with a custom build. But they didn't replace everything. They were selective. The companies that get the best results from SaaS replacement are the ones with a systematic framework for deciding what to replace and what to keep.
This is that framework.
Key Takeaways
The Five Criteria
Each criterion is scored 1-5. A total score of 15 or higher indicates a strong replacement candidate. Scores of 10-14 are worth investigating. Below 10, the SaaS tool is probably fine where it is.
Criterion 1: Feature Utilization (Weight: High)
What to measure: What percentage of the tool's features does your team actually use?
| Score | Utilization | Signal |
|---|---|---|
| 1 | 70%+ | You're using most of what you're paying for |
| 2 | 50-70% | Normal for enterprise SaaS |
| 3 | 30-50% | You're overpaying for features you don't need |
| 4 | 15-30% | Strong replacement signal |
| 5 | <15% | You're renting a mansion to use the bathroom |
Industry data shows that 30-53% of SaaS licenses go entirely unused in any given month. But feature utilization is even worse — most teams use less than 20% of the features in their enterprise SaaS tools.
How to measure it: Pull usage analytics from the SaaS vendor's admin panel. If they don't provide usage data (many don't — it's not in their interest), survey your team. Ask: "Which features do you use daily? Weekly? Monthly? Never?" The "never" list is usually the longest.
Criterion 2: Seat Count Impact (Weight: High)
What to measure: How much does per-seat pricing cost you annually, and how fast is that growing?
| Score | Annual Seat Cost | Signal |
|---|---|---|
| 1 | <$5K total | Low financial impact |
| 2 | $5K-$25K | Noticeable but manageable |
| 3 | $25K-$75K | Significant line item |
| 4 | $75K-$200K | Major budget allocation |
| 5 | $200K+ | This tool is a business unit unto itself |
The math here is straightforward but often shocking when you actually do it. Salesforce at $500/seat/month for the top tier costs $60K per seat per year. A 50-seat deployment is $3 million annually. A custom CRM costs $30K-$60K to build with $500-$3,000/month in maintenance.
How to measure it: Pull the annual contract value. Include all tiers, add-ons, and overages. Divide by actual active users (not provisioned users). If the per-active-user cost is 2x+ the per-seat price, you have a ghost user problem compounding the issue.
Criterion 3: Integration Complexity (Weight: Medium)
What to measure: How many engineering hours per year does this tool's integration consume?
| Score | Annual Integration Hours | Signal |
|---|---|---|
| 1 | <20 hours | Minimal maintenance |
| 2 | 20-50 hours | Standard upkeep |
| 3 | 50-100 hours | Noticeable engineering drain |
| 4 | 100-200 hours | Significant technical debt |
| 5 | 200+ hours | This integration is basically a full-time job |
As covered in The Technical Debt Nobody Counts, engineering teams spend 20-30% of their time on SaaS integration maintenance. The tools with the most complex integrations — multiple API endpoints, custom sync logic, frequent version migrations — are the ones where custom replacement eliminates the most hidden cost.
How to measure it: Check your project management tool for tickets tagged with the SaaS tool's name. Review git history for commits touching integration code. Ask the engineers who maintain it — they know exactly how much time it takes. They've been wanting to tell you.
Criterion 4: Data Sensitivity (Weight: Medium)
What to measure: How sensitive is the data flowing through this tool, and what's the risk of it living on a third-party server?
| Score | Data Type | Signal |
|---|---|---|
| 1 | Public marketing data | Low sensitivity |
| 2 | Internal operational data | Standard business data |
| 3 | Customer PII | Compliance implications |
| 4 | Financial/health data | Regulatory requirements |
| 5 | Core IP or competitive data | Strategic risk |
This criterion matters more in regulated industries (healthcare, finance, government) but it's increasingly relevant everywhere. GDPR, CCPA, SOC 2, HIPAA — the compliance landscape keeps expanding. Every SaaS tool holding your data is another vendor to audit, another data processing agreement to maintain, another potential breach notification to manage.
With a custom-built tool running on your own infrastructure, the data never leaves your environment. One fewer vendor with access to your customers' information.
How to measure it: Map the data flowing into and out of each SaaS tool. Classify it by sensitivity tier. Cross-reference with your compliance requirements. The tools handling your most sensitive data in the least controlled environments score highest.
Criterion 5: Vendor Lock-in Risk (Weight: Medium)
What to measure: How difficult would it be to leave this vendor if you needed to?
| Score | Lock-in Level | Signal |
|---|---|---|
| 1 | Easy export, standard formats | Minimal switching cost |
| 2 | Exportable but requires cleanup | Some effort to migrate |
| 3 | Proprietary formats, some data loss | Significant switching cost |
| 4 | Deep workflow dependencies | Major migration project |
| 5 | Data hostage situation | You're trapped |
Vendor lock-in is the boiling frog problem. It starts with "we'll just use their proprietary field types." It ends with "we can't leave because 10 years of business logic is embedded in their platform." The more locked in you are, the more urgently you need an exit plan — and the more leverage the vendor has to raise your prices.
How to measure it: Try to export your data. Right now. If it takes more than an hour to get a complete, usable export of your data in a standard format, your lock-in score is at least a 3.
Scoring in Practice: Three Examples
Example 1: Enterprise CRM (Salesforce)
| Criterion | Score | Rationale |
|---|---|---|
| Feature Utilization | 5 | Team uses contacts, deals, and basic reporting — maybe 10% of the platform |
| Seat Count Impact | 5 | 80 seats × $300/mo = $288K/year |
| Integration Complexity | 4 | Custom Apex triggers, 3 integrations with internal tools, 120+ hours/year maintenance |
| Data Sensitivity | 4 | Full customer PII, deal financials, pipeline data |
| Vendor Lock-in | 4 | Custom objects, workflows, Apex code all proprietary |
| Total | 22/25 | Strong replacement candidate |
A custom CRM built for this company's actual workflow — contacts, deals, pipeline, and reporting — would cost $40K-$60K to build and $2K-$3K/month to maintain. Three-year cost: $132K vs. $864K for Salesforce. That's $732K in savings while getting a tool that does exactly what the team needs and nothing they don't.
Example 2: Project Management (Jira)
| Criterion | Score | Rationale |
|---|---|---|
| Feature Utilization | 3 | Uses boards, sprints, and basic reporting. Ignores advanced roadmaps, automation, and portfolio features |
| Seat Count Impact | 2 | 50 seats × $16/mo = $9,600/year |
| Integration Complexity | 2 | Standard webhook integration, minimal custom code |
| Data Sensitivity | 2 | Internal project data, no PII |
| Vendor Lock-in | 2 | Data exportable, workflows recreatable |
| Total | 11/25 | Borderline — investigate but likely keep |
At $9,600/year with low integration overhead, Jira isn't bleeding money. The custom alternative might cost $25K to build — a 2.5-year payback for a tool that doesn't cause major pain. Keep it unless other factors (like Atlassian's pricing trajectory) change the math.
Example 3: Workflow Automation (Zapier)
| Criterion | Score | Rationale |
|---|---|---|
| Feature Utilization | 4 | Using 15 out of 200+ connectors, basic trigger-action patterns only |
| Seat Count Impact | 4 | Team plan + task overages = $45K/year and growing with volume |
| Integration Complexity | 4 | 40+ Zaps, many with custom code steps, frequent failures requiring manual intervention |
| Data Sensitivity | 3 | Customer data flowing through third-party servers |
| Vendor Lock-in | 3 | Workflows are recreatable but undocumented tribal knowledge |
| Total | 18/25 | Strong replacement candidate |
Zapier is the classic replacement target. Per-task pricing means costs scale with usage. A custom workflow engine built around the 15 integrations actually in use would cost $30K-$45K and run on your infrastructure at a fraction of the per-task cost. The 35% of teams already replacing SaaS frequently cite workflow automation as their first target.
How to Run This Framework
Step 1: Inventory. List every SaaS tool with an annual cost above $5,000. Include integration maintenance hours (ask your engineers — they're spending 20-30% of their time on this).
Step 2: Score. Run each tool through the five criteria. Be honest. If you're not sure about utilization numbers, that's itself a signal — you're paying for something you can't even measure.
Step 3: Rank. Sort by total score. Your top 3-5 tools are your replacement candidates.
Step 4: Sequence. Start with the tool that has the highest score AND the lowest build complexity. You want an early win that demonstrates value before tackling the harder replacements.
Step 5: Calculate ROI. For each replacement candidate, compare the 3-year SaaS cost (including annual increases of 8-12%) against the custom build cost plus maintenance. Most tools scoring 15+ will show 50-75% savings over three years.
Common Objections
"We don't have the engineering bandwidth." You don't have to build it yourself. A SaaS replacement agency handles the build, delivers a production-ready tool, and provides ongoing maintenance. Your engineers stay focused on product.
"What about updates and new features?" You only build the features you need. SaaS vendors ship features for their entire customer base — you're paying for R&D that benefits their other customers, not you. Custom tools get updates when you need them, scoped to your requirements.
"The switching cost is too high." Run the numbers. A tool scoring 20+ on this framework is costing you $200K-$500K per year in combined subscription + integration costs. The switching cost is a one-time expense. The savings compound annually.
The Decision Matrix
After scoring your entire stack, you'll likely find it falls into three buckets:
The goal isn't to replace everything. It's to replace the right things — the tools where the gap between what you're paying and what you're getting is widest.
Ready to score your stack? Get your free SaaS audit — we'll run this framework against your top tools and show you exactly where the savings are.