The Five SOC 2 Trust Service Criteria
This article is for educational purposes only and does not constitute professional compliance advice or legal counsel. Requirements and standards evolve, and you should consult with a qualified compliance professional about your specific situation.
You're reading through a SOC 2 report and the auditor keeps referencing the Trust Service Criteria. You're wondering what that means, what the auditor was actually evaluating, and whether your report covers the things your customers care about. The five criteria sound like alphabet soup at first—Security, Availability, Processing Integrity, Confidentiality, Privacy—but they're actually just the five categories that customers care about most when they're deciding whether to trust you with their data and systems. Every SOC 2 audit evaluates some combination of these five, and understanding what each one means is essential because it tells you what the auditor was actually looking for, what evidence you needed to gather, and what your customers will be looking for when they read your report.
Security: The Foundation That Gets the Most Attention
Security is the criterion that commands the most resources in a SOC 2 audit because it addresses the most fundamental question: can bad actors break into your systems and steal data or cause damage? This is the heavyweight of the five criteria, and it covers a lot of ground. The auditor is looking at access controls—how you're restricting who can get into what systems and data. They're examining encryption, both for data at rest (sitting in your storage systems) and data in transit (moving across networks). They're looking at how you find and fix vulnerabilities, what your incident response plan looks like and whether you actually follow it, and how you're monitoring your systems to detect when attacks are happening. This is the full security perimeter.
In the audit, the auditor is verifying that you have documented controls for each of these categories and that they actually work in practice. You'll need to show evidence that you're restricting access to the right people using role-based access controls or similar mechanisms. You'll need to demonstrate that you're monitoring for unauthorized access attempts or suspicious behavior. You'll need to produce a documented incident response plan and evidence that you've actually used it when incidents happened. You'll need to show that you have a plan for what to do if a security breach occurs, including notification procedures. The practical reality is that most of the auditor's effort in any SOC 2 audit goes into Security because it's both the broadest and the highest-risk criterion.
If your audit scope includes Security—and it always does—prepare to invest significant effort in documenting your access controls, incident response procedures, and security monitoring infrastructure. You'll need to organize access logs showing who has access to what, evidence of quarterly or periodic access reviews, documentation showing that your security policies are actually being followed. The auditor isn't trying to break into your systems or find every vulnerability. They're evaluating whether you've designed reasonable controls and whether you're actually maintaining them. The bar isn't "perfect security" (which doesn't exist). The bar is "you have thought about the main attack vectors and have controls to mitigate them."
Availability: Why Your Uptime Commitments Matter
Availability is about ensuring that your systems are usable when customers need them. For a SaaS company, a data center, or any service provider with uptime commitments, this is critical. The auditor is looking at backup and recovery procedures—can you restore service if something breaks? They're examining your change management process—how do you deploy updates without accidentally taking down your own system? They're looking at capacity planning—do you have enough resources to handle your expected traffic, or are you chronically undersized? And they're evaluating infrastructure monitoring—do you know when things are failing before your customers start complaining?
The audit verifies that you have monitoring in place to detect outages, that you can actually recover from failures with your documented procedures, and that you have a documented change management process that prevents accidents. The auditor will ask to see your monitoring dashboards or logs, your incident reports showing outages and how you recovered, your capacity planning documentation, and your change management logs showing who deployed what and when. If you're in the business of providing infrastructure or data processing services, Availability might actually be more important to your customers than Security in some contexts because an outage is an immediate revenue and operational impact for them, while a security breach is a potential future impact.
Here's what makes Availability different from just "keeping your systems up." It's about having documented, repeatable processes for maintaining uptime. You could have systems that never go down by accident, but if you don't have a documented change management process and evidence that you're following it, you fail the Availability criterion. The auditor isn't just looking for uptime. They're looking for evidence that your uptime is the result of deliberate controls, not luck. They want to see that you have monitoring, that you have procedures for handling failures, and that you have a process for deploying changes safely. The discipline is what matters.
Processing Integrity: Making Sure Data Moves Correctly
Processing Integrity addresses one of those criteria that sounds dry but carries real operational and legal weight. It's about ensuring that data remains accurate and complete as it moves through your systems. This covers data validation—making sure incoming data is correct before you process it. It covers error handling—what you do when data is malformed, incomplete, or doesn't meet your validation rules. It covers completeness of processing—making sure that all records get processed, none get silently lost or dropped.
This matters because if a customer uploads 1,000 records and 50 of them silently fail to process, that's a Processing Integrity failure. If your system silently corrupts timestamps, drops decimal places, or modifies data in ways you don't intend, that's a Processing Integrity failure. If a transaction gets processed twice somehow, that's a Processing Integrity failure. The auditor verifies that you have controls to catch these failures and either fix the data or alert the user that something went wrong. You'll need to show evidence of data validation rules, error logs showing how your system handled bad data, and processes for identifying and remediating data quality issues.
In financial services or healthcare, Processing Integrity is critical because bad data isn't just an annoyance—it's a legal and operational liability. A bank can't have transactions silently fail to post. A healthcare system can't have patient records become corrupted. In other industries, Processing Integrity is less central to the risk profile but still important because data corruption erodes trust even if it's rare. If you tell customers their data is safe and accurate, and then their data becomes corrupted, you've violated a fundamental trust expectation. The controls here are about proving that your systems maintain data integrity by design, not by accident.
Confidentiality: Access Control for Sensitive Information
Confidentiality is about controlling who can access data that's supposed to be restricted or secret. This includes access controls specifically for sensitive data, encryption of sensitive data (which is also about making sure that even if someone gets the data, they can't read it), logging of who accessed sensitive data so you can detect snooping, and deletion of sensitive data when it's no longer needed. The key difference from the Security criterion is focus. Security is about the entire system perimeter and protecting everything from external attacks. Confidentiality is specifically about protecting data that requires restricted access from internal and external actors alike.
A company with good Confidentiality controls might restrict access to customer billing information to only the accounting team, require audit logs showing who looked at what and when, and delete credit card data immediately after payment processing is complete. The auditor verifies that you have role-based or attribute-based access controls in place that restrict sensitive data access appropriately, and that you're actually maintaining those controls over time. They want to see evidence that the people with access to sensitive data are the people who should have it, and that your access logs show nobody inappropriate is looking at things they shouldn't.
Confidentiality is particularly important for SaaS companies because customer data is inherently sensitive and customers need to know that only the right people within your organization can see their data. If a customer is uploading confidential business information, they need confidence that it's not visible to random employees. This is different from preventing external attackers from getting in. This is about preventing internal access that shouldn't happen. The controls are about treating sensitive data differently from everything else, and about proving that access is restricted and monitored.
Privacy: Doing What You Promised with Personal Data
Privacy is where the focus shifts from internal controls to external commitments and regulatory compliance. It's about complying with the regulations and commitments you've made around what you do with personal data. This includes data collection—are you collecting only the data you need or are you hoarding? Notice and consent—did you tell people you were collecting it, and did they agree? Use restrictions—are you using data only for the purposes you said you would, or are you repurposing it? Retention—are you deleting data when you promised to? Breach notification—do you have a plan to tell people if their data gets out?
Privacy sounds like it's purely a legal compliance question, and there's definitely a legal layer, but from the SOC 2 perspective it's about operational controls that support your privacy commitments. The auditor verifies that you have documented privacy policies, that you're following those policies, and that you have procedures in place for things like honoring data deletion requests, responding to data access requests (important under GDPR), or notifying people when a breach happens. You'll need to show evidence of your privacy policies, documentation of how you classify and handle personal data, processes for honoring deletion or access requests, and procedures for breach notification.
Many companies think Privacy is just a policy document on their website. In reality, Privacy controls permeate operations. You need a data inventory—knowing where personal data lives in your systems. You need access controls preventing unauthorized access to that data. You need retention procedures ensuring data doesn't sit around indefinitely. You need deletion procedures ensuring you can actually remove data when requested. You need processes for responding to data subject access requests. You need incident response procedures that specifically address data breaches. And you need evidence that all of this is actually happening. If you process personal data from EU residents, you'll have GDPR commitments on top of your own privacy policies, and the auditor will be evaluating whether you're meeting those commitments too.
How the Criteria Interact and Which Ones Apply to You
In practice, most SOC 2 audits evaluate all five criteria, but they don't carry equal weight in every organization. Your scope defines which criteria are most relevant to your business and where the auditor should concentrate their evaluation effort. A pure infrastructure company focused on cloud hosting or storage might emphasize Security and Availability because customers primarily care about preventing intrusions and ensuring uptime. A data analytics company might emphasize Processing Integrity and Security because customers care that their data is accurate and not breached. A healthcare application would emphasize Security, Privacy, and Confidentiality because those are the deep concerns of healthcare law and HIPAA requirements. A company processing payments might emphasize Processing Integrity and Confidentiality.
The criteria also overlap, which is important to understand. A single control might satisfy multiple criteria at once. For example, encryption of sensitive data contributes to both Confidentiality (controlling access to sensitive data) and Privacy (protecting personal data). Role-based access controls contribute to both Security (controlling system-wide access) and Confidentiality (controlling sensitive data access). A documented incident response plan contributes to Security and also to Privacy if the incident response includes breach notification procedures. The auditor understands this interconnection and won't double-count controls or create unnecessary duplication. But you should understand it when you're building your evidence library so you don't miss the ways that a single operational control can provide evidence of multiple criteria.
The scope conversation with your auditor is critical because it determines which criteria get the most attention and where you should concentrate your evidence-gathering efforts. You might tell your auditor "Security and Availability are most important because we're SaaS infrastructure" and they'll adjust their testing plan accordingly. Or you might say "we need Privacy to be emphasized because we're handling EU personal data" and they'll focus more testing energy there. The scope isn't about ignoring criteria—all five get evaluated to some degree—but about allocating the auditor's testing effort where it matters most for your business. Understanding your own scope is crucial for planning your evidence collection and knowing what your auditor will be looking at most closely.
Reading Your Report Through the Lens of the Criteria
When you receive your SOC 2 report, it will be organized around these five criteria (or however many are in your scope). The report will have sections for Security, and within that section you'll see subsections describing your access controls, encryption practices, monitoring capabilities, and incident response procedures. You'll see sections for Availability describing your uptime monitoring and change management. You'll see Processing Integrity controls described, Privacy controls detailed, and Confidentiality controls explained. Each section will include the auditor's findings about whether those controls are effective.
Understanding what each criterion covers helps you read the report intelligently. When someone asks to see your SOC 2 report, you can tell them "it covers Security, Availability, and Processing Integrity" and they'll understand what was actually evaluated. If a customer asks whether your report covers Privacy, you can check the scope and give them a direct answer. When a prospect is reviewing your report, they can look for the criteria that matter most to their risk profile and make an informed decision. The report itself becomes much less mysterious when you understand that it's simply organized documentation of controls in five defined categories, and you understand what each category covers.
The Path to Understanding Your Audit
The Five Trust Service Criteria—Security, Availability, Processing Integrity, Confidentiality, and Privacy—are simply the five categories of controls that customers care about most. Security is about keeping bad actors out and detecting incidents when they happen. Availability is about maintaining uptime and preventing operational failures. Processing Integrity is about ensuring data accuracy and completeness. Confidentiality is about controlling who can access sensitive data. And Privacy is about handling personal data according to your commitments and the law. Not every audit emphasizes all five equally, and your scope defines which ones matter most to your business.
When you read your SOC 2 report or discuss audit scope with an auditor, you now understand what each criterion means and what the auditor was looking for when evaluating controls within that category. You can have informed conversations about what matters to your customers, what your business model requires, and where to concentrate your evidence-gathering efforts. The criteria aren't arbitrary bureaucratic categories. They're the result of decades of experience figuring out what customers actually need to know about vendors, and what controls actually matter most for building trust in a service relationship.
Fully Compliance provides educational content about IT compliance and cybersecurity. This article reflects general information about SOC 2 Trust Service Criteria as of its publication date. Standards, criteria emphasis, and audit scope requirements evolve — consult a qualified compliance professional for guidance specific to your organization.