Your SOAP scorecard, inspired by Gartner® Critical Capabilities

Gartner® publishes two complementary reports on Service Orchestration and Automation Platforms (SOAPs): the Magic Quadrant™ for SOAP and the Critical Capabilities for SOAP. The Magic Quadrant™ evaluates vendors at the organizational level, scoring their Ability to Execute and Completeness of Vision. In my view, the companion Critical Capabilities report takes the analysis deeper, focusing on the features and capabilities of the products themselves and mapping them to five key Use Cases.
Together, the two reports give a comprehensive view of the SOAP market landscape, but they remain market-level research, not an assessment of your specific business priorities.
Here, we offer a practical framework for how to translate Gartner’s approach into your own scorecard to evaluate SOAP platforms against your organization’s needs and goals.
Why capability-based evaluation matters
The Magic Quadrant™ is invaluable for seeing which vendors are positioned strongly in the market. It shows who’s executing effectively today and who has the vision and roadmap to meet tomorrow’s demands. But it’s not a detailed interrogation of product features or a guarantee of fit for your particular requirements.
That’s why the Gartner Critical Capabilities companion report is so useful. It zooms in on differentiators — why the SOAP software providers were recognized in particular areas. It asks: How well does this platform execute real-world tasks? How usable is it? What outcomes does it enable?
In the report, Gartner recommends, “When selecting a SOAP vendor, conduct thorough due diligence to understand their specific strengths in innovation, integration and responsiveness to emerging trends, rather than assuming parity in a mature market.”
Inspired by this approach, we’ve built a scorecard you can use to evaluate vendors for your particular purposes, for both functionality and fit, based on the five SOAP Use Cases.
Key capability domains to score
Each domain aligns with a Use Case from the Gartner report. Below, you’ll find:
- What the domain measures
- Traits to look for
- A 1–5 scoring rubric
Operational resilience and IT workload execution
Inspired by the IT Workload Automation Use Case
Can the platform orchestrate and safeguard large volumes of complex, time-sensitive IT workloads?
What to evaluate:
- SLA monitoring and escalation dashboards
- Automated failover, retry and recovery mechanisms
- Volume throughput and performance under stress
- System auditability and job history tracking
How to score:
1 | Minimal support; manual monitoring and recovery; no remote job monitoring; unreliable performance |
2 | Basic monitoring dashboards; manual recovery with some remote job monitoring |
3 | Real-time monitoring tools and alerts; basic recovery options; moderate reliability |
4 | SLA monitoring aligned with business requirements; intelligent recovery based on thresholds; strong dependency and decision-making features |
5 | Full observability features for monitoring and problem management with system and job performance; automated rollback/recovery; extensive dependency management and resilient job execution; high SLA integrity |
Hybrid orchestration and workflow flexibility
Inspired by the IT Workflow Orchestration Use Case
How well does the platform support both business and technical workflows across hybrid environments (on-prem, multi- cloud, SaaS)?
What to evaluate:
- Breadth of pre-built integrations across legacy and modern systems
- Ease of orchestration across teams and technologies (e.g., low-code)
- Flexibility to design, trigger and adapt complex workflows
- Support for both technical and non-technical users
How to score:
1 | Limited integrations; code-heavy; inflexible for cross-system workflows |
2 | Some inflexible connectors and code-heavy for customization; no low-code; moderate flexibility |
3 | Manual install for connectors; no library; limited reusability |
4 | Moderate connector library; community-supported connectors; some low-code options |
5 | Broad integration library; powerful no-code connector customization and reusable templates; non-technical user support |
Data movement and pipeline governance
Inspired by the Data Orchestration Use Case
Can the platform reliably orchestrate large-scale, rule-based data flows across warehouses, lakes and BI systems?
What to evaluate:
- Availability of connectors for major data platforms (e.g., Snowflake, SAP Datasphere)
- Orchestration of rule-based, event-driven data flows
- SLA tracking for data jobs and throughput performance
- Guardrails like validations, retries and logging
How to score:
1 | Integrated with legacy data management solutions and databases; manual or scripted data transfers; low throughput; poor visibility |
2 | Core data management with very limited third-party integrations; some file management capabilities |
3 | Basic data management integrations; minimal guardrails; requires customization for downstream and upstream dependency management |
4 | Data pipeline (SaaS, iPaaS and MF) integrations; downstream dependency management and upstream management for reporting and analytics |
5 | High throughput; supports dynamic event-based orchestration; data governance; proactive SLA monitoring |
Empowering business users
Inspired by the Citizen Automation Use Case
Can non-technical users safely create, edit and trigger automations with the right controls?
What to evaluate:
- Guided self-service tools for workflow design and execution
- Guardrails and governance features (e.g., approval workflows, role-based access)
- Training resources and onboarding ease
- Audit logs and rollback capabilities for business-created workflows
How to score:
1 | Designed only for developers/IT; no guardrails |
2 | Business users can get scheduled reports via email for the success or failure of reports |
3 | Business users can consume information in the UI about workflows but cannot influence them |
4 | Basic human-in-the-loop capabilities — business users can input simply into workflows to manage certain stages; some support for forms or reports in the UI |
5 | Full customization of user experience, dashboards, forms and interfaces for visibility and management of workflows, safety checks and governance policies |
DevOps readiness and automation agility
Inspired by the DevOps Automation Use Case
Does the platform integrate with DevOps toolchains and support agile release cycles?
What to evaluate:
- Native plugin availability for CI/CD tools
- API maturity and extensibility
- Support for version control, branching, rollback and parallel pipeline execution
- Ability to deploy and manage automation as code
How to score:
1 | No DevOps or versioning; manual management of versioning; no way to move workflows between environments or systems for promotion of new workflows and other objects |
2 | Disconnected environments provide automation developers with ways to manage change, manual export and import |
3 | Basic support for versioning and change management between environments; rigid and inflexible promotion and versioning |
4 | Integrated versioning and promotion of new workflows between environments; simple integrations with DevOps ecosystems |
5 | Comprehensive DevOps ecosystem integrations to automate and deploy new workflows from CI/CD pipeline management tools; low-code options to integrate with new environments; extensive in-product version and deployment control |
Constructing your SOAP scorecard
You don’t need a complex spreadsheet to evaluate SOAPs. Just build a simple table:
Capability domain | Score (1-5) | Weight (%) | Weighted score |
---|---|---|---|
IT workload execution | 4 | 25 | 1.0 |
Workflow flexibility | 5 | 20 | 1.0 |
Data orchestration | 3 | 20 | 0.6 |
Citizen automation | 4 | 15 | 0.6 |
DevOps readiness | 2 | 20 | 0.4 |
3.6 |
Adjust weights based on your priorities. If you’re focused on business agility, you might weigh citizen automation more heavily. If uptime is paramount, prioritize IT workload execution.
This approach doesn’t just tell you which provider offers what you want but the depth to which that capability goes.
Interpreting your results
- 4.5-5.0: Top-tier platform fit, capabilities with depth
- 3.5-4.4: Strong candidate, likely meets core needs with some tradeoffs
- 2.5-3.4: Mid-tier and may require customization or compromise
- <2.5: Unlikely to meet enterprise orchestration needs
Practical evaluation prompts
Use these conversation starters with vendors to dig into real-world capabilities.
- “Show me how a business user can edit this workflow safely.”
- “How many systems can I orchestrate without writing custom code?”
- “What happens if a data transfer job fails at 2 AM?”
- “Can this platform trigger deployments based on real-time events?”
- “How does the SLA dashboard escalate delays or job failures?”
Where Redwood leads — and what that signals for you
Redwood Software ranked #1 in all five Use Cases in the 2025 Gartner Critical Capabilities for SOAP report. We believe that reflects more than just functional breadth and confirms Redwood’s ability to deliver real-world orchestration across IT workloads, business workflows, citizen development, data movement and DevOps. This aligns with our mission to unleash human potential through automation fabric solutions.
A SOAP platform is not just a feature set but an enabler of better business outcomes. Use the scorecard above, and download the full Gartner Critical Capabilities report to optimize your search for the right SOAP.
About The Author

Dan Pitman
Dan Pitman is a Senior Product Marketing Manager for RunMyJobs by Redwood. His 25-year technology career has spanned roles in development, service delivery, enterprise architecture and data center and cloud management. Today, Dan focuses his expertise and experience on enabling Redwood’s teams and customers to understand how organizations can get the most from their technology investments.