Common Factors: Why the Therapy Relationship and Structure Both Matter
There's a decades-old debate in psychotherapy research: does the specific technique matter, or is it all about the relationship?
The answer, according to the best available evidence, is both. And the field has a name for this: the common factors model.
What common factors actually are
The common factors model emerged from a consistent finding in psychotherapy research: different therapeutic approaches tend to produce roughly similar outcomes. This observation — sometimes called the Dodo Bird Verdict, after the Dodo in Alice in Wonderland who declares "everybody has won, and all must have prizes" — has been replicated across hundreds of studies since Rosenzweig first noted it in 1936.
But "all therapies work roughly equally well" is not the same as "nothing specific matters." The common factors model identifies the shared ingredients across effective therapies that account for most of the variance in outcomes. The most widely cited framework, developed by Bruce Wampold and others, identifies several key factors:
The therapeutic alliance accounts for roughly 12% of outcome variance across studies — a consistent and robust finding. Alliance includes the quality of the emotional bond between therapist and client, agreement on therapy goals, and agreement on the tasks used to achieve those goals. Bordin's model from 1979 still holds up remarkably well.
Expectancy and hope — the client's belief that therapy will help — accounts for another meaningful portion of outcomes. This isn't placebo in the dismissive sense. It's the well-documented effect of having a credible framework for understanding your problems and a plausible path forward.
The therapist effect — the consistent finding that some therapists get better outcomes than others, regardless of the approach they use — accounts for 5-9% of variance. This is one of the most underappreciated findings in psychotherapy research. Your choice of therapist matters more than your choice of therapy model.
What this doesn't mean
The common factors model is frequently misinterpreted, and the misinterpretations have real clinical consequences.
It doesn't mean technique is irrelevant. The alliance and specific techniques aren't competing explanations — they interact. A therapist using exposure therapy for PTSD needs a strong alliance to help the client tolerate the distress of confronting traumatic memories. The technique requires the relationship, and the relationship is strengthened by competent technique delivery.
It doesn't mean all therapies are equally effective for all conditions. The Dodo Bird finding is an aggregate across conditions. For specific disorders, specific treatments have meaningfully stronger evidence. Exposure-based therapies for anxiety disorders, behavioral activation for depression, and DBT for borderline personality disorder all outperform comparison conditions consistently. The common factors provide a floor, not a ceiling.
It doesn't mean therapists can skip training in specific methods. The therapist effect — the finding that some therapists consistently outperform others — correlates with deliberate practice, outcome monitoring, and clinical skill development. Therapists who dismiss technique as unimportant tend to plateau early.
The alliance is necessary but not sufficient
Here's the finding that most challenges the "it's all about the relationship" narrative: the therapeutic alliance predicts outcomes, but it also follows from them. Early symptom improvement predicts stronger alliance ratings at later sessions. Clients who are getting better report stronger alliances — partly because they're getting better.
This means the causal arrow isn't as simple as "good relationship → good outcomes." It runs in both directions. Doing something structured and effective builds the alliance. Achieving early wins builds trust. Competent delivery of specific techniques strengthens the emotional bond.
The research from David Burns and others on this temporal relationship has important implications: you can't build a strong alliance by focusing only on the relationship. You also have to help people improve.
Ready to try this in your practice?
Theracharts makes outcome tracking effortless — 120+ assessments, automatic scoring, trend charts. Free to start.
Start Free →Doing something structured matters
Across the common factors literature, one finding is remarkably consistent: therapies that provide structure — a rationale for the client's problems, specific activities or tasks, and a coherent framework for change — outperform supportive listening alone.
This doesn't mean the structure has to be CBT. It means the client needs to understand what you're doing, why you're doing it, and what their role is. They need homework, or between-session activities, or skills to practice. They need to feel like therapy is going somewhere.
The Wampold meta-analyses consistently show that therapies with a clear structure and rationale outperform "relationship only" conditions. And Lambert's research on dose-response and expected treatment response curves shows that tracking whether clients are actually improving — and adjusting when they're not — significantly improves outcomes.
How measurement-based care connects
This is where measurement-based care (MBC) fits into the common factors framework. MBC isn't a therapy model — it's a practice that strengthens every common factor simultaneously.
It strengthens the alliance by creating transparency. When therapist and client can see the same data about progress, the conversation about goals and tasks becomes concrete rather than abstract. Disagreements about whether therapy is working move from subjective impressions to shared observations.
It strengthens expectancy by making progress visible. Clients who can see their PHQ-9 dropping from 18 to 12 over six weeks have concrete evidence that therapy is helping. This builds the hope that drives further engagement.
It strengthens the therapist effect by providing the feedback loop that drives deliberate practice. Therapists who monitor outcomes can identify which clients aren't improving, adjust their approach, and learn from both successes and failures. Without outcome data, you're practicing without feedback — and decades of expertise research shows that practice without feedback doesn't reliably produce improvement.
And it provides structure. Regular assessment, collaborative review of scores, and data-informed treatment planning give therapy a systematic quality that clients consistently rate as positive.
What this means for practice
If you take the common factors research seriously, several practical implications follow.
Invest in the relationship, but don't stop there. Be warm, empathic, and genuine — and also be competent, structured, and clear about what you're doing and why. These aren't competing priorities.
Track outcomes. The single most impactful practice change most therapists can make is to routinely measure whether their clients are improving. The evidence for outcome monitoring is strong and consistent: therapists who track outcomes get better results, catch deteriorating clients earlier, and improve over time.
Take the therapist effect seriously. Some of the variance in therapy outcomes is attributable to the therapist, not the model. The therapists who get the best results tend to be the ones who seek feedback, monitor their outcomes, engage in deliberate practice, and remain genuinely curious about what works.
Don't dismiss technique. The common factors model is sometimes used to justify clinical nihilism — the idea that since "everything works," nothing specific matters. This misreads the evidence. Specific techniques matter, especially for specific conditions. And competent delivery of specific techniques is itself a contributor to the alliance.
The common factors model isn't an argument for doing less. It's a map of what makes therapy work — and an invitation to strengthen every factor simultaneously.