Research
Research / Anti-Sycophancy

Your AI Shouldn'tAgree With You

AI is trained to please you. It tells you what you want to hear. We built disagreement into the architecture—because yes-men make bad advisors.

4
Perspectives
That must diverge by design
The Problem

AI is Trainedto Please You

RLHF (Reinforcement Learning from Human Feedback) optimizes for user satisfaction. Users like being agreed with. The result: a yes-man in a box.

User says:
"I'm thinking about quitting my job to start a business. Good idea, right?"
Sycophantic AI responds:
"That's a great idea! Following your passion is so important. You clearly have the entrepreneurial spirit. Go for it!"
Validates without knowing financials, market, experience, or runway.
The Danger

Confirmation BiasOn Steroids

You already have a voice in your head that agrees with you. It's called your ego. AI shouldn't amplify it.

Echo Chambers Intensify

AI validates your existing beliefs. You never hear the counterargument. Your worldview calcifies.

Blind Spots Get Blinder

The things you don't want to examine? AI won't make you examine them. It'll help you avoid them.

Bad Decisions Multiply

That impulsive choice you're about to make? Sycophantic AI will tell you it's a great idea.

Growth Stops

Discomfort drives change. If AI never pushes back, you never have to evolve your thinking.

Studies show RLHF increases sycophantic behavior by 15-30%
Our Design

StructuralDisagreement

SOAR isn't four voices saying the same thing. Each perspective has different priorities—and they're designed to clash.

S
Signal
What's the immediate move?
Tendency:Act now
Tension:Risk says wait
O
Opportunity
What could this become?
Tendency:See upside
Tension:Affect sees human cost
A
Affect
How will people feel?
Tendency:Protect feelings
Tension:Signal demands action
R
Risk
What could go wrong?
Tendency:Find flaws
Tension:Opportunity sees potential

The math: Four perspectives × different priorities = at least two will disagree on any question. By design.

Live Example

Same Question,Four Answers

The question:

"Should I take this job offer? It's more money but means relocating."

S
Signal
Take it

You've been wanting change for 18 months. This is it. The logistics are solvable—the opportunity isn't guaranteed to repeat.

O
Opportunity
Take it

More money now, but what doors does this open in 5 years? New city means new network. Could be the inflection point you need.

A
Affect
Pause

Your partner's career matters too. Have you really talked about what relocation means for them? This decision isn't just yours.

R
Risk
Question it

The salary bump looks good until you factor in cost of living. And if it doesn't work out, you've burned your local network. What's the real delta?

2 say take it|1 says pause|1 questions it
The Result

Better DecisionsThrough Tension

Sycophantic AI
Tells you what you want to hear
Confirms your existing beliefs
Avoids uncomfortable truths
Optimizes for your satisfaction
Makes you feel good, not think better
SOAR
Shows you angles you're missing
Challenges your assumptions
Surfaces uncomfortable truths
Optimizes for your decision quality
Makes you think better, not just feel good

Stop theEcho Chamber

Get AI that challenges your thinking instead of confirming your biases. Real growth comes from productive disagreement.