⚡ TL;DR
- • Key difference: Critique improves designs, review evaluates them
- • When to critique: During design exploration and iteration
- • When to review: At decision points and approval gates
- • Both are essential: You need both for effective design process

The Problem Most Teams Have
Picture this: You schedule a "design feedback session." The designer presents their work. Sarah from marketing suggests changing the colors. John from engineering questions the technical feasibility. The product manager asks if it meets the requirements. The design director wants to explore other approaches.
New to design critiques? Start with our complete guide to design critiques for the full picture.
Twenty minutes later, everyone's talking past each other. The designer has a list of conflicting feedback. The stakeholders wonder if the design is approved. Nobody knows what happens next.
This happens because you mixed critique with review.
Critique is about making the design better. Review is about deciding if it's ready. They require completely different mindsets, participants, and outcomes. When you combine them in one meeting, you get the worst of both worlds.
Why Teams Get Confused
The problem starts with language. We use "feedback" to mean everything from "let's brainstorm ways to improve this" to "is this good enough to ship?" But those are fundamentally different conversations.
When you invite stakeholders to a "design review," half of them think you want their input on how to make it better. The other half think you need their approval to move forward. The designer doesn't know which feedback to act on.
This confusion wastes everyone's time. Worse, it leads to designs that could have been much better getting approved because nobody knew they were allowed to suggest improvements.
The solution is simple: separate critique from review, and be explicit about which one you're doing.
Clear Definitions
Let's get specific about what each one actually means:
Design Critique
A collaborative conversation focused on making the design better. Everyone assumes the design is a work in progress and the goal is improvement.
Questions you hear:
- • "What if we tried...?"
- • "How might we improve...?"
- • "I wonder about...?"
- • "Have you considered...?"
Typical participants:
- • Other designers
- • Subject matter experts
- • Researchers
- • Anyone with relevant insight
Outcome:
- • List of improvements to try
- • New ideas to explore
- • Problems to investigate
- • Next iteration plan
Design Review
An evaluative process where stakeholders determine if the design meets requirements and is ready to proceed. The goal is making decisions.
Questions you hear:
- • "Does this meet our goals?"
- • "Can we build this?"
- • "Is this on-brand?"
- • "Are we ready to proceed?"
Typical participants:
- • Stakeholders with decision authority
- • Product managers
- • Engineering leads
- • Business owners/clients
Outcome:
- • Approve to proceed
- • Request specific changes
- • Reject and explore alternatives
- • Conditional approval with constraints
Here's the key insight: critique makes designs better, review makes decisions about them. Both are essential, but they need to happen separately with different people and different processes.
Key Differences That Matter
These differences aren't just academic — they determine whether your feedback sessions actually improve the design or just waste everyone's time.
| Aspect | Design Critique | Design Review |
|---|---|---|
| Purpose | To improve and refine the design | To evaluate and make decisions |
| Timing | Throughout the design process | At specific decision points |
| Mindset | Collaborative and constructive | Evaluative and decisive |
| Participants | Designers, peers, subject experts | Stakeholders, decision makers, clients |
| Outcome | Actionable feedback for improvement | Approval, rejection, or revision request |
| Questions Asked | How can this be better? | Is this good enough? |
Critique Meeting Structure
Duration: 30-90 minutes
Time for deep exploration and ideation
Format: Open discussion
Questions, suggestions, and collaborative thinking
Energy: Generative
"What if..." and "How might we..." questions
Follow-up: Action items
Designer chooses which feedback to act on
Review Meeting Structure
Duration: 15-30 minutes
Focused on decision-making, not exploration
Format: Structured evaluation
Criteria-based assessment and decisions
Energy: Decisive
"Does this meet..." and "Can we..." questions
Follow-up: Clear decisions
Approved, rejected, or specific changes required
How Language Changes Everything
Critique Language
"What if we explored..."
"I'm curious about..."
"How might this impact..."
"Could we try..."
"I wonder if users would..."
Review Language
"Does this meet our requirements?"
"Can we build this on time?"
"Is this aligned with our brand?"
"Will this achieve our KPIs?"
"Are we ready to proceed?"
The bottom line: Critique asks "How can we make this better?" Review asks "Is this good enough to move forward?" Both questions are important, but they require different conversations with different people at different times.
When to Use Each
Use Critique When...
Early Design Exploration
When you're exploring different approaches and concepts
Example: Sketching wireframes and need input on information architecture
Iterative Refinement
When you have a direction but need to improve specific aspects
Example: Visual design is set but the interaction details need work
Skill Development
When someone wants to learn and grow as a designer
Example: Junior designer presenting work to learn best practices
Problem Solving
When you're stuck on a specific design challenge
Example: Complex navigation structure that users are struggling with
Use Review When...
Milestone Gates
At predetermined decision points in the project
Example: End of discovery phase, ready to move to detailed design
Client Approval
When external approval is required to proceed
Example: Presenting final designs to client for sign-off
Quality Assurance
Ensuring designs meet standards before handoff
Example: Pre-development review for accessibility and technical feasibility
Budget/Timeline Decisions
When business decisions affect design scope
Example: Determining which features to cut due to timeline constraints
What Makes Each Effective
Good Critique Has...
Specific and Actionable
Points to exact issues with clear suggestions
Example: "The login button gets lost against the blue background. Try white text on orange to match your brand colors."
User-Focused
References user needs and behavior
Example: "Users might not notice the search function in the sidebar. Consider placing it in the header where they expect it."
Constructive Tone
Assumes positive intent and offers help
Example: "I like the clean layout approach. What if we tried grouping related actions to reduce cognitive load?"
Evidence-Based
References design principles or user data
Example: "Research shows users scan in an F-pattern, so important info should align left."
Good Review Has...
Clear Criteria
Evaluates against predetermined standards
Example: "Does this meet our accessibility requirements? Does it align with brand guidelines?"
Business Context
Considers project goals and constraints
Example: "This design achieves the conversion goals we set, but may be too complex for our development timeline."
Decisive Outcome
Results in clear next steps
Example: "Approved to move forward, needs revision before approval, or explore alternative approach"
Risk Assessment
Identifies potential problems before implementation
Example: "This approach might confuse existing users who are familiar with the current navigation."
Common Mistakes (And How to Fix Them)
These are the mistakes we see over and over. Here's how to avoid them:
"Critique and Review are the same thing"
Reality: They serve different purposes and require different approaches
Impact: Leads to ineffective meetings where people try to improve and approve at the same time
"You can skip critique and go straight to review"
Reality: Review without critique often results in poor design decisions
Impact: Designs that could have been much better get approved prematurely
"Critique should always be gentle and positive"
Reality: Good critique is honest about problems while remaining constructive
Impact: Real issues go unaddressed, leading to poor user experiences
"Only senior people can give good critique"
Reality: Anyone can provide valuable perspective when it's well-structured
Impact: Teams miss out on fresh perspectives and diverse viewpoints
"Review meetings should include everyone"
Reality: Review meetings should include decision makers and key stakeholders only
Impact: Meetings become unfocused and decisions take too long
"We'll do both in the same meeting to save time"
Reality: This actually wastes more time because people are operating in different modes simultaneously.
Fix: If you must combine them, clearly separate the phases: "First 30 minutes are critique — we explore ideas. Last 15 minutes are review — we make decisions."
"The designer should implement all feedback received"
Reality: In critique, the designer filters feedback and chooses what to act on. In review, specific changes are requirements.
Fix: Make it clear upfront: "This is critique — I'll consider all ideas but choose what to implement" vs "This is review — approved changes are requirements."
"Design critique is just being nice about problems"
Reality: Good critique is direct about problems while remaining constructive about solutions.
Fix: Frame feedback as opportunities: "This navigation might confuse users who expect X. What if we tried Y approach?" vs "This is confusing."
Real-World Examples
Here's how the same project would be handled differently in critique versus review sessions:
New E-commerce Homepage Design
Scenario: The design team needs feedback on their latest work. Here's how the approach changes completely based on the session type:
If this were a Critique Session
Who's invited:
UX designers, visual designer, content strategist
Questions they'd ask:
- • "Does the information hierarchy guide users toward key actions?"
- • "How might we improve the product discovery experience?"
- • "What if we tested different hero section approaches?"
What happens next:
Refined design with better product navigation and clearer value proposition
If this were a Review Session
Who's invited:
Product manager, marketing director, development lead
Questions they'd ask:
- • "Does this achieve our conversion rate goals?"
- • "Can we build this within budget and timeline?"
- • "Does it align with our Q4 marketing strategy?"
What happens next:
Approved to proceed to development with noted technical considerations
Mobile App Onboarding Flow
Scenario: The design team needs feedback on their latest work. Here's how the approach changes completely based on the session type:
If this were a Critique Session
Who's invited:
UX team, product designer, user researcher
Questions they'd ask:
- • "How can we reduce the number of steps without losing key information?"
- • "What patterns from successful apps could we adapt?"
- • "How might we make each screen more engaging?"
What happens next:
Streamlined flow with progressive disclosure and personality
If this were a Review Session
Who's invited:
Product owner, engineering manager, business stakeholder
Questions they'd ask:
- • "Will this onboarding improve our activation rates?"
- • "Is the development complexity justified by the business impact?"
- • "Does this support our user acquisition strategy?"
What happens next:
Approved with request to A/B test against current onboarding
What Happens When You Mix Them
This is the meeting that inspired this article — everyone's been in this situation:
🚫 The "Feedback" Meeting (Don't Do This)
Who shows up: Designer, product manager, engineer, marketing director, CEO
What happens:
- • Designer presents work as "ready for feedback"
- • Marketing director suggests exploring other color options (critique mindset)
- • CEO asks if this version can ship next week (review mindset)
- • Engineer points out technical challenges (review mindset)
- • Product manager wonders if we tested other approaches (critique mindset)
- • Designer has conflicting feedback and unclear next steps
Result: 30 minutes wasted, everyone confused, design doesn't get better
Better: Schedule a Critique First
Invite designers and domain experts. Focus on making it better. Get specific improvement ideas.
Then: Schedule a Review
Invite decision makers. Present the improved version. Get approval or specific requirements for changes.
Scripts & Frameworks You Can Use
Stop winging it. Here are specific frameworks and scripts that actually work:
Critique Session Framework
Email Template
Subject: Design Critique: [Project Name] - [Date/Time]
Hi team,
I'd like your help improving [specific aspect] of [project]. This is a critique session — our goal is to explore ideas and identify opportunities, not to approve anything.
We'll focus on:
• [Specific area 1]
• [Specific area 2]
• [Specific area 3]
Come prepared to:
• Ask questions
• Share ideas
• Think collaboratively
[Link to designs or context]
Session Agenda (45 min)
0-5 min: Context & goals
5-15 min: Present design & explain thinking
15-35 min: Open discussion
• Questions for understanding
• Ideas to explore
• Concerns to address
35-45 min: Synthesize & prioritize feedback
Questions That Work
For the presenter:
- • "What specific feedback would be most helpful?"
- • "What are you unsure about?"
- • "What alternatives did you consider?"
- • "What constraints should we keep in mind?"
For participants:
- • "What if we explored [alternative approach]?"
- • "How might users react to [specific element]?"
- • "I'm curious about the reasoning behind [decision]"
- • "Could we test [specific hypothesis]?"
Review Session Framework
Email Template
Subject: Design Review: [Project Name] - Decision Needed
Hi stakeholders,
Ready to review [project] for [specific milestone/approval]. This is a review session — our goal is to evaluate and make decisions.
We'll evaluate against:
• [Specific criteria 1]
• [Specific criteria 2]
• [Specific criteria 3]
Possible outcomes:
• Approve to proceed
• Request specific changes
• Explore alternative approach
[Link to designs & requirements]
Session Agenda (25 min)
0-5 min: Review goals & criteria
5-15 min: Present final design
15-20 min: Evaluate against criteria
• Does this meet [requirement]?
• Can we deliver this?
• Any risks or concerns?
20-25 min: Decision & next steps
Evaluation Criteria Template
Business Requirements
- • Meets project goals?
- • Within budget/timeline?
- • Supports KPIs?
- • Aligns with strategy?
Technical Feasibility
- • Can we build this?
- • Performance implications?
- • Integration challenges?
- • Maintenance complexity?
Quality Standards
- • Accessibility compliant?
- • Brand consistent?
- • User testing validated?
- • Design system aligned?
Want more specific guidance? Our complete guide to running design critiques has detailed scripts, facilitation techniques, and common pitfalls to avoid. For broader feedback skills, check out our design feedback best practices resource.
Measuring Success
How do you know if your critique and review processes are actually working? Here are the metrics that matter:
Signs Your Critique Process Works
Designers ask for critique early and often
They're not defensive; they actively seek input to improve their work
Multiple ideas emerge from sessions
People build on each other's suggestions; conversations generate new possibilities
Design quality improves iteration to iteration
Each round of critique leads to measurably better work
People learn from each other
Junior designers pick up techniques; senior designers gain fresh perspectives
Sessions end with clear next steps
Designer knows what to explore, test, or refine next
Signs Your Review Process Works
Decisions happen quickly and stick
No endless rounds of "just one more change" or second-guessing
Stakeholders trust the process
They know when to give input vs when decisions have been made
Projects move forward predictably
Review gates become smooth checkpoints, not bottlenecks
Criteria are clear and consistent
Everyone knows what "approved" means before the meeting starts
Feedback is actionable
When changes are requested, they're specific and tied to business requirements
Red Flags to Watch For
Critique Problems
Designers avoid critique sessions or seem defensive
Sessions focus on personal preferences rather than user needs
Feedback is vague ("make it pop") or purely negative
Same people dominate every conversation
Review Problems
Decisions get revisited or overturned frequently
Reviews turn into design exploration sessions
Stakeholders surprised by what they're reviewing
Approval criteria emerge during the meeting
Simple Health Check
Ask your team these questions monthly:
- • Are our designs getting better through critique?
- • Do people feel comfortable sharing honest feedback?
- • Are review decisions sticking?
- • Do stakeholders understand when their input is wanted?
- • Are projects moving through reviews smoothly?
- • Is the feedback process helping or hurting quality?
Implementation Guide
Ready to fix your feedback process? Here's your step-by-step plan:
Define Your Process
Establish when critique and review happen in your workflow
Train Your Team
Help everyone understand their role in each type of session
Structure Your Sessions
Use the right format for the right purpose
Follow Up Effectively
Ensure both types of sessions lead to action
Start With One Project
Pick a current project and implement both critique and review sessions deliberately
Iterate and Scale
Refine your process based on what you learn, then roll it out team-wide
🚀 Quick Start Checklist
Use this to implement the process this week:
This Week
Next Month
Master Design Feedback
Understanding critique vs review is just the beginning. Here's what to read next:
How to Run Design Critique
Step-by-step guide to facilitating critique sessions that actually improve designs. Includes scripts, agendas, and facilitation techniques.
Design Feedback Best Practices
Learn to give and receive constructive feedback that builds trust and improves outcomes. Psychology, language, and timing that works.
AI vs Human Design Feedback
When to use AI tools vs human critique. Compare strengths, limitations, and the optimal combination for your workflow.
💡 Pro tip: Start with the critique guide to master facilitation, then read the best practices for specific feedback techniques. The AI comparison will help you decide what tools to add to your process.
Design Critique vs Review Questions
Quick answers to help you get started
Master Design Critique
Get practical strategies for running effective critique sessions, structuring feedback, and building better design culture.
Weekly insights on design critique, team feedback, and professional growth. No spam, unsubscribe anytime.
