How to Give Good Feedback in Code Reviews: 5 Proven Techniques, Examples, and a Reviewer’s Checklist

Sales Development Representative and excited about connecting people
Thoughtful code review feedback is one of the highest‑leverage habits in software development. Done well, it shortens lead time, reduces defects, spreads knowledge, and makes teams—and open source communities—more welcoming. Done poorly, it delays releases, creates friction, and demotivates contributors.
This guide distills five practical techniques for giving effective code review feedback on pull requests, plus pro tips, examples, and a lightweight checklist you can use today.
Why Effective Code Reviews Matter
Great code review feedback:
- Improves code quality, security, and maintainability before issues reach production
- Speeds up onboarding by modeling standards and architectural patterns
- Builds a culture of collaboration where engineers learn from one another
- Reduces rework and clarifies trade‑offs early in the development process
SEO tip for readers: if you’re looking for ways to optimize “code review best practices,” “pull request feedback,” or “how to give good feedback,” you’re in the right place.
What Good Feedback Looks Like
Good code review feedback is:
- Specific: references the exact file, line, or behavior
- Actionable: includes a clear recommendation or suggested change
- Kind and respectful: focuses on the code, not the coder
- Contextual: explains the “why” behind the comment
- Prioritized: separates must‑fix items from nice‑to‑have suggestions
- Collaborative: invites discussion, alternatives, or clarifying questions
Example that lands well:
- “Consider using Array.prototype.every here to short‑circuit on the first failure. It keeps complexity O(n) and improves readability. Thoughts?”
Example to avoid:
- “This is wrong. Fix it.”
The 5 Techniques for Giving Better PR Feedback
1) The Show‑and‑Tell Review
When changes affect UI, UX, docs, or behavior, show what you mean—don’t just tell.
How to use it:
- Take a quick screenshot, screen recording, or GIF highlighting the behavior or visual issue
- In GitHub, open the PR → Files changed → add a comment on the relevant line/section → drag and drop your image or paste the link to a short clip
- Add one sentence of context: expected vs. actual outcome
When it shines:
- UI tweaks (layout shifts, contrast issues, spacing)
- Documentation updates (broken links, formatting, rendering differences)
- Behavioral changes (regression demonstrations, before/after states)
Sample comments:
- “Here’s how the tooltip renders on 125% zoom. The text gets truncated. Consider a max-width and white-space: normal.”
- “Docs render fine on GitHub but break on the docs site. Screenshot attached—front matter might need a title field.”
Pro tip: Add alt text for accessibility when you share images. It helps everyone—including future readers of the thread.
2) The Chameleon Review
Adapt your feedback to the type of contribution. Different PRs call for different lenses.
How to use it:
- For documentation and content: ask dialogic questions that sharpen clarity and usefulness
- “What makes this section unique compared to the tutorial above?”
- “Could we add a concrete example for first‑time users?”
- For backend changes: validate correctness and complexity
- “This adds a nested loop over records. Any concerns about O(n²) at scale? We could pre‑index by user_id.”
- For UI/Front‑end: check accessibility and responsiveness
- “Buttons lack accessible labels. Could we add aria-labels and focus states?”
- For data/infra/config: ask about blast radius and rollbacks
- “If this feature flag misbehaves, what’s the rollback path? Can we default to false for the first deploy?”
When it shines:
- You’re reviewing across domains (docs, infra, UI, backend, tests)
- You want to encourage deeper thinking without prescriptive dictation
3) Two Peas in a Pod (Tag‑Team Reviews)
Coordinate with another reviewer and divide feedback by strengths. You cover the copy and UX; they cover performance and tests.
How to use it:
- Quickly align in chat: “I’ll review a11y and copy; can you focus on query performance and test coverage?”
- Leave comments in separate threads, each focusing on your domain
- Summarize together in the final review to help the author prioritize
When it shines:
- Large PRs with cross‑cutting concerns
- Teams where pairing reduces review delay and spreads knowledge
4) The Teach‑Back Review
Don’t just flag issues—teach principles that help the contributor improve future work.
How to use it:
- Observation: “This div acts like a section heading.”
- Principle: “Semantic HTML improves accessibility and navigation.”
- Recommendation: “Use
here, then style it; screen readers will announce it properly.”
- Reference (optional): link to docs or a snippet that illustrates the pattern
When it shines:
- Repeated patterns (semantic HTML, dependency injection, error handling)
- Mentoring and onboarding scenarios
- Open source communities where contributors have varied backgrounds
Sample comments:
- “Catching the base Exception obscures the error source. Could we catch ValueError instead? It keeps logs cleaner and avoids masking bugs.”
5) Suggestion‑Driven Reviews (Inline Suggestions)
When time is tight—or the fix is obvious—propose direct changes inline so the author can accept with a click.
How to use it:
- In Files changed, hover the line → click the + icon → choose Add a suggestion
- Rewrite the line with your improved version
- Add a quick rationale: “avoids mutation,” “consistent with style guide,” or “simplifies logic”
- Use Request changes only for must‑fix items; otherwise, leave as comments or Approve with suggestions
When it shines:
- Minor copy edits, small refactors, consistent naming, comment fixes
- Text-based PRs where examples speak louder than explanations
Caution: Use suggestions for small, safe edits. If your change alters behavior or contracts, request changes and explain why.
Pro Tips to Make Your Reviews Faster—and Kinder
- Start with what’s working: “The test matrix is thorough—great coverage on edge cases.”
- Separate critical from optional: “Blockers” vs. “Nice to haves”
- Offer rationale: explain trade‑offs and likely impact
- Ask, don’t command: “What do you think about…?” prompts collaboration
- Avoid tone traps: ditch “Obviously,” “Just,” or “Clearly” from review language
- Respect the style guide: if it’s stylistic, point to linters/formatters, not preferences
- Don’t bike‑shed: limit color/spacing nits unless they affect usability
- Prefer small PRs: if the PR is huge, suggest splitting next time to ease feedback and reduce cycle time
Make Security, Accessibility, and Performance Part of Every Review
Security (shift‑left mindset):
- Check for hardcoded secrets, insecure defaults, and missing input validation
- Flag outdated dependencies with known vulnerabilities
- Ask about authz/authn paths for new endpoints
For a deeper look at weaving security into the dev lifecycle, see DevSecOps perspectives in this guide: DevSecOps vs. DevOps: Understanding the key differences (and why security can’t be an afterthought).
Accessibility:
- Verify keyboard navigation, focus states, color contrast, and alt text
- Prefer semantic elements over div soup
- Ensure form controls are properly labeled
Performance:
- Watch for N+1 queries, repeated DOM work, hot-path allocations
- Ask for metrics where it matters: “Any baseline on p95 after this change?”
- Encourage lazy-loading or caching when appropriate
Turn Review Outcomes Into Durable Decisions
Big design choices raised in reviews can get lost in comment threads. Capture them so future contributors understand the why.
- Summarize the decision in the PR description after consensus
- Link to an Architecture Decision Record (ADR) for consequential changes (APIs, data models, encryption choices, event schemas)
New to ADRs? This primer helps you create concise, useful records: Architecture Decision Records (ADRs): what they are and why they matter.
Fit Reviews Into Your Delivery Flow
Healthy teams treat code reviews as part of the work, not a side quest. Make it predictable:
- Set response expectations (e.g., first pass within one business day)
- Reserve daily review windows to reduce context switching
- Use labels like “needs product input,” “blocked on design,” or “ready for merge” to avoid limbo
- Align with your sprint cadence—reviews are integral to flow in iterative methods like Scrum
If you’re tuning your delivery process, this overview is a useful companion: Unlocking Efficiency: Scrum Guide.
A Simple Comment Template You Can Reuse
When you’re not sure how to phrase feedback, try this structure:
- Context: what this change touches and why it matters
- Observation: what you noticed (file/line/behavior)
- Impact: risks or benefits (correctness, security, performance, UX)
- Recommendation: a concrete alternative or suggestion
- Invitation: ask for thoughts or propose next steps
Example:
- “Context: This endpoint is on the checkout path. Observation: we parse JSON twice in handlers/cart.py:188–204. Impact: extra CPU and latency on a hot route. Recommendation: parse once in middleware and attach to request.ctx. Invitation: Does that fit with our current middleware pattern?”
Examples of Phrases That Help (and Those That Hurt)
Helpful:
- “Could we…?”
- “To make this more readable…”
- “This might simplify…”
- “Given X requirement, how about Y?”
- “If we go with this approach, what’s the rollback?”
Avoid:
- “This is wrong.”
- “Why would you do this?”
- “Just change it.”
- “Obviously…”
Anti‑Patterns to Avoid
- Drive‑by approvals: “LGTM” without reading the diff
- Perfectionism that blocks delivery (especially on low‑risk style items)
- Bikeshedding on naming when behavior is unresolved
- Vague criticism without alternatives
- Scope creep: trying to redesign the entire module via PR comments
- Ignoring test impact or missing docs
FAQ: Practical Questions Reviewers Ask
- How long should a review take?
- Aim for timely first responses (same day or next business day). Large PRs may need a scheduled walkthrough.
- What if I disagree with the author?
- Seek shared goals: “We want clarity and performance.” Offer data or a minimal POC. If you’re still stuck, escalate decisions, then document in an ADR.
- Should I leave nitpicks?
- Yes—but mark them as non‑blocking and keep them brief. Offload style nits to linters/formatters where possible.
- When do I Request changes vs. Approve with suggestions?
- Request changes for correctness, security, performance, or contract violations. Approve with suggestions for polish and non‑blocking improvements.
A Lightweight Reviewer’s Checklist
Before you click Submit review:
- Ran locally or read test output; verified critical path behavior
- Checked correctness, security, and error handling on new logic
- Confirmed naming, clarity, and code structure follow conventions
- Scanned for performance red flags (hot paths, allocations, queries)
- Verified accessibility and responsiveness for UI changes
- Ensured tests and docs reflect the changes
- Prioritized comments (must‑fix vs. nice‑to‑have)
- Summarized key points in the review summary
- Suggested ADRs for consequential decisions
TL;DR
- Use Show‑and‑Tell to make UI and behavior feedback crystal clear
- Adapt your lens with the Chameleon approach (docs vs. backend vs. UI vs. infra)
- Tag‑team large PRs to reduce delay and improve coverage
- Teach‑Back to raise the bar long‑term—not just fix this PR
- Lean on Suggestion‑Driven edits for small, obvious improvements
- Bake in security, a11y, and performance; document big choices with ADRs
- Treat reviews as part of your delivery system and align with your sprint cadence
Clear, kind, and actionable code review feedback doesn’t slow you down—it’s how you go faster with confidence.







