As a designer, you hand off your work to the development team, and they move on to implementing your designs. However, that doesn’t mean you can close Figma and call it a day. You want to ensure your designs are implemented correctly and meet the outlined requirements and specifications. This is where a design implementation review comes in.
Design implementation reviews, or design reviews, keep everyone (designers, developers, and software testers) on the same page. They help ensure that your carefully envisioned designs are implemented precisely as they should be.
What is a design review?
A design implementation review evaluates the current state of a project’s implementation and is usually performed at the beginning of the development process.
The golden rule is: review early and often. Identifying potential bugs and UI issues early in the process reduces the chance of major problems going unnoticed until later in development.
Making changes early in development is much more cost-effective than altering a product later in its lifecycle. Additionally, reviews and feedback at any stage of digital product development are crucial for the product's success.
There is no one-size-fits-all approach to design reviews, so you should customize them to fit your project's specific needs.
How often should you do design reviews?
The frequency of design implementation reviews varies with the project's complexity, timeline, and workflow. Regular check-ins throughout design and development are recommended, often scheduled around milestones like the initial design phase, feature completion, or completion of new components.
In Agile methodologies like Scrum, these reviews are typically part of sprint cycles, occurring at the end of each sprint, within user stories, or as part of the Definition of Done (DoD) for specific tasks.
Synchronous design reviews – live or remote
Synchronous design implementation reviews involve everyone present simultaneously. The project’s designer, developer (at least one per platform), and software tester pull the latest build to compare it against the design documented in Figma.
Whether they’re taking place live or remotely, the jist is the same – designers or developers walk everyone through the design or demonstrate how it’s implemented.
Pros:
- Immediate feedback: Identifying and addressing issues promptly reduces the chances of a misunderstanding.
- Faster decision-making
- Minimized risk of misinterpretation
- Live-fixing minor issues: If we notice a small detail that can be easily fixed, we can take care of it immediately instead of creating a task and addressing it later.
- Improved relationship between design and development
Cons
- Time-consuming: Design reviews can be quite time-consuming, especially on complex projects.
- Scheduling bumps: Scheduling meetings can also pose a challenge and lead to delays. A designated time slot is the key to minimizing scheduling bumps with synchronous design reviews.
- The pressure of on-the-spot feedback
Pro-tip for synchronous design reviews: Whether reviews are conducted live or remotely, it is good practice to determine who is responsible for taking notes during the meeting. This ensures that nothing important is missed and makes it easier to assign tasks for development fixes after the meeting.
Asynchronous design reviews
Scheduling live or remote reviews can be a hassle at times. Asynchronous design implementation reviews offer a way to review and give feedback on a design or its implementation without everyone having to be available simultaneously. This can be convenient for distributed teams across different time zones or when stakeholders need more time for a comprehensive evaluation.
Pros:
- Saving time and effort
- Flexibility: The designer can provide feedback on their schedule, accommodating different time zones and the development team's work hours.
- Written feedback: Written feedback provides a clear and permanent record of the review process.
- Iteration: Asynchronous reviews allow for multiple rounds of feedback, enabling iterative improvement. Developers and testers can respond at their own pace without feeling pressured.
Cons:
- Issue resolution time: They may delay issue resolution because participants are not simultaneously available to discuss and address concerns in real-time.
- Potential for miscommunication: Since there is no opportunity for immediate clarification or real-time discussion of feedback points, there is a higher risk of miscommunication or misinterpretation of the written feedback.
- Reduced collaboration: Without real-time communication, team members have less dynamic interaction.
For some features, live and remote reviews work better, while for others, we can thoroughly review the implementation working asynchronously.
Way of Working for asynchronous design reviews
A standard method for asynchronous reviews is for designers to compare implementation screenshots with the original designs in Figma. Here are a couple of general tips that will help you on your way:
- Before starting the review, specify what build you were using. You want to ensure you have the latest build installed on your testing device, as you don’t want to test bugs and issues that have already been fixed.
- Make sure to take screenshots while going through flows. Alternatively, you can screen-record everything and then screenshot specific frames.
- If you’re doing a design implementation review for mobile, compare the implementation on iOS and Android.
To simplify the process, we have created a template for asynchronous design implementation reviews, ready for your next review.
When we review, we find it best to categorize issues and fixes into three groups: things that must be fixed, things that should be fixed, and “nice-to-haves.” This categorization helps developers prioritize tasks and fix critical issues first.
- 🔴 “Must fix” issues typically relate to critical UX or UI issues likely to disrupt the user experience.
- 🟠 “Should fix” issues usually relate to UI issues caused by a slight deviation from the original design during implementation.
- 🟢 “Nice to haves” relate to improvements or fixes that are not critical or essential but can enhance the quality, usability, or performance of the whole product.
Document all the necessary changes with as much detail and clarity as possible so developers can understand precisely what needs to be improved. Your design review documentation can include notes next to the screenshots you have taken and more detailed information in the appropriate place.
After annotating the implementation and design screens, you must align with the developers and testers about task handling, development fixing, and further testing.
When you designate a task for the development team after a review, make sure to link the Figma file where they can find your design review documentation. For better file organization and clarity, it is better to have a separate Figma file just for design implementation reviews. When the feedback is received, there may be a follow-up meeting to discuss and clarify individual points and make decisions based on project needs or technical limitations.
Checking implementation during testing phases
As our software testers test flows and functionalities, they either screen record everything or take screenshots and attach them while creating tasks for the development team. Whenever they address issues in those tasks, take a look at the provided screenshots or videos and try to catch any UI issues that also need fixing.
“Mini design reviews” can be a great addition to your usual review practice, but they cannot replace the benefits and impact of a comprehensive review.
Review as you see fit
Design implementation reviews aren't "one size fits all." Choose an approach that suits your project's needs. You can also combine synchronous and asynchronous reviews to take advantage of their benefits at different stages. Flexibility is key—adapt your review process as the project evolves and based on team feedback.