Design implementation reviews
Last modified on Tue 03 Sep 2024

As a designer, you hand off your work to the development team, and they move on to implementing your designs. However, that doesn’t mean you can close Figma and call it a day. You want to ensure your designs are implemented correctly and meet the outlined requirements and specifications. This is where a design implementation review comes in.

Design implementation reviews, or design reviews, keep everyone (designers, developers, and software testers) on the same page. They help ensure that your carefully envisioned designs are implemented precisely as they should be.

What is a design review?

A design implementation review evaluates the current state of a project’s implementation and is usually performed at the beginning of the development process.

The golden rule is: review early and often. Identifying potential bugs and UI issues early in the process reduces the chance of major problems going unnoticed until later in development.

Making changes early in development is much more cost-effective than altering a product later in its lifecycle. Additionally, reviews and feedback at any stage of digital product development are crucial for the product's success.

There is no one-size-fits-all approach to design reviews, so you should customize them to fit your project's specific needs.

How often should you do design reviews?

The frequency of design implementation reviews varies with the project's complexity, timeline, and workflow. Regular check-ins throughout design and development are recommended, often scheduled around milestones like the initial design phase, feature completion, or completion of new components.

In Agile methodologies like Scrum, these reviews are typically part of sprint cycles, occurring at the end of each sprint, within user stories, or as part of the Definition of Done (DoD) for specific tasks.

Synchronous design reviews – live or remote

Synchronous design implementation reviews involve everyone present simultaneously. The project’s designer, developer (at least one per platform), and software tester pull the latest build to compare it against the design documented in Figma.

Whether they’re taking place live or remotely, the jist is the same – designers or developers walk everyone through the design or demonstrate how it’s implemented.

Pros:

Cons

Pro-tip for synchronous design reviews: Whether reviews are conducted live or remotely, it is good practice to determine who is responsible for taking notes during the meeting. This ensures that nothing important is missed and makes it easier to assign tasks for development fixes after the meeting.

Asynchronous design reviews

Scheduling live or remote reviews can be a hassle at times. Asynchronous design implementation reviews offer a way to review and give feedback on a design or its implementation without everyone having to be available simultaneously. This can be convenient for distributed teams across different time zones or when stakeholders need more time for a comprehensive evaluation.

Pros:

Cons:

For some features, live and remote reviews work better, while for others, we can thoroughly review the implementation working asynchronously.

Way of Working for asynchronous design reviews

A standard method for asynchronous reviews is for designers to compare implementation screenshots with the original designs in Figma. Here are a couple of general tips that will help you on your way:

To simplify the process, we have created a template for asynchronous design implementation reviews, ready for your next review.

When we review, we find it best to categorize issues and fixes into three groups: things that must be fixed, things that should be fixed, and “nice-to-haves.” This categorization helps developers prioritize tasks and fix critical issues first.

Document all the necessary changes with as much detail and clarity as possible so developers can understand precisely what needs to be improved. Your design review documentation can include notes next to the screenshots you have taken and more detailed information in the appropriate place.

After annotating the implementation and design screens, you must align with the developers and testers about task handling, development fixing, and further testing.

When you designate a task for the development team after a review, make sure to link the Figma file where they can find your design review documentation. For better file organization and clarity, it is better to have a separate Figma file just for design implementation reviews. When the feedback is received, there may be a follow-up meeting to discuss and clarify individual points and make decisions based on project needs or technical limitations.

Checking implementation during testing phases

As our software testers test flows and functionalities, they either screen record everything or take screenshots and attach them while creating tasks for the development team. Whenever they address issues in those tasks, take a look at the provided screenshots or videos and try to catch any UI issues that also need fixing.

“Mini design reviews” can be a great addition to your usual review practice, but they cannot replace the benefits and impact of a comprehensive review.

Review as you see fit

Design implementation reviews aren't "one size fits all." Choose an approach that suits your project's needs. You can also combine synchronous and asynchronous reviews to take advantage of their benefits at different stages. Flexibility is key—adapt your review process as the project evolves and based on team feedback.