The many benefits of software peer reviews include improved quality and productivity, sharing of technical knowledge, and gaining insights that lead to process improvements. Sometimes, though, it is hard for potential reviewers to put their heads together in real time.
Increasingly, software projects involve teams that collaborate across multiple corporations, time zones, continents, nationalities, organizational cultures, and native languages. Such projects must modify the traditional face-to-face peer review method. The review issues include both communication logistics and cultural factors; the latter usually pose the greater challenge. Even if cultural barriers are not an issue, you’ll need to deal with the difficulties of holding reviews with participants who cannot meet in person.
The two dimensions to consider are time and place. If review participants can assemble in the same location, you can hold a traditional review meeting. Geographically separated participants can hold distributed review meetings, and reviewers who cannot connect concurrently can practice asynchronous reviews. With either nontraditional method; however, your collaborations will be more effective if the participants meet in person at least once early on. Use this meeting to establish the team rapport and respect for the review moderator’s leadership that are necessary for effective reviews. Periodic face-to-face meetings throughout the project will help maintain the bond the team members established at the beginning.
Distributed Review Meeting
Today’s audio- and videoconferencing tools can facilitate communication if the participants are available at the same time but in different places, although sometimes "same time" is complicated when the participants reside in different time zones. My colleague Erik moderated several reviews that involved participants who spanned twelve time zones. You can manage this challenge by changing the time of day that you hold the reviews, to rotate the inconvenience of getting up in the middle of the night. This also avoids the perception that certain individuals or locations are subordinate to others.
A distributed review places special burdens on both the participants and the review moderator. When I participated in one distributed review meeting by telephone, I was struck by the absence of body language and facial expressions. I couldn’t tell what the other participants were doing or thinking. I couldn’t see when someone looked puzzled or looked like she was getting ready to say something. It is also difficult to detect sidebar conversations over the telephone or see when participants have left the room or are distracted. Use expert moderators for such long-distance reviews.
Establish some ground rules for taking turns speaking, identifying yourself before making a comment, relinquishing control to the moderator, and timeboxing discussions. For instance, a "round robin" approach to raising issues can keep all participants engaged when the moderator has difficulty knowing who is not contributing. Johanna Rothman described many conference call dos and don’ts for multicultural project meetings in her article "Managing Multicultural Projects with Complementary Practices" (Cutter IT Journal, April 2001).
My colleague Chris used a moderator at each of the three locations participating in a series of conference-call review meetings. During the meeting, each moderator facilitated participation by the team members present in his room. The moderators conferred before and after each session to discuss which aspects worked well and which did not.
I know one moderator who uses a whistle when leading audioconference reviews. A short toot gains the attention of participants who can’t see when the moderator is trying to break into the discussion. Another moderator has used the dialing beeps on the telephone as an attention-getter. A simple tone sequence such as the opening notes of Beethoven’s Fifth Symphony (dial 3-3-3-7) is easily recognized.
Videoconferencing addresses some of the challenges of conference-call review meetings. However, the time lag in videoconference equipment can be distracting and makes it easy for multiple participants to begin speaking simultaneously. Then they all stop speaking, and the cycle begins anew. During a videoconference review, the moderator can hold up a colored piece of paper or wave a flag when he needs to get the group’s attention.
Distributed reviews benefit from Internet-based collaboration tools: Visit http://www.coworking.com/html/tool.html for some examples. Some of these tools display the product being reviewed in a browser-like display so all participants see the same image. The recorder captures items in an online issue log as the reviewers bring them up, perhaps displaying the log in the browser for remote participants to view. Hyperlinks between the product under review and supporting documents permit easy and convenient navigation during the distributed discussion. Some studies of such collaborative review approaches indicate that they can be as effective as face-to-face meetings (Vahid Mashayekhi et al., "Distributed, Collaborative Software Inspection," IEEE Software, September 1993).