By Elisabeth Hendrickson and Dale Emery

Note: This article appeared in a slightly different form on the Software Test and Performance Collaborative website.

We were in the middle of the WordCount simulation in the Introduction to Agile Testing class, and participants were building a "software system" on index cards. Despite the low-tech props, the group was struggling with the same issues in the simulation as in the real world.

The last round had been a chaotic explosion of frenetic activity. Testers madly wrote yellow test case cards, white input cards, and pink bug cards. Developers scribbled instructions on green code cards. Meanwhile, Product Managers brainstormed new feature ideas on blue cards. The room was littered with a confetti of pink, white, yellow, green, and blue.

Dale, facilitating the simulation, announced that the round was over. Participants groaned, voicing their frustration at their lack of success. Dale kicked off the retrospective for the round by asking, "What problems are you seeing?"

The Interoffice Mail Courier shook his head, "The left hand doesn't know what the right hand is doing."

"Yes," agreed an Observer. "The testers and developers were working on different versions of the code, but didn't know it. Testers found two bugs that developers had already fixed."

"I have no idea what the test status is," said a Product Manager.

"Me either," agreed a Developer.

A Tester concurred, "Yeah, me either."

The group laughed.

"OK," said Dale, "What would you like to do about that?"

Another Tester jumped up, listing one idea after another for status reporting. "We can report the number of tests executed, how many are passing and failing, the number of bugs found and fixed, test coverage…"

As she continued to list ideas, she moved to the flip chart at the front of the room, grabbed a pen, and sketched out a complex status board crowded with numbers and graphs.

One of the Developers pointed. "What does that number mean?"

"Is it an open bug count?" suggested a Product Manager, squinting at the reporting template.

"Who is going to update these statistics?" asked another Tester.

For the next 20 minutes the group struggled to perfect the design of the status board, debating about what the statistics meant, who would update them, and when. The discussion grew loud and fragmented as multiple conversations erupted in the room.

Dale, sensing that the group was not converging, called the room to order. "I hear you all listing a lot of data that could be reported," he said. "But what information do you actually need to move forward?"

The Tester at the flip chart blinked, mouth open, pen hovering in mid-air.

"I need to know what bugs are open," a Product Manager said, breaking the silence. "The Customer keeps asking for a demo. I need to know if there's any reason not to give her one."

A Developer said, "I need to know what tests are passing and failing." She laughed and added, "And what version of the code you're testing."

Another Tester moved to the flip chart and started a new page. "What if we wrote the version here at the top? Then posted the test case cards on the left, and bug cards on right? We can mark the test cards with green or red dots to indicate pass or fail."

Heads nodded. This approach was simple and it would work. The group agreed to try it.

In the next round, the simplified status board became a focal point, supporting the group in delivering a working system to the Customer.

After the simulation, the Tester who had sketched the complex status board approached Dale. "Thank you, " she said.

"You're welcome," Dale replied. "What are you thanking me for?"

"My QA director and I spent five months standardizing the test status report for all of our projects. It covers all the kinds of data I listed when I was at the flip chart. We've been frustrated that no one reads the reports. But today I realized why: we never asked our stakeholders what information they needed." She took a deep breath and continued. "So tomorrow I'll tell my QA director about this experience. We need to simplify our report. But only after we ask the stakeholders what information they want to see in it."