42 Critiquing Science Communication

CONTENT

A cartoon image of a grey Gryphon. It is holding a pointer stick and pointing to something to its right.

  • Critiquing Science Communication

SLIDE DECKS

Some of the material presented in this chapter will be discussed in the lab. You must ensure you cover all the concepts presented in the lab and in this textbook.

This lab will be graded. See slides for more information.

Critical Evaluation & Feedback

As you continue to develop requirements, your team might begin to take certain assumptions or aspects of the project for granted. This might include the development of your own lexicon associated with the work that needs to be completed, or it might include completely missing a particular component of design that no one considered.

For example, your team may have had many conversations that ultimately helped you define a user and the roles they might have, or that defined what a user profile might look like. However, if this isn’t formally documented somewhere, it could lead to ambiguity within the requirements themselves – and this is something we want to avoid.

Beyond this, building requirements takes practice, so it’s always good to get a fresh pair of eyes to look at the work you’ve done. This is especially true when you are developing requirements for a large and complex system.

Reviewing requirements can be done in any number of ways. In our lab, we are going to share our requirements document with another team and have them critique the work we are doing. If you want to really test the work you have produced, you might want to give it to someone who isn’t in the course, and who has no prior knowledge of the system you are developing. Note: in a professional setting, you likely wouldn’t share the requirements with anyone other than the design team, the managers, and the client because you’d likely be violating confidentiality agreements.

But what is critical evaluation? Critical evaluation begins with an analytical process that extends beyond simply describing something (such as a requirements document). It involves reviewing something to identify its validity, feasibility, strengths, weaknesses, risks, and opportunities, among other things. It might also include a comparison to similar products or things.

A critical evaluation (in general) could answer the following questions:

  • What is it that is being evaluated? The answer to this question would provide a general description of the thing being evaluated (consider what, when, who, where, why, and how questions), why it’s being evaluated, and perhaps how it’s being evaluated.
  • What are the assumptions associated with the thing being evaluated? Are they valid assumptions? Do any unlikely or unfounded assumptions necessarily lead to positive, neutral, or negative outcomes that wouldn’t occur if the assumption was described more realistically or honestly?
  • How does the thing being evaluated work? What are its strengths and weaknesses? How does it compare to similar products?
  • What are some of the risks? Could the thing to be evaluated lead to harm? Is the thing being evaluated at risk of failing because it relies on a large assumption?
  • What are some of the opportunities? Could the thing to be evaluated be improved by incorporating other knowledge or products? If so, how?

From the point of view of evaluating a requirements document, you may also want to consider the following questions:

  • Are the requirements numbered correctly?
  • Are there spelling or grammar mistakes?
  • Are there Musts, Coulds, Shoulds, and Won’ts?
  • Do any of the Won’ts need to be implemented?
  • Is each requirement specific and measurable?
  • Is each requirement a complete sentence?
  • Are any of the requirements compound requirements?
  • Does anything need to be defined?

Later, as we add prioritizations, dependencies, and time estimates to our requirements document, we will also want to evaluate them to ensure consistency and logical flow.

Providing Good Feedback – Some Guidelines

  • Provide feedback in a way that you would want to receive it. Focus on the product, not the person. Refer back to the Wil Wheaton Rule.
  • Be positive. identify things that work (well) and things that could be improved. That is, don’t focus just on the negatives.
  • Don’t simply indicate that something is wrong or ill-conceived. Provide justification for any such claims. Provide suggestions.
  • If you think something needs to be improved, provide an example of how that might happen. Links and references are a great help. Remember – we’re in this together.
  • Be specific and clear in your feedback. Identify any assumptions you are making.
  • Don’t provide judgement. Identify opinions as opinions.

License

Community Engaged Data Science Copyright © 2023 by Daniel Gillis. All Rights Reserved.

Share This Book