Are you talking CRAAP? Our favourite models for appraising health information and the best times to use them.

Are you talking CRAAP?

By Katie Smith, Information Literacy Group Health Sector Representative.

__________________________________________________________

Our favourite models for appraising health information and the best times to use them.

Many health information professionals will deliver sessions on appraising health information and we’ll probably use a nifty, memorable mnemonic or two. Demonstrating a checklist to appraise health information can be a useful way of showing how to check the key areas to decide how good or valid a source is. But there are so many out there! Which one should we use? Do we use different ones for different audiences or sources? Or do we all just teach the same one because we weren’t aware of the others?

Sye and Thompson’s 2023 article, Tools, tests, and checklists: the evolution and future of source evaluation frameworks, provides a brilliant in depth look at some of the different source evaluation frameworks and their benefits and drawbacks. We also asked health literacy trainers what they use and why.

  1. The CRAAP Test

The CRAAP test (Blakeslee, 2004) is probably the most popular framework for health misinformation sessions. CRAAP stands for: Currency, Relevance, Authority, Accuracy and Purpose. The user asks closed questions about a source to decide its validity and usefulness. Its memorable name is one of the reasons it is popular. Some critics have suggested that the name is confusing to students (Sye and Thompson, 2023) but it is effective in teaching the key elements to check when finding information and choosing sources for research. However, make sure you have the right number of A’s!

  1. The RADAR approach

The RADAR approach (Mandalios, 2013) is like the CRAAP test. The handy mnemonic is more meaningful and related to the purpose by suggesting we ‘detect’ the quality of a source like a radar (Sye and Thompson, 2023). RADAR stands for: Relevance, Authority, Date, Appearance and Reason for writing. It is mainly used to look at and select online sources for projects.

  1. SHARE

SHARE is a useful checklist to use for social media posts and webpages. The UK Government created it to combat social media misinformation. SHARE stands for: Source, Headline, Analyse, Retouched, Errors. The language is simpler than the terms used in the CRAAP test. This can make it easier to remember although it isn’t always useful. For example, a post may not have any errors or have any edits to the photograph, but that doesn’t mean the post is truthful or good to share.

  1. SIFT

SIFT (Caulfield, 2019) looks at evaluating social media and online sources like news articles. It is useful to use in the context of misinformation. SIFT stands for: Stop, Investigate the Source, Find better coverage, and Trace the original context. Used with fact checking websites, this would be useful to show with social media examples or fake news stories.

  1. The Trust in Online Health Information Scale

Proposed by Rowley et al. in 2015, the Trust in Online Health Information Scale looked at responses from undergraduate students about their evaluation of digital health information. It is much longer than the other mnemonics. However, it is more detailed and focused on the health context. It features brand, content, credibility, ease of use, recommendation, style, usefulness and verification.

Other frameworks

There are many other evaluation frameworks such as DIG (Digital Image Guide), ABCs of web evaluation, the 5 W’s (Who, What, Why, Where and When) and a Proactive Evaluation Approach. Checklists like SHARE, SIFT, or DIG complement traditional source evaluation frameworks (like CRAAP or RADAR) by focusing on digital content like social media, webpages, and images. You can use these to train anyone on finding reliable online health information for personal use. 

Takeaways: 

Give learners opportunities to practice with information they might find in their everyday lives.

Whichever evaluation framework you use, think about if it is the right choice for your specific audience. CRAAP is not the only option!

_______________________________________

References:

Sye, D., & Thompson, D.S. (2023) Tools, tests, and checklists: The evolution and future of source evaluation frameworks. Journal of New Librarianship, 8(1), 76-100. https://doi.org/10.33011/newlibs/13/9

Blakeslee, S. (2004) The CRAAP Test. LOEX Quarterly, 31(3), 6-7.

Mandalios, J. (2013) RADAR: An approach for helping students evaluate internet sources. Journal of Information Science, 39(4), 470-478. https://doi.org/10.1177%2F0165551513478889 

Caulfield, M. (2019) SIFT (The Four Moves) [blogpost] Hapgood. https://hapgood.us/2019/06/19/sift-the-four-moves/ 

Rowley J, Johnson F, Sbaffi L. (2015) Students’ trust judgements in online health information seeking. Health Informatics Journal. 21(4), 316-327. https://doi.org/10.1177/1460458214546772

Hi! 👋

Want more Information Literacy, straight into your inbox?

Sign up to the ILG newsletter for the latest posts and weekly round-ups!

By subscribing, you agree to receive our promotional content and agree to our Privacy Policy. You may unsubscribe at any time.

Want more Information Literacy, straight into your inbox?

Sign up to the ILG newsletter for the latest posts and weekly round-ups!

By subscribing, you agree to receive our promotional content and agree to our Privacy Policy. You may unsubscribe at any time.

Leave a Comment

Your email address will not be published. Required fields are marked *