Recognizing scientific literacy & illiteracy

Scientific literacy – what it is, how to recognize it, and how to help people achieve it through educational efforts, remains a difficult topic.  The latest attempt to inform the conversation is a recent National Academy report “Science Literacy: concepts, contexts, and consequences” (https://www.nap.edu/download/23595).

While there is lots of substance to take away from the report, three quotes seem particularly telling to me. The first is from Roberts [1] that points out that scientific literacy has “become an umbrella concept with a sufficiently broad, composite meaning that it meant both everything, and nothing specific, about science education and the competency it sought to describe.”  

The second quote, from the report’s authors, is that “In the field of  education, at least, the lack of consensus surrounding science literacy has not stopped it from occupying a prominent place in policy discourse(p. 2.6).  And finally, “the data suggested almost no relationship between general science knowledge and attitudes about genetically modified food, a potentially negative relationship between biology-specific knowledge and attitudes about genetically modified food, and a small, but negative relationship between that same general science knowledge measure and attitudes toward environmental science” (p. 5.4).

576px-Flammarion
“Flat Earth” The Flammarion engraving (1888) Wikipedia

Recognizing the scientifically illiterate

So, perhaps it would be useful to consider the question of scientific literacy from a different perspective, namely, how can we recognize a scientifically illiterate person from what they write or say? What clues imply illiteracy?[1]

To start, let us consider the somewhat simpler situation of standard literacy.  Assume we ask a person a question and that the question is clearly composed, we might expect the illiterate person to have trouble correctly interpreting what a reasonable answer should contain.  Constructing a literate answer implies two distinct abilities: the respondent needs to be able to  accurately interpret what the question asks and they need to recognize what an adequate answer contains.

These are not innate skills; students need feedback and practice in both, particularly when the question is a scientific one. In my own experience with teaching, as well as data collected in the context of an introductory course [2], all to often a student’s answers consist of a single technical term, spoken (or written) as if a word = an argument or explanation.

We need a more detailed response in order to accurately judge whether an answer addresses what the question asks (whether it is relevant) and whether it has a logical coherence and empirical foundations, information that is traditionally obtained through a Socratic interrogation.[2]  At the same time, an answer’s relevance and coherence serve as a proxy for whether the respondent understood (accurately interprets) what was being asked of them.

So what is added when we move to scientific in contrast to standard literacy, what is missing from the illiterate response.  At the simplest level we are looking for mistakes, irrelevancies, failures in logic, or in recognizing contradictions within the answer, explanation or critique. The presence of unnecessary language suggests, at the very least, a confused understanding of the situation.[3]

A second feature of a scientifically illiterate response is a failure to recognize the limits of scientific knowledge; this includes an explicit recognition of the tentative nature of science, combined with the fact that some things are, theoretically, unknowable scientifically.  For example, is “dark matter” real or might an alternative model of gravity remove its raison d’être?[4]

When people speculate about what existed before the “big bang” or what is happening in various unobservable parts of the multiverse, have they left science for fantasy.  Similarly, speculation on steps to the origin of life on Earth (including what types of organisms, or perhaps better put living or pre-living systems, existed before the “last universal common ancestor”), the presence of “consciousness” outside of organisms, or the probability of life elsewhere in the universe can be seen as transcending either what is knowable or likely to be knowable without new empirical observations.

While this can make scientific pronouncements somewhat less dramatic or engaging, respecting the limits of scientific discourse avoids doing violence to the foundations upon which the scientific enterprise is built.  It is worth being explicit, universal truth is beyond the scope of the scientific enterprise.

The limitations of scientific explanations

Acknowledging the limits of scientific explanations is a marker of understanding how science actually works.  As an example, while a drug may be designed to treat a particular disease, a scientifically literate person would reject the premise that any such drug would, given the nature of interactions with other molecular targets and physiological systems, be without side effects and that these side effects will vary depending upon the features (genetic, environmental, historic, physiological) of the individual taking the drug.  While science knowledge reflects a social consensus, it is constrained by rules of evidence and logic (although this might appear to be anachronistic in the current post-fact age).

Even though certain ideas are well established (Laws of Conservation and Thermodynamics, and a range of evolutionary mechanisms), it is possible to imagine exceptions (and revisions).  Moreover, since scientific inquiry is (outside of some physics departments) about a single common Universe, conclusions from different disciplines cannot contradict one another – such contradictions must inevitably be resolved through modification of one or the other discipline.  A classic example is Lord Kelvin’s estimate of the age of the Earth (~20-50 million years) and estimates of the time required for geological and evolutionary processes to produce the observed structure of the Earth and the diversity of life (hundreds of millions to billions of years), a contradiction resolved in favor of an ancient Earth by the discovery of radioactivity.

Scientific illiteracy in the scientific community

There are also suggestions of scientific illiteracy (or perhaps better put, sloppy and/or self-serving thinking) in much of the current “click-bait” approach to the public dissemination of scientific ideas and observations.  All too often, scientific practitioners, who we might expect to be as scientifically literate as possible, abandon the discipline of science to make claims that are over-arching and often self-serving (this is, after all, why peer-review is necessary).

A common example [of scientific illiteracy practiced by scientists and science communicators] is provided by studies of human disease in “model” organisms, ranging from yeasts to non-human primates. While there is no doubt that such studies have been, and continue to be critical to understanding how organisms work (and certainly deserving of public and private support) – their limitations need to be made explicit, while a mouse that displays behavioral defects (for a mouse) might well provide useful insights into the mechanisms involved in human autism, an autistic mouse may well be a scientific oxymoron.

Discouraging scientific illiteracy within the scientific community is challenging, particularly in the highly competitive, litigious,[5] and high stakes environment we currently find ourselves in.[6]  How to best help our students, both within and without scientific disciplines, avoid scientific illiteracy remains unclear, but is likely to involve establishing a culture of Socratic discourse (as opposed to posturing).  Understanding what a person is saying, what empirical data and assumptions it is based on, and what does it imply and or predict are necessary features of literate discourse.

Picture1M.W. Klymkowsky  web site:  http://klymkowskylab.colorado.edu  email: [email protected] Twitter @mikeklymkowsky

 

 

 

Literature cited:

  1. Roberts, D.A., Scientific literacy/science literacy. I SK Abell & NG Lederman (Eds.). Handbook of research on science education (pp. 729-780). 2007, Mahwah, NJ: Lawrence Erlbaum.
  2. Klymkowsky, M.W., J.D. Rentsch, E. Begovic, and M.M. Cooper, The design and transformation of Biofundamentals: a non-survey introductory evolutionary and molecular biology course. LSE Cell Biol Edu, in press., 2016. in press.
  3. Lee, H.-S., O.L. Liu, and M.C. Linn, Validating measurement of knowledge integration in science using multiple-choice and explanation items. Applied Measurement in Education, 2011. 24(2): p. 115-136.

4.         Henson, K., M.M. Cooper, and M.W. Klymkowsky, Turning randomness into meaning at the molecular level using Muller’s morphs. Biol Open, 2012. 1: p. 405-10.

[1] Assuming, of course, that what a person’s says reflects what they actually think, something that is not always the case.

[2] This is one reason why multiple-choice concept tests consistently over-estimate students’ understanding ( 3. Lee, H.-S., O.L. Liu, and M.C. Linn, Validating measurement of knowledge integration in science using multiple-choice and explanation items. Applied Measurement in Education, 2011. 24(2): p. 115-136.)

[3] We have used this kind of analysis to consider the effect of various learning activities 4.        Henson, K., M.M. Cooper, and M.W. Klymkowsky, Turning randomness into meaning at the molecular level using Muller’s morphs. Biol Open, 2012. 1: p. 405-10..

[4] http://curious.astro.cornell.edu/physics/108-the-universe/cosmology-and-the-big-bang/dark-matter/659-could-a-different-theory-of-gravity-explain-the-dark-matter-mystery-intermediate

[5] http://science.sciencemag.org/content/353/6303/977 and http://www.cjr.org/the_observatory/local_science_fraud_misses_nat.php

[6] See as an example: http://www.nature.com/news/bitter-fight-over-crispr-patent-heats-up-1.17961http://www.sciencemag.org/news/2016/03/accusations-errors-and-deception-fly-crispr-patent-fight

*******
Mike Klymkowsky is a Professor of Molecular, Cellular, and Developmental Biology at the University of Colorado Boulder. In the area of biology education research, he developed (with Kathy Garvin-Doxas) the NSF-supported Biological Concepts Instrument (BCI), as well as a suite of virtual laboratory activities in molecular biology with Tom Lundy. He has been involved with the general question of how to develop more rigorous, coherent, and engaging courses and curricula in the biological sciences, including a re-designed introductory evolutionary, molecular, and systems biology course – Biofundamentals and general chemistry – Chemistry, Life, the Universe & Everything (CLUE), both with Melanie Cooper (Chemistry – Michigan State University). See “About this Blog” for more on Mike’s work.
Read more at blogs.plos.org

Trackback from your site.

Leave a comment

Save my name, email, and website in this browser for the next time I comment.
Share via