let's make higher education better, together

Information processing…by humans

Information processing…by humans

by Sebastien Wiertz on flickr.com

by Sebastien Wiertz on flickr.com

“Hey, I just read an article that said…”

I was reading an article about how dogs align themselves with Earth’s magnetic field while doing their business when I came across some words of wisdom regarding online information consumption. Marc Abrahams, editor of the Annals of Improbable Research provided an apt description of how the components of an academic research paper are cannibalized for widespread consumption. “One part [of the paper] is where they are telling you what they saw, and the other part almost always comes at the end, where they tell you what it means,” he said. “Often, that’s the part that’s questionable. But that’s also the part that usually ends up getting passed around.”

Following this past season’s barrage of holiday social events, I could not help but notice just how much my friends and family were talking about research. Perhaps it is the ubiquity of consumption-based devices like tablets and smartphones, but this season more than ever I was struck by how enthusiastic people were about learning things and, subsequently, their desire to share that knowledge. The topics ranged from how the MOOCS (massive open online courses) were not reaching their target populations to an astounding new development in the fight against leukemia. For me, this was a welcome deviation from the garden-variety family gossip about cousin so-and-so’s insufferable new love interest.

One night I was zoning out for a few moments amidst a backdrop of “I just read an article that…” and “Did you hear that they found that…?” when one of the conversational threads snagged. A few people presented the take-away points from a recent medical study when their interlocutor, a physician, asked about Abrahams’ dreaded “first part” of the paper. Was it a university or private corporation study? Did they know if the study was conducted double-blinded? I could tell that he meant no harm, these being very basic questions for an expert in the field. Regardless, what followed was the social equivalent of a car coming to screeching halt. Their discussion quickly unraveled into “I don’t know, but I will email you the link.” They never emailed him the link.

“Who are they and how do they know?”

Humans are generally awesome at configuring machines to process and deliver information, but many of us have not been explicitly trained as to how to process information for ourselves. Last year, I witnessed the following exchange between a new graduate student and an experienced professor:

Student:         “I just read that [insert research finding here].”

Professor:       “How do you know?”

Student:         “I read about a study by [insert author here] in [insert source here].”

Professor:       “Ok. How do they know?”

Student:         “Umm, I don’t know. I will have to get back to you on that.”

Though I may not be as seasoned, I have had many similar exchanges with students. Indeed, many educators have belabored the difficulties that students face in critically analyzing an article’s sources, methods, and conclusions. I do not necessarily fault them personally for skipping ahead to the conclusion, though. Headlines fixate on the “deliverables,” and mainstream articles seldom devote more than a few phrases to the source, existing body of research, or data collection and analysis process. While this practice may at times suffice for skimming the news, it deprives readers of information that could significantly impact the value judgments that they make for themselves on the information presented. For example, a reader would be less likely to accept a Telegraph (a reputable U.K. newspaper) report of a University of Portsmouth study claim that “reflexology may be as effective as [traditional] painkillers” if informed that the study had no real control, a sample size of 15, and was not blinded.

“Critical thinking for reading your iPad”

How can non-specialists analyze that tedious “first part” of research? Is this not just teaching to (sigh) think critically? Yes, and no. The former question addresses a concrete skill that can be taught and may consequently facilitate the more abstract, nebulous skill of thinking critically. The reality is that many college students, especially those at less-selective institutions, will complete their schooling without being rigorously instructed in how to appropriately process the headlines and sound bytes that shape their perceptions of the state of scientific progress in our world.

If I were to build a university from the ground up (any venture capitalists reading?), I would do the following. Along with language, writing, and math core courses, there would be an Information Processing Seminar, mandatory for all incoming freshmen. These types of courses are often available for STEM majors preparing to either conduct research or apply its results but are rarely included in the core curriculum for all students. Unlike specialized courses that teach research principles that prepare students to produce investigations, the seminar would focus on the consumption of knowledge broadcast through various platforms like news apps or websites. The seminar would teach the basic principles behind quantitative and qualitative means of investigation in an accessible manner, without the horror generally associated with an Introduction to Statistics course. The most critical skills to develop would be assessment of the source, the design, and the methodology.

“Who knows?”

The advice to “consider the source” seems trite, if not obvious even. However, outside of academic and research communities, the difference between peer- and non peer-reviewed material is not always well understood. Submitting a study with cooked books to a peer-reviewed publication could effectively ruin one’s career (Wakefield’s vaccine debacle) whereas printing dubious claims on a non-peer reviewed website may only lead to nasty postings in the comments sections. In addition, a sophisticated-looking webpage or well-designed blog can make information appear more legitimate than it may actually be (think The Onion). Conversely, studies published in reputable, established journals such as The New England Journal of Medicine or Nature undergo a lengthy review process, instilling more confidence in a study’s credibility.

Even when a source is reputable, there may be professional or personal affiliations that a reader need take into consideration. Did a company evaluate one of their own products or services in the study? Was the study funded by an organization with a social or political agenda? An affirmative answer to either of these questions should be motivation to dig a bit deeper before accepting any conclusions outright.

“How do they know?”

A study’s design and that design’s faithful implementation are arguably its most crucial components – they are what make the results credible and potentially generalizable to a larger population.  Mainstream media reports, however, tend to include only the vaguest of design details, inserting perfunctory adjectives for a “double-blinded,” “qualitative,” or “randomized” study before laying out the conclusion. Information Processing would require students to develop a working understanding of bias and how it can creep into empirical research through factors such flawed experimental design, attrition, issues with validity or reliability, or crossover between groups. Many studies that have suffered some degree of bias still publish their results (especially if funders want to see something for their investment), often with advanced statistical procedures to try to account for sources of research bias such as attrition or differences between groups. A real-world example of a critique of bias in a widely-reported study can be found in Dr. Ivory A. Toldson’s response to the recent study featured in Crime & Delinquency, which claimed that nearly 49% of black men and 38% of white men had been arrested by age 23. Toldson rightly points out deep issues with missing data and small sample sizes used for critical components of the investigation.

“What do they know?”

Whether or not researchers find evidence for an existing phenomenon (dog defecation alignment) or an intervention (welfare, medical trials, etc.) depends upon the resulting data calculations for factors such as statistical significance, power, and effect size. In Information Processing college students would develop the ability to recognize these elements in a report and make judgments regarding how researchers analyze them to arrive at a conclusion. For example, social scientists may report a low-stakes educational intervention as “statistically significant” if they found that there was less than a 10% chance that program effects were due to chance alone, but an informed reader would question such a conclusion for a high-stakes medical trial. Given the implications of incorrectly attributing a chance finding to a medical intervention, researchers often only interpret effects as “statistically significant” if there is less than a 5% probability that effects are due to chance alone. Similar observations can also be made for statistical power and effect size. These distinctions are incredibly important in that they allow us to decipher what we actually know from research findings and how that knowledge may be practically useful.

Back to reality

Of course, I understand that it is unrealistic for individuals to locate and then drudge through the full-length articles behind every headline. That is not what I am advocating here. Rather, young people should be required to develop the skills to critically access the findings that resonate with them to the point of shaping their decisions or world views – questions like “Should I undergo this medical treatment?” or “What are the long-term effects of this habit?” As an ever-increasing onslaught of information is made available to us, we can become a society of empowered consumers, deliberately choosing what we are going to “buy into.” Oh, and the link to the original poop paper is here.

*A good overview of how to critically read a study, written by Kyle Hill, can be found here

© HigherEdgy, 2014. Unauthorized use and/or duplication of this material without express and written permission from this blog’s owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to  http://www.higheredgy.com with appropriate and specific direction to the original content. Photo license: http://creativecommons.org/licenses/by/2.0/deed.en

Advertisements

Comments:

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

Legal Stuff for Higher Edgy®

© HigherEdgy, 2014. Excerpts and links may be used, provided that full and clear credit is given to higheredgy.com with appropriate and specific direction to the original content.

Graphic design by:

Copyright © 2014 · All Rights Reserved · www.higheredgy.com

%d bloggers like this: