Food, TV and Confirmation Bias
There is plenty of research confirming that video games are addictive ....
Research shows the harmful effects of sugar on child's bodies and brains ...
Though to make my point I should throw in some less emotionally charged examples ...
Reports indicate eggs are bad for you ... Reports show now that eggs are good for you ...
Giving kids choice is a powerful tool for learning. By letting kids choose -- in an enviornment made safe for exploring -- they get experience with decision making. They discover what works and doesn't work for them, what they like and don't like.
Choice in an unschooling home sometimes means letting kids safely explore areas Mom decided aren't right (for her). It's very common for Moms to argue that they need to restrict their children's exploration of food, TV and other electronics because the Mom has thoroughly researched this and
Another thing to be aware of is Confirmation Bias. When research focuses on finding confirmation for what you believe, you miss reports that disprove what you believe. If you believed all the marbles in a bag were mostly red, you'd dump out all the marbles to look at all of them.
The way Confirmation Bias works is, the bag of marbles is dumped out. Any marble that isn't red is eliminated from consideration.
That's so obviously wrong. Yet it's exactly what most people do when "researching" a belief. They sift through articles. They eliminate the ones that don't support their beleif. They focus in on the ones that do. They end up with a satisfying pile of dozens and dozens of articles that confirm what they believe, totally ignoring the
Because the blue marbles aren't red, they don't count. They aren't real marbles because ... they aren't red.
Food fears. I'm pulling this [Facebook] post (on Radical Unschooling Info) from the bottom of a very long thread. The topic of limiting kids' food choices based on fears of chemicals comes up often so I thought this might be useful.
Note: There is nothing counter to unschooling about surrounding kids with organic food or similar diets. The conflict with unschooling comes when parents forbid or fill their kids' heads full of "information" to prevent them from making their own choices that include conventional foods.
I assumed my choices based on past struggles with food were not fear based. I thought fear-based decisions meant decisions based on threats I'd been told of but never experienced. But maybe they are just plain fear.
(A paraphrased version of the original quote that's a bit clearer out of context.)
It's both. If you avoid repeating a bad consequence it is both sensible and fear based. It's not a big deal if avoiding that experience doesn't limit your life. If avoidance limits your life or someone else's then it is a big deal.
If you avoid certain chemicals in food yourself, not a big deal. If you decide your children must avoid certain chemicals, it limits not only their food choices but their ability to learn what's right for them. And, as they get older and the desire to explore beyond your limits grows, they will feel shackled not only by your no's but by the weight of your "information" that's built a wall between them and what you've forbidden.
Fear prevents us from trying again, from looking deeper, to find a way past the bad consequences.
It's understandable that mothers don't want to keep trying when they believe the consequences are bad for their kids! But it's still fear. And it's limiting. Which is why forums like this are valuable. People here have had lots of experience doing what you fear to do. People here can tell you what they've found actually happens with radically unschooled kids so you don't need to risk your own children on something you believe dangerous.
One big issue in your way of seeing clearly is the so-called research you did on nutrition. (Did that sentence cause your defenses to rise? Are you unwilling to examine whether your certainty or process might be false?)
You drew the conclusion that certain chemicals in food are bad for health. You then looked for confirmation. And you found writing that confirmed it. The more you looked for confirmation, the more you found. The more you found, the more convinced you became that the conclusion you'd drawn was right.
What you did is called confirmation bias.
All humans do this. It's natural for us to be biased towards confirming a belief. It's how our brains operate as we figure out how the world works. When we believe we understand, we try to confirm we're right.
If we test our belief against the physical world, it works. Or works well enough that we can, for ordinary everyday purposes, build an understanding of the world that way. (Though it's also how we develop superstitions! And folk beliefs. Even some religious beliefs.)
But if we test our conclusion against others' beliefs on the matter, if there are differing opinions, we'll naturally seek out the opinions that confirm ours.
Scientists, in order to do good science, can't design experiments to test their belief. They must design experiments to *disprove* their belief. They need to take on the role of the rival scientist who wants to see them fail ;-) They must ask the probing, skeptical, disbelieving questions their opponent would ask.
Good science is published in peer-reviewed science journals. (Researchers applying for government funding also undergo peer review before they begin the research.) The peer reviewers' purpose is to punch holes in how the research was (or will be) conducted. They ask, Is the researcher trying to punch holes in his own theory? Or is he designing experiments that can't fail to yield the results he wants? (While peer review has its flaws, the absence of independent, objective peer review on research is a giant red flag.)
It's very much like what we do here. We offer peer review of people's unschooling ideas The peer review process, knowing my ideas will be scrutinized with a critical eye at their logic and how well they match what unschoolers experience, has helped me think much more clearly about unschooling ideas. (And about life in general )
Eliminating bias is really hard to do! Even for research scientists whose job it is. Which is why peer reviewing is used before research is published. Their job is to catch the bias. Their job is to be the disbeliever and skeptic. (Reviewers are anonymous and independent of the study they're reviewing so that they aren't emotionally biased towards the research, the researcher or towards the research's funding source.)
Unless you're reading literature in peer-reviewed science journals then the confirmation from your "research" (which is more like literary research and not at all like science research) is not only the product of *your* confirmation bias but of the *authors* you're reading. (How much of your reading was of authors interpreting studies done by others? How much was of scientists doing peer-reviewed research?)
You've trusted that these writers were objective, that they (literarily) researched *all* the studies and drew an informed conclusion. It's very very likely that they did not. It's very likely they scoured the scientific literature for anything they could use to support their pet theory. Not out of malice -- Hopefully! Though if they're selling something (like their book) ... -- but because, like you, they sought to confirm what they believed.
That's not scientific research. (It's not even good literary research.) That's bad science.
The disdain for big science makes confirmation bias even worse. The fringe believers wash everyone who disagrees with them with an evil "Big Pharma" brush. They protect themselves from criticism by criticizing their critics. They give the impression that all the other scientists are proving what corporations tell them to prove. And that they are independent, searching for the "truth". When in reality they're independent because peers would roll their eyes at their bias.
(It *is* hard for a scientist to be objective when their funding source is looking for a particular result. But, one, it's illegal to fudge data and, two, scientists can be very catty with each other if they think one of their own is deliberately doing bad science. But it *is* a good idea to check to see who funded the research. A corporation? A university? The government? Was it a request by the government or a grant from the government?)
Reading about logical fallacies is often tough going, but
David McRaney's post on Confirmation Bias at You Are Not So Smart: A celebration of self delusion is very good.
Wikipedia has a good article on peer review. (There's also a section on criticism and flaws of the process. But avoiding peer review isn't the answer to the flaws.)
The original post and thread is here.