Give your kids the advantage with the award winning easy-to-teach Real Science-4-Kids  

A recent article in The Washington Times titled “Climate scientists to fight back at skeptics” discusses the ways in which key climatologists are feeling pressure to fight back and respond to their critics, in light of what has been referred to as “Climategate.” “Climategate,” which has painted climate scientists in an unflattering light, concerns the leaking of emails between two top climate research scientists. The emails appeared to indicate that the two scientists were “massaging data” in favor of a certain conclusion and ignoring other key data points to do so.

In the article, Stephen H. Schneider, a Stanford professor, says that he believes the “social contract” between policymakers and scientists has been broken and needs to be fixed. He is quoted as saying,

“What I am trying to do is head off something that will be truly ugly… I don’t want to see a repeat of McCarthyesque behavior and I’m already personally very dismayed by the horrible state of this topic, in which the political debate has almost no resemblance to the scientific debate.”

Schneider was a participant in an email conversation in which several other climate scientists associated with the National Academy of Sciences discussed organizing their fellow researchers to form a nonprofit group in order to raise funds for an ad in the New York Times which would respond forcefully to critics of the climatologists.

This latest twist in the story of “Climategate” leads us to question why the “social contract” between scientists and policymakers has been broken. Is it because the scientific facts given to us by the climate researchers are entirely flawed? Or could it be a result of policy makers who have relied on the “science experts” to tell them what to believe rather than using their own critical thinking skills to evaluate the conclusions proposed by the experts?

Science, at its most basic level, is all about argument, opposing viewpoints, and varied interpretations. One of the central requirements of any scientific theory is that its conclusions be falsified and tested, however what constitutes a “proven fact” is debatable. Because of this, scientists argue; that’s simply what scientists do. Scientists have disputes about everything; from evaluating data to how to conduct experiments. Many graduate students have been witnesses to heated conflicts over the interpretations of scientific data where scientists sometimes even attack one another’s reputations at conferences! So it’s no surprise that climate scientists vehemently disagree on the data and what the data mean. However, when one point of view ceases to be questioned, in a “this is the whole truth” perspective, science stops being scientific. “Climategate” is certainly the result of sloppy handling of data, but it is also likely the result of bias. When the theory that human CO2 emissions caused global warming became the sole cause for why polar ice caps were melting, scientific objectivity was shelved. This one, very narrow interpretation of data was presented as the entire reason behind climate change, while other facts, such as particulate matter by aerosols or the influences of solar activity were downplayed. It doesn’t take a climate scientist, or even a scientist, to see that to present one set of data as the only cause of a phenomenon while downplaying other data spells trouble. Anyone with a basic understanding of science and adequate tools in critical thinking can see this.

Politicians, journalists, and the general public need to learn to take responsibility for their own understanding about science and the scientific process. Awareness about the fundamentals of science and how science works are the keys to forming good opinions about climate science data. Anyone can achieve the ability to evaluate scientific claims. If first graders can learn physics and chemistry, then journalists and policymakers, as well as the average adult, can learn the basics and from there think critically about scientific conclusions.

Essentially, “Climategate” is a problem with education and politics, not science. The conflict between scientists is exactly the way science functions and is not out of the ordinary. There is a battle over climate data, as there should be. However, because policymakers viewed climate change from a narrow lens without critically evaluating counterarguments, errors in judgment were likely made. What we need is more science education for everyone, not necessarily more experts. If our policy makers had been taught to evaluate scientific claims for themselves and not merely rely on others to dictate scientific opinion to them, issues such as “Climategate” might have been avoided. However, if funding for education continues to be cut, as is currently happening in California to make up for the financial shortfall the state has experienced, this problem will only get worse. In order to re-establish the “social contract” between scientists and policymakers it might be a good idea to ask that both scientists and policymakers take responsibility for their own understanding of those scientific issues that impact politics.

In November of last year, e-mails between England’s University of East Anglia Professor Phil Jones, the head of the Climate Research Unit, and Professor Michael Mann of Pennsylvania State University were publicized by computer hackers. These e-mails received a great deal of media attention because they seemed to imply that there was a degree of misconduct between the two scientists. The misconduct came in the form of data being hidden in order to influence the peer review process and ultimately to keep scientific papers which had dissenting points of view from becoming public. For example, in one e-mail, Dr. Jones noted, regarding the global climate status, “I’ve just completed Mike’s Nature trick of adding in the real temps to each series for the last 20 years (ie, from 1981 onwards) and from 1961 for Keith’s to hide the decline.” Jones has since stepped down as head of the Climate Research Unit.

Although Professor Jones has defended the content of his e-mails, most recently to BBC News in a question-and-answer format (see, there is no doubt that what he wrote has ignited a public outcry about how scientists treat data, especially as it relates to climatology.  Responding to his critics in the interview with BBC, he says that the “…’trick’ did not refer to any intention to deceive — but rather ‘a convenient way of achieving something’.”

This leads us to ask why and how something like “Climategate” could happen.  What do the data really reflect? If the periods studied do not show a warming trend of statistical significance, what does this mean? How well do the interpretations of the data reflect what is really happening with the climate? Was the need to “hide the decline” a scientific necessity or was it political? And if it was political how much should politics influence the representation of scientific data? Is there a “correct” way to interpret scientific data, and if not, how does one educate a young scientist to interpret data to reflect the best possible representation of reality?

First, there is no one right way for interpreting scientific data and predicting climate change. The science of climatology is complex because there are so many variables involved.  Jones notes just a few of the numerous data points a climate scientist must take into consideration: “human and natural influences… natural internal variability of the climate system… Volcanic influences… Solar influence…” Because of all the variables, it becomes necessary to take a solid look at the collected data from various viewpoints.

Second, when looking at the science of a highly politicized issue such as climate change, it is imperative that scientists not become influenced one way or another by political opinion.  In the case of “Climategate,” it appears as though Jones may have mixed science with politics, and then felt pressured to present the “right” data so that climate change would appear in one light, when in fact his records possibly showed otherwise.  When questioned by the BBC, Jones admitted that, although there has been some warming, there has not been significant global warming from 1995 to the present, and that “Achieving statistical significance in scientific terms is much more likely for longer periods, and much less likely for shorter periods.” What this means is that Dr. Jones may have felt the need to stretch his data to appear sympathetic to the issue of climate change. His stretch has now cast doubts on all of his data, not just the statistics he referred to in these specific e-mails.

Last, the public is not well-versed in thinking critically when it comes to scientific matters. Science is not black and white and most scientific conclusions are complex and layered. Rather than looking at data objectively with a healthy dose of skepticism, sometimes the media and politicians will take their own point of view and then search for scientific data to back up their already-formed opinions.  This leads to an imbalance in the way scientific facts are presented to the public.

These are significant issues when it comes to science and how scientists present scientific data to the public, and these problems point to a desperate need to overhaul science education. It is not enough to rely on the “experts” and although there will always be need for expert opinion, everyone needs have a better understanding of science and the scientific process. What is being discussed in “Climategate” is the reason I wrote Real Science-4-Kids (RS4K). RS4K curricula help provide a foundation for science that kids can build on in the future.  With RS4K, children are given the tools they need to critically evaluate and interpret scientific facts. With the Kogs, kids are taught how science is connected to history, philosophy, technology and critical thinking. Better science education is not just a necessity for children who want to become scientists when they grow up, it is also imperative that politicians, journalists, and everyday readers who follow the news be educated to think critically and understand the limitations of scientific investigation.

In early January, the President announced several new public-private partnerships that would invest more than $250 million to help prepare more than 10000 new math and science teachers and provide extra training to more than 100,000 existing teachers.

The current administration’s campaign is called “Educate to Innovate” and is pursuing many avenues to increase U.S. students’ standing in science, technology, engineering and mathematics. There is a concerted push to have teachers who can confidently and enthusiastically teach science. (See the full White House news release for details on how universities and private companies are working on the problem.

“Passionate educators with deep content expertise can make all the difference,” President Obama said in a prepared statement, “enabling hands-on learning that truly engages students — including girls and underrepresented minorities — and preparing them to tackle the ‘grand challenges’ of the 21st century such as increasing energy independence, improving people’s health, protecting the environment and strengthening national security.”

All of this points out that our schools still lack in properly educating children in the science and math they will need to succeed in their adult careers. As reported in previous blog postings here, our students’ rankings in science continue to fall compared to many other countries, which does not bode well for our ability to innovate and compete in the future.

For those who teach at home or in private settings, it points out the need to use engaging science materials at an early age. To make sure both the students and the teacher are comfortable, use materials that include “how-to” manuals for non-scientist adults who are doing the teaching.

A case in point is that Gravitas Publications was begun in 2002 because one home-school mom – who actually was a scientist with a Ph.D. – could not find age-appropriate, engaging textbooks that built a real foundation for understanding science.

Home schoolers of all backgrounds must feel confident in being able to present the lessons and make it exciting for the student – just like the national effort to train professional teachers.

What about the President singling out the need to engage girls? Here’s just one statistic that points out that problem:  Only 17% of undergraduate engineering degrees are awarded to women.

The decline of serious science coverage in primary news media – and what that trend means for our future – was thoughtfully covered in an August 17 article in The Nation magazine entitled “Unpopular Science” by Chris Mooney and Sheril Kirshenbaum. (see:

Good science coverage should report on immediate topics such as the spread of flu and medical discoveries for better health. It should also cover solid science news about climate change, technology advancement, and energy developments, because the public must understand facts about subjects like these in order to shape national policy and make informed judgments. To avoid accepting news straight off of a press release, we need reporters with the experience and specialized knowledge to separate important facts from “fluff.”

Mooney and Kirshenbaum point out that the decline in the number and size of newspapers has triggered cuts in knowledgeable science reporting. And in television, the proliferation of cable news channels has meant that the major broadcast networks have less of a captive audience and fewer financial resources to cover serious science topic in depth.

They write:

From 1989 to 2005, the number of US papers featuring weekly science-related sections shrank from ninety-five to thirty-four. Many of the remaining sections shifted to softer health, fitness and “news you can use” coverage, reflecting the apparent judgment that more thorough science or science policy coverage just doesn’t support itself economically. And the problem isn’t confined to newspapers. Just one minute out of every 300 on cable news is devoted to science and technology, or one-third of 1 percent. Late last year CNN cut its entire science, space and technology unit.”

The overall result is that, although there is a great deal of science information available online, we must search for it and use our own critical thinking abilities to discern what is important to know and what is today’s fad. Learning basic science concepts and how they apply to our daily life is an important step toward making sense of science “news’ in the future. Learning critical thinking skills and the discipline of the scientific method for determining facts will serve non-scientists as well as scientists throughout life.

Americans are knowledgeable about basic scientific facts that affect their health and daily lives, but they are less able to answer questions about more complex science topics, according to a PEW study released in early July. These results support Gravitas’ long-standing philosophy that we learn and retain science information better when it is put into context and associated with our real-world experience.

The Pew Research Center for the People & the Press in collaboration with the American Association for the Advancement of Science (AAAS), the world’s largest general scientific society, conducted a general survey of opinions about the state of science and its impact on society. They also asked science knowledge questions in a separate survey of 1,005 adult members of the general public. Quoting from that section of the published report:

Fully 91% know that aspirin is an over-the-counter drug recommended to prevent heart attacks and 82% know that GPS technology relies on satellites. And topics covered in major news stories also are widely understood; 77% correctly identify earthquakes as a cause of tsunamis and 65% can identify CO2 as a gas linked to rising temperatures.

Slightly more than half (54%) knows that antibiotics do not kill viruses along with bacteria, and about the same percentage (52%) knows that what distinguishes stem cells from other cells is that they can develop into many different kinds of cells. And some high-school science knowledge is elusive for most Americans: Fewer than half (46%) know that electrons are smaller than atoms.

There were several other interesting results in the survey of opinions about the state of science and its impact on society, as the report presented points of agreement and disagreement between scientists who were surveyed and the general public.

For example, majorities of both groups point to advances in medicine and life sciences as important achievements of science. About half of the public (52%) cites medicine – including health care, vaccines, and medical cures – when asked to describe ways that science has positively affected society; by comparison, just 7% mention communications and computer technology. Similarly, most scientists (55%) mention a biomedical or health finding when asked about the nation’s greatest scientific achievement of the last 20 years.

The published report (Public Praises Science) also reveals percentages of opinions of the public versus scientists on topics such as natural evolution, belief in climate change from human activity, the relative standing of U.S. science achievements, and more.

Read or download the report at:

“Kimberly Kauer was worried about her 6-year-old daughter’s math skills. Her school doesn’t assign homework, and Ms. Kauer wasn’t sure which math concepts her daughter fully understood. To quell her fears, Ms. Kauer started her daughter on an online educational program for young children called DreamBox Learning. DreamBox uses interactive games to teach math and analyzes users’ progress as they complete lessons.”

The above is a quote from a July 22, 2009, article by Joseph De Avila in the Wall Street Journal. The article goes on to discuss several interactive Websites that concerned parents can access (and pay for by subscription) to test or supplement their child’s learning. The sites mentioned in this article include: Dreambox, SmartyCard, Brightstorm, and Grockit.

With a caveat about the effectiveness of these newer sites not being extensively studied, the point is made that this is a growing industry. The end of the article cites some studies suggesting “blended” learning (traditional, face-to-face teaching plus online learning) may be the most effective, at least for older students.

Gravitas Publications this year took its first steps in using online, interactive technology with the introduction of the company’s Club Services. Among the benefits of the subscription service is online testing for each chapter of the publisher’s chemistry, biology and physics textbooks. Textbooks are currently available in all three subjects for kindergarten through third grade level and for fourth grade through sixth grade level. There is also a chemistry text for grades seven through nine. Online tests are graded automatically and results can be printed out. Tests may be retaken as needed, and questions are shuffled each time.

Just last month, the University of New Mexico Cancer Center issued a news release announcing that the National Institutes of Health (NIH) has selected a research team at the University of New Mexico Cancer Center to lead the tenth National Center for Systems Biology in the U.S. with a five-year, $14.5 million grant:

“This grant will bring together people from many different disciplines and backgrounds, including biologists, engineers, mathematicians and physicists at UNM, Los Alamos National Laboratory and Sandia National Laboratories,” said Janet Oliver, PhD and principal investigator of the new center, called the New Mexico Spatiotemporal Modeling Center (STMC). “Together, we expect to develop the new tools needed to understand the dynamic biochemical and spatial events that control the behavior of immune and cancer cells.”

It is helpful for home school parents and teachers to be aware of this relatively new method of scientific investigation and medical research. Systems biology is an emerging interdisciplinary field that joins biology, mathematics, engineering and the physical sciences. Using experimental and computational approaches, it builds on existing knowledge of genetic and molecular functions to study and understand biological processes in cells, tissues and organisms.

Readers of Gravitas’ blogs and articles already know that Gravitas promotes “interdisciplinary” approaches to learning. That is why the Kogs-4-Kids™ series links chemistry with other subjects such as history, philosophy and technology. We believe this promotes enhanced critical-thinking and problem-solving skills, as well as better learning by repetition in various contexts and linking facts to real world situations. The announcement of this tenth systems biology center validates that our students need to be versed in interdisciplinary understanding.

The systems perspective brings an engineering paradigm into the science of biology to study the complex design of living things. In many ways, it is a new biology that will add much to our knowledge base, much as quantum physics extended the field of nuclear physics.

This is also a good example for children of how working scientists discover new facts and then put them into practical use, perhaps in this case producing new treatments or even cures for diseases such as cancer.

Recently a very supportive user of Real Science-4-Kids teaching materials wrote to Gravitas with a suggestion. She knows of some home school parents who have strong objections to the use of the word “design” in a few places in the RS4K biology texts.

The Gravitas philosophy is that its materials teach the facts about gathering and using scientific concepts and data but leave how that information is interpreted to each student and teacher. The objection to the word “design” seems to come from the idea that using the word is the same as promoting the science philosophy of “intelligent design” that pertains to the origin of life.

The specific language used to teach is very important, so it is important to be clear that “design” is not a taboo in science, because the word is commonly used in its generic meaning of “to draw up a plan or execute according to a plan.” In the automotive industry, one can design a car. In mathematics, one can design a set of formulas to solve a problem.

Books totally unrelated to the concept of intelligent design can use the term in a title, such as:  An Introduction to Systems Biology: Design Principles of Biological Circuits (Chapman & Hall/Crc Mathematical and Computational Biology)

Here is an array of science articles, again not pertaining to the discussion of intelligent design:

Protein design in biological networks: from manipulating the input to modifying the output. Van der Sloot AM, Kiel C, Serrano L, Stricher F. Protein Eng Des Sel. 2009 Jul 2.

A systematic design method for robust synthetic biology to satisfy design specifications. Chen BS, Wu CH. BMC Syst Biol. 2009 Jun 30;3(1):66.

Synthetic biology: exploring and exploiting genetic modularity through the design of novel biological networks. Agapakis CM, Silver PA. Mol Biosyst. 2009 Jul;5(7):704-13. Epub 2009 May 14.

Common themes in the design and function of bacterial effectors. Galán JE. Cell Host Microbe. 2009 Jun 18;5(6):571-9

This illustrates that one should be careful about judging a science text’s philosophy by scanning for a particular word or checking to see if it is used in the index.

Now and then, a parent using a Real Science-4-Kids lab workbook contacts Gravitas Publications to say that a particular experiment did not “work” as expected. From an educator’s point of view, a “failed” experiment can be a very rich source of learning.

It is always possible to discuss what the outcome of any given experiment was meant to teach had it gone “right.” This can provide a benchmark to compare with the student’s actual results.

An in-depth discussion of how and why it went “wrong” can be just as fruitful. It is the perfect time for student and teacher to begin an “open inquiry” scientific investigation with lots of questions. (See the June 23 posting in this blog for the steps to use a critical thinking lens.) The teacher can spark a long list of question by asking just a few for the student to consider such as:

  1. What parts of the experiment were most likely to go wrong? Where should we begin looking for why we had a different outcome than expected?
  2. Should we write out a new checklist to make certain something was not overlooked in the process of doing the experiment?
  3. At what point did my results begin to differ from what was expected? What could have gone “wrong” up to that point?

After thinking about these questions, the student may actually write a new hypothesis about why the experiment result varied. Now the student has a new experiment that can be carried out to test the new hypothesis!

Let’s say the original experiment was to show a chemical reaction involving just a few household substances. The gathering of data (the many questions above) revealed that one of the ingredients had been on the shelf for a very long time or another ingredient had been cold from the refrigerator instead of being at room temperature. Now new questions could be posed about whether the effects of long exposure to air or to extreme temperatures could affect the experiment’s outcome. If the new hypothesis is that the baking soda was too “old,” then repeating the experiment with a new box of baking soda might let the student confirm or eliminate that as a cause.

Using a “failure” to let the student use his or her own critical-thinking (or problem-solving) skills is a terrific way to promote curiosity and confidence. You might also want to relate the following story about the importance of learning from failure. It is based on information from 3M’s Post-it Notes® Web page (3M Web site):

Dr. Spencer Silver, a 3M scientist, discovered the formula for a lightly sticky adhesive back in 1968 (reportedly making the weak adhesive while searching for a formula for a stronger adhesive). But it was Silver’s colleague, Art Fry, who finally came up with a practical use for it. The idea for repositionable notes struck Fry while singing in the church choir. His bookmark kept falling out of his hymnal, causing him to lose his page. So, taking advantage of a 3M policy known as the “bootlegging” policy, Fry used a portion of his working hours to develop a solution to his problem. More than 10 years after Silver’s original “discovery,” the world began singing the praises of his pet project: Post-it® Notes.

Has this happened to you? The gist of something you hear does not seem quite “right,” yet the speaker sounds so logical it seems it must be a statement of fact.

Or, your child says something that tells you she or he has drawn the wrong conclusion from diverse bits of information?

Just as an optical lens can help our eyes see more clearly, a properly balanced (objective) intellectual “lens” can help us think more clearly. It is called a “critical thinking lens,” and its use should not be limited to questions of science. It is helpful in all areas of our lives. Let’s look at the steps for evaluating information “critically” as presented in the Kogs-4-Kids Critical Thinking workbook.

A “critical thinker” of any age collects information, evaluates the information, draws conclusions using logic, and then further evaluates the logical conclusions. The most important part of all of these steps is to ask questions – the right questions.

To collect information, begin with questions journalists always try to answer when writing a story: Who? What? When? Where? How? Strive to clarify the facts further with questions such as: Who else? How much? Exactly what time?

To evaluate all of the collected details, ask questions that explore the relevance and significance of each fact. Is each fact substantial, crucial or applicable to the answer (conclusion) you wish to find?

Next, logic helps a critical thinker avoid errors in a conclusion by exploring validity, consistency and logical flaws.

Finally, evaluate the conclusion itself with questions about its fairness, reasonableness, depth and breadth. Ask questions such as: Does my conclusion seem practical? Did I gather my information from only one source? Could there be information that I am missing?

For illustration, let’s look at an overly simple example. Let’s say a child makes the statement that real cats are yellow. To help him learn if his view of the world (relating to the color of cats) is valid, you might ask him a series of fact-finding questions. Who or what makes you think this is true? He says: “I have seen yellow cats.” Where and when have you seen them? He says: “When I play in my backyard.” When exactly do you play in the yard and see the cats? He says, “Every afternoon.” How many cats do you see? He says, “Three.”

You agree that these are interesting facts, and they are relevant because he has experienced seeing yellow cats himself. It even seems logical for him to draw the conclusion that real cats are yellow based on his observations.

But now the “logical” conclusion must be evaluated once more for fairness, reasonableness, depth and breadth.

You ask another series of questions: Do you think there may be other cats that only go outside in the morning but not the afternoon? He says, “Maybe.” Do you think some neighbors have cats that stay in the house all of the time so that you never see them? He says: “Maybe.” Do you have friends who have a pet cat? “Yes.” Have you seen whether those cats are all yellow? “No.” Can you think of any other place where you have seen a cat? “At my cousin’s house.” What color was that cat? “It was black and white. So real cats are not always yellow after all!”

This critical thinking lens (process) can be used with any statement (hypothesis, in science) to test its validity. Try it with a conclusion you hear in a newscast or with an often-heard bit of folk wisdom.