Where’s the Science in Science Education?

2014-10-09-HP4710_9_2014.jpg

A constant refrain in articles about education and the economy highlights the need for more of a focus on STEM: Science, Technology, Engineering, and Mathematics. In fact, the National Science Foundation and many other public and private entities spend billions each year to advance STEM education. STEM is indeed critical for American economic competitiveness and progress. So naturally you’d expect that STEM subjects would be among the best researched of all, right?

Wrong. My colleagues and I just published a review of research on elementary science programs in the most prestigious science education journal, the Journal of Research on Science Teaching (JRST). I’ll get to the substantive conclusions in a moment. What I want to focus on first is the most important finding of the review: that we found only 23 studies that met our inclusion standards. Our standards were not that tough. We required that studies compare experimental to well-matched or randomly assigned control groups on measures that fairly assessed what was taught in both groups. Studies had to last at least 4 weeks (less than the 12 weeks we’ve required in every other subject). Our 23 studies were the product of all qualifying research published in English throughout the world over a period of more than 30 years. That’s less than one study per year. Had we required random assignment and analysis at the level of random assignment, only seven studies would have qualified.

Of course, there are thousands of studies of elementary science teaching. Why did so few meet our standards? A lot of them had no control group, or no measure of science learning. Many were very brief lab studies lasting from an hour to a few days.

Among the few studies that did compare experimental and control classes over at least four weeks, most had obvious problems that made it impossible to include them. Many used measures made to register the gains in the experimental group but unrelated to what was taught in the control group. For example, many studies taught a unit on, say, electricity, to the experimental group and compared their gains on an electricity test to those of a group that was not taught electricity at all during the experiment.

Among the studies we could include, the outcomes favored inquiry approaches that emphasized professional development for teachers, using methods such as cooperative learning and reading-science integration. Inquiry methods using science kits did no better than control groups, and disturbingly, these were the highest-quality studies. There were positive effects for approaches emphasizing technology, but there were very few studies in this category.

The larger question posed by our review, however, is why there were so few qualifying studies. How could the entire field of science education produce less than one methodologically adequate experimental study of practical elementary science approaches per year?

At first, my colleagues and I thought that this problem must surely just be due to the fact that science educators focus more on secondary schools than elementary schools. However, we are now working on a review of secondary science programs, under a grant from the Spencer Foundation. We are not finding markedly more qualifying studies at that level, either.

The number of studies that meet similar inclusion standards in elementary and secondary reading and math is much higher than in science. What is it about science education that makes such research rare? Of course, there is a poignant irony in the observation that among all major branches of educational research, science education is least likely to use rigorous scientific evidence to evaluate its own programs. Science educators should be, and could still become, leaders in evidence-based reform, but this will require a serious change in direction in the field.

Lessons from Innovators: STEM Learning Opportunities Providing Equity (SLOPE)

The process of moving an educational innovation from a good idea to widespread effective implementation is far from straightforward, and no one has a magic formula for doing it. The William T. Grant and Spencer Foundations, with help from the Forum for Youth Investment, have created a community composed of grantees in the federal Investing in Innovation (i3) program to share ideas and best practices. Our Success for All program participates in this community. In this space, I, in partnership with the Forum for Youth Investment, highlight observations from the experiences of i3 grantees other than our own, in an attempt to share the thinking and experience of colleagues out on the front lines of evidence-based reform.

2013-12-05-Slope2.jpgThis blog post is based on a conversation between the Forum for Youth Investment and Sharon Twitty, Project Director for the STEM Learning Opportunities Providing Equity (SLOPE) i3 project based at the Alliance for Regional Collaboration to Heighten Educational Success (ARCHES). SLOPE is a development project designed to help students succeed in Algebra in the 8th grade and to prepare for careers in science, technology, engineering, and math (STEM). Throughout the conversation, Twitty reflected on how relationship building and her background in communications have helped her successfully navigate a complex and geographically dispersed effort. Her reflections and advice to others working to implement and evaluate interventions are summarized here.

Set realistic goals
The SLOPE project is currently active in 17 districts around the state of California and is serving close to 3,500 students. Although SLOPE has met all of the participation targets identified in their i3 proposal, and although Twitty feels implementation has been rigorous, she notes that their three-tiered model is quite complex and that it is too early to determine whether the intervention is worthy of further expansion. Her team has learned a lot during the first year of implementation, in particular about what to expect from schools and teachers. “We know from change theory that it takes people 3-5 years to get comfortable with an innovation of this nature. The more traditional your values and the more ‘stand and deliver’ your method, the harder it is to acclimate to an intervention like this. Change is hard and people change slowly.” She suggests that building in a planning and development year for any complex change effort is important because it can help keep expectations realistic and give teachers time to adjust and prepare for new practices and tools. Twitty notes that — especially for complex projects — piloting in the field for refinement prior to implementation in the “study”-type environment is essential.

Relationships are the work
“Don’t let anyone tell you differently. Relationships are the work. You move at the speed of trust. Without relationships, any intervention, no matter how strong, is going to fail,” says Twitty. That is why she did everything she could to nurture and build personal relationships with every school and teacher involved, from big schools in a city to the single rural teacher implementing the intervention by herself. Twitty did a combination of small and big things to foster communication and engagement. She personally does a site visit at every school at least twice a year. Whenever possible, she highlights the good work of schools in newsletters, in local newspapers, and with policymakers. Sometimes she brings congressional delegates with her on site visits to highlight the project and show the schools she values their work. “I am not constantly in their face, but rather I focus on being responsive, respectful, and trying to make it as easy as possible for participants. I let them know that I need them and I thank them regularly,” says Twitty. Twitty has also learned that the more time principals spend in classrooms with SLOPE, the more they learn about the project. One strategy Twitty has used to encourage time in SLOPE classrooms is to send administrators a list of questions they can only answer by visiting classrooms and observing teachers in action. This makes participating teachers feel valued and generates useful anecdotes for communications with other sites, funders, and policymakers.

Be efficient
One concern with a relationship-intensive approach is that it may not be scalable. Twitty isn’t worried. In her opinion this just requires working smarter, not harder. “It is not that hard to build relationships. People just want to feel taken care of. Be deliberate and strategic, and watch for opportunities to do little things.” In fact, she has found that in some cases, just giving out her cell phone number and being responsive on email (which she tries to do within 24 hours), have been enough to garner support. In addition, she is intentional about making sure every school and every member of the team feels they have an important role to play. That’s why every school engaged in the project, whether in a small town or a large district where multiple schools participate, gets a site visit from someone in a leadership role. As the project grows, regional hubs can be established for personal contact and the project director can make their presence felt from a distance — through webinars, email, and Skype or other video-chat services.

Don’t forget the control group
One thing Twitty has learned directing this project is that teachers and schools are not used to participating in projects that include a randomized controlled trial. She has found that it is important to nurture relationships with teachers who were part of the control group as well as the treatment group. She has found this helps to maintain their willingness to participate in the study and provides valuable information about what happens when the intervention group is not in play. “You have to keep them engaged,” Twitty notes. “Just getting a stipend is not enough. I’ve worked hard to build community among the comparison teachers by empowering them to feel good about the project. I send them information about the intervention and how it is part of a national initiative, and explain why it is important to keep their classes ‘uncontaminated.’ I explained the What Works Clearinghouse and why it is a big deal for a little development project like ours to meet their evaluation standards. It is amazing how far a little personal attention and explanation can go.”

The bottom line in this day and age is time. People value their time and want to participate in something that is relevant and significant. Having respect for that concept and building relationships goes a long way.