Whadja Do In School Today?

Every parent of a four or five year old knows the drill. Your child comes home after pre-kindergarten or kindergarten. “Whadja do today?” you  say with eager anticipation, thinking of all the friends your child must have made, the stories your child heard, the songs your child sang, the projects or dress-up or phonics or math, or…well, anything.

“Nuffin,” your child says, wandering out of range to avoid further interrogation.

You know your child did a whole lot more than “nuffin.” But how can you find out so that you can build on what the teacher did each day?

One answer is something our group at the Success for All Foundation created utilizing Investing in Innovation (i3) funding with partners at Sesame Workshop, Sirius Thinking, and Johns Hopkins University. We call it Home Links. Home Links are 10-15 minute videos, akin to short television shows, that parents and children watch together, 4 evenings a week. Each show uses content from Sesame Street and animations we have made with Sirius Thinking, so they are a bit like Sesame Street shows themselves, with one huge difference: the content of the shows reflects the activities that children and teachers were doing that day in school.

The Home Links give kids reinforcement and extension of vocabulary and skills they learned that day, and that’s important. But more important, they tell parents what’s happening in school. When a show contains skits about fall, the letter V, counting to five, and singing traditional songs, the parents know that all of these things are happening in school. Our surveys found that 96% of the time, a parent, grandparent, or other relative watches with the child. At the end of each show there is music and movement, and parents tell us they dance with their children, and they love the closeness and fun. But parents also now know how to support their children’s learning. If the topic is markets, they know to point out interesting things when they next are at the market with their child. If the letter is T, they know to point out things that begin with T. If the math segment is on shapes, parents know to ask children about shapes they see in daily life. Home should not be another classroom, but it’s the ideal place for a child to learn that the things he or she is learning in school are important to his or her parents and exist in his or her community. It also helps children understand that knowing about and learning about those things brings pride and builds curiosity.

Home Links are sent home on DVDs each day. We are now looking for funding to make an online version so families can download Home Links to digital devices such as phones and tablets.

Right now, Home Links are being used in approximately 300 preschool and kindergarten classes already using our proven Success for All whole-school approach. In the future, we hope to disseminate Home Links to preschools and kindergartens whether or not they use Success for All.

When this happens, more and more parents won’t have to ask, “Whadja do in school today?” They’ll know. And they’ll know how to build on what they find out.

And that ain’t nuffin’.

 

The Investing in Innovation (i3) program is a federal competitive grant program at the U.S. Department of Education, within the Office of Innovation and Improvement (OII). It provides funding to support local education agencies or nonprofit organizations in partnership with LEAs and/or schools to expand and develop innovative practices that can serve as models of best practices and to identify and document best practices that can be shared and taken to scale in the areas of improving student achievement or student growth, closing achievement gaps, decreasing dropout rates, increasing high school graduation rates, or increasing college enrollment and completion rates.

More information on the i3 program can be found here.

More information on Success for All Foundation’s grant Around the Corner: A Technology-Enhanced Approach to Early Literacy can be found here.

Advertisements

The Rapid Advance of Rigorous Research

My colleagues and I have been reviewing a lot of research lately, as you may have noticed in recent blogs on our reviews of research on secondary reading and our work on our web site, Evidence for ESSA, which summarizes research on all of elementary and secondary reading and math according to ESSA evidence standards.  In the course of this work, I’ve noticed some interesting trends, with truly revolutionary implications.

The first is that reports of rigorous research are appearing very, very fast.  In our secondary reading review, there were 64 studies that met our very stringent standards.  55 of these used random assignment, and even the 9 quasi-experiments all specified assignment to experimental or control conditions in advance.  We eliminated all researcher-made measures.  But the most interesting fact is that of the 64 studies, 19 had publication or report dates of 2015 or 2016.  Fifty-one have appeared since 2011.  This surge of recent publications on rigorous studies was greatly helped by the publication of many studies funded by the federal Striving Readers program, but Striving Readers was not the only factor.  Seven of the studies were from England, funded by the Education Endowment Foundation (EEF).  Others were funded by the Institute of Education Sciences at the U.S. Department of Education (IES), the federal Investing in Innovation (i3) program, and many publishers, who are increasingly realizing that the future of education belongs to those with evidence of effectiveness.  With respect to i3 and EEF, we are only at the front edge of seeing the fruits of these substantial investments, as there are many more studies in the pipeline right now, adding to the continuing build-up in the number and quality of studies started by IES and other funders.  Looking more broadly at all subjects and grade levels, there is an unmistakable conclusion: high-quality research on practical programs in elementary and secondary education is arriving in amounts we never could have imagined just a few years ago.

Another unavoidable conclusion from the flood of rigorous research is that in large-scale randomized experiments, effect sizes are modest.  In a recent review I did with my colleague Alan Cheung, we found that the mean effect size for large, randomized experiments across all of elementary and second reading, math, and science is only +0.13, much smaller than effect sizes from smaller studies and from quasi-experiments.  However, unlike small and quasi-experimental studies, rigorous experiments using standardized outcome measures replicate.  These effect sizes may not be enormous, but you can take them to the bank.

In our secondary reading review, we found an extraordinary example of this. The University of Kansas has an array of programs for struggling readers in middle and high schools, collectively called the Strategic Instruction Model, or SIM.  In the Striving Readers grants, several states and districts used methods based on SIM.  In all, we found six large, randomized experiments, and one large quasi-experiment (which matched experimental and control groups).  The effect sizes across the seven varied from a low of 0.00 to +0.15, but most clustered closely around the weighted mean of +0.09.  This consistency was remarkable given that the contexts varied considerably.  Some studies were in middle schools, some in high schools, some in both.  Some studies gave students an extra period of reading each day, some did not.  Some studies went for multiple years, some did not.  Settings included inner-city and rural locations, and all parts of the U.S.

One might well argue that the SIM findings are depressing, because the effect sizes were quite modest (though usually statistically significant).  This may be true, but once we can replicate meaningful impacts, we can also start to make solid improvements.  Replication is the hallmark of a mature science, and we are getting there.  If we know how to replicate our findings, then the developers of SIM and many other programs can create better and better programs over time with confidence that once designed and thoughtfully implemented, better programs will reliably produce better outcomes, as measured in large, randomized experiments.  This means a lot.

Of course, large, randomized studies may also be reliable in telling us what does not work, or does not work yet.  When researchers get zero impacts and then seek funding to do the same treatment again, hoping for better luck, they and their funders are sure to be disappointed.  Researchers who find zero impacts may learn a lot, which may help them create something new that will, in fact, move the needle.  But they have to then use those learnings to do something meaningfully different if they expect to see meaningfully different outcomes.

Our reviews are finding that in every subject and grade level, there are programs right now that meet high standards of evidence and produce reliable impacts on student achievement.  Increasing numbers of these proven programs have been replicated with important positive outcomes in multiple high-quality studies.  If all 52,000 Title I schools adopted and implemented the best of these programs, those that reliably produce impacts of more than +0.20, the U.S. would soon rise in international rankings, achievement gaps would be cut in half, and we would have a basis for further gains as research and development build on what works to create approaches that work better.  And better.  And then better still.

There is bipartisan, totally non-political support for the idea that America’s schools should be using evidence to enhance outcomes.  However a school came into being, whoever governs it, whoever attends it, wherever it is located, at the end of the day the school exists to make a difference in the lives of children.  In every school there are teachers, principals, and parents who want and need to ensure that every child succeeds.  Research and development does not solve all problems, but it helps leverage the efforts of all educators and parents so that they can have a maximum positive impact on their children’s learning.  We have to continue to invest in that research and development, especially as we get smarter about what works and what does not, and as we get smarter about research designs that can produce reliable, replicable outcomes.  Ones you can take to the bank.

Love, Hope, and Evidence in Secondary Reading

I am pleased to announce that our article reviewing research on effective secondary reading programs has just been posted on the Best Evidence Encyclopedia, aka the BEE. Written with my colleagues Ariane Baye, Cynthia Lake, and Amanda Inns, our review found 64 studies of 49 reading programs for students in grades 6 to 12, which had to meet very high standards of quality. For example, 55 of the studies used random assignment to conditions.

But before I get all nerdy about the technical standards of the review, I want to reflect on what we learned. I’ve already written about one thing we learned, that simply providing more instructional time made little difference in outcomes. In 22 of the studies, students got an extra period for reading beyond what control students got for at least an entire year, yet programs (other than tutoring) that provided extra time did no better than those that did not.

If time doesn’t help struggling readers, what does? I think I can summarize our findings with three words: love, hope, and evidence.

Love and hope are exactly what students who are reading below grade level are lacking. They are no longer naive. They know exactly what it means to be a poor reader in a high-poverty secondary school (almost all of the schools in our review served disadvantaged adolescents). If you can’t read well, college is out of the question. Decent jobs without a degree are scarce. If you have no hope, you cannot be motivated, or you may be motivated in antisocial directions that give you at least a chance for money and recognition. Every child needs love, but poor readers in secondary schools are too often looking for love in all the wrong places.

The successful programs in our review were ones that give adolescents a chance to earn the hope and love they crave. One category, all studies done in England, involved one-to-one and small group tutoring. How better to build close relationships between students and caring adults than to have individual or very small group time with them? And the one-to-one or small group setting allows tutors to personalize instruction, giving students a sense of hope that this time, their efforts will pay off (as the evidence says it will).

But the largest impacts in our review came from two related programs – The Reading Edge and Talent Development High School (TDHS). These both developed in our research center at Johns Hopkins University in the 1990s, so I have to be very modest here. But beyond these individual programs, I think there is a larger message.

Both The Reading Edge (for middle schools) and TDHS (for high schools) organize students into mixed-ability cooperative teams. The team members work on activities designed to build reading comprehension and related skills. Students are frequently assessed and on the basis of those assessments, they can earn recognition for their teams. Teachers introduce lessons, and then, as students work with each other on reading activities, teachers can cruise around the class looking in on students who need encouragement or help, solving problems, and building relationships. Students are on task, eager to learn, and seeing the progress they are making, but students and teachers are laughing together, sharing easy banter, and encouraging each other. Yes, this really happens. I’ve seen it hundreds of times in secondary schools throughout the U.S. and England.

Many of the most successful programs in our review also are based on principles of love and hope. BARR, a high school program, is an excellent example. It uses block scheduling to build positive relationships among a group of students and teachers, adding regular meetings between teachers and students to review their progress in all areas, social as well as academic. The program focuses on building positive social-emotional skills and behaviors, and helping students describe their desired futures, make plans to get there, and regularly review progress on their plans with their teachers and peers. Love and hope.

California’s Expository Reading and Writing Course helps 12th graders hoping to attend California State Universities prepare to pass the test used to determine whether students have to take remedial English (a key factor in college dropout). The students work in groups, helping each other to build reading, writing, and discussion skills, and helping students to visualize a future for themselves. Love and hope.

A few technology programs showed promising outcomes, especially Achieve3000 and Read 180. These do not replace teachers and peers with technology, but instead cycle students through small group, teacher-led, and computer-assisted activities. Pure technology programs did not work so well, but models taking advantage of relationships as well as personalization did best. Love and hope.

Of course, love and hope are not sufficient. We also need evidence that students are learning more than they might have been. To produce positive achievement effects requires outstanding teaching strategies, professional development, curricular approaches, assessments, and more. Love and hope may be necessary but they are not sufficient.

Our review applied the toughest evidence standards we have ever applied. Most of the studies we reviewed did not show positive impacts on reading achievement. But the ones that did so inspire that much more confidence. The very fact that we could apply these standards and still find plenty of studies that meet them shows how much our field is maturing. This in itself fills me with hope.

And love.

Apology

In a recent blog, I wrote about work we are doing to measure the impact on reading and math performance of a citywide campaign to provide assessments and eyeglasses to every child in Baltimore, from pre-k to grade 8. I forgot to mention the name of the project, Vision for Baltimore, and neglected to say that the project operates under the authority of the Baltimore City Health Department, which has been a strong supporter. I apologize for the omission.