ReportWire

Tag: Data and research

  • Education Department takes a preliminary step toward revamping its research and statistics arm

    [ad_1]

    In his first two months in office, President Donald Trump ordered the closing of the Education Department and fired half of its staff. The department’s research and statistics division, called the Institute of Education Sciences (IES), was particularly hard hit. About 90 percent of its staff lost their jobs and more than 100 federal contracts to conduct its primary activities were canceled.

    But now there are signs that the Trump administration is partially reversing course and wants the federal government to retain a role in generating education statistics and evidence for what works in classrooms — at least to some extent. On Sept. 25, the department posted a notice in the Federal Register asking the public to submit feedback by Oct. 15 on reforming IES to make research more relevant to student learning. The department also asked for suggestions on how to collect data more efficiently.

    The timeline for revamping IES remains unclear, as is whether the administration will invest money into modernizing the agency. For example, it would take time and money to pilot new statistical techniques; in the meantime, statisticians would have to continue using current protocols.

    Still, the signs of rebuilding are adding up. 

    Related: Our free weekly newsletter alerts you to what research says about schools and classrooms.

    At the end of May, the department announced that it had temporarily hired a researcher from the Thomas B. Fordham Institute, a conservative think tank, to recommend ways to reform education research and development. The researcher, Amber Northern, has been “listening” to suggestions from think tanks and research organizations, according to department spokeswoman Madi Biedermann, and now wants more public feedback.  

    Biedermann said that the Trump administration “absolutely” intends to retain a role in education research, even as it seeks to close the department. Closure will require congressional approval, which hasn’t happened yet. In the meantime, Biedermann said the department is looking across the government to find where its research and statistics activities “best fit.”

    Other IES activities also appear to be resuming. In June, the department disclosed in a legal filing that it had or has plans to reinstate 20 of the 101 terminated contracts. Among the activities slated to be restarted are 10 Regional Education Laboratories that partner with school districts and states to generate and apply evidence. It remains unclear how all 20 contracts can be restarted without federal employees to hold competitive bidding processes and oversee them. 

    Earlier in September, the department posted eight new jobs to help administer the National Assessment of Educational Progress (NAEP), also called the Nation’s Report Card. These positions would be part of IES’s statistics division, the National Center for Education Statistics. Most of the work in developing and administering tests is handled by outside vendors, but federal employees are needed to award and oversee these contracts. After mass firings in March, employees at the board that oversees NAEP have been on loan to the Education Department to make sure the 2026 NAEP test is on schedule.

    Only a small staff remains at IES. Some education statistics have trickled out since Trump took office, including its first release of higher education data on Sept. 23. But the data releases have been late and incomplete

    It is believed that no new grants have been issued for education studies since March, according to researchers who are familiar with the federal grant making process but asked not to be identified for fear of retaliation. A big obstacle is that a contract to conduct peer review of research proposals was canceled so new ideas cannot be properly vetted. The staff that remains is trying to make annual disbursements for older multi-year studies that haven’t been canceled. 

    Related: Chaos and confusion as the statistics arm of the Education Department is reduced to a skeletal staff of 3

    With all these changes, it’s becoming increasingly difficult to figure out the status of federally funded education research. One potential source of clarity is a new project launched by two researchers from George Washington University and Johns Hopkins University. Rob Olsen and Betsy Wolf, who was an IES researcher until March, are tracking cancellations and keeping a record of research results for policymakers. 

    If it’s successful, it will be a much-needed light through the chaos.

    Contact staff writer Jill Barshay at 212-678-3595, jillbarshay.35 on Signal, or barshay@hechingerreport.org.

    This story about reforming IES was produced by The Hechinger Report, a nonprofit, independent news organization focused on inequality and innovation in education. Sign up for Proof Points and other Hechinger newsletters.

    The Hechinger Report provides in-depth, fact-based, unbiased reporting on education that is free to all readers. But that doesn’t mean it’s free to produce. Our work keeps educators and the public informed about pressing issues at schools and on campuses throughout the country. We tell the whole story, even when the details are inconvenient. Help us keep doing that.

    Join us today.

    [ad_2]

    Jill Barshay

    Source link

  • OPINION: After-school and summer programs can help more students learn to embrace numbers and become ‘math people’ after all

    [ad_1]

    As a teacher, I heard it all the time: “I’m not a math person.” 

    I would be in line at the grocery store, wearing a math T-shirt one of my students got for me, and I’d hear it: “Algebra? Who needs it?”  

    I would ask the person if they’d shopped with a coupon, bought a cheaper store brand, looked at the unit price on toilet paper or if they’d mentally calculated their total before heading to the checkout line. 

    I’d smile and say — “All of that is algebraic thinking.”  

    Despite my assurances, the idea that “I am just not into math” was, and still is, pervasive. Sometimes the thought comes from students, often from parents or colleagues, and more often than not it is said with a kind of resignation — as if math were a club you either got into early or missed forever. 

    That mindset has never been more insidious than it is now, when mathematics knowledge is needed more than ever. Every day we rely on math to interpret data, whether it’s tracking public health trends, forecasting weather, making financial decisions or navigating technology. 

    Related: A lot goes on in classrooms from kindergarten to high school. Keep up with our free weekly newsletter on K-12 education.  

    The ability to reason quantitatively, spot patterns and make decisions based on evidence has become integral to how we all navigate the world. Yet recent national data shows we’re falling short. Fewer than one in three eighth graders are on grade level in math, according to the latest National Assessment of Educational Progress scores. 

    Across nearly every industry, from agriculture to aerospace, mathematical reasoning is becoming more essential. Employers across sectors increasingly need people who can interpret data, test ideas and solve unfamiliar problems.  

    If we want more young people to access these growing opportunities, we need to rapidly expand access to the after-school and summer programs that help them develop the confidence and curiosity to build math skills. 

    Right now, too many young people are missing out. After-school and summer learning programs are rarely included in state or federal improvement plans, even though research shows that they are proven to reinforce classroom learning and build student confidence.  

    In addition, educators in these programs could benefit from training and resources to help young people connect more fully with math.  

    With the right support from funders and policymakers, these challenges can be addressed, and millions more students can build the math skills they’ll need. Every student deserves the chance to build confidence in math, not just those who excel early.  

    The stakes are far too high to keep throwing the same solutions at the problem. We need to think differently — not just about how we teach math, but how and where young people experience it. 

    After-school and summer programs give young people a chance to engage with math in low-pressure settings that don’t feel like an extension of school. They aren’t bound by curriculum or high-stakes test prep.  

    In these programs, educators can naturally bring math into real-life experiences — budgeting for a community project, designing a video game, planning the route for a field trip or understanding the data behind a favorite sport or song.  

    These programs also create opportunities to engage families in everyday math and to elevate older youth as peer mentors or tutors — making math feel more personal, social and relevant. 

    Out-of-school experiences mean students aren’t expected to memorize a formula before they can explore an idea. They’re encouraged to ask questions, try things out and see what happens. 

    And, importantly, they can take time to try, reflect and try again, without fear of being wrong. 

    Related: A lot of hope was pinned on after-school programs — now some are shutting their doors 

    When mistakes are treated as part of the mathematical reasoning process, students start to take more risks. They begin trusting themselves to navigate challenges, which builds their confidence. 

    That shift is especially important for students who have internalized the message that math isn’t for them, and it will carry them much further than an emphasis on better test scores and grades.  

    At STEM Next, we’re working to foster that shift by supporting after-school and summer programs, training informal educators and strengthening the learning environments where math confidence takes root.  

    Our recent publication offers a closer look at how after-school and summer programs are helping students experience math differently, and why that shift matters now more than ever.  

    Expanding access to these programs isn’t just to help kids grow math skills today, it’s a long-term investment in our workforce and our future.  

    Camsie McAdams is director of the Institute for a STEM Ready America at STEM Next Opportunity Fund. 

    Contact the opinion editor at opinion@hechingerreport.org.  

    This story about after-school programs was produced by The Hechinger Report, a nonprofit, independent news organization focused on inequality and innovation in education. Sign up for Hechinger’s weekly newsletter.  

    The Hechinger Report provides in-depth, fact-based, unbiased reporting on education that is free to all readers. But that doesn’t mean it’s free to produce. Our work keeps educators and the public informed about pressing issues at schools and on campuses throughout the country. We tell the whole story, even when the details are inconvenient. Help us keep doing that.

    Join us today.

    [ad_2]

    Camsie McAdams

    Source link

  • Public school kids were already going missing. There’s even more to come

    [ad_1]

    For many Americans, the specter of missing children evokes forlorn images on milk cartons or Amber alerts on cell phones. But a new report from the Brookings Institution suggests that the pandemic may have created a new generation of lost kids — this time, from classrooms. 

    Lost but not found

    The number of students who are not in school exploded in 2020 after the Covid outbreak, and many still aren’t back. The missing kids are not in  private schools or being homeschooled. Many children are simply not enrolled anywhere, according to the Brookings’ analysis of federal data. Some are older teens, nearly at the end of their high school years, but many are younger. And no one knows whether these kids are getting an education. 

    During the 2021–22 school year, roughly 2 million additional students, ages 5 through 17, disappeared from both public and private school rolls, a 450 percent increase from 2019-20 in missing kids, according to the report. I would have guessed that families had relocated during the pandemic, temporarily or permanently, and administrative records were in too much disarray to track down everyone. But even by 2023–24, a normal school year, the share of children unaccounted for (not in public or private school) still totaled 2.1 million or almost 4 percent of the nation’s 54 million kids, ages 5 to 17, nearly five times the number before the pandemic. 

    To calculate the number of missing children, the Brookings researchers subtracted school enrollment figures from U.S. population data. It’s possible that there’s some statistical discrepancy between data from the U.S. Census Department and the National Center for Education Statistics that will be sorted out in the future. But it’s also possible that these missing children are not learning to read and do math, and that doesn’t portend well for the nation’s future. Analysis of state data by Stanford University professor Thomas Dee in 2023 first revealed the pandemic increase in missing children, and was publicized by the Associated Press. This Brookings report confirms that it is an enduring mystery. 

    Percentage of school-aged children who are not enrolled in traditional public schools, 2016-17 to 2023-24

    Source: Brookings, “Declining public school enrollment,” August 2025

    Private school enrollment flat

    Before the pandemic, the share of students in traditional public schools held steady, hovering near 85 percent between 2016 and 2020. After the pandemic, traditional public school enrollment plummeted to below 80 percent and hasn’t rebounded. 

    The mysterious missing children account for a big chunk of the decline. But families also switched to charter and virtual schools. Charter school enrollment rose from 5 percent of students in 2016-17 to 6 percent in 2023-24. The number of children attending virtual schools almost doubled from 0.7 percent before the pandemic in 2019-20 to 1.2 percent in 2020-21 and has remained elevated. 

    Surprisingly, private school enrollment has stayed steady at almost 9 percent of school-age children between 2016-17 and 2023-24, according to this Brookings estimate. 

    I had expected private school enrollment to skyrocket, as families soured on public school disruptions during the pandemic, and as 11 states, including Arizona and Florida, launched their own educational savings account or new voucher programs to help pay the tuition. But another analysis, released this month by researchers at Tulane University, echoed the Brookings numbers. It found that private school enrollments had increased by only 3 to 4 percent between 2021 and 2024, compared to states without vouchers. A new federal tax credit to fund private school scholarships is still more than a year away from going into effect on Jan. 1, 2027, and perhaps a greater shift into private education is still ahead. 

    Defections from traditional public schools are largest in Black and high-poverty districts

    I would have guessed that wealthier families who can afford private school tuition would be more likely to seek alternatives. But high-poverty districts had the largest share of students outside the traditional public-school sector. In addition to private school, they were enrolled in charters, virtual schools, specialized schools for students with disabilities or other alternative schools, or were homeschooling. 

    More than 1 in 4 students in high-poverty districts aren’t enrolled in a traditional public school, compared with 1 in 6 students in low-poverty school districts. The steepest public school enrollment losses are concentrated in predominantly Black school districts. A third of students in predominantly Black districts are not in traditional public schools, double the share of white and Hispanic students. 

    Share of student enrollment outside of traditional public schools, by district poverty

    Source: Brookings, “Declining public school enrollment,” August 2025

    Share of students not enrolled in traditional public schools by race and ethnicity

    Source: Brookings, “Declining public school enrollment,” August 2025

    These discrepancies matter for the students who remain in traditional public schools. Schools in low-income and Black neighborhoods are now losing the most students, forcing even steeper budget cuts. 

    The demographic timebomb

    Before the pandemic, U.S. schools were already headed for a big contraction. The average American woman is now giving birth to only 1.7 children over her lifetime, well below the 2.1 fertility rate needed to replace the population. Fertility rates are projected to fall further still. The Brookings analysts assume more immigrants will continue to enter the country, despite current immigration restrictions, but not enough to offset the decline in births. 

    Even if families return to their pre-pandemic enrollment patterns, the population decline would mean 2.2 million fewer public school students by 2050. But if parents keep choosing other kinds of schools at the pace observed since 2020, traditional public schools could lose as many as 8.5 million students, shrinking from 43.06 million in 2023-24 to as few as 34.57 million by mid-century. 

    Between students gone missing, the choices some Black families and families in high-poverty districts are making and how many kids are being born, the public school landscape is shifting. Buckle up and get ready for mass public school closures.

    Contact staff writer Jill Barshay at 212-678-3595, jillbarshay.35 on Signal, or barshay@hechingerreport.org.

    This story about school enrollment declines was produced by The Hechinger Report, a nonprofit, independent news organization focused on inequality and innovation in education. Sign up for Proof Points and other Hechinger newsletters.

    The Hechinger Report provides in-depth, fact-based, unbiased reporting on education that is free to all readers. But that doesn’t mean it’s free to produce. Our work keeps educators and the public informed about pressing issues at schools and on campuses throughout the country. We tell the whole story, even when the details are inconvenient. Help us keep doing that.

    Join us today.

    [ad_2]

    Jill Barshay

    Source link

  • Behind the latest dismal NAEP scores

    [ad_1]

    The National Assessment for Educational Progress, called NAEP or the Nation’s Report Card, has long been considered the gold standard for understanding how American students are doing. So bad headlines were inevitable last week when the long-delayed 2024 results for 12th graders in math and reading and for eighth graders in science were finally released.

    It is tempting to blame the long tail of the pandemic for the dismal scores. But folks who keep a close eye on NAEP had some provocative analysis. 

    Eric Hanushek: It’s not just the pandemic

    Eric Hanushek, a senior fellow at Stanford University’s Hoover Institution, points out that the 3-point declines for 12th graders between 2019 and 2024 are in line with the long-term achievement losses that he’s been seeing since 2013. In a paper this month, written before the 12th grade 2024 NAEP scores were released, he documented that the learning losses during the pandemic match those that occurred before and after the pandemic. In other words, student achievement is declining for reasons other than Covid school disruptions.

    Hanushek calculated that restoring student achievement to 2013 levels would raise the lifetime earnings of today’s average student by an estimated 8 percent and would produce dramatic and sustained gains for the national economy.

    Related: Our free weekly newsletter alerts you to what research says about schools and classrooms.

    Dan McGrath: It could have been worse

    Dan McGrath, the retired former associate commissioner for assessments at the National Center for Education Statistics, used to oversee NAEP until he lost his job in March during mass layoffs at the Education Department.

    Now he’s sharing his personal analysis of NAEP score data in a newsletter. McGrath points out that the slide in eighth grade science and 12th grade math and reading is “not as bad” as he had expected.

    He based that prediction on deteriorating scores for students this age before the pandemic, and pandemic-era losses for fourth and eighth graders. He said he would have expected drops twice as large: 8 points instead of just 3 to 4 points.

    Any decline is bad. McGrath said that students who were in eighth grade in the spring of 2024 (and are now starting 10th grade in high school) are less prepared for difficult high school science courses, and students who graduated high school in 2024 went to college or into the workforce “underskilled” compared to students before them.

    But given that McGrath had predicted far worse results, these NAEP scores are “kinda sorta good news,” he said. Why did 12th graders weather the pandemic better than eighth graders did, and why did science skills hold up better than math and reading for eighth graders? “I don’t know,” wrote McGrath. 

    Related: NAEP, the Nation’s Report Card, was supposed to be safe. It’s not 

    Andrew Ho: Missing data

    Harvard University education professor Andrew Ho lamented on LinkedIn that the recent NAEP release isn’t that useful. For starters, the long five-year gap (from 2019 to 2024) between the tests of 12th graders means that we cannot tell if the 2024 results represent a pandemic decline or recovery from an earlier nadir.

    That matters. Education policymakers have no way of knowing if high schools are back on an upward track (and should continue doing what they are doing) or not (and change course).

    Also, there’s no state data for 12th graders to help us see bright spots to emulate.

    That frequency and breadth already takes place for fourth and eighth graders. Leslie Muldoon, executive director of the board that oversees the NAEP test, commented that more frequent and state-by-state testing of high schoolers is a future priority.

    Related: A smaller NAEP 

    Reversing course and rehiring at the Education Department

    Adding tests might seem like a pipe dream in the wake of budget and staffing cuts at the Education Department. All the staffers dedicated to NAEP were fired in March as part of a mass downsizing that Education Secretary Linda McMahon said was a first step toward eliminating the department.

    However, the Education Department is now starting to rehire staff to help administer the NAEP exam — a sign that the administration intends to preserve at least one function of the agency that President Donald Trump wants to abolish.

    So far two new jobs have been posted — one to oversee the development of test questions and the other to supervise the administration of the tests. These are the first two of at least eight positions that the Education Department plans to fill this fall, according to an education department official with the National Center for Education Statistics who briefed reporters this month.

    Contact staff writer Jill Barshay at 212-678-3595, jillbarshay.35 on Signal, or barshay@hechingerreport.org

    This story about NAEP scores was produced by The Hechinger Report, a nonprofit, independent news organization focused on inequality and innovation in education. Sign up for Proof Points and other Hechinger newsletters.

    The Hechinger Report provides in-depth, fact-based, unbiased reporting on education that is free to all readers. But that doesn’t mean it’s free to produce. Our work keeps educators and the public informed about pressing issues at schools and on campuses throughout the country. We tell the whole story, even when the details are inconvenient. Help us keep doing that.

    Join us today.

    [ad_2]

    Jill Barshay

    Source link

  • OPINION: The push to expand school choice should not diminish civic education

    [ad_1]

    From Texas to Florida to Arizona, school voucher policies are reshaping the landscape of American education. The Trump administration champions federal support for voucher expansion, and many state-level leaders are advancing school choice programs. Billions of public dollars are now flowing to private schools, church networks and microeducation platforms.  

    The push to expand school choice is not just reallocating public funds to private institutions. It is reorganizing the very purpose of schooling. And in that shift, something essential is being lost — the public mission of education as a foundation of democracy. 

    Civic education is becoming fragmented, underfunded and institutionally weak.  

    In this moment of sweeping change, as public dollars shift from common institutions to private and alternative schools, the shared civic entities that once supported democratic learning are being diminished or lost entirely — traditional structures like public schools, libraries and community colleges are no longer guaranteed common spaces. 

    The result is a disjointed system in which students may gain academic content or career preparation but receive little support in learning how to lead with integrity, think across differences or sustain democratic institutions. The very idea of public life is at risk, especially in places where shared experience has been replaced by polarization. We need civic education more than ever. 

    Related: A lot goes on in classrooms from kindergarten to high school. Keep up with our free weekly newsletter on K-12 education.  

    If we want students who can lead a multiracial democracy, we need schools of every type to take civic formation seriously. That includes religious schools, charter schools and homeschooling networks. The responsibility cannot fall on public schools alone. Civic formation is not an ideological project. It is a democratic one, involving the long-term work of building the skills, habits and values that prepare people to work across differences and take responsibility for shared democratic life. 

    What we need now is a civic education strategy that matches the scale of the changes reshaping American schooling. This will mean fostering coordinated investment, institutional partnerships and recognition that the stakes are not just academic, they are also democratic. 

    Americans overwhelmingly support civic instruction. According to a 2020 survey in Texas by the Center of Women in Politics and Public Policy and iCivics, just 49 percent of teachers statewide believed that enough time was being devoted to teaching civics knowledge, and just 23 percent said the same about participatory-democracy skills. This gap is not unique to Texas, but there is little agreement on how civics should be taught, and even less structural support for the schools trying to do it. 

    Without serious investment, civic formation will remain an afterthought — a patchwork effort disconnected from the design of most educational systems. 

    This is not an argument against vouchers in principle. Families should have options. But in the move to decentralize education, we risk hollowing out its civic core. A democratic society cannot survive on academic content alone. It requires citizens — not just in the legal sense, but in the civic one. 

    A democratic society needs people who can deliberate, organize, collaborate and build a shared future with others who do not think or live like they do. 

    And that’s why we are building a framework in Texas that others can adopt and adapt to their own civic mission. 

    The pioneering Democracy Schools model, to which I contribute, supports civic formation across a range of public and private schools, colleges, community organizations and professional networks.  

    Civic infrastructure is the term we use to describe our approach: the design of relationships, institutions and systems that hold democracy together. Just as engineers build physical infrastructure, educators and civic leaders must build civic infrastructure by working with communities, not for or on them. 

    We start from a democratic tradition rooted in the Black freedom struggle. Freedom, in this view, is not just protection from domination. It is the capacity to act, build and see oneself reflected in the world. This view of citizenship demands more than voice. It calls for the ability to shape institutions, policies and public narratives from the ground up. 

    Related: STUDENT VOICE: My generation knows less about civics than my parents’ generation did, yet we need it more than ever 

    The model speaks to a national crisis: the erosion of shared civic space in education. It must be practiced and must be supported by institutions that understand their role in building public life. Historically Black colleges and universities like Huston-Tillotson University offer a powerful example. They are not elite pipelines disconnected from everyday life. They are rooted in community, oriented toward public leadership and shaped by a history of democratic struggle. They show what it looks like to educate for civic capacity — not just for upward mobility. They remind us that education is not only about what students know, but about who they become and what kind of world they are prepared to help shape. 

    Our national future depends on how well we prepare young people to take responsibility for shared institutions and pluralistic public life. This cannot be accomplished through content standards alone. It requires civic ecosystems designed to cultivate public authorship. 

    We have an enormous stake in preparing the next generation for the demands of democratic life. What kind of society are we preparing young people to lead? The answer will not come from any single institution. It will come from partnerships across sectors, aligned in purpose even if diverse in approach. 

    We are eager to collaborate with any organization — public, private or faith-based — committed to building the civic infrastructure that sustains our democracy. Wherever education takes place, civic formation must remain a central concern. 

    Robert Ceresa is the founding director of the Politics Lab of the James L. Farmer House, Huston-Tillotson University. 

    Contact the opinion editor at opinion@hechingerreport.org.  

    This story about civic education was produced by The Hechinger Report, a nonprofit, independent news organization focused on inequality and innovation in education. Sign up for Hechinger’s weekly newsletter.

    The Hechinger Report provides in-depth, fact-based, unbiased reporting on education that is free to all readers. But that doesn’t mean it’s free to produce. Our work keeps educators and the public informed about pressing issues at schools and on campuses throughout the country. We tell the whole story, even when the details are inconvenient. Help us keep doing that.

    Join us today.

    [ad_2]

    Robert M. Ceresa

    Source link

  • OPINION: NAEP scores show we need new approaches, more resources and all hands on deck to address underlying education problems

    [ad_1]

    America’s future is not made in factories or innovation labs — it’s forged in classrooms. We can’t bring good jobs back to U.S. shores if we don’t first educate a workforce capable of doing them. The latest National Assessment of Educational Progress, or NAEP, known as the Nation’s Report Card, paints a grim picture, with test scores down since 2019 for eighth graders in science and 12th graders in math and reading.  

    The lowest-performing learners lost the most ground, leaving large percentages of students unable to perform the strong academic work required for postsecondary life. Only about 1 in 5 high school seniors scored at the NAEP Proficient level in math. That puts them at a terrible disadvantage since STEM positions make up a growing percentage of the workforce. Nearly half were working below even the NAEP Basic level, meaning they likely don’t know how to use percentages to solve real-world problems. 

    This isn’t the first bad report card we’ve seen since the pandemic upended learning five years ago, but progress in American education has generally been stalled for at least a decade. Leaders at every level need to stop using the pandemic as an excuse and start looking for solutions. There have been times in the past when Republicans and Democrats have come together around education. While that may be difficult to do today, it’s needed more than ever. 

    Related: A lot goes on in classrooms from kindergarten to high school. Keep up with our free weekly newsletter on K-12 education.  

    We need new approaches, more resources and all hands on deck to help students develop the knowledge and skills to thrive in an increasingly complicated world. The teens who took the 12th grade tests are now out of school. They’re facing a workforce disrupted by AI and demanding more from them — not less. Even young people who opt not to go to college, such as those looking to work in manufacturing, for example, need more advanced STEM and literacy skills than in the past. 

    There are some areas of educational progress around the country we can learn from. For example, Indiana is remaking the American high school experience to personalize it and connect it to the world of work, while Rhode Island is reinvigorating career and technical education to embed it with more rigor and ensure it provides an on-ramp to an array of postsecondary options, including college. 

    We can also expand on the reforms that are taking root in elementary education. An emphasis on the research behind teaching children to read, sometimes referred to as the science of reading, is effective. And states like Mississippi and Louisiana, leaders in this movement, have seen strong literacy gains. We can apply that kind of evidence-based approach across K-12 subjects and grades. 

    It’s also vital to listen to what students are saying. Fortunately, the Nation’s Report Card can help with this.  

    Survey data accompanying the eighth grade science assessment suggests that inquiry-based learning is in decline. Fewer students say they’re spending time on things like designing experiments to answer research questions. That kind of instruction helps students build science knowledge and develop key skills like the ability to think critically and to collaborate with peers, exactly the kind of skills that AI can’t replace. 

    Related: Nation’s Report Card at risk, researchers say 

    The best instruction has a purpose for learning, explores real-world problems and makes connections to work. Most states have passed science standards that promote this kind of instruction, but more resources are needed to get aligned materials into schools and provide teachers with the training to use them effectively. 

    Getting kids out of the classroom helps too. I invited elementary school students to my farm in western Massachusetts a few years ago and vividly recall a fourth grader’s aha moment, finally understanding decimals when collecting 2.25 inches of rain in a vial. It was a terrific example of how interdisciplinary science is and how powerful it is in experiential learning settings. 

    It’s true that science resources, such as lab materials, can be expensive; however, schools can tap into community partners and business leaders for assistance. In Massachusetts, for example, General Electric has helped bring mobile technology labs into schools.  

    One thing I am grateful for, even amid all this bad education news, is the high-quality data shining a light on the problems we’re facing. There are too many voices today calling for a rollback of testing. That’s a mistake. Obtaining meaningful data, such as that found on the Nation’s Report Card, is crucial. Of course, what we do with it matters even more. 

    It has been 42 years since American leaders from across political parties and sectors came together to bring attention to “A Nation at Risk,” a landmark report that spurred significant education reforms. And it’s been 36 years since 49 governors came together and defined the state role in K-12 schooling.  

    After these milestones, the nation saw sustained progress on NAEP. We need that same leadership now. 

    Republican Jane Swift is a former governor from Massachusetts who serves on the National Assessment Governing Board, which oversees the Nation’s Report Card. She is also the CEO of Education at Work, a nonprofit that connects college students with work-based learning opportunities. 

    Contact the opinion editor at opinion@hechingerreport.org. 

    This story about NAEP scores was produced by The Hechinger Report, a nonprofit, independent news organization focused on inequality and innovation in education. Sign up for Hechinger’s weekly newsletter.

    The Hechinger Report provides in-depth, fact-based, unbiased reporting on education that is free to all readers. But that doesn’t mean it’s free to produce. Our work keeps educators and the public informed about pressing issues at schools and on campuses throughout the country. We tell the whole story, even when the details are inconvenient. Help us keep doing that.

    Join us today.

    [ad_2]

    Jane Swift

    Source link

  • A window into America’s high schools slams shut – The Hechinger Report

    [ad_1]

    This story was reported by and originally published by APM Reports in connection with its podcast Sold a Story: How Teach Kids to Read Went So Wrong.

    The choices you make as a teenager can shape the rest of your life. If you take high school classes for college credit, you’re more likely to enroll at a university. If you take at least 12 credits of classes during your first year there, you’re more likely to graduate. And those decisions may even influence whether you develop dementia during your later years.

    These and insights from thousands of other studies can all be traced to a trove of data that the federal government started collecting more than 50 years ago. Now that effort is over.

    On a single day in February, the Trump administration and its Department of Government Efficiency canceled a long-running series of surveys called the high school longitudinal studies. The surveys started in 1972, and they had gathered data on more than 135,000 high school students through their first decade or so of adulthood — sometimes longer.

    “For 50 years, we’ve been mapping a timeline of progress of our high school system, and we’re going to have a big blank,” said Adam Gamoran, who leads the William T. Grant Foundation and was nominated to head up the Education Department’s research and statistics arm under President Biden, but was never confirmed. “That’s very frustrating.”

    The data collection effort has been going on since before the founding of the modern Department of Education. Thousands of journal articles, books, dissertations and reports have relied on this data to form conclusions about American education — everything from how high school counselors should be spending their days to when students should start taking higher-level math classes.

    The Department of Government Efficiency first canceled contracts for the collection of new long-term high school data and then started laying off staff. The National Center for Education Statistics used to have nearly 100 employees. Today, only three remain.

    “The reduction — annihilation — of NCES functionally is a very serious issue,” said Felice Levine, former executive director of the American Educational Research Association, one of the groups suing the administration over these actions. “Maybe it doesn’t appear to be as sexy as other topics, but it really is the backbone of knowledge building and policymaking.”

    The Department of Education is reviewing how longitudinal studies “fit into the national data collection strategy based on studies’ return on investment for taxpayers,” according to an email from its spokesperson. The statement also said the department’s Institute of Education Sciences, which is in charge of overseeing research and gathering statistics, remains committed to “mission-critical functions.”

    “It seems to me that even if you were the most hardcore libertarian who wants the government to regulate almost nothing, collecting national statistics is about the most innocuous and useful thing that a government could do,” said Stuart Buck, executive director of the Good Science Project, a group advocating less bureaucracy in science funding. 

    “The idea of a Department of Governmental Efficiency is an excellent idea, and I hope we try it out sometime,” he said. But the effort, “as it currently exists, I would argue, is often directly opposed to efficiency. Like, they’re doing the exact opposite.”

    He likened the approach to “someone showing up to your house and claiming they saved you $200 a month, and it turns out they canceled your electricity.” 

    Related: Become a lifelong learner. Subscribe to our free weekly newsletter featuring the most important stories in education.  

    Since the effort began in the early 1970s, the federal government has collected data on six large groups of high school students, each numbering in the tens of thousands. Researchers surveyed each group at least once during high school, along with their parents and teachers. Researchers then contacted the students periodically after that, generally over the course of a decade or so — sometimes longer. They collected transcripts and other documents to track progress, too. In total, the data set contains thousands of variables.

    The studies are called longitudinal, because they take place over a long time. The methodology is similar to studies that track twins over their lifetimes to determine which traits are genetic and which are caused by their experiences. Such data sets are valuable because they allow researchers to tease out effects that can’t be seen in a single snapshot, but they are rare because they require sustained funding over decades. And the high school data covers a large number of participants selected to represent the national population, giving insights that can be broadly applicable across states.  

    That vast repository of data affects students “indirectly, but profoundly,” said Andrew Byrne, who runs the math department at Greenwich High School in Connecticut. For example, research based on the data has shown that high school students who take classes for college credit have a better chance of finishing their bachelor’s degrees on time. 

    Byrne said that research informed the school’s decision to start offering a new Advanced Placement precalculus class when the College Board unveiled it two years ago. The new offering gave some high school students in lower-level math classes the opportunity to get college credit for the first time.

    “Success in AP precalculus could empower them to believe they can succeed in college-level classes overall,” Byrne said. A student probably would not read the academic research, but “they live the results of the decisions that data informs,” he said.

    Follow-up surveys for the group first contacted in 2009 — made up of people who started high school during the Great Recession — and for students who were high school freshmen in 2022 have been canceled. The latter group, who were middle schoolers during the pandemic, will be graduating next year.

    Elise Christopher oversaw the high school longitudinal studies at the National Center for Education Statistics until she was laid off in March along with dozens of her colleagues. Christopher, a statistician who worked at the center for more than 14 years, is concerned about the data that was scheduled to be collected this year — and now won’t be. 

    “We can’t just pick this back up later,” she said. “They won’t be in high school. We won’t be able to understand what makes them want to come to school every day, because they’ll be gone.” 

    Researchers were hoping to learn more about why chronic absenteeism has persisted in schools even years after Covid-19 abated, Christopher explained. They were also hoping to understand whether students are now less interested in attending college than previous generations.

    “Every single person in this country who’s been educated in the past 50 years has benefited from something that one of these longitudinal surveys has done,” she said.

    Levine said the planned follow-up with students from the 2009 high school group would have helped reveal how a greater emphasis on math, science and technology in some states has influenced student decision-making. Were they more likely to study the hard sciences in college? Did they continue on to careers in those fields? 

    “These are the kinds of things that the public wants to know about, families want to know about, and school administrators and counselors want to know about,” she said.

    Related: Suddenly sacked

    About 25,000 people who completed the high school survey in 1980 were contacted again by researchers decades later. 

    Rob Warren, the director of the University of Minnesota’s Institute for Social Research and Data Innovation, is hoping those people — now in their 60s — may help him and other researchers gain new insights into why some people develop dementia, while others with similar brain chemistry don’t.

    “Education apparently plays a big role in who’s resilient,” Warren said. “That’s kind of a mystery.” 

    The people who participated in the high school study may offer a unique set of clues about why education matters, Warren explained.

    “You need all that detail about education, and you need to be able to see them decades later, when they’re old enough to start having memory decline,” he said. Other studies can measure cognition, but to measure whether education plays a role in dementia outcomes, “you can’t really test (that) with other data,” he said. 

    So Warren’s team got permission from the federal government to contact the group in 2019. Researchers asked all the usual types of questions about their jobs and lives, but also gave them cognitive tests, asked medical questions, and even collected samples of their blood to monitor how their brains were changing as they aged.

    Warren is continuing his research even though the federal government has canceled future high school surveys. But the staffing cuts at the Department of Education have hampered his ability to hand the data off to the center or share it with other researchers. To do that, he needs permission from the Department of Education, but getting it has been a challenge

    “Very often you don’t hear anything back, ever, and sometimes you do, but it takes a very long time,” Warren said. Even drafting legal agreements to make the data available to the National Institutes of Health — another federal agency, which funded his data collection effort and would be responsible for handling the medical data —  has been a bottleneck.

    Such agreements would involve a bunch of lawyers, Warren said, and the Department of Education has laid off most of its legal team. 

    If the data isn’t made available to other researchers, Warren said, questions about dementia may go unanswered and “NIH’s large investment in this project will be wasted.”

    Kate Martin contributed to this report.

    The Hechinger Report provides in-depth, fact-based, unbiased reporting on education that is free to all readers. But that doesn’t mean it’s free to produce. Our work keeps educators and the public informed about pressing issues at schools and on campuses throughout the country. We tell the whole story, even when the details are inconvenient. Help us keep doing that.

    Join us today.

    [ad_2]

    by Carmela Guaglianone, APM Reports

    Source link

  • A researcher’s view on using AI to become a better writer

    [ad_1]

    Writing can be hard, equal parts heavy lifting and drudgery. No wonder so many students are turning to the time-saving allure of ChatGPT, which can crank out entire papers in seconds. It rescues them from procrastination jams and dreaded all-nighters, magically freeing up more time for other pursuits, like, say … doomscrolling.

    Of course, no one learns to be a better writer when someone else (or some AI bot) is doing the work for them. The question is whether chatbots can morph into decent writing teachers or coaches that students actually want to consult to improve their writing, and not just use for shortcuts.

    Maybe.

    Jennifer Meyer, an assistant professor at the University of Vienna in Austria, has been studying how AI bots can be used to improve student writing for several years. In an interview, she explained why she is cautious about the ability of AI to make us better writers and is still testing how to use the new technology effectively.

    All in the timing 

    Meyer says that just because ChatGPT is available 24/7 doesn’t mean students should consult it at the start of the writing process. Instead, Meyer believes that students would generally learn more if they wrote a first draft on their own. 

    That’s when AI could be most helpful, she thinks. With some prompting, a chatbot could provide immediate writing feedback targeted to each students’ needs. One student might need to practice writing shorter sentences. Another might be struggling with story structure and outlining. AI could theoretically meet an entire classroom’s individual needs faster than a human teacher. 

    Related: Our free weekly newsletter alerts you to what research says about schools and classrooms.

    In Meyer’s experiments, she inserted AI only after the first draft was done as part of the revision process. In a study published in 2024, she randomly assigned 200 German high school students to receive AI feedback after writing a draft of an essay in English. Their revised essays were stronger than those of 250 students who were also told to revise, but didn’t get help from AI. 

    In surveys, those with AI feedback also said they felt more motivated to rewrite than those who didn’t get feedback. That motivation is critical. Often students aren’t in the mood to rewrite, and without revisions, students can’t become better writers.

    Meyer doesn’t consider her experiment proof that AI is a great writing teacher. She didn’t compare it with how student writing improved after human feedback. Her experiment compared only AI feedback with no feedback. 

    Most importantly, one dose of AI writing feedback wasn’t enough to elevate students’ writing skills. On a second, fresh essay topic, the students who had previously received AI feedback didn’t write any better than the students who hadn’t been helped by AI.

    Related: AI writing feedback ‘better than I thought,’ top researcher says

    It’s unclear how many rounds of AI feedback it would take to boost a student’s writing skills more permanently, not just help revise the essay at hand. 

    And Meyer doesn’t know whether a student would want to keep discussing writing with an AI bot over and over again. Maybe students were willing to engage with it in this experiment because it was a novelty, but could soon tire of it. That’s next on Meyer’s research agenda.

    A viral MIT study

    A much smaller MIT study published earlier this year echoes Meyer’s theory. “Your Brain on ChatGPT” went viral because it seemed to say that using ChatGPT to help write an essay made students’ brains less engaged. Researchers found that students who wrote an essay without any online tools had stronger brain connectivity and activity than students who used AI or consulted Google to search for source materials. (Using Google while writing wasn’t nearly as bad for the brain as AI.) 

    Although those results made headlines, there was more to the experiment. The students who initially wrote an essay on their own were later given ChatGPT to help improve their essays. That switch to ChatGPT boosted brain activity, in contrast to what the neuroscientists found during the initial writing process. 

    Related: University students offload critical thinking, other hard work to AI

    These studies add to the evidence that delaying AI a bit, after some initial thinking and drafting, could be a sweet spot in learning. That’s something researchers need to test more. 

    Still, Meyer remains concerned about giving AI tools to very weak writers and to young children who haven’t developed basic writing skills. “This could be a real problem,” said Meyer. “It could be detrimental to use these tools too early.”

    Cheating your way to learning?

    Meyer doesn’t think it’s always a bad idea for students to ask ChatGPT to do the writing for them. 

    Just as young artists learn to paint by copying masterpieces in museums, students might learn to write better by copying good writing. (The late great New Yorker editor John Bennet taught Jill to write this way. He called it “copy work” and he encouraged his journalism students to do it every week by copying longhand the words of legendary writers, not AI.)

    Meyer suggests that students ask ChatGPT to write a sample essay that meets their teacher’s assignment and grading criteria. The next step is key. If students pretend it’s their own piece and submit it, that’s cheating. They’ve also offloaded cognitive work to technology and haven’t learned anything.

    Related: AI essay grading is already as ‘good as an overburdened’ teacher, but researchers say it needs more work

    But the AI essay can be an effective teaching tool, in theory, if students study the arguments, organizational structure, sentence construction and vocabulary before writing a new draft in their own words. Ideally, the next assignment should be better if students have learned through that analysis and internalized the style and techniques of the model essay, Meyer said. 

    “My hypothesis would be as long as there’s cognitive effort with it, as long as there’s a lot of time on task and like critical thinking about the output, then it should be fine,” said Meyer.

    Reconsidering praise

    Everyone likes a compliment. But too much praise can drown learning just as too much water can keep flowers from blooming.  

    ChatGPT has a tendency to pour the praise on thick and often begins with banal flattery, like “Great job!” even when a student’s writing needs a lot of work. In Meyer’s test of whether AI feedback can improve students’ writing, she intentionally told ChatGPT not to start with praise and instead go straight to constructive criticism.

    Her parsimonious approach to praise was inspired by a 2023 writing study about what motivates students to revise. The study found that when teachers started off with general praise, students were left with the false impression that their work was already good enough so they didn’t put in the extra effort to rewrite.

    Related: Asian American students lose more points in an AI essay grading study — but researchers don’t know why

    In Meyer’s experiment, the praise-free feedback was effective in getting students to revise and improve their essays. But she didn’t set up a direct competition between the two approaches — praise-free vs. praise-full — so we don’t know for sure which is more effective when students are interacting with AI.

    Being stingy with praise rubs real teachers the wrong way. After Meyer removed praise from the feedback, teachers told her they wanted to restore it. “They wondered about why the feedback was so negative,” Meyer said. “That’s not how they would do it.”

    Meyer and other researchers may one day solve the puzzle of how to turn AI chatbots into great writing coaches. But whether students will have the willpower or desire to forgo an instantly written essay is another matter. As long as ChatGPT continues to allow students to take the easy way out, it’s human nature to do so. 

    Shirley Liu is a graduate student in education at Northwestern University. Liu reported and wrote this story along with The Hechinger Report’s Jill Barshay.

    Contact staff writer Jill Barshay at 212-678-3595, jillbarshay.35 on Signal, or barshay@hechingerreport.org.

    This story about using AI to become a better writer was produced by The Hechinger Report, a nonprofit, independent news organization focused on inequality and innovation in education. Sign up for Proof Points and other Hechinger newsletters.

    The Hechinger Report provides in-depth, fact-based, unbiased reporting on education that is free to all readers. But that doesn’t mean it’s free to produce. Our work keeps educators and the public informed about pressing issues at schools and on campuses throughout the country. We tell the whole story, even when the details are inconvenient. Help us keep doing that.

    Join us today.

    [ad_2]

    Shirley Liu and Jill Barshay

    Source link

  • OPINION: If we are going to build AI literacy into every level of learning, we must be able to measure it

    [ad_1]

    Everywhere you look, someone is telling students and workers to “learn AI.” 

    It’s become the go-to advice for staying employable, relevant and prepared for the future. But here’s the problem: While definitions of artificial intelligence literacy are starting to emerge, we still lack a consistent, measurable framework to know whether someone is truly ready to use AI effectively and responsibly. 

    And that is becoming a serious issue for education and workforce systems already being reshaped by AI. Schools and colleges are redesigning their entire curriculums. Companies are rewriting job descriptions. States are launching AI-focused initiatives.  

    Yet we’re missing a foundational step: agreeing not only on what we mean by AI literacy, but on how we assess it in practice. 

    Two major recent developments underscore why this step matters, and why it is important that we find a way to take it before urging students to use AI. First, the U.S. Department of Education released its proposed priorities for advancing AI in education, guidance that will ultimately shape how federal grants will support K-12 and higher education. For the first time, we now have a proposed federal definition of AI literacy: the technical knowledge, durable skills and future-ready attitudes required to thrive in a world influenced by AI. Such literacy will enable learners to engage and create with, manage and design AI, while critically evaluating its benefits, risks and implications. 

    Second, we now have the White House’s American AI Action Plan, a broader national strategy aimed at strengthening the country’s leadership in artificial intelligence. Education and workforce development are central to the plan. 

    Related: A lot goes on in classrooms from kindergarten to high school. Keep up with our free weekly newsletter on K-12 education. 

    What both efforts share is a recognition that AI is not just a technological shift, it’s a human one. In many ways, the most important AI literacy skills are not about AI itself, but about the human capacities needed to use AI wisely. 

    Sadly, the consequences of shallow AI education are already visible in workplaces. Some 55 percent of managers believe their employees are AI-proficient, while only 43 percent of employees share that confidence, according to the 2025 ETS Human Progress Report.  

    One can say that the same perception gap exists between school administrators and teachers. The disconnect creates risks for organizations and reveals how assumptions about AI literacy can diverge sharply from reality. 

    But if we’re going to build AI literacy into every level of learning, we have to ask the harder question: How do we both determine when someone is truly AI literate and assess it in ways that are fair, useful and scalable? 

    AI literacy may be new, but we don’t have to start from scratch to measure it. We’ve tackled challenges like this before, moving beyond check-the-box tests in digital literacy to capture deeper, real-world skills. Building on those lessons will help define and measure this next evolution of 21st-century skills. 

    Right now, we often treat AI literacy as a binary: You either “have it” or you don’t. But real AI literacy and readiness is more nuanced. It includes understanding how AI works, being able to use it effectively in real-world settings and knowing when to trust it. It includes writing effective prompts, spotting bias, asking hard questions and applying judgment. 

    This isn’t just about teaching coding or issuing a certificate. It’s about making sure that students, educators and workers can collaborate in and navigate a world in which AI is increasingly involved in how we learn, hire, communicate and make decisions.  

    Without a way to measure AI literacy, we can’t identify who needs support. We can’t track progress. And we risk letting a new kind of unfairness take root, in which some communities build real capacity with AI and others are left with shallow exposure and no feedback. 

    Related: To employers,AIskills aren’t just for tech majors anymore 

    What can education leaders do right now to address this issue? I have a few ideas.  

    First, we need a working definition of AI literacy that goes beyond tool usage. The Department of Education’s proposed definition is a good start, combining technical fluency, applied reasoning and ethical awareness.  

    Second, assessments of AI literacy should be integrated into curriculum design. Schools and colleges incorporating AI into coursework need clear definitions of proficiency. TeachAI’s AI Literacy Framework for Primary and Secondary Education is a great resource. 

    Third, AI proficiency must be defined and measured consistently, or we risk a mismatched state of literacy. Without consistent measurements and standards, one district may see AI literacy as just using ChatGPT, while another defines it far more broadly, leaving students unevenly ready for the next generation of jobs. 

    To prepare for an AI-driven future, defining and measuring AI literacy must be a priority. Every student will be graduating into a world in which AI literacy is essential. Human resources leaders confirmed in the 2025 ETS Human Progress Report that the No. 1 skill employers are demanding today is AI literacy. Without measurement, we risk building the future on assumptions, not readiness.  

    And that’s too shaky a foundation for the stakes ahead. 

    Amit Sevak is CEO of ETS, the largest private educational assessment organization in the world. 

    Contact the opinion editor at opinion@hechingerreport.org. 

    This story about AI literacy was produced by The Hechinger Report, a nonprofit, independent news organization focused on inequality and innovation in education. Sign up for Hechinger’s weekly newsletter. 

    The Hechinger Report provides in-depth, fact-based, unbiased reporting on education that is free to all readers. But that doesn’t mean it’s free to produce. Our work keeps educators and the public informed about pressing issues at schools and on campuses throughout the country. We tell the whole story, even when the details are inconvenient. Help us keep doing that.

    Join us today.

    [ad_2]

    Amit Sevak

    Source link

  • A gender gap in STEM widened during the pandemic. Schools are trying to make up lost ground

    [ad_1]

    IRVING, Texas — Crowded around a workshop table, four girls at de Zavala Middle School puzzled over a Lego machine they had built. As they flashed a purple card in front of a light sensor, nothing happened. 

    The teacher at the Dallas-area school had emphasized that in the building process, there are no such thing as mistakes. Only iterations. So the girls dug back into the box of blocks and pulled out an orange card. They held it over the sensor and the machine kicked into motion. 

    “Oh! Oh, it reacts differently to different colors,” said sixth grader Sofia Cruz.

    In de Zavala’s first year as a choice school focused on science, technology, engineering and math, the school recruited a sixth grade class that’s half girls. School leaders are hoping the girls will stick with STEM fields. In de Zavala’s higher grades — whose students joined before it was a STEM school — some elective STEM classes have just one girl enrolled. 

    Efforts to close the gap between boys and girls in STEM classes are picking up after losing steam nationwide during the chaos of the Covid pandemic. Schools have extensive work ahead to make up for the ground girls lost, in both interest and performance.

    In the years leading up to the pandemic, the gender gap nearly closed. But within a few years, girls lost all the ground they had gained in math test scores over the previous decade, according to an Associated Press analysis. While boys’ scores also suffered during Covid, they have recovered faster than girls, widening the gender gap.

    As learning went online, special programs to engage girls lapsed — and schools were slow to restart them. Zoom school also emphasized rote learning, a technique based on repetition that some experts believe may favor boys, instead of teaching students to solve problems in different ways, which may benefit girls. 

    Old practices and biases likely reemerged during the pandemic, said Michelle Stie, a vice president at the National Math and Science Initiative.

    “Let’s just call it what it is,” Stie said. “When society is disrupted, you fall back into bad patterns.”

    Related: A lot goes on in classrooms from kindergarten to high school. Keep up with our free weekly newsletter on K-12 education.

    In most school districts in the 2008-09 school year, boys had higher average math scores on standardized tests than girls, according to AP’s analysis, which looked at scores across 15 years in over 5,000 school districts. It was based on average test scores for third through eighth graders in 33 states, compiled by the Educational Opportunity Project at Stanford University. 

    A decade later, girls had not only caught up, they were ahead: Slightly more than half of districts had higher math averages for girls.

    Within a few years of the pandemic, the parity disappeared. In 2023-24, boys on average outscored girls in math in nearly 9 out of 10 districts.

    A separate study by NWEA, an education research company, found gaps between boys and girls in science and math on national assessments went from being practically non-existent in 2019 to favoring boys around 2022.

    Studies have indicated girls reported higher levels of anxiety and depression during the pandemic, plus more caretaking burdens than boys, but the dip in academic performance did not appear outside STEM. Girls outperformed boys in reading in nearly every district nationwide before the pandemic and continued to do so afterward.

    “It wasn’t something like Covid happened and girls just fell apart,” said Megan Kuhfeld, one of the authors of the NWEA study. 

    Related: These districts are bucking the national math slump 

    In the years leading up to the pandemic, teaching practices shifted to deemphasize speed, competition and rote memorization. Through new curriculum standards, schools moved toward research-backed methods that emphasized how to think flexibly to solve problems and how to tackle numeric problems conceptually.

    Educators also promoted participation in STEM subjects and programs that boosted girls’ confidence, including extracurriculars that emphasized hands-on learning and connected abstract concepts to real-life applications. 

    When STEM courses had large male enrollment, Superintendent Kenny Rodrequez noticed girls losing interest as boys dominated classroom discussions at his schools in Grandview C-4 District outside Kansas City. Girls were significantly more engaged after the district moved some of its introductory hands-on STEM curriculum to the lower grade levels and balanced classes by gender, he said.

    When schools closed for the pandemic, the district had to focus on making remote learning work. When in-person classes resumed, some of the teachers had left, and new ones had to be trained in the curriculum, Rodrequez said. 

    “Whenever there’s crisis, we go back to what we knew,” Rodrequez said. 

    Related: One state tried algebra for all eighth graders. It hasn’t gone well

    Despite shifts in societal perceptions, a bias against girls persists in science and math subjects, according to teachers, administrators and advocates. It becomes a message girls can internalize about their own abilities, they say, even at a very young age. 

    In his third grade classroom in Washington, D.C., teacher Raphael Bonhomme starts the year with an exercise where students break down what makes up their identity. Rarely do the girls describe themselves as good at math. Already, some say they are “not a math person.” 

    “I’m like, you’re 8 years old,” he said. “What are you talking about, ‘I’m not a math person?’” 

    Girls also may have been more sensitive to changes in instructional methods spurred by the pandemic, said Janine Remillard, a math education professor at the University of Pennsylvania. Research has found girls tend to prefer learning things that are connected to real-life examples, while boys generally do better in a competitive environment. 

    “What teachers told me during Covid is the first thing to go were all of these sense-making processes,” she said. 

    Related: OPINION: Everyone can be a math person but first we have to make math instruction more inclusive 

    At de Zavala Middle School in Irving, the STEM program is part of a push that aims to build curiosity, resilience and problem-solving across subjects.

    Coming out of the pandemic, Irving schools had to make a renewed investment in training for teachers, said Erin O’Connor, a STEM and innovation specialist there.

    The district last year also piloted a new science curriculum from Lego Education. The lesson involving the machine at de Zavala, for example, had students learn about kinetic energy. Fifth graders learned about genetics by building dinosaurs and their offspring with Lego blocks, identifying shared traits. 

    “It is just rebuilding the culture of, we want to build critical thinkers and problem solvers,” O’Connor said.

    Teacher Tenisha Willis recently led second graders at Irving’s Townley Elementary School through building a machine that would push blocks into a container. She knelt next to three girls who were struggling.

    They tried to add a plank to the wheeled body of the machine, but the blocks didn’t move enough. One girl grew frustrated, but Willis was patient. She asked what else they could try, whether they could flip some parts around. The girls ran the machine again. This time, it worked.

    “Sometimes we can’t give up,” Willis said. “Sometimes we already have a solution. We just have to adjust it a little bit.” 

    Lurye reported from Philadelphia. Todd Feathers contributed reporting from New York. 

    The Associated Press’ education coverage receives financial support from multiple private foundations. AP is solely responsible for all content. Find AP’s standards for working with philanthropies, a list of supporters and funded coverage areas at AP.org.

    The Hechinger Report provides in-depth, fact-based, unbiased reporting on education that is free to all readers. But that doesn’t mean it’s free to produce. Our work keeps educators and the public informed about pressing issues at schools and on campuses throughout the country. We tell the whole story, even when the details are inconvenient. Help us keep doing that.

    Join us today.

    [ad_2]

    Annie Ma and Sharon Lurye

    Source link

  • Nation’s Report Card at risk, researchers say

    [ad_1]

    This story was reported by and originally published by APM Reports in connection with its podcast Sold a Story: How Teach Kids to Read Went So Wrong.

    When voters elected Donald Trump in November, most people who worked at the U.S. Department of Education weren’t scared for their jobs. They had been through a Trump presidency before, and they hadn’t seen big changes in their department then. They saw their work as essential, mandated by law, nonpartisan and, as a result, insulated from politics.

    Then, in early February, the Department of Government Efficiency showed up. Led at the time by billionaire CEO Elon Musk, and known by the cheeky acronym DOGE, it gutted the Department of Education’s Institute of Education Sciences, posting on X that the effort would ferret out “waste, fraud and abuse.”

    A post from the Department of Government Efficiency.

    When it was done, DOGE had cut approximately $900 million in research contracts and more than 90 percent of the institute’s workforce had been laid off. (The current value of the contracts was closer to $820 million, data compiled by APM Reports shows, and the actual savings to the government was substantially less, because in some cases large amounts of money had been spent already.)

    Among staff cast aside were those who worked on the National Assessment of Educational Progress — also known as the Nation’s Report Card — which is one of the few federal education initiatives the Trump administration says it sees as valuable and wants to preserve.

    The assessment is a series of tests administered nearly every year to a national sample of more than 10,000 students in grades 4, 8 and 12. The tests regularly measure what students across the country know in reading, math and other subjects. They allow the government to track how well America’s students are learning overall. Researchers can also combine the national data with the results of tests administered by states to draw comparisons between schools and districts in different states.

    The assessment is “something we absolutely need to keep,” Education Secretary Linda McMahon said at an education and technology summit in San Diego earlier this year. “If we don’t, states can be a little manipulative with their own results and their own testing. I think it’s a way that we keep everybody honest.”

    But researchers and former Department of Education employees say they worry that the test will become less and less reliable over time, because the deep cuts will cause its quality to slip — and some already see signs of trouble.

    “The main indication is that there just aren’t the staff,” said Sean Reardon, a Stanford University professor who uses the testing data to research gaps in learning between students of different income levels.

    All but one of the experts who make sure the questions in the assessment are fair and accurate — called psychometricians — have been laid off from the National Center for Education Statistics. These specialists play a key role in updating the test and making sure it accurately measures what students know.

    “These are extremely sophisticated test assessments that required a team of researchers to make them as good as they are,” said Mark Seidenberg, a researcher known for his significant contributions to the science of reading. Seidenberg added that “a half-baked” assessment would undermine public confidence in the results, which he described as “essentially another way of killing” the assessment.

    The Department of Education defended its management of the assessment in an email: “Every member of the team is working toward the same goal of maintaining NAEP’s gold-standard status,” it read in part.

    The National Assessment Governing Board, which sets policies for the national test, said in a statement that it had temporarily assigned “five staff members who have appropriate technical expertise (in psychometrics, assessment operations, and statistics) and federal contract management experience” to work at the National Center for Education Statistics. No one from DOGE responded to a request for comment.

    Harvard education professor Andrew Ho, a former member of the governing board, said the remaining staff are capable, but he’s concerned that there aren’t enough of them to prevent errors.

    “In order to put a good product up, you need a certain number of person-hours, and a certain amount of continuity and experience doing exactly this kind of job, and that’s what we lost,” Ho said.

    The Trump administration has already delayed the release of some testing data following the cutbacks. The Department of Education had previously planned to announce the results of the tests for 8th grade science, 12th grade math and 12th grade reading this summer; now that won’t happen until September. The board voted earlier this year to eliminate more than a dozen tests over the next seven years, including fourth grade science in 2028 and U.S. history for 12th graders in 2030. The governing board has also asked Congress to postpone the 2028 tests to 2029, citing a desire to avoid releasing test results in an election year. 

    “Today’s actions reflect what assessments the Governing Board believes are most valuable to stakeholders and can be best assessed by NAEP at this time, given the imperative for cost efficiencies,” board chair and former North Carolina Gov. Bev Perdue said earlier this year in a press release.

    The National Assessment Governing Board canceled more than a dozen tests when it revised the schedule for the National Assessment of Educational Progress in April. This annotated version of the previous schedule, adopted in 2023, shows which tests were canceled. Topics shown in all caps were scheduled for a potential overhaul; those annotated with a red star are no longer scheduled for such a revision.

    Recent estimates peg the annual cost to keep the national assessment running at about $190 million per year, a fraction of the department’s 2025 budget of approximately $195 billion.

    Adam Gamoran, president of the William T. Grant Foundation, said multiple contracts with private firms — overseen by Department of Education staff with “substantial expertise” — are the backbone of the national test.

    “You need a staff,” said Gamoran, who was nominated last year to lead the Institute of Education Sciences. He was never confirmed by the Senate. “The fact that NCES now only has three employees indicates that they can’t possibly implement NAEP at a high level of quality, because they lack the in-house expertise to oversee that work. So that is deeply troubling.”

    The cutbacks were widespread — and far outside of what most former employees had expected under the new administration.

    “I don’t think any of us imagined this in our worst nightmares,” said a former Education Department employee, who spoke on condition of anonymity for fear of retaliation by the Trump administration. “We weren’t concerned about the utter destruction of this national resource of data.”

    “At what point does it break?” the former employee asked.

    Related: Suddenly sacked

    Every state has its own test for reading, math and other subjects. But state tests vary in difficulty and content, which makes it tricky to compare results in Minnesota to Mississippi or Montana.

    “They’re totally different tests with different scales,” Reardon said. “So NAEP is the Rosetta stone that lets them all be connected.”

    Reardon and his team at Stanford used statistical techniques to combine the federal assessment results with state test scores and other data sets to create the Educational Opportunity Project. The project, first released in 2016 and updated periodically in the years that followed, shows which schools and districts are getting the best results — especially for kids from poor families. Since the project’s release, Reardon said, the data has been downloaded 50,000 times and is used by researchers, teachers, parents, school boards and state education leaders to inform their decisions.

    For instance, the U.S. military used the data to measure school quality when weighing base closures, and superintendents used it to find demographically similar but higher-performing districts to learn from, Reardon said.

    If the quality of the data slips, those comparisons will be more difficult to make.

    “My worry is we just have less-good information on which to base educational decisions at the district, state and school level,” Reardon said. “We would be in the position of trying to improve the education system with no information. Sort of like, ‘Well, let’s hope this works. We won’t know, but it sounds like a good idea.’”

    Seidenberg, the reading researcher, said the national assessment “provided extraordinarily important, reliable information about how we’re doing in terms of teaching kids to read and how literacy is faring in the culture at large.”

    Producing a test without keeping the quality up, Seidenberg said, “would be almost as bad as not collecting the data at all.”

    The Hechinger Report provides in-depth, fact-based, unbiased reporting on education that is free to all readers. But that doesn’t mean it’s free to produce. Our work keeps educators and the public informed about pressing issues at schools and on campuses throughout the country. We tell the whole story, even when the details are inconvenient. Help us keep doing that.

    Join us today.

    [ad_2]

    Kate Martin and Carmela Guaglianone

    Source link

  • Tutoring was supposed to save American kids after the pandemic. The results? ‘Sobering’

    [ad_1]

    Rigorous research rarely shows that any teaching approach produces large and consistent benefits for students. But tutoring seemed to be a rare exception. Before the pandemic, almost 100 studies pointed to impressive math or reading gains for students who were paired with a tutor at least three times a week and used a proven curriculum or set of lesson plans. 

    Some students gained an extra year’s worth of learning — far greater than the benefit of smaller classes, summer school or a fantastic teacher. These were rigorous randomized controlled trials, akin to the way that drugs or vaccines are tested, comparing test scores of tutored students against those who weren’t. The expense, sometimes surpassing $4,000 a year per student, seemed worth it for what researchers called high-dosage tutoring.

    On the strength of that evidence, the Biden administration urged schools to invest their pandemic recovery funds in intensive tutoring to help students catch up academically. Forty-six percent of public schools heeded that call, according to a 2024 federal survey, though it’s unclear exactly how much of the $190 billion in pandemic recovery funds have been spent on high-dosage tutoring and how many students received it. 

    Related: Our free weekly newsletter alerts you to what research says about schools and classrooms.

    Even with ample money, schools immediately reported problems in ramping up high-quality tutoring for so many students. In 2024, researchers documented either tiny or no academic benefits from large-scale tutoring efforts in Nashville, Tennessee, and Washington, D.C.

    New evidence from the 2023-24 school year reinforces those results. Researchers are rigorously studying large-scale tutoring efforts around the nation and testing whether effective tutoring can be done more cheaply. A dozen researchers studied more than 20,000 students in Miami; Chicago; Atlanta; Winston-Salem and Greensboro, North Carolina; Greenville, South Carolina; schools throughout New Mexico, and a California charter school network. This was also a randomized controlled study in which 9,000 students were randomly assigned to get tutoring and compared with 11,000 students who didn’t get that extra help.

    Their preliminary results were “sobering,” according to a June report by the University of Chicago Education Lab and MDRC, a research organization.

    The researchers found that tutoring during the 2023-24 school year produced only one or two months’ worth of extra learning in reading or math — a tiny fraction of what the pre-pandemic research had produced. Each minute of tutoring that students received appeared to be as effective as in the pre-pandemic research, but students weren’t getting enough minutes of tutoring altogether. “Overall we still see that the dosage students are getting falls far short of what would be needed to fully realize the promise of high-dosage tutoring,” the report said.

    Monica Bhatt, a researcher at the University of Chicago Education Lab and one of the report’s authors, said schools struggled to set up large tutoring programs. “The problem is the logistics of getting it delivered,” said Bhatt. Effective high-dosage tutoring involves big changes to bell schedules and classroom space, along with the challenge of hiring and training tutors. Educators need to make it a priority for it to happen, Bhatt said.

    Related: Students aren’t benefiting much from tutoring, one new study shows

    Some of the earlier, pre-pandemic tutoring studies involved large numbers of students, too, but those tutoring programs were carefully designed and implemented, often with researchers involved. In most cases, they were ideal setups. There was much greater variability in the quality of post-pandemic programs.

    “For those of us that run experiments, one of the deep sources of frustration is that what you end up with is not what you tested and wanted to see,” said Philip Oreopoulos, an economist at the University of Toronto, whose 2020 review of tutoring evidence influenced policymakers. Oreopoulos was also an author of the June report.

    “After you spend lots of people’s money and lots of time and effort, things don’t always go the way you hope. There’s a lot of fires to put out at the beginning or throughout because teachers or tutors aren’t doing what you want, or the hiring isn’t going well,” Oreopoulos said.

    Another reason for the lackluster results could be that schools offered a lot of extra help to everyone after the pandemic, even to students who didn’t receive tutoring. In the pre-pandemic research, students in the “business as usual” control group often received no extra help at all, making the difference between tutoring and no tutoring far more stark. After the pandemic, students — tutored and non-tutored alike — had extra math and reading periods, sometimes called “labs” for review and practice work. More than three-quarters of the 20,000 students in this June analysis had access to computer-assisted instruction in math or reading, possibly muting the effects of tutoring.

    Related: Tutoring may not significantly improve attendance

    The report did find that cheaper tutoring programs appeared to be just as effective (or ineffective) as the more expensive ones, an indication that the cheaper models are worth further testing. The cheaper models averaged $1,200 per student and had tutors working with eight students at a time, similar to small group instruction, often combining online practice work with human attention. The more expensive models averaged $2,000 per student and had tutors working with three to four students at once. By contrast, many of the pre-pandemic tutoring programs involved smaller 1-to-1 or 2-to-1 student-to-tutor ratios.

    Despite the disappointing results, researchers said that educators shouldn’t give up. “High-dosage tutoring is still a district or state’s best bet to improve student learning, given that the learning impact per minute of tutoring is largely robust,” the report concludes. The task now is to figure out how to improve implementation and increase the hours that students are receiving. “Our recommendation for the field is to focus on increasing dosage — and, thereby learning gains,” Bhatt said.

    That doesn’t mean that schools need to invest more in tutoring and saturate schools with effective tutors. That’s not realistic with the end of federal pandemic recovery funds.  

    Instead of tutoring for the masses, Bhatt said researchers are turning their attention to targeting a limited amount of tutoring to the right students. “We are focused on understanding which tutoring models work for which kinds of students.” 

    Contact staff writer Jill Barshay at 212-678-3595, jillbarshay.35 on Signal, or barshay@hechingerreport.org.

    This story about tutoring effectiveness was written by Jill Barshay and produced by The Hechinger Report, a nonprofit, independent news organization focused on inequality and innovation in education. Sign up for Proof Points and other Hechinger newsletters.

    The Hechinger Report provides in-depth, fact-based, unbiased reporting on education that is free to all readers. But that doesn’t mean it’s free to produce. Our work keeps educators and the public informed about pressing issues at schools and on campuses throughout the country. We tell the whole story, even when the details are inconvenient. Help us keep doing that.

    Join us today.

    [ad_2]

    Jill Barshay

    Source link

  • What Trump’s education cuts mean for literacy – The Hechinger Report

    [ad_1]

    This podcast, Sold a Story, was produced by APM Reports and reprinted with permission.

    There’s an idea about how children learn to read that’s held sway in schools for more than a generation – even though it was proven wrong by cognitive scientists decades ago. Teaching methods based on this idea can make it harder for children to learn how to read. In this new American Public Media podcast, host Emily Hanford investigates the influential authors who promote this idea and the company that sells their work. It’s an exposé of how educators came to believe in something that isn’t true and are now reckoning with the consequences – children harmed, money wasted, an education system upended.

    Episode 14: The Cuts

    Education research is at a turning point in the United States. The Trump administration is slashing government funding for science and dismantling the Department of Education. We look at what the cuts mean for the science of reading — and the effort to get that science into schools.

    This podcast, Sold a Story, was produced by  APM Reports and reprinted with permission.

    The Hechinger Report provides in-depth, fact-based, unbiased reporting on education that is free to all readers. But that doesn’t mean it’s free to produce. Our work keeps educators and the public informed about pressing issues at schools and on campuses throughout the country. We tell the whole story, even when the details are inconvenient. Help us keep doing that.

    Join us today.

    [ad_2]

    Emily Hanford

    Source link

  • Inaccurate, impossible: Experts knock new Trump plan to collect college admissions data

    [ad_1]

    President Donald Trump wants to collect more admissions data from colleges and universities to make sure they’re complying with a 2023 Supreme Court decision that ended race-conscious affirmative action. And he wants that data now. 

    But data experts and higher education scholars warn that any new admissions data is likely to be inaccurate, impossible to interpret and ultimately misused by policymakers. That’s because Trump’s own policies have left the statistics agency inside the Education Department with a skeleton staff and not enough money, expertise or time to create this new dataset. 

    The department already collects data on enrollment from every institution of higher education that participates in the federal student loan program. The results are reported through the Integrated Postsecondary Education Data System (IPEDS). But in an Aug. 7 memorandum, Trump directed the Education Department, which he sought to close in March, to expand that task and provide “transparency” into how some 1,700 colleges that do not admit everyone are making their admissions decisions. And he gave Education Secretary Linda McMahon just 120 days to get it done. 

    Related: Our free weekly newsletter alerts you to what research says about schools and classrooms.

    Expanding data collection on applicants is not a new idea. The Biden administration had already ordered colleges to start reporting race and ethnicity data to the department this fall in order to track changes in diversity in postsecondary education. But in a separate memorandum to the head of the National Center for Education Statistics (NCES), McMahon asked for even more information, including high school grades and college entrance exam scores, all broken down by race and gender.  

    Bryan Cook, director of higher education policy at the Urban Institute, a think tank in Washington, D.C., called the 120-day timeline “preposterous” because of the enormous technical challenges. For example, IPEDS has never collected high school GPAs. Some schools use a weighted 5.0 scale, giving extra points for advanced classes, and others use an unweighted 4.0 scale, which makes comparisons messy. Other issues are equally thorny. Many schools no longer require applicants to report standardized test scores and some no longer ask them about race so the data that Trump wants doesn’t exist for those colleges. 

    “You’ve got this effort to add these elements without a mechanism with which to vet the new variables, as well as a system for ensuring their proper implementation,” said Cook. “You would almost think that whoever implemented this didn’t know what they were doing.” 

    Cook has helped advise the Education Department on the IPEDS data collection for 20 years and served on technical review panels, which are normally convened first to recommend changes to the data collection. Those panels were disbanded earlier this year, and there isn’t one set up to vet Trump’s new admissions data proposal.

    Cook and other data experts can’t figure out how a decimated education statistics agency could take on this task. All six NCES employees who were involved in IPEDS data collection were fired in March, and there are only three employees left out of 100 at NCES, which is run by an acting commissioner who also has several other jobs. 

    An Education Department official, who did not want to be named, denied that no one left inside the Education Department has IPEDS experience. The official said that staff inside the office of the chief data officer, which is separate from the statistics agency, have a “deep familiarity with IPEDS data, its collection and use.” Former Education Department employees told me that some of these employees have experience in analyzing the data, but not in collecting it.

    In the past, there were as many as a dozen employees who worked closely with RTI International, a scientific research institute, which handles most of the IPEDS data collection work. 

    Technical review eliminated

    Of particular concern is that RTI’s $10 million annual contract to conduct the data collection had been slashed approximately in half by the Department of Government Efficiency, also known as DOGE, according to two former employees, who asked to remain anonymous out of fear of retaliation. Those severe budget cuts eliminated the technical review panels that vet proposed changes to IPEDS, and ended training for colleges and universities to submit data properly, which helped with data quality. RTI did not respond to my request to confirm the cuts or answer questions about the challenges it will face in expanding its work on a reduced budget and staffing.

    The Education Department did not deny that the IPEDS budget had been cut in half. “The RTI contract is focused on the most mission-critical IPEDS activities,” the Education Department official said. “The contract continues to include at least one task under which a technical review panel can be convened.”  

    Additional elements of the IPEDS data collection have also been reduced, including a contract to check data quality.

    Last week, the scope of the new task became more apparent. On Aug. 13, the administration released more details about the new admissions data it wants, describing how the Education Department is attempting to add a whole new survey to IPEDS, called the Admissions and Consumer Transparency Supplement (ACTS), which will disaggregate all admissions data and most student outcome and financial aid data by race and gender. College will have to report on both undergraduate and graduate school admissions. The public has 60 days to comment, and the administration wants colleges to start reporting this data this fall. 

    Complex collection

    Christine Keller, executive director of the Association for Institutional Research, a trade group of higher education officials who collect and analyze data, called the new survey “one of the most complex IPEDS collections ever attempted.” 

    Traditionally, it has taken years to make much smaller changes to IPEDS, and universities are given a year to start collecting the new data before they are required to submit it. (Roughly 6,000 colleges, universities and vocational schools are required to submit data to IPEDS as a condition for their students to take out federal student loans or receive federal Pell Grants. Failure to comply results in fines and the threat of losing access to federal student aid.)

    Normally, the Education Department would reveal screenshots of data fields, showing what colleges would need to enter into the IPEDS computer system. But the department has not done that, and several of the data descriptions are ambiguous. For example, colleges will have to report test scores and GPA by quintile, broken down by race and ethnicity and gender. One interpretation is that a college would have to say how many Black male applicants, for example, scored above the 80th percentile on the SAT or the ACT. Another interpretation is that colleges would need to report the average SAT or ACT score of the top 20 percent of Black male applicants. 

    The Association for Institutional Research used to train college administrators on how to collect and submit data correctly and sort through confusing details — until DOGE eliminated that training. “The absence of comprehensive, federally funded training will only increase institutional burden and risk to data quality,” Keller said. Keller’s organization is now dipping into its own budget to offer a small amount of free IPEDS training to universities

    The Education Department is also requiring colleges to report five years of historical admissions data, broken down into numerous subcategories. Institutions have never been asked to keep data on applicants who didn’t enroll. 

    “It’s incredible they’re asking for five years of prior data,” said Jordan Matsudaira, an economist at American University who worked on education policy in the Biden and Obama administrations. “That will be square in the pandemic years when no one was reporting test scores.”

    ‘Misleading results’

    Matsudaira explained that IPEDS had considered asking colleges for more academic data by race and ethnicity in the past and the Education Department ultimately rejected the proposal. One concern is that slicing and dicing the data into smaller and smaller buckets would mean that there would be too few students and the data would have to be suppressed to protect student privacy. For example, if there were two Native American men in the top 20 percent of SAT scores at one college, many people might be able to guess who they were. And a large amount of suppressed data would make the whole collection less useful.

    Also, small numbers can lead to wacky results. For example, a small college could have only two Hispanic male applicants with very high SAT scores. If both were accepted, that’s a 100 percent admittance rate. If only 200 white women out of 400 with the same test scores were accepted, that would be only a 50 percent admittance rate. On the surface, that can look like both racial and gender discrimination. But it could have been a fluke. Perhaps both of those Hispanic men were athletes and musicians. The following year, the school might reject two different Hispanic male applicants with high test scores but without such impressive extracurriculars. The admissions rate for Hispanic males with high test scores would drop to zero. “You end up with misleading results,” said Matsudaira. 

    Reporting average test scores by race is another big worry. “It feels like a trap to me,” said Matsudaira. “That is mechanically going to give the administration the pretense of claiming that there’s lower standards of admission for Black students relative to white students when you know that’s not at all a correct inference.”

    The statistical issue is that there are more Asian and white students at the very high end of the SAT score distribution, and all those perfect 1600s will pull the average up for these racial groups. (Just like a very tall person will skew the average height of a group.) Even if a college has a high test score threshold that it applies to all racial groups and no one below a 1400 is admitted, the average SAT score for Black students will still be lower than that of white students. (See graphic below.) The only way to avoid this is to purely admit by test score and take only the students with the highest scores. At some highly selective universities, there are enough applicants with a 1600 SAT to fill the entire class. But no institution fills its student body by test scores alone. That could mean overlooking applicants with the potential to be concert pianists, star soccer players or great writers.

    The Average Score Trap

    This graphic by Kirabo Jackson, an economist at Northwestern University, depicts the problem of measuring racial discrimination though average test scores. Even for a university that admits all students above a certain cut score, the average score of one racial group (red) will be higher than the average score of the other group (blue). Source: graphic posted on Bluesky Social by Josh Goodman

    Admissions data is a highly charged political issue. The Biden administration originally spearheaded the collection of college admissions data by race and ethnicity. Democrats wanted to collect this data to show how the nation’s colleges and universities were becoming less diverse with the end of affirmative action. This data is slated to start this fall, following a full technical and procedural review. 

    Now the Trump administration is demanding what was already in the works, and adding a host of new data requirements — without following normal processes. And instead of tracking the declining diversity in higher education, Trump wants to use admissions data to threaten colleges and universities. If the new directive produces bad data that is easy to misinterpret, he may get his wish.

    Contact staff writer Jill Barshay at 212-678-3595, jillbarshay.35 on Signal, or barshay@hechingerreport.org.

    This story about college admissions data was written by Jill Barshay and produced by The Hechinger Report, a nonprofit, independent news organization focused on inequality and innovation in education. Sign up for Proof Points and other Hechinger newsletters.

    The Hechinger Report provides in-depth, fact-based, unbiased reporting on education that is free to all readers. But that doesn’t mean it’s free to produce. Our work keeps educators and the public informed about pressing issues at schools and on campuses throughout the country. We tell the whole story, even when the details are inconvenient. Help us keep doing that.

    Join us today.

    [ad_2]

    Jill Barshay

    Source link

  • OPINION: Trump is back. We’re still waiting on his plan for schools – The Hechinger Report

    [ad_1]

    OK. I guess we’re doing this (again).

    It feels awful for lots of reasons, of course, but mostly it’s because the country chose political vibes over policy ideas. As a researcher who spends his days trying to find evidence-based ways to make schools better, I’m at something of a loss.

    See, whatever you thought about the Harris-Walz ticket’s particular proposals, the Democrats had things to say about education issues that genuinely shape children’s development: affordable early care and learning, access to nutritious school meals, funding for English learners, and more.

    President-elect Trump’s education platform was made of much vaguer stuff — mostly culture war vibes. For instance, conservatives are eager to get the government involved in biological screenings to determine if kids have the “correct” genitalia for peeing in a particular bathroom or playing on a particular sports team. Trump talks about schools secretly imposing gender transition surgery on children. Finally, it’s likely that the administration will try to voucherize more public dollars to support families sending their children to private schools.

    Related: Become a lifelong learner. Subscribe to our free weekly newsletter to receive our comprehensive reporting directly in your inbox.

    But, again, all of this is light on substance. It’s pretty hard to see how bathroom-usage policies will help kids recover from the pandemic’s academic consequences, or get more children ready for kindergarten, or more third graders ready to read on grade level. School voucher programs may give anxious parents public money to pay for private education, but there’s not much evidence that they help students or the public schools they’re leaving behind.

    Worse yet, some of conservatives’ K–12 ideas are at war with themselves. The Republican platform calls for federal defunding of schools teaching curricula that conservatives don’t like, but it also pledges — immediately afterward — to “veto efforts to nationalize Civics Education [sic].” So they’re promising not to nationalize how schools teach history, except when they don’t like how certain schools teach history.

    Now, there was a detailed conservative plan for federal K–12 education drifting around during the campaign. The Heritage Foundation’s Project 2025 proposes to “eliminate” and “redistribute the various congressionally approved federal education programs across the government.” But Trump claimed to want nothing to do with it.

    Related: How would Project 2025 change education?

    Maybe he’s telling the truth — perhaps he’s realized that Project 2025 would significantly reduce his ability to enact any sort of affirmative education policy agenda. It would be harder to remake American schools in a Trumpian image without a federal Education Department, after all.

    Of course, that’s assuming 1) that Trump has given K–12 enough thought to work through that strategic calculus, and 2) conservatives actually have an affirmative agenda for making schools more effective, something that goes deeper than lines like this from their platform: “Our Great Teachers, who are so important to the future wellbeing of our Country, will be cherished and protected by the Republican Party so that they can do the job of educating our students that they so dearly want to do.”

    Related: What education could look like under Trump and Vance

    Perhaps there’s a concrete, substantive plan for reforming Title II of the Elementary and Secondary Education Act lurking in those words, and I just don’t have the right GOP decoder ring?

    So look, conservatives: You’ve got to figure something out. The country’s schools can’t afford another four years like the first round of President Trump’s leadership, which left U.S. public schools reeling.

    By 2018, the leadership at the Fordham Institute, the country’s most august conservative education policy think tank, was calling for Secretary Betsy DeVos to resign in the hopes that troubles from her first two rocky years could be sorted out by a replacement.

    In a January 2021 piece headlined “The Wreckage Betsy DeVos Leaves Behind,” the New York Times editorial board wrote, “The Department of Education lies in ruins at precisely the time when the country most needs it.”

    Related: Trump’s deportation plan could separate millions of families, leaving schools to pick up the pieces

    Please forgive me if this reads like I’m being overdramatic. Perhaps it’s my outmoded instincts as a Very Serious Beltway Policy Researcher; I still think about policymaking as an effort to actually solve big public problems.

    I’m a hidebound fossil that way. Of course, if you really want to own me, really want to prove experts like me wrong (again), you could shock everyone by setting aside the culture wars and giving substantive education reform a try.

    Conor P. Williams is a senior fellow at The Century Foundation, a founding partner with The Children’s Equity Project, and a father of three children currently enrolled in public schools in Washington, DC. The views here are strictly his own.

    The Hechinger Report provides in-depth, fact-based, unbiased reporting on education that is free to all readers. But that doesn’t mean it’s free to produce. Our work keeps educators and the public informed about pressing issues at schools and on campuses throughout the country. We tell the whole story, even when the details are inconvenient. Help us keep doing that.

    Join us today.

    [ad_2]

    Conor P. Williams

    Source link

  • OPINION: Encouraging Black and Latinx students to apply to selective colleges has become more urgent than ever – The Hechinger Report

    OPINION: Encouraging Black and Latinx students to apply to selective colleges has become more urgent than ever – The Hechinger Report

    [ad_1]

    Those of us who worked with high school students in the wake of the Supreme Court’s historic decision overturning race-conscious admissions can’t profess shock over news showing decreases in enrollment among Black and Latinx students across many college campuses, especially those considered competitive for enrollment.

    We saw this coming.

    Last year we saw too many highly qualified students shy away from applying to schools because they were sent a message that they wouldn’t get in without affirmative action. This year, it is more important than ever that we encourage our Black and Latinx students to apply to schools attended by similar students before the court’s reversal. Mentoring is a critical catalyst to achieve this goal.

    Another year of dips in enrollment among Black and Latinx students would arguably ignite a snowball trend in some of our nation’s most recognized institutions, leading to a perception that they are unwelcoming to students of color.

    As a society, we simply can’t afford this. We are at a demographic crossroads: Generation Z is forecast to be the last majority-white generation; the majority of Americans under the age of 18 are “nonwhite.” If we don’t increase the numbers of Black and Latinx students going to colleges where they belong and deserve to have a seat at the table, we are impacting the future of America.

    To change this new dynamic, we need to think outside the [check your race] box. College-educated adults hold the key to reshaping how to support Black and Latinx students getting to and through the college process so that they can unlock their full potential and achieve the “holy grail” of economic mobility.

    As colleges put more emphasis on early action and deadlines specific to first-generation students, our Black and Latinx high school seniors have the chance to make their voices heard through the power of their applications.

    Increasing applications by November’s early admission deadlines is a critical first step.

    Related: Interested in innovations in the field of higher education? Subscribe to our free biweekly Higher Education newsletter.

    Also, vitally, first-generation students need to have strong, trusting relationships in place before, during and after the application process to reinforce a sense of belonging. The adults these students meet early in their lives — often outside the home — can help blunt a seismic shift in the makeup of college enrollment across our nation.

    Over the past 25 years of working with primarily first-generation and low-income students, I have found that the path to and through college is built on a mentorship model that relies heavily on schools, corporations and communities working in lockstep. This tripod of support needs to work even more closely together to encourage students to increase their applications to and enrollment in selective universities.

    Through a focus group of 42,000 (the total number of students mentored since 1999), my organization has shown that the model of starting junior year of high school with 1:1 mentoring is proven and ensures that every student has an adult champion to not only help them chart a path to college but also build the sense of belonging needed to persevere to graduation.

    Mentoring develops the social capital to help establish careers and create the building blocks needed for long-term economic mobility.

    Every adult needs to adopt a mentoring mindset. We cannot sit back and watch as Black and Latinx students are shut out of college.

    One successful mentoring model I’ve seen uses partnerships with corporations that open their doors to high school students. This helps students start charting a course toward college and career paths based on interactive experiences in the conference room as well as the classroom.

    Related: How did students pitch themselves to colleges after
    last year’s affirmative action ruling?

    As DEI initiatives decline on college campuses, many corporations are expanding their own affinity groups and DEI initiatives. For students, these corporate communities foster a sense of belonging in both college and careers. For adults, these experiences hone a greater understanding of the many inequities that Black and Latinx, often first-generation, students face.

    Seemingly simple connections matter. Planting seeds of trust and confidence early in a relationship helps students see their future selves in their mentors. More Black and Latinx students need to hear “we don’t know if we don’t try,” and this work needs to start well before the beginning of senior year.

    Looking through the lens of a trusted adult, students can better trust the process and not be deterred by such things as the reversals of court decisions.

    While the decrease of Black and Latinx students enrolled in some selective universities this fall is discouraging, there is hope. The vast majority of students (97 percent) mentored in my organization who apply to college are accepted.

    Higher education has a critical accountability role as well as we head into this admissions year. I applaud those who have already reached out to try to help encourage underrepresented students to apply for college.

    Through an ecosystem of support, more Black and Latinx students will earn seats at the table in college and beyond.

    Mentoring helps close equity gaps for first-generation students, guiding them toward successful college careers and beyond. Together, we can turn these recent challenges into a transformative opportunity for lasting impact. The future needs as many Black and Latinx college-educated students as possible.

    Heather D. Wathington is CEO of iMentor, a national leader in 1:1 mentoring that builds long-term, personal relationships to help students, largely first-generation college students from underresourced communities, access and navigate postsecondary educations and careers.

    Contact the opinion editor at opinion@hechingerreport.org.

    This story about mentoring for college was produced by The Hechinger Report, a nonprofit, independent news organization focused on inequality and innovation in education. Sign up for our higher education newsletter. Listen to our higher education podcast.

    The Hechinger Report provides in-depth, fact-based, unbiased reporting on education that is free to all readers. But that doesn’t mean it’s free to produce. Our work keeps educators and the public informed about pressing issues at schools and on campuses throughout the country. We tell the whole story, even when the details are inconvenient. Help us keep doing that.

    Join us today.

    [ad_2]

    Heather D. Wathington

    Source link

  • Dual enrollment has exploded. But it’s hard to tell if it’s helping more kids get a college degree

    Dual enrollment has exploded. But it’s hard to tell if it’s helping more kids get a college degree

    [ad_1]

    Share of new college students in the fall of 2015 who were still in high school and taking a dual enrollment class. Map reprinted from The Postsecondary Outcomes of High School Dual Enrollment Students A National and State-by-State Analysis (October 2024) Community College Research Center.

    Dual enrollment is exploding. During the 2022-23 school year, nearly 2.5 million high school students took college classes, simultaneously earning high school and college credits. That’s up from 1.5 million students in the fall of 2021 and roughly 300,000 students in the early 2000s. Figures released last week show that dual enrollment grew another 7 percent in the fall of 2024 from a year earlier, even as the number of traditional college freshmen fell. 

    Exactly how much all of this is costing the nation isn’t known. But the state of Texas, which accounts for 10 percent of high schoolers who are taking these college classes, was investing $120 million annually as recently as 2017, according to one estimate. It wouldn’t be far fetched to extrapolate that over $1 billion a year in public funds is being spent on dual enrollment across the nation. 

    Alongside this meteoric rise of students and resources, researchers are trying to understand who is taking advantage of these early college classes, whether they’re expanding the pool of college educated Americans, and if these extra credits help students earn college degrees faster and save money.

    A new analysis released in October 2024 by the Community College Research Center (CCRC), at Teachers College, Columbia University, and the National Student Clearinghouse Research Center tracked what happened to every high school student who started taking dual enrollment classes in 2015. Of these 400,000 high schoolers, more than 80 percent enrolled in college straight after high school. That compares favorably with the general population, of whom only 70 percent of high school graduates went straight to college. Almost 30 percent of the 400,000 dual enrollees, roughly 117,000 students, earned a bachelor’s degree in four years. But a majority (58 percent) had not earned any college degree, either a four-year bachelor’s or a two-year associate, or any post-secondary credential, such as a short-term certificate, within this four-year period. (The Hechinger Report is an independent unit of Teachers College.)

    This is the most detailed and extensive analysis of dual enrollment that I’ve seen, covering all students in the nation, and tracking them for years after high school. But the analysis does not answer the fundamental question of whether dual enrollment  is a worthwhile public policy. 

    It’s not clear that  an early taste of higher education encourages  more students to go to college who wouldn’t have otherwise. And it’s hard to tell from this report if the credits are helping students get through college any faster. 

    The fact that students with dual enrollment credits are faring better than students without dual enrollment credits isn’t terribly persuasive. In order to qualify for the classes, students usually need to have done well on a test, earned high grades or be on an advanced or honors track in school. These high-achieving students would likely have graduated college in much higher numbers without any dual enrollment courses.

    “Are we subsidizing students who were always going to go to college anyway?” asked Kristen Hengtgen, a policy analyst at EdTrust, a nonprofit research and advocacy organization that lobbies for racial and economic equity in education. “Could we have spent the time and energy and effort differently on higher quality teachers or something else? I think that’s a really important question.”

    Related: High schoolers account for nearly 1 out of every 5 community college students

    Hengtgen was not involved in this latest analysis, but she is concerned about the severe underrepresentation of Black and Hispanic students that the report highlights. A data dashboard accompanying the new report documents that only 9 percent of the high schoolers in dual enrollment classes were Black, while Black students made up 16 percent of high school students. Only 17 percent of dual enrollment students were Hispanic at a time when Hispanic students made up almost a quarter of the high school population. White students, by contrast, took 65 percent of the dual enrollment seats but represented only half of the high school population. Asian students were the only group whose participation in dual enrollment matched their share of the student population: 5 percent of each. 

    Advocates of dual enrollment have made the argument that an early taste of college can motivate students to go to college, and the fact that so few Black and Hispanic students are enrolling is perhaps is the most troubling sign that the giant public-and-private investment in education isn’t fulfilling one of its main objectives: to expand the college-educated workforce.

    Hengtgen of EdTrust argues that Black, Hispanic and low-income students of all races need better high school advising to help them sign up for the classes. Sometimes, she said, students don’t know they have to take a prerequisite class in 10th grade in order to be eligible for a dual enrollment class in 11th grade, and by the time they find out, it’s too late. Cost is another barrier. Depending upon the state and county, a family may have to pay fees to take the classes. Though these fees are generally much cheaper than what college students pay per course, low-income families may still not be able to afford them. 

    Tatiana Velasco, an economist at CCRC and lead author of the October 2024 dual enrollment report, makes the argument that dual enrollment may be most beneficial to Black and Hispanic students and low-income students of all races and ethnicities. In her data analysis, she noted that dual enrollment credits were only providing a modest boost to students overall, but very large boosts to some demographic groups. 

    Among all high school students who enrolled in college straight after high school, 36 percent of those with dual enrollment credits earned a bachelor’s degree within four years compared with 34 percent without any dual enrollment credits. Arguably, dual enrollment credits are not making a huge difference in time to completion, on average.

    However, Velasco found much larger benefits from dual enrollment when she sliced the data by race and income. Among only Black students who enrolled in college straight away, 29 percent of those who had earned dual enrollment credits completed a bachelor’s degree within four years, compared to only 18 percent of those without dual enrollment credits. That’s more than a 50 percent increase in college attainment. “The difference is massive,” said Velasco.

    Among Hispanic students who went straight to college, 25 percent of those with dual enrollment credits earned a bachelor’s degree within four years. Only 19 percent of Hispanic college students without dual enrollment credits did. Dual enrollment also seemed particularly helpful for college students from low-income neighborhoods; 28 percent of them earned a bachelor’s degree within four years compared with only 20 percent without dual enrollment. 

    Again, it’s still unclear if dual enrollment is driving these differences. It could be that Black students who opt to take dual enrollment classes were already more motivated and higher achieving and still would have graduated college in much higher numbers. (Notably, Black students with dual enrollment credits were more likely to attend selective four-year institutions.) 

    There is a wide variation across the nation in how dual enrollment operates in high schools. In most cases, high schoolers never step foot on a college campus. Often the class is taught in a high school classroom by a high school teacher. Sometimes community colleges supply the instructors. English composition and college algebra are popular offerings. The courses are generally designed and the credits awarded by a local community college, though 30 percent of dual enrollment credits are awarded by four-year institutions. 

    A few other takeaways from the CCRC and National Student Clearinghouse report:

    • States with very high rates of college completion from their dual enrollment programs, such as Delaware, Georgia, Mississippi and New Jersey, tend to serve fewer Black, Hispanic and low-income students. Florida stood out as an exception. CCRC’s Velasco noted it had both strong college completion rates while serving a somewhat higher proportion of Hispanic students.
    • In Iowa, Texas and Washington, half of all dual enrollment students ended up going to the college that awarded their dual enrollment credits. 
    • In Montana, New Hampshire, Ohio, and Wisconsin, dual enrollment students have become a huge source of future students for community colleges. (A separate cost study shows that some community colleges are providing dual enrollment courses to a nearby high school at a loss, but if these students subsequently matriculate, their future tuition dollars can offset those losses.)

    And that perhaps is the most worrisome unintended consequence of the explosion of dual enrollment credits. Many bright high school students are racking up credits from three, four or even five college classes and they’re feeling pressure to take advantage of these credits by enrolling in the community college that partners with their high school. That might seem like a sensible decision. It’s iffy whether these dual enrollment credits can be transferred to another school, or, more importantly, count toward a student’s requirements in a major, which is what really matters and holds students back from graduating on time. 

    But a lot of these students could get into their state flagship or even a highly selective private college on scholarship. And they’d be better off. Dual enrollment students who started at a community college, the report found, were much less likely than those who enrolled at a four-year institution to complete a bachelor’s degree four years after high school.

    Contact staff writer Jill Barshay at 212-678-3595 or barshay@hechingerreport.org.

    This story about dual enrollment was written by Jill Barshay and produced by The Hechinger Report, a nonprofit, independent news organization focused on inequality and innovation in education. Sign up for Proof Points and other Hechinger newsletters. 

    The Hechinger Report provides in-depth, fact-based, unbiased reporting on education that is free to all readers. But that doesn’t mean it’s free to produce. Our work keeps educators and the public informed about pressing issues at schools and on campuses throughout the country. We tell the whole story, even when the details are inconvenient. Help us keep doing that.

    Join us today.

    [ad_2]

    Jill Barshay

    Source link

  • OPINION: Why we need a joint and urgent effort to teach data science and literacy in the U.S. – The Hechinger Report

    OPINION: Why we need a joint and urgent effort to teach data science and literacy in the U.S. – The Hechinger Report

    [ad_1]

    Data is now everywhere in our lives, informing our decisions about which new show to watch, what path to take or whether to grab an umbrella. But it’s practically absent from the way our kids learn.

    Our approach to teaching data science and data literacy has hardly evolved since I started my teaching career in 1995. Yet now more than ever, K-12 students need basic modern data science skills.

    Nearly 1 in 4 job postings in the United States require data science skills. These aren’t just tech jobs — they span industries from manufacturing to agriculture to transportation. The ability to capture, sort and analyze data is as important for small business owners as it is for computer scientists.

    Now is the time to reprioritize curricular emphases to reflect the importance of data science and data literacy. With data talent in high demand globally, other countries are investing billions in data education.

    But American K-12 education still underemphasizes data science and data literacy skills — including the ability to understand qualitative and quantitative data, assess claims based on data and make data-driven predictions.

    How do we know? Look at the data.

    Related: Become a lifelong learner. Subscribe to our free weekly newsletter to receive our comprehensive reporting directly in your inbox. 

    According to the most recent NAEP results, between 2019 and 2022, student performance in data analysis, statistics and probability fell by a full 10 points for eighth grade students, representing what some experts consider a full grade level in lack of progress.

    Data science education is typically reserved for higher education, but only slightly more than a third of Americans have a college degree. The opportunity to learn basic data skills should not be reserved for a select group of students.

    Every student needs a chance to practice these vital skills from kindergarten through high school. That’s why I am excited for the National Council of Teachers of Mathematics to be a part of Data Science 4 Everyone’s national Chart the Course initiative, exploring the integration of data literacy and science across our most important school subjects. It will build upon NCTM’s work to reimagine, revitalize and increase math’s relevance for high schoolers.

    As president of NCTM, I’ve had the honor of helping to lead the mathematics education community through a time of profound technological change, which has included developing a position statement on AI.

    Additionally, in partnership with the National Science Teaching Association, the Computer Science Teachers Association, the National Council for the Social Studies and the American Statistical Association, we made an unprecedented joint call to build data science as an interdisciplinary subject across K-12 education.

    Early in my teaching career, we focused on teaching students how to use a dataset to create a bar graph or scatter plot. Now, students need to know how to formulate the question that will generate the data, how to collect the data and how to interpret the data.

    Students are eager to make sense of the world around them, but many don’t see how classroom instruction is related to the problems they will face as adults.

    Data — in the form of numbers, graphics and videos — can provide the hook that pulls students into lessons with real-world examples and applications.

    While a math teacher might look at a graph and observe that a certain variable decreased, a social studies teacher might say, “Of course there was a decrease, look at what was happening at that moment in history.”

    If we want students to think with and use data analysis skills in their everyday lives during and after high school, we need to create relevant data-learning experiences that engage students in using statistics to make sense of the world around them. This will also result in better test scores because students will understand the material and be able to apply what they know.

    Related: Do we need a ‘Common Core’ for data science education? 

    We are now joining with Data Science 4 Everyone in an even broader effort to create the first-ever national K-12 data learning progression that stretches across school subjects. It will shape how generations of students study data.

    Educator voices are vital to this process. We need input from the people who are closest to students and who will be rolling out data science lessons in their classrooms, so we’re asking them to weigh in. We need to engage our educators in order to effect change.

    Data Science 4 Everyone’s Chart the Course voting platform is open through October 31, and we are encouraging teachers to vote for the learning outcomes they believe are the most important for K-12 students to learn by the time they graduate from high school.

    The selection of the learning outcome options in Chart the Course was informed by 11 focus groups made up of students, educators, higher education leaders, policymakers, researchers, curriculum designers and industry professionals.

    The collaborative approach was designed to create a framework that meets the needs of students and reflects the cross-disciplinary potential of data science. We hope to equip students with the skills they need to understand data and think critically and carefully as they interact with AI tools and draw their own conclusions about the world around them.

    Engaging with data is a way to make education relevant for all our students and bring our many subjects together in unique ways. It’s time to chart a course that connects classroom learning to the lives of students. That should be our goal for all teachers.

    Kevin Dykema is president of the National Council of Teachers of Mathematics (NCTM), an international mathematics education organization with more than 30,000 members. He has taught eighth grade mathematics for over 25 years in southwest Michigan.

    Contact the opinion editor at opinion@hechingerreport.org.

    This story about data science education was produced by The Hechinger Report, a nonprofit, independent news organization focused on inequality and innovation in education. Sign up for Hechinger’s weekly newsletter.

    The Hechinger Report provides in-depth, fact-based, unbiased reporting on education that is free to all readers. But that doesn’t mean it’s free to produce. Our work keeps educators and the public informed about pressing issues at schools and on campuses throughout the country. We tell the whole story, even when the details are inconvenient. Help us keep doing that.

    Join us today.

    [ad_2]

    Kevin Dykema

    Source link

  • What one state learned after a decade of free community college

    What one state learned after a decade of free community college

    [ad_1]

    View of the Tennessee State Capitol, where lawmakers were the first in the nation to pass a law in 2014 to make community college tuition free for future high school graduates. Credit: Joe Sohm/Visions of America/Universal Images Group via Getty Images

    The free community college movement effectively began in 2014 when Republican Gov. Bill Haslam of Tennessee signed the Tennessee Promise Scholarship Act, which offered the state’s high school graduates free tuition to attend any two-year public community college or technical college in Tennessee.

    Communities around the country had been experimenting with free college programs since 2005, usually with private funding, but Tennessee was the first to make it a statewide policy, and it inspired 36 states to follow suit. This year, Massachusetts was the most recent to make community college free. (Here is a search tool for all the free college programs, including more than 400 local ones.) 

    But as free-tuition programs have multiplied, so have questions and doubts. Are low-income students benefiting? Is free tuition leading to more college graduates? 

    Thirty-seven states operate statewide free college tuition programs. Some programs cover all tuition and fees; others don’t. Some just cover two-year community colleges while others include four-year institutions. Some only give assistance to low-income students; others give aid only to students who meet certain academic thresholds. Some states offer free tuition to a combination of those with need and merit.  Source: College Promise

    Unfortunately we have to wait years to allow students time to get through college, but answers to these important questions are starting to emerge from Tennessee. College Promise, a nonprofit that advocates for making college free, along with tnAchieves, the nonprofit that helps administer the Tennessee program, released a 10-year anniversary report on Oct. 14. The report offers encouraging signs that the Tennessee Promise scholarship program, which now costs about $29 million a year in tuition subsidies and other services, has helped more students go to college and earn two-year associate degrees. In addition, Tennessee shared some of the lessons learned. 

    First the numbers. The report highlights that more than 90 percent of all Tennessee high school seniors apply for the free college program. All students regardless of family income are eligible, and roughly 15,000 students a year ultimately use the program to enroll in college right after high school.  About half come from low-income families who qualify for the Federal Pell Grant

    Thirty-seven percent of students who stuck with the Promise scholarship program earned a two-year associate degree within three years, compared with only 11 percent of students who didn’t maintain eligibility, often because of incomplete financial aid paperwork, unfinished service hours that are required or failure to stay enrolled in college at least part time. Tennessee projects that since its inception, the scholarship program will have produced a total of 50,000 college graduates by 2025, administrators told me in an interview.

    Before the free tuition program went statewide, only 16 percent of Tennessee students who started community college in 2011 had earned an associate degree three years later. Graduation rates then rose to 22 percent for students who started community college in 2014. At this time, 27 Tennessee counties had launched their own free tuition programs, but the statewide policy had not yet gone into effect. 

    By 2020, when free tuition statewide had been in effect for five years, 28 percent of Tennessee’s community college students had earned a degree in three years. Not all of these students participated in the free tuition program, but many did. 

    It’s unclear if the free tuition program is the driving force behind the rising graduation rates. It could be that motivated students sign up for it and abide by the rules of the scholarship program and might have still graduated in higher numbers without it. It could also be that unrelated nationwide reforms, from increases in federal financial aid to academic advising, have helped more students make it to the finish line.

    I talked with Celeste Carruthers, an economist at University of Tennessee Knoxville, who has been studying the free tuition program in her state. She is currently crunching the numbers to figure out whether the program is causing graduation rates to climb, but the signs she sees right now are giving her “cause for optimism.” Using U.S. Census data, she compared Tennessee’s college attainment rates with the rest of the United States. In the years immediately following the statewide scholarship program, beginning with the high school class of 2015, there is a striking jump in the share of young adults with associate degrees a few years later, while associate degree attainment elsewhere in the nation improved only mildly. Tennessee quickly went from being a laggard in young adult college attainment to a leader – at least until the pandemic hit. (See graph.)

    Computations by Celeste Carruthers, University of Tennessee Knoxville. Data Source: American Community Survey, via IPUMS (https://usa.ipums.org/usa/index.shtml). Graph produced by Jill Barshay/The Hechinger Report.

    While evaluation of the Tennessee program continues, researchers and program officers point to three lessons learned so far: 

    • The scholarship program hasn’t helped many low-income students financially. The Federal Pell Grant of $7,395 far exceeds annual tuition and fees at Tennessee’s community colleges, which hover around $4,500 for a full-time student. Community college was already free for low-income students, who represent roughly half of the students in Tennessee’s free college program. Like other free college programs around the country, Tennessee’s is structured as a “last dollar” program, which means that it only pays out after other forms of financial aid are exhausted. 

    That means that tuition subsidies have primarily gone to students from higher income families that don’t qualify for the Pell Grant. In Tennessee, the funding source is the state lottery. Roughly $22 million of lottery proceeds were used to pay for community college tuition in the most recent year.

    • Free tuition alone isn’t enough help. In 2018, Tennessee added coaching and mentoring for low-income students to give them extra support. (Low-income students hadn’t been receiving any tuition subsidies because other financial aid sources already covered their tuition.) Then, in 2022, Tennessee added emergency grants for books and other living expenses for needy students – up to $1,000 per student. The extra assistance for low-income students is financed through state budget allocations and private fundraising. For students who are the first generation in their families to attend college, current graduation rates have jumped to 34 percent with this extra support compared with 11 percent without it, the 10-year report said. 

    “Pairing the financial support with the non-financial support – that mentoring support, the coaching support – is really the sweet spot,” said Graham Thomas, chief community and government relations officer at tnAchieves. “It’s the game changer, and that is often overlooked for the money part.” 

    Coaching is best conducted in person on campus. During COVID, Tennessee launched an online mentoring platform, but students didn’t engage with it. “We learned our lesson that in-person is the most valuable way to go when building relationships,” said Ben Sterling, chief content officer at tnAchieves.

    • The worst case scenario didn’t happen. When free community college was first announced, critics fretted that the zero price tag would lure students away from four-year colleges, which aren’t free. That’s bad because the transfer process from community college back to a four-year school can be rocky with students losing credits and the time invested. Studies have shown that most students are more likely to complete a four-year degree if they start at a four-year institution. But the number of bachelor’s degrees did not fall. It seems possible that the free tuition policy lured students who wouldn’t have gone to college at all in the past, without cannibalizing four-year colleges. However, bachelor’s degree acquisition in Tennessee, though rising, remains far below the rest of the nation. (See graph.)
    Computations by Celeste Carruthers, University of Tennessee Knoxville. Source: American Community Survey, via IPUMS (https://usa.ipums.org/usa/index.shtml). Graph produced by Jill Barshay/The Hechinger Report.

    As an aside, students are also able to use their Tennessee Promise scholarship funds at a limited number of public four-year colleges that offer associate degrees. About 10 percent of the program’s students take advantage of this option.

    Despite all the positive signs for educational attainment in Tennessee, recent years have not been kind. “Everything that’s happened to enrollment since COVID  kind of erased all of the gains from Tennessee Promise,” said the University of Tennessee’s Carruthers. The combination of pandemic disruptions, a strong job market and changing public sentiment about higher education hammered enrollment at community colleges nationwide. Students have started returning again in Tennessee, but community college enrollment is still below what it was in 2019.  

    Contact staff writer Jill Barshay at 212-678-3595 or barshay@hechingerreport.org.

    This story about free community college was written by Jill Barshay and produced by The Hechinger Report, a nonprofit, independent news organization focused on inequality and innovation in education. Sign up for Proof Points and other Hechinger newsletters. 

    The Hechinger Report provides in-depth, fact-based, unbiased reporting on education that is free to all readers. But that doesn’t mean it’s free to produce. Our work keeps educators and the public informed about pressing issues at schools and on campuses throughout the country. We tell the whole story, even when the details are inconvenient. Help us keep doing that.

    Join us today.

    [ad_2]

    Jill Barshay

    Source link

  • Why an end-of-the alphabet last name could skew your grades

    Why an end-of-the alphabet last name could skew your grades

    [ad_1]

    A dashboard from the Canvas learning management system is displayed to students in this college lecture hall. A University of Michigan study finds that students with last names at the end of the alphabet are penalized when instructors grade in alphabetical order, a default setting in Canvas and other widely used learning management systems (LMS). Credit: Brandon Bell/Getty Images

    If your last name starts with an A, that could mean that you’re also more likely to score an A on a test. But if you’re a Wilson or a Ziegler, you may be suffering from a new slight of the modern age: lower college grades.

    Grading processes have profoundly changed at colleges and universities in the past decade. Instead of placing assignments on a table in the front of the classroom, students today upload their work to a website, called a Learning Management System or LMS, where course documents, assignments and communications are all housed. Students can even take their exams directly within the LMS. 

    Course instructors mark assignments, papers and exams within the LMS, which also functions as a computerized grade book. The default setting is to sort student submissions in alphabetical order by surname. The computer system automatically guides the instructor to grade Adams before Baker all the way down to Zimmerman.

    A trio of researchers at the University of Michigan, including one whose surname begins with W, documented an unintended consequence of grading in alphabetical order. “There is such a tendency of graders to give lower grades as they grade more,” said Helen Wang, lead author of the study and a doctoral student at the University of Michigan’s business school.

    Wang and her two co-authors analyzed over 30 million grades at a large university that uses the most popular LMS, which is called Canvas. They calculated that surnames starting with U to Z were docked a little more than half a point (0.6 points) on a 100-point scale compared with A-to-E surnames. That’s a rather small penalty. But cumulatively, these small dings can add up and eventually translate into the difference between an A-minus and a B-plus on a final grade. 

    The study is described in a 2024 draft paper posted on the website of SSRN, formerly known as the Social Science Research Network. It is currently undergoing revisions with the academic journal Management Science.

    The researchers detected grading bias against the end of the alphabet in a wide range of subjects. However, the grading penalty was more pronounced in the social sciences and the humanities compared to engineering, science and medicine. 

    In addition to lower grades, the researchers also found that students at the bottom of the alphabet received more negative and impolite comments. For example, “why no answers to Q 2 and 3? You are setting yourself up for a failing grade,” and “NEVER DO THAT AGAIN.” Top-of-the-alphabet students were more likely to receive, “Much better work on this draft, [Student First Name]! Thank you!” 

    The researchers cannot prove precisely why extra points are deducted for the Wilsons of the world, but they suspect it’s because instructors – mostly graduate students at the unnamed university in this study – have heavy grading loads and they get tired and cranky, especially after grading the 50th student in a row. Even before the era of electronic grading, it’s quite likely the instructors were not as fair to students at the bottom of the paper pile. But in the paper world, a student’s position in the stack was always changing, depending on when the papers were turned in and how the instructors picked them up. No student was likely to be in the bottom of the pile every time. In the LMS world, the U’s, V’s, W’s, X’s, Y’s and Z’s almost always are.

    Another theory mentioned by the authors in the paper is that instructors may feel the need to be stricter if they’ve already given out a string of A’s, so as not to be too generous with high marks. Students at the bottom of the alphabet may be the victims of a well-intentioned effort to restrain grade inflation. It’s also possible that instructors are too generous with students at the top of the alphabet, but grade more accurately as they proceed. Either way, students at the bottom are being graded differently. 

    Some college instructors seem to be aware of their human frailty. In 2018, one posted on a message board at Canvas, asking the company to randomize the grade book. “For me, bias starts to creep in with fatigue,” the instructor wrote. “I grade a few, go away from it, grade a few more, take a break. Or that’s the goal when I’m not up against a deadline.” 

    If you’ve read this far, perhaps you are wondering how the researchers know that the grades for the U-to-Z students were unfair. Maybe they’re comparatively worse students? But the researchers matched the grades in Canvas with the student records in the registrar’s office and they were able to control for a host of student characteristics, from high school grades and college GPA to race, ethnicity, gender, family background and income. End-of-the alphabet surnames consistently received lower marks even among similar students who were graded by the same instructor.

    The researchers also found that a tiny fraction of instructors tinkered with the default settings and graded in reverse alphabetical order, from Z to A. That led to the exact opposite results; students with end-of-the alphabet names earned higher grades, while the grades for A, B and C surnames were lower.

    The bias against end-of-alphabet surnames is probably not unique to students who use the Canvas LMS. All four major LMS companies, which collectively control 90 percent of the U.S. and Canadian market with more than 48 million students, order submissions alphabetically for grading, according to the researchers. Even Coursera, a separate online learning platform, does it this way.   

    Wang’s solution is to shake things up and have the LMS present student work for grading in random order. Indeed, Canvas added a randomize option for instructors in May 2024, after the company saw a draft of this University of Michigan study.  “It was something that we had on our radar and that we’d heard from some users, but had not completed it yet,” a company spokesman said. “The report from the University of Michigan definitely pushed that work to top priority.” 

    However, the default remains alphabetical order and instructors need to navigate to the settings to change it. (Changing this default setting, according to the study authors, has “low visibility” within system settings on the site.) I hope this story helps to get the word out. 

    Contact staff writer Jill Barshay at (212) 678-3595 or barshay@hechingerreport.org.

    This story about learning management systems was written by Jill Barshay and produced by The Hechinger Report, a nonprofit, independent news organization focused on inequality and innovation in education. Sign up for Proof Points and other Hechinger newsletters. 

    The Hechinger Report provides in-depth, fact-based, unbiased reporting on education that is free to all readers. But that doesn’t mean it’s free to produce. Our work keeps educators and the public informed about pressing issues at schools and on campuses throughout the country. We tell the whole story, even when the details are inconvenient. Help us keep doing that.

    Join us today.

    [ad_2]

    Jill Barshay

    Source link