ReportWire

Tag: different things

  • ‘Plant-Based’ Peanut Butter … And Shampoo … And Booze

    ‘Plant-Based’ Peanut Butter … And Shampoo … And Booze

    [ad_1]

    Several years ago, I made a New Year’s resolution to eat more plants. Doing so, I assumed, would be better for my health, for animals, and for the planet. Besides, it would be easy: The rise of plant-based meat alternatives, offered by companies such as Impossible Meat and Beyond Meat, made it a breeze to eat less meat but still satisfy the occasional carnivorous urge. I could have my burger and eat it too.

    Or so I thought. Meat alternatives, I found, cost more than their conventional counterparts and are made with complicated ingredients that raise doubts about their healthiness—and even then, taste just okay. Other people have had similar concerns, part of the reason the popularity of those products has declined in recent years to such a degree that Beyond Meat is reportedly now in “survival mode.” But beyond the meat aisle, the “plant-based” label lives on in virtually every food product imaginable: instant ramen, boxed mac and cheese, Kraft singles, KitKat bars, even queso. You can now buy plant-based peanut butter. You can also wash your hair with plant-based shampoo and puff on a plant-based vape.

    Queso made from cauliflower instead of milk is correctly described as plant-based. But if peanut butter is vegan to begin with, then what is the point of the label? And who asked for plant-based liquor? On packaging and ad copy, plant-based has been applied to so many items—including foods that are highly processed, or those that have never contained animal ingredients—that it has gotten “diluted to nothing,” Mark Lang, a marketing professor at the University of Tampa who studies food, told me.

    Technically, plant-based does have a clear definition. The Cornell University biochemist Thomas Colin Campbell is often credited for coining the term in the 1980s as a neutral, less fraught descriptor for diets considered “vegan” or “vegetarian.” That is what made plant-based a popular term for companies eager to sell their meat replacements to a wide range of eaters. The Plant Based Foods Association uses essentially the same criteria—foods made from plants that do not contain animal products—to determine which products can bear its “Certified Plant Based Seal.”

    Some companies describe products as “plant-based,” however, even if they don’t meet these criteria. Items sold as such include foods that have always been vegan, such as prepackaged jackfruit, and those mixed in with some animal products, such as Wahlburgers’ “Flex Blend” patties. But even a product that is properly described as “plant-based” might mean different things to different people, because there is no one reason to try and avoid the consequences of animal rearing and consumption. Health is the leading one, followed by environmental and ethical concerns, Emma Ignaszewski, the associate director of industry intelligence and initiatives at the Good Food Institute, told me.

    The label’s vagueness has been a marketer’s dream, creating an enormous opportunity to capitalize on the perceived virtuousness and healthiness of eating plant-based. Brands use the “plant-based” label to “draw people’s attention to the aggregate goodness of a particular product” and simultaneously “deflect attention” from any less appealing attributes, Joe Árvai, a professor of psychology and biological sciences at the University of Southern California, told me. Some, like coconut water, are relatively good for you; others, like booze, are probably not. And their environmental benefits remain murky: Using fewer animal ingredients generally decreases emissions, but the climate impacts are not always straightforward.

    In this way, the evolution of plant-based mirrors that of organic or gluten-free. These terms have specific meanings that are legitimately useful for helping people make choices about their food, but they have been overused into oblivion. You can now buy organic marijuana and gluten-free water along with your plant-based energy drinks. With multiple labels, including gluten-free, plant-based, GMO-free, Earth-friendly, and Fair Trade, “some products look like a NASCAR” vehicle, Lang said. “You’re just putting buttons all over the place, trying to get my attention.”

    We may have already hit peak “plant-based.” According to a recent survey from the Food Industry Association, there is substantial confusion about what the label means—and that could be discouraging people from buying plant-based products. Some are now outright skeptical of the label. A 2023 study co-authored by Árvai suggested that people are less likely to go for foods described as “plant-based” (or “vegan”) compared with those called “healthy” or “sustainable.” One reason may be negative associations with plant-based meat alternatives, which are seen as “artificial” because of their ultra-processed nature, co-author Patrycja Sleboda, an assistant professor of psychology at Baruch College, City University of New York, told me.

    Another may be that consumers are not sure whether “plant-based” foods are healthy. Americans may respond better when the actual benefits of the food are highlighted, she said. Similarly, market research conducted by Meati, a company that sells meat alternatives made of mushrooms, found that the “plant-based” label, applied to food, signaled “bad eating experience, bad flavor, bad texture, poor nutrition, too many ingredients, and overprocessing,” Christina Ra, Meati’s vice president of marketing and communications, told me.

    Some good may still come out of the messiness of “plant-based” everything. Meati deliberately avoids the label altogether, opting instead to highlight the contents of its products (“95 percent mushroom root”). A recent Whole Foods report predicted that in 2024, consumers will want to “put the ‘plant’ back in ‘plant-based’” by replacing “complex meat alternatives” with recognizable ingredients such as walnuts and legumes. In a particular literal interpretation of this prediction, the company Actual Veggies sells a greens-and-grains patty called “The Actual Green Burger.” And some milk alternatives are also now skipping “plant-based” and simplifying their ingredient lists to just two items (nuts and water).

    Shoppers just want to know what’s in their food without having to think too hard about it. Plant-based hasn’t helped with that. Even Campbell, after he coined the term, acknowledged that it was a limiting, potentially misleading phrase that left too much room for unhealthy ingredients, such as sugar and flour. Perhaps shoppers’ exasperation with the vagueness of “plant-based” eating may eventually lead brands to promote more plant-based eating: that is, just eating plants.

    [ad_2]

    Yasmin Tayag

    Source link

  • Quit Your Bucket List

    Quit Your Bucket List

    [ad_1]

    Years ago, just after I finished my psychiatry residency, a beloved supervisor called to say she had some bad news. At a routine checkup, she had glanced at her chest X-ray up on the viewing box while waiting for her doctor to come into the room. She was a trauma surgeon before becoming a psychiatrist and had spent years reading chest X-rays, so she knew that the coin-size lesion she saw in her lung was almost certainly cancer, given her long history of smoking.

    We had dinner soon after. She was still more than two years away from the end of her life and felt physically fine—vital, even. That’s why I was so surprised when she said she had no desire to spend whatever time she had left on exotic travel or other new adventures. She wanted her husband, her friends, her family, dinner parties, and the great outdoors. “Just more Long Island sunsets. I don’t need Bali,” she told me.

    At the end of life, you might expect people to feel regret for all the things they wanted to do and never made time for. But I have yet to know a patient or friend who, facing the blunt fact of their own mortality, had anything close to a bucket list. This squares with some recent research that shows that people tend to prefer familiar experiences more when they are reminded that their days are limited. The people I know even regretted the novelty they’d chased along the way, whether it was recreational-drug use or dating exciting people who they knew weren’t relationship material.

    Deathbed pronouncements can have limited applications for the rest of life, but this pattern suggests that novelty is perhaps overrated. Chasing the high of new sensations simply isn’t appealing for many people, and can sometimes even be bad for our health. I suspect that’s because, too often, the pursuit of novelty requires sacrificing the things we already know we love.

    It’s a common misconception that people who don’t have a taste for the newest, sexiest experience are dull, incurious, and unimaginative. A 2002 study found that people will switch away from their favorite, habitual choices when they know others are watching in order to avoid being judged as narrow-minded. And yet, Warren Buffett notoriously eats breakfast at the same fast-food restaurant every day and sticks to a strict work schedule. Taylor Swift’s music can be redundant and predictable. Barack Obama is famous for his strict morning exercise regime and daily reading time.

    Even when they’re not facing death, many people just don’t seem to like novelty that much. In 2017, a poll by a British soup company found that 77 percent of U.K. workers had consumed the exact same lunch every day for nine months and that one in six people had done so for at least two years. You might think it’s just a matter of convenience or economic exigency (the study didn’t say), but I’m not so sure; wealthy people I know partake in similar behavior, even if they do it at a fancy restaurant. Consider, too, that when people lose a pet, many run out and get a replacement of the same breed with a similar temperament. They repeatedly date people with the same quirks and problems. They return to a favorite vacation spot. They listen to the same musical artists and styles time and again.

    Research shows that humans have an intrinsic preference for things and people they are familiar with, something called the mere exposure effect. Several studies have shown that people who listen to unfamiliar songs repeatedly grow fonder of the songs they hear most  by the end of the experiment, even if they did not initially like them very much. You don’t even have to be aware that you’re growing used to something for the effect to work.

    This tendency toward repetition may seem natural, even lazy, but it runs counter to much of our history. We, along with other animals, evolved to be exquisitely sensitive to novel experiences. Way back in the Paleolithic era, there was a clear survival advantage to being attuned to new situations, which could lead someone to a potential mate or a piece of mastodon, or reveal a deadly threat. Nowadays, though, with every conceivable reward—food, sex, drugs, emotional validation, you name it—either a click, tap, or ChatGPT query away, conventional novelty-seeking has lost much of its adaptive advantage.

    As Arthur Brooks has written in The Atlantic, novelty can be fun and exciting. New and unexpected experiences activate the brain’s reward pathway more powerfully than familiar ones, leading to greater dopamine release and a more intense sense of pleasure. But on its own, excitement won’t bring about enduring happiness. Human beings habituate rapidly to what is new. To achieve a lifetime of stimulation, you would have to embark on an endless search for the unfamiliar, which would inevitably lead to disappointment. Worse, the unfettered pursuit of novelty can lead to harm through excessive thrill-seeking—including antisocial behavior such as reckless driving—particularly when the novelty seeker has poor impulse control and a disregard for others.

    There’s a better way. Research shows that when novelty-seeking is paired with persistence, people are far more likely to be happy, probably because they are able to achieve something meaningful. You might, for example, take a variety of courses in college or try different summer internships if you’re not yet sure what interests you. When one really clicks, you should explore it in depth; it might even become a lifelong passion. This principle relates to less consequential pleasures, too: If you’re checking out a new neighborhood joint, consider ordering different things during your first few visits, then picking your favorite and sticking with it.

    Novelty-seeking is most valuable when you use it as a tool to discover the things and people you love—and once you find them, go deep and long with those experiences and relationships. The siren call that tells you there might be a new and better version of what you already have is likely an illusion, driven by your brain’s relentless reward pathway. When in doubt, pick a beloved activity over an unfamiliar one.

    This golden rule of novelty may help explain why some people at the end of their life regret having spent so much time exploring new things, even if they once brought fleeting pleasure. Age, too, might partly explain this feeling, because older people tend to be less open to new experiences. But that’s probably not the whole story. My colleagues who treat children and adolescents have mentioned that, in the face of life-threatening diagnoses, even young people prefer the familiar. They do so not only because the familiar is known and safe, but because it is more meaningful to them. After all, things become familiar to us because we choose them repeatedly—and we do that because they are deeply rewarding.

    Imagine, just for a moment, that your death is near. What might you miss out on if you put your bucket list on hold? Sure, you won’t make it to Bali or Antarctica. But maybe instead you could fit in one last baseball game with your kids, one last swim in the ocean, one last movie with your beloved, one last Long Island sunset. If you prioritize the activities and people you already love, you won’t reach the end of your life wishing you’d made more time for them.

    [ad_2]

    Richard A. Friedman

    Source link

  • Expiration Dates Are Meaningless

    Expiration Dates Are Meaningless

    [ad_1]

    For refrigerators across America, the passing of Thanksgiving promises a major purge. The good stuff is the first to go: the mashed potatoes, the buttery remains of stuffing, breakfast-worthy cold pie. But what’s that in the distance, huddled gloomily behind the leftovers? There lie the marginalized relics of pre-Thanksgiving grocery runs. Heavy cream, a few days past its sell-by date. A desolate bag of spinach whose label says it went bad on Sunday. Bread so hard you wonder if it’s from last Thanksgiving.

    The alimentarily unthinking, myself included, tend to move right past expiration dates. Last week, I considered the contents of a petite container in the bowels of my fridge that had transcended its best-by date by six weeks. Did I dare eat a peach yogurt? I sure did, and it was great. In most households, old items don’t stand a chance. It makes sense for people to be wary of expired food, which can occasionally be vile and incite a frenzied dash to the toilet, but food scientists have been telling us for years—if not decades—that expiration dates are mostly useless when it comes to food safety. Indeed, an enormous portion of what we deem trash is perfectly fine to eat: The food-waste nonprofit ReFED estimated that 305 million pounds of food would be needlessly discarded this Thanksgiving.

    Expiration dates, it seems, are hard to quit. But if there were ever a moment to wean ourselves off the habit of throwing out “expired” but perfectly fine items because of excessive caution, it is now. Food waste has long been a huge climate issue—rotting food’s annual emissions in the U.S. approximate that of 42 coal-fired power plants—and with inflation’s brutal toll on grocery bills, it’s also a problem for your wallet. People throw away roughly $1,300 a year in wasted food, Zach Conrad, an assistant professor of food systems at William and Mary, told me. In this economy? The only things we should be tossing are expiration dates themselves.

    Expiration dates, part of a sprawling family of labels that includes the easily confused siblings “best before,” “sell by,” and “best if used by,” have long muddled our conception of what is edible. They do so by insinuating that food has a definitive point of no return, past which it is dead, kaput, expired—and you might be, too, if you dare eat it. If only food were as simple as that.

    The problem is that most expiration dates convey only information about an item’s quality. With the exception of infant formula, where they really do refer to expiration, dates generally represent a manufacturer’s best estimate of how long food is optimally fresh and tasty, though what this actually means varies widely, not least because there is no federal oversight over labeling. Milk in Idaho, for example, can be “sold by” grocery stores more than 10 days later than in neighboring Montana, though the interim makes no difference in terms of quality. Some states, such as New York and Tennessee, don’t require labels at all.

    Date labels have been this haphazard since they arose in the 1970s. At the time, most Americans had begun to rely on grocery stores to get their food—and on manufacturers to know about its freshness. Now “the large majority of consumers think that these [labels] are related to safety,” Emily Broad Leib, a Harvard Law Professor and the founding director of its Food Law and Policy Clinic, told me. A study she co-authored in 2019 found that 84 percent of Americans at least occasionally throw out food close to the date listed on the package. But quality and safety are two very different things. Plenty of products can be edible, if not tasty, long past their expiration date. Safety, to food experts, refers to an item’s ability to cause the kind of food poisoning that sends people to the hospital. It’s “no joke,” Roni Neff, a food-waste expert at Johns Hopkins University, told me.

    Consider milk, which is among the most-wasted foods in the world. Milk that has already soured or curdled can—get this—still be perfectly safe to consume. (In fact, it makes for fluffy pancakes and biscuits and … skin-softening face masks.) “If you take a sip of that milk, you’re not going to end up with a foodborne illness,” Broad Leib said, adding that milk is one of the safest foods on the market because pasteurization kills all of the germs. Her rule of thumb for other refrigerated items is that anything destined for the stove or oven is safe past its expiration date, so long as it doesn’t smell or look odd. In industry speak, cooking is a “kill step”—one that destroys harmful interlopers—if done correctly. And then there is the pantry, an Eden of forever-stable food. Generally, dry goods never become unsafe, even if their flavor dulls. “You’re not taking your life into your hands if you’re eating a stale cracker or cereal,” said Broad Leib.

    Of course it would just be easier if labels were geared toward safety, but for the majority of food, the factors are too complex to sum up in a single date. Food is considered unsafe if it carries pathogens such as listeria, E. coli, or salmonella that can cause foodborne illness. These sneak into food through contamination, like when E. coli–tainted water is used to grow romaine lettuce. Proper storage, which means temperatures colder than 40 degrees Fahrenheit or hotter than 140 degrees Fahrenheit, inhibits their growth (except for listeria, which is particularly scary because it can thrive during refrigeration). It would be extremely difficult for a label to reflect all of this information, especially given that unsafe storage and contamination tend to occur after purchase, in hot car trunks and on unsanitized countertops. But as long as food doesn’t carry these germs to begin with, pathogens won’t suddenly appear the moment the clock strikes midnight on the expiration date. “They’re not spontaneous. Your crackers aren’t, like, contracting salmonella from the shelf,” said Broad Leib.

    There is, however, one category of food that should be labeled. Sometimes referred to as “foods pregnant women should avoid,” it includes certain ready-to-eat products such as deli meats, raw fish, sprouted vegetables, and unpasteurized milk and cheese, Brian Roe, a professor at Ohio State University’s Food Innovation Center, told me. These require extra caution because they can carry listeria, which is invisible to the senses, and are usually served cold—that is, they don’t go through a kill step before serving. Experts I spoke with agreed that high-risk foods should be identified as such, because there’s no way to tell if they’ve become unsafe. As things stand, the date label is the only information available, and it is “not helping people protect themselves from that handful of foods,” said Broad Leib. To overcome this setback, efforts are under way in the Senate and the House to replace all date labels with two phrases: best if used by to denote quality and use by for safety.

    But it’s one thing to know expiration dates are bogus and another to live accordingly. In America, dates have become a tradition we can’t escape, Neff said, adding that the stickler of each household usually gets to set the rules. And even for more adventurous eaters, date labels serve a purpose: They’re a tool for calibrating judgment, or merely for providing the comfort of a reference point. “There’s something about seeing a number there that we think tells us something that gives us a sense of security,” Neff said. Manufacturers, meanwhile, maintain date labels because they don’t want to risk consumers buying products past their prime, even if they are safe and still (mostly) tasty.

    Although there’s no perfect way to know whether food is safe or not, there are better ways than expiration dates to tell. The adage “When in doubt, throw it out” doesn’t cut it anymore, said Neff; if you’re not sure, just look it up. Good tools are available online: She recommends FoodKeeper, an app developed by the U.S. Department of Agriculture, which lets users look up roughly how long food lasts. The Waste-Free Kitchen Handbook, by the food-waste pioneer Dana Gunders, gives detailed practical advice, such as scraping a half-inch below blue-green mold on hard cheese to safely recover the rest. Leftovers require slightly more caution, noted Broad Leib, because reheating, transferring between containers, and frequent touching with utensils (which, admit it, have been in your mouth) introduces more risk for contamination; her recommendation is to eat them within three to five days, and reheat them well—to a pathogen-killing internal temperature of 165 degrees Fahrenheit. And if doing so proves tedious, consider Roe’s take on the old saying: “When in doubt, cover it with panko, fry it up, and give it to your kids.”

    Yet for most foods, one tactic reigns supreme: the smell test. Your senses can give you most of the information you need. “If something smells off, you know,” said Broad Leib. Humans evolved disgust because it taught us to avoid the stench of pathogen-tainted food. But because most people are out of practice, they struggle to tell good from bad or don’t trust their senses. To be fair, it can be hard to discern whether weird smells are coming from the milk or the carton. To restore the food knowledge that has been lost since Americans shifted away from agriculture, all of the experts I spoke with supported the revival of home-economics classes—albeit with different branding and less sexism. Teaching students how to handle perishable food means teaching them what perished looks and smells like. Adults can learn this at home, of course, by opening that milk carton and daring to sniff deeply. It may be the first sniff of the rest of your life.

    It’s unlikely that we’ll ever return en masse to the pre-1970s idyll of purchasing food directly from farmers or growing it ourselves. Americans are “several generations removed now from agriculture and food production, so we don’t know our food as well as they once did,” Jackie Suggitt, the director of capital, innovation, and engagement at ReFED, told me. A smell rebellion, if you will, can’t restore our severed relationship with food, but hey, it’s a start. The lonely items lingering in one’s post-Thanksgiving fridge may be one inhale away from renewed relevance. If I deigned to sniff that “expired” heavy cream, I might be delighted to encounter a future garnish for pumpkin pie. And what is wilted spinach anyway but a can of artichokes away from dip?

    [ad_2]

    Yasmin Tayag

    Source link

  • The College-Admissions Merit Myth

    The College-Admissions Merit Myth

    [ad_1]

    Tomorrow, the Supreme Court will hear oral arguments in two cases that could end America’s experiment with affirmative action in higher education. The challenges to the admissions programs at Harvard and at the University of North Carolina at Chapel Hill—both brought by Students for Fair Admissions, a coalition of unnamed students assembled by the conservative legal strategist Edward Blum—argue that the institutions discriminate against Asian American students, and that eliminating the use of race in admissions would fix the problem.

    Lower courts have rejected SFFA’s arguments, leaning on more than 40 years of precedent that says the use of race in admissions is permissible in narrow circumstances. “Harvard has demonstrated that no workable and available race-neutral alternatives would allow it to achieve a diverse student body while still maintaining its standards for academic excellence,” Judge Allison Burroughs wrote in her 2019 opinion. But SFFA pressed on, and now the case sits before a conservative Supreme Court that has shown a willingness to overturn well-established precedents.

    In her new book, Is Affirmative Action Fair? The Myth of Equity in College Admissions, Natasha Warikoo, a sociologist at Tufts University who has spent years examining race-conscious admissions, assesses the positions of those for and against affirmative action, and argues that we’re asking the wrong questions about how students get into college. By exalting merit, Warikoo warns, Americans have developed a skewed perception of the process—a perception that leads to challenges such as the one before the Court.

    I spoke with Warikoo about her book, the Supreme Court hearing, and how we can better understand admissions.

    This conversation has been edited for length and clarity.


    Adam Harris: You write, “When we recognize the diverse goals that universities attempt to address through college admissions, it becomes clear that admission is not a certification of individual merit, or deservingness, nor was it ever meant to be.” Can you expand on that idea? Where do we have flaws in our understanding of college admissions?

    Natasha Warikoo: In the past, it was like “We want to have a bar.” You had to have some demonstration that you could handle the work that we’re going to give you. And some of that was exclusionary. It was like “Can you pass the Latin test?” Well, most schools didn’t teach kids Latin, so it’s not that that was fair—it was “You’re going to be doing Latin; do you know Latin?”

    But now, when we’re talking about super-selective places—there are more than 200 of them, so not just the Ivies, but also not most colleges—they have so many different interests that are playing into who they’re admitting. You’ve got the sports coaches who are trying to get their recruits; you’ve got the development office that gives a list and says, “These people have done a lot for this university—make sure you take a close look at that”; there’s the humanities departments who want to make sure there are people interested in the humanities, not just in STEM; the orchestra’s bassoon player may have graduated, and now the orchestra needs a bassoon player. So, there are all these different things that are going on, and the admissions office is trying to fulfill all these different interests and needs.

    But ordinary people treat admissions as, you know, they’re lining people up from best to worst and taking the top ones, and if one of these says they’re not coming, then they take the next person. Well, that’s not how it works. They’re fulfilling organizational needs and desires. But somehow, we treat it as a prize—and whoever is most deserving gets in.

    Harris: That plays into the broader idea in America around merit, and the way that we’ve oriented our society around merit. How do merit and the idea of fairness work together to give us the wrong idea about admission systems?

    Warikoo: In all of these international surveys, when you look at respondents’ belief about whether people should be rewarded for merit over other things, Americans are much more likely to say yes than people in most other countries. A lot of modern societies believe in these ideas of meritocracy, but the United States is especially attached to the idea. We have this belief that some people are deserving—and the unspoken idea that some are undeserving. And there’s a sense of entitlement, like I did all of these things; I deserve a spot at these places.

    But we should stop treating college admissions as if everybody is on an equal playing field and that the person who is the smartest, the most hardworking, the one with the most grit, is the one getting in. Instead of arguing about how affirmative action goes against our ideas of meritocracy, we should look at what colleges are actually trying to do.

    Harris: Well, let’s talk about affirmative action. How has it been viewed since Justice Lewis Powell accepted the diversity rationale in the Regents of the University of California v. Bakke case in 1978?

    Warikoo: There’s a whole industry of research that develops after that decision to really try to dig into the impact of a diverse learning environment: What is the impact of having a roommate of a different race, going to a college that is diverse, being in a class with students who are a different race? And this research shows all these benefits: Groups make better decisions; students have more intellectual engagement; they improve their racial attitudes. There are even some findings that show a positive impact on civic engagement down the line. A student may not even have a diverse set of friends, but if they’re on a diverse campus, there seems to be some kind of impact.

    So, all of this research shows these positive effects, and those data have been used in subsequent court cases defending affirmative action. But in the public conversation, many people recognize that it’s also an equity issue.

    Harris: In 2003, Justice Sandra Day O’Connor said the Court expects that 25 years from now, the use of racial preferences will no longer be necessary. And that’s what a lot of opponents of affirmative action say now: It may have been justified in the past, but it’s no longer necessary—and if we need something, we might be able to find a proxy. Are there proxies for race in admissions?

    Warikoo: The legal requirement is that when you’re using these suspect categories such as race in a policy, you have to show that there’s no other way that you could do things instead. And it’s pretty clear that there’s no good stand-in for race. We can use class, and class is important. But I don’t see these as either-or. The Georgetown law professor Sheryll Cashin has looked at zip code as a stand-in, and it’s pretty clear that such an approach is not going to have an impact on the numbers of underrepresented minority students on campus. Because, you know, the overwhelming majority of people in the United States today are white. The majority of people who are poor in this country are white. So you’re not really going to racially diversify by looking at class.

    Colleges have tried different things, such as the Texas “10 percent plan.” The research suggests that these other ideas are somewhat helpful, but the problem has been that graduation rates can go down when you’re just using a percent plan. And it’s not a stand-in for race-based affirmative action.

    We can look at the data from the states that have banned affirmative action to understand that they have not figured out a stand-in. We see declines in every state, year on year, of the number of underrepresented minorities when affirmative action gets banned.

    Harris: One of the through lines in the book is the purpose of higher education. What can colleges do better to be more honest about their goals?

    Warikoo: One is being careful about how they talk about admissions. And when you dig into their language, many schools say that they’re looking to build a class, and that everyone makes a unique contribution. But they’re still publishing acceptance rates. There are so many ways in which the language they use buys into this idea that they are a place of excellence. This is the best class ever, you’re told when you’re a freshman.

    When you have these elite colleges in which the student body comes from more resourced families than the average across 18 year-olds, it’s not just the best of the best. Your family’s resources play a role—whether you have parents who went to college, whether you grew up in certain neighborhoods or went to certain schools. Two-thirds of American adults don’t have a bachelor’s degree.

    But I keep coming back to the question of What are we trying to do here? Our spending in the U.S. on higher education is regressive. The most elite colleges accept students who are the highest achieving and most resourced. But who needs the most support? When you look at what community colleges are doing in terms of social mobility, they blow places like Harvard and Tufts out of the water. Colleges should think much more about the role they want to play in our society, and how they should align admissions to those goals.

    Harris: As I got toward the end of the book, where you talk about solutions, a couple of things really stuck out: the sort of anti-inclusive instinct that a lot of institutions have in terms of increasing their enrollment, where they don’t want to increase enrollment because that may upset alumni who attach value to the selectiveness of their institution. Or, if there were an admission lottery, families of high achievers may be frustrated. And my takeaway was: There’s really nothing the institutions may be able to do that is going to make everyone happy, so maybe they should just do what’s just.

    Warikoo: Yes. There are so many more amazing 18-year-olds in our country—deserving, hardworking, ambitious, smart, whatever superlative you want to use—than there is space for them at Harvard, at UNC, at any given school.

    But we have to stop acting like you deserve it and you don’t deserve it. It’s not about who deserves it. And that’s why I talk about a lottery system, because it implies you don’t deserve this more than anyone else—you got lucky. It already is luck: that your parents could afford to buy a house near a school that had a college counselor, or you had a tutor who could help you with your essay, or you went to a school with a crew team and you got recruited for crew—all kinds of things. It is luck. Why not call it what it is?

    [ad_2]

    Adam Harris

    Source link