ReportWire

Tag: charles darwin

  • Today in History: December 27, Charles Darwin sets out on world voyage

    [ad_1]

    Today is Saturday, Dec. 27, the 361st day of 2025. There are four days left in the year.

    Today in history:

    On Dec. 27, 1831, naturalist Charles Darwin set out on a round-the-world voyage from Plymouth, England, aboard the HMS Beagle.

    Also on this date:

    In 1904, James Barrie’s play “Peter Pan: The Boy Who Wouldn’t Grow Up” opened at the Duke of York’s Theatre in London.

    [ad_2]

    Associated Press

    Source link

  • James Watson, co-discoverer of the double-helix shape of DNA, has died at age 97

    [ad_1]

    James D. Watson, whose co-discovery of the twisted-ladder structure of DNA in 1953 helped light the long fuse on a revolution in medicine, crimefighting, genealogy and ethics, has died. He was 97.

    The breakthrough — made when the brash, Chicago-born Watson was just 24 — turned him into a hallowed figure in the world of science for decades. But near the end of his life, he faced condemnation and professional censure for offensive remarks, including saying Black people are less intelligent than white people.

    Watson shared a 1962 Nobel Prize with Francis Crick and Maurice Wilkins for discovering that deoxyribonucleic acid, or DNA, is a double helix, consisting of two strands that coil around each other to create what resembles a long, gently twisting ladder.

    That realization was a breakthrough. It instantly suggested how hereditary information is stored and how cells duplicate their DNA when they divide. The duplication begins with the two strands of DNA pulling apart like a zipper.

    Even among non-scientists, the double helix would become an instantly recognized symbol of science, showing up in such places as the work of Salvador Dali and a British postage stamp.

    The discovery helped open the door to more recent developments such as tinkering with the genetic makeup of living things, treating disease by inserting genes into patients, identifying human remains and criminal suspects from DNA samples, and tracing family trees and ancient human ancestors. But it has also raised a host of ethical questions, such as whether we should be altering the body’s blueprint for cosmetic reasons or in a way that is transmitted to a person’s offspring.

    “Francis Crick and I made the discovery of the century, that was pretty clear,” Watson once said. He later wrote: “There was no way we could have foreseen the explosive impact of the double helix on science and society.”

    Watson never made another lab finding that big. But in the decades that followed, he wrote influential textbooks and a best-selling memoir and helped guide the project to map the human genome. He picked out bright young scientists and helped them. And he used his prestige and contacts to influence science policy.

    Watson died in hospice care after a brief illness, his son said Friday. His former research lab confirmed he passed away a day earlier.

    “He never stopped fighting for people who were suffering from disease,” Duncan Watson said of his father.

    Watson’s initial motivation for supporting the gene project was personal: His son Rufus had been hospitalized with a possible diagnosis of schizophrenia, and Watson figured that knowing the complete makeup of DNA would be crucial for understanding that disease — maybe in time to help his son.

    He gained unwelcome attention in 2007, when the Sunday Times Magazine of London quoted him as saying he was “inherently gloomy about the prospect of Africa” because “all our social policies are based on the fact that their intelligence is the same as ours — where all the testing says not really.” He said that while he hopes everyone is equal, “people who have to deal with Black employees find this is not true.”

    He apologized, but after an international furor he was suspended from his job as chancellor of the prestigious Cold Spring Harbor Laboratory in New York. He retired a week later. He had served in various leadership jobs there for nearly 40 years.

    In a television documentary that aired in early 2019, Watson was asked if his views had changed. “No, not at all,” he said. In response, the Cold Spring Harbor lab revoked several honorary titles it had given Watson, saying his statements were “reprehensible” and “unsupported by science.”

    Watson’s combination of scientific achievement and controversial remarks created a complicated legacy.

    He has shown “a regrettable tendency toward inflammatory and offensive remarks, especially late in his career,” Dr. Francis Collins, then-director of the National Institutes of Health, said in 2019. “His outbursts, particularly when they reflected on race, were both profoundly misguided and deeply hurtful. I only wish that Jim’s views on society and humanity could have matched his brilliant scientific insights.”

    Long before that, Watson scorned political correctness.

    “A goodly number of scientists are not only narrow-minded and dull, but also just stupid,” he wrote in “The Double Helix,” his bestselling 1968 book about the DNA discovery.

    For success in science, he wrote: “You have to avoid dumb people. … Never do anything that bores you. … If you can’t stand to be with your real peers (including scientific competitors) get out of science. … To make a huge success, a scientist has to be prepared to get into deep trouble.”

    It was in the fall of 1951 that the tall, skinny Watson — already the holder of a Ph.D. at 23 — arrived at Britain’s Cambridge University, where he met Crick. As a Watson biographer later said, “It was intellectual love at first sight.”

    Crick himself wrote that the partnership thrived in part because the two men shared “a certain youthful arrogance, a ruthlessness, and an impatience with sloppy thinking.”

    Together they sought to tackle the structure of DNA, aided by X-ray research by colleague Rosalind Franklin and her graduate student Raymond Gosling. Watson was later criticized for a disparaging portrayal of Franklin in “The Double Helix,” and today she is considered a prominent example of a female scientist whose contributions were overlooked. (She died in 1958.)

    Watson and Crick built Tinker Toy-like models to work out the molecule’s structure. One Saturday morning in 1953, after fiddling with bits of cardboard he had carefully cut to represent fragments of the DNA molecule, Watson suddenly realized how these pieces could form the “rungs” of a double helix ladder.

    His first reaction: “It’s so beautiful.”

    Figuring out the double helix “goes down as one of the three most important discoveries in the history of biology,” alongside Charles Darwin’s theory of evolution through natural selection and Gregor Mendel’s fundamental laws of genetics, said Cold Spring Harbor lab’s president, Bruce Stillman.

    Following the discovery, Watson spent two years at the California Institute of Technology, then joined the faculty at Harvard in 1955. Before leaving Harvard in 1976, he essentially created the university’s program for molecular biology, scientist Mark Ptashne recalled in a 1999 interview.

    Watson became director of the Cold Spring Harbor lab in 1968, its president in 1994 and its chancellor 10 years later. He made the lab on Long Island an educational center for scientists and non-scientists, focused research on cancer, instilled a sense of excitement and raised huge amounts of money.

    He transformed the lab into a “vibrant, incredibly important center,” Ptashne said. It was “one of the miracles of Jim: a more disheveled, less smooth, less typically ingratiating person you could hardly imagine.”

    From 1988 to 1992, Watson directed the federal effort to identify the detailed makeup of human DNA. He created the project’s huge investment in ethics research by simply announcing it at a news conference. He later said it was “probably the wisest thing I’ve done over the past decade.”

    Watson was on hand at the White House in 2000 for the announcement that the federal project had completed an important goal: a “working draft” of the human genome, basically a road map to an estimated 90 percent of human genes.

    Researchers presented Watson with the detailed description of his own genome in 2007. It was one of the first genomes of an individual to be deciphered.

    Watson knew that genetic research could produce findings that make some people uncomfortable. In 2007, he wrote that when scientists identify genetic variants that predispose people to crime or significantly affect intelligence, the findings should be publicized rather than squelched out of political correctness.

    James Dewey Watson was born in Chicago on April 6, 1928, into “a family that believed in books, birds and the Democratic Party,” as he put it. From his birdwatcher father he inherited an interest in ornithology and a distaste for explanations that didn’t rely on reason or science.

    Watson was a precocious child who loved to read, studying books like “The World Telegraph Almanac of Facts.” He entered the University of Chicago on a scholarship at 15, graduated at 19 and earned his doctorate in zoology at Indiana University three years later.

    He got interested in genetics at age 17 when he read a book that said genes were the essence of life.

    “I thought, ‘Well, if the gene is the essence of life, I want to know more about it,’” he later recalled. “And that was fateful because, otherwise, I would have spent my life studying birds and no one would have heard of me.”

    At the time, it wasn’t clear that genes were made of DNA, at least for any life form other than bacteria. But Watson went to Europe to study the biochemistry of nucleic acids like DNA. At a conference in Italy, Watson saw an X-ray image that indicated DNA could form crystals.

    “Suddenly I was excited about chemistry,” Watson wrote in “The Double Helix.” If genes could crystallize, “they must have a regular structure that could be solved in a straightforward fashion.”

    “A potential key to the secret of life was impossible to push out of my mind,” he recalled.

    In the decades after his discovery, Watson’s fame persisted. Apple Computer used his picture in an ad campaign. At conferences, graduate students who weren’t even born when he worked at Cambridge nudged each other and whispered, “There’s Watson. There’s Watson.” They got him to autograph napkins or copies of “The Double Helix.”

    A reporter asked him 2018 if any building at the Cold Spring Harbor lab was named after him. No, Watson replied, “I don’t need a building named after me. I have the double helix.”

    His 2007 remarks on race were not the first time Watson struck a nerve with his comments. In a speech in 2000, he suggested that sex drive is related to skin color. And earlier he told a newspaper that if a gene governing sexuality were found and could be detected in the womb, a woman who didn’t want to have a gay child should be allowed to have an abortion.

    More than a half-century after winning the Nobel, Watson put the gold medal up for auction in 2014. The winning bid, $4.7 million, set a record for a Nobel. The medal was eventually returned to Watson.

    Both of Watson’s Nobel co-winners, Crick and Wilkins, died in 2004.

    ___

    Ritter is a retired AP science writer. AP science writers Christina Larson in Washington and Adithi Ramakrishnan in New York contributed to this report.

    ___

    The Associated Press Health and Science Department receives support from the Howard Hughes Medical Institute’s Department of Science Education and the Robert Wood Johnson Foundation. The AP is solely responsible for all content.

    [ad_2]

    Source link

  • Rachel Ruysch’s Tirade of Beauty at Boston’s MFA

    [ad_1]

    Rachel Ruysch, Posy of Flowers with a Beetle on a Stone Ledge, 1741. Oil on canvas. Courtesy the Museum of Fine Arts, Boston.

    Craving ever new varieties in nature for experimentation, Darwin wrote to his good friend and botanist, Joseph Hooker, “I have a passion to grow orchid seeds…for love of Heaven favour my madness & have some lichens or mosses scraped off & sent me. I am a gambler & love a wild experiment.” It seems that Darwin was not the only one to crave exotic flowers. Three centuries earlier, the Dutch were hot on the trail to expand their imperial power by collecting exotic specimens from all over the world. The Dutch East India Company was established in 1602 and the West East India Company in 1621, enabling the empire’s expansion through their maritime fleet. By using enslaved labor, they amassed huge collections of flowers, insects, reptiles and birds from North and South America, Africa, Australia, India and even Borneo. The difficulty in transporting all of these delicate specimens across vast oceans was extreme. There were rats on board ships, and radical changes of temperature going from the tropics to frigid Europe. The Dutch greenhouses on Cape Horn were a stopover for the exotics, before the last treacherous sail home. Cape Horn has the deadliest seas on Earth.

    During the 1600s in the Netherlands, hundreds of devoted scientists and artists documented these discoveries. One of the most famous was the painter Rachel Ruysch. Her father, Frederik Ruysch, a renowned collector and artist, was known for his anatomical, zoological and botanical specimens, as well as his embalming technique. This was Rachel’s early laboratory until she went on to study painting, becoming the highest-paid painter in the Netherlands, earning more money than Rembrandt.

    Born in 1664, she painted for seven decades, dying in 1750 at the age of 86. She painted 185 known works (possibly 250). She was lauded during her time, internationally famous and the subject of poems. She painted from the age of 15 and well into her 80s. Lest we forget, Ruysch also had ten children. None of the poems mentions that.

    And her paintings are downright gorgeous. The vitality of her work, the meticulous accuracy, the fullness of color and the enchanting compositions are a wonder to behold. She painted nature in all its blooming, populated with exotic flowers, fruits, insects, reptiles, moths and butterflies. The paintings are rich in vibrant color, deeply shaded and with exact anatomical precision. She recorded for the ages flora and fauna, insects and reptiles, that may now already be extinct or on their way to extinction.

    An oil painting depicts a woman artist, believed to be Rachel Ruysch, seated at a table with a palette and brushes as she delicately arranges a flower beside an open botanical book, emphasizing her dual role as painter and scientific observer.An oil painting depicts a woman artist, believed to be Rachel Ruysch, seated at a table with a palette and brushes as she delicately arranges a flower beside an open botanical book, emphasizing her dual role as painter and scientific observer.
    Michiel van Musscher, Rachel Ruysch, 1692. Oil on canvas. Courtesy the Museum of Fine Arts, Boston.

    The MFA in Boston is displaying 35 of Ruysch’s paintings in all their glory in “Rachel Ruysch: Artist, Naturalist, and Pioneer.” In the floral still lifes, she focuses not just on the blooms but also on the creatures that populated the flowers. From 1686, Forest Recess with Flowers, the blooms are framed in loping, draping milk thistle leaves, almost like reptilian skin. A curling mushroom below, a frog, snail, moths, tree trunk, the clay forest floor—these details lift her far beyond a flower painter into a deep and astute scientific observer.

    In 1714, she paints a still life with 25 species from 15 botanical families of flowers and fruit. Still Life with Fruits and Flowers displays a cacophony of pomegranates, peaches, corn, wheat, grapes, squash, pumpkin, along with tulips, peonies, lizard, butterflies and moths. You wonder how long it took her to paint these bounties before decay set in. Everything is fresh, glistening, delicious, fragrant—alive. A sumptuous, irresistible feast, joining the hungry reptiles and insects.

    She doesn’t stop there. In 1735, Still Life of Exotic Flowers on a Marble Ledge, she paints 36 species from around the world. Represented are flowers native to North and South America, South Africa, the Caribbean, East and Southeast Asia. She includes in her many paintings 17 species of diurnal butterflies (active during the day), 24 species of moths, spiders and many species of bee beetles, including the mango longhorn beetle from South America. There are lizards and birds and egg shells, and many plants in the cactus family. A painting technique prevalent in nature paintings during her early career was lepidochromy. Butterfly wings were pressed into the wet paint for further authenticity. Ruysch often placed exotic and native animals, butterflies and flowers together—always with an astute eye for composition.

    A densely detailed still life painting shows an overflowing arrangement of flowers, fruits, and plants—such as tulips, peonies, grapes, peaches, and pomegranates—intermixed with insects and small animals, illustrating the abundance and scientific precision characteristic of Rachel Ruysch’s work.A densely detailed still life painting shows an overflowing arrangement of flowers, fruits, and plants—such as tulips, peonies, grapes, peaches, and pomegranates—intermixed with insects and small animals, illustrating the abundance and scientific precision characteristic of Rachel Ruysch’s work.
    Rachel Ruysch, Still Life with Fruits and Flowers, 1714. Oil on canvas. © Kunstsammlungen und Museen Augsburg / Photo: Bayerische Staatsgemäldesammlungen, Nicole Wilhelms / Courtesy Museum of Fine Arts, Boston

    She also included frogs and toads. One, Surinam toad (Pipa pipa), gets a portrait all to herself. The entire painting is dark green and brown, hard to see. Does it need cleaning? The toad is accompanied nearby with a specimen in a glass jar, better to see the indentations in her back where the male leaves his sperm. The eggs incubate in these small craters on her back until they hatch, fully formed.

    The curator, Anna Knaap, has organized the exhibit into six luxurious sections, highlighted against sumptuously painted dark, rich burgundy and deep green walls. In the sections are specimens in glass jars of reptiles, cases of pinned butterflies and moths, maps of the empire, botanical drawings, as well as paintings by her sister Anna Ruysch and many other Dutch painters of that time. The plant and insect specimens are from Harvard University’s Herbarium and Museum of Comparative Zoology.

    Ruysch’s last painting, Posy of Flowers with a Beetle on a Stone Ledge, 1741, is comparatively small with very few flowers. The bowl of the pink peony is flecked with dew and a bee. It is a tender painting and luminous. To see an exhibition including all three giants—Darwin, Ruysch and Emily Dickinson, another lover of botany and flowers—would be exciting. As Dickinson wrote in Flowers – Well – if anybody:

    Butterflies from St. Domingo
    Cruising round the purple line—
    Have a system of aesthetics—
    Far superior to mine.

    Rachel Ruysch: Artist, Naturalist, and Pioneer” is at the Museum of Fine Arts, Boston, through December 7, 2025. An excellent, comprehensive, award-winning catalogue accompanies the exhibition.

    More in Artists

    Rachel Ruysch’s Tirade of Beauty at Boston’s MFA

    [ad_2]

    Dian Parker

    Source link

  • Paradigm Shifts: A Complete Change in Worldview

    Paradigm Shifts: A Complete Change in Worldview

    [ad_1]

    Discover the power of paradigm shifts in driving individual and societal transformation, from overcoming cognitive dissonance to fueling scientific revolutions.


    When’s the last time you changed your mind about something?

    Many people are stuck in their beliefs and worldview, especially once we reach a certain age. Our map of reality is shaped most by early life experiences, including lessons we’ve learned from parents, teachers, and friends.

    A worldview can be hard to break out of on a purely psychological level.

    Once we are set in a view, we seek new information that continues to confirm these beliefs by only looking at sources that already agree with us. When new information contradicts these beliefs, we can easily ignore it or distort it to keep our map of reality intact.

    Accepting that we are wrong about something can be hurtful to our ego and pride, and in many ways our brains are designed to protect ourselves from this discomfort by simply ignoring contradictory information unless it has a real world effect on our lives. As Philip K. Dick once said, “Reality is that which, when you stop believing in it, doesn’t go away.”

    The average person isn’t primarily driven by a search for truth, they just need a map of reality that is good enough to navigate their lives effectively and not get themselves into too much trouble, which includes social pressures to conform to certain beliefs or stay silent about others.

    People can go through radical changes in beliefs though. Young adults and teenagers may go through “phases” as they come-of-age, where they question what they’ve been taught, rebel against orthodoxy, and search for their own meaning or purpose in life. These transformative years can lead to paradigm shifts that last a lifetime, such as adherence to new political, religious, or philosophical ideologies. Many may still return to their old beliefs later in life, but with a fresh new perspective.

    Learning about a new worldview, ideology, or philosophy doesn’t mean you need to adopt it – and it doesn’t necessarily lead to a paradigm shift. Often times learning about radically different belief systems can give us a firmer understanding of our current beliefs. There’s wisdom in learning about worldviews you find wrong, mistaken, or incorrect; at the very least, it will give you a better understanding of where other people are coming from.

    Paradigm shifts aren’t just new or updated knowledge, they represent a complete change in your perspective that makes you see and interpret old knowledge in a different way.

    This shift in perspective can be jolting and uncomfortable at first. We depend on worldviews to make sense of reality, so deep changes in perspective can often make reality feel more confusing or unstable at first.

    We often need to re-evaluate old knowledge and experiences through a new lens, and re-integrate them into a new and better map of reality. This is a mental shift that can sometimes take months or years before it is fully developed.

    My Paradigm Shifts

    My mind has changed a lot over the past decade, which hopefully is a sign that I’m learning and growing. When I first started this website over 15 years ago, my worldview was very different than what it is today.

    A few ways my mindset has changed:

    • Less Individualistic – During my college years, I explored a lot of libertarian philosophy that emphasized the individual over the collective. This is a common starting point in many “self help” circles too, which have an ethos of “take responsibility” and “pull yourself up by your boot straps.” While I still believe strongly in individual responsibility and initiative, I’ve grown to recognize the “no man is an island” mantra and focus more on the importance of social support, community-mindedness, and asking for help. This understanding has led to changes in my political and economic views too.
    • Less Materialistic and Money-Focused – It’s a bit embarrassing looking back on it, but I used to want to be rich and famous. I think a lot of it is just part of America’s narcissistic culture, where everyone strives to become some type of celebrity. As I get older, I’ve discovered new core values that have helped me focus on the more important things in life. I’ve also learned that a lot of my drive for money was really a drive for independence, and those aren’t the same thing. A person can make a lot of money and be trapped in their career to sustain their luxurious lifestyle, but a person of more modest fortune, who can be happy with less, often has more independence because they can then focus on other things in life. That was a counter-intuitive idea for me that took awhile to process.
    • Focus on Social and Cultural Forces – When I was younger, and likely a product of my libertarian days, I used to focus more on the importance of economics rather than culture. Generally, I saw things like music, art, and film as just a peripheral aspect of society, but now I’m beginning to understand their central importance. Every culture reflects and propagates a certain set of values, and a culture that promotes harmful and destructive values will lead to a harmful and destructive society. When I look at today’s world, I see a lot of cultural forces going in the wrong direction. I’m not pro-censorship in anyway, but I find many aspects of our culture need to be analyzed, criticized, and abandoned if they are hurting the happiness and health of a people.

    This is how my mindset has shifted over the years – and my mind will likely keep changing as long as I stay open to new information, new knowledge, and new experiences. At this point, most of my learning has happened outside of school and that’s a path I will continue on for the rest of my life.

    The Structure of Scientific Revolutions

    One of the most popular discussions on the topic of paradigm shifts is Thomas Kuhn’s 1962 book
    The Structure of Scientific Revolutions.

    Kuhn describes that scientific progress isn’t just an accumulation of facts, which he calls “normal science,” but also periods of “revolutionary science,” where anomalies are discovered that force scientists to look at a field in a completely new way.

    Common examples of paradigm shifts in science include:

    • The Copernican Revolution in the 16th century, where there was a change from geocentrism (“earth is the center of the universe”) to heliocentrism (“sun is the center of the solar system”)
    • Newtonian Physics in the 17th century, where classical mechanics discovered by Isaac Newton replaced previous models of Aristotelian physics.
    • Darwin’s theory of evolution and natural selection in the 19th century, which changed how humans viewed themselves in relation to animals and nature.

    Often there is initial resistance to accept new paradigms, which can go through heated periods of controversy and criticism among contemporary scientists and laymen.

    However, once these new paradigms were adopted, they allowed for research and discoveries into new phenomenon which ultimately expanded the boundaries of science and learning.

    New paradigms completely change how a scientific field is looked at. Thomas Kuhn used the example of the duck-rabbit optical illusion to demonstrate how new paradigms can change how we see old information:

    duck-rabbit optical illusion

    A duck or rabbit? It depends on your perspective.

    New paradigms can take awhile to be fully adopted. Old facts need to be looked at through a new lens. New books, research, studies, lectures, and textbooks need to be re-written from this new perspective, leading to a type of cognitive restructuring of society. The philosopher Immanuel Kant referred to the advancements of Greek mathematics and Newtonian physics as “revolutions in thinking,” and they take time to process.

    Generally, new scientific paradigms are better than old ones because they have more explanatory power over understanding natural phenomenon and predicting future outcomes.

    The best measure of scientific truth is its predictive power: if a new paradigm fails to better explain or predict a natural occurrence over a previous paradigm, then there’s no real point in replacing the old model (from a scientific perspective).

    Paradigm Shifts: An Antidote to Cognitive Dissonance

    Paradigm shifts are spurred on when new facts don’t fit into old worldviews. This leads to feelings of cognitive dissonance which is when someone is forced to hold two contradictory beliefs at the same time.

    Often the only way to reconcile this disconnect between facts vs. experience is to find a completely new paradigm that accounts for all old and new knowledge. This may require recognizing wrong or mistaken beliefs from your past, or cultivating a worldview with more complexity and nuance.

    Cognitive dissonance is a painful experience that most people choose to ignore or avoid. Many people double-down on wrong beliefs when they are passionately invested in them, which leads to excessive confirmation bias and conspiracy theories when beliefs continue to be held unchecked.

    At the same time, cognitive dissonance can be a catalyst for change – it’s a signal that we need to adjust our understanding of reality. This can become a real avenue for transformative thinking as long as you are honest with yourself, seek out diverse sources of information, and open-minded enough to see things in a new light.

    Conclusion

    Paradigm shifts are a part of learning and growing on both an individual and societal level. They are necessary for both radical self-improvement and radical scientific progress.

    While it’s important not to “change your mind just for the sake of changing your mind,” honest searches for knowledge and truth inevitably come up against walls that require a paradigm shift to get over and move onto the next stage.


    Enter your email to stay updated on new articles in self improvement:

    [ad_2]

    Steven Handel

    Source link

  • WTF Fun Fact 13544 – What Darwin Ate

    WTF Fun Fact 13544 – What Darwin Ate

    [ad_1]

    You might assume that Charles Darwin, the famed naturalist, was a vegetarian since he was so enamored with living creatures, but he was just the opposite – in fact, Darwin ate some of his discoveries.

    During his journey on The Beagle, he indulged in an array of exotic meats – from puma, which he found “remarkably like veal in taste,” to armadillos and iguanas.

    His curiosity even led him to taste the bladder contents of a giant tortoise. Darwin’s palate wasn’t just adventurous; it was scientific. He was known for eating specimens he was studying and trying to describe scientifically.

    Modern Biologists Follow Suit

    This gastronomic curiosity didn’t end with Darwin. Many modern scientists continue to eat their study subjects, either out of convenience (as with those researching edible plants and animals like trout or blueberries) or driven by sheer curiosity. From bluegill and sea urchin to more peculiar choices like beetles and cicadas, the range of their dietary experiments is vast.

    Notably, Richard Wassersug, while conducting a study on the palatability of tadpoles in the 1970s, had graduate students (bribed with beer) taste but not swallow various tadpole species. This experiment, now impossible to conduct due to ethical restrictions, showed that easy-to-catch tadpoles often tasted worse. Wassersug himself described the taste of toad tadpoles as “astonishingly bitter.”

    The Drive Behind Why Darwin Ate an Unusual Diet

    The motivation behind these gastronomic explorations varies. Sometimes it’s an academic pursuit, as in Wassersug’s study. Other times, it’s a quest to manage invasive species, turning them from pests into menu items. Sarah Treanor Bois, during her Ph.D. research on invasive plants, attended a cook-off featuring dishes made from invasive species like nutria and bullfrog legs. Eating invasives is not just about satiating curiosity but also about drawing attention to ecological problems.

    However, the most common reason cited for these unusual diets is pure scientific curiosity. Robert Thorson, a geologist, once tasted 30,000-year-old meat from a giant steppe bison found in permafrost. His verdict? It was stringy and flavorless, with a “pungent rankness.”

    Scientists’ Gastronomic Adventures

    Why are scientists so inclined towards tasting their research subjects? Mark Siddall, a leech expert, believes it’s about familiarity. Just as an omnivore eats chicken, beef, or pork, scientists consume what they’re familiar with. To a biologist, an organism they’ve studied extensively may not seem so different from regular food. Richard Wassersug views it as a part of being a naturalist. To fully understand and connect with nature, one must engage all senses, including taste.

    It’s not just about curiosity but also about a sense of community and perhaps a bit of competitiveness among scientists. The stories of Darwin and others set a precedent, and many modern scientists feel compelled to follow in their footsteps, driven by peer or ‘beer’ pressure.

     WTF fun facts

    Source: “Dining Like Darwin: When Scientists Swallow Their Subjects” — NPR

    [ad_2]

    WTF

    Source link

  • Today in History: December 27, Soviets take Afghanistan

    Today in History: December 27, Soviets take Afghanistan

    [ad_1]

    Today in History

    Today is Tuesday, Dec. 27, the 361st day of 2022. There are four days left in the year.

    Today’s Highlight in History:

    On Dec. 27, 1979, Soviet forces seized control of Afghanistan. President Hafizullah Amin (hah-FEE’-zoo-lah ah-MEEN’), who was overthrown and executed, was replaced by Babrak Karmal.

    On this date:

    In 1822, scientist Louis Pasteur was born in Dole, France.

    In 1831, naturalist Charles Darwin set out on a round-the-world voyage aboard the HMS Beagle.

    In 1904, James Barrie’s play “Peter Pan: The Boy Who Wouldn’t Grow Up” opened at the Duke of York’s Theater in London.

    In 1932, New York City’s Radio City Music Hall first opened.

    In 1945, 28 nations signed an agreement creating the World Bank.

    In 1958, American physicist James Van Allen reported the discovery of a second radiation belt around Earth, in addition to one found earlier in the year.

    In 1985, Palestinian gunmen opened fire inside the Rome and Vienna airports in terrorist attacks that killed 19 people; four attackers were slain by police and security personnel. American naturalist Dian Fossey, 53, who had studied gorillas in the wild in Rwanda, was found hacked to death.

    In 1995, Israeli jeeps sped out of the West Bank town of Ramallah, capping a seven-week pullout giving Yasser Arafat control over 90 percent of the West Bank’s 1 million Palestinian residents and one-third of its land.

    In 1999, space shuttle Discovery and its seven-member crew returned to Earth after fixing the Hubble Space Telescope.

    In 2001, Defense Secretary Donald H. Rumsfeld announced that Taliban and al-Qaida prisoners would be held at the U.S. naval base at Guantanamo Bay, Cuba.

    In 2002, a defiant North Korea ordered U.N. nuclear inspectors to leave the country and said it would restart a laboratory capable of producing plutonium for nuclear weapons; the U.N. nuclear watchdog said its inspectors were “staying put” for the time being.

    In 2016, Japanese Prime Minister Shinzo Abe (shin-zoh AH’-bay), accompanied by President Barack Obama, visited Pearl Harbor in Hawaii, where he offered his “sincere and everlasting condolences to the souls of those who lost their lives” in Japan’s 1941 attack; Abe did not apologize, but conceded his country “must never repeat the horrors of war again.” Actor Carrie Fisher died in a hospital four days after suffering a medical emergency aboard a flight to Los Angeles; she was 60.

    Ten years ago: An Indian-born man, Sunando Sen, was shoved to his death from a New York City subway platform; suspect Erika Menendez later pleaded guilty to manslaughter and was sentenced to 24 years in prison. (Authorities say Menendez pushed Sen because she thought he was Muslim; Sen was Hindu.) Retired Army general Norman Schwarzkopf, 78, died in Tampa, Florida.

    Five years ago: Freezing temperatures and below-zero wind chills socked much of the northern United States. Houston Astros star second baseman Jose Altuve was named AP Male Athlete of the Year after leading the team to its first World Series title. A power outage struck parts of Disneyland in California, forcing some guests to be escorted from stalled rides.

    One year ago: U.S. health officials cut isolation restrictions for asymptomatic Americans infected with the coronavirus from 10 to five days, and similarly shortened the time that close contacts needed to quarantine; officials said the guidance was in keeping with growing evidence that people with the coronavirus were most infectious in the two days before and the three days after symptoms developed. Defense officials said a U.S. Navy warship, the USS Milwaukee, remained in port in Guantanamo Bay, Cuba, with about two dozen sailors – or nearly a quarter of its crew – testing positive for COVID-19.

    Today’s Birthdays: Actor John Amos is 83. Rock musician Mick Jones (Foreigner) is 78. Singer Tracy Nelson is 78. Actor Gerard Depardieu is 74. Jazz singer-musician T.S. Monk is 73. Singer-songwriter Karla Bonoff is 71. Rock musician David Knopfler (Dire Straits) is 70. Actor Tovah Feldshuh is 69. Journalist-turned-politician Arthur Kent is 69. Actor Maryam D’Abo is 62. Actor Ian Gomez is 58. Actor Theresa Randle is 58. Actor Eva LaRue is 56. Wrestler and actor Bill Goldberg is 56. Bluegrass singer-musician Darrin Vincent (Dailey & Vincent) is 53. Rock musician Guthrie Govan is 51. Musician Matt Slocum is 50. Actor Wilson Cruz is 49. Actor Masi Oka is 48. Actor Aaron Stanford is 46. Actor Emilie de Ravin is 41. Actor Jay Ellis is 41. Christian rock musician James Mead (Kutless) is 40. Rock singer Hayley Williams (Paramore) is 34. Country singer Shay Mooney (Dan & Shay) is 31. Actor Timothee Chalamet is 27.

    [ad_2]

    Source link

  • Review: How Meacham’s Lincoln defeated ‘Big Lie’ of his day

    Review: How Meacham’s Lincoln defeated ‘Big Lie’ of his day

    [ad_1]

    “And There Was Light: Abraham Lincoln and the American Struggle” by Jon Meacham (Random House)

    Fun fact: Feb. 12, 1809, is the birthdate for both Abraham Lincoln and Charles Darwin. While we tend to contemplate “The Great Emancipator” as fully formed well before he became the 16th president, his moral perspectives and political goals developed in a gradual process more akin to Darwin’s theories.

    Jon Meacham’s excellent new biography, “And There Was Light: Abraham Lincoln and the American Struggle,” illuminates how Lincoln’s personal growth and travails enabled him to lead a nation along a fitful evolution toward freedom despite a catastrophic rebellion that denied it. Fueling the national disaster was the “Big Lie” of Lincoln’s day — that slavery was a justifiable institution.

    Meacham does not portray Lincoln’s backstory as mere iconography — the log cabin, the backwoods education, the rail splitting. Rather, this account of his hardscrabble youth is less an any-boy-can-be-president morality tale than a foundation of Lincoln’s personal values and empathy informed by crushing poverty and loss. It is little wonder that Lincoln sought to deliver more fairness in an unfair world.

    The light that powered this desire was the gift of literacy acquired in what Lincoln called “A.B.C. schools” and any books he could hungrily consume thereafter. The darkness of early 19th century America was vividly embodied by enslaved Blacks herded in chains down his native Kentucky roads.

    At 23, Lincoln formally entered the political arena running for office in Illinois to feed his great ambition “of being truly esteemed of my fellow men, by rendering myself worthy of their esteem.” Meacham expertly peels back the historic to reveal the familiar in his coverage of the swirl of politics, largely unchanged to this day.

    The author girds his analysis with a comprehensive survey of the variety of social, political and theological writings that influenced Lincoln and resonate across his career. Keenly attuned to public opinion, Lincoln recognized both in himself and the entire nation two realities — anti-Black prejudice and a passionate desire in the North to abolish slavery. It was the same empathy that recoiled from the brutal practice of slavery that also connected him to the humanity of those who supported it.

    This led to Lincoln’s finely calibrated debates with Stephen A. Douglas in which he called for the status quo limiting slavery unto its eventual end, yet hewed to the stance of abolitionist supporters who otherwise resisted a multiracial, egalitarian society. Lincoln added that slave owners’ unearned wealth created a decidedly un-American class system that disadvantaged poor whites. Douglas was eventually sent to the U.S. Senate to advocate slavery’s expansion and the continuation of unfettered white supremacy.

    The stage set for his White House candidacy under the Republican Party banner, Lincoln won in 1860 with only a plurality of the vote. Before taking office, he grew his trademark whiskers, watched as the South seceded, then took command committed to his official duty to restore the Union, not his personal wish that all men everywhere be made free.

    Buffeted by Confederate victories, impatient abolitionists and South-sympathizing Democrats, all while fearing the loss of the border states, Lincoln’s first term was the American presidency’s greatest tightrope act: incremental policy advances balanced by principle. The victory at Antietam in September 1862 steadied the North, and Lincoln issued his Emancipation Proclamation to add explicitly the cause of freedom to the preservation of the Union.

    Meacham details the messy political caveats that necessarily riddle the more convenient, more heroic Northern narrative. Emancipation was limited. Some states in the northwest sued for peace allowing for the expansion of slavery or even the expulsion of the New England states. A draft to enlarge the army led to rioting. By 1864, fellow Republicans were advising Lincoln to moderate his abolitionist views to get reelected. He convinced them that abandoning emancipation would be worse than losing the presidency.

    Ultimately, it was not virtue but victory — the fall of Atlanta in September 1864 turned Northern skeptics into hawks — that delivered Lincoln a second term. Meacham reveals in his examination of the second inaugural address how Lincoln repurposed the Psalms and the Gospels to capture the moral essence of “this mighty scourge” in which “the prayers of both (sides) could not be answered.” The war, and slavery with it, finally ended only to be tragically punctuated by his assassination.

    An admirer across the Atlantic wrote before the 1864 election that supporters in England observed in Lincoln’s career “a grand simplicity of purpose and patriotism which knows no change and which does not falter.” Meacham’s fine account of America’s greatest president delivers a close-up that captures — wart and all — why Lincoln’s political sensibilities and moral vision were, like the Union itself, indivisible.

    ———

    Douglass K. Daniel is the author of “Anne Bancroft: A Life” (University Press of Kentucky).

    [ad_2]

    Source link

  • Weird Facts

    Weird Facts

    [ad_1]

    Before marrying his cousin Emma Wedgwood, Charles Darwin carefully considered the pros and cons. The pros won out; they were married from 1839 until his death in 1882 and had ten children.

    [ad_2]

    Source link