ReportWire

Category: Bazaar News

Bazaar News | ReportWire publishes the latest breaking U.S. and world news, trending topics and developing stories from around globe.

  • Why Does Snow Squeak When You Step On It?

    Why Does Snow Squeak When You Step On It?

    [ad_1]

    While you might not have a winter wonderland on your hands just yet, snow is sure to start falling soon as we get closer to the end of December. And with that comes a pretty important question: Why is snow, which is so quiet when it’s falling out of the sky, so loud on the ground, squeaking, creaking, and crunching under our boots?

    The reason, as it turns out, is science. Snow is made up of ice crystals. While ice is a solid, it actually has a thin (as in, a few nanometers) quasi-liquid layer (QLL) on its surface. Michael Faraday, better known for his work on magnetism and electrochemistry, first suggested this idea in the 1850s. While scientists have confirmed it since then, the origins and many of the characteristics of the QLL are unclear.

    One thing we do know, though, is that the thickness of the QLL depends on temperature. When snow is warmer, the QLL around all those ice crystals is softer. When you step on the snow, you compress the crystals, but the liquid allows them to quietly slide past each other. When snow is colder and the QLL is stronger, there’s more friction between the crystals, and they don’t slide so easily. When you step on colder snow, the crystals rub against each other and also break, making that oh so familiar squeaking sound.

    The dividing line between squeaky and non-squeaky snow is around 14°F, so if that crunchy noise tends to bother you, then that would be good to keep in mind before you lace up your boots and go on a walk in the snow.

    Have you got a Big Question you’d like us to answer? If so, let us know by emailing us atbigquestions@mentalfloss.com.

    A version of this article was originally published in 2016 and has been updated for 2023.

    [ad_2]

    Matt Soniak

    Source link

  • 8 Tips for Beating a Claw Machine

    8 Tips for Beating a Claw Machine

    [ad_1]

    Unless you’re small enough to climb inside, grabbing a prize out of a claw machine can be pretty tough. But Los Angeles Times film reporter Jen Yamato and film critic Kim Morgan are very, very good at it: Yamato estimates that she’s nabbed 100 toys from the prize pits of claw machines (which she’s deposited in her car and at her house), and at one point, Morgan says, she had “two large garbage bags overflowing with stuffed animals from just one year. I donated them.”

    Morgan has always been drawn to claw machines, but got really hooked in 2008: “Must be the dumb kid in me that spies an enormous box of stuffed toys,” she says. “A claw? It’s almost something out of the Brothers Grimm … One time I clawed six animals in a row. There was a crowd around me! It was so silly.” Yamato’s obsession with claw games began in her adult life. “I only realized I was good at it because I kept winning stuff and I was keeping track of it on Instagram,” she says. “I’m a professional person most of the time, and it’s one of the only things that I will let myself be completely competitive about. … You get to bask in the glory of holding your bounty high above your head and saying, ‘Yes, I snatched this prize out of this machine! I beat it!’”

    It might seem like fun and games—and, of course, it is. But there’s real skill involved, too. Here are the strategies Morgan and Yamato use to nab a prize.

    The first thing you should look at when thinking about playing a claw machine is the prize pit—specifically, how tightly the prizes are packed. “An easy tell is when all of the stuffed animals have been front faced and they’re packed in like sardines,” Yamato says. “That means nobody has jiggled anything loose yet, or maybe an employee has just stuffed them in super tight.” A tightly-packed prize pit will make your job a lot harder: “I’m not going to bother playing a machine that is clearly stuffed too tight,” Yamato says. “I won’t be able to reel anything in.”

    Morgan agrees. “If the toys are stuffed so tightly that grabbing is impossible, don’t waste your time,” she says. “I think it’s better to find those weird lone claw machines in places that seem more abandoned—they don’t get stuffed as much. Those are the only places you can win because there’s more room to drag an animal.”

    “Don’t necessarily watch how they play, but watch how the machine reacts when they play—that information can help you whenever it comes to be your turn,” Yamato says. “I can see if the claw grip is too loose, or if it’s designed to let go or give a jiggle after it grasps something, then I won’t play because I know the odds are definitely against me … unless it’s a really, really sweet toy that I want. Then I’ll spend a little extra time.”

    Stuffed animals in a claw machine pit

    Which one will you attempt to snag? / Aaron C Photography/Moment Open/Getty Images

    Yamato and Morgan go after the prize that looks the most attainable. “Sometimes, the most desirable prizes are the hardest ones to get,” Yamato says. “Being realistic about what you can win in any given machine will help you win a lot more.”

    “If the pretty pony in the far end, stuffed tightly next to the cute teddy bear, is an impossible option, you’re going to have to settle with the ugly duck/monster thing with red shoes and a cape or whatever the hell it is and live with it,” Morgan says.

    The ideal prize is “sticking out a little bit, isn’t being blocked or obstructed by any other prizes, and isn’t too close to the side,” according to Yamato. (If a prize is leaning against the glass, the claw track won’t allow the claw to get close enough to nab it.) Morgan also advises sticking to prizes that are close to the chute: “Don’t drag something from the very end of the machine,” she says. “That rarely works.”

    Yamato also avoids round or rotund objects. “Those are hard because a lot of the time there’s nothing to grab onto,” she says. Instead, aim for a prize that has some kind of appendage—a head, or an arm or a leg—sticking out: “Something you can get one of the claw prongs under is your best bet, if the angle’s right.”

    After Yamato has picked her prize, she’ll play once, “to test the tensile grip of the claw to see how easily it will hold after it closes,” she says. “A lot of them will jiggle open right after they close, so even if you’ve caught something, it’ll screw you over by opening up the claws a little bit.” If that happens, Yamato says she won’t play again … “probably.”

    In general, it’s easier to play machines that have a three-pronged claw rather than a two-pronged claw: “It’s all about the grip—if the claw has a weak grip, forget it,” Morgan says. “The two-pronged claws seem weaker to me.”

    claw in claw machine in front of stuffed animals

    If you want to win, use the claw to move your toy closer with each play. / Richard Drury/Photodisc/Getty Images

    “One strategy is bumping another animal out of the way to grab another,” Morgan says. She also advises grabbing and dragging a prize closer to the chute to make it easier to grab on your second try.

    Most claw machines drop and grab with one push of a button; some need two pushes—one to drop the claw, another to close it—but that’s rare. Either way, “most machines give you enough time to position your claw, and most of them will let you move it forward and backward and then sideways,” Yamato says. “I usually try to spend most of the time of the clock running down to make sure that I’m exactly above where I want the claw to drop.” Once you’re in the absolute best position, drop it.

    Most machines cost 50 cents to play, so Yamato will put in a dollar. “Maybe half the time I get a prize on my first dollar,” she says. “I’ll usually play a couple of dollars at most before I realize that I should walk away. It’s like gambling—for no monetary gain!”

    Morgan says grabbing a prize usually takes her a few tries “on good machines,” she says. “On bad machines—and they seem worse now—it takes me about five or 10 times or never. I will not go past 10.”

    In 2015, Vox posted an article that explained how claw machine owners can rig them—but Yamato doesn’t think that’s true for every game. “People might play less because they think every claw machine is rigged to screw them over, but not all claw machines are rigged,” she says. “I always believe that every claw is winnable—it’s just a matter of how much I want to stand there and keep playing if I already know that this particular machine is sort of stuck.” But people should avoid the machines that have money wrapped around the prizes: “In my experience,” Yamato says, “those are usually the ones that are rigged.”

    Morgan, on the other hand, does believe that many of the machines are rigged—which is why she prefers to play machines in places off the beaten path, like in California’s Yucca Valley. “Are they less rigged in the desert? I think so,” she says. “I have incredible luck out there. I always play in the desert.”

    A version of this story ran in 2018; it has been updated for 2023.

    [ad_2]

    Erin McCarthy

    Source link

  • 12 Fascinating Facts About Zelda Fitzgerald

    12 Fascinating Facts About Zelda Fitzgerald

    [ad_1]

    Zelda Fitzgerald was a writer, dancer, and Jazz Age celebrity who struggled on and off with mental illness. Her husband, famed writer F. Scott Fitzgerald, called her the first American flapper, and she became a 1920s icon thanks to her vivacious nature and bon vivant lifestyle. Here’s what you need to know about her.

    BORN

    DIED

    NOTABLE WORKS

    July 24, 1900, Montgomery, Alabama

    March 10, 1948, Asheville, North Carolina

    ‘Save Me the Waltz’ (novel)

    Zelda Sayre was born in Montgomery, Alabama, in 1900. Her father, Anthony Dickinson Sayre [PDF], worked as a lawyer, representative in the Alabama state legislature, state senator, city judge, and justice of the Supreme Court of Alabama. Additionally, both Zelda’s great-uncle and grandfather served in the United States Senate.

    In high school, Zelda’s desire to be unconventional and rebellious meant that she smoked, drank alcohol, and snuck out of her parents’ house to spend time with boys. Her friends described her as fearless, daring, and attention-seeking. Later, when she was living with her husband in New York, her carefree spirit and profligate behavior (such as jumping into fountains fully clothed) became a symbol of the 1920s.

    As a child, Zelda had taken ballet lessons, but her interest in dance was renewed in her late twenties while the couple was living in France. Hoping to become a professional ballerina, she took ballet lessons in Paris with Russian dancer Lubov Egorova. Zelda trained obsessively for a few years, spending all day practicing until her dancing dreams ended when she suffered a mental breakdown in 1930.

    F. Scott and Zelda Fitzgerald

    F. Scott and Zelda Fitzgerald. / Minnesota Historical Society/GettyImages

    Zelda met her future husband—then an officer at nearby Fort Sheridan—at a country club dance in 1918 when she was just 17. According to Zelda Sayre Fitzgerald: An American Woman’s Life, she caught the 21-year-old’s eye while performing a ballet solo, but Zelda initially wasn’t interested; they wouldn’t marry until April 1920, after Scott’s first book, This Side of Paradise, was published. They had a daughter in 1921.

    Their marriage was reportedly a toxic one, complete with alcoholism, mutual infidelity, and jealousy. Zelda accused her husband of having a relationship with his friend and fellow writer Ernest Hemingway, and she had nervous breakdowns throughout their marriage. Although they never divorced, the couple was estranged when F. Scott died in 1940.

    Scott based some of his characters on Zelda, and he adapted his real-life interactions and experiences with her into his novels. He also copied, verbatim, entries from Zelda’s journals and put them into his books, blurring the line between fiction and reality. In a review she wrote for The New York Tribune, Zelda poked fun at her husband, saying that he “seems to believe that plagiarism begins at home.”

    On the flip side, Scott dismissed and undermined his wife’s literary ambitions. He criticized her novel Save Me The Waltz, Zelda’s only published work, accusing her of using autobiographical details of their lives that he was going to use in his novel Tender Is The Night and borrowing a character’s name from one of his early protagonists.

    Throughout the 1930s and 1940s, Zelda was in and out of mental hospitals. Although she was diagnosed with schizophrenia, her fluctuations between depression and mania would most likely get her a bipolar diagnosis today. During her time in these hospitals, Zelda kept herself creatively occupied by writing and painting. She worked on her second novel, called Caesar’s Things, and she painted scenes from Alice in Wonderland, the Bible, and New York locations like Times Square, Washington Square Park, and the Brooklyn Bridge.

    Save me the Waltz by Zelda Sayre Fitzger

    The cover of ‘Save me the Waltz’ by Zelda Fitzgerald. / Culture Club/GettyImages

    Zelda began writing her semi-autobiographical novel, Save Me the Waltz—about a Southern belle named Alabama Beggs who longs to be a ballerina and marries an army-officer-turned-successful-painter—in early 1932 and finished it in under a month after she entered the Phipps Psychiatric Clinic in Baltimore, Maryland, to receive treatment for a breakdown.

    “I am proud of my novel, but I can hardly restrain myself enough to get it written,” she wrote to her husband. “You will like it—It is distinctly École Fitzgerald, though more ecstatic than yours—perhaps too much so.” She sent the manuscript to Scott’s editor, Max Perkins, without showing him first: “Scott being absorbed in his own [novel] has not seen it,” she wrote, “so I am completely in the dark as to its possible merits, but naturally terribly anxious that you should like it.”

    When he found out she had sent the manuscript to his editor without showing him first, Scott was furious (the fact that she had used material from their own lives was also a sore point). After getting a look at it, he wrote to Perkins that one section would have to be “radically rewritten,” and though at first she refused to make any revisions, Zelda eventually came around, “changing what was rather flashy and self-justifying ‘true confessions’ that wasn’t worth of her into an honest piece of work.” Perkins essentially left the novel as it was, but Scott made Zelda edit it down even more before it was published by Scribner’s in October 1932. The publisher had advanced Scott so much money for his own novel that, according to A. Scott Berg in his biography of Perkins, “they arranged for half of Zelda’s royalties to be applied against Scott’s debt until $5000 had been paid back.”

    Unfortunately for them all, Save Me the Waltz didn’t sell well (Zelda earned just $120.73), and was largely panned by critics. “It is not only that her publishers have not seen fit to curb an almost ludicrous lushness of writing,” the New York Times review of the book noted, “but they have not given the book the elementary services of a literate proofreader.”

    After that, Zelda turned to writing plays and exhibiting paintings, but didn’t see success there, either.

    During the 1940s, Zelda worked on writing a novel and lived intermittently in Highland Hospital in Asheville, North Carolina. On March 10, 1948, a fire started in the hospital’s kitchen. Reportedly, Zelda was scheduled for an electroshock therapy session and was sedated and locked in a waiting room. Regardless of where exactly she was, the fire spread through the floors of the building via the dumbwaiter shaft, and Zelda was killed along with eight other women. She was 47.

    In the mid-1980s, Japanese video game designer Shigeru Miyamoto needed a name for his new Nintendo heroine, and Zelda had just the right ring to it. “She was a famous and beautiful woman from all accounts, and I liked the sound of her name,” Miyamoto has said, and thus he called the princess in his fantasy game Zelda. The game was an immediate hit.

    After reading a biography about Zelda, Don Henley of the Eagles wrote the 1972 song “Witchy Woman” about her. It was “an important song for me,” Henley said, “because it marked the beginning of my professional songwriting career.” Describing her as a restless spirit in the song, Henley also referred to her use of absinthe (“she drove herself to madness with a silver spoon”).

    In the years since her death in relative obscurity, Zelda has returned to the icon status she enjoyed in her 1920s heyday—and served as inspiration for filmmakers and other writers. The Last Flapper, a play by William Luce based on Zelda’s writings and billed as “the definitive portrait of Mrs. F. Scott Fitzgerald,” premiered in 1990. Natasha Richardson played Zelda in a 1993 TV movie about her life, and Theresa Anne Fowler published Z: The Beginning of Everything, a novel about Zelda’s early life and marriage to Scott, in 2013. The novel was later adapted into a show starring Christina Ricci for Amazon. Jennifer Lawrence and Scarlett Johansson have both been attached to films about Zelda, too.

    In 2013, Jeni’s Splendid Ice Creams offered a limited edition line of ice creams inspired by Zelda. Called The Zelda Collection, the sweet treats came in four flavors meant to reflect Zelda’s life from Alabama to New York to St. Paul, Minnesota (F. Scott’s hometown). The flavors featured were blackberries and sweet cream, cognac and marmalade, dark chocolate rye, and Loveless biscuits and peach jam. Zelda, with her appreciation for delicacies, would likely have approved.

    A version of this story ran in 2016; it has been updated for 2023.

    [ad_2]

    Suzanne Raga

    Source link

  • 8 Historical Methods of Detecting Pregnancy

    8 Historical Methods of Detecting Pregnancy

    [ad_1]

    Home pregnancy tests became widely available in 1978, although they took two hours to develop and were accurate for negative results only 80 percent of the time. Nowadays, they can supposedly tell as early as five days before a person’s missed period. Home pregnancy tests work by detecting trace levels of the pregnancy hormone human chorionic gonadotropin (hCG) in urine; hCG is present after egg implantation, which occurs six to 12 days after fertilization, and is secreted by the cells that are beginning to form the placenta.

    Before the invention of this miraculous device, the most reliable test was just to wait and see. But while it might be a nice surprise to find out you’re pregnant the old-fashioned way—vomiting, missing periods, having a baby—people still wanted to know as early as possible whether they were harboring a tiny human.

    So how’d they do it? Weirdly enough, it often comes back to pee.

    wheat fields on a sunny day

    Modern people shouldn’t start peeing on random wheat fields. / HAUKE-CHRISTIAN DITTRICH/GettyImages

    One of the earliest, if not the earliest, home pregnancy tests came from ancient Egypt. In 1350 BCE, potentially pregnant people were advised to urinate on wheat and barley seeds over the course of several days; if the wheat sprouted, they were having a girl, and if the barley sprouted, a boy. If neither sprouted, they weren’t pregnant. The most interesting thing about this test was that it actually worked: In 1963, a laboratory experimented with the wheat and barley test and found that 70 percent of the time, the urine of pregnant people would cause the seeds to sprout, while the urine of non-pregnant people did not.

    old-timey illustration of red onions

    Onions are for eating, not sticking up your orifices. / Fototeca Storica Nazionale./GettyImages

    While the ancient Egyptians were on to something with the wheat and barley test, they and the ancient Greeks seem to have had a fuzzy understanding of anatomy. Both Egyptian medical papyri and Hippocrates, lauded as the father of medicine, suggested that a person who suspected they might be pregnant insert an onion or other strong-smelling bulbous vegetable into their vagina overnight. If the person’s breath smelled of onions the next morning, they weren’t pregnant; this was based on the idea that if the womb was open, and wafting the oniony scent up to the mouth like a wind tunnel, there was no fetus. If the person were pregnant, then the womb would be closed, so no wind tunnel.

    old illustration of key and lock

    Whoever came up with this test was missing some key information on how to accurately detect pregnancy. / Heritage Images/GettyImages

    From The Distaff Gospels, a collection of women’s medical lore written in the late 15th century: “My friends, if you want to know if a woman is pregnant, you must ask her to pee in a basin and then put a latch or a key in it, but it is better to use a latch—leave this latch in the basin with the urine for three or four hours. Then throw the urine away and remove the latch. If you see the impression of the latch on the basin, be sure that the woman is pregnant. If not, she is not pregnant.” Say what now?

    A physician examining pee to determine a woman’s pregnancy.

    A physician examining pee to determine a woman’s pregnancy. / Buyenlarge/GettyImages

    As bizarre as the “latch test” sounds, it still recognized that something in a pregnant person’s pee was different than non-pregnant urine, a fact that 16th-century European “piss prophets” also recognized. These so-called experts claimed they could determine whether or not a person was with child by the color and characteristics of their urine. Some also mixed urine with wine and observed the results, a test that might have seen some success, given that alcohol can react to proteins present in a pregnant person’s pee. Of course, these piss prophets didn’t limit their divination to pregnant people; they could also, by examining urine, intuit whether the urine’s owner was suffering from any illness or disease.

    close-up of a woman's eye

    Eyes are not the window to the womb. / Mark Mainz/GettyImages

    One 16th-century physician, Jacques Guillemeau, claimed you could tell by a person’s eyes whether they were pregnant. Guillemeau, author of an influential treatise on ophthalmology, claimed that as early as the second month, “a pregnant woman gets deep-set eyes with small pupils, drooping lids and swollen little veins in the corner of the eye.” That is likely not true, but he was right about one thing: Eyes can change during pregnancy, affecting your vision. This is why it’s not a good idea to get new contacts or prescription glasses during pregnancy.

    Early on in pregnancy, at roughly six to eight weeks, the cervix, labia, and vagina can take on a dark bluish or purple-red hue, owing to the increased blood flow to the area. This remarkable indication of pregnancy was first noticed in 1836 by a French physician. It later became known as Chadwick’s sign, after James Read Chadwick, an obstetrics doctor who brought the discovery up at a meeting of the American Gynecological Society in 1886.

    photograph of a white rabbit

    Pregnancy tests fortunately no longer require killing bunnies. / Tony Evans/Timelapse Library Ltd./GettyImages

    Aside from observational tests such as Chadwick’s sign, pregnancy tests were still an unpleasant crapshoot until the 20th century. Investigation into hormones, the big thing in science at the turn of the century, just made pregnancy testing unpleasant for a bunch of rabbits, mice, and rats.

    In the 1920s, two German scientists, Selmar Aschheim and Bernhard Zondek, determined that there was a specific hormone present in the urine of pregnant people that seemed to be linked to ovary growth; we now know it as human chorionic gonadotropin, or hCG. They figured this out by injecting the urine of pregnant people into sexually immature rabbits, rats, and mice, which would induce ovarian development. Most of the time, the pregnant person’s pee would produce bulging masses on the animals’ ovaries, a sure indication of the presence of hCG. So, the Rabbit Test was born.

    According to a contemporary medical journal, it worked like this: A sample of urine was injected into a group of young female mice over a period of five days. On the fifth day, the mice were killed and autopsied to examine the state of their ovaries. If their reproductive bits looked excited, the test was positive. If you wanted your results in less than five days, they could simply use more mice.

    This method ran through a lot of rabbits, mice, and rats; though the phrase “the rabbit died” popularly meant that the woman was pregnant, in actuality, all of the rabbits—and the mice and rats—died. Though doctors could look at the ovaries of the animal without killing it, that tended to be too much trouble.

    two frogs in a pond

    Frogs can now remain free of human pee. / SEBASTIAN WILLNOW/GettyImages

    Though it worked on the same principle as the Rabbit Test, this one was actually a bit better—at least the animal remained alive at the end of it. By the late 1940s, scientists had determined that when a pregnant person’s pee is injected into a live toad or frog, the unfortunate amphibian will produce eggs within 24 hours. The toad or frog lived to see another day and, usually, another test.

    A version of this story originally ran in 2016; it has been updated for 2023.

    [ad_2]

    Linda Rodriguez McRobbie

    Source link

  • When Quentin Tarantino Directed an Episode of ‘ER’

    When Quentin Tarantino Directed an Episode of ‘ER’

    [ad_1]

    Quentin Tarantino has long maintained that his success as a writer and director stems in large part from his encyclopedic knowledge of film and an adolescence spent absorbing as many frames of it as he possibly could.  

    But Tarantino was also an avid consumer of television. While he was editing 1994’s Pulp Fiction, he would come home, collapse on his sofa, and watch The X-Files and Home Improvement. When he was hoping to cast John Travolta in 1996’s vampire drama From Dusk Till Dawn, he invited the actor over to his house and played the Welcome Back, Kotter board game with him.

    Travolta ultimately ended up in Pulp Fiction; George Clooney, who had broken his losing streak of canceled TV shows in the fall of 1994 with NBC’s hospital drama ER, wound up taking a role alongside Tarantino in From Dusk Till Dawn (which Tarantino wrote and Robert Rodriguez directed). That’s when the actor proposed that Tarantino consider directing an episode of the hit series.

    “Motherhood,” the penultimate episode of ER’s first season, aired on May 11, 1995. Written by Lydia Woodward, the show’s intertwined plots were all related to the title’s theme and given a copious amount of screen blood to mesh with Tarantino’s reputation for unsettling imagery. A boy is wheeled in after a playground mishap with a rod poking through his chest (it’s removed, slowly, with help from a bone saw); a gang member’s ear is lopped off; a drug overdose leads to foaming at the mouth. Woodward says she didn’t really write the episode with Tarantino in mind, though she did try to “gross out” the medical procedures.

    Accustomed to having the final word on film sets, Tarantino told an interviewer he initially found it off-putting that ER executive producer John Wells told him he needed to shoot more footage for a scene he thought he had completed. “Then I realized, this is their show, this isn’t my show,” he said. “In TV, the producer is the man, the auteur.”

    Tarantino received $30,000 for the assignment, which probably seemed like a bargain to NBC. Just two months earlier, he was awarded the Oscar for Best Original Screenplay alongside Pulp Fiction co-writer Roger Avary. Woodward later told a reporter that Tarantino was interested in directing more episodes and possibly even appearing as a recurring character. That didn’t come to pass, though Tarantino did direct “Grave Danger,” a two-part episode of CSI, for CBS in 2005.

    [ad_2]

    Jake Rossen

    Source link

  • 15 Eerie Facts About Japan’s Suicide Forest

    15 Eerie Facts About Japan’s Suicide Forest

    [ad_1]

    Northwest of the majestic Mount Fuji is Aokigahara, 13.5 square miles of forest so thick with foliage that it’s known as the “sea of trees.” Many visitors have chosen this place as the setting for their final moments, walking in with no intention of ever walking back out—so much so that this area has the second-highest rate of suicide in the world after the Golden Gate Bridge in San Francisco.

    In years past, Aokigahara was also believed to contain yūrei, or mythological Japanese ghosts filled with anger and vengefulness. Its grim history made the woods a fitting location for the 2016 horror film The Forest. Here are more facts you might not have known about Japan’s “suicide forest.”

    Statistics on Aokigahara’s suicide rates vary, in part because the forest is so lush that some bodies can go undiscovered for years or might be lost forever. Some estimates claim between 30 to 100 people a year take their lives there. However, other sources report that statistics for recent years are unavailable, in part because the Japanese government has stopped releasing numbers in order to prevent future deaths by suicide.

    Self-inflicted death doesn’t carry the same stigma in Japan as it does in other countries. The practice of seppuku—a samurai’s honorable suicide—dates back to Japan’s feudal era. And while the tradition is no longer the norm, “vestiges of the seppuku culture can be seen today in the way suicide is viewed as a way of taking responsibility,” Yoshinori Cho, author of Why do People Commit Suicide? and then-director of the psychiatry department at Teikyo University in Kawasaki, told the Japan Times in 2011.

    A bench in the dense Aokigahara forest

    A bench in the dense Aokigahara forest / Carl Court/GettyImages

    The global financial crisis of 2008 and ensuing economic instability seemed to spur a 15 percent increase in suicides in Japan. The incidence peaked in March 2009, the end of Japan’s fiscal year. While the number of suicides in the country fell in 2021 by 0.4 percent compared to the previous year, rates for women rose and remained high for younger people. And in 2022, suicide rates increased by 2.7 percent, making it one of the leading causes of death for men between the ages of 20 to 44, and women between the ages of 15 to 34.

    In 2017, the Japanese government announced plans to reduce Japan’s suicide rates by 30 percent over the next decade, reducing the number of suicides from 18.5 per 100,000 people in 2015 to 13 per 100,000 people by 2025.

    Part of these measures included posting security cameras at the entrance of Aokigahara and increasing patrols. Suicide prevention counselors and police have also posted signs on various paths throughout the forest that offer messages like “Think carefully about your children, your family.” Another posted sign reads: “Your life is a precious gift from your parents.”

    The forest’s trees organically twist and turn, their roots winding across the forest floor in treacherous threads. Because of its location at the base of a mountain, the ground is uneven, rocky, and perforated with hundreds of caves. But more jarring than its tricky terrain is the feeling of stillness; the trees are too tightly packed for winds to whip through, and the wildlife is sparse. Some visitors also have reported strange phenomena, like compasses breaking, as well as GPS devices and smartphones no longer working (although several visitors have also reported these gadgets working fine for them).

    The sun shines weakly through the dense forest.

    The sun shines weakly through the dense forest. / Carl Court/GettyImages

    The second most common method is jumping from a high place, according to a 2004 study. The government has increased the height of bridge railings and other steps in Aokigahara to curb suicide attempts.

    Mystery author Seichō Matsumoto’s popular 1960 novel Tower of Waves featured a protagonist who dies in the forest, while Wataru Tsurumi’s controversial 1993 work, The Complete Manual of Suicide, called Aokigahara “the perfect place to die.” The manual has been found among the possessions left behind by visitors to the forest.

    Ubasute is a form of euthanasia that translates roughly to “abandoning an old woman.” In this practice—allegedly resorted to in Japan during times of famine—a family decreases the number of mouths to feed by leading an elderly relative to a mountain or similarly remote environment to die from dehydration, starvation, or exposure. Many argue that ubasute was never a real tradition, but a product of folklore, potentially connected to the suicide forest.

    A lost shoe in Japan's Aokigahara forest.

    A lost shoe in Japan’s Aokigahara forest. / Carl Court/GettyImages

    Some believe that the ghosts—or yūrei—of those abandoned by ubasute, as well as the mournful spirits of those who took their own lives, still linger in the woods. Folklore claims they are vengeful, dedicated to tormenting visitors and luring those who are sad and lost off the paths.

    Volunteers patrol the area and recover the remains of the deceased. Police and volunteers trek through the sea of trees to bring bodies out of the forest for a proper burial. In the early 2000s, 70 to 100 people’s remains were uncovered each year. More recently, the Japanese government has declined to publicize the numbers of bodies recovered from the searches, especially amid controversies, like the one that broke out in 2017 when YouTuber Logan Paul shared a controversial video of his experiences in Aokigahara.

    Camping is allowed in the area, but police consider visitors who bring tents to be potentially contemplating suicide (visitors who stay for multiple days are believed to be weighing their decisions). People on prevention patrol will gently speak with campers and encourage them to leave the forest.

    Tape marks the way in the Aokigahara forest.

    Tape marks the way in the Aokigahara forest. / Carl Court/GettyImages

    Volunteers who search the area for bodies and those considering suicide typically mark their way by tying plastic ribbon or tape around trees. This method prevents searchers from losing their bearings after leaving the paths.

    The forest’s soil is rich in magnetic iron, which can disrupt cell phone service, GPS systems, and even compasses. If you get lost, you may not be able to report your emergency—hence the comparatively low-tech plastic tape.

    Local residents lament the lethal reputation the peaceful forest has acquired. Many tourists visit simply to take in gorgeous views of Mount Fuji and visit highlights of the natural landscape, like the distinctive lava plateau, 300-year-old trees, and the enchanting Narusawa Ice Cave.

    A path in Japan's Aokigahara forest

    Don’t leave the path. / Carl Court/GettyImages

    The internet is littered with disturbing images from the suicide forest, from abandoned personal effects in the undergrowth to human bones. If you dare to venture into this legendary place, do as the signs suggest and stay on the path.

    A version of this story ran in 2016; it has been updated for 2023.

    [ad_2]

    Kristy Puchko

    Source link

  • 11 Ways School Was Different in the 1800s

    11 Ways School Was Different in the 1800s

    [ad_1]

    For most kids in the United States, August marks the start of back to school season. They probably aren’t thrilled to start hitting the books again. But taking a look at what American schools were like in the 1800s might convince them how much tougher it could be—and just how good they’ve got it.

    Students and their teacher outside their one-room schoolhouse.
    Students and their teacher outside their one-room schoolhouse. / Wikimedia Commons // Public Domain

    In the 19th and early 20th centuries, one room schoolhouses were the norm in rural areas. A single teacher taught grades one through eight together. The youngest students—called Abecedarians, because they would learn their ABCs—sat in the front, while the oldest sat in the back. The room was heated by a single wood stove.

    All those stories you hear about people having to walk five miles to school, uphill both ways, have a bit of truth to them. Most schoolhouses were built to serve students living within four or five miles, which was considered close enough for them to walk.

    18th century photo of girls in sewing class

    You wouldn’t find a boy in a sewing class like this. / Heritage Images/GettyImages

    At some schools, boys and girls entered through separate doors; they were also kept apart for lessons.

    When the Department of Education first began gathering data on the subject in the 1869–70 school year [PDF], students attended school for about 132 days (the standard year these days is 180) depending on when they were needed to help their families harvest crops. Attendance was just 59 percent. School days typically started at 9 a.m. and wrapped up at 2 p.m. or 4 p.m., depending on the area; there was one hour for recess and lunch, which was called “nooning.”

    Elementary School Children Standing And Watching Teacher Write At Blackboard

    Chalk did a lot of heavy lifting. / Heritage Images/GettyImages

    Forget Trapper Keepers and gel pens. In the 19th and early-20th centuries, students made do with just a slate and some chalk [PDF].

    In the monitorial or Lancasterian system, the older, stronger students learned lessons directly from the teacher, then taught the younger, weaker students.

    Children learning in a classroom

    Not a cell phone in sight. / Heritage Images/GettyImages

    Teachers taught subjects including reading, writing, arithmetic, history, grammar, rhetoric, and geography (you can see some 19th century textbooks here). Students would memorize their lessons, and the teacher would bring them to the front of the room as a class to recite what they’d learned—so the teacher could correct them on things like pronunciation on the spot—while the other students continued to work behind them.

    According to Michael Day at the Country School Association of America, this practice was called “boarding round,” and it often involved the teacher moving from one students’ house to the next as often as every week. As one Wisconsin teacher wrote of boarding with families in 1851:

    “I found it very unpleasant, especially during the winter and spring terms, for one week I would board where I would have a comfortable room; the next week my room would be so open that the snow would blow in, and sometimes I would find it on my bed, and also in it. A part of the places where I boarded I had flannel sheets to sleep in; and the others cotton. But the most unpleasant part was being obliged to walk through the snow and water. I suffered much from colds and a cough.”

    Children Marching Around The Classroom

    They knew better than to step out of line. / Heritage Images/GettyImages

    Sure, stepping out of line in the 1800s and early 1900s could result in detention, suspension, or expulsion, but it could also result in a lashing. According to a document [PDF] outlining student and teacher rules created by the Board of Education in Franklin, Ohio, from 1883,

    “Pupils may be detained at any recess or not exceeding fifteen minutes after the hour for closing the afternoon session, when the teacher deems such detention necessary, for the commitment of lessons or for the enforcement of discipline. … Whenever it shall become necessary for teachers to resort to corporal punishment, the same shall not be inflicted upon head or hands of the pupil.”

    Not all places had such a rule, though; in other areas, teachers could use a ruler or pointer to lash a student’s knuckles or palms [PDF]. Other punishments included holding a heavy book for more than an hour and writing “I will not …” do a certain activity on the blackboard 100 times.

    Instead, kids brought their lunches to school in metal pails. Every student drank water from a bucket filled by the older boys using the same tin cup. That began to change in the this early 1900s.

    Children In Classroom

    Education was for the young. / Heritage Images/GettyImages

    In order to graduate, students would have to pass a final exam. You can see a sample of a typical eighth grade exam in Nebraska circa 1895 in this PDF. It includes questions like “Name the parts of speech and define those that have no modifications,” “A wagon box is 2 ft. deep, 10 feet long, and 3 ft. wide. How many bushels of wheat will it hold?,” and “What are elementary sounds? How classified?”

    A version of this story originally ran in 2016; it has been updated for 2023.

    [ad_2]

    Erin McCarthy

    Source link

  • 15 Facts About Édouard Manet’s ‘Le Déjeuner sur l’Herbe’ (‘Luncheon on the Grass’)

    15 Facts About Édouard Manet’s ‘Le Déjeuner sur l’Herbe’ (‘Luncheon on the Grass’)

    [ad_1]

    Today, Le Déjeuner sur l’Herbe (Luncheon on the Grass) is regarded as 19th-century French artist Édouard Manet’s greatest triumph. But when its unusual take on classical nudes was first displayed, it shocked Paris and earned the artist a reputation as a reckless rebel.  

    In 1515, High Renaissance artist Raphael designed Judgment of Paris. The intricately detailed print originated as a drawing by Raphael that master engraver and collaborator Marcantonio Raimondi then recreated as a print. More than 300 years later, Manet would pull inspiration and poses from the engraving’s lower right corner, where two men lounge with a woman whose elbow is perched on her raised knee. 

    Titian and/or Giorgione, ‘The Pastoral Concert’ (ca. 1510).
    Titian and/or Giorgione, ‘The Pastoral Concert’ (ca. 1510). / Wikimedia Commons // Public Domain

    The mix of clothed men and casually nude women caused quite a stir among prim and proper Parisians, but it was not a new subject. Circa 1510, The Pastoral Concert  (which used to be attributed to Giorgione but is now believed to be an early Titian) famously depicted a similar scene.

    Manet tried to get Le Déjeuner sur l’Herbe accepted by the Paris Salon in 1863, but the casual nudity of the women among clothed men in a public space so stunned the salon’s jury that they refused to display it. Manet was not alone in being snubbed that year—the stingy salon rejected so many artists that Napoleon III created an exhibition for this outcast art. Manet’s misfit masterpiece debuted at the Salon des Refusés (Salon of the Refused) with its fellow failed salon applicants. 

    The secondary salon boasted names familiar to any art fan—including Pisarro, Whistler, and Cézanne—but Manet’s painting was the show’s standout. With its unconventional representation of nudity, the artwork became the subversive salon’s main attraction. But that doesn’t mean it was beloved. It’s said men scooted their wives past the piece as quickly as possible, then doubled back to gawk. Mostly, Manet’s work drew laughter and sneers. The public was scandalized.

    Self-Portrait by Édouard Manet

    Self-portrait by Édouard Manet / Fine Art/GettyImages

    Nude women had long been the subjects of classical art, but those were generally women meant to represent the divine. In Manet’s Le Déjeuner sur l’Herbe, the women were not goddesses. One’s shed clothes are clearly visible in the lower left corner. And the men in contemporary garb underline Manet’s intention of showing modern, real people in place of fantastical or classical figures. These details made the painting feel sexual in a way classical works did not. This collision led critics and the public to call the piece obscene. More alarming to detractors was that the woman in the foreground even dares to confrontationally look out at the viewer with no shame over her nakedness.

    The title Manet gave the painting for its debut supplies a milder explanation for the female nudity. But once the piece sparked controversy over its perceived sexual nature, the artist jokingly nicknamed it “la partie carrée,” which translates loosely to “the foursome.” As has happened with many great works of art, the painting’s name shifted along with the public’s perception of it. 

    One is his brother, Eugène Manet. The other is his future brother-in-law, Dutch sculptor Ferdinand Leenhoff

    Victorine-Louise Meurent was a popular muse of Parisian painters of the late 1800s. Her nickname, la crevette (the shrimp), referred to not only her petite size, but also her rosy complexion and red hair. She sat for Manet on a number of occasions, appearing Le Déjeuner sur l’Herbe and eight other pieces.

    ‘Olympia’ by Edouard Manet

    ‘Olympia’ by Edouard Manet / Fine Art/GettyImages

    The pair collaborated on another nude that showed the redhead lounging on a white, unmade bed, once more calmly staring down her audience. Again breaking from classical tradition, this character seemed less like a fictional figure or a goddess and more like a flesh and blood woman owning her sexuality. When it debuted in 1865, it would mark one of the first times a female nude had been presented in such a realistic manner. As with Le Déjeuner sur l’Herbe, the art worled was stunned by Olympia.

    Because of the intimacy projected in these pieces, many assumed that Manet and Meurent were lovers, but that was just the tip of the gossip iceberg. A popular reading of both Le Déjeuner sur l’Herbe and Olympia is that the brazenly naked women in the paintings must be prostitutes. This fueled rumors that Meurent herself was a sex worker who had met an alcohol-fueled end at a young age. In truth, she lived to be 83 and earned acclaim outside of Manet’s canvases. 

    Victorine Meurent by Édouard Manet

    ‘Victorine Meurent’ by Édouard Manet / Fine Art/GettyImages

    In 1876, Meurent submitted a self-portrait to the Salon, and it was accepted while Manet’s submission was denied. She would again show at this prestigious venue in 1879, 1885, and 1904. In 1903, she was inducted into the esteemed Sociétés des Artistes Français. Sadly, only one of her paintings has survived. Dating from the 1880s, Palm Sunday can be found on display at Musée Municipal d’Art et d’Histoire de Colombes, northwest of Paris. 

    It measures 81.9 inches by 104.5 inches, or nearly 7 feet by 9 feet.

    In the lower left corner, the wrinkled polka-dot dress topped by a toppled basket of fruit, a shiny flask, and a jaunty bonnet prove Manet possessed great mastery of technique. This traditional talent makes his less conventional choices in Le Déjeuner sur l’Herbe all the more compelling.  

    A Bar at the Folies-Bergere by Edouard Manet

    ‘A Bar at the Folies-Bergere’ by Edouard Manet / Fine Art/GettyImages

    In 1882, the Parisian painter offered his last great work, A Bar at the Folies-Bergère. Like Le Déjeuner sur l’Herbe and Olympia, the painting featured a redhead whose eyes face out toward the viewer. The model in the painting was presumed, again, to be a real-life prostitute. Notably, Manet played with perspective here in a way that demanded audiences give the piece a second look, just as his challenging “foursome” did decades before. 

    With Le Déjeuner sur l’Herbe, Manet not only condensed cultural elements from different times, but also painted his backdrop with no dimension, as if it were a theatrical flat. Manet likewise rejected rules of proportion, most noticeably in the size of the woman bathing in the background in comparison to the men before her. At the time, these choices rejected conventional perspectives and caused a lot of head-scratching from Paris’s arts community. But over time, Manet’s rebellious style proved foundational to artists like James Tissot, Claude Monet, Paul Cézanne, and Pablo Picasso.

    Want to see the piece for yourself? Today, Le Déjeuner sur l’Herbe is proudly displayed in Paris’s Musée d’Orsay.

    A version of this story was published in 2016; it has been updated for 2023.

    [ad_2]

    Kristy Puchko

    Source link

  • 25 Famous Songs With Misunderstood Meanings

    25 Famous Songs With Misunderstood Meanings

    [ad_1]

    In the history of the music industry, there are some songs that are pretty straightforward—think Color Me Badd’s “I Wanna Sex You Up,” for example (hey, we didn’t say they had to be good songs). And then you have something like The Beatles’s “I Am the Walrus.” So it’s hardly surprising that once you get beyond the title, there are some song lyrics that are either open to interpretation or just downright confusing.

    Here’s a look at 25 songs that got their meanings twisted and misconstrued—and the original intentions put forth by the artists who wrote them.

    Semisonic frontman Dan Wilson predicted the second life of the band’s only big hit; in 2010, Wilson told The Hollywood Reporter, “I really thought that that was the greatest destiny for ’Closing Time,’ that it would be used by all the bartenders.” But when Wilson penned lyrics like “Time for you to go out to the places you will be from,” the song’s focus was more an emphasis on the miracle of childbirth than an ode to kicking late-night barflies to the curb.

    In 2010, Wilson admitted to American Songwriter that he had babies on his mind partway through writing Semisonic’s gangbuster breakout hit, stating, “My wife and I were expecting our first kid very soon after I wrote that song. I had birth on the brain, I was struck by what a funny pun it was to be bounced from the womb.”

    When Rolling Stone named the former Beatle’s ubiquitous hit the third-greatest song of all time, John Lennon’s hallmark lyrics were described as “22 lines of graceful, plain-spoken faith in the power of a world, united in purpose, to repair and change itself.” But the feel-good sentiments behind the song Jimmy Carter once said was “used almost equally with national anthems” have some serious communist underpinnings.

    Lennon called the song “virtually the Communist Manifesto,” and once the song became a hit, went on record saying, “Because it’s sugarcoated it’s accepted. Now I understand what you have to do—put your message across with a little honey.”

    “Total Eclipse of the Heart” is the kind of big, bombastic power ballad that could only flow from the pen of frequent Meat Loaf collaborator Jim Steinman. He called the number a “Wagnerian-like onslaught of sound and emotion” in an interview with People, and American Songwriter’s Jim Beviglia christened it a “garment-rending, chest-beating, emotionally exhausting ballad.” It’s also a vampire love song.

    When Steinman featured “Total Eclipse” in his Broadway musical Dance of the Vampires—a flop that lost $12 million—in 2002, he opened up about the song to Playbill, stating, “With ‘Total Eclipse of the Heart,’ I was trying to come up with a love song and I remembered I actually wrote that to be a vampire love song. Its original title was ‘Vampires in Love’ because I was working on a musical of Nosferatu, the other great vampire story. If anyone listens to the lyrics, they’re really like vampire lines. It’s all about the darkness, the power of darkness and love’s place in dark.”

    Entertainment Weekly recognized The Cure’s synth-slathered love song as the 25th greatest love song of all time, but also questioned, “Just what is this scream/laugh/hug inducing trick?” Turns out, the lyric that threw most fans of The Cure for a loop just refers to a sudden shortness of breath.

    The only thing that might be more oblique than the lyrics is Smith’s explanation for the love song’s cryptically esoteric poetry. In a 2003 interview with Blender, Smith said “Just Like Heaven,” inspired by a trip with his girlfriend to Beachy Head in southern England, was “about hyperventilating—kissing and falling to the floor.”

    Smith’s dissection of the song’s opening lines (“Show me, show me, show me how you do that trick”) is less obvious. According to the singer, the line is equal parts a reference to his affinity for performing magic tricks in his youth and “about a seduction trick, from much later in my life.”

    Turns out Mr. Brown (who thinks “Like a Virgin” is “a metaphor for big d**ks”) and Mr. Blonde (“It’s about a girl who is very vulnerable”) both misinterpreted Madonna’s smash hit in the opening scene of Reservoir Dogs. Even though Madonna famously settled the fictional debate by autographing a CD for Quentin Tarantino—“Quentin, it’s about love, not d**k”—“Like a Virgin” is only autobiographical for songwriter Billy Steinberg.

    Not originally meant for a female performer, the lyrics Steinberg penned for “Like a Virgin” tackle his own relationship woes. He explained in depth to the Los Angeles Times: “I was saying … that I may not really be a virgin—I’ve been battered romantically and emotionally like many people—but I’m starting a new relationship and it just feels so good, it’s healing all the wounds and making me feel like I’ve never done this before, because it’s so much deeper and more profound than anything I’ve ever felt.”

    At first blush, the single off Maroon 5’s debut album Songs About Jane seems to be, well, just another song about Jane, the name of a girlfriend with whom lead singer Adam Levine shared a rocky relationship. But though the album’s lead-off single sounds like a racy nod to the jilted lover Levine claimed to be his muse, “Harder to Breathe” stemmed from a different kind of suffocating relationship. The song serves as a bitter indictment of music industry pressures.

    “That song comes sheerly from wanting to throw something,” Levine said in a 2002 interview with MTV. “It was the 11th hour, and the label wanted more songs. It was the last crack. I was just pissed. I wanted to make a record and the label was applying a lot of pressure, but I’m glad they did.”

    Born in the winter of 1959, Bryan Adams would’ve only been 10 during the eponymous summer of one of his best-known hits, released in 1985. But “Summer of ’69” isn’t so much Adams waxing nostalgic over the dog days of 1969 as much as it is a reference to the sexual position of the same name. In 2008, Adams told CBS News that “a lot of people think it’s about the year, but actually it’s more about making love in the summertime. It’s using ’69 as a sexual reference.”

    Parts of the song are still steeped in hints of truth, though: Adams has gone on record saying that he picked up his second-ever electric guitar at a pawn shop, and that his fingers indeed bled while he was “totally submersed in practicing.” Other facts are indisputably wrong; Adams’s first band, Shock, formed when the singer was 16, and “Summer of ’69” co-writer Jim Vallance stands by the song as a wistful trip in the wayback machine.

    When Georgia natives R.E.M. unleashed their first Top 10 single in concert, the band’s guitar-slinger Peter Buck felt baffled by audiences’ romantic reactions. “I’d look into the audience and there would be couples kissing,” Buck said. “Yet the verse is … savagely anti-love … People told me that was ‘their song.’ That was your song?”

    Singer Michael Stipe echoed Buck’s emotions in a 1992 interview with Q magazine, admitting that he almost didn’t even record the song, calling it “too brutal” and “really violent and awful.” After five years of “The One I Love” going out to loved ones as dedications over the radio waves, Stipe took a complacent stance on his song’s misconstrued fate, saying, “It’s probably better that they think it’s a love song at this point.”

    Radio purists of the ’90s probably missed out on the fact that the upbeat Third Eye Blind anthem is about a couple on a crystal meth binge—the two censor-triggering words in the line “doing crystal meth will lift you up until you break” would get backmasked in an edited version of the song played by radio stations.

    Why make a song about such a serious topic so light and bouncy? Lead singer Stephen Jenkins explained that the musical and lyrical juxtapositions were completely intentional: The music reflects “the bright, shiny feeling you get on speed,” he told Billboard.

    Sorry, urban legend enthusiasts. Tom Petty’s 1977 standard wasn’t inspired by a University of Florida girl who died by suicide. Though the song’s second verse references both a girl standing “alone on her balcony” and “could hear the cars roll by out on 441” (a highway that runs near the Gainesville campus), Petty shot down the misunderstanding on numerous occasions.

    In the book Conversations With Tom Petty, the lead Heartbreaker was quoted as saying, “It’s become a huge urban myth down in Florida. That’s just not at all true. The song has nothing to do with that. But that story really gets around.” Heartbreakers’ guitarist Mike Campbell has backed Petty up, stating that some interpretations of the song took the lyrics at face value: “Some people take it literally and out of context. To me it’s just a really beautiful love song.”

    In Round Two of Song Meanings Getting Twisted By Urban Legends, Phil Collins’s first solo single wasn’t written about the singer’s brush with a man who refused point-blank to save a drowning swimmer. And, according to Collins himself, he most definitely didn’t invite the man to stand front row in the concert to be verbally berated by “In the Air Tonight.”

    Instead, the song is simply a tense, introspective look at Collins’s divorce from his first wife. Collins swears by the story that he pulled together the lyrics in a snap during a studio recording session, and laughs off the rumors swirling around the origins of “In the Air Tonight.” He admitted to the BBC that he doesn’t know what the heck the song is actually about, saying, “What makes it even more comical is when I hear these stories which started many years ago, particularly in America, of someone coming up to me and say[ing], ‘Did you really see someone drowning?’ I said, ‘No, wrong’ … This is one song out of all the songs probably that I’ve ever written that I really don’t know what it’s about …”

    At its heart, one of The Clash’s most scathing political statement is less a song about the state of British politics and more a song about Joe Strummer’s personal fear of drowning. In a dissection of “London Calling” published by the Wall Street Journal, Mick Jones mentioned the band’s nervousness regarding a 1979 London Evening Standard headline about the possibility of the Thames River overflowing and flooding London. How did The Clash react to the news? According to Jones, “We flipped.”

    That nagging fear of drowning propelled Strummer’s first few drafts of the song’s lyrics, at least until Jones stepped in to broaden the scope until “the song became this warning about the doom of everyday life.” Jones joked about the band’s sink-or-swim anxiety: “We were a bit ahead of the global warming thing, weren’t we?”

    Paul McCartney told Santa Monica radio station KCRW that “It’s not really about a blackbird whose wings are broken, you know, it’s a bit more symbolic.”

    A highlight from the McCartney songbook (and written at his kitchen table in Scotland), Sir Paul penned “Blackbird” about the American Civil Rights Movement, drawing inspiration from the racial desegregation of the Little Rock, Arkansas, school system. Put succinctly by USA Today, “Paul McCartney penned Blackbird about the Black struggle.”

    In a 2008 interview with Mojo, McCartney elaborated on just how enamored The Beatles were with the Civil Rights Movement happening across the pond. “I got the idea of using a blackbird as a symbol for a Black person. It wasn’t necessarily a black ‘bird,’ but it works that way, as much as then you called girls ‘birds’ … it wasn’t exactly an ornithology ditty; it was purely symbolic.”

    A perennial choice for the best prom song, Green Day’s acoustic ballad was originally meant to be anything but a romantic affair. Brooding frontman Billie Joe Armstrong wrote the number about a girlfriend who was moving away to Ecuador, and titled the song “Good Riddance” out of frustration with the breakup.

    Not that the misinterpretation of the ballad as a high school slow dance number fazes Armstrong. As he told VH1’s Behind the Music, “I sort of enjoy the fact that I’m misunderstood most of the time. That’s fine.”

    No list of misunderstood songs is complete without “Born in the U.S.A.” Music critic Greil Marcus believes the use of The Boss’s hit as a rah-rah political anthem fuels its legacy: “Clearly the key to Bruce’s popularity is in a misunderstanding,” he said. “He is a tribute to the fact that people hear what they want to hear.”

    As Songfacts points out, “Most people thought it was a patriotic song about American pride, when it actually cast a shameful eye on how America treated its Vietnam veterans … with the rollicking rhythm, enthusiastic chorus, and patriotic album cover, it is easy to think this has more to do with American pride than Vietnam shame.”

    “Born in the U.S.A.” is the antithesis of the American Dream-chasing optimism that listeners construe the rock number as; the song captures the desperate feelings of a working-class citizen in post-Vietnam America. Springsteen explains that the song’s protagonist is “isolated from the government, isolated from his family, to the point where nothing makes sense.”

    Another song whose meaning was obscured by its party anthem vibes, this Calypso-lite tune featured a delightful (and then annoyingly ubiquitous) call-and-response question that never got answered. Asking who let the dogs out became low-hanging comedy fruit after the song’s release in 2000, which meant most people missed that it was “a man-bashing song.” Songwriter Anslem Douglas said in an interview with Rock Cellar Magazine that it’s a song about a good time being ruined by men catcalling and harassing women. Everyone’s having a great time (Yippie-Yi-Yo), and then jerks start treating women like objects, and it ruins everything (woof, woof, woof). The video for Baha Men’s cover of Douglas’s song featured a ton of literal dogs escaping past a guard at a doggie daycare, further obfuscating its feminist roots.

    The surface-level reading of the flighty pop song is pretty standard: Forbidden young love, the thrills of disappointing your parents, and a boyfriend imploring you to run away with him. Yet, as songwriter/lead singer Johnny Rzeznik explained during the band’s VH1 Storytellers session, there’s an even deeper cause to the couple’s angst. “The song is actually about these two teenage kids, and the girlfriend gets pregnant, and they’re trying to decide whether she should get an abortion, or they should get married, or what should go on,” he said. So it’s not exactly Taylor Swift’s “Love Story.” The explanation brings lyrics about loving “the life you killed,” and the priest being on the phone into sharper clarity.

    There’s an entire subgenre of music whose lyrics are ignored because the instrumentals are so fun, and “Macarena” may be its queen. Not understanding Spanish also gave listeners another reason to gleefully swing their hips while the duo sang about a young woman who cheats on her boyfriend with two of his friends while he’s enlisting in the Army. Not great, Macarena!

    It’s that moment in every rom-com when the budding architect or bakery owner faces some initial rejection from her/his object of affection. Cue Blondie and a montage of personal improvement projects and/or cheeky conspiracy theory hobby board-level planning to win them over. Sadly, our well-meaning soul mate is preparing their romantic overtures to a song about a stalker. Songwriter/lead singer Debbie Harry told EW, “I was actually stalked by a nutjob, so it came out of a not-so-friendly personal event. But I tried to inject a little bit of levity into it to make it more lighthearted. I think in a way that’s a normal kind of survival mechanism.”

    Psy’s crowning earworm with its invisible horse dance was Korea’s first massive global musical export, and it came with its very own goofy music video where the elements of Psy’s big-money lifestyle are revealed to be something absurdly pathetic. Without knowing Psy or Korean culture, it’s easy to think he’s simply making fun of himself, but the song and the video are both mocking a specific lifestyle of chasing the appearance of wealth without taking care of your core needs. The hollow commercial attitude is typified in the song by the Gangnam district (think Beverly Hills) where trust-funders eat cheap food in order to afford expensive coffee that they conspicuously down in one sip (instead of savoring). As the song satirizing the pointless pursuit of material reached unseen YouTube success, Psy told The Atlantic, “Human society is so hollow, and even while filming [the music video] I felt pathetic.”

    You just donated to your local pet shelter. We get it. It’s an uplifting song about finding solace at your lowest point through the comforting arms of an angel, a sweet message carried by McLachlan’s heavenly voice and soothing piano tones. As it turns out, the “angel” is heroin. To be specific, anything someone might use it to escape themselves at low points. McLachlan wrote the song following a brutal two-year stint of touring and recording. She read a Rolling Stone article about Smashing Pumpkins touring keyboardist Jonathan Melvoin overdosing on heroin and felt moved by his struggles with drug addiction.

    Andre 3000 was right when he sang “Y’all don’t wanna hear me/ you just wanna dance” near the end of an incredibly joyful jam with deeply depressing lyrics about the state of modern relationships. His baby loves him! Or maybe she’s just afraid of being alone! Separate’s always better! Why would love be any different if nothing lasts forever? In May 2021, Outkast tweeted a meme with Andre 3000’s face from the music video, labeling a very small portion as “A bop” and a huge portion as “The saddest song ever written.”

    For everyone who has made this their first dance at their wedding, James Blunt thinks you’re “f***ed up.” It’s a saccharine-sweet ballad, sure, but it’s also extremely clear from the lyrics that it’s about a creepy dude, high on drugs, reveling in the beauty of a stranger who is with another man, and then despairing that he will never, ever, ever get together with her. Blunt has explained that the stalker dies by suicide in his vision for the song, and the music video even reinforces that subtext: showing Blunt removing his shoes and clothing before jumping off a cliff.

    Not only do people turn on Jimmy Buffett’s classic tune to chill, but the singer/songwriter has also built an entire lifestyle brand around the vibe, complete with restaurants and, naturally, his own tequila. Once again, it’s an incredibly breezy song, tailor-made for sipping and lounging as long as you don’t think about the lyrics too much. The narrator of the song is constantly blackout drunk, getting tattoos he can’t remember, and grappling with who’s responsible for his downfall (a woman … maybe him … definitely him) while “wasting away” so badly he can’t even find the salt, salt, salt.

    Whether in its original German language or in English, the happy-pop New Wave jam is easily the most danceable song about a nuclear holocaust caused by balloons. Guitarist Carlo Karges initially got the idea for the song when Mick Jagger released thousands of balloons into the sky at the end of a 1982 Rolling Stones concert in West Berlin. It got him wondering what would happen if the balloons crossed into Soviet airspace, were mistaken for UFOS, and set off a chain reaction of nukes flying around the world. As added depression fuel, the song ends with the narrator finding a single balloon in the ruins of the world, thinking of the listener, and letting it go. Dance on!

    A version of this story ran in 2016; it has been updated for 2023.

    [ad_2]

    Erik van Rheenen & Scott Beggs

    Source link

  • Can You Spot the Cat in This Sea of Owls?

    Can You Spot the Cat in This Sea of Owls?

    [ad_1]

    If you’re into hidden image brainteasers, you’re probably familiar with Gergely “Dudolf” Dudás, a Budapest-based artist creating work that’s as challenging to the brain as it is pleasing to the eye. Dudolf’s drawings tend to captivate the attention of the Internet, largely because so many people have trouble spotting the creatures and objects he hides in his work. That’s because puzzles like these play up the human brain’s tendency toward “laziness,” or difficulty in identifying subtle visual differences without investing a bit of time in careful scrutiny.

    In this brainteaser, Dudolf has hidden one cat in the middle of dozens of owls in the image below. The colorful critters are all packed together tightly, almost making it difficult to determine where one owl ends and another begins.

    hidden object of a cat hidden among owls

    Owl-eyed viewers will need to scan the image closely to find the hidden kitty. It’s a tough puzzle—the beasts share a body shape and boast unusual colors. And because their shapes are so similar, our brains tend to lump them together.

    If you haven’t found the cat yet, here’s a tip: Pay close attention to the animals’ faces.

    Looking for more animal-themed brainteasers? See if you can find the doe hiding among the stags or the mouse concealed amid the squirrels. If you’re still trying to spot the sneaky feline hidden among the owls, check out the image below for the answer. When you’re done, be sure to brush up on your owl facts here.

    answer showing a cat highlighted among rows of owls

    A version of this story originally ran in 2015; it has been updated for 2022.

    [ad_2]

    Jennifer M Wood

    Source link

  • Talk is Sheep: Behind the Christmas Eve Myth That Animals Speak at Midnight

    Talk is Sheep: Behind the Christmas Eve Myth That Animals Speak at Midnight

    [ad_1]

    For all its very logical and sensible legends and traditions, Christmas has quite a few strange ones too (like, say, gravity-defying reindeer). Some rare bits of Christmas mythology are even stranger still—like the one that claims that at the stroke of midnight on Christmas Eve, animals gain the power of speech.

    Old painting of horses and ponies in a stable

    Beware of fortune-telling horses. / Print Collector/GettyImages

    The legend—most common in parts of Europe—has been applied to farm animals and household pets alike. It operates on the belief that Jesus’s birth occurred at exactly midnight on Christmas Day, leading to various supernatural occurrences. Many speculate that the myth has pagan roots or may have morphed from the belief that the ox and donkey in the Nativity stable bowed down when Jesus was born. In any case, the story has since taken on a life of its own, with different versions ranging from sweet to scary.

    According to The Christmas Troll and Other Yuletide Stories by Clement A. Miles, variations of the legend can be surprisingly sinister for holiday lore. One tells the story of vengeful pets plotting against their masters, like this tale from Brittany:

    “Once upon a time there was a woman who starved her cat and dog. At midnight on Christmas Eve she heard the dog say to the cat, ‘It is quite time we lost our mistress; she is a regular miser. To-night burglars are coming to steal her money; and if she cries out they will break her head.’
    ‘Twill be a good deed,’ the cat replied. The woman in terror got up to go to a neighbor’s house; as she went out the burglars opened the door, and when she shouted for help they broke her head.”

    Another tale, this time hailing from the German Alps, features animals foretelling their caretakers’ death. On Christmas Eve, a young farm servant hides in the stables hoping to witness the animals’ speech, where he overhears an alarming conversation between two horses:

    “We shall have hard work to do this day week,” said one horse.
    “Yes, the farmer’s servant is heavy,” replies another horse.
    “And the way to the churchyard is long and steep,” says the first.

    The servant dies a few days later, leaving those horses to do some heavy lifting.

    A more modern version of the tale first aired on ABC in 1970, and while it’s animated and for children, it’s still surprisingly grim. In the made-for-TV cartoon titled The Night The Animals Talked, animals gain the power of speech and sing a song exalting their newfound ability—to insult each other: “You can bicker with anyone you hate / It’s great to communicate.”

    By the time the animals realize that they’ve been given the ability in order to spread the message of Jesus’s birth, it’s too late. While running through the streets of Bethlehem, they lose their speech one by one. The ox, last to lose the ability, is left to lament that so many humans seem to waste the gift of speech.

    And then there’s “The Friendly Beasts,” a lighter version of the legend in the form of a Christmas carol. The hymn takes a less literal approach to the “talking animals” theory, instead focusing more on the connection each animal had to Jesus’s birth: “’I,’ said the donkey, shaggy and brown, ‘I carried His mother up hill and down; ‘I,’ said the cow, all white and red, ‘I gave Him my manger for His head,’” and so on with the sheep and dove.

    The song’s origins purportedly lie in a mostly forgotten French medieval feast day, the Fete de L’Ane, or the Feast of the Ass, which honors Mary, Jesus, and Joseph’s flight into Egypt, and the donkey who transported them. The carol was born of an early Latin hymn commonly sung at the feast, “Orientis partibus Adventavit asinus,” or “From the East the ass has come,” which included a chorus of “Hail, Sir donkey, hail!”

    old illustration of bees

    Who needs human carolers when you have singing bees? / Hulton Archive/GettyImages

    The variations of Christmas legends about special or supernatural animal behavior are diverse and far-reaching. Not all necessarily involve animals speaking. In John Howison’s 1821 Sketches of Upper Canada, the author recounts a Native American who told him that “[It’s] Christmas night and all deer fall upon their knees to the Great Spirit.” William Henderson’s 1879 book Folk-lore of the Northern Counties of England and their Borders recounts the legend that, on Christmas Eve, bees assemble into a type of choir:

    “Thus the Rev. Hugh Taylor writes: ‘A man of the name of Murray died about the age of ninety, in the parish of Earsdon, Northumberland. He told a sister of mine that on Christmas Eve, the bees assemble and hum a Christmas hymn, and that his mother had distinctly heard them do this on one occasion when she had gone out to listen for her husband’s return. Murray was a shewd man, yet he seemed to believe this implicitly.’”

    In some cases, the myth of the singing bees circles back to that of the kneeling oxen: “[…]In the parish of Whitebeck, in Cumberland, bees are said to sing at midnight as soon as the day of the Nativity begins, and also that oxen kneel in their stalls at the same day and hour.”

    So, singing bees, plotting pets, clairvoyant horses, praying oxen, and more, all to illustrate the power of Christmas Eve—short of supernatural power, it certainly has a strong hold on the collective human imagination.

    A version of this story originally ran in 2015; it has been updated for 2022.

    [ad_2]

    Maureen Monahan

    Source link

  • 15 Books Whose Endings Were Changed for the Movie

    15 Books Whose Endings Were Changed for the Movie

    [ad_1]

    So you read the book before you saw the movie. Congrats! Unfortunately—as these examples prove—that doesn’t always make you an expert on what, exactly, is going to unfold on the big screen. (It should go without saying, but this article contains spoilers—lots of them. You’ve been warned!)

    Jurassic Parkone of the most popular summer blockbusters of all time, doesn’t completely line up with the events described in Michael Crichton’s best-selling novel of the same name. At the end of the book, the Costa Rican military comes to the rescue by bombing Site A on Isla Nublar. But director Steven Spielberg felt like changing it up. Instead of a military intervention, Spielberg decided to have the T. Rex return to save the protagonists from a Velociraptor attack. “I think the star of this movie is the T. Rex,” Spielberg explained at the time. “The audience will hate me if the T. Rex doesn’t come back for one more heroic appearance.”

    The book and movie’s body counts vary, too. By the end of the novel, John Hammond has died, and it is implied that Ian Malcolm has as well. Both survive in the movie. On the other hand, the park’s game warden, Robert Muldoon, and IgGen’s attorney, Donald Gennaro, perish in the big screen adaptation, but live on in the book.

    Planet of the Apes (1968) features one of the most iconic twist endings in movie history: Astronaut George Taylor (Charlton Heston) discovers he has been marooned on a post-apocalyptic Earth the entire time. But in La Planète des Singes, the French novel it is based on, the main character—journalist Ulysse Merou—lands on a different planet during the course of his travels, one inhabited by self-aware apes, sentient monkeys, and tribes of dimwitted humans. When Ulysse finally makes it back to Earth, he is shocked to learn that it is now 700 years in the future, and that a similar hierarchy has emerged at home. 

    Twilight Zone creator Rod Serling, who co-wrote the film’s screenplay, was the one who ultimately decided to make the planet of the apes Earth in the distant future. 

    Stanley Kubrick based his screenplay on the shortened American version of the British novel by Anthony Burgess. This telling omitted the final chapter of the book, focused on Alex after he is rehabilitated. Though he grows out of his murderous tendencies in Burgess’s text, in Kubrick’s interpretation, Alex remains as psychotic as ever. Kubrick didn’t like the tale’s original ending; he felt it was entirely too optimistic given the story’s tone and themes. “I think whatever Burgess had to say about the story was said in the book,” Kubrick said. “But I did invent a few useful narrative ideas and reshape some of the scenes.”

    Burgess wasn’t a fan of the final product. “The book I am best known for, or only known for, is a novel I am prepared to repudiate: written a quarter of a century ago,” Burgess later recalled. “It became known as the raw material for a film which seemed to glorify sex and violence. The film made it easy for readers of the book to misunderstand what it was about, and the misunderstanding will pursue me until I die.”

    The film version of Fight Club remains faithful to author Chuck Palahniuk’s original plot—until the very end, that is. The movie version wraps up as the narrator, standing beside Marla, watches a series of explosions caused by his now-absent alter ego, Tyler Durden. At the end of the book, however, the narrator wakes up in recovery from his gunshot wound. He thinks he’s in heaven, but Palahniuk makes it clear that he’s actually in a mental institution. Several hospital attendants ask him when he’s going to start Project Mayhem again, inferring that Tyler Durden is still very much a part of him.

    Director David Fincher explained his choice by arguing that the book was too devoted to the narrator’s alter ego: “[I] wanted people to love Tyler (Durden), but I also wanted them to be OK with his vanquishing.”

    Nathaniel Hawthorne’s novel The Scarlet Letter is an exploration of guilt, punishment, and mob mentality in 17th-century New England. At the end of the classic tale, the townspeople persecuting Prynne learn that the father of her baby is Reverend Dimmesdale, who eventually dies from immense guilt.

    The 1995 film version of The Scarlet Letter opted instead for a happy Hollywood ending (read: no one dies). Instead, Reverend Dimmesdale and Hester Prynne leave their town in order to build a new life together.

    Truman Capote’s beloved novella was also given a simplified and sanitized ending when it was adapted for Hollywood. In the book, Holly Golightly loses her cat and abandons New York for Argentina, and it’s unclear where the free spirit will end up next. The movie, on the other hand, ends with Audrey Hepburn’s Holly reuniting with Cat and sharing a passionate kiss with neighbor Paul. (There’s no romance between them in Capote’s version.)

    Capote wasn’t a fan of the movie based on his work, nor of the casting of Audrey Hepburn. “I had lots of offers for that book, from practically everybody,” Capote said in an interview. “And I sold it to this group at Paramount because they promised things, they made a list of everything, and they didn’t keep a single one.”

    Jodi Picoult’s My Sister’s Keeper tells the story of a young leukemia patient named Kate whose parents conceive another daughter, Anna, in order to have an organ donor for their firstborn. When she turns 13, Anna is asked to donate one of her kidneys to her dying sister. She refuses and sues her parents for medical emancipation. 

    In the book, Anna gets into a terrible car accident, and her kidneys are posthumously harvested for Kate, who survives. But for the 2009 adaptation, director Nick Cassevetes chose to reverse the sisters’ fates. Kate ends up succumbing to her illness after she refuses to accept her sister’s organs. Cassevetes believed his movie’s ending was more accurate after he visited pediatric hospitals and talked to terminally ill patients.

    “Going and visiting people in the hospital, this story repeated over and over and over again,” Cassevetes told About.com. “In reality, none of these stories ended like the book did.”

    Stephen King’s The Mist ends on a vague note: A few survivors head toward the source of a mysterious radio transmission as the titular mist creeps around them. But director Frank Darabont decided to give the film a more definitive—and more gut-wrenching—conclusion. David, played by Thomas Jane, comes to realize that the group’s survival efforts are futile. To prevent any further suffering, he kills the remaining survivors, including his son, just before the military shows up and the mist clears. 

    “How primitive do people get?” Darabont said of his new ending. “It’s Lord of the Flies that happens to have some cool monsters in it.” King, for his part, gave the new ending two thumbs-up: “The ending is such a jolt—wham! It’s frightening. But people who go to see a horror movie don’t necessarily want to be sent out with a Pollyanna ending.”

    At the end of Dr. Seuss’s The Lorax, the tale’s Once-ler gives the boy the last-ever Truffula seed in the hopes that he’ll be able to grow a new forest. But there’s no room for ambiguity in the story’s 2012 cartoon version: Before the credits roll, new Truffula Trees are flourishing and The Lorax has returned to the forest. 

    Dr. Strangelove or: How I Learned to Stop Worrying and Love the Bomb, based on Peter George’s Red Alert, takes a comedic approach to the source material. Instead of narrowly avoiding a nuclear catastrophe at the zero hour (like the book does), Stanley Kubrick decided to blow up the world because of some petty bickering. 

    Originally, Kubrick planned to have everyone in the situation room get into a big pie fight. Ultimately, though, “I decided it was farce and not consistent with the satiric tone of the rest of the film,” he said.

    There are some pretty major differences between Forrest Gump’s book and film versions. Though the movie ends with Jenny’s death and shows Forrest raising their child alone, the book wraps up with Forrest starting up his own shrimp business, in memory of his college friend Bubba. (Another key difference: In Winston Groom’s book, Jenny survives, but marries another man and has his child.) 

    “[Screenwriter] Eric Roth departed substantially from the book,” Zemeckis told the Chicago Tribune. “We flipped the two elements of the book, making the love story primary and the fantastic adventures secondary. Also, the book was cynical and colder than the movie. In the movie, Gump is a completely decent character, always true to his word. He has no agenda and no opinion about anything except Jenny, his mother, and God.”

    Groom believed that the movie “took some of the rough edges off” his beloved character. In fact, he was so unhappy with the film that he started the book’s sequel, Gump and Co., with Forrest telling readers, “Don’t never let nobody make a movie of your life’s story.”

    Who Censored Roger Rabbit?, the inspiration for Who Framed Roger Rabbit?, is a surprisingly dark murder mystery. In the novel, Roger hires Detective Eddie Valiant to figure out why Rocco DeGreasy, the man who has the cartoon rabbit under contract, hasn’t given him his own comic strip. During Valiant’s investigation, Roger Rabbit is murdered and his wife Jessica is framed. Valiant spends the rest of the story trying to figure out who killed Roger. (The book ends with the revelation that a mysterious genie is the culprit.)

    Although there’s still a murder at the center of the 1988 movie version—this time, Toontown owner Marvin Acme is the victim—Disney and Touchstone Pictures gave the entire story an overhaul when the company acquired the film rights from author Gary K. Wolf. The studios hoped to make a family-friendly blockbuster in order to rejuvenate their flagging animation department, and saw Who Censored Roger Rabbit? as a means to that end.

    In 2007, Will Smith starred in author Richard Matheson’s I Am Legend as Dr. Robert Neville, a survivor of a worldwide plague that turns humans into infected, vampire-like creatures. The book ends with Dr. Neville, who spends his days slaying the infected to protect himself, learning that he’s considered a monster to the creatures who are now the dominant race on the planet. He’s imprisoned and later executed for his crimes. In the movie, however, Neville solidifies his hero status by handing off a cure for the virus ravaging the planet to a healthy woman and boy.

    An alternate ending that showed more interaction between Neville and the creatures was shot, but the filmmakers opted to go with an ending in which Will Smith sacrifices himself for the sake of the human race.

    The first Rambo movie is based on the novel First Blood by author David Morrell. The book and the movie both tell the story of a troubled Vietnam War vet, but the book ends with his death after a violent showdown with Chief Teasle. In the movie, Rambo and Teasle survive, and Rambo turns himself into the authorities.

    The reason for the change: Once again, early test audiences didn’t approve of the original ending, and wanted to see Rambo live to fight another day. 

    The 1956 black-and-white classic, based on Jack Finney’s The Body Snatchers, ends with protagonist Miles ranting and raving (“You’re next!”) along a busy highway of pod people and non-believers. But in the book, the titular body snatchers flee Earth after Miles discovers where their pods are grown and begins to set them on fire.

    Though director Don Siegel and screenwriter Daniel Mainwaring were happy with their unsettling ending, the movie studio demanded a more hopeful outcome. To keep their bosses happy, the filmmakers added in a brief epilogue, during which the audience learns that local police had alerted national authorities to the presence of the space invaders. “The film was nearly ruined by those in charge at Allied Artists, who added a preface and an ending that I don’t like,” Siegel said.

    A version of this story ran in 2015; it has been updated for 2023.

    [ad_2]

    Rudie Obias

    Source link

  • 15 Festive Facts About Christmas Lights

    15 Festive Facts About Christmas Lights

    [ad_1]

    The Christmas tree may have German roots, but the twinkling lights adorning them each December are distinctly American. From Thomas Edison’s ingenious marketing strategy to Carson Williams’ viral “Wizards of Winter” display, here are some facts about fairy lights to keep you warm throughout the season.

    Determined to make good on his promise to electrify downtown Manhattan, Thomas Edison sought to draw attention to his incandescent light bulb during the 1880 Christmas season. The Wizard of Menlo Park, who was known for his PR savvy, laid eight miles of underground wire to power strings of lights around the outside of his New Jersey laboratory. Train commuters traveling between New York and Philadelphia were so amazed by the glowing fields that one reporter labeled Edison “the Enchanter” and described the spectacle as “a fairy-land of lights.”

    Just two years later in 1882, Edison set up a central power plant on Pearl Street in Manhattan. That holiday season, his friend and colleague Edward H. Johnson decorated the first Christmas tree with 80 blinking red, white, and blue electric lights at his home on Fifth Avenue. To add to the historic presentation, Johnson’s tree sat atop a revolving box that spun every 10 seconds. Documenting the amazing display, a reporter from the Detroit Post and Tribune wrote, “I need not tell you that the scintillating evergreen was a pretty sight—one can hardly imagine anything prettier … The tree was kept revolving by a little hidden crank below the floor which was turned by electricity. It was a superb exhibition.”

    Benjamin Harrison was the first to have a Christmas tree in the White House in 1889, but it wasn’t until 1895—four years after the White House was wired with electricity—that Grover Cleveland requested the first family’s tree be adorned with hundreds of multi-colored bulbs. The 22nd and 24th president is credited with warming the public to the idea of electric Christmas lights. At the time, many people mistrusted electricity and thought that dangerous vapors would seep into their homes through the lights and wires.

    By 1900, it could cost as much as $300 (around $10,000 today!) to pay for the lights, a generator, and a wireman’s services to illuminate a Christmas tree with electric lights. A breakthrough came in 1903 when General Electric (created after another company merged with Edison’s factory) offered the first pre-wired, eight-socket strings of lights. When GE attempted to patent their Christmas “festoons,” their patent application was refused because the product was based on knowledge that ordinary electricians possessed. Once the market was open, other companies and inventors began to produce their own tree light sets, and the American Christmas light industry was born.

    Inspired by a tragic fire that was started by candles decorating a Christmas tree, then-teenager Albert Sadacca suggested adapting the novelty lighting his parents sold for Christmas trees. While only a hundred strings sold in the first year, once Sadacca thought to paint the bulbs red, green, and other colors, the business took off. Sadacca started the National Outfit Manufacturers Association, a trade group of several small companies that consolidated into the NOMA Electric Company in 1926. NOMA became the largest Christmas light company in the world until the mid-1960s.

    Many of the early figural light bulbs were blown from the molds that were also used to make glass ornaments, and were then painted by toy makers. The paint would often flake from the constant expansion and contraction of the glass (due to the heat generated as a byproduct of making light), so milk glass was typically used to make the flaking paint less noticeable.

    President Calvin Coolidge was responsible for the first National Christmas tree in 1923. The 48-foot tall balsam fir came from Coolidge’s home state of Vermont and was adorned with 2500 red, white, and green electric bulbs. But for most of the country, outdoor lights wouldn’t be widely available until 1927. To increase sales, General Electric and various distribution companies sponsored neighborhood “decorating with color-light” contests, a tradition that continues today.

    Move over, Clark Griswold. The Gay family strung 601,736 lights around their LaGrangeville, New York, home in 2014 to reclaim the Guinness World Record for the most lights on a residential property. Set to more than 200 songs, the installation took the crown with help from Ritz Crackers, which contributed a 200,000-light display.

    Universal Studios Japan, in Osaka, smashed its own Guinness World Record for the most lights on an artificial Christmas tree in November 2022. It’s eye-popping display contained 612,000 lights.

    Mason, Ohio, resident Carson Williams spent nearly two months programming 25,000 lights to the Trans-Siberian Orchestra’s song “Wizards in Winter,” which he transmitted through a FM channel. The Christmas light display became one of YouTube’s early viral videos in 2005. “It just shocked everybody that it took on a life of its own,” TSO creator Paul O’Neill told The Providence Journal. “When you go to Disney World or MGM, all the lights are all going off to Trans-Siberian Orchestra music.”

    Those red-tipped mini lights that enable a string to twinkle work through a simple design. When electricity heats a strip of metal in the bulb, it bends and breaks the circuit. As the metal cools, it bends back and reconnects the circuit to create an intermittent flashing effect. But more modern light displays use an integrated circuit.

    Residents of Aurora, Illinois face a fine of $50 or more if they do not take their lights down by February 25. 

    Several homeowners associations across the country ban outdoor light displays, with one in Idaho making news in 2015 for threatening to sue a resident for an overly elaborate display.

    According to the Consumer Product Safety Commission (CPSC), thousands of people are treated in emergency rooms around the country for injuries connected to holiday lights, Christmas trees, ornaments, and other decorations during the holiday season. The most common injuries are falls, lacerations, and back strains.

    While electric lights greatly reduced the risk of fire, the CPSC estimates that fire departments respond to several hundred fires each year in which the Christmas tree was found to be the first item ignited. To prevent fires, make sure to water trees frequently, unplug lights when leaving home, and discard holiday light sets with damage such as exposed wires and broken sockets.

    A version of this story ran in 2015; it has been updated for 2022.

    [ad_2]

    Liz Loerke

    Source link

  • 50 Words That Sound Dirty But Actually Aren’t

    50 Words That Sound Dirty But Actually Aren’t

    [ad_1]

    To paraphrase Krusty the Clown, comedy isn’t dirty words—it’s words that sound dirty, like mukluk. He’s right, of course. Some words really do sound like they mean something quite different from their otherwise entirely innocent definition (a mukluk is an Inuit sealskin boot, in case you were wondering), and no matter how clean-minded you might be, it’s hard not to raise an eyebrow or a wry smile whenever someone says something like cockchafer or sexangle. Here are 50 words that might sound rude, but really aren’t. Honest.

    If you read that as “a-hole,” then think again. Aholehole is pronounced “ah-holy-holy,” and is the name of a species of Hawaiian flagtail fish native to the central Pacific.

    Aktashite is a rare mineral used commercially as an ore of arsenic, copper, and mercury. It takes its name from the village of Aktash in eastern Russia, where it was first discovered in 1968. The final –ite, incidentally, is the same mineralogical suffix as in words like graphite and kryptonite.

    Oregon Flying Squirrel

    Oregon Flying Squirrel / Heritage Images/GettyImages

    While exploring the coast of Virginia in 1606, Captain John Smith (of Pocahontas fame) wrote in his journal of a creature known to local tribes as the assapanick. By “spreading their legs, and so stretching the largeness of their skins,” he wrote, “they have been seen to fly 30 or 40 yards.” Assapanick is another name for the flying squirrel.

    Assart is an old medieval English legal term for an area of forested land that has been converted into arable land for growing crops. It can also be used as a verb meaning “to deforest,” or preparing wooded land for farming.

    Derived from bastón, the Spanish word for a cane or walking stick, bastinado is an old 16th-century word for a thrashing or caning, especially on the soles of the feet.

    In addition to being the name of a former shipping port in northern Tasmania, boobyalla is also a name for the wattlebird, one of a family of honeyeaters native to much of Australia, that according to the Oxford English Dictionary is “a borrowing from Tasmanian Aboriginal.”

    In his Dictionary of the English Language (1755), Samuel Johnson described a bum-bailiff as “a bailiff of the meanest kind,” and in particular, “one that is employed in arrests.”

    One possible meaning of bumfiddle is “to pollute or spoil something,” in particular by scribbling or drawing on a document to make it invalid. A bumfiddler is someone who does precisely that. (But there are dirty meanings of the word, too.)

    Like the aholehole, the bummalo is another tropical fish, in this case a southeast Asian lizardfish. When listed on Indian menus, it goes by “Bombay duck.”

    Businessman whispering in another businessman's ear

    This guy’s a clatterfart. / Hans Neleman/The Image Bank/Getty Images

    According to a Tudor dictionary published in 1552, a clatterfart is someone who “wyl disclose anye light secreate”—in other words, a gossip or blabbermouth.

    Cockapert is an Elizabethan name for “a saucy fellow,” according to the OED, but it can also be used as an adjective meaning “impudent” or “smart-alecky.”

    The word cock-bell refer to a small handbell, a type of wildflower that grows in the spring, and an old English dialect word for an icicle. In any case, it’s derived from coque, the French word for a seashell.

    The cockchafer is a large beetle native to Europe and western Asia. The origin of its name is a mystery, but one theory claims the beetles are so characteristically aggressive that they can be made to fight one another like cockerels.

    One of the smallest members of the antelope family 'Dik-Dik'

    One of the smallest members of the antelope family, the dik-dik. / Anadolu Agency/GettyImages

    Standing little more than a foot tall at the shoulder, the dik-dik is one of the smallest antelopes in all of Africa. Their name is apparently an imitation of their alarm call.

    A dreamhole is an opening made in the wall of a building to let in sunlight or fresh air. It was also once used to refer to holes in watchtowers used by lookouts and guards, or to openings left in the walls of church towers to amplify the sounds of the bells.

    According to one 19th-century glossary of industrial slang, a fanny-blower or fanner was “used in the scissor-grinding industry,” and comprised “a wheel with vanes, fixed onto a rotating shaft, enclosed in a case or chamber to create a blast of air.” In other words, it’s a fan.

    Fartlek is a form of athletic training in which intervals of intensive and much less strenuous exercise are alternated in one long continuous workout. It literally means “speed-play” in Swedish.

    'The Surrender of the Prince Royal'. c1650-1700Artist: Willem van de Velde the Younger

    There are several fuksheets in this painting. / Print Collector/GettyImages

    Fuk was an old Middle English word for a sail, specifically the foremost sail on a ship. The word fukmast referred to a ship’s foremast, while fuksheet or fuksail were used for the sail attached to the ship’s fukmast.

    To grope a gull is an old Tudor English expression meaning “to take advantage of someone” or “to swindle an unsuspecting victim”—and a gullgroper does just that.

    Taking its name from an Arabic word meaning “blustering” or “blowing,” haboob refers to a dry wind that blows across deserts, dustbowls, and other arid regions often at great speed, forming vast sandstorms as it goes. Haboobs are typically caused by the collapse of a cold front of air, which blasts dust and sediment up from the desert floor as it falls.

    The Hurdy-Gurdy Player.

    A humpenscrump/hurdy-gurdy player. / Heritage Images/GettyImages

    The OED defines humpenscrump as “a musical instrument of rude construction.” It was originally another name for the hurdy-gurdy, as were humstrum, celestinette, and wind-broach.

    Invagination is simply the process of putting something inside something else (in particular, a sword into a scabbard). It’s also the proper name for turning something inside out. The opposite is called “evagination.”

    Jaculation is the act of throwing or jostling something around; jaculate means “to rush or jolt forward suddenly.”

    A jerkinhead is a roof that is only partly gabled (i.e., only forms part of a triangle beneath its eaves) and is instead levelled or squared off at the top, forming a flattened area known as a hip. Jerkinheads are also known as “half-hipped” or “clipped-gable” roofs.

    In addition to being an old nickname for a walking stick or truncheon, knobstick is an old 19th-century slang word for a workman who breaks a strike, or for a person hired to take the place of a striking employee. (These days, we call them “scabs.”)

    A kumbang is another hot, arid wind, in this case one that blows seasonally in the lowlands of western Indonesia.

    Lobcock is an old Tudor English word for an unsophisticated, clownish bumpkin. Lobcocked is an equally ancient adjective meaning “boorish” or “naïve.”

    A nestle-cock is the last bird to hatch from a clutch of eggs. It dates from the early 1600s, when it was also used as a nickname for an overly spoilt or pampered child.

    European green woodpecker (Picus viridis) is seen on a tree...

    The European green woodpecker, a.k.a. a nicker-pecker. / Marcos del Mazo/GettyImages

    Nicker-pecker is an old English dialect name for the European green woodpecker, the largest woodpecker native to Great Britain. In this context, nicker is probably a derivative of nick, meaning “a small cut or scratch.”

    In early 19th-century English, boxers were nicknamed nobbers, a name apparently derived from the earlier use of nobber as a slang term for a punch or blow to the head.

    Nodgecock is another Tudor word for a foolish person. It likely derives from an even earlier word, noddypoll, for someone who nods their head in agreement with any idea, no matter how good or bad it might be.

    Red raffle tickets

    You’d use a lot of these tickets in a pakapoo. / kevinjeon00/E+/Getty Images

    Pakapoo is a 19th-century Australian word for a lottery or raffle. It apparently derives from a Cantonese phrase, baahk gáap piu, literally meaning “white pigeon ticket”—the Oxford English Dictionary suggests that in the original form of the game, a white dove might have been trained to select the winning ticket from all of the entries.

    The word peniaphobia definitely doesn’t mean what it sounds like—it’s actually the fear of poverty.

    Penistone (pronounced “PEN-is-tun,” before you ask) is the name of a picturesque market town in Yorkshire, England, which has given its name to both a type of coarse woolen fabric and a type of locally produced sandstone.

    The Scots word pershittie means “prim“ or “overly meticulous.” It’s one of a family of late 18th- to early 19th-century Scots words all of similar meaning, including perjinkity, perskeety, and, most familiar of all, pernickety.

    Pissalat is a condiment popular in southern French cookery made from puréed anchovies and olive oil, mixed with garlic, pepper, and herbs. It’s used to make a type of open bread tart called a pissaladière, which is flavored with onions and black olives.

    Pissasphalt is a thick semi-liquid form of bitumen, similar to tar. The first part of the name is the Greek word for pitch, pissa.

    Poonga oil is obtained from the seeds of the Indian beech tree, Pongamia pinnata, and is widely used across southern India as everything from a skin treatment to a replacement for diesel in engines and generators.

    Spain's Post-Covid Tourism Revival

    A whole lot of sack-butts. / David Silverman/GettyImages

    When it’s spelled with one t, the word sackbut refers to an early Renaissance brass instrument similar to a trombone. When sack-butt has two ts, however, it’s a word for a wine barrel.

    The adjective sexagesimal means “relating to the number 60,” and anything that proceeds sexagesimally does so in sets of 60 at a time. So the word sexagesm means “one-sixtieth of something.”

    Both sexangle and the equally indelicate sexagon are simply 17th-century names for what is otherwise known as a hexagon, a plane geometric shape with six sides. The prefix sexa– is derived from the Latin word for “six” rather than its Greek equivalent, heks.

    Foil is an old-fashioned name for a leaf or petal dating back to the Middle English period; it’s retained in the names of plants like the bird’s-foot trefoil, a type of clover, and the creeping cinquefoil, a low-growing weed of the rose family. The word sexfoil refers to a six-leaved plant or flower, or a similarly shaped architectural design or ornament incorporating six leaves or lobes.

    The shittah is a type of acacia tree native to Arabia and north-east Africa that is mentioned in the Old Testament Book of Isaiah as one of the trees that God “will plant in the wilderness” of Israel, alongside the cedar, pine, and myrtle. Its name was adopted into English from Hebrew in the early Middle Ages, but it can probably be traced all the way back to an Ancient Egyptian word for a thorn-tree.

    Billcock, brook-ouzel, oar-cock, velvet runner, grey-skit, and skiddy-cock are all old English dialect names for the water rail, a small and notoriously elusive wading bird found in the wetlands of Europe, Asia, and north Africa. The name skiddy-cock is thought to be derived from skit, a 17th-century word meaning “to act shyly,” or “to move rapidly and quickly”—but it could just as probably be derived from an even older 15th-century word, skitter, meaning “to produce watery excrement.”

    In 19th-century English, a slagger was a workman in a blast furnace whose job it was to siphon off the stony waste material, or slag, that is produced when raw metals and ores are melted at high temperatures. Even earlier than that, in 16th century English, slagger was a verb, variously used to mean “to loiter or creep,” or “to stumble or walk awkwardly.”

    Glassmaking in Palma de Mallorca

    Taking a piece of glass out of the teasehole. / Carlos Alvarez/GettyImages

    A teasehole is the opening in a glassmaker’s furnace through which the fuel is added.

    Sheep farmers in some rural parts of Britain once had their own traditional counting systems, many of which are particularly ancient and predate even the Norman and Anglo-Saxon invasions of England. Most of these counting systems vanished during the Industrial Revolution, but several remain in use locally and have become fossilized in local rhymes, sayings and folk songs. Tether was an old Lake District name for the number 3, while dick was the number 10; tetheradick, ultimately, was a count of 13.

    Tit-bore—or tit-bore-tat-bore in full—is a 17th-century Scots name for a game of peekaboo. It was once also called hitty-titty, as was, incidentally, hide and go seek.

    The tit-tyrants are a family of eight species of flycatcher native to the Andes Mountains and the westernmost rainforests of South America. One of the species, the ash-breasted tit-tyrant, is one of the world’s most endangered birds, with fewer than 1000 individuals left in a handful of remote, high-altitude sites in Peru and Bolivia.

    Wankapin, or water chinquapin, is another name for the American lotus, Nelumbo lutea, a flowering plant native to Central American wetlands. The lotus was apparently introduced to what is now the southern United States by native tribes who would use the plant’s tubers and seeds (known as “alligator corn”) as a source of food.

    This list first ran in 2015 and has been updated for 2023.

    Are you a logophile? Do you want to learn unusual words and old-timey slang to make conversation more interesting, or discover fascinating tidbits about the origins of everyday phrases? Then get our new book, The Curious Compendium of Wonderful Words: A Miscellany of Obscure Terms, Bizarre Phrases, & Surprising Etymologies, out now! You can pick up your copy on Amazon, Barnes & Noble, Books-A-Million, or Bookshop.org.

    [ad_2]

    Paul Anthony Jones

    Source link

  • Why Do We Only Say “Merry” for Christmas?

    Why Do We Only Say “Merry” for Christmas?

    [ad_1]

    For well wishes on all occasions, from general holidays like Halloween and Valentine’s Day to personal milestones like anniversaries and birthdays, English speakers are happy to let happy do the heavy lifting. But for some reason, we’ve decided that Christmas deserves its own bespoke greeting.

    So, as Thanksgiving fades to black, the word merry shakes off the dust of its nearly year-long hibernation and emerges—along with eggnog, ugly sweaters, and jolly old St. Nick himself—into the glorious red and green glow of seasonal relevance.

    Which leaves the curious with one question: How exactly did merry become the go-to modifier for Christmas—and only Christmas?

    It all began when merry arrived in Old English by way of Germanic. It essentially meant “pleasing,” but that definition expanded over the centuries to cover “festive,” “joyous,” and other celebration-related senses. The earliest known reference to merry Christmas dates back to 1534—in a letter from John Fisher, bishop of Rochester, to Henry VIII’s chief minister Thomas Cromwell. “And thus our Lord send yow a mery Christenmas, and a comfortable, to yowr heart desyer,” Fisher wrote.

    Happy got a slightly later start, showing up in English around the 14th century from hap, meaning “good fortune.” Happy, too, enjoyed a broadening of its definition into the territories of pleasure and celebration, and it wasn’t long before people were wishing each other happy holidays. According to the Oxford English Dictionary, Happy New Year came first in the mid-16th century, and Happy Christmas was in play by the late 17th.

    picture of young church Christmas carolers in 'Aunt Louisa’s London Toy Books: The Robin’s Christmas Eve' published in 1867.

    Merry young carolers in ‘Aunt Louisa’s London Toy Books: The Robin’s Christmas Eve’ published in 1867. / whitemay/DigitalVision Vectors/Getty Images

    For a while after that, merry and happy were both regularly paired with Christmas. It wasn’t until the Victorian era that merry pulled ahead in the rankings, thanks to some seminal Yuletide content. Charles Dickens peppered 1843’s A Christmas Carol with roughly 20 Merry Christmases, for example, and not a single happy Christmas. The first commercial Christmas card, which debuted that same year, featured Merry Christmas as well.

    The phrase also cropped up in carols, including early versions of “We Wish You a Merry Christmas” favored by 19th-century British kids. As one stanza went, “I wish you a merry Christmas / And a happy new year / A pocket full of money / And a cellar full of beer.”

    Though not all Victorian Christmas traditions have prevailed, our modern conception of the holiday is still very much a reflection of that era—as evidenced by the fact that we’re still reading (or watching adaptations of) A Christmas Carol, sending Christmas cards, and listening to “We Wish You a Merry Christmas.” Moreover, we’ve shored up the staying power of Merry Christmas by adding our own memorable references to the heap, from Judy Garland’s warbling “Have Yourself a Merry Little Christmas” to Home Alone 2: Lost in New York’s iconic catchphrase, “Merry Christmas, ya filthy animals!”

    Using merry for other occasions wasn’t always unheard of; merry Thanksgiving and merry birthday continued making appearances into the 20th century. But the ever-swelling volume of Christmas culture containing merry has anchored it to the holiday in a manner that hasn’t happened with any other fête.

    All things considered, it’s quite an achievement that the UK has managed to avoid merry’s monopoly and keep happy Christmas on the market. Semantics just might know why.

    Despite their definitional overlap, merry and happy aren’t mirror images of each other. Since the 14th century, per the OED, people have used merry to mean “boisterous or cheerful due to alcohol.” Merry Christmas, therefore, might be construed as a winking way to say, “I hope your cup runneth over … with champagne at all the best Christmas parties, that is!”

    You could argue that it’s vaguely sacrilegious, or at least in poor taste, to focus on booze-heavy revelry during a holiday that’s about as holy in origin as they come. And you certainly wouldn’t be the first.

    “We make Christmas excessively merry, only by being excessively wicked; and we celebrate the festivity of our Savior, as if we were ministering the mad orgies of Bacchus,” one observer wrote in a 1772 issue of The London Magazine: Or, Gentleman’s Monthly Intelligencer. “But profligacy is the characteristic of this wretched age.”

    And the next age, too: A North London reverend named Gordon Calthrop pointed out the debauchery often involved in a merry Christmas during an 1864 address that advocated for happy Christmases rather than simply merry ones. But his thesis was less about condemning merrymakers and more about questioning whether merriment equaled happiness. In Calthrop’s estimation, it did not.

    A 19th-century illustration of Christmas punch drinkers by Randolph  Caldecott

    A 19th-century illustration of Christmas punch drinkers by Randolph Caldecott. / ilbusca/DigitalVision Vectors/Getty Images

    “The boisterous gaiety which many put on, is oftentimes only a mask. It covers a sad—sad face,” he said. “And if a man tries to reassure me, or to persuade himself, by extravagant demonstrations of delight, that he is exceedingly happy, I always feel disposed to take the liberty to doubt the statement. True happiness is not a noisy and boisterous, but a quiet thing.”

    You can write it off as a personal hot take that true happiness is never expressed noisily. But Calthrop’s opinion does jibe with the connotations of the words merry and happy. The former is typically characterized by some energetic and short-lived expression of cheer: laughing, singing, dancing, clinking beer steins, etc. Happy, meanwhile, often implies a deeper-seated and less fleeting kind of contentment—not to mention its original sense regarding good fortune.

    This distinction could shed light on why people started wishing each other a merry Christmas and a happy New Year: as if to say, “I hope you have a really fun Christmas, and then after that I hope the new year brings you lasting pleasure and prosperity.”

    Calthrop wasn’t the only 19th-century Christian who found something lacking in a really fun Christmas. Plenty of others contended that the notion of a merry Christmas was juvenile, irreligious, or just not a very accurate representation of how it feels to actually celebrate the holiday.

    Merry Christmas is quite the term for the young, but it a little jars upon the ears as life goes on, and we know more of its troubles and sorrows. For myself, I confess that I much prefer the ‘Happy Christmas.’ It speaks to all of the birthday of our King,” one person wrote in an 1878 issue of a Gloucestershire parish magazine. 

    These sentiments were evidently pervasive enough in the UK that by the early 20th century, the phrase Merry Christmas had gained a bad rap as an Americanism. “I send you of course the greetings of the season: Merry Christmas (a foolish American wish!) and a Happy New Year,” someone wrote to the editors of The Catholic Fortnightly Review in 1909.

    Great Britain’s Happy Christmas crusaders, like baby Jesus before them, were soon blessed with a gift from a king. During the monarchy’s first-ever Christmas Day message in 1932—written by Rudyard Kipling and broadcast over the radio to the entire empire—George V wished everyone a happy Christmas. George VI took up the happy mantle during his reign, as did Elizabeth II after him. Their Christmas Day broadcasts made it abundantly clear that Happy Christmas was high society’s holiday greeting of choice. (That said, some members of the royal family do sometimes use Merry Christmas these days.)

    All feelings about the merits of a merry Christmas versus a happy one aside, we can all agree that Crimbo has at least earned a hat tip for heading off merry’s descent into obsolescence. (Not to diminish the good work of the humble merry-go-round.)

    Have you got a Big Question you’d like us to answer? If so, let us know by emailing us atbigquestions@mentalfloss.com.

    [ad_2]

    Ellen Gutoskey

    Source link

  • 7 Sentences That Sound Bizarre But Are Still Grammatical

    7 Sentences That Sound Bizarre But Are Still Grammatical

    [ad_1]

    Let’s not look at grammar as a cold, harsh mistress. She can also be a fun, kooky aunt. Here are some tricks you can do to make strange sounding sentences that are still grammatical.

    Take advantage of the fact that the same sentence can have two different structures. This famous joke from Groucho Marx assumes that most people expect the structure of the first part to be “One morning [I shot an elephant] [in my pajamas].” But another possible, and perfectly grammatical, reading is “One morning [I shot] [an elephant in my pajamas].”

    Make a garden path sentence. In this one, we think we’ve reached the main verb when we get to raced, but instead we are still inside a reduced relative clause. Reduced relative clauses let us say, “the speech given this morning” instead of “the speech that was given this morning” or, in this case, “the horse raced past the barn” instead of “the horse that was raced past the barn.”

    This garden path sentence depends on the fact that complex, houses, and married can serve as different parts of speech. Here, complex is a noun (a housing complex) instead of an adjective, houses is a verb instead of a noun, and married is an adjective instead of the past tense of a verb.

    This sentence contains multiple center embeddings. We usually have no problem putting one clause inside another in English. We can take the phrase the rat ate the malt and stick in more information to make the rat the cat killed ate the malt.  But the more clauses we add in, the harder it gets to understand the sentence. In this case, the rat ate the malt. After that it was killed by a cat. That cat had been chased by a dog. The grammar of the sentence is fine. The style, not so good.

    This is another wild center-embedded sentence. Can you figure it out? Start with “anyone who feels X is likely to agree.” Then go to ”anyone who feels if X then Y is likely to agree.” Then fill out the X and Y. You might need a pencil and paper.

    Buffalo: It’s a noun! It’s a city! It’s a verb (meaning “to intimidate”)! We’ve discussed the notorious buffalo sentence before, but it never stops being fun. It plays on reduced relative clauses, different part-of-speech readings of the same word, and center embedding, all in the same sentence. Stare at it until you get the following meaning: “Bison from Buffalo, New York, who are intimidated by other bison in their community, also happen to intimidate other bison in their community.”

    This sentence takes advantage of the versatile English –ing. The author of a 19th-century grammar guide lamented the fact that one could “run to great excess” in the use of –ing participles “without violating any rule of our common grammars,” and constructed this sentence to prove it.

    It doesn’t seem so complicated once you realize it means, “This very superficial grammatist, supposing empty criticism about the adoption of proper phraseology to be a show of extraordinary erudition, was displaying, in spite of ridicule, a very boastful turgid argument concerning the correction of false syntax, and about the detection of false logic in debate.”

    Not only is this a great example of the wonderful and wild things you can do within the bounds of proper English, it’s the perfect response to pull out the next time someone tries to criticize your grammar.

    A version of this story ran in 2015; it has been updated for 2023.

    Are you a logophile? Do you want to learn unusual words and old-timey slang to make conversation more interesting, or discover fascinating tidbits about the origins of everyday phrases? Then get our new book, The Curious Compendium of Wonderful Words: A Miscellany of Obscure Terms, Bizarre Phrases, & Surprising Etymologies, out now! You can pick up your copy on Amazon, Barnes & Noble, Books-A-Million, or Bookshop.org.

    [ad_2]

    Arika Okrent

    Source link

  • The Origins of 10 Nicknames

    The Origins of 10 Nicknames

    [ad_1]

    The origins of some nicknames are obvious. It’s easy to see why Ed is short for Edward, Nick is short for Nicholas, and Ally is short for Allison. Other diminutives require more explanation. If you’re curious how Margaret turned into Peggy, or how Richard led to Dick, check out the histories of 10 nicknames that push the limits of the term.

    Dick Van Dyke

    The actor Richard “Dick” Van Dyke. / Paul Morigi/GettyImages

    The name Richard is very old and was popular during the Middle Ages. In the 12th and 13th centuries everything was written by hand and Richard nicknames like Rich and Rick were common just to save time. Rhyming nicknames were also common and eventually Rick gave way to Dick and Hick, while Rich became Hitch. Dick, of course, is the only rhyming nickname that stuck over time. And boy did it stick. At one point in England, the name Dick was so popular that the phrase “every Tom, Dick, or Harry” was used to describe Everyman.

    Bill Gates

    William Henry Gates III, who goes by the nickname Bill. / WPA Pool/GettyImages

    There are many theories on why Bill became a nickname for William; the most obvious is that it was part of the Middle Ages trend of letter swapping. Much how Dick is a rhyming nickname for Rick, the same is true of Bill and Will. Because hard consonants are easier to pronounce than soft ones, some believe Will morphed into Bill for phonetic reasons. Interestingly, when William III ruled over England in the late 17th century, his subjects mockingly referred to him as “King Billy.

    Baseball player Hank Aaron.

    Baseball player Henry “Hank” Aaron. / John Vawter Collection/GettyImages

    The name Henry dates back to medieval England. (Curiously, at that time, Hank was a diminutive for John.) So how do we get Hank from Henry? Well, one theory says that Hendrik is the Dutch form of the English name Henry. Henk is the diminutive form of Hendrick, ergo, Hank from Henk. Hanks were hugely popular here in the States for many decades, though by the early ’90s the name no longer appeared in the top 1000 names for baby boys. But Hank is making a comeback! In 2010, it cracked the top 1000, settling at 806. By 2013 it was up to 632.

    Jack Nicholson.

    John “Jack” Nicholson. / Roy Jones/GettyImages

    The name Jack dates back to about 1200 and was originally used as a generic name for peasants. Over time, Jack worked his way into words such as lumberjack and steeplejack. Even jackass, the commonly used term for a donkey, retains its generic essence in the word Jack. Of course, John was once used as a generic name for English commoners and peasants (John Doe), which could be why Jack came became his nickname. But the more likely explanation is that Normans added -kin when they wanted to make a diminutive. And Jen was their way of saying John. So little John became Jenkin and time turned that into Jakin, which ultimately became Jack.

    Chuck Berry playing guitar on stage.

    Charles “Chuck” Berry. / Michael Ochs Archives/GettyImages

    “Dear Chuck” was an English term of endearment and Shakespeare, in Macbeth, used the phrase to refer to Lady Macbeth. What’s this have to do with Charles? Not much, but it’s interesting. However, Charles in Middle English was Chukken [PDF] and that’s probably where the nickname was born.

    The name Margaret has a variety of different nicknames. Some are obvious, as in Meg, Mog, and Maggie, while others are downright strange, like Daisy. But it’s the Mog/Meg we want to concentrate on here as those nicknames later morphed into the rhymed forms Pog(gy) and Peg(gy).

    Senator Ted Kennedy

    Senator Edward “Ted” Kennedy. / Wally McNamee/GettyImages

    The name Ted is yet another result of the Old English tradition of letter swapping. Since there were a limited number of first names in the Middle Ages, letter swapping allowed people to differentiate between people with the same name. It was common to replace the first letter of a name that began with a vowel, as in Edward, with an easier to pronounce consonant, such as T. Of course, Ted was already a popular nickname for Theodore, which makes it one of the only nicknames derived from two different first names.

    Prince Harry, whose full name is Henry Charles Albert David.

    Prince Harry, whose full name is Henry Charles Albert David. / Samir Hussein/GettyImages

    Since Medieval times, Harry has been a consistently popular nickname for boys named Henry in England. Henry was also very popular among British monarchs, most of whom preferred to be called Harry by their subjects. This is a tradition that continues today as Henry Charles Albert David, as he was Christened, goes by Prince Harry. Of course, Harry is now used as a given name for boys. In 2006, it was the 595th most popular name for boys in the United States. One reason for its upsurge in popularity was the huge success of the Harry Potter books.

    There are no definitive theories on how Jim became the commonly used nickname for James, but the name dates back to at least the 1820s. For decades, Jims were pretty unpopular due to “Jim Crow” laws. The name derived from a minstrel character used to perpetuate racist stereotypes in 19th-century America. The name “Jim Crow” soon became associated with African Americans and by the early 20th century, Jim Crow aimed to promote segregation in the South. Jim has since shed its racial past, and is once again a popular first name for boys all by itself, sans James.

    Sally was primarily used as a nickname for Sarah in England and France. Like some English nicknames, Sally was derived by replacing the R in Sarah with an L. Same is true for Molly, a common nickname for Mary. Though Sally from the Peanuts never ages, the name itself does and has declined in popularity in recent years. Today, most girls prefer the original Hebrew name Sarah.

    A version of this story ran in 2015; it has been updated for 2023.

    [ad_2]

    David K. Israel

    Source link

  • 34 Misleading Misnomers Explained

    34 Misleading Misnomers Explained

    [ad_1]

    A light-year is the distance traveled by light in a single year—5,878,499,810,000 miles, or 9,460,528,400,000 kilometers. So, despite how it sounds, when we talk about things being “light-years away,” we’re not talking about an enormously vast amount of time but rather an enormously vast distance. The stories behind 34 more misleadingly misnomers are explained here.

    This game isn’t a form of checkers, nor is it from China. It was invented in Germany in 1892; the name was changed to make the game more marketable in the late 1920s.

    Arabic numbers (1, 2, 3, 4, 5 …) originated in India, not the Arabian Peninsula. They’re named for the Arabian mathematicians who introduced them to Europe in the Middle Ages.

    And while we’re on the subject of math, the Fibonacci sequence, in which each number is the sum of the previous two (1, 1, 2, 3, 5, 8, 13, 21…), was first discussed by Indian scholars several centuries years before Fibonacci.

    The Babylonians had an understanding of the Pythagorean Theorem more than 1000 years before Pythagoras.

    New Koala Joey On Display At Taronga Zoo

    A Koala joey at Taronga Zoo. / Lisa Maree Williams/GettyImages

    Koala bears are marsupials, not bears, and king crabs aren’t crabs. They’re one of the many animals that are referred to as “false crabs,” along with the closely-related hermit crabs.

    Glow-worms and fireflies aren’t worms or flies, but insect larvae and beetles, respectively.

    The horned toad and the slow worm are both species of lizard.

    Divers discover underwater world in Turkey's Aydin

    A starfish near Turkey. / Anadolu Agency/GettyImages

    Starfish and jellyfish aren’t fish—they are echinoderms and cnidarians, respectively.

    Despite looking like fashionable ants, velvet ants are actually wasps.

    Strawberries aren’t berries. And neither are blueberries, raspberries, or blackberries. By definition, berries have to be produced from a single ovary, like a redcurrant or a grape … or a banana. Confused? Well…

    Peanuts aren’t nuts, but they are related to peas. Coconuts and walnuts aren’t nuts either, but rather “drupes”—like dates, coffee beans, and olives—which are fleshy fruits surrounding a hard shell containing a seed. Hazelnuts and chestnuts, however, are true nuts, as are acorns.

    Teddy Roosevelt in Construction Vehicle

    Theodore Roosevelt in a construction vehicle at the Panama Canal. / George Rinhart/GettyImages

    Panama hats come from Ecuador. The term Panama hat was in use as early as the 1830s, but the term—and the hats themselves—grew in popularity after then-President Theodore Roosevelt visited the Panama Canal construction site wearing one in 1906.

    English horns come from Poland. And they aren’t horns, but woodwind instruments related to the oboe.

    Jerusalem artichokes come from North America. The “Jerusalem” part of the name might be a corruption of the Italian word for “sunflower,” girasole—because the Jerusalem artichoke isn’t an artichoke at all, but a member of the sunflower family.

    Bombay duck is actually a fish. According to the BBC, there are a few theories as to how the fish got its moniker, the most popular of which is that “that the name came from the British mail trains that huffed odoriferous orders of dried fish from the city to the interior of India. These wagonloads became known as ‘Bombay Dak.’ (The word dak means ‘mail.’)”

    Pont Neuf: Plate One From The Paris Set', 1904.

    Pont Neuf: Plate One From The Paris Set’, 1904. / Print Collector/GettyImages

    Paris’s Pont Neuf is the oldest bridge in the city, but its name still means “new bridge.” It was completed in 1607.

    The Isle of Dogs in central London isn’t an island, but rather a peninsula-like loop of land surrounded on three sides by the river Thames.

    Catgut is typically made from sheep gut, and has never come from cats. The “cat” part of the word is mysterious, but it might come from a corruption of kit, an old dialect word for a fiddle.

    When you hit your funny bone, you’re actually hitting your ulnar nerve. According to the Oxford English Dictionary, the name funny bone is “probably at least partly punning on the homophones humerus”—the upper bone of the arm—“and humorous.”

    The Battle of Bunker Hill

    The Battle of Bunker Hill. / Fine Art/GettyImages

    This Revolutionary War battle mainly took place on nearby Breed’s Hill in Boston.

    There are more than 1500 islands in North America’s Thousand Islands archipelago.

    Napoleon’s Hundred Days—the period between his return from exile on March 20, 1815 to the restoration of the French monarchy on July 8—lasted 111 days.

    The Thousand Days’ War (also known as The War of a Thousand Days) lasted 1130 days.

    The Thirty Days War—a.k.a. the Greco-Turkish War—was part of a series of larger skirmishes that lasted 304 days.

    The Hundred Years’ War lasted 116 years. (But the Eighty Years’ War did last eighty years.)

    A version of this story ran in 2015; it has been updated for 2022.

    [ad_2]

    Paul Anthony Jones

    Source link

  • How Did ‘Gross’ Become a Term of Disgust?

    How Did ‘Gross’ Become a Term of Disgust?

    [ad_1]

    The word gross has been in English for hundreds of years. We got it from French, where it means “big, thick, coarse.” It took on a variety of senses in English related to size, including “coarse” (gross grains as opposed to fine), “strikingly obvious” (“grosse as a mountaine,” as Shakespeare put it in Henry IV, Part 1), and “whole” (gross as opposed to net value). It also picked up negative senses like “vulgar,” “crude” (“Grose folke of rude affection Dronkerdes. Lubbers, knaues” from the 16th century), or “ignorant” (“a grosse unlettered people” from 1833). Uncivilized and indecent behavior was called “gross.” Low-quality food was called “gross.” And history is filled with gross abuses, gross misconceptions, gross perfidy, and gross folly. From there it’s not a big jump to the current sense of “disgusting.” There’s always been something repulsive, or at least unsavory, in the word gross.

    Still, “Ew! That is so gross!” has a very modern ring to it. It feels like a very different word from the one they were using 200 years ago. In contrast, a word like disgusting feels essentially the same. So what happened to gross? What separates the gross of today from the gross of the past?

    Gross did not undergo a big change in meaning, but it did undergo a big change in context. In the late 20th century, young people started to use it a lot—like, a lot a lot. So much so that older people noticed it, and didn’t like it. As one critic said in a 1971 issue of The Saturday Review, “Gross has always meant something coarse and vulgar. But as used by the teens, it runs the gamut of awfulness from homework to something the cat contributed to ecology.” In other words, gross became slang.

    At first, some time in the 1950s, it became an in-group term, one of a number of words (including great, the greatest, the most) that, according to a 1959 article about university slang, were “either complimentary or derogatory depending on how they are said.”

    It’s hard to image anyone today using gross in the complimentary way. It’s also strange to learn that, according a 1973 article, gross out could mean “wild and shocking” (“That was a real gross-out party”) and, as author Hugh Rawson wrote in Wicked Words: A Treasury of Curses, Insults, Put-Downs, and Other Formerly Unprintable Terms from Anglo-Saxon Times to the Present could also be used as a noun (“Fred is a real gross-out”). The early slang meaning of gross was broader than it is now.

    The development of the verb form to gross out in the ’60s and ’70s (probably on analogy with cop out and freak out) helped contribute a sense of newness to the word and made it seem even more slangy. By the ’80s it was a staple of “valley girl” speak, so often repeated, mocked, emulated, and imitated that it spread far beyond the teen world it came from. Its sense narrowed into a succinct judgment of visceral disgust, capturing the colorful bodily emotion of gag me with a spoon but in a less wordy way. Gross was always there, but young people, needing a more compact package to deliver their disdain, fixed it up and made it grosser.

    Now that you know how gross came to its current meaning, find out why we find words like moist really, really gross.

    A version of this story ran in 2015; it has been updated for 20

    [ad_2]

    Arika Okrent

    Source link

  • The Origins of 13 Christmas Words

    The Origins of 13 Christmas Words

    [ad_1]

    Most people have heard that Christmas is literally the “Christ Mass” of the Christian church, and that Santa Claus takes his name from a corruption of “Saint Nicholas,” the 4th century figure whose fondness for secretly handing out gifts apparently inspired the traditional image of Father Christmas. But what about all of the other festive words that crop up at this time of year? From worthless trinkets to misnamed chickens, here are the histories and etymologies of 12 Christmassy words.

    Bauble derives from beaubelet, an old French word for a child’s toy or plaything, and dates back as far as the 14th century in English (if not earlier) when it originally referred to any showy but ultimately valueless ornament. In the years that followed, however, bauble also came to be used for the baton carried by court jesters (who were nicknamed bauble-bearers in Tudor England) and foolish people; to give the bauble meant to make fun of someone in 17th century English.

    'The Carol Singers'

    ‘The Carol Singers.’ / Culture Club/GettyImages

    The earliest references to Christmas carols date back to Tudor England, but before then the word carol could be used to refer to any joyous or celebratory song, bands or choruses of singers or musicians, birdsong or the chorus of songbirds at dawn, or to a particular type of circular folk dance or piece of music written to accompany a ring-dance. Whatever its earliest meaning might have been, carol was borrowed into English from French in the early Middle Ages and can probably be traced back to an ancient Latin or Greek word for a flute-player, choraules.

    American Chestnut (Castanea Dentata)

    American Chestnut (Castanea Dentata). / Heritage Images/GettyImages

    Those chestnuts roasting on an open fire are actually Castanea nuts, named for the ancient region of Castana in central Greece from where they might once have been imported into the rest of Europe. Just like Brazil nuts, however, it might actually be the case that the region of Castana took its name from chestnut trees that grew there, not the other way around—in which case the name chestnut might instead derive from some ancient and long-lost name for the chestnut tree itself.

    Two glasses of eggnog on a table.

    Eggnog is delicious—and its etymology mysterious. / Yulia Naumenko/Moment/Getty Images

    The nog of eggnog is an old 17th century word for strong beer, and in particular an ale or beer once brewed in Norfolk in the east of England. Before then, however, no one is entirely sure where the name nog originates, although one plausible explanation is that it comes from an even older Scots word, nugg, for beer warmed by having a red-hot poker placed into it. If so, then your Christmas eggnog can probably be traced back to an old Norwegian word, knagg, for a metal peg or spur.

    Frankincense plant - used as an ingredie

    Frankincense plant. / Culture Club/GettyImages

    You’ll no doubt have heard the word, but you might not know the meaning—frankincense is actually a type of fragrant resin, obtained from the sap of the frankincense tree, which has long been used to make incense; the frank– of frankincense is an old French word essentially meaning “high quality.”

    Myrrh is another much-prized and highly fragrant resin or oil obtained from the sap of the myrrh tree. Its name comes from an Arabic word meaning “bitter.”

    In Old English, a gift was specifically a wedding dowry, but by the early Middle Ages, its meaning had broadened to mean simply something given freely from one person to another. It ultimately derives from some ancient Germanic word root meaning something like “give” or “bestow”—which is also the origin of the not-so-festive German word gift, meaning “poison.”

    Viscum Album L. (Mistletoe)

    Viscum Album L. (Mistletoe). / Heritage Images/GettyImages

    The ­–toe of mistletoe is an Old English word for “twig,” but the mistle– part is much more puzzling. Originally, the mistletoe plant was just called mistel, which in Old English was also used as a word for birdlime, a sticky substance pasted onto the branches of trees to trap birds. How these two meanings came together in mistletoe is unclear, but one idea is that, because birds would eat mistletoe berries and then poop out the seeds elsewhere (with their poop acting as a fertilizer), mistel might originally have meant bird droppings, in the related sense of a sticky, unpleasant substance. Kissing under the poop-twig suddenly doesn’t seem quite so romantic.

    Maresme Plant Growers Grow Poinsettias For The Christmas Season

    That’s a lot of poinsettias. / Manuel Medir/GettyImages

    Poinsettias are the large, bright red “flowers” (the red parts are actually leaves) popular at Christmas that are native to Mexico and parts of Central America. They’re named after Joel Robert Poinsett, a former congressman and diplomat, who is credited with introducing the plant to the United States in the early 1800s.

    Father Christmas, Santa Claus

    Lazy Rudolph. / Vermilya/GettyImages

    He might be the most famous of Santa’s reindeer, but the name Rudolph actually means “famous wolf,” and would once have been an epithet bestowed on the fiercest or most audacious of warriors. The rein– of reindeer, incidentally, is an old Germanic word meaning “horn.”

    Two children putting tinsel on a Christmas tree

    Children putting tinsel on a Christmas tree circa 1955. / Orlando/GettyImages

    Though it had several earlier and different uses in the 1300s and 1400s, beginning in the early 1500s, tinsel was the name of an iridescent fabric interwoven with gold- or silver-colored thread that took its name from a French word, étincelle, meaning “sparkle” or “spark.” Tinsel as we know it today dates from the late 1500s, and took its name from the sparkling silvery or golden threads that made tinsel fabric so shiny.

    The first birds ever known as turkeys in English were African guinea fowl, which were so-named as they were imported to Europe dinner tables via Turkey. When the first Europeans came across wild turkeys in North America in the early 1500s, however, they wrongly assumed that they were relatives of the guinea fowl they knew from back home, and so they too came to be known as turkeys.

    English isn’t the only language to have geographically misnamed the turkey: In French it is called dinde (from poule d’inde, or “Indian chicken”); in Portuguese it is the peru (because the birds were mistakenly thought to come from South America); and in Malaysia it is called the ayam belanda or “Dutch chicken” (because the birds were originally introduced by Dutch settlers).

    Yule derives from an Old Norse word, jól, which, according to the Oxford English Dictionary, was once the name of a 12-day pagan festival. This was borrowed into Old English as gol or geol as far back as the 8th century, and was originally used both as another name for December (which was called ǽrra geóla, or “before Yule”) and January (which was æftera geóla, or “after Yule”).

    A version of this story ran in 2015; it has been updated for 2022.

    [ad_2]

    Paul Anthony Jones

    Source link