ReportWire

Tag: Made by History

  • History Tells Us How the Israel-Hamas War Will End

    History Tells Us How the Israel-Hamas War Will End

    [ad_1]

    The Hamas terrorist attack on Oct. 7 killed some 1,200 people in Israel. Since then, a fierce Israeli military campaign has destroyed much of Gaza, with more than 22,000 dead and untold numbers injured, buried under rubble, or wasting away from disease and malnutrition, as Israeli troops continue to blockade much-needed humanitarian assistance.

    Israel has declared an ever-expanding list of war aims that includes the obliteration of Hamas, the return of all hostages, assurances that Gaza will never again threaten Israel, the banning of the Palestinian Authority from any role in governance of the territory, prevention of any “element” that “educates its children for terrorism, supports terrorism, finances terrorism and calls for the destruction of Israel,” and creation of a wide buffer zone within Gaza separating Israel from the Strip’s population.

    Yet, no Israeli leader has explained how “Operation Swords of Iron,” as their military campaign has been dubbed, could possibly achieve these objectives. That’s because it can’t—and they know it. This very situation has existed in almost every Israeli war since 1948 because the country’s problems are political in nature, not something military force alone can solve. This reality means that instead of “Operation Swords of Iron” ending upon completion of Israeli objectives, it will end —as other Israeli wars have—when leading nations (often the U.S.) determine that its military has gone too far.

    This dynamic is rooted in the history of Israel and the revolutionary, if partial, success of Zionism. Inspired by other nationalist movements and a tide of antisemitism in Europe, Zionists in the 19th and 20th centuries sought to construct a Jewish country in the largely Arab-Muslim Middle East, which fiercely opposed its creation and expansion. They understood that a Jewish state, or homeland, would eventually require peace with the Arab and Muslim Middle East. Yet, as Revisionist Zionist leader Vladimir (Ze’ev) Jabotinsky famously admitted in 1923, anyone with their eyes open could recognize “the complete impossibility” of transforming “Palestine from an Arab country to a country with a Jewish majority” by voluntary agreement.

    Read More: How the Yom Kippur War Changed Israel

    Jabotinsky is key to this history because he made clear what Zionists across the political spectrum understood: their movement’s success would require the use of force to ensure survival and convince their neighbors of the Jewish polity’s permanence. But there was a paradox: while force was necessary to survive, it would be insufficient to gain peace and mutual recognition. 

    His solution was the “Iron Wall strategy,” to which mainstream Zionists—and later, Israeli political parties—more or less subscribed. This strategy involved using crushing levels of violence to force Arabs to accept that a Jewish state was here to stay. That would create the conditions for successful negotiations, which did not exist as long as Arabs had a “gleam of hope” of ending the Zionist project altogether. After that hope was extinguished, Jabotinsky stressed, Israelis would be ready to imagine a future for “two peoples in Palestine.” As he wrote, “I am prepared to swear, for us and our descendants, that we will never violate this equality and we will never attempt to expel or oppress the Arabs.”

    In the 1930s, after Britain had declared its support for the idea of a Jewish homeland in Palestine and the League of Nations endorsed the declaration, Zionists began implementing this vision of creating negotiating opportunities by inflicting crushing defeats. They hoped to split Israel’s opponents into diehard extremists and moderates willing to negotiate. In Nov. 1947, in the wake of the Holocaust, the United Nations passed a resolution recommending the end of British Mandate Palestine and the creation of two independent states: one Arab and one Jewish. A civil war between Jewish and Palestinian forces and a chaotic abandonment of Palestine by British forces soon ensued.

    When Israel declared independence in May 1948, its Arab neighbors attacked. Israeli forces were victorious on almost all fronts, displacing roughly 750,000 Palestinians—and the war only ended when Israel’s leaders became fearful of British military intervention and agreed to withdraw their forces from Sinai and cancel operations to conquer the West Bank.

    In 1956, after Egyptian leader Gamal Abdel Nasser nationalized the Suez Canal and struck an arms deal with the Soviet Union, Israel, along with Britain and France, invaded Egypt and occupied the Sinai Peninsula and the Gaza Strip. Israel left the territories it occupied only when President Dwight Eisenhower threatened direct and painful retribution if it failed to do so.  

    The pattern of crushing military defeats of Arab forces that began in 1948, and was repeated in reprisal raids in the 1950s and in the 1956 war, continued in June 1967: Egypt, Syria, and Jordan threatened to attack. Israel responded with a devastating preemptive strike. The U.N. Security Council ordered a ceasefire, but Israel did not stop fighting until the U.S. and the U.N. pressured Prime Minister Levi Eshkol and Defense Minister Moshe Dayan into an abrupt ceasefire.

    In October 1973, after Egypt and Syria waged a surprise assault on Israeli positions in the Sinai and Golan Heights, Israeli forces slowly gained the upper hand and began a drive to isolate and destroy the Egyptian Third Army. It only halted this campaign because the U.S.—in a tense standoff with the Soviet Union—categorically refused to allow it to continue. President Richard Nixon and Secretary of State Henry Kissinger insisted that Israel abide by the terms of Security Council Resolution 338, requiring Israel to go back to positions it had held on the day the resolution was passed. 

    Yet, despite deploying Jabotinsky’s strategy of overwhelming force for 75 years now, Israel has never gotten to the part of his theory in which Israel gains acceptance from its neighbors and the Palestinian people. That’s in part because he failed to foresee that as Israel’s victories over Arab resistance to Zionism accumulated, the political center of gravity within Zionism, and then within Israel, would shift toward territorial expansion and ethnonational exclusivism. 

    Read More: The West Is Losing the Global South Over Gaza

    This expansion in its ambitions scuttled the potential for peace. Right when the Iron Wall strategy might’ve been reaching its pivot point in the 1970s—when Israel’s use of overwhelming force had finally convinced a moderate faction of the Arab world to accept its existence in exchange for peace, and Israel could’ve shifted from using force to negotiating a political settlement—its goals were changing in ways that made such a deal impossible. In the 1970s, Prime Minister Golda Meir rejected several peace offers from Jordan and Egypt, because they would’ve required territorial concessions. 

    Beginning in 1977, Israel’s politics have shifted rightward. Elections swung back and forth in the 1980s and 1990s between Labor and the right-wing Likud Party—which trumpeted expansionist policies.

    That had major implications in the 1990s when the peace process was finally gaining a foothold with the signing of the Oslo Peace Accords. A far-right Israeli assassinated Labor Prime Minister Yitzhak Rabin. That enabled Likud’s Netanyahu to win election, and he did everything in his power to destroy the Oslo peace process. After its collapse, the Arab world, led by Saudi Arabia, proposed a comprehensive peace with Israel based on the creation of a Palestinian state in the West Bank and Gaza Strip, but no Israeli government has ever officially responded to this initiative.  

    And since 2010, Likud has dominated Israeli politics, making peace all but impossible. Though battered by its attempted overhaul of Israel’s court system and its colossal failures with respect to the Gaza Strip, Likud and other parties that insist on Israeli rule of the greater “Land of Israel” will continue to control Israeli governments for the foreseeable future.

    Without peace, wars have regularly broken out. Each time, the Israeli government promises that finally, unlike in the past, the Israel Defense Forces (IDF) will be allowed to fight until victory. In reality, however, as Jabotinsky anticipated in the 1920s, the IDF can never deliver real victory because military means alone cannot transform Middle Eastern realities or create the basis for stable political arrangements based on the principle of the equality of nations. 

    That explains why Israel has never ended a war without being ordered to do so. The pattern has held steadily into the 21st century, including in 2006, when the Israeli-Lebanon war only ended when the U.S. and France intervened, brokering a UN resolution. Even if an Israeli government wants to stop fighting, it cannot, because that would mean admitting that it never had a chance of keeping its promise to solve Israel’s problems by simply “letting the IDF win.” In effect Jabotinsky’s insight, that crushing military victories were necessary, but only to set the stage for a political compromise—which could bring peace—has been forgotten.

    This history exposes that Israel’s current laundry list of war aims isn’t designed to be achievable. Rather, these goals are political statements put in place so that after the U.S. inevitably forces an end to the fighting, Netanyahu and his government can claim they would’ve achieved lofty goals, if only the U.S. had not interceded.  

    An American order to stop is not what Israeli leaders fear, it is what they expect, and it is what the country always needs. And history shows it will come—eventually. Like previous presidents, President Biden is learning that there are excruciating trade-offs between the dangers, risks, and human costs of allowing an Israeli military campaign to continue and the domestic political consequences of stopping it. When he decides the former concerns outweigh the latter, he will give the order. Then and only then will this war end.

    Ian S. Lustick is the Bess W. Heyman professor emeritus in the department of political science of the University of Pennsylvania. His most recent book is Paradigm Lost: From Two-State Solution to One-State Reality. Made by History takes readers beyond the headlines with articles written and edited by professional historians. Learn more about Made by History at TIME here.

    [ad_2]

    Ian Lustick / Made by History

    Source link

  • The Semiconductor Industry Needs to Learn From the Past

    The Semiconductor Industry Needs to Learn From the Past

    [ad_1]

    The CHIPS and Science Act, which Congress enacted in 2022, promised $280 billion in funding to reverse a decline in U.S. semiconductor manufacturing (the nation went from producing 37% of the global supply of semiconductors in 1990 to just 10% in 2022). The White House hoped the legislation would make it possible for U.S. workers and communities to “win the future,” through domestic, high-tech economic development. Just as they hoped, the new law ignited a race to build government-subsidized semiconductor factories (“fabs”) on U.S. soil.

    Yet, it’s not all good. The rushed process has been rife with construction site injuries, safety concerns, and union avoidance. The semiconductor industry is also taking a toll on the environment. In 2022, semiconductor manufacturing comprised 11% of the U.S.’s non-domestic water usage even though production in the U.S. was low, and it generated massive volumes of greenhouse gas emissions and hazardous waste. These health, safety, and environmental problems raise doubts about whether the U.S. has learned from the industry’s history. When the U.S. was a global leader in semiconductor production, the industry was wracked with occupational hazards, environmental injustices, and union-busting. As the Biden Administration pushes to rebuild the industry, it can learn from this history to ensure that what emerges is better for workers and the environment than the industry of the 1970s to 1990s.  

    Public memory usually credits the rise of American computing to inventive executives in their labs and garages. Yet this mythology ignores how the industry’s rapid growth from the 1960s to the 1990s also relied on factory workers who produced crucial components. Their contributions came at great risk to their health. Computer chip production was a chemically intensive process, and required using caustic, understudied solvents to purify and process chip materials. Chemicals used in chip making like trichloroethane (TCE), ethylene-based glycol ethers, and 1,1,1-trichloroethane (TCA) were linked to maladies including chemical sensitivity, miscarriages, birth defects, and cancer.

    Companies rarely let workers know about these hazards in the industry’s early years, but many could tell chemicals were toxic from firsthand experience. For example, when Pat Lamborn worked on the National Semiconductor production line in the 1970s, she was never told about any hazards of chemicals she worked with, including TCA. But when she experienced severe acne, her doctor told her it was chemically-induced chloracne.

    Read More: The Only Way the U.S. Can Win the Tech War with China

    When Lamborn first got her job at National Semiconductor, she had sought to unionize the workplace. Encountering barriers while personally experiencing surprising health effects from her chemically-intensive work, she instead joined the recently-growing occupational health movement. In 1978, Lamborn, lawyer Amanda Hawes, and industrial hygienist Robin Baker founded the Project on Health and Safety in Electronics (PHASE) to educate workers on the risks of semiconductor production. The next year they also launched the Electronics Committee on Safety and Health (ECOSH), which focused on organizing. Both organizations eventually became part of the Santa Clara Center for Occupational Safety and Health, or (SCCOSH).

    These new groups aimed to address widespread health problems in the growing electronics industry. In 1978, electronics manufacturers in California had over four times the state’s average rate of occupationally related illness. 

    PHASE and ECOSH researched the chemicals used in the industry, reached out to workers with a hotline and home visits, and provided them with health, legal, and labor organizing resources. After talking with hundreds of workers about their concerns, they developed a campaign to ban TCE, a common solvent used to produce chips that had already been linked to liver cancer and brain, kidney, and heart damage. The industry fought back, with the pioneering manufacturer Fairchild Semiconductor claiming that such a ban would be based on inadequate research and therefore premature. Nonetheless, by the early 1980s, the activist groups’ campaign succeeded in massively reducing the legal limit of TCE usable in California.

    In addition to limiting the use of TCE in California, occupational health groups worked in coalitions alongside labor unions at the federal, state, and local levels to secure workers’ right to know about the chemicals they worked with. These efforts produced a range of new policies, from local ordinances in Silicon Valley to a new federal OSHA standard, that dramatically increased transparency around workplace chemicals.

    Evidence was also starting to emerge that the chemicals involved in electronics manufacturing could pose risks to surrounding communities—something that drew far more attention than potential danger for employees. In the early 1980s, local residents in South San Jose began to notice unusually high rates of miscarriages and birth defects. They suspected the cause might be toxins in their water, because a spill of chemical solvents had recently spread 2,000 ft. from a nearby Fairchild Semiconductor fab. Research from county and state health officials soon bolstered their suspicions, revealing that residents in the polluted area experienced about twice as many miscarriages and three times as many birth defects as those in a nearby, uncontaminated control neighborhood (though it did not definitely state the cause).

    In response to these environmental issues, SCCOSH joined diverse allies to launch the Silicon Valley Toxics Coalition, which mounted a grassroots campaign to monitor, clean, and prevent the industry’s toxic waste. Their campaign shined a light on toxic spills and demanded cleanups. By 1984, Santa Clara County led the nation with 20 EPA Superfund cleanup sites, 16 of which stemmed from computer manufacturing. In 1986, Fairchild reached a multimillion dollar settlement with local residents in a case tied to the TCA spill (the company had also helped pay cleanup costs).  

    Yet, while semiconductor workers’ awareness of chemical risks increased over time, the risks themselves did not simply disappear. Some chemical injuries were severe and unambiguous. For example, in 1986 Judy Ann Myer inhaled chloroethene vapors while trying to retrieve circuit boards from a four-foot-deep vat of solvent, passed out, and died in the vat.

    Read More: The U.S. Releases New Plans In the Fight To Bring Chip-Making Back

    Longer-term illnesses like cancer were more difficult to link to any particular chemical exposure, sometimes producing contentious legal battles. When 37-year-old Amy Romero, a former GTE Lenkurt semiconductor worker who was unemployed with pulmonary disease, cancer, and no health insurance, visited attorney Josephene Rohr in 1984, Rohr remarked that she seemed young to have cancer. Romero replied “Actually, all the women where I work have lost their uteruses.” In disbelief, Rohr began speaking to other employees at GTE Lenkurt, discovering dozens with ovarian, uterine, colon, skin, breast, brain, and thyroid cancers. This discovery led to the largest workplace illness case in New Mexico state history. Between 1984 and 1992, 225 workers sued GTE Lenkurt and its chemical suppliers. The companies denied responsibility for their ailments but settled three lawsuits for a total of $9 million. 

    The coalitions of health, environmental, and labor organizers achieved many partial successes throughout the 1980s and 90s. However, they realized that more systemic changes would be necessary to avoid repeated problems. They demanded the industry only use chemicals that had been adequately tested and reallocate research funding so that chips would not only become exponentially more efficient over time, but also exponentially safer. They also called for a unionized industry with democratically-elected health and safety committees in semiconductor plants. This, they believed, would give workers tangible power over their own safety rather than making them resort to lawsuits after harm was already done. 

    But these calls went for naught. The computer industry left its priorities and safety policies up to corporate managers, and it responded harshly to unionization efforts. At a time when union power was waning and employers were heavily exporting manufacturing jobs overseas, union drives in high-tech industries led to more firings and factory closures than union contracts.

    Speaker of the House Nancy Pelosi (D-CA), alongside House Democrats, signs the CHIPS For America Act during a bill enrollment ceremony outside the U.S. Capitol July 29, 2022 in Washington, D.C. Drew Angerer—Getty Images

    But things might be ripe for change in 2024. The CHIPS Act incentivizes returning computer manufacturing to the U.S., the National Labor Relations Board is far less tolerant of actions like the firing of union organizers, and the labor movement is experiencing a renaissance.

    Once again, environmental and labor organizations are pushing for a safer, more worker friendly semiconductor industry. A new coalition of over 50 organizations, including the United Auto Workers, Communications Workers of America, and the Sierra Club, is now demanding phasing out hazardous chemicals, respecting semiconductor workers’ right to organize a union, and negotiating with local communities to ensure new fabs support their needs. This coalition is insisting that the Biden administration act to “avoid the problems of the past.”

    And their activism exposes the truth: for American workers and communities to truly “win the future” as the administration hopes, lawmakers, regulators, and employers will need to learn from the past to become safer and more sustainable. These goals are not just technical but social; they cannot be attained with more advanced technology alone. History shows that safety and sustainability will require a far more disruptive idea: a union-friendly, democratized tech industry.

    Adam Quinn is a PhD candidate in history at the University of Oregon, where he is writing a dissertation on the environmental and labor history of computers. He is a dissertation fellow with the Just Futures Institute/Center for Environmental Futures/Andrew W. Mellon Foundation. Made by History takes readers beyond the headlines with articles written and edited by professional historians. Learn more about Made by History at TIME here.

    [ad_2]

    Adam Quinn / Made by History

    Source link

  • Langston Hughes' Christmas in Uzbekistan

    Langston Hughes' Christmas in Uzbekistan

    [ad_1]

    On a snowy Christmas morning in 1932, the writer and poet Langston Hughes woke up to find a stocking hanging from the post of his bed. It was stuffed with halva, cashew, and pistachio nuts grown by his hosts, a group of African American agronomists who had been living in Yangiyul, Uzbekistan, at the invitation of the Soviet government.

    The day was filled with yet more surprises. “We even had pumpkin pie for dessert,” Hughes later wrote in I Wonder as I Wander, his self-described autobiographical journey, “and the tables were loaded down with all the American-style dishes that those clever Negro wives could concoct away over there in Uzbekistan.” The meal was the result of a joint effort thanks to the male agronomists’ ability to coax a harvest from the unforgiving tundra, and to their wives’ skill at turning the ingredients into a meal reminiscent of the kind they used to enjoy back home in the United States.

    As we enter a holiday season set against the backdrop of war and other conflicts, we would do well to remember Hughes’ forgotten Uzbek celebration. It stands as testament to the power of creating a sense of home and community even under the most challenging of circumstances.

    Read More: The World’s Greatest Places—Uzbekistan

    A chance encounter in Moscow led to the Christmas visit. Hughes had been hired to help punch up the script for a Communist International (Comintern)-financed film titled Black and White. The production was to be shot in Moscow but set at a steel mill in Birmingham, Ala. According to Hughes: “its heroes and heroines were Negro workers. The men were stokers in the steel mills, the women domestics in wealthy homes…Its villains were the reactionary white bosses of the steel mills and the absentee owners, Northern capitalists, who aroused the poor white Southern workers against both the [incipient] union and the Negroes.”

    In short, the film was meant to bolster the Soviet model of trade unionism and criticize the United States’ divisive racial politics. At the same time, it was also an opportunity for African American actors and writers to get work in their chosen fields during a time when such opportunities were in short supply back home.

    While the film crew waited out some production delays, they made their way through Moscow’s popular gathering spots, like the Metropol, the Bolshoi Moscow, and the Grand Hotel, where they were staying. Somewhere along this circuit, Hughes ran into another group that happened to be in the city for reasons somewhat like his own.

    That group was led by Oliver Golden, a Mississippi-born veteran of the First World War and graduate of the Tuskegee Institute (now University) who, despite having studied under George Washington Carver, could only find work in the United States as a pullman porter and dishwasher.

    Those early roadblocks made it relatively easy for him to accept an invitation from a Soviet recruiter to pursue studies at the Communist University of the Toilers of the East (KUTV) in the 1920s. The student body represented more than 70 nationalities and ethnicities and included, at various points, future leaders like North Vietnam’s Hồ Chí Minh and Deng Xiaoping of the People’s Republic of China. The school included African Americans in its mission because the Soviets considered them, like their KUTV brethren, a colonized people (in their case, within their own country, by their fellow citizens), and sought to train them to lead Communist movements in the United States.

    Golden’s experience taught him that the Soviets valued his intellect and contributions, and he continued to work for Communist causes when he returned to the United States. That work soon put him in the path of another recruiter who invited Golden to lead a team of African American agronomists on a two-year project in the Soviet Union’s burgeoning agricultural industry in Uzbekistan. This gave him the opportunity to do the kind of work Golden had gone to school for, and wanted to do, for years. In exchange, the Soviets would pay several hundred dollars a month, which amounted to a fortune during the Depression, along with a home to live in, an extended paid vacation each year, and the service of a household maid.

    Read More: The Black Power Movement Is a Love Story

    Regardless of how appealing he and his wife Bertha Bialek (a U.S.-born daughter of Jewish immigrants from Poland) personally found the possibility of accepting the Soviet offer, Golden initially struggled to assemble a team. It was a hard sell to ask recruits to leave their home country for a place even Golden had never visited. They knew nothing of its customs and culture, and worried that they would be traveling vast distances only to encounter the same kind of racism they left behind, all without familiar support systems to navigate it. But what choice did they have?

    Golden eventually managed to convince a group that included fellow Tuskegee alumnus John Sutton, Wilberforce University graduate George Tynes, Virginia State College’s Joseph Roane, and several of their wives (not all the men were married). Their first official stop was in Moscow, where they pulled in to the Oktyabrsky rail terminal on Nov. 7, 1931, which not coincidentally fell on the 14th anniversary of the 1917 Bolshevik Revolution that overthrew the tsarist government and brought Vladimir Lenin to power. A ceremonial program in Red Square and the next day’s issue of the Moscow News happily reported that the agronomists had front row views to the festivities.

    Golden and Roane got on with Hughes so well during this busy Moscow interlude that they invited him to call on them in Yangiyul sometime if he was ever willing to make the trip. No one knew it at the time, but the Black and White production would soon fall apart, dashing the film crew’s artistic hopes but ultimately freeing up Hughes to travel around the Soviet Union and take his new American friends up on their offer to visit.

    Read More: The Overlooked LGBTQ+ History of the Harlem Renaissance

    Hughes arrived just in time for Christmas in 1932. By then, the agronomists had begun to tally several successes like introducing a new species of cotton to the region (one that is still grown today), growing their own food, getting married, having children, developing ties with the local Uzbek community, and generally finding ways to settle into their lives by drawing inspiration from both old worlds and new.

    Still, to a casual visitor like Hughes, Yangiyul’s pleasures were slow to reveal themselves. He had had a long, somewhat arduous journey east, and the end was marked with the surprising discovery that his train would only be slowing down, and not stopping, to let him disembark.

    Golden and Roane eagerly met him at the train station for what should have been a happy reunion. “But as we slithered through the sticky snow and pulled our feet from the sucking mud at each step,” Hughes admitted, “I am afraid I failed to hide very well my lack of joy at seeing at last a large group of fellow American Negroes away out in Asia.” While not the most easygoing guest upon his arrival, Hughes soon understood what a feat the agronomists and their families had managed. Out here in this vast expanse a world away from the home (one they were forced to leave by circumstances beyond their control), they came together to find refuge, community, and, in that, a reason to celebrate.

    Tamara J. Walker is an Associate Professor of Africana Studies at Barnard College, co-founder of The Wandering Scholar, and the author of Beyond the Shores: A History of African Americans Abroad. Made by History takes readers beyond the headlines with articles written and edited by professional historians. Learn more about Made by History at TIME here.

    [ad_2]

    Tamara J. Walter / Made by History

    Source link

  • The Nigerian Activist Whose History Holds a Lesson for Today

    The Nigerian Activist Whose History Holds a Lesson for Today

    [ad_1]

    Around the world, workers are mobilizing for higher wages while citizens are questioning the very functioning of democratic institutions. In the United States, workers across various industries from film to car manufacturers to coffee shops are demanding adequate compensation.

    In Nigeria, similar demands are being made. Meanwhile, just as in the U.S., there is rising distrust in electoral practices. With ordinary people in Nigeria facing significant hardship partly because of the government’s removal of a critical oil subsidy, the country’s largest unions called for and went on a short national strike in November.

    These events, coinciding with the 63rd anniversary of Nigeria’s independence, highlight ongoing questions about Nigeria’s political landscape that reach back to the colonial period and demonstrate the connections between democracy and labor activism.

    Read More: Why Nigeria’s Election Is a Key Test for Democracy in 2023

    In particular, the relatively unsung ideas of Olaniwun Adunni Oluwole are worth revisiting. She placed the political and economic needs of everyday Nigerians at the forefront of her agenda. Remembering her activism is important not only for rethinking Nigeria’s political future but also for illustrating how solidarity—forged across class and gender lines—can spur change in many contexts.

    Oluwole grew up amid the British forces’ gradual subjugation of the communities that were amalgamated later into the colony named Nigeria. Born in 1905, Oluwole attended primary school in Lagos and then worked as an actress and itinerant preacher. Her transition from the pulpit to the political arena occurred in 1945 during one of the most significant moments in Nigeria’s history: the colony’s first and longest labor strike, which involved 40,000 workers and lasted for six weeks.

    Since 1941, Nigerian trade unions had agitated for higher wages in response to degraded working conditions and economic inflation exacerbated by the Second World War. Shortly after the war’s outbreak in 1939, the British government had diverted food supplies to the army and mandated overtime hours in railway and factory work. Policies pushing farmers to cultivate cash crops at the expense of food crops limited Nigerians’ food production, as did price regulations. These initiatives inflated the cost of basic necessities and made life difficult for colonized people.

    These policies also brought Nigerian women into the fight. The Lagos Market Women Association (LMWA), which represented female traders, rejected the wartime price controls, including the government’s decision to buy and sell food at its own stores. Led by the association’s president, Madam Alimotu Pelewura, LWMA members argued that the British government undercut their profits and disrupted the local marketing system. The traders protested price restrictions by storming official meetings, petitioning British officials, and holding street demonstrations. Recognizing the women’s political influence, labor leader Michael Imoudu requested the LMWA’s support in case of a strike, telling Pelewura in a 1941 letter that “our strike is also your own strike.” The labor unions and the broader community shared in the goal of defeating British economic oppression.

    Read More: Nigeria Is Bracing for a Potentially Mammoth Strike Over a Hike in the Fuel Price

    The over-worked and underpaid unions pressed their employers for cost-of-living salary increases. Although the British governor initially claimed that the government had no funds for African workers, by May 1942, his administration provided “separation allowance” to European expatriates whose wives resided abroad. Finally, in June 1945, the railway workers in Lagos kickstarted the strike that spread across Nigeria. The strikers demanded that the British colonizers increase wages and offer benefits like “family allowances” similar to the European separation allowances for African workers.

    Nigerian women sustained the strike by donating funds to the Workers’ Relief Fund. Oluwole, the actress and preacher, devoted herself to the workers’ cause. In addition to feeding workers who visited her Lagos home, Oluwole spoke at strikers’ meetings, encouraging them to maintain the picket line. Her efforts later earned her the title of the “workers’ mother.”

    The labor unions also depended on the connections that leaders had forged with the LMWA. The traders aligned with the strikers because of their shared criticisms of the British government’s price controls as well as family ties. Male wage earners were the fathers, brothers, and sons of working women. Traders across Nigeria followed their Lagos peers by selling items to the strikers on credit or reduced prices.

    The strike slowed the British government’s emergency war production and forced it to recognize the power of working class people. By December 1945, the British administration agreed to pay workers a living allowance 50% higher than in the pre-strike era. LWMA’s agitations also forced the European administrators to remove gari, a staple food made from cassava, from the control list after the end of war.

    The strike’s success demonstrated the transformative power of class and gender solidarity in pressing for workers’ rights and temporarily crippling the colonial economy.

    The strike also strengthened Oluwole’s resolve to agitate for Nigeria’s freedom from British rule. Oluwole was convinced that only expelling British colonizers would relieve workers’ suffering and improve Nigerians’ living conditions. She turned her full attention to supporting the Nigerian political parties who were committed to self-rule.

    Read More: Queen Elizabeth II’s Death Is a Chance to Examine the Present-Day Effects of Britain’s Colonial Past

    The country’s leading politicians had been pushing the British government to accelerate the date for independence to 1956, as well as replace the existing government with a federalist one that would divide power among the three regional legislatures.

    However, Oluwole opposed the parties’ agenda, explaining her stance in a 1954 newspaper op-ed: “I am not anti-self-government when it is to the benefit of the majority of the people. The stand of my party is that self-government for Nigeria in 1956 is premature.” She established the Nigerian Commoners’ Party (NCP) to promote everyday people’s interests. It also opposed some Nigerian parties’ agitation for self-government by 1956, concerned that such an acceleration would set up the new nation for failure.

    Oluwole’s discontent was shaped by the nepotism, electoral fraud, and aggrandization of wealth that were already unfolding among these parties. Advancing an alternative vision, Oluwole wanted a decentralized and unitary government that would prevent regional power plays and imbalances. She advocated a political structure where local councils with greater administrative powers would coexist with a single legislature.

    Before she could do more to achieve her vision, though, Oluwole died at age 52 in 1957.

    But revisiting her alternative proposals yields new insights. Her criticisms of Nigeria’s emerging political structure proved prescient. Moreover, lifting up the example of forgotten women like Oluwole today reminds us of the possibilities of “anti-colonial world-making” that exposed the unequal relationship between colonized people and European rulers.

    Women’s organizing changed the workers’ conditions and weakened colonial economic order: the 1945 labor dispute’s success depended on the strikers’ connections with women leaders and recognizing their shared struggles against colonial exploitation. By foregrounding the working class’ interests, Nigerian women created and envisioned transformative political changes. Any future for Nigeria will have to do the same.

    Halimat Somotan is a historian of 20th century Lagos, Nigeria and assistant professor of African studies at Georgetown University’s Edmund A. Walsh School of Foreign Service. Made by History takes readers beyond the headlines with articles written and edited by professional historians. Learn more about Made by History at TIME here.

    [ad_2]

    Halimat Somotan / Made by History

    Source link

  • America's War on Drugs Was Always Bipartisan—And Unwinnable

    America's War on Drugs Was Always Bipartisan—And Unwinnable

    [ad_1]

    For 70 years, politicians in both parties have fought an unwinnable war on drugs. In the latest chapter, the Biden Administration has labeled Mexican cartels the top criminal threat facing the U.S. and proposed devoting even more resources to trying to keep drugs from crossing the border — policies that historically have only made things worse. The Republican candidates for president want to go even further. They fantasize about invading Mexico, destroying the cartels, and shooting suspected smugglers at the border. Both sides see American drug users as innocent victims rather than the source of demand driving a lucrative illegal market.

    This bipartisan consensus has two racialized foundations. Politicians have long competed to punish drug traffickers—whom they typically portray as foreigners and racial minorities. Meanwhile government policy historically has defined most white, middle-class illegal drug users (not just addicts) as both criminals and victims, to be arrested and forced into treatment. As a result, drug warriors have poured more than a trillion dollars into law enforcement and involuntary rehabilitation—with little more to show for it than a punitive and racially discriminatory system of mass incarceration.

    This history exposes the truth: the drug war isn’t winnable, as the Global Commission on Drug Policy stated in 2011. And simply legalizing marijuana is not enough. Instead only a wholesale rethinking of drug policy—one that abandons criminalization and focuses on true harm reduction, not coercive rehabilitation—can begin to undo the damage of decades of a misguided “war.”

    The modern drug war began in the 1950s, with liberals—not conservatives—leading the charge. In California, the epicenter of the early war on narcotics, white suburban grassroots movements prodded liberal politicians like Governor Pat Brown into action. They blamed “pushers,” usually perceived and depicted as people of color, and demanded that elected officials crack down on the drug supply. Legislators in California, Illinois, and New York responded by passing the nation’s first mandatory-minimum sentencing laws in an effort to save teenagers from these traffickers.

    Read More: Decriminalizing Opioids Will Save Countless Lives

    In 1951, the initial wave of grassroots activism and state legislation pushed Congress to enact the first federal mandatory-minimum law, which likewise targeted Black and Mexican American “pushers” who allegedly supplied heroin and marijuana to innocent white teenagers. Policymakers included marijuana because of the mythology that youthful experimentation would inevitably lead to heroin addiction. To add further urgency, politicians and the news media routinely depicted a horror story in which these “pushers” hooked white middle-class girls and women on drugs, consigning them to a downward spiral that almost invariably resulted in prostitution.

    While the enforcement of these new drug laws initially focused on the ominous “pushers,” police ultimately arrested millions of white teenagers and young adults for marijuana and other drug offenses—albeit with a different goal in mind. For white middle-class youths, a drug arrest almost always led to either dropped charges (often after parents agreed to seek private rehab) or diversion to a treatment program through a process that did not leave any traces on their permanent record.

    Law enforcement focused their attention on marijuana because it held the most allure for white middle-class youths. This made it the number-one enemy for white parents, and therefore the illegal drug that politicians in both parties cared the most about. The crackdown on marijuana aimed to save these suburban youths from themselves and from what smoking pot symbolized—an alleged gateway to heroin addiction during the 1950s and 1960s, political radicalism and hippie values during the 1960s and 1970s, and the “amotivational syndrome” of laziness and apathy in the 1970s and 1980s.

    In 1970, the obsession with rehabilitating young white marijuana users directly shaped a seminal federal drug law, jointly crafted by congressional Democrats and the Nixon Administration. This legislation reduced the 1950s-era penalty for possession of all illegal drugs from a mandatory-minimum felony to a misdemeanor. Politicians designed this provision to provide prosecutors and judges with more leverage to coerce white marijuana offenders into rehabilitation through conditional probation that would not leave a formal record.

    The use of law enforcement to deter and rehabilitate recreational pot smokers reached its peak between the mid-1960s and the mid-1970s, when mass disregard for marijuana laws accompanied the rise of the campus antiwar movement and the counterculture. The proportion of white Americans arrested on drug charges reached historically high levels and the percentage of drug arrests in the suburbs quadrupled. White youths accounted for around 89% of juvenile drug arrests during the 1970s, a percentage that would drop precipitously once the racially selective war on crack cocaine began.

    Soaring arrest rates prompted a dramatic turnabout from white suburbanites. Instead of clamoring for a crackdown, parents of teens facing criminal charges usually demanded leniency or no punishment at all for what they began redefining as a victimless crime committed by “otherwise law-abiding people.” Many students and young adults joined vibrant political movements for marijuana legalization or decriminalization, led by the ACLU and the National Organization for the Reform of Marijuana Laws (NORML). This grassroots pressure convinced politicians in 11 states to decriminalize marijuana possession—but not sale—during the 1970s. Young activists kept demanding full legalization as a right of personal freedom and denounced both the incarceration and forcible rehabilitation of recreational pot smokers.

    In the late 1970s and 1980s, however, their crusade ran into a wall because of a new group based in the white suburbs: the National Federation of Parents for Drug-Free Youth. This coalition began sounding the alarm that the growing rates of marijuana smoking by teenagers and even preteens would destroy the futures of middle-class children and represented “the most massive and pervasive drug epidemic in human history.” Pressure from this movement convinced the Carter Administration to reverse its support for marijuana decriminalization and re-escalate the war on drugs—targeting marijuana as well as cocaine.

    Later, the National Federation of Parents also worked closely with the Reagan Administration to target nonwhite and foreign drug traffickers to cut off the supply for affluent white suburbs. Instead of militarized law enforcement, however, their own children received the “just say no” message made famous by First Lady Nancy Reagan. The white suburban movement’s focus on marijuana prompted her husband’s administration to shift funding away from urban treatment centers to prioritize the perceived emergency of white teenage and preteen marijuana use. In response, Democrats in Congress charged that President Reagan was losing the war on crack cocaine because of his administration’s obsession with saving white kids from smoking pot.

    Read More: Black Silent Majority and What We Can Learn From History’s Criminal Justice Reformers

    As this criticism exposed, at every step along the way, politicians in both parties shared similar goals: cracking down on evil suppliers, protecting “innocent” victims, and using law enforcement to force addicts and illegal drug users into rehabilitation. This resulted in extreme racial disparities in drug-war enforcement, with punitive incarceration for “pushers” almost always defined as nonwhite, and diversion into treatment and rehabilitation for white middle-class law-breakers overwhelmingly labeled their “victims.”

    There was really only one big difference between liberal drug warriors and conservative ones. While liberal drug warriors propagated the “pusher” myth, they also often wanted to spend more on treatment and rehabilitation in nonwhite communities. Many liberals believed that at least some Black and Mexican American youths were also victims of illegal drug traffickers and therefore should be sent to rehabilitation rather than prison. But this conviction never stopped them from drafting most of the get-tough laws that police, prosecutors, and judges predictably used in discretionary and discriminatory ways to target nonwhite neighborhoods and send their residents to prison, jail, or juvenile detention at far higher rates.

    Over the last 15 years, liberal policymakers have become more willing to acknowledge the massive racial disparities in drug war enforcement and to prioritize alternatives to incarceration for “nonviolent” drug offenders, especially those arrested for possession. They want to extend the diversion programs long used for affluent white drug users to everyone. The Biden Administration’s National Drug Control Strategy includes a commitment to “advance racial equity” in drug-related arrests and sentencing while also reducing the clear racial disparities in the diversion of individuals to rehabilitation alternatives.

    Yet, this well-intentioned vision will not bring substantial changes to our nation’s vast and discretionary drug control system, and it does not abandon the fundamental and disastrous policy of criminalization of drug use or its corollary of coercive public health. The stark truth is the drug war is a failure and only a wholesale rethinking, one that focuses on harm reduction and renounces criminalization itself, can save us from repeating the mistakes that have defined seven decades of drug policy, cost taxpayers trillions, and done little to reduce drug usage.

    Matthew D. Lassiter is the author of The Suburban Crisis: White America and the War on Drugs, available now from Princeton University Press. Made by History takes readers beyond the headlines with articles written and edited by professional historians. Learn more about Made by History at TIME here.

    [ad_2]

    Matthew D. Lassiter / Made by History

    Source link

  • Joe Biden’s Unprecedented Support for Israel

    Joe Biden’s Unprecedented Support for Israel

    [ad_1]

    President Biden is proving himself, by some measures, the most pro-Israel president in American history. 

    Starting on the very day of the Oct. 7 Hamas terrorist attacks in Israel, Biden has emotionally expressed his solidarity with Israel and affirmed its need to fight fire with fire. He has expedited military assistance and, most dramatically, he flew to Israel and sat as part of Prime Minister Benjamin Netanyahu’s war cabinet. While there certainly is a time-honored tradition of a “special relationship” between the U.S. and Israel, Biden has gone where no other U.S. president ever has gone during one of Israel’s wars—not only physically but also politically and strategically.

    Biden’s predecessors have traditionally decided how to handle Arab-Israeli wars by making calculations about geopolitics and American politics. Often they prioritized limiting the impact of these wars on U.S. strategic goals. The domestic political position of each president also shaped his actions. Biden’s support for Israel, by contrast, has been more fulsome and less qualified. It’s also less clear that he’s acting in a way congruent with U.S. strategy and his own political needs.

    In November 1947, the United Nations (UN) voted to create a Jewish state—and a Palestinian one—out of what had been Britain’s Palestine mandate, and war erupted in the Holy Land. President Harry S. Truman conferred recognition on the State of Israel after its leaders declared independence in May 1948. Truman may have hoped to reap a political benefit with Jewish voters as he stood for election later that year. Even so, he refrained, then and afterward, from providing military assistance to Israel. Truman wished to prevent an arms race in the Middle East and he did not want to completely estrange the Arab states.

    In 1956, Israel conspired with France and Great Britain to attack Egypt. Israel seized the Sinai Peninsula, which it coveted as a security buffer against Egypt’s military.

    President Dwight D. Eisenhower was livid over what he saw as his European allies’ deceit and Israel’s mischief. Eisenhower’s focus was the Cold War and he was convinced that Britain and France’s imperialist effort to repress Egyptian nationalism, abetted by Israel for its own purposes, would increase Soviet influence in Asia and Africa. The invasion occurred only days before Eisenhower won a huge reelection victory, and his strong political position domestically encouraged him to take the diplomatic offensive. He put the screws to the three invading states and forced Israel to retreat from the Sinai in 1957.

    Read More: Everything President Biden Has Said About the Israel-Hamas War

    In contrast to Eisenhower, a decade later, when Israeli fears about aggression from Arab states prompted another war in the Middle East, President Lyndon B. Johnson sympathized with Israel. Even so, he carefully calibrated his expression of support. Johnson wrote to Prime Minister Levi Eshkol, “I must emphasize the necessity for Israel not to make itself responsible for the initiation of hostilities. Israel will not be alone unless it decides to go alone.”

    Johnson took care to place some distance between the U.S. and Israel’s military decisions. He might have prevented the war by sending American troops on a peacekeeping mission to Israel’s borders, but there was no way he was going to plunge the U.S. into what might become a war zone. He didn’t want to own Israel’s war and probably hoped to maintain his government’s credibility as a diplomatic broker after the war ended—with Israel again taking the Sinai, in addition to Gaza, the West Bank, East Jerusalem, and Syria’s Golan Heights.

    The U.S. supported subsequent U.N. efforts to set the table for a land-for-peace deal between Israel and the Arab states. However, the Arab world’s rejection of negotiations with Israel enabled Israel to begin solidifying its hold on these newly conquered lands and forestalled, at least in the near term, any potential mediation by the Americans. Johnson navigated the diplomacy of the Middle East conflict shrewdly and it played no role in his eventual downfall over his war in Vietnam.

    In 1973, Egypt and Syria attacked Israel, seeking to win back lost territories and bargaining power. President Richard M. Nixon—operating with great political latitude after his landslide reelection the previous year—gave Israel unstinting military aid. Nixon’s unqualified material support for Israel is the only approximation of Biden’s full-on allyship today. Yet the October War of 1973 was a conventional war between armies, not a conflict like the current one, in which the great majority of the dead killed by both parties are civilians.

    The Cold War had also taken a new turn and Nixon saw the war as a proxy conflict between East and West. While Nixon’s actions brought the U.S. and the Soviets perilously close to a confrontation, in the end his support for Israel helped make the war a draw. This outcome prepared the way for the later peace agreement between Egypt and Israel, as each state proved its military potency, while each also now recognized that their country could not overwhelm the other on the battlefield.

    By the 1980s, the conflict between Israel and the Arab world shifted to a new front and new players. Tensions had intensified amid attacks by Palestinian groups against northern Israel and the political threat that the Palestinian nationalist movement—based in Lebanon—posed to Israeli control of the West Bank. President Ronald Reagan gave Israel the green light to invade Lebanon and decimate the Palestine Liberation Organization, which Israel’s prime minister, Menachem Begin, likened to Nazi Germany.

    But the entire U.S. government was shocked when Israeli General Ariel Sharon laid siege to Beirut, exceeding the plans he had shared with the Americans. Reagan, staunchly pro-Israel, felt used. The public outcry against Israel’s shelling of civilian neighborhoods added to Reagan’s alienation from Israel’s behavior. He told Begin that Israel was perpetrating a “holocaust” and he demanded that the prime minister reverse Israel’s cut-off of water and electricity to Beirut. Begin was outraged, but he complied with Reagan’s wishes.

    Although a first-term president, Reagan proved willing to chastise Israel when he deemed his ally’s actions reckless and beyond decency’s bounds. He expressed care and compassion about the deaths of Arab civilians, especially children. Reagan later succeeded in pressing the PLO to foreswear terrorism and thus brought the group into international diplomacy, helping build the path toward the Oslo Peace Accords of the 1990s.

    Read More: Biden to Ask Congress for $105 Billion to Bolster Israel and Ukraine, the U.S.-Mexico Border, and the Indo-Pacific

    Today, Biden confronts a situation different than his predecessors faced. The grisly killings of 1,200 people, mostly Israeli civilians, as well as the kidnapping of hundreds, by Hamas on Oct. 7 were unprecedented. It is not surprising that Israel’s response has been ferocious and vastly larger than its past reprisals. Yet Biden seems to have pushed aside the strategic and political considerations that guided past American leaders’ responses to Israel’s wars. He joined himself at the hip politically with Netanyahu and echoed Israel’s talking points faithfully. However, after criticism for appearing to write Israel a blank check, Biden worked to broker the release of Hamas’s hostages in exchange for a limited ceasefire and the freeing of Palestinian prisoners from Israel.

    Biden has expressed concern over Israel’s lack of a clear and plausible endgame for Gaza, but he nonetheless gave Israel a bright green light to wage war there to root out Hamas. As he stated it, his revulsion over Hamas’s “pure evil” and his heartfelt embrace of the framing of this terrorism as a continuation of age-old violent hatred against the Jewish people gave him no alternative. 

    In the eyes of the world, there is no space between the American president and the Israeli war, and this reality poses strategic and political risks. American efforts to persuade other states that Russia’s siege of Ukrainian cities is atrocious may now fall on deaf ears, considering Biden’s support for Israel’s siege of Gaza, which has killed well over 10,000 people. Biden’s fervent defense of Israel’s war also threatens the president’s support among younger, more diverse Americans, who empathize with the Palestinians in a way that Americans of Biden’s generation scarcely comprehend. His statements of concern over Palestinian deaths have been belated and unemotional, creating a vivid contrast with his rousing declarations that he, as America’s leader, will keep faith with Israel and with Jewish life.

    We don’t yet know if Biden can pave a road to peace out of war or whether his actions will destabilize domestic or international affairs dangerously. He faces a challenge in seeking to claw back perceptions of a rush to approve Israel’s war with no true red lines, although he shows signs of seeking to create a more balanced perception. Up to now, Biden has redefined what American support looks like during an Israeli war.

    Doug Rossinow is professor of history at Metro State University in St. Paul, Minnesota. The author of works including The Reagan Era: A History of the 1980s (2015), he is writing Promised Land: The Worlds of American Zionism, 1942–2022, to be published by Oxford University Press. Made by History takes readers beyond the headlines with articles written and edited by professional historians. Learn more about Made by History at TIME here.

    [ad_2]

    Doug Rossinow / Made by History

    Source link

  • Ronald Reagan Offers a Lesson for Biden on Iran

    Ronald Reagan Offers a Lesson for Biden on Iran

    [ad_1]

    Hamas’ brutal surprise attack on Israel on Oct. 7 has upended the Biden Administration’s desire to embrace a new world order. Last month, Secretary of State Antony Blinken declared that the U.S. finds itself at a “hinge moment in history.” He insisted that “the core assumptions that shaped our approach to the post–Cold War era no longer hold.”

    Blinken’s statements reflected the desire of President Biden and his team to reorient foreign policy focus away from the “forever wars” that have dominated the past two decades and toward a new era of great power competition with China and Russia. The threat from China looms especially large in the minds of both parties in Washington, and it had seemed as if Barack Obama’s vice president was finally on the cusp of fulfilling his former boss’s long-awaited strategic “pivot to Asia.”

    Instead, the actions of Hamas, which has been supported by Iran (though Iran denies direct involvement in the events of Oct. 7), has proved that “rogue states”—which have been the defining threat to U.S. and global security since the end of the Cold War—remain a potent menace threatening international peace and stability. Rogue regimes such as Iran warrant continued attention and resources from the U.S. government and its allies as they work to protect the “rules-based” order that Biden’s predecessors established after the Soviet Union’s demise. To formulate a successful foreign policy in this fraught moment, Biden can look to the successful strategies that Ronald Reagan’s administration forged at the end of the Cold War to combat the emerging threat.

    Read more: As Biden Responds to Iran-Linked Attacks With Air Strikes, Fears of a Wider War Grow

    In the late 1980s, rogue states like Iran and Libya emerged from the ashes of the Cold War as the central menace to global security. Though pariah states had long challenged the established international order, their threat grew with the end of the superpower competition between the U.S. and the Soviet Union. The close of the Cold War diminished the restraints the two rivals had imposed on rogue states in their orbits. Now unshackled, these regimes brought together three new security threats that had seemed to be peripheral nuisances as the 1980s began: regional military aggression, state-sponsored terrorism, and the pursuit of weapons of mass destruction.

    The rising urgency of these new threats became all too apparent in 1983. The year before, Israel had invaded Lebanon to put an end to attacks from the Palestine Liberation Organization (PLO). As the situation spiraled into a humanitarian crisis, President Reagan grew appalled by the images of horrific violence consuming the Lebanese capital. He deployed U.S. Marines as part of a multinational peacekeeping force to contain the bloodshed and oversee a shaky ceasefire in Beirut. On Oct. 23, 1983—40 years ago this month—a truck loaded with explosives destroyed the Marines’ barracks, killing 241 U.S. military personnel in the worst terrorist attack against the U.S. until 9/11. Reagan later wrote in his memoirs that the attack amounted to “the lowest of the low” points of his eight years in the White House.

    It quickly became apparent that Iran had supported the terrorists who carried out the attack. But a decisive American response crumbled in the face of sharp disagreement between the State and Defense Departments over whether to use military force against terrorists. Though Reagan had vowed soon after taking office to enact “swift and effective retribution” for terrorist attacks against Americans, he did not yet have a clear strategy to back up his bold words with action.

    This realization, combined with the escalation of attacks against the U.S. and its allies, triggered a recalibration of America’s approach toward terrorism. Reagan declared in July 1985 that terrorist attacks were “acts of war” and he began implementing the first comprehensive counterterrorism strategy in U.S. history. This policy emphasized offensive as well as defensive measures, including focusing attention on state sponsors of terrorism. Reagan proclaimed that the United States would not tolerate “attacks from outlaw states run by the strangest collection of misfits, loony tunes, and squalid criminals since the advent of the Third Reich.”

    Libya became the first poster child for these emerging “outlaw states.” Reagan referred to the Libyan dictator Muammar Gaddafi as the “mad clown” of Tripoli. Gaddafi’s regime had established itself by the mid-1980s as the most brazen supporter of international terrorism against Western and American targets, including attacks against passengers of the Israeli national airline in the Rome and Vienna airports in Dec. 1985 and then the bombing of a disco in West Berlin in early April 1986. The latter killed three and wounded 229, among them 81 American servicemen. With clear evidence of Libyan sponsorship, this last attack finally triggered a decisive American response.

    The president began deploying his new counterterrorism strategy against Gaddafi’s regime in late 1985 and early 1986. His administration implemented a multifaceted plan that brought together diplomatic, economic, intelligence, and eventually military initiatives in an escalating series of steps to isolate Gaddafi’s rogue regime. These efforts included economic sanctions and a display of U.S. military power through naval exercises off Libya’s coast. They culminated in airstrikes on April 14–15, 1986.

    Reagan’s administration followed the strikes with a determined push to enlist the support of previously reluctant European allies. The aim was to forge an international coalition to turn back the threat posed by terrorists and their state sponsors. Reagan sought to maintain pressure on Libya through tightened sanctions and covert measures designed to foment regime change against Gaddafi.

    Though Gaddafi clung to power, his support for terrorism notably faded in the aftermath of Reagan’s bombing of Libya. His agents resurfaced for one final major attack in the last days of Reagan’s presidency, bombing Pan Am Flight 103 over Lockerbie, Scotland, in Dec. 1988 (though Western intelligence did not confirm Libya’s role for several years). That same month, Reagan threatened military action to destroy a Libyan chemical weapons factory under construction. By the time Reagan left office, his strategy against rogue states had largely succeeded in curbing the Libyan threat, which never again came close to its peak destructiveness of the mid-1980s.

    Moreover, this strategy created a precedent for using military force against states sponsoring terrorism, laying the groundwork for the coercive diplomacy used during the Gulf War and after 9/11 during the War on Terror.

    Reagan’s experience with rogue states and their terrorist allies offers a blueprint for President Biden. Reagan’s uncertain response to the 1983 Beirut bombing, which eventually led to the withdrawal of American forces from Lebanon, emboldened rogue states such as Iran and disheartened America’s allies in the Middle East. But his decisive response to Libya’s brazen terrorist acts in 1986—which integrated all elements of American power, including diplomatic and economic measures as well as military force—degraded this foe from a growing threat to a waning, second-tier troublemaker.

    Biden faces a similar moment in the wake of the Hamas attack on Israel. Iran will be watching to see if the U.S. responds decisively. Clear repercussions, not just for Hamas but also for Iran, are essential. Biden should gear U.S. efforts toward accelerating the diplomatic and economic isolation of both regimes, enlisting the support of allies while making clear the willingness to display—and use if necessary—U.S. military force against any further escalation of the conflict against Israel. That’s the lesson of Ronald Reagan’s fight against terrorism.

    Speaking at another hinge moment in history, on the day in 1990 that Saddam Hussein’s Iraqi tanks rolled across the border to invade Kuwait, President George H. W. Bush predicted the threats that would define the post–Cold War era: “Terrorism, hostage-taking, renegade regimes and unpredictable rulers, new sources of instability—all require a strong and an engaged America.” This era may be evolving, but it is not yet over.

    Matthew Frakes is an America in the World Consortium Postdoctoral Fellow at the Henry A. Kissinger Center for Global Affairs at the Johns Hopkins University School of Advanced International Studies (SAIS). He is currently writing a book that examines the reshaping of U.S. national security at the end of the Cold War.

    Made by History takes readers beyond the headlines with articles written and edited by professional historians. Learn more about Made by History at TIME here.

    [ad_2]

    Matthew Frakes / Made by History

    Source link

  • The Integration Story that Challenges What We Think We Know

    The Integration Story that Challenges What We Think We Know

    [ad_1]

    Most history classes around the United States learn about how White students hounded James Meredith in 1962 when he tried to integrate the University of Mississippi. They might see the footage of Governor George C. Wallace barring Foster Hall to prevent two Black students from enrolling at the University of Alabama a year later. These moments demonstrate both the persistent hopes for equal access to education and the barriers to achieving them. But there are other, less familiar moments in civil rights history that complicate our conventional understandings of the Black freedom struggle and White reaction to it. One of these is the integration of the oldest public university in Alabama. 

    Just five days before the bombing at the 16th Street Baptist Church in Birmingham, Ala., Wendell Wilkie Gunn quietly matriculated at Florence State College (today, the University of North Alabama). Reinserting Gunn’s story in the history of the civil rights movement invites us to consider the complex historical logics of integration and offers new insights into, as Martin Luther King Jr. memorably put it, where “we go from here.”

    Gunn was a native of the area in Northwest Alabama known as the Shoals, having grown up across the Tennessee River in Spring Valley, Ala. After a few years of college at the historically Black Tennessee State, he came home, hoping to enroll in the local college.

    Read More: How Robert F. Kennedy Shaped His Brother’s Response to Civil Rights

    Inspired by a yearbook at a family friend’s home, Gunn simply walked into the Florence State College registrar’s office and requested an application. The surprised receptionist summoned the University’s President, E.B. Norton, who informed Gunn that all colleges in the state were segregated and referred him to the state’s HBCUs. But then, he did something else. Norton advised Gunn that if he were to file suit in federal court, Florence State might be compelled to consider his application. Gunn, reflecting on this moment later, surmised that the college president “was waiting for me.”

    He likely was. A pragmatist, Norton accepted that integration was inevitable, and he thought that it would be best to allow it quietly and peacefully. 

    Gunn’s request to apply presented Norton with an opportunity. For one thing, he arrived alone and seemingly without an affiliation with any civil rights or legal organization. Gunn had, in fact, participated in an attempt to integrate the University Church of Christ in Nashville in 1960, but he insisted that his Florence State College application was borne simply of a desire to go to school near home.

    Norton also thought that, despite the presence of white supremacy in the region, the residents would peacefully comply with a federal order. Because of its centrality to the Tennessee Valley Authority, Lauderdale County had long been reliant on federal money and jobs. It also had a relatively small Black population compared to the rest of the state, which meant that, for white residents, defying the federal government would pose a greater existential threat to their way of life than integration of the local college.

    When Gunn returned home from campus that day and told his mother, Mattie Crawley Gunn, what Norton had said, she called Fred Gray, the famous civil rights attorney and a family friend. Gray took the case, filing suit. A few weeks later in a Birmingham courtroom, Federal Judge H.H. Grooms ordered Florence State to evaluate Gunn’s application as it would that of any white student. He enrolled that week. 

    Not that it was easy. When Gunn’s attempt to enroll became public, there came predictable, terrifying calls: “You show up on that campus, there’s going to be rifles pointing at your head. I guarantee you.” Gunn’s aunt, visiting from Gary, Ind., put covers over the windows so snipers couldn’t see whom to shoot. His father, a union leader at the local Reynolds Metal plant was threatened; his mother lost her job as a cook.

    Facing the threat of violence, Gunn’s family and the larger Black community offered steadfast support. At lunch on Florence’s West Side, in clubs on Friday nights, and in church on Sunday morning, people gathered around him, understanding the significance of what he was doing, and the sacrifice. 

    While Gunn excelled academically at Florence State, his first months were isolating. “Almost no one spoke to me except my instructors. I had zero social life on campus.” It didn’t help that the Dean was walking with him to class every day for security purposes. Aside from loneliness, the threats continued. Norton anticipated this. He assembled the football team on Gunn’s first day and told them to make sure that nothing happened. When one player heard a racial slur directed at Gunn, he intervened, grabbing the offender and informing him, “We don’t do that here.”

    Read More: The South Could Mend America’s Divide—If It Reckons With Its Past

    The turning point for Gunn came at the end of his first year when he won the college’s Physics Achievement Award. “As I stood up to accept the award,” he recalled, emotionally, “the audience began to applaud. It started low and grew quickly. Until that moment, I had no idea how much eight months of silence and isolation had affected me. My emotions exploded, with tears to match. The more I cried, the louder the audience applauded. Ten seconds later, the entire audience was on its feet, cheering.” From this point on he was “just another student.”

    Gunn graduated, with honors, from Florence State in 1965 with a degree in chemistry. After working for four years as an industrial chemist, he made his way to Chicago, where he enrolled in 1969 in a M.B.A. program at the University of Chicago School of Business.

    After years working in upper management at Chase Manhattan Bank and PepsiCo, in 1982, Gunn accepted a position as a Special Assistant to President Ronald Reagan for Policy Development for International Trade, a post he held for two years. During George H.W. Bush’s administration, he was the Chief of Staff for Secretary of Housing and Urban Development Jack Kemp. 

    In 2017, he got another invitation: to deliver the Commencement address at his alma mater, now the University of North Alabama, where he received a honorary doctorate degree. The next year, the school named the newly constructed student center in his honor. Then, in 2018, the University invited Gunn to serve on its Board of Trustees, which he accepted. “I know of no other example out there where a student entered under the circumstances that he did, went on to graduate, and then returned to join the institution’s governing body,” UNA President Ken Kitts stated. “This is truly full circle.” Fred Gray agreed: “Of all my many clients, Dr. Gunn is the first to be appointed to the governing board of the institution that he helped desegregate.”

    This is not the typical narrative of school integration in the American South. Instead, civil rights histories have tended—correctly, in most cases—to point out the instances of violent racist opposition, the trauma inflicted on children, the rise of Christian segregation academies, and the persistence of unequal education. But there are other narratives too, that offer not just hopeful accounts of moral sense but also visions of progress based on shared interests, what civil rights scholar and activist Heather McGhee calls, the “Solidarity Dividend.” 

    Gunn’s unexpected story is one of these, offering a different, if still complex, version of integration’s afterlives. Undoubtedly, the University of North Alabama, in publicly embracing Gunn and his successes, might be acting in its own interests, not unlike Norton did 60 years ago. But Gunn’s position on the Board of Trustees also reveals genuine power-sharing, a move toward real inclusion. The paradox is not invalidating but instead invites Americans to think pragmatically about common interests in justice and to pursue collaborative possibilities for progress. Cultivating a sense of shared purpose is crucial in our current racial politics, where white supremacist violence endures and divides appear entrenched. Teaching Black history is essential work. And that means telling the truth about racial exclusion while also acknowledging and celebrating moments of cooperation.

    Ansley Quiros is an associate professor of history at the University of North Alabama and author of God With Us: Lived Theology and the Freedom Struggle in Americus, 1942-1976. She and Dr. Matthew Schoenbachler are currently working on a memoir with Dr. Wendell Gunn.

    Made by History takes readers beyond the headlines with articles written and edited by professional historians. Learn more about Made by History at TIME here.

    [ad_2]

    Ansley Quiros / Made by History

    Source link

  • How the World Got Hooked on Sugar

    How the World Got Hooked on Sugar

    [ad_1]

    Cornflakes and yogurt, ketchup and salad dressings, sodas and sports drinks: what do they all have in common? Lots and lots of sugar.

    The sweet stuff is all around us. It wreaks havoc on our bodies and contributes to obesity. It is actually rooted in a food system that has long reproduced systemic inequality: from slavery to colonialism to our modern food industries that have made sugary food cheap and easily accessible to marginalized communities. Indeed, we might think craving sweetness is innate, but that is far from the whole story.

    For most of human history, crystalline sugar simply did not exist, and people were happy with honey, sweet beans, glutinous rice, barley, or maple syrup. More than 2,000 years ago, however, peasants in Bengal learned how to boil cane juice into a raw dark sweet mass. But that alone didn’t drive sugar consumption. Indeed, just two centuries ago even in the wealthiest countries, people rarely consumed more than a few kilograms a year—while today, in many high- and middle-income countries, people annually consume 30 to 40 kilograms, and in the U.S. more than 45 kilograms. And this figure does not include High Fructose Corn Syrup, a caloric sweetener widely used by the U.S. beverage industry.

    What happened?

    This explosion of sugar consumption was entwined with imperialism and the rise of modern industrial societies, where sugar became a cheap supplier of calories for urban workers and industrialization enabled the mass production of refined sugar.

    Read More: Why Ultra-Processed Foods Are So Bad for You

    Initially, white crystals of purified sugar were so precious that emperors, rajas, and caliphs ordered it to be molded in sculptures to decorate their lavish dinner tables. Sugar was also coveted as a medicine. Dissolved in a bit of water it did wonders for people suffering from intestinal diseases, and generally reinvigorated exhausted human bodies.

    Across Asia, long caravans crossed the deserts loaded with sugar and other spices and precious metals. Indeed, Europe was entirely marginal to this history of sugar. That all changed after the 15th century, when sugar gradually became part of urban consumption in Western Europe.

    By 1500, demand in Europe outgrew production in the Mediterranean region and it was not long before sugar production found another frontier: the Americas. Tragically, it led to the enslavement of millions of Africans. Overall, of the approximately 12.5 million people who were kidnapped in Africa and survived their transport across the Atlantic, nearly two-thirds ended up on sugar plantations. Conditions were horrible at plantations of all kinds throughout the Americas, but those on the sugar plantations were the worst.

    Planting sugar cane in the West Indies, 1833. Universal Images Group /Getty Images

    The consumers of sugar in Philadelphia, London, and Paris became more and more aware of these horrors, being informed about frequent slave rebellions by the rapidly growing printed press. A vocal minority of literate, urban people in Europe and the U.S., especially Quakers, increasingly protested slavery as a mortal sin. A popular pamphlet condemned the consumption of sugar “stained with spots of human blood.” Thanks to dozens of petitions with hundreds of thousands of signatures, in 1807 the British Parliament decided to ban the slave trade in territories under its control.

    But sugar production and consumption endured. The German inventor Karl Franz Achard developed an industrial process for extracting sucrose from beet roots instead of sugar cane. Other enthusiastic entrepreneurs advocated opening up trade with India, arguing that sugar could be obtained there in much larger quantities and at a lower price. Neither Indian sugar nor beet sugar could make slavery disappear. By the 1860s, half of the sugar consumed by the industrial workers in Europe and North America was still produced by enslaved people. It was the world’s most traded commodity.

    Read More: Inside Barbados’ Historic Push for Slavery Reparations

    Government subsidies helped to ensure its overproduction, leading to steadily declining prices, which facilitated consumption. In late 19th-century Europe, farmers switched from wheat to beets to create beet sugar, resulting in beet sugar making up 50% of all internationally traded sugar by 1900. As the U.S. gained imperial power over Hawaii, Cuba, Puerto Rico, and the Philippines after 1898, it also build a strong beet sugar industry. The federal government introduced the Sugar Program in 1934, a system that protected American farmers and provided a market for its client states. Throughout the 20th century the world’s largest beet and cane sugar exporters tried to rein in overproduction and sugar dumping, notably through the Brussels Convention of 1902 and the International Sugar Agreement of 1937. These treaties did not hold, however, and the flooding of the world with cheap sugar continued.

    But what about the consumers? How did they become accustomed to devouring so much sugar, from a spoonful per week by 1800 to almost a kilogram every week for the average American today? In the 19th century, urban workers were often undernourished and lacked energy. According to the medical wisdom of that time, all a proper diet needed was a fair amount of calories, and sugar was the cheapest and fastest way to achieve this. The U.S. army leadership—as well as their counterparts in Europe and Japan—added sugar to the rations of recruits, to increase their endurance. From there runs a straight line to the chocolate bars and Coca-Cola that traveled with GIs liberating Europe from the Nazi regime.

    Yet stuffing our food with sugar did not happen unaccompanied by any warnings. By the early 19th century, the medical profession already guessed a significant correlation between sugar, obesity, and what’s now known as type 2 diabetes. The first low-carb, non-sugar diet was published by Britain’s William Banting in the 1860s and achieved wide popularity. But his work was almost forgotten in the subsequent decades.

    Of course, people did know that sugar in large quantities could make you fat and sick, but the sugar industry, and the beverage industry, devoted marketing efforts to convincing people of the opposite. Sugar corporations, for instance, contributed to research that identified fat and not sugar as the real danger to our heart and veins, while beverages are too often advertised as delights and part of sportive lives.

    Read More: Soda Taxes Are a ‘No-Brainer’ for Public Health, Says the Author of a New Study on Them

    And yet, the history of sugar has an important reminder for navigating this health crisis today: it shows that there is nothing natural about the amount of sugar we consume now; it is the result of a confluence of political, social, and economic forces. We need to understand that our overconsumption of sugar is only to a limited extent a matter of individual choice and very much the result of how over the past centuries much of our food has become an industrial product. Sugar played a central role in it. The next chapter in the history of sugar is up to us, particularly as citizens summoning our governments to protect not only industrial interests and but also our public health.

    Ulbe Bosma is Professor of International Comparative Social History at the Vrije Universiteit Amsterdam and the author of The World of Sugar: How the Sweet Stuff Transformed Our Politics, Health, and Environment over 2,000 Years, published by Harvard University Press.

    Made by History takes readers beyond the headlines with articles written and edited by professional historians. Learn more about Made by History at TIME here.

    [ad_2]

    Ulbe Bosma / Made by History

    Source link

  • Biden’s Border Wall is a Bipartisan Tradition

    Biden’s Border Wall is a Bipartisan Tradition

    [ad_1]

    In a seemingly stark policy reversal, President Biden announced his administration will build 20 miles of new fences along the U.S.-Mexico border. DHS Secretary Alejandro N. Mayorkas says Biden’s hand has been forced, as Congress allocated funds for this fencing in 2019, which could not be repurposed. Moreover, Mayorkas argues that Biden has been under pressure from both parties to show decisive action at the border. In short, Biden officials claim that even though he may not want to build a wall, he must, or he will face serious political consequences.

    But new fences are not a reversal of the Democratic Party’s agenda. They are part of an extensive history of both Democrats and Republicans selling Americans on the idea that they can stop border-crossings by simply starting a new program or building a big fence. Politicians from both parties have consistently attempted to “close the border,” as if doing so is actually possible, let alone desirable. Biden is not continuing construction on Trump’s border wall; he is continuing to build America’s border wall.

    Read More: In Reversal, Biden Moves to Expand Border Wall

    The first border fences built along the U.S.-Mexico border to curb immigration from Mexico began in earnest under Democrats Franklin Delano Roosevelt and Harry S. Truman. After building fences for decades to stop animals, the federal government shifted its focus when people began migrating in significant numbers from south to north in the 1940s and 1950s.

    In this transitional moment, both Mexico and the United States embraced the border’s permeability. To fill labor gaps left by World War II, the nations agreed to a guest worker program, known as the Bracero Program. Not everyone qualified to participate, though, so thousands began migrating independently. Growers in the north yearned for affordable labor. Mexicans within and outside of the program provided it. Under pressure to control the flow of people, the Roosevelt Administration began planning fence construction in urban areas to divert traffic to more isolated areas. By the end of the Truman Administration, most border cities were fenced. Even as both nations facilitated Mexican migration, they looked to fences to aid them in filtering who could enter.

    The Bracero Program ended in 1964, and a year later, Democrat Lyndon B. Johnson signed the Immigration and Nationality Act which, for the first time, placed a cap on the number of people who could immigrate to the U.S. from Western Hemisphere countries like Mexico. This shift in regulation directed greater attention to the border.

    Read more: What Donald Trump Got Right—and Wrong—About the History of Deportation

    Despite new laws and fences, immigrants kept coming. Lured by U.S. demand, smugglers brought drugs, too. In 1969, Republican Richard Nixon launched Operation Intercept. He tried to close the border for weeks to stop the movement of illicit drugs. The initiative increased security and surveillance—a virtual fence, not a material one—but it failed by its own measure.

    Two years later, First Lady Pat Nixon established Friendship Park along the border near San Diego where people could celebrate cross-border culture. At the dedication ceremony, Nixon requested that her security detail cut strands of barbed wire there so that she could greet Mexicans across the borderline. “I hope there won’t be a fence too long here,” the Republican famously said. Nixon’s administration never built significant barriers.

    Facing economic distress and American angst with rising tides of labor migrations from Mexico, Democrat Jimmy Carter replaced the fence Nixon had cut with a bigger, stronger fence in 1979. A year before it went up, its design stirred controversy when the contractor stated it would “sever the toes” of anyone who dared to breach it. After public outcry, Carter’s administration redesigned the fence to be plain, but tightly woven, wire mesh topped with barbed wire. Even if that fence did not sever toes, it did tear through Pat Nixon’s bi-nationally spirited park.

    Republican Ronald Reagan also closed the border for a few weeks in 1985, repeating Operation Intercept. Despite his idea that he could close the border at his whim, Reagan, like First Lady Nixon, demonstrated hesitation about actual border fences. In a 1980 debate with future President George H.W. Bush, Reagan had said, “Rather than talking about putting up a fence, why don’t we work out some recognition of our mutual problems, make it possible for them to come here legally with a work permit and then while they’re working and earning here they pay taxes here.”

    Reagan later signed the 1986 Immigration Reform and Control Act. The law provided legalization to over two million undocumented immigrants who had been working in the United States, increased the legal culpability for employers who hired undocumented people, and provided funding for more Border Patrol agents. Although Reagan did not build fences, his administration did maintain the ones that existed, and he provided funds to increase border surveillance, as did George H.W. Bush.

    In the 1990s intense xenophobia and public debate about unauthorized immigration escalated in the United States, prompting both parties to move toward physically securing the border. Democrat Bill Clinton’s policies would not just tear through Pat Nixon’s park, they would effectively destroy it. In 1993 and 1994, Clinton launched three separate border operations: Operation Hold the Line in Texas, Operation Safeguard in Arizona, and Operation Gatekeeper in Southern California.

    Read More: Barriers to a Border Wall

    The fences were part of what Clinton referred to as a “get tough policy at our borders.” He used steel surplus military landing mats, which the Army Corps of Engineers welded together, to build an allegedly impassable wall. In the middle of Friendship Park, the Immigration and Naturalization Service built three parallel fences. Multiple fences, they argued, would allow agents to catch fence-jumpers in between them. Clinton’s barriers to humans went up alongside NAFTA, which opened the border to material goods, once again making the border more of a sieve than a seal.

    Instead of stopping people from crossing, a more militarized border diverted them to dangerous landscapes, increasing migrant deaths exponentially. In the decade following Clinton’s fences, deaths along the border doubled.

    Like his father, George W. Bush began his presidency hoping to build bridges with Mexico. He floated the idea of reviving and expanding a Bracero-style guest worker program to allow Mexicans to work in the United States legally. He made that recommendation consistently, even after the terrorist attacks of 2001. But reacting to those same attacks also led Bush and Congress to tighten border security and ultimately abandon his plan.

    In 2006, Bush signed the Secure Fence Act, authorizing 700 miles of double-layered, reinforced fencing. When he left office, he had completed more than 500 miles. Barack Obama continued the work, building 130 more miles of fencing. He also famously funded the Border Patrol and deported more people than any president before him.

    Read More: Trump’s Immigration Crackdown Seems Designed to Spread Fear

    Although Donald Trump championed building his wall, his administration only built about 85 miles of new fences. Biden will now add 20 more.

    Additional fencing will do what previous fencing has done: impose severe harm—on the environment, on borderland communities, livestock, and most of all on the human beings hoping to cross who will be diverted into costlier and deadlier routes. Fences have transformed the borderlands into a racialized graveyard, but they have not and will not stop people from migrating if doing so is a matter of survival. In a future where climate crises and political unrest is certain, so too are continued waves of migration.

    Fences cannot “close the border” because borders are never simply open or shut. And the costs of making them impenetrable are grave.

    As it stands, fences are piecemeal and violent. And historically, Republicans have been less inclined to build them than Democrats. There are currently 700 miles of non-contiguous fences along the 1,951-mile border. A Republican built most of those, but we cannot ignore that Democrats have also built and supported their fair share, showing bipartisan commitment to this symbol of illusory control. Biden has not made an about-face, he is simply continuing an interminable trend of border-building policies and now, like many who came before him, he has fallen into the same, familiar, repetitive pattern.

    Mary E. Mendoza is an assistant professor of history and Latino/a Studies at Penn State University and an environmental historian of the U.S.-Mexico borderlands. Made by History takes readers beyond the headlines with articles written and edited by professional historians. Learn more about Made by History at TIME here.

    [ad_2]

    Mary E. Mendoza / Made by History

    Source link