The Curious History of Leftovers

The Curious History of Leftovers

We are searching data for your request:

Forums and discussions:
Manuals and reference books:
Data from registers:
Wait the end of the search in all databases.
Upon completion, a link will appear to access the found materials.

Whether you parcel them out for lunch the next day or squirrel them away with the best intentions until they’ve gone bad, leftovers are a mostly unremarkable reality of modern life. But leftovers have a story to tell, and their curious history tells us about changes in technology and in attitudes both toward affluence and dinner.

Until the icebox (aka proto-refrigerator) became standard in many homes at the turn of the 20th century, “leftovers” didn’t exist. Because there was no way to keep food in the form a freshly prepared meal took at the table, preservation of remaining food was as much a part of the culinary process as preparation. Cookbooks would often follow directions for a meal with instruction for pickling, curing, or salting the remains to prolong the life of all ingredients.

These weren’t leftovers as we think of them today, but the basis of another meal or food item entirely. But the ability to reliably keep things cool changed all that, as people could hang onto last night’s dinner without worrying about immediate spoilage. And so the notion of “leftover”—the remains of a meal that could be kept and consumed in a recognizably similar form later—was born, thanks to this technological innovation of the early 20th century.

The most interesting thing about leftovers, however, is not their invention but shifting attitudes toward them. The luxury of an icebox didn’t mean abundance was taken for granted. In fact, in World War I, eating one’s leftovers was positioned as so patriotic that some celebrated killing house pets rather than recklessly waste human food on them (in those days, pets ate scraps from human meals). From the wartime years through the intense poverty of the Depression, resourcefulness with this new category of “leftover” proved one’s virtuous frugality even more strongly. A 1917 U.S. Food Administration poster reminded citizens to “serve just enough/use what is left”; while a Good Housekeeping headline from 1930 admonished, “Leftovers Shouldn’t Be Left Over.”

By the 1960s, when the majority of American homes had electricity and refrigeration technology improved, leftovers potentially had a much longer life. Yet as food prices fell, leftovers lost status; throwing them away became a mark of middle-class status, historian Helen Veit notes in her book, Modern Food, Moral Food: Self-Control, Science, and the Rise of Modern American Eating in the Early Twentieth Century. Fast food restaurants and frozen meals were newly affordable, and often more convenient than cooking at home. Consuming these innovations conveyed a modern, casual affluence in a way that packing up last night’s painstakingly prepared pot roast most definitely did not.

Simultaneously, as many middle-class women entered the workforce, feminists questioned the domestic ideal in general and kitchen labor in particular by highlighting the considerable uncompensated housework that limited women from professional endeavors. Understandably, that worldview defined getting creative with last night’s dinner as drudgery. Paradoxically, the convenience of serving leftovers (especially for a working woman increasingly out of the kitchen) also earned the disapproval of conservatives, who perceived it as cutting corners on a homemaker’s primary responsibility.

Eating leftovers, or worse, serving them to a guest, thus made one an object of disdain or ridicule rather than paragon of civic virtue as in earlier eras. Etiquette columns throughout the 1960s and early 1970s regularly fielded questions about whether it was even acceptable to ask for a “doggy bag” at restaurants, the uncertainty of letter writers revealing this ambivalence about how to act appropriately around leftovers.

Are leftovers poised for a return to glory? Not only did portion sizes grow by 50 percent from 1977 to 1996, but Veit points out that the recent popularity of foods like curries and stews that taste better after a few days bodes well for the resurgence of the leftover, if for personal sensory pleasure rather than civic purpose.

A Brief History of Leftovers

I f you cooked a turkey for Thanksgiving &mdash whether you deep fried, roasted, grilled or smoked it &mdash chances are you served up a bird far too large to be consumed in one sitting. And chances also are you did this intentionally, mashing too many potatoes and baking too many pies along the way. If you simply ate someone else’s grub, you likely went home with too much of everything &mdash enough to throw your own feast. It’s a gluttonous day.

Leftovers have been a part of human eating culture since ancient man realized the fruits of a hunt would stay edible for a while if they were stored in the back of a cold, dark cave. Ancient Greeks and Romans hauled ice and snow down from the mountains, wrapped it in straw or buried it in cellars where it slowed down food spoilage, although “leftovers” back then were more along the lines of fall harvest foods that could be stored and eaten when sustenance was scarce.

But by the end of the 19th century, ice delivery men visited American homes as regularly as milk men, depositing large cubes into ice boxes made of wood and often lined with tin. According to Dupont, which later invented the coolant Freon, ice was harvested where it formed naturally &mdash including from New York City’s rivers &mdash and shipped to the South, all in the name of food storage. In the 1840s, a Florida physician named John Gorrie, trying to cool the rooms where patients were suffering from yellow fever, figured out how to make ice using mechanical refrigeration, paving the way for household refrigerators that appeared in American homes en masse in the 1920s and 1930s. It wasn’t a moment too soon. As families struggled to feed their children during the Great Depression, it was unthinkable to throw away leftovers.

As home cooks reveled in their convenient new food storage box, plastics innovators pounced on an unmet need for containers that would seal in food and keep refrigerators smelling fresh. New Hampshire native Earl S. Tupper launched Tupperware in the 1940s, and by the following decade, he was marketing the containers via Tupperware “parties” where salespeople could demonstrate the distinctive “burp” that guaranteed longer lives for leftovers. (Tupperware was a roaring success Tupper sold the company for $9 million in 1958.) For Americans who didn’t want to purchase an entire line of pastel plastic containers, Dow Chemical started selling Saran Wrap in 1953, and Ziploc storage bags in 1968.

But perhaps the greatest innovation in the development of leftover culture came in the 1970s, when the first affordable home microwave ovens went on sale. By 1986, a quarter of American homes were outfitted with microwaves able to reheat leftovers in seconds. The appliance is now in more than 90% of U.S. households. Still, if you’re not so keen on beaming molecule-shaking waves into your food, advice abounds on how to fit leftovers into your diet more creatively, with cookbooks on the market like “The Use It Up Cookbook,” “Second Time Around,” and “The Rebirth of Leftovers.”

Of course, no cookbook or microwave oven is required for the most beloved leftover Thanksgiving meal of all time. Recipe: Two slices of white bread, cold turkey, and lots of mayo.

The Curious History of ‘What Did the President Know, and When Did He Know It?’

A half-century ago an ally helped bring down a president with one simple question.

What all these outlets want to know is the same question last asked in another presidential corruption scandal, Watergate, 45 years ago: “What did the president know, and when did he know it?”

The simple inquiry became world famous. But what is less well-known is the story behind the question, and perhaps most surprising of all, that it was asked in order to defend President Richard Nixon.

The person who asked the question was Tennessee GOP Sen. Howard Baker Jr. His party credentials were unassailable. His father was a GOP Congressman and his father-in-law was Senate minority leader for a decade. Baker was the ranking Republican on the special Senate committee that investigated Watergate.

In February 1973, before the hearings began, Baker had a secret Oval Office meeting with Nixon. He told the President the committee’s game plan, which was to start out with minor witnesses in an effort to ratchet up the pressure on major witnesses to appear before the panel. As Nixon later related, Baker suggested having major figures such as White House Chief of Staff H.R. Haldeman and top aide John Ehrlichman testify first, “to deflate the whole thing.”

Through the first half of that year, Nixon trusted Baker and vice versa. The White House and Baker were in frequent contact. Nixon’s staff prepared a strategy memo for Baker suggesting ways he could keep the hearings from becoming a “political circus.” The whole inquiry, they wrote, was a “witch hunt.” (Sound familiar?) In fact, they wrote, the President himself could inform Baker that it was actually the Democrats who had bugged his offices in 1968.

The Senate’s Watergate hearings began in May. North Carolina Democrat Sam Ervin chaired the proceedings on a special Committee whose membership was equally divided between the parties.

At first the testimony backed the spin being pushed from the White House: Watergate was a third-rate burglary attempt by a small group of bad apples. The President was uninvolved.

But on June 25, former White House Counsel John Dean, who had been fired by Nixon in the spring, started testifying. Reading a 245-page statement to the Committee over the course of two days, he systematically linked Nixon to a pattern of corruption and obstruction of justice. Then the questioning began.

When it was Baker’s turn to interrogate Dean, his goal was to prove that the accusations against the President were based on circumstantial evidence. Baker carefully questioned the former White House lawyer, attempting to prove that he had no direct evidence of the President’s role in the break in or in any cover up. Dean held his own, ultimately testifying that he and Nixon discussed the cover up 35 times.

Baker was subdued after Dean’s testimony. The Committee took a two-week break. Shortly after it resumed work, it learned of the President’s secret Oval Office tapes. Now, it wasn’t just Dean’s word against the President’s. Not only that, but the level of Baker’s involvement with the White House could be exposed.

Baker’s opinion of the President and the scandal was changing. In 1992, he explained to the Associated Press “I believed that it was a political ploy of the Democrats, that it would come to nothing…But a few weeks into that, it began to dawn on me that there was more to it than I thought, and more to it than I liked.”

The Committee’s discovery of the tapes marked the turning point in Watergate and set in motion the events that would lead to Nixon’s resignation. On July 23,1973, less than a month after Baker asked his famous question, the Committee voted to subpoena the tapes. Baker and all the Republicans on the committee voted to issue it. It was the first time a congressional committee had ever issued a subpoena to a President, and only the second time since 1807 that anyone had subpoenaed the chief executive.

Meanwhile, Special Prosecutor Archibald Cox also subpoenaed the tapes. Nixon spurned both demands. Baker warned that the nation was "on the brink of a constitutional confrontation between the Congress and the White House."

The legal maneuvering between Nixon, the Senate Committee, and Cox continued into the fall of 1973. Finally, on October 19, Nixon offered a compromise. He would let the famously hard of hearing Senator John C. Stennis (D-Miss.) listen to the tapes and summarize them. Cox rejected the offer, and the next morning, a Saturday, Nixon ordered him fired.

All through 1973 and into 1974, Baker slowly moved out of the President’s camp. He shifted from being a Nixon apologist to acting as a mediator between the Senate and the President. Ultimately, he let the chips fall where they might. Baker’s eyes started opening in the days after he posed his famous question.

Forty-five years later, is there a Senator or Representative poised to inherit Baker’s mantle? It’s not impossible to imagine.

The views expressed are the author's own and not necessarily those of the Brennan Center for Justice.

A significant portion of the narrative in this piece is derived from The Wars of Watergate (1990) by Stanley I. Kutler.

Image: WASHINGTON, DC -- CIRCA 1973: Members of the US Senate Watergate Committee during the Committee's hearings in Capitol Hill, circa 1973 in Washington, DC. In attendance, among others, are minority counsel (later senator and presidential candidate) Fred Thompson (left) and Ranking Member Sen. Howard H. Baker, Jr. (right).

A few steps ahead of the Nazis

In 1940, the Reys returned to Paris for work. This time, they escaped only a few days before the Nazis marched into the capitol city. They fled Paris on bicycles, carrying only a few possessions — including, of course, their portfolio of Curious George stories and drawings. They biked 75 miles, then managed to get a train to Spain. They were again stopped and searched by police once again, their stories and drawings of Curious George charmed the authorities into letting them pass to safety.

"It's this narrow escape, saving the day, and those are ideas that are very much embedded in their books. It felt like that year, they were living it through their art, as well as in their life," Nahson said.

They made their way to Lisbon, then on a ship to Brazil, then eventually to the United States in 1940 this year marks the 80th anniversary of their journey to safety.

Had the Reys not escaped Europe, they would have faced a high likelihood of death. Two-thirds of the Jewish people in Europe were killed during World War II a total of 6 million Jews died in the Holocaust. In their home country of Germany, according to the U.S. Holocaust Memorial Museum, there were 525,000 Jews in 1933, and only 37,000 in 1950.

The Reys started a new life in America, settling first in New York City and then moving to Cambridge, Massachusetts. They wrote more than 30 books together, including seven starring Curious George. Their books have been translated into more than a dozen languages, including Yiddish and Hebrew.

Want more Curious George? You can stream all new seasons of Curious George for free exclusively on Peacock. Disclosure: Peacock is part of TODAY's parent company, NBCUniversal.

Rebecca Dube is the Head of TODAY Parents, Digital, and a mom of two boys. Follow her on the TODAY Parenting Team and Twitter.

The curious history of the rise and fall of twin beds

Twin beds—the end of an era in a marriage or a hygienic 'mod-con"?

For the best part of a century, twin beds were not only seen as acceptable but were actually championed as the sign of a modern and forward-thinking couple.

But what lay behind this innovation? And why did so many married couples ultimately abandon the twin bed?

Lancaster University academic Professor Hilary Hinds offers a fascinating insight into the combination of beliefs and practices that made twin beds an ideal sleeping solution.

"A Cultural History of Twin Beds," funded by the Wellcome Trust, challenges ingrained assumptions about intimacy, sexuality, domesticity and hygiene by tracing the rise and fall of twin beds as a popular sleeping arrangement for married couples between 1870 and 1970.

Professor Hinds, who heads up the English Literature and Creative Writing Department at Lancaster University, studied everything from marriage guidance and medical advice books to furnishing catalogues, novels, films (including the all-time great "Brief Encounter') and newspapers to glean the information.

Her key findings reveal that twin beds:

  • Were initially adopted as a health precaution in the late nineteenth century to stop couples passing on germs through exhaled breath.
  • Were seen, by the 1920s, as a desirable, modern and fashionable choice, particularly among the middle classes.
  • Featured as integral elements of the architectural and design visions of avant-garde Modernists such as Le Corbusier, Peter Behrens and Wells Coates.
  • Were (in the early decades of the 20th century) indicative of forward-thinking married couples, balancing nocturnal 'togetherness' with a continuing commitment to separateness and autonomy.
  • Never entirely replaced double beds in the households of middle-class couples but, by the 1930s and 1940s, were sufficiently commonplace to be unremarkable.
  • Enjoyed a century-long moment of prominence in British society and, as such, are invaluable indicators of social customs and cultural values relating to health, modernity and marriage.

The backlash against twin beds as indicative of a distant or failing marriage partnership intensified in the 1950s and by the late 1960s few married couples saw them as a desirable choice for the bedroom.

The trigger for the research came while Professor Hinds was researching interwar fiction written by women, when she chanced upon a reference to twin beds.

"I thought I knew what twin beds signified until I came across a comment by the protagonist in one of the novels. She looks across at her sleeping husband, on the far side of their double bed, and thinks 'modern twin beds' would be so much more comfortable and hygienic.

"I was fascinated by the perception that twin beds were seen as 'modern." I wanted to know what identified them as fashionable items.

"This then reminded me of a curious clipping in my great-grandmother's scrap book (covering the 1880s to the 1890s) which discussed 'the dangers of bed sharing' and indicating that a weaker person sharing a bed with a stronger one would 'leach the life force' from the stronger person.

The Curious History of ‘Chain Migration’

Early in negotiations to revamp immigration laws in 1964, President Lyndon B. Johnson drew opposition from some in Congress who feared changes in which countries’ immigrants would predominate.

For more than 50 years, American immigration law has favored candidates who already have relatives in the country. A result of that policy is what President Donald Trump calls “the horrible chain migration,” because of the way it allows lawful immigrants to invite their spouses, parents and even adult siblings to join them in the United States.

Mr. Trump says that he wants instead an immigration system that would award visas to people on the basis of their skills and education, as Canada does. As it happens, the U.S. Congress was on the verge of enacting just such an immigration policy in 1965. But it didn’t come to pass—because of some legislators’ concerns that it would open the doors too widely to immigrants of color. In a classic case of unintended consequences, Congress chose a different approach and got precisely the outcome it thought it could avoid: a flood of immigrants from the developing world.

Before 1965, immigrant visas were allocated primarily on the basis of national origin, with tens of thousands set aside for people from northern and western Europe. Countries in Asia, Africa and the Middle East were allocated as few as 100 visas each per year. The discriminatory policy was justified, in the words of Democratic Senator John McClellan of Arkansas, on the grounds that it rewarded “those countries that contributed most to the formation of this nation.”

In 1964, President Johnson proposed replacing the national origin quota system with a merit-based system. “A nation that was built by the immigrants of all lands can ask those who now seek admission, ‘What can you do for our country?’” he said. “But we should not be asking, ‘In what country were you born?’”

Johnson’s proposed reform went nowhere that year, however, largely because of opposition from conservative southern Democrats, some Republican allies and groups such as the American Legion. Progress on immigration reform came only after the election in late 1964 of a new Congress with a liberal majority. Even then, it was halting.

Continue reading your article with a WSJ membership

You Must Forward This Story to Five Friends

On the occasion of the release of the horror film Chain Letter, in which “a maniac murders teens when they refuse to forward chain mail,” Slate asked Paul Collins to plot out the real history of the form.

Chain letters have a long, sordid history

“This prayer has been sent to you for good luck. The original copy came from the Netherlands. It has been around the world nine times. The luck has been sent to you. You are to receive good luck within nine days of receiving this letter. It is no joke. You will receive it in the mail. Send 20 copies of this letter to people you think need good luck. … Zorin Barrachilli received the chain. Not believing it, he threw it away. Nine days later he died. For no reason whatsoever should this chain be broken.”

Unlike the unfortunate Zorin Barrachilli, the chain letter lives on. If that 1974 sample from an online archive of chain letters sounds familiar, it’s probably thanks to generations of e-mail and photocopying. But the real origin of the letter wasn’t the Netherlands: Like any truly great crooked scheme, it began in Chicago.

It was there in 1888 that one of the earliest known chain letters came from a Methodist academy for women missionaries. Up to its eyes in debt, that summer the Chicago Training School hit upon the notion of the “peripatetic contribution box“—a missive which, in one founder’s words, suggested that “each one receiving the letter would send us a dime and make three copies of the letter asking three friends to do the same thing.”

The chain letter had been born.

The “peripatetic contribution box” was seized upon in Britain as a weapon against, of all people, Jack the Ripper. That November, the Bishop of Bedford oversaw a “snowball collection” to fund the Home for Destitute Women in Whitechapel, where crimes against prostitutes were raising an outcry for charitable relief. The Bishop’s snowball worked: Indeed, it worked diabolically well. It snowballed, so that along with 16,000 correctly addressed letters a week burying the hapless originator, garbled variants of the return address also piled upon the Bishop of Bangor—as well as Bradford and Brighton.

During the 1890s, chain-letter fundraising proliferated for everything from a bike path in Michigan to a consumptive railroad telegrapher by July 1898, the New York World was preprinting chain letter forms to fundraise for a memorial for Spanish-American War soldiers. (“Do not break the chain which will result in honoring the memory of the men who sacrificed their lives,” it chided.) Upon seeing what the World’s proprietor had wrought, his rivals at the New York Sun were blunt in their assessment: “Pulitzer is insane.”

They had good reason to scoff. Earlier that year, a 17-year-old Red Cross volunteer in Long Island, Natalie Schenck, had contrived a chain to provide ice for troops in Cuba, causing 3,500 letters at a time to pour into the tiny post office of Babylon, N.Y. “We did not consider what patriotic Americans are capable of,” the girl’s mother fretted to the press.

Chains had taken on a life of their own: Along with loopy “bad luck” and “good luck” imprecations to recipients, conmen used them to raise money for, among other things, a fictitious charity case in Las Vegas. But the best cons, as always, played on greed: Schemes like the “Self Help Mutual Advance Society” of London combined the exponential growth of chain letters with a pyramid-scheme payment structure. Recipients were now told to mail dimes to previous senders while adding their name to a list that, enough links later, would bring the coins of subsequent generations showering down on them. One American chain-grifter was promptly immortalized in 1896 with the Chicago Tribune’s sardonic headline: PLANS TO BECOME A TRILLIONAIRE.

This, of course, was the scheme’s appeal: The exponential structure of a four-copy chain (four letters, then 16, then 64) meant that 20 rounds would generate 1,099,511,627,776 recipients. Or it would, at least, were it not for the inconvenient limitation of the earth’s population. Even a perfectly executed chain inevitably left its final “round” out of a great deal of money, and only the first few generations of participants were greatly enriched. By 1899, the U.S. Postal Service had seen enough: It declared “dime letter” chains a violation of lottery laws and cracked down.

Chain letters have never gone away, of course: They made a comeback in World War I, when they were used by pro-German Americans during the neutral era “to send a substantial sum to Field Marshal Hindenburg” by 1917, they were fingered by the New York Times as “a German plot … to clog the United States mails.” A Jewish anti-Nazi chain letter circulated in 1933 and the invention of photocopiers and then e-mail have ensured a reliable afterlife for endlessly copied chains that threaten woe upon all who chuck them in the trash. The most cunning variant was the “Circle of Gold” scheme, which first propagated through parties in Marin County, Calif., in 1978 it skirted Postal Service enforcement by insisting that participants hand-deliver their letters.

But no chain-letter craze has ever quite topped the spring madness of 1935. Gutted by the Great Depression, Americans turned back to the allure of the “dime letter.” After letters for a “Prosperity Club” came blossoming out of Denver, the city of Springfield, Mo., was seized with a mania for the idea: Chain letter “stores” sprung up in vacant storefronts selling official-looking “certificate” shares in high-ranked names on chain letters. “Beauty shops,” reported the AP, “sold the letters to their customers while administering facials and permanents.” Emboldened by hazy laws governing their business, chain-letter brokerages appeared in a matter of days from Portland, Ore., to Buffalo, N.Y. at its mad height, one chain-letter shop in Toledo, Ohio, boasted 125 employees.

The chains became such a cultural phenomenon that Paramount announced plans for Chain Letter, a movie to star Fred MacMurray. Spoofs appeared in the mail, like the “Send-a-Packard” letter (“Think how nice it would be,” it rhapsodized, “to have 15,625 automobiles”). Other letters promised fantastic exponential results in procuring dames, whiskey, and elephants. A few residents of Springfield even attempted a “drunk chain“—doubling their crowd in size with each round of highballs at a new tavern, while “the originators were hazily trying to figure out how long it would take to get the whole city drunk.” Alas, they passed out before completing their calculations.

Soon the entire country had a hangover: The chain-letter market crashed after a few weeks, chain-letter brokers fled town with tens of thousands of dollars, and a $26.9 million suit was filed against Western Union for allowing the first electronic chains via telegraph. As dazed customers woke up to discover their investments were worthless, the U.S. Postal Service was left in July 1935 with “between 2,000,000 and 3,000,000 letters in the dead letter offices.”

It all has a curious ring of familiarity, which makes the earliest chain-letter fiascos all the more instructive. After all, what happened to Natalie Schenck, the teenager who nearly capsized her Long Island town with chain-letters for the Spanish-American War troops? The one who shook down a cascade of money, embarrassed a respected institution, and left government agencies tied up in knots?

Copy the link below

To share this on Facebook click on the link below.

We all know the song, and we all know the lyrics. There’s the thing about your left arm, it goes in, it comes out, you repeat, there’s a bit of business with a funny name and a turn around, and that’s basically what it is all about.

However, the history of the song itself is anything but simple. A hit on both sides of the Atlantic, for two different songwriting teams, “The Hokey Cokey” was born out of a spirit of wartime cooperation and wound up with multiple claims of authorship and a web of transatlantic disagreement over what it should even be called.

So, let’s start at the beginning. There’s a traditional children’s song called “I Put My Little Hands In,” (based on an old English/Scots folk dance called “Hinkim-Booby”), the lyrics to which go like this:

“I put my both hands in,
I put my both hands out,
I give my both hands a shake shake shake
And turn myself about.” (repeat for other body parts)

So the idea of having a song which contains actions and requires the putting in and taking out of bits of your body was not a brand new one, even during the Second World War, when there was a vogue for communal dances with actions, like “Underneath The Spreading Chestnut Tree.”

Enter British band-leader Al Tabor. He claimed to have been approached by a Canadian officer, who suggested he write a knees-up sort of song with actions. Possibly half-remembering “I Put My Little Hands In”, he came up with a similar line of instructions and added the term hokey pokey after remembering an ice-cream seller yelling “hokey pokey penny a lump. Have a lick make you jump.”

Hokey pokey was a generic name for a serving of ice cream, at the time, which you’d get from a hokey pokey man. It was also the name of the waxed paper on which the servings of ice-cream were sold, in the days before cones. Even now, hokey pokey lives on as an ice-cream flavour: vanilla with honeycomb pieces.

Having finished his masterpiece, Al presented it to the Canadian officer, who suggested he change the name to “The Cokey Cokey” because cokey was Canadian miner slang for crazy. And so in 1942, the first sheet music for this new dance was published.

However, it was published by the great Jimmy Kennedy, the songwriter behind “The Teddy Bear’s Picnic” and “Istanbul (Not Constantinople,” among other astonishing hits. And Jimmy’s son always claimed it was his dad that had had the experience of talking to Canadian soldiers about the whole cokey thing. So it’s not really clear who has the rightful claim to have written it. Al and Jimmy fell out over royalty payments, eventually resulting in legal action, and Al settled out of court, finally renouncing all claim on the song.

Somewhere down the line, during all the fuss, the song became “The Okey Cokey” and then “The Hokey Cokey” and that’s the way it has stayed, over in the UK at least.

Meanwhile, in America, Robert Degen and Joseph P. Brier, two Pennysylvania club musicians, had copyrighted a song called “The Hokey Pokey Dance” in 1944. Then a group called The Ram Trio recorded “The Hokey Pokey,” claiming to have come up with it while entertaining the apres-ski crowd in Idaho’s Sun Valley. which was recorded by Ray Anthony’s big band in 1953, and became an enormous worldwide hit. You can imagine the legal kerfuffle which ensued.

Curiously, the arguments between Al and Jimmy, and Robert/Joseph and the Ram Trio, appear to have existed entirely in parallel, with no lawsuits crossing the Atlantic. However, due to the enormity of Ray Anthony’s hit, the song remains “The Hokey Pokey” (occasionally “The Hokey Tokey”) in America, Canada, New Zealand, Ireland, and Australia. And of course now no one is entirely sure who wrote it, and when.

To make matters worse, in 2008, Anglican Canon Matthew Damon, of Wakefield Cathedral in West Yorkshire, claimed the movements in the song came from a mockery of a Catholic mass, and that the name derives from hocus pocus. So it could have been used as a taunting, anti-Catholic song. A theory that was taken seriously enough to provoke calls from Catholic clergy to have the song banned.

The theory has been widely discredited, however. Not least because of its popularity among Catholic families. Still, it’s a matter of no small irony that a song which was written to bring people together, and which has brought nothing but pleasure to children of all ages, the world over, should leave behind it such an unpleasant trail of bitterness and acrimony.

What the world will look like in the year 250,002,018

Researchers figure out the function of mysterious heart structures first described by da Vinci.

The heart and the trabeculae.

Scientists found out the purpose of mysterious structures in the human heart, first described by Leonardo da Vinci 500 years ago. The mesh of muscle fibers called trabeculae lines the inner surface of the heart and was shown to affect how well the heart functions.

The mesh, exhibiting distinctive fractal patterns that resemble snowflakes, was initially sketched by Leonardo da Vinci in the 16th century. Early in human development, human hearts form trabeculaes, which create geometric patterns on the inner surface. While their purpose during this stage appears to be in aiding oxygenation of the growing heart, what they do in adults hasn't been previously figured out. Da Vinci thought the structure warms blood going through the heart.

To really understand what these networks do, an international research team used artificial intelligence to go through data from 25,000 MRI scans of the heart. They also looked at the related data pertaining to heart morphology and genetics.

The scientists observed that the rough surface of the heart ventricles helps the efficiency of the blood flow during a heartbeat, the way dimples on a golf ball lower air resistance, as elaborates the team's press release. They also discovered that there are six regions in human DNA that determine how exactly the fractal patterns in the muscle fibers form.

The team working on the project included Ewan Birney from the European Molecular Biology Laboratory's Bioinformatic Institute.

"Our findings answer very old questions in basic human biology," explained Birney. "As large-scale genetic analyses and artificial intelligence progress, we're rebooting our understanding of physiology to an unprecedented scale."

Another important insight – the shape of the trabeculae influences the heart's performance. Analysis of data from 50,000 patients established that the different fractal patterns can influence the risk of heart failure. Interestingly, the study showed that people who have more trabeculae branches seem to be at lower risk of heart failure.

Leonardo DaVinci: behind a Genius

Declan O'Regan, Clinical Scientist and Consultant Radiologist at the MRC London Institute of Medical Sciences, said that while their work is built on quite old observations, it can be crucial to today's people.

"Leonardo da Vinci sketched these intricate muscles inside the heart 500 years ago, and it's only now that we're beginning to understand how important they are to human health," said O'Regan. "This work offers an exciting new direction for research into heart failure, which affects the lives of nearly 1 million people in the UK."

Other participating scientists came from the Heidelberg University, Cold Spring Harbor Laboratory, and the Politecnico di Milano.

Check out their study published in the journal Nature.

Renewed interest

While few gay men today actively use Polari, in recent years it has gained a kind of latent respectability as an historic language – similar to the way Latin is seen by the Catholic faith. From a political standpoint, Polari is now recognised as historically important, an example of the perseverance of a reviled group of people who risked arrest and attack just for being true to who they were.

In 2012 a group of Manchester-based artists used Polari to highlight the lack of LGBT inclusivity in education. They created an exam in LGBT studies, getting volunteers to sit it under strict exam conditions. The language portion of the exam was about Polari.

Another group of activists called the Sisters of Perpetual Indulgence created a Polari Bible, running a Polari wordlist through a computer program on an English version of the Bible. The Bible was bound in leather and displayed in a glass case at the John Rylands Library in Manchester. This was not to mock religion but to highlight how religious practices are filtered through different cultures and societies, and that despite not always being treated well by mainstream religions, there should still be space for gay people to engage with religion.

In 2012, I participated in a group effort to carry out the longest ever Polari Bible reading which took place in a Manchester Art gallery. In a nice touch of high camp we had to wear white gloves while touching the Bible, to ensure the oils from our fingers didn’t ruin the paper. We took turns reading lines such as: “And the rib, which the Duchess Gloria had lelled from homie, made she a palone, and brought her unto the homie.” Translation: “And the rib which God had taken from man was made into a woman and brought to the man.”

The Polari Evensong at Cambridge, carried out by trainee priests, however, took place in a more official context and provoked a range of conflicting opinion. Some people think it is hilarious, some are concerned about Church of England rules being broken and disrespect for religious tradition, while others think that God should be prayed to in any language and that the Evensong was perfectly valid. As someone who has spent 20 years documenting the rise and fall of Polari, I find it fascinating that even now, it is finding new ways to cause controversy. Never has a dead language had such an interesting afterlife.

Watch the video: Η Τουρκία μετατρέπει και τη Μονή της Χώρας σε τζαμί - Η ιστορία της. 21082020. ΕΡΤ


  1. Sakinos

    I'm sorry, but I think you are making a mistake. I can prove it. Email me at PM, we will discuss.

  2. Inaki

    instructive !!!! gee gee gee

  3. Esmond

    I remember someone posted pictures ...

  4. Odran


  5. Tomkin

    You are not right. Let's discuss it. Write to me in PM.

Write a message