Sociologists, economists, therapists, and every other sort of -ists have studied The Principle of Least Interest, but it’s incredibly important for writers as well. This is one of those areas in which science has confirmed what common sense has long maintained: the person who cares the least has the most power. This principle works everywhere from the housing market to the marriage market. (I wrote about this topic previously in 2015.)
If the buyer is more eager to buy than the seller is to sell, the seller will determine the selling price. If he loves her more than she loves him, he could end up the proverbial hen-pecked husband of so many comedies; vice versa and she is a candidate for the downtrodden foot-wipe—perhaps abused—wife of so many tragedies.
This principle is so well understood that sometimes people try to disguise their true levels of caring/interest (talk of other great offers forthcoming, flirting with or dating a rival). Inherent in disguise is the understanding that what counts is often the perception of least interest.
The First Take-Away for Writers:
For your characters, know who has the power (the least interest) and who is perceived to have it. And if your work has more than two characters, you need to understand the power relationships for each pair.
Unlike a credit score, people can’t go on-line and check out their power ratings. The primary reason that power relationships are often unclear is that the bases of power are virtually limitless: expertise, physical attractiveness, intelligence, wealth, athletic ability, knowledge of secrets, ability to make the other’s life miserable, being popular, great sense of humor—anything and everything that is important to that pair. Knowing the facts doesn’t tell you/the reader who has the power.
If she married him for the money and he married her for the Green Card, who cares more? What if we add in that she is beautiful and he’s a great problem-solver; she’s moody and he’s uncommunicative; he’s a natural athlete and she manages their money; they’re both extremely intelligent and care mightily for their two children. As the author, you can determine who has the power by giving weight to these factors based on the characters’ perceptions of what is important.
The Second Take-Away for Writers:
Power is seldom one-dimensional, and if you don’t recognize the complexity, your characters will be flat and unrealistic.
In many relationships—for example, boss/employee, parent/child, older sibling/younger sibling, teacher/student—the general expectation would be that the total power package would favor the former. But my guess is that most readers don’t read to confirm the norm; they like to be surprised.
The Third Take-Away for Writers:
You should at least consider writing against common power expectations.
And just to end on a high-brow note: according to Lord Acton, “Power corrupts. And absolute power corrupts absolutely.” Consider how less-than-absolute power might corrupt your character(s).
Know who has the power and who is perceived to have it.
Power is seldom one-dimensional, and if you don’t recognize the complexity, your characters will be flat and unrealistic.
You should at least consider writing against common power expectations.
The men’s beauty and makeup market, already a billion-dollar industry, is expected to grow to nearly $20 billion by 2027.
A recent survey on Ipsos found that among heterosexual men ages 18-65, 15% reported currently using male cosmetics and makeup, and another 17% say they would consider doing so in the future.
Who were the nay-sayers? 73% of men 51 and over, compared to 37% of men 18-34.
Makeup on Ancient Male Faces
Some might wonder “What’s the world coming to?” A more accurate question might be, “What’s the world getting back to?” An article—with pictures—at humanistbeauty.com makes the following five points about men’s early use of cosmetics.
Men were wearing makeup as long ago as 3000 BCE in China and Japan. Men used natural ingredients to make nail polish, face powder, rouge, and eyeliners, all signs of status and wealth. Archaeologists found a “portable” makeup box with a bronze mirror, large and small wooden combs, a scraper, and powder box. In the Han Dynasty, civil servants known as Lang Shi Zhong wore elaborate makeup and hairstyles when they appeared in court. Male attendents of Emporer Hui (210-188 BCE) of the Han Dynasty were forbidden “to go on duty without putting on powder.”
In ancient Egypt, men rimmed their eyes in black “cat” eye patterns as a sign of wealth (it also helps to reduce sun glare — as modern baseball and football players have found). They also wore pigments on their cheeks and lip stains made from red ochre. Makeup was an important way of showcasing masculinity and social rank.
In ancient Korea, the Silla people believed that beautiful souls inhabited beautiful bodies, so they embraced makeup and jewelry for both genders. Hwarang, an elite warrior group of male youth, wore makeup, jade rings, bracelets, necklaces, and other accessories. They used face powder and rouge on their cheeks and lips.
Skipping to Elizabethan England: the goal was for skin to look flawless. Men powdered their faces to whiten the skin as a sign of wealth, intelligence, and power. Fashionable courtiers dyed, curled, starched, and waxed their beards and moustaches into elaborate arrangements. To achieve the desired effect, men spent hours painting their faces, necks, hands, and hair into fantastic conifgurations that lasted for days before being removed. However, cosmetics during the Elizabethan age were dangerous due to lead, mercury, arsenic, and allum in the majority of products. These cosmetics could lead to blindness, seizures, hair loss, sterility, and premature death.
Makeup on More Recent Male Faces
Men’s love affair with makeup—for specific purposes, traditions, and enjoyment died a slow death in the 18th century when Queen Victoria associated makeup with the devil and declared it a horrible invention.
I read somewhere or other that George Washington issued a pound of flour with each soldier’s rations for use on his wig or hair. Though few soldiers wore full wigs, many attached fake plaits to their own hair or the backs of their hats. During the Revolutionary War, American wig and hair fashions were much less elaborate than those of British aristocrats, like the simpler fashions for ladies’ dresses on this side of the Atlantic. (Washington himself curled and powdered his own hair rather than wearing a wig; he was a natural redhead!)
After the American Revolutionary War, the use of visible “paint” (color for lips, skin, eyes, and nails) gradually became socially unacceptable for both sexes in the U.S. Painting one’s face was considered vulgar and was associated with prostitution and actresses/actors. But did people stop using them?
Of course not! True, few cosmetics were manufactured in America during most of the nineteenth century. However, folks (mostly women) went DIY, using recipes that circulated among friends, family, and sometimes printed in women’s magazines and cookbooks.
Lip Salve Take 1 ounce of white wax and ox marrow, 3 ounces of white pomatum, and melt all in a bath heat; add a drachm of alkanet, and stir it till it acquire a reddish colour.
To Blacken the Eye-lashes and Eye-brows The simplest preparation for this purpose are the juice of elder-berries; burnt cork, or cloves burnt at the candle. Some employ the black of frankincense, resin, and mastic; this black, it is said, will not come off with perspiration.
Pearl Powders, for the Complexion 1. Take pearl or bismuth white, and French chalk, equal parts. R educe them to a fine powder, and sift through lawn. 2. Take 1 pound white bismuth, 1 ounce starch powder, and 1 ounce orrispowder; mix and sift them through lawn. Add a drop of attar of roses or neroli.
Of course, the simplest way to “lighten” the complexion was with starch, applied with a hare’s foot or soft brush. Pale skin indicated social class/wealth: brown skin signaled outdoor labor.
Thus lotions, powders, and skin washes—to lighten complexions and diminish the visibility of blemishes or freckles—remained in use.
Druggists sold ingredients for these recipes, and sometimes ready-made products. Given the association of “paint” with prostitution (and actors), products needed to appear “natural.” Some secretly stained their lips and cheeks with pigments from petals or berries, or used ashes to darken eyebrows and eyelashes.
Technological advances in photography, interior lighting, and creating reflective surfaces led to a rise in “visual self-awareness” throughout the 19th and 20th centuries. This, coupled with a rise in wide-spread advertising through print mediums, created a wider market for commercially produced cosmetics.
In the late 1960s, norms again celebrated ideals of natural beauty—as in the Victorian era—including a rejection of make-up altogether by some. Cosmetics companies returned to touting products for a “natural” look.
Makeup on Performing Male Faces
Makeup for actors never went out of fashion, so it’s no surprise that the recent increase in makeup use for men has been led by entertainers. Performers used cosmetics as part of costumes or to ensure their facial features remained visible on stage or on screen. Stylized makeup designs correspond to specific roles in classical forms of Japanese, Thai, Indian, and Chinese theater traditions.
The popularity of the Ballet Russe in Paris in the beginning of the 20th century led to an increase in the social acceptability of wearing makeup. When the Ballet went on tour, there was a corresponding boom in cosmetic sales and advertising in countries where they performed.
Waves of glam rock, heavy metal, goth, and punk musicians in the 1970s and 1980s inspired legions of fans to don makeup to perform and to disrupt social norms. Just think of KISS, Mötley Crüe, Marilyn Manson, King Diamond, Boy George, or Alice Cooper.
The elaborate makeup and costumes of Glam Rock stars such as Boy George and David Bowie challenged gender expectations.
Heavy Metal performers such as Alice Cooper and Marilyn Manson are as recognizable for their stage makeup as for their stage costumes and music styles.
Makeup on Modern Male Faces
Men are now open to using a variety of products, including facial cleansers, exfoliants, serums, moisturizers, and most recently, cosmetics.1
For centuries, gender binaries established during the 17th and 18th centuries influenced who typically wore makeup–women! But make-up for men (and those who identify as male) may be here to stay—and goes way beyond entertainers and political statements.
Young Yuh, who has 1.6 million followers on TikTok and posts skin care and makeup tutorials full-time, says makeup is key to his self-expression. His view is that it’s like hygiene, or hairstyle, or any number of other personal choices and should not be bound by gender identification. His daily routine includes cleanser, toner, some type of serum, moisturizer, sunscreen, primer, concealer, contour, blush and eyeliner—no doubt a bit much for many!
The hashtag #meninmakeup has more than 250 million views on TikTok. And The New York Times Style Magazine article “Makeup Is For Everyone” gives a great overview of the most recent developments and resources online.
In 2017, Maybelline launched their Collosus mascara campaign featuring Manny Gutierrez with the tagline “Lash Like a Boss.” Patrick Starr, a Filipino-American makeup artist and fashion designer, collaborated with MAC makeup to launch a collection of his own design. In 2016, Gabriel Zamora became the first male makeup artist to join Ipsy makeup. Advertisements both reflect the current culture and feed it.
Bottom line: Men are now open to using a variety of products, including facial cleansers, exfoliants, serums, moisturizers, and most recently, cosmetics.
(Writers note: depending on your audience, you might want your guys’ grooming to include more than a shave and a hair cut.)
English is pretty anaemic when it comes to scent. We have to attach adjectives like “putrid” or “mown grass.” On the other hand, the Jahai people of Malaysia have words attached to specific smells, with meanings like “to have a stinging smell, to smell of human urine,” and “to have a bloody smell that attracts tigers.’
Researchers at Rockefeller University estimated that humans can detect at least a trillion distinct smells. That leads me to conclude that need determines what we do with specific ones. This conclusion is supported by other evidence: in tribes that have recently switched from hunting and gathering to farming, smell words often vanish.
Abigail Tucker explores the sense of smell in the Smithsonian Magazine article “Scents and Sensibility.” (Full reference below). Do read it! To whet your appetite, here are some bits that I found particularly interesting.
Scents and the Body
Females are more sensitive to smells than males.
Research indicates that infants are habituated via mothers’ milk to react more positively to the smell of things the mothers eat.
The human nose comes in 14 basic shapes and sizes.
(FYI, not in the article: noses and ears do not continue to grow during adulthood. They do change shape, however, due to skin changes and gravity.)
Many “tastes” are actually smell; chocolate, for example.
(I remember a classic psychology experiment that demonstrated that, without olfactory or visual clues, people couldn’t tell bits of apple from bits of onion.)
The exposed nature of scent receptors in the nose make them especially vulnerable to environmental toxins.
Apparently olfactory receptors can become fatigued.
The sense of smell declines with age, especially in those over 50. By age 80, 75% of people exhibit what could be classified as a smell disorder. (Oh, sigh.)
Pretty much everyone has “blind spots” when it comes to smell. For example, not everyone can smell asparagus in their pee—but if you can, you can smell it in anyone’s urine.
Also not covered in this article: virtually everyone can become noseblind when exposed to the same smell for a prolonged period of time. Consider entering a room and noticing an odor at first but not later.
A human without visual or auditory cues can track a scent through the grass of a public park—but not as well as a dog can.
Some psychological conditions affect sense of smell. For example, research has linked autism to an enhanced sense of smell. On the other had, depression and Parkinson’s disease are related to decreased sensitivity.
Culture affects what we smell and how we react to specific smells.
Besides genetic and cultural factors, certain smells evoke a visceral reaction specific to the individual, depending on life history. Research participants are able to access more emotional memories when exposed to a smell as opposed to a picture of the source of the smell.
Andreas Keller, a prominent neuroscientist specializing in olfaction, has opened a gallery, Olfactory Art, where smell is central to the experience!
How does the nose know? We still don’t know! “Olfaction has always been an underdog sense. It’s both primitive and complex, which makes it hard to study and harder still to transfer to our increasingly digital existence. … smells cannot at this point be recorded or emailed or Instagrammed.”
In a 2011 survey, more than half of the young adults said they’d rather give up their sense of smell than their cell phones. Little did they know what that sacrifice would entail.
BOTTOM LINE: COVID’s notorious effects on the sense of smell has triggered a new appreciation of the role of scents in our lives, for both pleasure and safety.
“Scents and Sensibility” by Abigail Tucker, Smithsonian Magazine, October 2022, pp 66-80.
It’s human nature to have an energy slump in the afternoon, sometime between 1:00 and 4:00. It’s tied to our circadian rhythm. Two ways to combat midday fatigue: napping and exercising. This blog deals only with the former! (I’ve previously written about sleeping habits here.) On average, adults who nap do so 94.3 days each year.
In the 1990s, James Maas, a social psychologist and sleep expert coined the term power nap, 10 to 20 minutes long, to boost energy and alertness. A power nap is reputed to allow workers to get back to work right away because this amount of sleep does not yet reach the deeper states of a sleep cycle. The napper stays in the lighter stages of non-dreaming sleep. And for some, apparently, it works; 42.7% of U.S. full-time workers say they regularly nap during a break in a typical workday,
Avoid 30-minute naps.
They cause “sleep inertia,” a groggy state that can last for another 30 minutes after waking up. This is because the body is forced awake right after beginning, but not completing, the deeper stages of sleep.
A 60-minute nap might be okay.
Sleeping for 60 minutes includes the deepest type of sleep, slow-wave sleep. Because of this, the one-hour nap is ideal for helping an individual better remember faces, names, and facts. However, your brain will not complete a sleep cycle in only 60 minutes, so you may not be very alert for some time after waking up.
The ideal nap is 90-minutes.
This is the length of one full sleep cycle, which includes all the light and deep (REM and dreaming) stages of sleep. A full sleep cycle nap improves procedural and emotional memory (e.g. for playing musical instruments and driving). A 90-minute nap can also significantly boost one’s creativity. Because the nap is a full sleep cycle, waking up should come much easier. (This according to the National Sleep Foundation.)
On the other hand, the Mayo Clinic is very specific: the ideal nap occurs between 2pm and 3pm and lasts between 10 and 30 minutes. This takes advantage of one’s normal post-meal dip in energy and, if finished by 3pm, poses the least risk for causing sleeplessness at night.
Among older adults, shorter naps (less than 30 minutes) are reported by adults with better health; long naps (e.g., longer than 90 minutes) have been linked to cardiovascular problems and diabetes, declining cognitive function, and increased mortality.
Benefits of Napping
There are lots of benefits to sneaking in power naps every once in a while.
Curb the side effects of temporary sleep deprivation. If you missed getting adequate sleep the night before, a quick nap can be restorative.
Note: Temporary sleep deprivation refers to a night every once in a while in which you don’t get enough sleep.
Improve memory function and job performance. Younger people definitely benefit from a quick nap in the afternoon, which can help them immensely with their studies, if they are in school. People of all ages can enhance job performance (and physical performance, in general) with a brief period of shut eye. If you feel sluggish while at work or in school, you may be able to improve the situation with a nap.
Lower blood pressure.
Prevent mistakes in judgment or accidents while driving or operating machinery. Drowsy driving is dangerous and can strike anybody at any time.
Heal the body. A brief nap can help relieve stress, allow the body to heal inflammation and injury, and improve mood.
Napping Can Be Problematic
If you have insomnia, you might exacerbate or even cause it by taking naps. If you take long naps or nap later in the afternoon, they may alter your circadian rhythms, leading to trouble with falling asleep at bedtime. On the other hand, people with severe insomnia might find themselves only ever able to take short naps, rather than sleeping all night.
If you are diabetic, or likely to develop diabetes, note that recent research has linked long afternoon naps (over an hour) to Type II Diabetes. Observational studies of more than 300,000 people by the University of Tokyo found a link between long napping and a 45 percent increase in the incidence of diabetes when naps lasted at least 60 minutes.
If you don’t know what is causing your daytime fatigue, it might be better to avoid napping altogether. Aside from sleep disorders, there’s a whole range of other causes, from prescription medications to underlying health problems to depression and mood disorders.
The prevalence of napping in older adults ranges from 20% to 60% in different studies, but is consistently reported to be higher than in other age groups. Age-related changes in circadian rhythm and sleep patterns, cultural beliefs, chronic conditions, medications, and lifestyle changes contribute to the high prevalence of napping in older adults.
(FYI: If people lived alone in total dark, “days” would be about 25 hours each. However, our body clocks reset each day based on the sun’s light/dark cycles—plus alarm clocks, work schedules, and the world in general.)
Bottom Line: Both short and long naps can increase alertness and be useful. Choose depending on personal rhythms, why you are napping, and environmental constraints.
We’ve just celebrated the biggest candy month of the year! The day of the year with the most candy sales is October 28th. And of all the 365 days in the year, the top five candy selling days are all in October.
Just How Sweet Is It?
Over 10% of annual candy sales happen the days leading up to Halloween — that is nearly $2 billion dollars in sales.
Chocolate is the preferred choice of sweets for many. Of the $1.9 billion sold in Halloween candy each year, $1.2 billion was for chocolate candy and only $680 million for sugar candy.
Consumers buy an incredible 90 million pounds of chocolate candy during Halloween week, giving it a strong lead compared to other holidays. Almost 65 million pounds is sold during the week leading up to Easter but only 48 million pounds during Valentine’s week
The average American household spends $44 a year on Halloween candy.
Americans purchase nearly 600 million pounds of candy a year for Halloween.
These “facts” popped up during multiple searches about candy. Could all of these “facts” be true? I don’t know. But without vouching for truthfulness or accuracy, I hereby present candy info from across the web.
Americans purchase over 20 million pounds of candy corn a year. With that said, it’s unlikely that every last one of those millions of candies was actually consumed. For one thing, it is the most hated Halloween candy of all. (See below)
After the beloved and beleaguered candy corn, the leading best sellers are as follows: Snickers, Reese’s, Kit Kats, and M&Ms.
Candy corn is the most searched-for candy term in Google — more popular than candy apples, gummy worms, and candy pumpkins.
Looking Beyond October
Candy, at its simplest, is the result of dissolving sugar in water. The different heating levels determine the types of candy: Hot temperatures make hard candy, medium heat will make soft candy, and cool temperatures make chewy candy.
In Europe during the middle ages, the high cost of sugar made sugar candy a delicacy available only to the wealthy.
About 65% of American candy bars were introduced more than 50 years ago.
Americans over 18 years of age consume 65 percent of the candy produced each year.
Frank and Ethel Mars, who created the Snickers candy bar in 1929, named it after the family horse.
Retailers sell more than 36 million heart-shaped boxes of chocolate every year for Valentine’s Day.
In the 1800’s, physicians commonly advised their broken-hearted patients to eat chocolate to calm their pining.
Not A New Thing
Fry’s Chocolate Cream: Candy as we know it today, was first recorded in 1847. This can be considered the first candy ever made and sold officially on the market. The candy was created by Joseph Fry. He used bittersweet chocolate. Today, Cadbury manufactures this “Rich dark chocolate with a smooth fondant center.”
Good & Plenty is believed to be the oldest candy brand in the USA. The pink-and-white capsule-shaped chewy licorice was first produced in 1893 in Philadelphia. It’s still found at concession stands everywhere, which makes Good & Plenty a treat that can be enjoyed by candy lovers of all ages.
Dryden & Palmer dates back to 1880 when rock candy enjoyed great popularity as a cough-cold remedy and delicious confection. Every bar and saloon had its own creation of rock candy dissolved in rye whisky to “cure their patrons’ colds” or at least make them forget they had a cold in the first place. Prohibition hit the rock candy industry hard and, of the original manufacturers, only Dryden & Palmer remains today.
John Ross Edmiston may have been the accidental creator of saltwater taffy in Atlantic City in 1883. His jokingly offered “saltwater taffy” to customers after his boardwalk shop was flooded, soaking his taffy stock with salt water.
Tootsie Rolls debuted in 1896, introduced by Leo Hirshfield of New York who named them after his daughter’s nickname, “Tootsie”.
The War Office added Tootsie Rolls to soldiers’ rations during World War II due to their durability in all weather conditions.
According to USMC apocrypha, marines used Tootsie Rolls as emergency first aid to plug bullet holes during the Korean War.
In the 1940s and 1950s, “Captain Tootsie” fought crime with his sidekick Rolo in a daily ad comic strip.
Milton Hershey of Lancaster, PA introduced the first Hershey milk chocolate bar in 1900. Hershey’s Kisses appeared in their familiar foil wraps in 1906.
NECCO wafers are pastel-colored candy disks that first appeared in 1901, named for the acronym of the New England Confectionery Company.
Baby Ruth candy bars were first sold in 1920, named for President Grover Cleveland’s daughter – not the famous baseball player.
Milky Way Bar is the first of many candies to be introduced by the Mars family in 1923. It was created to taste like a malted milk that would be available anywhere, anytime. One of the earliest advertisements for Milky Way listed “sunlight and fresh air” as primary ingredients.
M&M/Mars introduced the Snickers Bar in 1930. It is the number-one selling candy bar in the U.S. today.
M&M/Mars debuted the 3 Musketeers Bar in 1932. It was originally made as a three-flavor bar featuring chocolate, vanilla and strawberry nougat. In 1945, M&M/Mars changed to making them with only chocolate nougat.
Soldiers’ rations in the Spanish Civil War inspired Forrest Mars, Sr to create M&Ms: plain chocolate candies in a shell of hard sugar.
Mars joined Bruce Murrie (son of Hershey executive William Murrie) to produce M&Ms in 1941, marketing them as durable in response to slack chocolate sales in summer.
During World War II, M&Ms were sold exclusively to the US military because of their durability.
Hershey’s had an exclusive contract with the American military to supply chocolate for soldiers’ rations during World War II.
They specifically created the D-Ration Bar to “taste a little better than a boiled potato” to discourage soldiers from eating only their chocolate ration and nothing else.
The recipe for these emergency chocolate rations made a viscous liquid so thick that it clogged the regular manufacturing machines and had to be packed into molds by hand.
Hershey produced a Tropical D-Ration specifically designed to withstand the high temperatures in the Pacific Theater.
Multiple sources claim to be the creators of Skittles, including the Wrigley’s candy company and a nebulous British man named Skittle. Today, 200 million Skittles are produced each day.
Sugar Daddies, the caramel lollipops, were originally called Papa Suckers.
Dum Dums “mystery” flavor is always a mix of two flavors. The machine creates them when it switches to producing a new flavor.
Reese’s Peanut Butter Cups are the No. 1 selling candy brand in the United States, consisting of white fudge, milk, or dark chocolate cups filled with peanut butter. H.B. Reese invented them in 1928 after he founded the H.B. Reese Candy Company in 1923.
11/7 National Bittersweet Chocolate with Almonds Day
12/7 National Cotton Candy Day
12/19 National Hard Candy Day
12/26 National Candy Cane Day
12/28 National Chocolate Candy Day
What Candy Does to Your Body
Less than two percent of the calories in the American diet come from candy.
A one-ounce piece of milk chocolate contains about the same amount of caffeine as a cup of decaffeinated coffee.
When we eat sweet foods, we activate the brain’s reward system — called the mesolimbic dopamine system. Dopamine is a brain chemical released by neurons and can signal that an event was positive. When the reward system fires, it reinforces behaviors — making it more likely for us to carry out these actions again.
The recommended dose of candy is just two to three pieces of candy a day.
While eating too much candy in one sitting can do a number on your blood sugar and your teeth, it’s true that occasional excess probably won’t do major lasting harm. In the long-term, however, repeated indulgence in high-sugar foods can increase your risk for a number of health problems.
The effects of added sugar intake — higher blood pressure, inflammation, weight gain, diabetes, and fatty liver disease — are all linked to an increased risk for heart attack and stroke,
Candy has some physical health benefits: Decreasing your risk of stroke and heart attack — Dark chocolate is rich in antioxidant flavonoids, which are healthy for your heart. Regularly eating this rich treat can decrease your risk of stroke and heart attack by 39 percent.
Chocolate has been shown to improve depression and anxiety symptoms and to help enhance feelings of calmness and contentedness. Both the flavanols and methylxanthines are believed to play a role in chocolate’s mood-enhancing effects.
Chocolate can’t replace traditional treatment options for depressive feelings with mood disorders, but science may support its role in your diet. Approximately 70% of people in a cross-sectional survey were less likely to report depressive symptoms if they’d eaten dark chocolate within the last 24 hours.
BOTTOM LINE: Some ways good, some ways bad, always sweet!
I can’t help it: every October my thoughts turn to bones. Bones—especially skulls and skeletons—are sort of my thing. Athletics, not so much.
Still, I have it on the best authority—an authority, anyway—that October is the best month for sports, too. Sammy Sucu (bleacherreport.com) ranks October #1 for sports fans.
World Series and MLB playoffs
NBA and NHL seasons begin
NFL is in full swing
College basketball begins
College football rivalry matches
Soccer and their rivalry matches
Clearly, this is a biased list. There are roughly 200 sports that are internationally recognized, and besides those listed above, dozens of them are played in October: ice skating, rugby, weight lifting, cricket, badminton/table tennis, sailing, tennis, beach volleyball, chess, karate, golf, various motor sports, swimming, field hockey, skiing, and gymnastics, among others. Plus, October is National Roller Skating Month!
Put them together, and October might also be the month with the most broken bones.
Most Breakable Sports—Where Broken Bones are Common
Bones that are most commonly fractured during sports are in the wrist, hand, ankle, foot, and collarbone. (FYI, in talking about bones, a break is the same as a fracture.)
Stress fractures are most commonly seen in athletes whose sports require repetitive movements such as marathon runners. I know a woman who developed stress fractures in her ankle while training for a marathon but decided to run anyway. She ran 26.2 miles on a fractured ankle, in a tremendous amount of pain.
A fracture occurred in 20.6% of the emergency department visits for sports-related injuries.
Most of the fractures occurred in football players (22.5%).
The OR (odds ratios) for fracture was highest for inline skating (OR, 6.03), males (OR, 1.21), Asians, whites, and Amerindians (OR, 1.46, 1.25, and 1.18, respectively), and those older than 84 years (OR, 4.77).
Fractures are most common in contact sports such as basketball, rugby, and football. The most commonly fractured bones in contact sports are the hands, wrist, collarbone, ankle, feet, and the long bones of the lower extremities. Overall, contact sport athletes have a high risk of fractures in ankles and feet because they get into vulnerable positions while playing.
Among High School athletes, the highest rate of fractures was in football (4.61 per 10 000 athlete exposures) and the lowest in volleyball (0.52). Boys were more likely than girls to sustain fractures in basketball and soccer.
Most fractures heal in 6-8 weeks, but this varies tremendously from bone to bone and in each person. Hand and wrist fractures often heal in 4-6 weeks whereas a tibia fracture may take 20 weeks or more.
But broken bones aren’t the biggest risk. I’m surprised that the top 7 most frequent sports injuries seldom involve bone fractures.
Knee Injury. About 55% of sports injuries occur in the knee.
ACL Tear. Your anterior cruciate ligament (ACL) is responsible for connecting your thigh to your shinbone at your knee.
Tennis or Golf Elbow
Safest Sports—Where Broken Bones are Rare
1. Swimming It’s easy on the joints and can be an aid in recovery after an injury as well as being the safest sport in America.
2. Cheerleading Occasional falls may cause broken bones, especially during practice new routines.
3. Golf Anytime players are not required to physically touch one another will more than likely make for a safer sport. Golf injuries most often occur from the repetitive action of swinging the golf club.
4. Track and Field The most common types of injuries are running injuries such as ankle arthritis, sprains in the knees, shin splints and knee injuries.
5. Baseball Also not a contact sport, the most common injury is rotator cuff tears, especially for pitchers. Other injuries include the ulnar collateral ligament, knee injuries, and muscle sprains. Additional possible injuries include a pitched ball hitting a batter’s face and concussions from falls while fielders go for a catch.
FYI, Top 10 broken bones overall (not just athletes)
Not all fractures get a cast! A clavicle, for example. Also a coccyx.
Sports That Help Prevent Broken Bones
Athletes participating in weightbearing sports have an approximately 10% higher Bone Mineral Density than nonathletes, and athletes in high-impact sports have a higher BMD compared with medium- or low-impact sports.
Investigators found that soccer and gymnastics participants have the highest bone density in most body segments and the lowest fat mass, while swimming had the lowest bone mineral density at most skeletal sites.
Boxing improves bone mineral density. The forces through the hands and arms stimulate bones to mineralize and strengthen, ultimately reducing the risk of developing osteopenia or osteoporosis and potentially even reversing the conditions in some cases.
Osteoporosis is a disorder characterized by low bone density and impaired bone strength, an important risk factor for fracture. Low bone mass poses a particular challenge for athletes because it predisposes to stress-related bone injuries and increases the risk of osteoporosis and insufficiency fractures with aging.
My Personal Bone Break Stories
1) In second grade I climbed to the top of the swing set and fell, breaking my left arm. That was pretty cool, getting attention, signatures, and artwork on the cast.
2) The first time I tried downhill skiing, I sat down on the edge of my ski and I broke my tailbone. The local ski injury doctor (!) said I should sit on a rubber donut and then he gave me a prescription for pain pills that I could refill ten times. (This was decades ago, of course.) When I asked whether there was nothing he could actually do about it, he said that if still had trouble — if I still had difficulty riding in a car — a year or so down the line, a doctor could surgically remove it.
(Last year, I posted a blog about human bones/skeletons in general and another about the all-important spine. Still good info there.)
Bottom line: Choose your activities carefully and take care of your bones.
Having consumed all the pawpaws, I’ve turned to pumpkins. Pumpkins, too, are a native fruit. (Yes, botanically, pumpkins are fruits, a type of berry known as a pepo, to be precise. But cooks and diners commonly class pumpkins with vegetables—along with squash, tomatoes, eggplant and other “vegetables” that have their seeds on the inside—allowing pawpaw to be the largest native food that is considered and eaten as fruit.)
Pumpkins and winter squash are native to the Americas, from the southwestern part of what is now the United States through much of central and South America. People have cultivated pumpkins at least since 3500 B.C.E. Corn and pumpkins are the oldest known crops in the western hemisphere.
And who hasn’t heard about the Cahokian, Muscogee, and Iroquois “three sisters” system of companion planting: corn, beans, and squash/pumpkins grown together to the benefit of all.
Native peoples baked pumpkins whole in wood ashes, stewed them, and sometimes made a sort of succotash with beans and corn. Pumpkin was a popular ingredient in meat stews. They roasted long strips of pumpkin on an open fire until edible
Roasted seeds were (and are) eaten as a delicacy. In fall, people cut pumpkins into rings and hung up the strips to dry, later to grind the strips into flour to add to bread.
Perhaps more unexpectedly, Native Americans dried strips of pumpkin flesh and wove them into mats. And, they made a fermented drink from pumpkins. (Researchers have recently found that fermenting pumpkin reduces insulin-dependent sugars, making it a particularly suitable beverage for diabetics.)
Native Americans introduced colonists to pumpkins and they, too, relied heavily on pumpkin for food as evidenced by this poem (circa 1630):
For pottage and puddings and custard and pies, Our pumpkins and parsnips are common supplies: We have pumpkins at morning and pumpkins at noon, If it were not for pumpkins, we should be undoon.
Early colonists used pumpkins as the Native Americans taught them, also making pumpkin butter (similar to apple butter) and pumpkin syrup (as a substitute for molasses).
During the Revolutionary War, they made pumpkin sugar! (Pumpkin, Pumpkin, Anne Copeland) FYI: at one time, the Port of Boston was called Pumpkinshire.
Now, eating pumpkin is more seasonal. Come October, one can easily find pumpkin muffins, bread, meatloaf, soup, ice cream, and drinks. Thoughts of pumpkin pie stir. (FYI, the canned product sold for making pumpkin pies actually is Cucurbita moschata, a species of winter squash. The FDA does not distinguish among varieties of squash when labeling canned foods.)
Although once an important food source, pumpkins are now more prominent in Halloween and Thanksgiving decorations.
Jack-o-lanterns originated in Ireland. According to legend, Stingy Jack fooled the devil so many times that when Jack arrived at the gates of hell, the devil wouldn’t let him in. Instead he sent him off into the night with a burning lump of coal, which Jack put into a hollowed out turnip and has been roaming the Earth ever since.
“If you knew the sufferings of that forsaken craythur, since the time the poor sowl was doomed to wandher, with a lanthern in his hand, on this cowld earth, without rest for his foot, or shelter for his head, until the day of judgment… oh, it ‘ud soften the heart of stone to see him as I once did, the poor old dunawn, his feet blistered and bleeding, his poneens (rags) all flying about him, and the rains of heaven beating on his ould white head.”
Immigrants to America continued the tradition of making jack-o-lanterns but switched to easier to carve pumpkins. The influx of Irish immigrants in the 18th and 19th centuries greatly increased the popularity of Halloween celebrations. They adapted the customs and traditions of Samhain to their new homes in North America, including dressing in costumes, trick-or-treating, pranking houses, and carving jack-o-lanterns.
Imagine a pumpkin. Chances are, what came to mind first was a “typical” pumpkin, 12-18 pounds, oblong and orange, as commonly seen around and about in October, suitable for painting and carving. But consider the variety!
One of the most popular miniature pumpkin varieties is Jack Be Little, orange, about 3” in diameter and 2” high. Typically used for fall decorations, they’re also edible and grow well on trellises, making them ideal for small growing spaces.
Baby Boo are small white pumpkins, also suitable for decorating and eating. Each plant produces about 10 pumpkins. Extreme sun and frost don’t affect growth adversely.
At the other end of the continuum, you’ll find giant pumpkins: in 2022, a pumpkin set a new North American record, weighing 2,560 pounds. This was at the 49th Safeway World Championship Pumpkin Weigh-Off in Half Moon Bay, California, though Travis Gienger grew the pumpkin in Minnesota.
Half Moon Bay considers itself the pumpkin capital of the world because local growers produce more than 3,000 tons of pumpkins each year. But in 2021, Stefano Crutupi, an Italian grower, set the world record for giant pumpkins with a 2,703 pound pumpkin.
To truly appreciate pumpkins, go to a pumpkin festival. My home state of Ohio hosts the Circleville Pumpkin Show—“The Greatest Pumpkin Show on Earth”—always held the 3rd Wednesday through Saturday in October. There is, of course, every pumpkin food and beverage you might want available for purchase. Plus you can enjoy a giant pumpkin weigh-in, pumpkin carving demonstrations, and the crowning of Little Miss Pumpkin Show. And concerts for music lovers (this year featuring DJ Tune Stoned and The Poverty String Band).
Truth be told, once upon a time, I used canned pumpkin for cooking and fresh pumpkins only for jack-o-lanterns . But when I had three daughters, and thus three pumpkins, I couldn’t bear the waste, and started collecting pumpkin recipes. I once thought of writing The Great Pumpkin Cookbook, but never got beyond a notebook full of clippings. I lost momentum when I found the following:
But I will share one pumpkin soup recipe, I made up based on a side dish my son-in-law made.
Savory Pumpkin Soup 1-2 cloves chopped garlic Chopped onion Vegetable or olive oil to sauté Equal amounts of pumpkin puree and diced canned tomatoes Vegetable or chicken broth Optional: your favorite herb or spice, such as basil, curry, etc. Blue cheese or feta cheese
Gauge the garlic and onion on the basis of your taste and the amount of soup you are making. (For 15 oz. cans of tomatoes and puree, I use 1 clove of garlic and half a medium onion.) Sauté garlic and onion till soft. Add the pumpkin and tomatoes, and enough broth to make a soup of the consistency you like. If using additional seasonings, add now. Simmer to blend. When hot, add cheese to taste and stir to melt.
BOTTOM LINE: there’s a lot more to pumpkins than decorations and pie!
Periodically, a friend of a friend gifts me with a few pawpaws. Pawpaw (Asimina triloba) is a little known and (IMHO) not a pretty fruit. They are especially not pretty when left in the fridge during a week at the beach.
These are what remain of my most recent gift, received two days before I left town. Surprisingly, five of them are not just edible after a week in the fridge; they’re delicious. Which brings me to wax poetic—or at least, try to—about this fruit native to Virginia and most of the eastern United States and southern Canada.
For one thing, it’s the only fruit native anywhere in North America that resembles tropical fruits. It is also the largest edible fruit native to North America. Open a pawpaw and you’ll find a sunshine-yellow pulp dotted with dark brown/black seeds. The flesh is the consistency of pudding and tastes like some combination of banana, mango, and pineapple. What’s not to love?
In 1541, a Portuguese explorer who accompanied explorer Hernando de Santo wrote, “The fruit is like unto Peares Riall [pears royal]; it has a very good smell and an excellent taste.”
Pawpaws are high in vitamin C, magnesium, iron, copper, and manganese. They are a good source of potassium and several essential amino acids, and they also contain significant amounts of riboflavin, niacin, calcium, phosphorus, and zinc.
I eat it “as is” but people who have enough to save for later can freeze the flesh for baking, or make it into preserves. Pawpaws will not ripen if plucked from the tree too early, but unripe pawpaws can ferment into a sweet wine that pawpaw connoisseurs highly prize.
And about those seeds: as the largest edible fruit native to North America (5-16 oz., 3-6 inches long), there is plenty of room for seeds. The seeds are reminiscent of lima beans in shape, and adorn the flesh in two rows, 10-14 seeds per fruit. Each seed is 1/2 to 1-1/2 inches. Reputedly, pawpaws grow easily from seeds, but I’ve never tried. In the wild, pawpaws send out suckers, creating the “pawpaw patch” of song. Pawpaw cultivators frequently grow new trees from grafts and can produce fruit up to a pound and a half in size.
When sucked clean, the seeds feel satin smooth. One might be tempted to carry one as a lucky charm or worry “stone.” I can imagine these seeds used in children’s games: money, tokens… But if one chooses to play with dry pawpaw seeds, be aware that dry seeds won’t germinate.
Unlike most fruit trees, pawpaws do not attract bees for pollination. The flowers attract carrion flies and beetles. Pawpaw leaves are the only host for zebra swallowtail butterfly larva.
If you aren’t familiar with pawpaws, you aren’t alone. You might know them as a poor man’s banana, Indiana banana, prairie banana, frost banana, custard apple, fetid-bush, or bandango. They aren’t easy to store or ship and so haven’t been developed as a commercial food until recently. Food scientist Neal Peterson is one of many pawpaw enthusiasts who has spent decades breeding and cultivating pawpaws to make them commercially viable, greatly widening their availability.
But they were a key component of American Indian diets; indeed, the Shawnee even had a “pawpaw month” (ha’siminikiisfwa) when they harvested and preserved pawpaws. It was a cultivated food for many tribes along the Eastern Seaboard. Archaeologists have found huge quantities of pawpaw seeds and remnants at the sites of the earliest Native American settlements all along the east coast of North America.
A wise move, because pawpaws are incredibly nutritious.
At least two U.S. presidents favored pawpaws: reportedly, they were George Washington’s favorite dessert. Thomas Jefferson grew pawpaws at Monticello and had the seeds shipped to friends in Paris when he was the American ambassador to France.
Journal entries document that pawpaws fed the Lewis & Clark expedition on their return trip in the fall of 1810. In fact, pawpaw fruits and nuts saved the expedition from starvation and death when in western Missouri their rations ran low and no game was to be found.
Our party entirely out of provisions. Subsisting on poppaws. We divide the buiskit [sic] (biscuits) which amount to nearly one buisket [sic] per man, this in addition to the poppaws is to last us down to the Settlement’s which is 150 miles.
For a time, many European settlers viewed the pawpaw as a marker of racial difference, according to food historian Rebecca Earle. As ideas about racial and societal divides developed and codified, white settlers often dismissed pawpaws. Rejecting “different” foods, including pawpaws, as fit only for “different” races, became part of the colonial identity.
Their hardiness and tendency to grow wild made pawpaws a common food source along several areas of the Underground Railroad.
During the Great Depression, people often ate pawpaws as a substitute for other fruits, hence their nickname “poor man’s bananas.” Though the pawpaw continued to be an important fruit in the North American diet, interest waned after World War II with the introduction of other fruits. Racist views of the pawpaw’s place in the American diet contributed to its marginalization. As Dr. Devon Mihesuah, a scholar of Indigenous foodways, says, pawpaws haven’t been forgotten so much as “ignored, disliked, and unavailable.”
Rural populations relied heavily on pawpaw fruit as a food source, so naturally other parts of the tree figured heavily in medicine and folklore traditions. In some communities, people wore pawpaw seeds as an amulet to prevent disease. Shawnee and Catawba artisans used pawpaw bark fiber to make fishing nets and lines, weaving designs for luck and good fish catches into the nets.
Pawpaws offered powerful protection against Ozark Witches. Ozarkers used many means to thwart witches, especially to protect the home. One method was driving several tiny pegs of pawpaw wood into the doorsill.
The (supposedly) powerful Pawpaw Conjure used wood from the pawpaw tree:
This charm could be employed if the witch master could obtain the witch’s nail parings, a lock of hair, a tooth, or a cloth with her blood on it. The hair, nail parings, or other personal effects were stuck to the end of a wooden peg with beeswax. The witch master took this peg out into the woods at midnight, bored a hole in the fork of a pawpaw tree, and drove the peg into the hole. The witch, and her powers, were expected to dwindle.
I’m in Corolla, NC now, reveling in the wonder that is water. I grew up more-or-less in the middle of Ohio—not exactly water country. I first saw the ocean at age twenty, during spring break on the east coast of Florida near Tequesta/Jupiter. It was love at first sight: soft, white sand; clear, warm water; and the sounds of moving water…
Since then I’ve been near—or better yet, sailing on—water at every opportunity. Life is just better on water.
And this isn’t a placebo effect, specific to me!
The Wonder of Water Outside the Body
There are psychological benefits to water, especially oceans. Research indicates that, being by the sea has a positive impact on mental health. (Psych Central)
Minerals in the sea air reduce stress
Negatively charged ions in the sea air combat free radicals, improving alertness and concentration
Salt in the water preserves tryptamine, serotonin and melatonin levels in the brain, which aid in diminishing depression or increasing your overall sense of wellness
The sounds of waves alter the brain’s wave patterns, producing a state of relaxation
So, even the sound of water is powerful, soothing. Water sounds have long been used in meditation. The benefits of “blue space” – the sea and coastline, but also rivers, lakes, canals, waterfalls, even fountains – are less well publicized, yet the science has been consistent for at least a decade: being by water is good for body and mind.
Whenever I’m near the ocean, a bay, a river, I’m awed by the vastness and the interconnectedness of water. Water makes up 71% of the Earth’s surface. I often think about cells sanded off my feet and ending up oceans away.
And I’ve experienced nothing more awesome than being on the water in a small boat during a storm. Watching lighting go from the earth up. Furling the sails and trying to hold the tiller steady. And knowing that the water is primal, and ultimately has all the power. I’m inconsequential.
Listening to ocean sounds is a popular sleep aid: people are able to let go of thoughts and allow sleep in.
And then there is the beneficial environmental factors, such as less polluted air and more sunlight. Also, people who live by water tend to be more physically active – not just with water sports, but walking and cycling. (The Guardian)
The Wonder of Water Inside the Body
In addition there are physiological benefits of water: reducing muscle tension and joint stress, and keeping skin moisturized, hair shiny, etc. (Fix)
When was the last time you thought about—really thought about—water? (Not counting hurricane Ian, of course.) How many times a day do you unthinkingly turn on a faucet? Water is so prevalent it’s easy to forget that life depends on it. People deprived of food and water will die of dehydration first.
Water makes up 75% of the human brain. People who consume too much alcohol often wake so parched that their tongues stick to the roof of their mouths and their lips stick together. Imagine what has happened to your watery brain. (For the handful of you out there who have never had such an experience, think cotton balls and glue.)
The Wonder of Water and the History of the Body
Much of our nutrition comes from seafood. Waterways have long been a means of transportation and an avenue of trade. But the wonder of water goes way beyond its utility.
Once upon a time, our ancestors slithered out of the sea. People still want to live and stay by water. Water property values are consistently higher than others. Of course, which water, and whether there is access to it, etc., count for a lot, but still…
Papua New Guinea was long isolated from the rest of the world. The island is mountainous, and tribes located in tiny villages have warred with one another for five hundred generations. Even today, almost a thousand languages can be found (approximately 12% of all the languages spoken in the world). It is a wild place, where the outside world has had little influence, and cannibalism may be practiced, though there is some debate on how widespread the custom is today.
Actually, the only tribe which may still practice cannibalism is the Korowai tribe (a.k.a. Kolufu) in south-eastern Papua/south-eastern part of the western part of the New Guinea. But still, that is one marker of how very different human societies can be.
Archeological records suggest that pigs were introduced to New Guinea between 2,500 and 10,000 years ago, by way of a land bridge to Asia that has since disappeared. Pigs play important roles among the peoples of Papua, especially so among those living in the Central Highlands.
Apart from pigs and deer (originally brought in by the Europeans) there are not many mammals on these islands. So, yes, pigs are bred for their meat, but they are rarely killed just for eating.
The Pig Culture
In a section known as Kaulong, a pig culture prevails. The people believe pigs and humans are on a single continuum of existence, such that pigs may behave more or less like humans and humans may behave more or less like pigs.
Villagers say, “Pigs are our hearts!” Young pigs are treated as pets: they share their owner’s cooked food, are ritually named and baptized, are given magical treatments for illness, and women pre-chew tubers to feed to weak piglets.
The men own the pigs. Although there is some assertion that women or children (rarely) own a pig, the counter argument is that the man has “given the pig into their care” and thus they speak of it as their own.
Raising pigs is an important responsibility, and there is no argument that women are the ones who care for these precious animals.
At birth, powdered lime is blown into the nostrils of the piglet to make it forget its natural mother and cause it to bond with its human one.
Pigs are named, and share the women’s sleeping quarters. The women pet and handle them. Occasionally, small pigs unable to compete against siblings or orphaned piglets were breast fed by nursing mothers.
(If you search for “woman suckling pig” online, you can find images: a Huli woman breast feeding a child and a piglet the same time; and a Chimbu woman in the Eastern Highlands of Papua New Guinea breast feeding a piglet.)
Pigs as Part of Ceremonies
Pigs are sacrificed in some places to appease the ancestral spirits, and they play central roles at major rites of passage: births, weaning of children, initiation of boys, a girl’s first menstruation, weddings, and funerals. The most frequent occasion for eating pig meat is a funerary cremation.
Pig killing/eating accompanies many undertakings, such as house building and boat building.
Much feasting accompanies festivals in which local men of influence match themselves in prestige competitions.
Pigs are exchanged at peacemaking ceremonies after violent disputes.
A special occasion at which pig meat is eaten every day for weeks on end by men, women, and children, is during the major pig feasts, held at regular intervals.
There are two exceptions: a pig which is sick, and a pig which has been stolen. Such pigs would be consumed as soon as possible, without the usual ceremony.
Pigs for Status and Trade
Pigs are important symbol of political and social power. The more pigs an individual has, the more pigs he can give away, leading to bigger feasts and a higher social status.
Pigs are the main dowry offered in exchange for brides.
The most valuable pig to own is a “tusker.” These are pigs which have had their upper canines removed by a specialist, so that their lower canines can grow unimpeded, sometimes—after ten or twelve years—turning in a full circle to re-enter the lower jaw. After the ceremonial removal, the owner will use spells and all-night ceremonies to enhance that growth.
Adult tribe members blacken their own teeth with manganese oxide because white, visible teeth signify aggression (like a pig’s tusks). Tusks are made into ornaments, which a man must kill another man to earn permission to wear. When men are challenging another tribe in battle, they clench pigs’ tusks between their own teeth to appear more aggressive: “Watch out. I can be like a pig. I am powerful and dangerous.”
BOTTOM LINE: Pigs are a very valuable commodity in this part of the world, because they are used to buy brides, in general commerce and trading, for feasts and important ceremonies. Pig ownership is a sign of a man’s wealth. Thus—at (virtually) all costs—pigs are kept alive and pampered until needed. Seems like hog heaven to me!