Hyrra Features the Latest and Most Talked-About Topstories News and Headlines from Around the World.
⎯ 《 Hyrra • Com 》

5 Misconceptions about Health and Wellness

2023-10-10 00:23
How much water should you really drink a day? Well, it’s complicated.
5 Misconceptions about Health and Wellness

We have a lot of misconceptions when it comes to health and wellness, from the cold-fighting powers of Vitamin C to whatever it is that crystals are supposed to do. Let’s break down a few of them, adapted from an episode of Misconceptions on YouTube.

1. Misconception: You need to drink eight glasses of water a day.

Everybody knows that if you don’t drink enough water each day, you’ll get dehydrated. Your lips will turn dry, your tongue will feel fuzzy, and you might pass out. But there’s an easy way to prevent this kind of dehydratory discomfort—and that’s drinking eight glasses of water a day. That’s why we’re carrying around 16- to 24-ounce water bottles, right? So we can easily calculate how many 8-ounce increments we consume each day?

It’s a fact that our bodies lose water by breathing, sweating, and answering nature’s call, and that water needs to be replaced to maintain our metabolism and other normal bodily functions.

Yet, the Centers for Disease Control and Prevention does not specify the optimal amount of water a person should drink each day. The closest we come to official advice is from the Institutes of Medicine, which in 2005 establishedadequate intakes” of total water per day to maintain health. “Total water” means the entire amount derived from drinking water, beverages, and food—and yes, that includes caffeinated drinks like coffee, tea, and soda.

Based on survey data, the adequate intake for young adults is 3.7 liters for men and 2.7 liters for women. Men were drinking about 3 liters of actual liquids per day, and women were drinking 2.2 liters. That’s actually more than eight 8-ounce glasses of water per day.

So where does this “eight glasses” idea come from?

Back in 1945, the Food and Nutrition Board of the National Research Council released an update to its bulletin Recommended Dietary Allowances, which included guidelines for dietary standards. It’s likely the earliest instance of a government recommendation for daily water intake. Under a short section about water, the researchers wrote:

“A suitable allowance of water for adults is 2.5 liters daily in most instances. An ordinary standard for diverse persons is 1 milliliter for each calorie of food. Most of this quantity is contained in prepared foods. At work or in hot weather, requirements may reach 5 to 13 liters daily.”

These guidelines might not be surprising, considering they’re not far from what the Institutes of Medicine are recommending today.

But according to a 2002 study [PDF], later researchers suggested that subsequent scientists and nutritionists overlooked the middle sentence—the one about most daily water needs being contained in food. Then, they misinterpreted the first sentence as recommending that people drink 2.5 liters in addition to consuming other beverages and food. That misinterpretation has been repeated for decades. One 2011 commentary in the medical journal BMJ even blamed multinational food conglomerates for keeping this misconception going, since it helped them to sell more bottled water.

2. Misconception: Caffeinated beverages dehydrate you.

Going back to caffeine—don’t caffeinated beverages dehydrate you? Caffeine does have a diuretic effect, meaning it makes you pee more, and that might lead you to think that all that peeing will lead to dehydration. But multiple studies, dating all the way back to 1928, have not found a strong link between caffeine and total water deficit. The Institutes of Medicine report suggests that “caffeinated beverages appear to contribute to the daily total water intake, similar to that contributed by noncaffeinated beverages.”

3. Misconception: Vitamin C prevents colds.

The misconception that Vitamin C prevents the common cold has been around for a long time.

Before Vitamin C—also called ascorbic acid—was discovered in the early 20th century, people believed that eating certain fresh fruits and vegetables prevented illness.

In the 1750s, Scottish physician James Lind suggested citrus prevented scurvy, a lethal disease that was endemic on long naval voyages. He didn’t know that Vitamin C was the active ingredient preventing scurvy, but his years spent as a naval surgeon were enough to convince him that fruits and veggies had a curative element. By 1795, the British Admiralty began issuing rations of lemon juice. Later it switched to lime juice, whence British sailors got the nickname “limeys.”

So there was a strong case for consuming lime, lemon, and other fruits for good health by the early 20th century, but the mechanism behind it was still unknown. Researchers struggled to pinpoint the molecules or chemicals responsible. In 1930, a Hungarian biochemist named Albert Szent-Györgyi and a colleague conducted an experiment on guinea pigs, which—like humans—cannot produce Vitamin C within their bodies as most animals can. Szent-Györgyi fed one group of guinea pigs boiled foods, and another group ate food with the addition of what was then known as hexuronic acid, a molecule that he had discovered during his earlier study of biological combustion.

The animals that ate boiled foods without the hexauronic acid developed scurvy-like symptoms and died, because boiling gets rid of the foods’ Vitamin C. The group that ate hexuronic acid-laced foods remained healthy. Szent-Györgyi renamed the molecule “ascorbic acid” to note its anti-scurvy (or anti-scorbutic) property.

After more experiments to confirm his findings, Szent-Györgyi was awarded the Nobel Prize in Physiology or Medicine in 1937 for his discovery of Vitamin C.

We know now that Vitamin C enables the body to use carbs, proteins, and fats for healthy bones, teeth, gums, blood vessels, and ligaments. It lowers your risk of cardiovascular disease and cancer.

But beyond those health benefits, Vitamin C has not been shown in studies to prevent colds for most people. Some studies have suggested that taking large doses of Vitamin C may lessen the severity or duration of cold symptoms, but generally, it doesn’t have these effects if you start taking it after the cold hits you. One paper also determined that for marathon runners or soldiers doing exercises in sub-arctic conditions there might be a benefit, but that doesn’t apply to the vast majority of us.

The widely held opinion that Vitamin C prevents colds probably stems from the work of another Nobel Laureate—Linus Pauling, one of the giants of 20th-century science. This chemist and peace activist won the Nobel Prize in Chemistry in 1954 for his work understanding chemical bonds, and a few years later won the Nobel Peace Prize for his actions opposing the nuclear arms race during the Cold War.

In the 1970s, Pauling became interested in a string of inconclusive studies of Vitamin C and colds. After analyzing the literature, Pauling published a bestseller called Vitamin C and the Common Cold, which made specific recommendations for readers. He wrote that 1 to 2 grams of ascorbic acid per day “is approximately the optimum rate of ingestion. There is evidence that some people remain in very good health, including freedom from the common cold, year after year, through the ingestion of only 250 mg of ascorbic acid per day.” He also said you should carry 500-mg Vitamin C tablets “with you at all times.”

While the American public rushed out to buy Vitamin C supplements, skeptical researchers launched more than two dozen studies about these claims, and even today, no study has shown conclusively that ascorbic acid prevents colds among the general population.

While we’re on the topic, what’s the deal with chicken soup for treating colds? Some studies have found that, like Vitamin C, chicken soup may lessen the symptoms and length of colds. A 2000 study in the journal CHEST found a mild anti-inflammatory effect from chicken soup, which seemed to clear cold sufferer’s stuffy airways. But, they were unable to tell which ingredient—chicken, onion, sweet potatoes, carrots, etc.—was responsible.

4. Misconception: Cracking your knuckles is bad for your joints.

First, let’s talk about why your joints crack. Your knuckles are where your metacarpals, or hand bones, connect with the proximal phalanges, or finger bones. The area where the bones meet is cushioned by a structure called the synovial capsule, which is filled with a lubricating fluid containing dissolved gases and nutrients.

When you pull or bend your fingers backward to crack your knuckles, you stretch the synovial capsule and create more space inside it. Gases rush in to the resulting vacuum, creating a bubble. That sound of the knuckles cracking corresponds to the formation of those bubbles. After the bubbles form, the gases take a while to re-dissolve in the fluid, which is why you can’t crack your knuckles again right after doing it.

Repeatedly popping your synovial capsules sounds bad—but does it cause joint damage like arthritis? One physician in Thousand Oaks, California, was determined to find out, using himself as a test subject.

When he was a child, Donald Unger’s mother and aunts repeatedly told him that cracking his knuckles would cause arthritis. Out of curiosity—or perhaps spite—he began cracking his left hand’s knuckles twice a day for 50 years. He avoided cracking his right hand to use it as a control. After approximately 36,500 left knuckle cracks, Unger compared his hands and found no presence of arthritis in either one. Though the study’s sample size—himself—was too small to confirm population-scale results, Unger wrote in the journal Arthritis and Rheumatism in 1998, the findings supported a 1970s study of 28 nursing home residents that found no link between knuckle cracking and arthritis [PDF].

For his valuable contributions to science, Unger received the 2009 Ig Nobel Prize for Medicine.

In the years following the start of Unger’s experiment, there have been a handful of studies examining the possibility of a link between knuckle-cracking and joint damage. A 1990 study found that people who cracked their knuckles had less grip strength and more hand swelling than those who didn’t [PDF]. Another report described minor injuries to ligaments and tendons in the joints, which healed within a month [PDF]. But these were minor effects, and didn’t lead to arthritis. So, crack away, knowing that the worst consequence of your knuckle-popping will likely be annoying your friends and co-workers.

5. Misconception: Crystals have healing powers.

A lot of people put their faith in healing crystals—they might hold a rose quartz if they’re seeking love, or sleep with obsidian on their pillow to feel calm and grounded. Let’s look at where these notions originated.

In the 1st century CE, the Roman scholar Pliny the Elder described various precious stones, and the healing remedies associated with them, in his massive book Natural History. Most of the remedies involved ingesting the stones in some form along with food or drink. For example, hematite was “taken with wine for the cure of wounds inflicted by serpents,” among other ailments.

Pliny also sought to describe crystals in relatively scientific terms. He wrote that crystal—what we would call rock crystal or quartz today—was formed from rain water and pure snow, congealing over time into a solid, clear, often spiky mineral. While Pliny didn’t suggest eating or drinking it, he said that “medical men'' had told him that it made a great lens for use in surgery. “The very best cautery for the human body is a ball of crystal acted upon by the rays of the sun,” he wrote.

The supposed link between crystals and health or medicine was well established by the Middle Ages. At that time, Christian writers in Europe mentioned crystals in their works, and important religious books were bound in covers encrusted with crystals and precious stones. According to Stanford University medievalist Marisa Galvez, crystals symbolized transcendence and intellectual illumination. Among Christians, they also became associated with purity, faith, and perfection, all characteristics of the Virgin Mary—who, not coincidentally, was believed to have a healing touch.

Skipping past the Enlightenment, by the late 19th century, crystals gained value among spiritualists in the UK and the U.S. Gazing into crystal balls for visions of the future or answers to existential questions became a major fad. Some spiritualists identified crystals as a kind of holistic medicine—perhaps in the same category as then-popular patent drugs and potions. Others associated this self-curated healthcare with values like self-reliance, self-awareness, creativity, and psychological growth [PDF].

In the 1970s and ‘80s, healing crystals got another boost in the New Age movement, which explored gray areas between science, magic, nature, and the occult. One overarching New Age belief is that health is a balance of body, mind, and spirit, and that illness can be traced to an imbalance among the three. Crystals are thought to help maintain the balance in one way or another.

That theory underlies the concept of complementary medicine, sometimes called alternative medicine—which also includes acupuncture, music therapy, herbal remedies, prayer, yoga, and other practices used with or in place of modern Western medicine. These remedies are gaining legitimacy today—there’s even a center at NIH focused on studying its efficacy. And at least one researcher in the UK has looked at crystal healing as a form of complementary medicine.

But behind all this, researchers have found that crystals’ supposed healing powers are most likely a result of the placebo effect. For example, a crystal user might convince herself that the stones can improve her mood, and actually feel better while holding them or meditating with them, even if the stones have no intrinsic mood-boosting properties.

The placebo effect is real, scientists have found, and can help patients improve their outlook while undergoing treatments. For some patients, the belief in the power of a non-medical intervention, like prayer or crystals, can actually improve the effectiveness of Western medical practices. So if you find crystals helpful, have at it—just know that your mind is more powerful than a mineral.

This article was originally published on www.mentalfloss.com as 5 Misconceptions about Health and Wellness.