We generally assume moderate drinking (two standard drinks per day) is good for our health.
This idea comes from studies over the past three decades showing moderate drinkers are healthier and less likely to die prematurely than those who drink more, less, or don’t drink at all.
I would be glad if this were true.
But our latest research challenges this view. We found while moderate drinkers are healthier than relatively heavy drinkers or non-drinkers, they are also wealthier. When we control for the influence of wealth, then alcohol’s apparent health benefit is much reduced in women aged 50 years or older, and disappears completely in men of similar age. [continue]
David Fajgenbaum was the fittest of his friends at the University of Pennsylvania’s medical school, a 6-foot-3 gym addict and former quarterback at Georgetown. His mammoth hands seemed more suited to spiraling footballs than the fine fingerwork a doctor-in-training might need. He had endurance to match, taking multiple hits and returning to the field to play on.
“This guy was a physical specimen,” said his former roommate, Grant Mitchell, who used to walk to work with him. When they would arrive at the hospital for his obstetrics rotation, his friend recalled, “he would basically coerce me into doing pull-ups on the tree outside.”
In July 2010, that all changed. The 25-year-old woke up at night drenched in sweat. His lymph nodes were swollen. He felt stabs of abdominal pain, and odd red bumps began sprouting across his chest. Most bizarre of all, he felt very tired — so tired that he began slipping into empty exam rooms between patients, stealing five-minute naps to get through the day.
“Guys, I think I’m dying,” he recalled telling his friends.
A visit to the emergency room confirmed his fears. A doctor told him that his liver, kidneys and bone marrow were not working properly. Even more troubling, the doctor had no idea why his body was failing. “What do you think is going on?” he remembers the doctor asking him. [continue]
Have you come across any of Gary Taubes’ writing? If you’ve got the slightest interest in nutrition, history, and health, he is worth your attention. Here’s a recent article of his from the Guardian: Is sugar the world’s most popular drug?
Imagine a drug that can intoxicate us, can infuse us with energy and can be taken by mouth. It doesn’t have to be injected, smoked, or snorted for us to experience its sublime and soothing effects. Imagine that it mixes well with virtually every food and particularly liquids, and that when given to infants it provokes a feeling of pleasure so profound and intense that its pursuit becomes a driving force throughout their lives.
Could the taste of sugar on the tongue be a kind of intoxication? What about the possibility that sugar itself is an intoxicant, a drug? Overconsumption of this drug may have long-term side-effects, but there are none in the short term – no staggering or dizziness, no slurring of speech, no passing out or drifting away, no heart palpitations or respiratory distress. When it is given to children, its effects may be only more extreme variations on the apparently natural emotional rollercoaster of childhood, from the initial intoxication to the tantrums and whining of what may or may not be withdrawal a few hours later. More than anything, it makes children happy, at least for the period during which they’re consuming it. It calms their distress, eases their pain, focuses their attention and leaves them excited and full of joy until the dose wears off. The only downside is that children will come to expect another dose, perhaps to demand it, on a regular basis. [continue]
Stories abound that undermine the notion that elite athletes are healthy. From the running world, marathoner Alberto Salazar, at the age of 48, suffered a heart attack and lay dead for 14 minutes before a cardiologist placed a stent in a blocked artery, saving his life. Micah True, the ultra-marathoner and protagonist of the bestselling book Born to Run, went for a 12-mile run in the New Mexico wilderness and was later found dead.
Of course, these tragic tales are preceded by the origin story of an endurance athlete running himself, literally, to death. An enlarged, thickened heart with patchy scar tissue is common in long-term endurance athletes and is dubbed “Pheidippides cardiomyopathy” after the 40-year-old running messenger (and prototypical masters endurance athlete) who died after bringing the news of Greek victory at the battle of Marathon to Athens. Pheidippides was a hemerodrome, (an all-day running courier in Ancient Greece), and he had run 240km over two days to request help from Sparta against the Persians at Marathon, before expiring after running the additional 42km (26.2 miles) back from the battlefield. We celebrate his death by running marathons.
These deaths are even more alarming when you consider the subjects — highly trained athletes in what many would consider peak physical condition. Isn’t exercise supposed to prevent us from falling to a heart attack? [read the whole article]
If you’re an endurance athlete, does this give you pause for thought?
Researchers at McMaster University have found that a single minute of very intense exercise produces health benefits similar to longer, traditional endurance training.
The findings put to rest the common excuse for not getting in shape: there is not enough time.
“This is a very time-efficient workout strategy,” says Martin Gibala, a professor of kinesiology at McMaster and lead author on the study. “Brief bursts of intense exercise are remarkably effective.” [continue]
Today the BBC published an article that is awesome on so many levels. It is The sugar conspiracy. The summary:
In 1972, a British scientist sounded the alarm that sugar – and not fat – was the greatest danger to our health. But his findings were ridiculed and his reputation ruined. How did the world’s top nutrition scientists get it so wrong for so long?
And indeed, how did they?
If you care about health, science, and whether the nutrition advice you’ve tried to follow is nonsense or not, this is worth your time.
Health. Diabetes. Pre-diabetes. Diabetes and the link to heart disease. How the body processes carbs. What causes people to get fat, anyway? If any of this interests you, you’ll want to read blogs like the one Dr Malcolm Kendrick writes. His latest post is What happens to the carbs – part II. It begins:
My interest in nutrition began many years ago as part of my over-riding interest in cardiovascular disease. This means that, unlike many other people, I backed into this area with no great interest in the effect of food on health. For most doctors nutrition takes up about an hour of the medical degree course. We are pretty much given to understand that it is of little medical significance. Eat a balanced diet…end of. I also paid nutrition about that much heed.
However, because of the power and influence of the diet/heart hypothesis I felt the need to understand more about this whole area, and how the system of digestion and metabolism actually worked. At first my interest was purely to find out if there was any clear and consistent association between diet and cardiovascular disease (which I shall call heart disease from now on, as it is simplest to do so).
Like many others, before and since, I could not find any such association. Nor could I find any biochemical or physiological reason why saturated fat, in particular, could cause heart disease. That issue, of course, represents a long and winding road that I am not going down here.
However it did not take long before I became side tracked by the very powerful and consistent association between heart disease and diabetes. People with diabetes have far higher rates of heart disease than people who do not. In the case of women with diabetes, the increase in risk hovers around five times the rate of non-diabetics. So it became clear that I would need to understand diabetes, if I was going to fully understand heart disease. [continue]
The dewy chill over Leicester, England, in March 1885 did not deter thousands of protesters from gathering outside nearby York Castle to protest the imprisonment of seven activists. Organizers claimed as many as 100,000 people attended, although historians estimate it was closer to 20,000.
The cause they rallied against? Vaccination.
This movement has faded from popular memory, obscured by the controversy of more recent anti-vaccination efforts, which gained momentum in the 1990s. However, the effects of the Victorian anti-vaccination movement still echo in the debate over the personal belief exemption, which was banned in California in June.
On the day the Leicester protesters gathered, vaccination was mandatory in England. Nearly a century before, Edward Jenner, a Scottish physician, had invented a method of protecting people against the raging threat of smallpox. The treatment was called variolation, and it involved [continue]
There’s a good article on Over-Training Syndrome (OTS) at Outside Online: Running on Empty.
OTS is one of the scariest things I’ve ever seen in my 30 plus years of working with athletes,” says David Nieman, former vice president of the American College of Sports Medicine. “To watch someone go from that degree of proficiency to a shell of their former self is unbelievably painful and frustrating.”
Nieman, a professor of health and exercise science at Appalachian State University in North Carolina, has spent his career studying the effects of training on the immune system. In 1992, he received the first of a dozen distressingly similar letters from endurance athletes, each of them describing a sudden loss of ability as they struggled with everything from anemia to chronic dehydration to a basic inability to get out of bed. Nieman was both troubled and fascinated by these tales. Their symptoms all seemed to point to overtraining syndrome, and he’s been looking into the root causes of the condition ever since. [continue]
Believe it or not, there are a few cultures in the world where back pain hardly exists. One indigenous tribe in central India reported essentially none. And the discs in their backs showed little signs of degeneration as people aged.
An acupuncturist in Palo Alto, Calif., thinks she has figured out why. She has traveled around the world studying cultures with low rates of back pain — how they stand, sit and walk. Now she’s sharing their secrets with back pain sufferers across the U.S.
About two decades ago, Esther Gokhale started to struggle with her own back after she had her first child. “I had excruciating pain. I couldn’t sleep at night,” she says. “I was walking around the block every two hours. I was just crippled.” [continue]
She’s sure not crippled anymore. Now Esther teaches other people what she learned. Go look her up on Youtube if you’d like to see what she teaches.
A few years ago I stumbled upon Esther Gokhale’s book, 8 Steps to a Pain-Free Back. Worth every penny. Esther’s book taught me things I didn’t know about posture and lifting. I was captivated by her story.
I follow Esther’s advice. It’s not the only thing I do for my back, but it’s an important part of being supple and pain-free.
(Link to the NPR article found here at Mark’s Daily Apple.)
The Fore people, a once-isolated tribe in eastern Papua New Guinea, had a long-standing tradition of mortuary feasts — eating the dead from their own community at funerals. Men consumed the flesh of their deceased relatives, while women and children ate the brain. It was an expression of respect for the lost loved ones, but the practice wreaked havoc on the communities they left behind. That’s because a deadly molecule that lives in brains was spreading to the women who ate them, causing a horrible degenerative illness called “kuru” that at one point killed 2 percent of the population each year.
The practice was outlawed in the 1950s, and the kuru epidemic began to recede. But in its wake it left a curious and irreversible mark on the Fore, one that has implications far beyond Papua New Guinea: After years of eating brains, some Fore have developed a genetic resistance to the molecule that causes several fatal brain diseases, including kuru, mad cow disease and some cases of dementia.
The research also sheds light on how modern Europeans came to look the way they do – and that these various traits may originate in different ancient populations. Blue eyes, it suggests, could come from hunter gatherers in Mesolithic Europe (10,000 to 5,000 BC), while other characteristics arrived later with newcomers from the East. [continue]
“Do me a favor and don’t wear any eye makeup when you come in,” I recall the receptionist having requested over the phone. “It messes with the goggles.”
Instead of saying, “Goggles?” as I was thinking, I said, “Eye makeup?”
“Mascara, eye shadow, eyeliner,” the receptionist said.
I’d been to this functional neurology center in a suburb of Portland, Oregon, several times since 2007, when I was diagnosed with having a puddle of cerebral-spinal fluid—the water that your brain floats in—about the size of a lemon where my right parietal lobe would be. The parietal lobe is the part of the brain responsible for judging time, space, distance, and the location of the body, among other tasks. I was diagnosed only a couple of months before leaving for grad school in Southern California; I had been hoping to get to the bottom of why learning to drive had proven impossible for me. [continue]
In a stunning discovery that overturns decades of textbook teaching, researchers at the University of Virginia School of Medicine have determined that the brain is directly connected to the immune system by vessels previously thought not to exist. That such vessels could have escaped detection when the lymphatic system has been so thoroughly mapped throughout the body is surprising on its own, but the true significance of the discovery lies in the effects it could have on the study and treatment of neurological diseases ranging from autism to Alzheimer’s disease to multiple sclerosis. [continue]
WOW, just …. wow. Suddenly all my anatomy textbooks are out of date!
Portugal decriminalized the use of all drugs in 2001. Weed, cocaine, heroin, you name it — Portugal decided to treat possession and use of small quantities of these drugs as a public health issue, not a criminal one. The drugs were still illegal, of course. But now getting caught with them meant a small fine and maybe a referral to a treatment program — not jail time and a criminal record. (…)
The prevalence of past-year and past-month drug use among young adults has fallen since 2001, according to statistics compiled by the Transform Drug Policy Foundation, which advocates on behalf of ending the war on drugs. Overall adult use is down slightly too. And new HIV cases among drug users are way down. [continue]
This is great news for Portugal, but it makes me feel so frustrated about our approach to drug use in Canada. I’d like to be in charge long enough to change a few laws, and to make narcan available to paramedics and members of the general public. I think that would save a lot of lives.
The “glass delusion” is an extraordinary psychiatric phenomenon in which people believe themselves to be made of glass and thus liable to shatter. It peaked centuries ago but there are still isolated cases today, writes Victoria Shepherd.
The late medieval French King Charles VI was one of the most notable sufferers of glass delusion. He was reported to have wrapped himself in blankets to prevent his buttocks from breaking.
Instances of the delusion cropped up in medical encyclopaedias from across Europe. There were references in fiction – most notably Cervantes’ short story The Glass Graduate of 1613, in which the hero is poisoned by a quince intended as an aphrodisiac but which instead triggers a glass delusion. [continue]
Better than thinking one’s skin is covered with bugs, I guess.
Can a routine hospital stay upset the balance of microbes in our bodies so much that it sets some older people up for a life-threatening health crisis called sepsis? A new University of Michigan and VA study suggests this may be the case.
It shows that older adults are three times more likely to develop sepsis — a body-wide catastrophic response to infection — in the first three months after leaving a hospital than at any other time.
What’s more, the risk of sepsis in that short post-hospital time is 30 percent higher for people whose original hospital stay involved care for any type of infection — and 70 percent higher for those who had a gut infection called Clostridium difficile.
In fact, one in 10 C. diff survivors end up with sepsis within three months of their hospital stay, according to the new study published online in the American Journal of Respiratory and Critical Care Medicine. It’s the first analysis of its kind.
The researchers chose to look at the relationship between hospitalization and sepsis because of a growing understanding that antibiotics and other infection treatments disrupt the body’s microbiome — the natural community of bacteria and other organisms that is vital for healthy body function. In turn, C. difficile preys upon hospital patients who have a disrupted gut microbiome. [continue]
Has news about the importance of the microbiome been showing up everywhere in your world, too? Or is it just the stuff I read that is full of microbiome news?
The first time Grigory Kessel held the ancient manuscript, its animal-hide pages more than 1,000 years old, it seemed oddly familiar.
A Syriac scholar at Philipps University in Marburg, Germany, Dr. Kessel was sitting in the library of the manuscript’s owner, a wealthy collector of rare scientific material in Baltimore. At that moment, Dr. Kessel realized that just three weeks earlier, in a library at Harvard University, he had seen a single orphaned page that was too similar to these pages to be coincidence.
The manuscript he held contained a hidden translation of an ancient, influential medical text by Galen of Pergamon, a Greco-Roman physician and philosopher who died in 200 A.D. It was missing pages and Dr. Kessel was suddenly convinced one of them was in Boston.
Dr. Kessel’s realization in February 2013 marked the beginning of a global hunt for the other lost leaves, a search that culminated in May with the digitization of the final rediscovered page in Paris. [continue]
I love it when a story relates to so many of my interests. (History, books, medicine, old manuscripts, digitization, Vatican library… and on and on!)
Being alone may be the central anxiety of our time but, as it turns out, you are never really alone — at least in a biological sense: Every single cell of you — that is, every cell made of human DNA — is kept company by ten cells of microbes that call your body home. And because microbes are single-celled organisms that each carry their own DNA, the difference is even starker in genetic terms — you carry approximately twenty thousand human genes and two to twenty million microbial ones, which makes you 99% microbe. What’s more, although you and I are 99.99% identical in our human DNA, we are vastly different in our individual microbiomes — you have only one in ten of my microbes. Even more striking than the sheer number of these silent and invisible cohabitants is their power over what we consider our human experience — they influence everything from our energy level to how we handle illness to our moods to how tasty we are to mosquitoes. [continue]
My father, a neurologist, once had a patient who was tormented, in the most visceral sense, by a poem. Philip was 12 years old and a student at a prestigious boarding school in Princeton, New Jersey. One of his assignments was to recite Edgar Allan Poe’s The Raven. By the day of the presentation, he had rehearsed the poem dozens of times and could recall it with ease. But this time, as he stood before his classmates, something strange happened.
Each time he delivered the poem’s famous haunting refrain—“Quoth the Raven ‘Nevermore’ ”—the right side of his mouth quivered. The tremor intensified until, about halfway through the recitation, he fell to the floor in convulsions, having lost all control of his body, including bladder and bowels, in front of an audience of merciless adolescents. His first seizure. [continue]
“Slim by Chocolate!” the headlines blared. A team of German researchers had found that people on a low-carb diet lost weight 10 percent faster if they ate a chocolate bar every day. It made the front page of Bild, Europe’s largest daily newspaper, just beneath their update about the Germanwings crash. From there, it ricocheted around the internet and beyond, making news in more than 20 countries and half a dozen languages. It was discussed on television news shows. It appeared in glossy print, most recently in the June issue of Shape magazine (“Why You Must Eat Chocolate Daily”, page 128). Not only does chocolate accelerate weight loss, the study found, but it leads to healthier cholesterol levels and overall increased well-being. The Bild story quotes the study’s lead author, Johannes Bohannon, Ph.D., research director of the Institute of Diet and Health: “The best part is you can buy chocolate everywhere.”
I am Johannes Bohannon, Ph.D. Well, actually my name is John, and I’m a journalist. I do have a Ph.D., but it’s in the molecular biology of bacteria, not humans. The Institute of Diet and Health? That’s nothing more than a website.
Other than those fibs, the study was 100 percent authentic. My colleagues and I recruited actual human subjects in Germany. We ran an actual clinical trial, with subjects randomly assigned to different diet regimes. And the statistically significant benefits of chocolate that we reported are based on the actual data. It was, in fact, a fairly typical study for the field of diet research. Which is to say: It was terrible science. The results are meaningless, and the health claims that the media blasted out to millions of people around the world are utterly unfounded.
If you are reading this before breakfast, please consider having an egg. Any day now, the US government will officially accept the advice to drop cholesterol from its list of “nutrients of concern” altogether. It wants also to “de-emphasise” saturated fat, given “the lack of evidence connecting it with cardiovascular disease”.
This is a mighty U-turn, albeit hedged about in caveats, and long overdue. The evidence has been building for years that eating cholesterol does not cause high blood cholesterol. A 2013 review by the American Heart Association and the American College of Cardiology found “no appreciable relationship between consumption of dietary cholesterol and serum [blood] cholesterol”.
Cholesterol is not some vile poison but an essential ingredient of life, which makes animal cell membranes flexible and is the raw material for making hormones, like testosterone and oestrogen. Your liver manufactures most of the cholesterol found in your blood from scratch, and adjusts for what you ingest, which is why diet does not determine blood cholesterol levels. Lowering blood cholesterol by changing diet is all but impossible.
Nor is there any good evidence that high blood cholesterol causes atherosclerosis, coronary heart disease or shorter life. It is not even a risk factor in people who have already had heart attacks. In elderly people — ie, those who have the most heart attacks — the lower your blood cholesterol, the greater your risk of death. Likewise in children. [continue]
Modern lifestyles have famously made humans heavier, but, in one particular way, noticeably lighter weight than our hunter-gatherer ancestors: in the bones. Now a new study of the bones of hundreds of humans who lived during the past 33,000 years in Europe finds the rise of agriculture and a corresponding fall in mobility drove the change, rather than urbanization, nutrition or other factors. [continue].
Excuse me while I go off to become ridiculously active.