In dopamine primers 1 and 2, I introduced this useful neurotransmitter and key to the thinking style of humanity. Today we'll go over the different neurotransmitters on the different sides of the brain, and why that matters.
Dopamine is the neurotransmitter that is not only responsible for modulating a lot of our physical movement, but also sex, aggression, motivational drive, and, counter-intuitively, long-term planning and restraint or impulse control. In nearly all right-handed and most left-handed humans who are left-brain dominant, dopamine rules the left side of the brain. Lateral dopamine pathways modulate working memory, cognitive shifting, and other executive functions (such as planning).
On the right side of the brain - the artsy part - serotonin and norepinephrine are more dominant. Serotonin and norepinephrine have more to do with emotional activation and arousal systems. The serotonin systems manage our movements and focus on a close personal level. They work in close concert with the opiate reward system and some other hormonal and neuronal systems associated with close in work - feeding babies, eating - stuff we would do with our gaze directed downward, and usually doesn't require as much planning. Norepinephrine and serotonin networks also manage our vestibular systems, helping us balance, postural control, and knowing which way is up or down. Again, one associates balance with looking down or being centered in space. Norepinephrine tracts in the brain are also heavily involved in the "close-in" sense of touch.
The dopamine systems on the left side of the brain become more active with thinking and planning that is more long term. Scanning the horizon, searching and recognition. These systems rarely operate close in our personal space, as it is rare to bring something within arm's reach that we do not already recognize. The medial dopamine systems are used in exploration, navigation, and orientation to landmarks. Far vision, upward gaze, hearing, and smell are more associated with the dopamine tracts. Dopamine's linkage to distant space seems to have associated itself in humans with distant time as well. There is no species quite so future oriented as Homo sapien. And no other species has such high concentrations of dopamine in the brain.
An obvious and perhaps controversial correlate to these different neurotransmitter systems responsible for close/feeling/balance versus far/navigation/planning is the inherent female/male parallel. Aggressive "male" behaviors are mediated by dopamine, whereas receptive female and maternal behaviors (grooming, feeding) are mediated by close-in cues, norepinephrine, and opiate and oxytocin systems.
Psychiatrically speaking, it is probably not a coincidence that dopamine related disorders, such as schizophrenia, addiction, ADHD and even autism are much more common in men, whereas the serotonin/norepinephrine linked anxiety and depressive disorders are more common in women. Of course dopamine is also associated with depression and opiates with addiction, and men get depressed and anxious while women have ADHD and autism. These are obviously not absolutes, just trends.
It also occurs to me that the differences in personality disorder diagnoses between men and women may relate to the neurotransmitter dominance differences as well. Many more men have the diagnosis of antisocial personality disorder than women - basically, a sociopath. A person with antisocial personality disorder generally lacks empathy for others, and often struggles with impulsivity and aggressive drives and substance abuse. That combination will get you sociopathic behavior - stealing, murder, etc. Many more women than men are diagnosed with borderline personality disorder. Someone who has borderline personality generally has trouble with relationships and containing feelings appropriately. He or she can struggle with rage and impulsive behavior also, but often the violence is directed at him or herself in suicide attempts or self-injurious behavior. A person with borderline personality disorder has empathy but has trouble applying it - he or she likely has boundary issues and has a hard time sitting with another person's pain without feeling it too much as his or her own. Thus he or she will feel very deeply for someone, then when it gets to be too much, push him or her away. People with borderline also tend to have impulsive behavior and have higher risk for substance abuse. Both borderline and sociopathic individuals are likely to have experienced abuse or neglect as a child, but some have not and still have the disorder.
So if we run with the rather gross simplification that men are dopamine "left brain" dominant and women perhaps more balanced between the right and left hemispheres, you can see how troubles with regulating dopamine combined with an underactive social/empathy serotonin/norepinephrine system would get you sociopathy in a man. In a woman, the trouble in borderline personality seems to be more in the poor regulation of both - but perhaps more a deficiency of dopamine and a dysregulation of the social/empathy serotonin/norepinephrine side of things. Now these are all just my rampant speculations, but I'll keep my eye out for some actual proof.
And, of course, there are borderline men and antisocial women (think Sharon Stone in "Basic Instinct"). There is some question that some of our social male/female constructs keep us from wanting to diagnose women with antisocial personalities, and perhaps the same social constructs, pressures, and expectations mean that an abused boy child has the risk of growing up into a sociopath, and an abused girl child has the risk of becoming a borderline. I'm inclined to believe some of that is going on, but that the biology is also at play. Of course, many abused children grow up without having personality disorders as well.
I'll finish up with a discussion of how our brains end up the way they do in the first place. The same gravimetric serotonin-driven vestibular systems that help us stay balanced are responsible for the migration and localization of the dopamine tracts as they are formed in early development. The theory goes that our bipedalism keeps us oriented differently than other primates or other animals in utero, leading to different gravity signals and a more lateralized brain development (meaning our right and left brains are simply more different than the right and left brains of any other animals.). There are no known genes that act as master directors for this lateralization - it all seems to be related to epigenetic factors and serotonin availability in the right place at the right time.
Natural human movement, then, is what may make our brains human.
(once again I'm drawing from Previc's The Dopaminergic Mind in Human Evolution and History).
Also Previc's paper "The neuropsychology of 3-D space" http://www.ncbi.nlm.nih.gov/pubmed/9747184)
Monday, November 29, 2010
Friday, November 26, 2010
Acne, Depression, Omega 3, Vitamins, and Minerals
In 2008 some folks from a Beverly Hills skin clinic wrote up a short paper in Lipids in Health and Disease called Acne vulgaris, mental health and omega-3 fatty acids: a report of cases (free full text). The experiment itself was an open-label trial of a mineral/omega-3 supplement on five patients, so useful only as a reason to do further research. But a lot of interesting science tidbits on acne, omega-3s, and minerals are noted in the article, so it's worth a peek.
Acne is a disease of civilization which, like depression, has increased the last half century, especially in women. As we discussed in my blog post, Acne and Suicide, patients with acne are more likely to be depressed, angry, and suicidal. In fact, patients with acne struggle more with mental health issues than even patients with epilepsy or diabetes (according to a study comparing questionnaires between sufferers of acne and other general medical conditions).
Acne is accompanied by the overproduction of Sebum, a waxy oil, in addition to inflammation, hormonal shifts, and infection. Inflammation is one of the earliest manifestations of the disease, particularly mediated by a leukotriene called LTB4. This inflammatory chemical helps up-regulate sebum production, and you might be interested to know that the omega 6 fatty acid derivative arachidonic acid is made into LTB4, while the omega 3 fatty acid EPA (from fish) inhibits the conversion of arachidonic acid to LTB4. A study of 1000 teenagers in North Carolina showed lower incidence of pustules, acne cysts, and oily skin in those teenagers consuming the most fish. Another study showed that patients with acne ate low amounts of seafood. In my own clinical experience, young adults with acne have experienced a reduction in severity when they begin to supplement with fish oil (though it is not a complete cure, and doesn't seem to help everyone). However, many have a very noticeable improvement. There is a prescription drug, zileuton, that inhibits LTB4 and improves acne, but it would seem a fish prescription might be more practical.
Patients with acne, being in a state of systemic inflammation, also seem to have lower serum amounts of several vitamins and minerals, specifically zinc, vitamins A and C, and selenium. Studies of all these supplements, some administered topically, some orally, or both seemed to show some benefit. In addition, EGCG from green tea has been "suggested to be helpful in acne due to its well documented anti-inflammatory and antioxidant activity."
Acne is also worse in people with poor control of blood glucose, and the supplement chromium is known to have some minor benefit in that area. There was one open label trial of 400 mcg of chromium daily that seemed to help.
So in this experiment, five patients (three males and two females aged 18-23) were given a supplement with 1000 mg of EPA, EGCG 200mg, zinc gluconate 15 mg, selenium 200 mcg, and chromium 200 mcg to take daily. They didn't use any new topical treatments or change their diets in any way. The number of pimples and amount of inflammation was noted with a standardized acne scale at the beginning, and measured again at the end of two months. In addition, each patient took a before and after test measuring mental, emotional, and social well-being.
The results? Four of the five had improvement in number of lesions, and all seemed to have a reduction of general skin inflammation. Sense of well-being improved 24% (with a range in the five patients of 20-27%). The authors thought this improvement might be due to the EPA, but since EPA seems less important in the brain than its sister fish oil, DHA, I'm prone to be skeptical. I wonder if the improvement might be due to the generalized reduction in inflammation.
All told, this tiny little open-label trial can't allow us to draw too many conclusions. Without a control and some more data points, we can only tuck the information away as something to look at further. Now a healthy Paleolithic diet with organ meats and fish would provide the vitamins, minerals, and EPA (not sure about the EGCG). Especially in an active hunter-gatherer who would consume and burn more calories, and therefore more nutrients along the way. I feel saturated fat itself helps the skin (thinking of the adolescent Masai warriors, or the beautiful people of Thailand, and my own experience starting with a more Cordain-inspired lower fat paleo diet, and a few months later switching to a higher fat primal diet.) My skin is much less sensitive and I'm not prone to breakouts at all with the absence of gluten and the increase in saturated fat of my primal-style diet. In fact, one of the ways I know I've been slipping a little too much (a birthday party here, a wedding to go to there) is if a little spot pops up.
Everyone benefits from an improvement in looks. One of the fastest ways to improve mental health in my clinical experience is to help someone successfully get into fat-burning mode and clear the skin. Clinical experience and common scientific sense is one thing. Real controlled trials are something else. Bring them on.
Acne is a disease of civilization which, like depression, has increased the last half century, especially in women. As we discussed in my blog post, Acne and Suicide, patients with acne are more likely to be depressed, angry, and suicidal. In fact, patients with acne struggle more with mental health issues than even patients with epilepsy or diabetes (according to a study comparing questionnaires between sufferers of acne and other general medical conditions).
Acne is accompanied by the overproduction of Sebum, a waxy oil, in addition to inflammation, hormonal shifts, and infection. Inflammation is one of the earliest manifestations of the disease, particularly mediated by a leukotriene called LTB4. This inflammatory chemical helps up-regulate sebum production, and you might be interested to know that the omega 6 fatty acid derivative arachidonic acid is made into LTB4, while the omega 3 fatty acid EPA (from fish) inhibits the conversion of arachidonic acid to LTB4. A study of 1000 teenagers in North Carolina showed lower incidence of pustules, acne cysts, and oily skin in those teenagers consuming the most fish. Another study showed that patients with acne ate low amounts of seafood. In my own clinical experience, young adults with acne have experienced a reduction in severity when they begin to supplement with fish oil (though it is not a complete cure, and doesn't seem to help everyone). However, many have a very noticeable improvement. There is a prescription drug, zileuton, that inhibits LTB4 and improves acne, but it would seem a fish prescription might be more practical.
Patients with acne, being in a state of systemic inflammation, also seem to have lower serum amounts of several vitamins and minerals, specifically zinc, vitamins A and C, and selenium. Studies of all these supplements, some administered topically, some orally, or both seemed to show some benefit. In addition, EGCG from green tea has been "suggested to be helpful in acne due to its well documented anti-inflammatory and antioxidant activity."
Acne is also worse in people with poor control of blood glucose, and the supplement chromium is known to have some minor benefit in that area. There was one open label trial of 400 mcg of chromium daily that seemed to help.
So in this experiment, five patients (three males and two females aged 18-23) were given a supplement with 1000 mg of EPA, EGCG 200mg, zinc gluconate 15 mg, selenium 200 mcg, and chromium 200 mcg to take daily. They didn't use any new topical treatments or change their diets in any way. The number of pimples and amount of inflammation was noted with a standardized acne scale at the beginning, and measured again at the end of two months. In addition, each patient took a before and after test measuring mental, emotional, and social well-being.
The results? Four of the five had improvement in number of lesions, and all seemed to have a reduction of general skin inflammation. Sense of well-being improved 24% (with a range in the five patients of 20-27%). The authors thought this improvement might be due to the EPA, but since EPA seems less important in the brain than its sister fish oil, DHA, I'm prone to be skeptical. I wonder if the improvement might be due to the generalized reduction in inflammation.
All told, this tiny little open-label trial can't allow us to draw too many conclusions. Without a control and some more data points, we can only tuck the information away as something to look at further. Now a healthy Paleolithic diet with organ meats and fish would provide the vitamins, minerals, and EPA (not sure about the EGCG). Especially in an active hunter-gatherer who would consume and burn more calories, and therefore more nutrients along the way. I feel saturated fat itself helps the skin (thinking of the adolescent Masai warriors, or the beautiful people of Thailand, and my own experience starting with a more Cordain-inspired lower fat paleo diet, and a few months later switching to a higher fat primal diet.) My skin is much less sensitive and I'm not prone to breakouts at all with the absence of gluten and the increase in saturated fat of my primal-style diet. In fact, one of the ways I know I've been slipping a little too much (a birthday party here, a wedding to go to there) is if a little spot pops up.
Everyone benefits from an improvement in looks. One of the fastest ways to improve mental health in my clinical experience is to help someone successfully get into fat-burning mode and clear the skin. Clinical experience and common scientific sense is one thing. Real controlled trials are something else. Bring them on.
Thursday, November 25, 2010
Happy Thanksgiving! Chow Down on Dopamine Primer 2
Hope you are having a great time on your Turkey Day (assuming you celebrate it as the majority of my blog readership does). I baked a paleo pumpkin pie last night, and today I'll make (and eat) the crab stuffed mushrooms (finally, a use for the oregano plant we mostly killed in the fall) from the same Everyday Paleo page. We're having our turkey (I'm told it is organic free range, and if it isn't, I don't want to know) with some relatives on Saturday, so by then no doubt I will have baked another pie to bring with me to the gathering, only this time I'll likely add vanilla and coconut flakes to the butter pecan hazelnut crust. Yum! Gotta love a Thanksgiving where you don't even mentally count the fat grams. Just avoid the gluten...but the rolls and stuffing were never my favorites anyway. Give me meat and pie and I'll be happy.
Back to dopamine. My favorite neurotransmitter, and also (I'm assuming) the favorite neurotransmitter of Dr. Fred H. Previc who wrote The Dopaminergic Mind in Human Evolution and History which will be, once again, the major source for much of the information here, though there will be some from the psychiatry clinical trenches too (let's assume I did learn something in residency). Dopamine is special because it divides the humans from the beasts, as it were, and yet some of our more bestial qualities stem from chasing that dopamine feeling - via ultimate fighting, cocaine, speed, and also the reward pathways in binge eating, drinking, gambling, shopping, etc. etc. etc. Anything that involves competitiveness, drive, sex, planning, working memory, concentration, aggression... dopamine plays a role.
Dopamine is distributed quite differently on the different sides of the human brain, and is is speculated that this lateralization is responsible for how very human we are. The left brain (in almost all right-handed and most left-handed people) is responsible for language, linear reasoning, mathematics, that sort of thing, whereas the right side is usually responsible for intuition, holistic reasoning, some elements of music and speech intonation, etc. It's funny, really, how different the sides of the brain are. They look much the same. My right and left lungs look a little different, but they have the same function. My right and left feet do pretty much the same thing. But remove my left brain, and I'll be a different person. Amazingly, if one is young enough, half the brain can be scooped out and removed (if necessary, usually to control intractable seizures), leaving the child with pretty normal intellectual functioning (though motor functioning on one side is usually irrevocably lost). This finding (among others) led one prominent neuroscientist to pen a famous chapter called "Is Your Brain Really Necessary?"
The human brain has 100 billion neurons. Only 20,000 or so carry dopamine, and they do so along four major tracts. Keep in mind that the brain is a gorgeous and mysterious place with many wonderful names, like Rivendell or Brigadoon. The areas are named either anatomically (the dorsolateral prefrontal cortex) or because it reminded the anatomist of yore of some other important anatomical structure (the mammilary bodies). Or for some other obscure reason (the tegmentum? Oh, that's Latin for "covering." Okay!)
Dopamine is made in two little areas in the deep animal recesses of the brain - the substantia nigra and the ventral tegmental area. From these starting gates, the dopamine tracts reach out to various segments.
1) The nigrostriatal tract is important to neurologists. It brings dopamine from the substantia nigra to the striatum or "basal ganglia." These neurons are responsible for a lot of the motor control of the body. Death of dopaminergic neurons in the substantia nigra leads to the symptoms of Parkinson's disease - tremor, stiffness, loss of voluntary movement (though someone with advanced Parkinson's may not be able to toss a ball to you, if you toss a ball to him, he might be able to use different reflex motor brain tracts and catch it). This pathway is also affected in various choreas, such as the possibly wheat-related Huntington's Chorea.
2) The mesolimbic pathway goes from the ventral tegmental area to the limbic system. The limbic system of the brain controls reward and emotion, and includes the hippocampus and the medial frontal cortex. This is the pathway that is thought to be responsible for addiction and psychosis.
3) The mesocortical pathway goes from the ventral tegmental area to the dorsolateral frontal cortex. This is the pathway responsible for planning, responsibility, prioritizing, motivation, and some elements of emotional response. This is one of the damaged areas in ADHD and depression.
4) The tuberoinfundibular pathway has my favorite name (though not as fun a name as the as the anterior inferior thoracoacromial artery). It's a dopamine pathway between the hypothalamus and the pituitary gland, and the most important part is that dopamine inhibits prolactin release. So inhibiting dopamine means prolactin increases, enabling breastfeeding and whatnot.
Having fun yet? In psychiatric terms, the mesolimbic and mesocortical pathways are by far the most important. They have a lot to do with how we behave, and who we are. Now go eat some turkey and be happy.
Back to dopamine. My favorite neurotransmitter, and also (I'm assuming) the favorite neurotransmitter of Dr. Fred H. Previc who wrote The Dopaminergic Mind in Human Evolution and History which will be, once again, the major source for much of the information here, though there will be some from the psychiatry clinical trenches too (let's assume I did learn something in residency). Dopamine is special because it divides the humans from the beasts, as it were, and yet some of our more bestial qualities stem from chasing that dopamine feeling - via ultimate fighting, cocaine, speed, and also the reward pathways in binge eating, drinking, gambling, shopping, etc. etc. etc. Anything that involves competitiveness, drive, sex, planning, working memory, concentration, aggression... dopamine plays a role.
Dopamine is distributed quite differently on the different sides of the human brain, and is is speculated that this lateralization is responsible for how very human we are. The left brain (in almost all right-handed and most left-handed people) is responsible for language, linear reasoning, mathematics, that sort of thing, whereas the right side is usually responsible for intuition, holistic reasoning, some elements of music and speech intonation, etc. It's funny, really, how different the sides of the brain are. They look much the same. My right and left lungs look a little different, but they have the same function. My right and left feet do pretty much the same thing. But remove my left brain, and I'll be a different person. Amazingly, if one is young enough, half the brain can be scooped out and removed (if necessary, usually to control intractable seizures), leaving the child with pretty normal intellectual functioning (though motor functioning on one side is usually irrevocably lost). This finding (among others) led one prominent neuroscientist to pen a famous chapter called "Is Your Brain Really Necessary?"
The human brain has 100 billion neurons. Only 20,000 or so carry dopamine, and they do so along four major tracts. Keep in mind that the brain is a gorgeous and mysterious place with many wonderful names, like Rivendell or Brigadoon. The areas are named either anatomically (the dorsolateral prefrontal cortex) or because it reminded the anatomist of yore of some other important anatomical structure (the mammilary bodies). Or for some other obscure reason (the tegmentum? Oh, that's Latin for "covering." Okay!)
Dopamine is made in two little areas in the deep animal recesses of the brain - the substantia nigra and the ventral tegmental area. From these starting gates, the dopamine tracts reach out to various segments.
1) The nigrostriatal tract is important to neurologists. It brings dopamine from the substantia nigra to the striatum or "basal ganglia." These neurons are responsible for a lot of the motor control of the body. Death of dopaminergic neurons in the substantia nigra leads to the symptoms of Parkinson's disease - tremor, stiffness, loss of voluntary movement (though someone with advanced Parkinson's may not be able to toss a ball to you, if you toss a ball to him, he might be able to use different reflex motor brain tracts and catch it). This pathway is also affected in various choreas, such as the possibly wheat-related Huntington's Chorea.
2) The mesolimbic pathway goes from the ventral tegmental area to the limbic system. The limbic system of the brain controls reward and emotion, and includes the hippocampus and the medial frontal cortex. This is the pathway that is thought to be responsible for addiction and psychosis.
3) The mesocortical pathway goes from the ventral tegmental area to the dorsolateral frontal cortex. This is the pathway responsible for planning, responsibility, prioritizing, motivation, and some elements of emotional response. This is one of the damaged areas in ADHD and depression.
4) The tuberoinfundibular pathway has my favorite name (though not as fun a name as the as the anterior inferior thoracoacromial artery). It's a dopamine pathway between the hypothalamus and the pituitary gland, and the most important part is that dopamine inhibits prolactin release. So inhibiting dopamine means prolactin increases, enabling breastfeeding and whatnot.
Having fun yet? In psychiatric terms, the mesolimbic and mesocortical pathways are by far the most important. They have a lot to do with how we behave, and who we are. Now go eat some turkey and be happy.
Tuesday, November 23, 2010
Depression and Diabetes, Together Again
Waaay back in June I wrote up Depression 1, which referenced a few of the studies linking depressive disorders to metabolic syndrome and diabetes. Today a new study came out from some of our favorite epidemiologists, Frank Hu and crew. Dr. Hu actually wrote an editorial questioning the wisdom of population-wide nutritional advice that resulted in replacing saturated fat with refined carbohydrates. However, he also signed off on that monumentally silly paper about "low carb" meat eaters versus low carb veggie-lovers that everyone made fun of a few months back.
Today's paper is "Bidirectional Association Between Depression and Type 2 Diabetes Mellitus in Women" from the Archives of Internal Medicine. The researchers followed 65,381 women in the Nurses' Health Study from 1996 to 2006. Clinical depression was defined by having diagnosed depression or using antidepressants, and depressed mood was defined as having a high score on the 5 item Mental Health Index questionnaire. The nurses self-reported having type 2 diabetes which was validated by medical record review.
During the 10 years of follow up, there were 2844 new cases of type 2 diabetes. Those with poor mood had a higher chance of developing diabetes. 7415 women developed clinical depression, and women who were on insulin to control type 2 diabetes had a risk ratio of 1.53 for developing clinical depression. Overall, the data showed that diabetes increased the risk of depression, and depression increased the risk of diabetes, and that more severe depression and more severe diabetes increased the risk of the other illness even more. Countless covariates did not explain the bidirectional connection.
On a population level, 23.5 million American adults have diabetes (about 10%, though about 23% of those >=60). 6.7% (14.8 million) of the US adult population has depression in any given year, with a lifetime incidence in women of around 20%. A previous meta-analysis (1) found that the odds of depression in the diabetic group was twice that of the nondiabetic comparison group. In a couple of papers, diabetes risk was increased by 60% in patients who had diagnosed depression. Frank Hu's group paper from this week made some effort to figure out what comes first - depression or diabetes? Depression (atypical depression that is) can make you crave carbs and leave you unmotivated to perform exercise. Having diabetes is usually pretty stressful too, putting you at greater risk for depression, and "depression may result from the biochemical changes directly caused by diabetes."
That last quote is perilously close to my theory, that metabolic syndrome and leptin/insulin derangements probably underlie both disorders. That's more "big picture" evolutionary medicine style thinking than I'm used to in reading the discussions of these papers, but don't get too hopeful, it was only touched on in that one sentence. I discussed the underlying mechanisms of depression and metabolic syndrome and how they are connected in Stress is Metabolic Syndrome and Chronic Stress is Chronic Illness. These aren't really radical ideas. Oh well.
Today's paper is "Bidirectional Association Between Depression and Type 2 Diabetes Mellitus in Women" from the Archives of Internal Medicine. The researchers followed 65,381 women in the Nurses' Health Study from 1996 to 2006. Clinical depression was defined by having diagnosed depression or using antidepressants, and depressed mood was defined as having a high score on the 5 item Mental Health Index questionnaire. The nurses self-reported having type 2 diabetes which was validated by medical record review.
During the 10 years of follow up, there were 2844 new cases of type 2 diabetes. Those with poor mood had a higher chance of developing diabetes. 7415 women developed clinical depression, and women who were on insulin to control type 2 diabetes had a risk ratio of 1.53 for developing clinical depression. Overall, the data showed that diabetes increased the risk of depression, and depression increased the risk of diabetes, and that more severe depression and more severe diabetes increased the risk of the other illness even more. Countless covariates did not explain the bidirectional connection.
On a population level, 23.5 million American adults have diabetes (about 10%, though about 23% of those >=60). 6.7% (14.8 million) of the US adult population has depression in any given year, with a lifetime incidence in women of around 20%. A previous meta-analysis (1) found that the odds of depression in the diabetic group was twice that of the nondiabetic comparison group. In a couple of papers, diabetes risk was increased by 60% in patients who had diagnosed depression. Frank Hu's group paper from this week made some effort to figure out what comes first - depression or diabetes? Depression (atypical depression that is) can make you crave carbs and leave you unmotivated to perform exercise. Having diabetes is usually pretty stressful too, putting you at greater risk for depression, and "depression may result from the biochemical changes directly caused by diabetes."
That last quote is perilously close to my theory, that metabolic syndrome and leptin/insulin derangements probably underlie both disorders. That's more "big picture" evolutionary medicine style thinking than I'm used to in reading the discussions of these papers, but don't get too hopeful, it was only touched on in that one sentence. I discussed the underlying mechanisms of depression and metabolic syndrome and how they are connected in Stress is Metabolic Syndrome and Chronic Stress is Chronic Illness. These aren't really radical ideas. Oh well.
Monday, November 22, 2010
Dopamine Primer
Hi! Tonight a quick basic post for some necessary background. I will try to make it as painless as possible. But we need to examine dopamine more closely. Why? Well, dopamine may well be the secret to what makes us human. Meaning awfully bright, able to plan ahead, and resist impulses when necessary.
What is dopamine? It's a neurotransmitter. It controls communication in the brain - it's a chemical that can tell a neuron to fire off a signal or not, and modulates the signals. Dopamine is ancient - found in lizard brains and every other animal along the evolutionary tree up to homo sapiens. But humans have a great deal of dopamine, and over many generations we have evolved to have more and more.
Control of dopamine and where it ends up in the brain isn't just determined by straight up mendalian genetics. As I discussed in this post, our mothers' neurochemical environment had a lot to do with how our dopamine machinery migrates and works in our brains.
Another special thing about humans is our bipedalism. Being upright while mom is pregnant exposes our fetal brains to different vestibular environments than other primates, so the theory is this elevated the dopamine levels in the left hemisphere of most people's brains. I know. Go with it for a minute. It's just a theory.
Humans also eat a lot of meat and fish compared to other primates - meat and fish give us more dopamine precursors. More dopamine is also associated with both greater competitiveness, aggression, and impulse control - one could see how that particular combination of traits would be selected for over human evolution.
Serotonin, another neurotransmitter, is our oldest neurotransmitter and the original antioxidant - dopamine is what made humans so successful.
Now the biochemistry. Dopamine is a type of neurotransmitter called a catecholamine. Catecholamines have, not surprisingly, a catechol chemical group attached to an amine.
How do we get dopamine? We eat it. The precursor amino acid from the protein we eat is called tyrosine. Tyrosine becomes dopa via the enzyme tyrosine hydroxylase, and dopa becomes dopamine via the actions of dopa decarboxylase. (One more chemical reaction can turn dopamine into its best buddy neurotransmitter, norepinephrine, but more on that later). As is the case with serotonin and its precursor tryptophan, tyrosine can cross the blood brain barrier, but dopamine itself cannot. That means that the dopamine our brain needs must be manufactured from dopamine machinery and precursors in the brain.
Still with me? What happens without dopamine, or with screwy dopamine machinery or inefficient dopamine? Well, in development this lack can cause mental retardation, which is the case in a rare genetic disease called PKU and cretinism (a type of mental retardation caused by iodine deficiency). Dopamine problems are implicated in ADHD, Alzheimer's, Parkinson's, depression, bipolar disorders, and schizophrenia. Having too much dopamine in the wrong place can make you psychotic. Illicit drugs that dump loads of dopamine (or strongly inhibit its reuptake, which is similar to dumping loads of dopamine) include cocaine and methamphetamines. Therefore high amounts of dopamine can cause euphoria, aggression and intense sexual feelings.
We need dopamine in the right place at the right time in the right amounts. When it all comes together, we are the awesomest ape around. When it doesn't, problems ensue (not surprisingly). Dopamine is linked to everything interesting about metabolism, evolution, and the brain. There's a feast of scientific info out there. Hopefully we can make some sense of it.
Information from this post is taken mostly from The Dopaminergic Mind in Human Evolution and History by Fred Previc.
What is dopamine? It's a neurotransmitter. It controls communication in the brain - it's a chemical that can tell a neuron to fire off a signal or not, and modulates the signals. Dopamine is ancient - found in lizard brains and every other animal along the evolutionary tree up to homo sapiens. But humans have a great deal of dopamine, and over many generations we have evolved to have more and more.
Control of dopamine and where it ends up in the brain isn't just determined by straight up mendalian genetics. As I discussed in this post, our mothers' neurochemical environment had a lot to do with how our dopamine machinery migrates and works in our brains.
Another special thing about humans is our bipedalism. Being upright while mom is pregnant exposes our fetal brains to different vestibular environments than other primates, so the theory is this elevated the dopamine levels in the left hemisphere of most people's brains. I know. Go with it for a minute. It's just a theory.
Humans also eat a lot of meat and fish compared to other primates - meat and fish give us more dopamine precursors. More dopamine is also associated with both greater competitiveness, aggression, and impulse control - one could see how that particular combination of traits would be selected for over human evolution.
Serotonin, another neurotransmitter, is our oldest neurotransmitter and the original antioxidant - dopamine is what made humans so successful.
Now the biochemistry. Dopamine is a type of neurotransmitter called a catecholamine. Catecholamines have, not surprisingly, a catechol chemical group attached to an amine.
How do we get dopamine? We eat it. The precursor amino acid from the protein we eat is called tyrosine. Tyrosine becomes dopa via the enzyme tyrosine hydroxylase, and dopa becomes dopamine via the actions of dopa decarboxylase. (One more chemical reaction can turn dopamine into its best buddy neurotransmitter, norepinephrine, but more on that later). As is the case with serotonin and its precursor tryptophan, tyrosine can cross the blood brain barrier, but dopamine itself cannot. That means that the dopamine our brain needs must be manufactured from dopamine machinery and precursors in the brain.
Still with me? What happens without dopamine, or with screwy dopamine machinery or inefficient dopamine? Well, in development this lack can cause mental retardation, which is the case in a rare genetic disease called PKU and cretinism (a type of mental retardation caused by iodine deficiency). Dopamine problems are implicated in ADHD, Alzheimer's, Parkinson's, depression, bipolar disorders, and schizophrenia. Having too much dopamine in the wrong place can make you psychotic. Illicit drugs that dump loads of dopamine (or strongly inhibit its reuptake, which is similar to dumping loads of dopamine) include cocaine and methamphetamines. Therefore high amounts of dopamine can cause euphoria, aggression and intense sexual feelings.
We need dopamine in the right place at the right time in the right amounts. When it all comes together, we are the awesomest ape around. When it doesn't, problems ensue (not surprisingly). Dopamine is linked to everything interesting about metabolism, evolution, and the brain. There's a feast of scientific info out there. Hopefully we can make some sense of it.
Information from this post is taken mostly from The Dopaminergic Mind in Human Evolution and History by Fred Previc.
Sunday, November 21, 2010
Cleaning Up
In lieu of a new and interesting post, something possessed me to organize the place this morning. So up on the right there I now have some more advanced disclaimers and a brand new site map. It's not finished yet (right now I only have the first three months up, I'm updated now - might be some fun reading for those of you newer to the site) and not pretty (a graphic designer I am not, and blogger can be fairly unwieldy with the font sizes and formatting).
My youngest child is amusing herself by trying to eat her hand. Good thing she doesn't have that many teeth. On the positive side, she learned how to say "vitamin" today! I better stop now as I need a certain aura of mystery to maintain my shamanistic role as a psychiatrist. Though it is likely the last vestiges of my dignity were lost when I joined Twitter.
A couple of music links (right click in new tab and you can enjoy them while you peruse the new map!)
The Limousines - "Internet Killed the Video Star" - The kids are disco dancing, they're tired of Rock and Roll. I try to tell them hey that drum machine ain't got no soul.
If you are of a more classical bent - Aaron Copeland* classic, "Fanfare for the Common Man." Copland has a unique sound due to what he left out - the major 3rd.
* Thanks music Doc - It's Copland. Not sure why but spelling was never my strong suit.
One more for Thanksgiving week coming up - Vampire Weekend, "Holiday."
My youngest child is amusing herself by trying to eat her hand. Good thing she doesn't have that many teeth. On the positive side, she learned how to say "vitamin" today! I better stop now as I need a certain aura of mystery to maintain my shamanistic role as a psychiatrist. Though it is likely the last vestiges of my dignity were lost when I joined Twitter.
A couple of music links (right click in new tab and you can enjoy them while you peruse the new map!)
The Limousines - "Internet Killed the Video Star" - The kids are disco dancing, they're tired of Rock and Roll. I try to tell them hey that drum machine ain't got no soul.
If you are of a more classical bent - Aaron Copeland* classic, "Fanfare for the Common Man." Copland has a unique sound due to what he left out - the major 3rd.
* Thanks music Doc - It's Copland. Not sure why but spelling was never my strong suit.
One more for Thanksgiving week coming up - Vampire Weekend, "Holiday."
Friday, November 19, 2010
The Wild and Compelling Theory that Alzheimer's Dementia is Caused by Infectious Disease
As you recall, Alzheimer's Disease is a slowly progressive illness of neuronal degeneration. The cardinal symptoms are cognitive impairment and memory loss, but the condition eventually leads to death. I've dedicated many posts to Alzheimer's, which is a reflection of the amount of research out there. And for today's post, I'm going to begin with a nice review article from 2008, and over another post (or several) try to finish out with some more specific looks at the various papers.
Even if you don't find Alzheimer's that compelling, if infectious agents contribute to its pathology, then you have to open your mind to the idea that many neurodegenerative processes could be due to (or accelerated by) infection. Neurodegenerative diseases include many neurological illnesses, but also depression, bipolar disorder, schizophrenia, autism...
What is the argument against infection as an ongoing contributing factor? Well, where's the bug? Do a spinal tap - does the fluid grow any bacteria in culture? Are there white blood cells (sign of active infection). Is there an elevated level of protein (a sign of viral infection)? Over the last hundred years we've become pretty good at finding bugs. It's hard to imagine them hiding from us, even in the protected environment of the human skull. So if you bring up the idea of infections causing Alzheimer's disease to a physician friend and he or she scoffs at you - that's why. Where's the bug?
With that in mind, let's plunge forward into the review by Urosevic and Martins from the Journal of Alzheimer's Disease, "Infection and Alzheimer's Disease: The Apoe epsilon4 Connection and Lipid Metabolism."
The whole theory breaks down like so - there's a continuous, chronic infection supplying persistent live microorganisms, and their toxic products stimulate the host's (that's you) inflammation. The pathogen itself damages the neurons, and the brain's inflammatory response also damages the neurons. Remember that ApoE4 is the Alzheimer's vulnerable variant of apolipoprotein, the molecular key on the lipoprotein that invites cholesterol and triglycerides into the brain. As an added bonus, any infectious disease theory of Alzheimer's also has to explain why having ApoE4 makes you more likely to get Alzheimer's.
What infectious agents are we talking? Some viruses immediately come to mind, specifically herpes viruses. These little guys are exceedingly common and come in lots of different flavors, and are well known to hide out in nerve cells for the duration of the host's life (an easy example of "There's the bug.") HSVI, associated with cold sores, infects people early in life and hangs out in the trigeminal ganglia (the nerve root of the trigeminal nerve that innervates a good part of the face). Some people get cold sores, some people don't, but those who do are more likely to be ApoE4 carriers. People infected with HSVI are also more likely to develop Alzheimer's. ApoE4 mice were more likely to carry invasive HSVI and have brain colonization of the virus.
Other viruses implicated include human herpes virus 6 (cause of roseola, a common childhood illness of high fever followed by a characteristic rash), HIV, hepatitis C, and cytomegalovirus (a cause of mononucleosis-like illness and fatigue symptoms). It is well known that HIV causes a form of AIDS dementia (which happens to be more common in carriers of ApoE4), so it would make sense that other common viruses that infiltrate the neurons might lead to other types of dementia.
All the common inflammation players (TNFalpha, IL-6, nitric oxide synthase) are involved in fighting off viral infections. We know these players have a role in Alzheimer's pathology and in depression and bipolar disorder. Interestingly, as we get older, our immune response becomes less aggressive, and it is perhaps then that the infectious agents hold sway, leading to Alzheimer's pathology. Other inflammation-mediated brain disease occurs at different developmental stages - late adolescence and early adulthood for schizophrenia, and infancy for autism.
There are also suspected bacterial causes. The "spirochetes" are a type of sneaky bacteria that are known to infect nerves (as in syphillis and Lyme disease). Some spirochetes that cause gum disease are found in the mouths of Alzheimer's patients and healthy folks, but in the Alzheimer's patients, they are found in the brain more often than in healthy folks. Certain spirochetes have been found in the amyloid plaques in the brains of patients with Alzheimer's (once again - there's the bug).
Chlamydia pneumoniae is an intracellular bacterial pathogen also implicated in Alzheimer's dementia. Not surprisingly, it is better known as a common cause of pneumonia and other acute respiratory infections, but infected immune cells could presumably carry chlamydia pneumoniae from the upper respiratory tract to the brain if the blood brain carrier is compromised in some fashion. Chlamydia pneumoniae has been injected into mouse brains and causes amyloid deposits, which anyone will agree is suspicious behavior. Once again, ApoE4 comes up, as ApoE4 seems to allow for greater bacterial load and numbers of infected cells.
ApoE4 is not only a bad guy when it comes to Alzheimer's. Carriers are at greater risk of athersclerosis, stroke, and poor recovery from head injury. And ApoE's role as a molecular marker and director of where lipoproteins go is key. Lipoproteins are how fat and antioxidants are carried throughout the body. It is probably not a coincidence that major conditions leading to weight loss without any particular effort are infectious disease and Alzheimer's (along with cancer and melancholic depression). ApoE4 in particular is associated with poor clearance and recycling of lipoprotein particles. It's just not an efficient key. If everything is going along just fine, maybe we don't need an efficient key. Add stress or infection, though, and you have a greater need for meabolic efficiency. If you fall behind, garbage builds up.
The ability of a host (you, or me) to handle an infection depends on genetic, dietary, and other environmental factors such as age, stress, and immune status. Viruses and bacteria are sneaky - I daresay sneakier even than Homo sapiens, and the closeted nerve cells are a perfect place for an unnoticed infection to simmer for decades. Microogranisms can continuously release toxins, leading to chronic inflammation and damage. Treating a chronic infection, if it is found, could possibly be a useful way to fight a number of neurogenerative diseases, even Alzheimer's.
Even if you don't find Alzheimer's that compelling, if infectious agents contribute to its pathology, then you have to open your mind to the idea that many neurodegenerative processes could be due to (or accelerated by) infection. Neurodegenerative diseases include many neurological illnesses, but also depression, bipolar disorder, schizophrenia, autism...
What is the argument against infection as an ongoing contributing factor? Well, where's the bug? Do a spinal tap - does the fluid grow any bacteria in culture? Are there white blood cells (sign of active infection). Is there an elevated level of protein (a sign of viral infection)? Over the last hundred years we've become pretty good at finding bugs. It's hard to imagine them hiding from us, even in the protected environment of the human skull. So if you bring up the idea of infections causing Alzheimer's disease to a physician friend and he or she scoffs at you - that's why. Where's the bug?
With that in mind, let's plunge forward into the review by Urosevic and Martins from the Journal of Alzheimer's Disease, "Infection and Alzheimer's Disease: The Apoe epsilon4 Connection and Lipid Metabolism."
The whole theory breaks down like so - there's a continuous, chronic infection supplying persistent live microorganisms, and their toxic products stimulate the host's (that's you) inflammation. The pathogen itself damages the neurons, and the brain's inflammatory response also damages the neurons. Remember that ApoE4 is the Alzheimer's vulnerable variant of apolipoprotein, the molecular key on the lipoprotein that invites cholesterol and triglycerides into the brain. As an added bonus, any infectious disease theory of Alzheimer's also has to explain why having ApoE4 makes you more likely to get Alzheimer's.
What infectious agents are we talking? Some viruses immediately come to mind, specifically herpes viruses. These little guys are exceedingly common and come in lots of different flavors, and are well known to hide out in nerve cells for the duration of the host's life (an easy example of "There's the bug.") HSVI, associated with cold sores, infects people early in life and hangs out in the trigeminal ganglia (the nerve root of the trigeminal nerve that innervates a good part of the face). Some people get cold sores, some people don't, but those who do are more likely to be ApoE4 carriers. People infected with HSVI are also more likely to develop Alzheimer's. ApoE4 mice were more likely to carry invasive HSVI and have brain colonization of the virus.
Other viruses implicated include human herpes virus 6 (cause of roseola, a common childhood illness of high fever followed by a characteristic rash), HIV, hepatitis C, and cytomegalovirus (a cause of mononucleosis-like illness and fatigue symptoms). It is well known that HIV causes a form of AIDS dementia (which happens to be more common in carriers of ApoE4), so it would make sense that other common viruses that infiltrate the neurons might lead to other types of dementia.
All the common inflammation players (TNFalpha, IL-6, nitric oxide synthase) are involved in fighting off viral infections. We know these players have a role in Alzheimer's pathology and in depression and bipolar disorder. Interestingly, as we get older, our immune response becomes less aggressive, and it is perhaps then that the infectious agents hold sway, leading to Alzheimer's pathology. Other inflammation-mediated brain disease occurs at different developmental stages - late adolescence and early adulthood for schizophrenia, and infancy for autism.
There are also suspected bacterial causes. The "spirochetes" are a type of sneaky bacteria that are known to infect nerves (as in syphillis and Lyme disease). Some spirochetes that cause gum disease are found in the mouths of Alzheimer's patients and healthy folks, but in the Alzheimer's patients, they are found in the brain more often than in healthy folks. Certain spirochetes have been found in the amyloid plaques in the brains of patients with Alzheimer's (once again - there's the bug).
Chlamydia pneumoniae is an intracellular bacterial pathogen also implicated in Alzheimer's dementia. Not surprisingly, it is better known as a common cause of pneumonia and other acute respiratory infections, but infected immune cells could presumably carry chlamydia pneumoniae from the upper respiratory tract to the brain if the blood brain carrier is compromised in some fashion. Chlamydia pneumoniae has been injected into mouse brains and causes amyloid deposits, which anyone will agree is suspicious behavior. Once again, ApoE4 comes up, as ApoE4 seems to allow for greater bacterial load and numbers of infected cells.
ApoE4 is not only a bad guy when it comes to Alzheimer's. Carriers are at greater risk of athersclerosis, stroke, and poor recovery from head injury. And ApoE's role as a molecular marker and director of where lipoproteins go is key. Lipoproteins are how fat and antioxidants are carried throughout the body. It is probably not a coincidence that major conditions leading to weight loss without any particular effort are infectious disease and Alzheimer's (along with cancer and melancholic depression). ApoE4 in particular is associated with poor clearance and recycling of lipoprotein particles. It's just not an efficient key. If everything is going along just fine, maybe we don't need an efficient key. Add stress or infection, though, and you have a greater need for meabolic efficiency. If you fall behind, garbage builds up.
The ability of a host (you, or me) to handle an infection depends on genetic, dietary, and other environmental factors such as age, stress, and immune status. Viruses and bacteria are sneaky - I daresay sneakier even than Homo sapiens, and the closeted nerve cells are a perfect place for an unnoticed infection to simmer for decades. Microogranisms can continuously release toxins, leading to chronic inflammation and damage. Treating a chronic infection, if it is found, could possibly be a useful way to fight a number of neurogenerative diseases, even Alzheimer's.
Thursday, November 18, 2010
Thursday Already?
It has been a nutty week. And right now I'm cooking steak, and the 18 month old won't let me put her down. But a few things have caught my attention:
Epilepsy's Big, Fat Miracle
A New York Times article about a parent with a kid on a ketogenic diet for epilepsy. Makes ketogenic diets sound completely bonkers, yet effective. I suppose that is the story of my blog.
The 32-year relationship between cholesterol and dementia from midlife to late life
Okay, get this. In animal and cell culture studies, high cholesterol is associated with amyloid beta deposition. YET when one reviews the human studies, the ratio between cholesterol levels and amyloid deposition becomes far more murky (declining cholesterol levels as we age is a risk factor). Also, high dietary cholesterol in RABBITS and genetically modified mice leads to greater amyloid pathology relative to controls!!! Should I eat the top round steak being simmered in grassfed ghee right now? Should I?
Well, high cholesterol in women who had high cholesterol in 1968-69 was not associated with risk of dementia up to 32 years later. Also, a decrease in cholesterol levels over the follow up period (of 32 years) was associated with an increased risk of dementia.
Here's a money quote from the study: "Thus, the unintended decreases in cholesterol levels (e.g. not via medications or cholesterol-lowering diet) greater than expected due to aging may be more indicative of dementia risk than midlife cholesterol levels and may reflect underlying dementia processes. This pattern is observed for other dementia risk factors, such as BMI and blood pressure...consequences of the dementia prodrome such as apathy or reduced olfactory function may lead to decreased energy intake, which may also affect blood cholesterol levels."
Or, just maybe, cholesterol levels are a biomarker of some other process, so following just cholesterol levels leads to confusing and contradictory information regarding dementia risk, and absolutely low cholesterol is usually bad news when it comes to the (human) brain.
It's exhausting, really, slaying the remnants of the lipid hypothesis in 2010.
Dear Conventional Wisdom Nutritional Information Purveyors: I know that allowing fat into the diet makes a big juicy steak that much less naughty. But you don't have to be naughty to have fun!! You can just enjoy a big juicy steak.
Seriously.
Epilepsy's Big, Fat Miracle
A New York Times article about a parent with a kid on a ketogenic diet for epilepsy. Makes ketogenic diets sound completely bonkers, yet effective. I suppose that is the story of my blog.
The 32-year relationship between cholesterol and dementia from midlife to late life
Okay, get this. In animal and cell culture studies, high cholesterol is associated with amyloid beta deposition. YET when one reviews the human studies, the ratio between cholesterol levels and amyloid deposition becomes far more murky (declining cholesterol levels as we age is a risk factor). Also, high dietary cholesterol in RABBITS and genetically modified mice leads to greater amyloid pathology relative to controls!!! Should I eat the top round steak being simmered in grassfed ghee right now? Should I?
Well, high cholesterol in women who had high cholesterol in 1968-69 was not associated with risk of dementia up to 32 years later. Also, a decrease in cholesterol levels over the follow up period (of 32 years) was associated with an increased risk of dementia.
Here's a money quote from the study: "Thus, the unintended decreases in cholesterol levels (e.g. not via medications or cholesterol-lowering diet) greater than expected due to aging may be more indicative of dementia risk than midlife cholesterol levels and may reflect underlying dementia processes. This pattern is observed for other dementia risk factors, such as BMI and blood pressure...consequences of the dementia prodrome such as apathy or reduced olfactory function may lead to decreased energy intake, which may also affect blood cholesterol levels."
Or, just maybe, cholesterol levels are a biomarker of some other process, so following just cholesterol levels leads to confusing and contradictory information regarding dementia risk, and absolutely low cholesterol is usually bad news when it comes to the (human) brain.
It's exhausting, really, slaying the remnants of the lipid hypothesis in 2010.
Dear Conventional Wisdom Nutritional Information Purveyors: I know that allowing fat into the diet makes a big juicy steak that much less naughty. But you don't have to be naughty to have fun!! You can just enjoy a big juicy steak.
Seriously.
Monday, November 15, 2010
Selenium and Depression
The good news about selenium and the brain is that one can become familiar with the literature with full institutional access to pubmed and an hour or so of reading time. The bad news is that the mechanisms of selenium and the brain are rather Mysterious, so all we have are a few papers, some micronutrient supplementation, and some speculation.
All right. Let's talk pregnancy and depression for a minute. Little known fact that slightly more women are depressed during pregnancy than after it (1). And if you combine ante and post-natal depression statistics, this is what you get for the moms and kiddos: poor maternal self-care, increase in alcohol and drug use during pregnancy, decrease in seeking medical care during pregnancy, more pre-eclampsia, birth difficulties, preterm delivery, reduced breastfeeding, lower APGAR scores, poor sleep, failure to thrive, developmental delays, greater risk of illness in the baby, more behavioral problems, and at 16 years, offspring of depressed mothers are almost five times more likely to suffer depression themselves.
There are any number of social and medical factors that are linked to perinatal depression, but let's focus a bit on the nutritional ones - links have been found with folate status, vitamin B12, calcium, iron, selenium, zinc, and omega 3s. Kaplan and colleagues, in a must-see literature review, found potential beneficial effects from B vitamins, vitamin C, D, and E, calcium, chromium, iron, magnesium, zinc, selenium, and choline on mood symptoms. (Real Food = Best Fetal Dinner). A very recent study (to which I, sadly, do not have full access) showed a significantly decreased Edinburgh Postnatal Depression Scale score (that's good) in pregnant women randomized to receive 100 mcg selenium daily from the first trimester until delivery.
Back to selenium - as you recall, it is a vital component of the selenoprotein glutathione peroxidase and is required for the synthesis and metabolism of thyroid hormones. And way back in 1991, Benton and Cook did a randomized controlled crossover trial of 100 mcg of selenium vs placebo in 50 people for 5 weeks, followed by a 6 month washout, then the crossover arm of the trial. Selenium supplementation was associated with increases in self-reported mood. This same paper tells us that when push comes to shove and selenium is deficient, the brain is the last place that selenium levels drop, suggesting that in the brain, selenium is Important. More recently, Gosney et al reviewed the effects of micronutrient supplementation on mood in nursing home residents, finding that no residents started out with insufficient serum levels of selenium, yet 8 weeks of 60 mcg selenium supplementation (included in a multivitamin/multimineral with 150 mcg iodine) was directly correlated with decreases in depression scores and increases in serum levels. The supplementation of these elderly people with selenium resulted in reduced serum T4 and increased serum T3, suggesting that the additional selenium helped the rather boring T4 become the metabolically active T3 and kick some serious sluggish metabolic and depression expletive deleted here. (Any of you with hypothyroidism on synthroid (T4) get a recommendation from your doctor to supplement iodine or selenium? Hmmm.)
In other studies, selenium serum level was associated with cognition in the elderly. In a 9 year follow-up of Alzheimer's patients, cognitive decline associated with dropping selenium levels.
There's hardly enough data even to speculate, but I'll give it a whirl. Selenium is more like magnesium than zinc. I think most non-alcoholic non-anorexic meat-eaters not on thiazide diuretics probably have enough zinc on board, though stress and yellow number five might make you waste it a bit faster than normal. But magnesium is low in pretty much everyone, as is selenium (if you aren't a coal miner or a fish in selenium rich fertilizer run-off lakes). Selenium deficiency will mask itself as a somewhat subclinical (or clinical) hypothyroidism, with depression, fatigue, and grumpiness along for the ride. Replete the selenium and jazz up the T3. Happiness to follow. Not sure what that has to do with the brain holding onto selenium, but maybe that's where the glutathione comes in.
But I have a lot more reading to do before I jump into the thyroid. In the mean time - that multimineral ain't such a bad idea. Or seaweed and brazil nuts (though Barkeater was right! According to the Internet, brazil nuts have 1000 times more radium than other foods - not that much is supposedly retained in the body??) Or organ meats. Good fuel for everyone (6 months and older).
All right. Let's talk pregnancy and depression for a minute. Little known fact that slightly more women are depressed during pregnancy than after it (1). And if you combine ante and post-natal depression statistics, this is what you get for the moms and kiddos: poor maternal self-care, increase in alcohol and drug use during pregnancy, decrease in seeking medical care during pregnancy, more pre-eclampsia, birth difficulties, preterm delivery, reduced breastfeeding, lower APGAR scores, poor sleep, failure to thrive, developmental delays, greater risk of illness in the baby, more behavioral problems, and at 16 years, offspring of depressed mothers are almost five times more likely to suffer depression themselves.
There are any number of social and medical factors that are linked to perinatal depression, but let's focus a bit on the nutritional ones - links have been found with folate status, vitamin B12, calcium, iron, selenium, zinc, and omega 3s. Kaplan and colleagues, in a must-see literature review, found potential beneficial effects from B vitamins, vitamin C, D, and E, calcium, chromium, iron, magnesium, zinc, selenium, and choline on mood symptoms. (Real Food = Best Fetal Dinner). A very recent study (to which I, sadly, do not have full access) showed a significantly decreased Edinburgh Postnatal Depression Scale score (that's good) in pregnant women randomized to receive 100 mcg selenium daily from the first trimester until delivery.
Back to selenium - as you recall, it is a vital component of the selenoprotein glutathione peroxidase and is required for the synthesis and metabolism of thyroid hormones. And way back in 1991, Benton and Cook did a randomized controlled crossover trial of 100 mcg of selenium vs placebo in 50 people for 5 weeks, followed by a 6 month washout, then the crossover arm of the trial. Selenium supplementation was associated with increases in self-reported mood. This same paper tells us that when push comes to shove and selenium is deficient, the brain is the last place that selenium levels drop, suggesting that in the brain, selenium is Important. More recently, Gosney et al reviewed the effects of micronutrient supplementation on mood in nursing home residents, finding that no residents started out with insufficient serum levels of selenium, yet 8 weeks of 60 mcg selenium supplementation (included in a multivitamin/multimineral with 150 mcg iodine) was directly correlated with decreases in depression scores and increases in serum levels. The supplementation of these elderly people with selenium resulted in reduced serum T4 and increased serum T3, suggesting that the additional selenium helped the rather boring T4 become the metabolically active T3 and kick some serious sluggish metabolic and depression expletive deleted here. (Any of you with hypothyroidism on synthroid (T4) get a recommendation from your doctor to supplement iodine or selenium? Hmmm.)
In other studies, selenium serum level was associated with cognition in the elderly. In a 9 year follow-up of Alzheimer's patients, cognitive decline associated with dropping selenium levels.
There's hardly enough data even to speculate, but I'll give it a whirl. Selenium is more like magnesium than zinc. I think most non-alcoholic non-anorexic meat-eaters not on thiazide diuretics probably have enough zinc on board, though stress and yellow number five might make you waste it a bit faster than normal. But magnesium is low in pretty much everyone, as is selenium (if you aren't a coal miner or a fish in selenium rich fertilizer run-off lakes). Selenium deficiency will mask itself as a somewhat subclinical (or clinical) hypothyroidism, with depression, fatigue, and grumpiness along for the ride. Replete the selenium and jazz up the T3. Happiness to follow. Not sure what that has to do with the brain holding onto selenium, but maybe that's where the glutathione comes in.
But I have a lot more reading to do before I jump into the thyroid. In the mean time - that multimineral ain't such a bad idea. Or seaweed and brazil nuts (though Barkeater was right! According to the Internet, brazil nuts have 1000 times more radium than other foods - not that much is supposedly retained in the body??) Or organ meats. Good fuel for everyone (6 months and older).
Sunday, November 14, 2010
Selenium!
Some of my most popular posts are about the minerals (exclamation point) and I'm guessing this one will be no exception. My least popular/commented upon posts are about ADHD. Not sure why. ADHD is way more common than say, schizophrenia. But I will give my readers what they want! Even though I don't get paid for the blog. Something is wrong with my business model here. I need to be more like those paleo entrepreneur geniuses with 4 hour work weeks and yachts.
So - selenium. This is a micronutrient found in the soil and in marine bivalves, and fixed by plants into more organically bioactive forms. Levels are extremely high in certain parts of China, Turkestan, and the Western United States. Levels are quite low in Finland, Scotland, New Zealand, and other parts of China, so that livestock must be supplemented or will fall ill. (1). According to some gardening message boards, soils here in New England aren't great, and Texas where I grew up is rather middling. And according to the various selenium sources, it is somewhat difficult to figure out how much you do get from the soil, as areas vary widely and many forms of selenium are more bioavailable than others.
What does selenium do? Well, selenium is key to one of the body's master antioxidants, glutathione peroxidase. This complex keeps the delicate polyunsaturated acids in our cell membranes from getting oxidized (rancid).
Selenium deficiency has been described in China as Keshan Disease, a type of heart problem (2). It is also thought to be a factor in gastrointestinal cancer, liver cancer, and prostate cancer. In addition, selenium in concert with iodine seems to be important to thyroid hormone production (specifically in the conversion of T4 to T3). As I've mentioned before, selenium deficiency seems to be one of the major dangers of a carelessly designed ketogenic diet, leading to death via cardiomyopathy and arrhythmia in children. Selenium deficiency was also deemed the cause of death (via cardiomyopathy) of adult patients on selenium-deficient TPN (total parenteral nutrition).
Selenium is a trace mineral. Appropriate amounts for humans are measured in micrograms rather than milligrams. People have died (from hypotension and cardiac muscle depression) via the ingestion of gram amounts of selenium, and toxicity occurs at milligram amounts. Sources indicate that the first symptoms of toxicity are hair and nail brittleness and garlic smell on the breath. The cases of death I read about involved the ingestion of "gun bluing" (2% selenious acid), so don't drink gun bluing, as selenium posioning sounds like one of the more horrific ways to die.
The bioavailability of selenium is best in plant-derived foods (grains, actually) and nuts, while the selenium in animal sources, marine sources, and mineral waters aren't as bioavailable. Fruits and vegetables, however, don't seem to have that much in the US, so if you avoid grains, the best sources are actually brazil nuts (one ounce will get you over 500 micrograms, so take it easy and have just a few at a time). Organ meats are also high in selenium. 3 oz or so of tuna, beef, cod, and chicken breast will get you around 30 mcg (though it is less bioavailable), whereas a typical grain meal (pasta, oatmeal) will net you around 10-15 mcg. Though phytic acids in grains do interfere with mineral absorption, studies of women eating wheat from selenium-rich soils showed a nice elevation in plasma levels of selenium.
The adult US RDA is 55 micrograms (though a bit higher for pregnant or breastfeeding women). (3)The doctors Jaminet recommend 200 mcg daily, and the US upper tolerable limit is 400 mcg (the level above which those nails start to get brittle, you get irritable and tired, and that breath begins to be garlicky.). On the other hand, Chinese studies of daily doses up to 1500 micrograms showed no adverse effects, nor did another study of 600 mcgs daily.
In my next post I'll review the studies of selenium and mental health. In the mean time, eat a few brazil nuts and cool it on the supplements if your breath gets garlicky or your nails start to look like this. (Actually, go see a doctor pronto if your hair falls out and your nails look like that.)
(another iPad post so forgive any bad spelling, and I'll add a few more links later!)
So - selenium. This is a micronutrient found in the soil and in marine bivalves, and fixed by plants into more organically bioactive forms. Levels are extremely high in certain parts of China, Turkestan, and the Western United States. Levels are quite low in Finland, Scotland, New Zealand, and other parts of China, so that livestock must be supplemented or will fall ill. (1). According to some gardening message boards, soils here in New England aren't great, and Texas where I grew up is rather middling. And according to the various selenium sources, it is somewhat difficult to figure out how much you do get from the soil, as areas vary widely and many forms of selenium are more bioavailable than others.
What does selenium do? Well, selenium is key to one of the body's master antioxidants, glutathione peroxidase. This complex keeps the delicate polyunsaturated acids in our cell membranes from getting oxidized (rancid).
Selenium deficiency has been described in China as Keshan Disease, a type of heart problem (2). It is also thought to be a factor in gastrointestinal cancer, liver cancer, and prostate cancer. In addition, selenium in concert with iodine seems to be important to thyroid hormone production (specifically in the conversion of T4 to T3). As I've mentioned before, selenium deficiency seems to be one of the major dangers of a carelessly designed ketogenic diet, leading to death via cardiomyopathy and arrhythmia in children. Selenium deficiency was also deemed the cause of death (via cardiomyopathy) of adult patients on selenium-deficient TPN (total parenteral nutrition).
Selenium is a trace mineral. Appropriate amounts for humans are measured in micrograms rather than milligrams. People have died (from hypotension and cardiac muscle depression) via the ingestion of gram amounts of selenium, and toxicity occurs at milligram amounts. Sources indicate that the first symptoms of toxicity are hair and nail brittleness and garlic smell on the breath. The cases of death I read about involved the ingestion of "gun bluing" (2% selenious acid), so don't drink gun bluing, as selenium posioning sounds like one of the more horrific ways to die.
The bioavailability of selenium is best in plant-derived foods (grains, actually) and nuts, while the selenium in animal sources, marine sources, and mineral waters aren't as bioavailable. Fruits and vegetables, however, don't seem to have that much in the US, so if you avoid grains, the best sources are actually brazil nuts (one ounce will get you over 500 micrograms, so take it easy and have just a few at a time). Organ meats are also high in selenium. 3 oz or so of tuna, beef, cod, and chicken breast will get you around 30 mcg (though it is less bioavailable), whereas a typical grain meal (pasta, oatmeal) will net you around 10-15 mcg. Though phytic acids in grains do interfere with mineral absorption, studies of women eating wheat from selenium-rich soils showed a nice elevation in plasma levels of selenium.
The adult US RDA is 55 micrograms (though a bit higher for pregnant or breastfeeding women). (3)The doctors Jaminet recommend 200 mcg daily, and the US upper tolerable limit is 400 mcg (the level above which those nails start to get brittle, you get irritable and tired, and that breath begins to be garlicky.). On the other hand, Chinese studies of daily doses up to 1500 micrograms showed no adverse effects, nor did another study of 600 mcgs daily.
In my next post I'll review the studies of selenium and mental health. In the mean time, eat a few brazil nuts and cool it on the supplements if your breath gets garlicky or your nails start to look like this. (Actually, go see a doctor pronto if your hair falls out and your nails look like that.)
(another iPad post so forgive any bad spelling, and I'll add a few more links later!)
Saturday, November 13, 2010
Acne and Suicide
There have been many reports on acne, depression, and suicide over the years, mostly because a primary treatment for severe acne, isotretinoin (accutane), is known to increase the risk of depression and suicide. However, a new paper in the British Medical Journal, Association of suicide attemps with acne and treatment with isotretinoin: retrospective Swedish cohort study adds a new twist.
The researchers followed 5756 people in Sweden from 1980-2001, before, during, and after accutane treatment for severe acne. Sure enough, admissions to the hospital for suicide attempts were increased in patients during and for up to six months after completion of accutane treatment. Three years after treatment, and for the next 12 years, attempted suicide rates were back to normal levels, so the treatment didn't seem to have permanent effects. The effect size was such that it would take 2300 people treated with accutane for 6 months to result in one additional first suicide attempt. In addition, the effect of accutane on suicide attempts seemed to be stronger in women than for men, and the risk was higher among women who received two or more treatment courses.
Here's the twist. Attempted suicide rates were also higher in the population in the months prior to treatment with accutane. And some (admittedly small) studies have shown that accutane leads to an improvement in anxiety and depression (1)(2) due to a clearing up of the disfiguring acne. And one wonders about the increased risk in women - women who undergo more than one course of treatment would have worse acne, and it is plausible that, as a cohort, young women would be more concerned about disfiguring acne than young men. And of course, we know that acne is a disease of civilization (the Kitavans and Masai are unafflicted, for example) and obviously associated with inflammation, just like psoriasis and obesity and those other diseases all independently associated with depression and anxiety. I have heard of a case of someone with years of unsuccessful standard medical acne treatment (though not accutane, as that is not routinely used in young women in the US due to the risk of birth defects) who responded well to Cordain's dietary cure and also had increased energy on the Cordain paleo-inspired diet.
On the other hand, there are very plausible biological brain mechanisms for accutane causing depression, and accutane has caused depression behaviors in studies of young adult mice (3). Various mechanisms include disrupting hippocampal neurons and their communication (4) and interfering with the serotonin machinery (5 - this link is rather interesting, as accutane seems to be rather the opposite of an SSRI in mice, increasing serotonin reuptake and increasing the post synaptic serotonin receptors).
In addition, cillikat brought up some links between accutane, cholesterol, and vitamin D in her recent comment on this post. Though she mentioned that accutane lowered cholesterol, I found it to be a cause of increased cholesterol, LDL, and triglycerides in several studies (here is the first one from 1983, I believe.) But there is a link between accutane treatment and vitamin D. Accutane is basically a super-strong variety of vitamin A, and vitamin A and D work in concert as their receptor complexes attach together to do a variety of actions in the nucleus (6). Exceedingly high A intake could result in there being a ton of vitamin A receptor complexes swimming around and very few vitamin D complexes to join the party. This issue would be exacerbated by the typical advice to accutane users to stay out of the sun. Studies combining accutane and active vitamin D3 have used the synergistic receptor complex effect to try to kill cancer cells more effectively (7).
Vitamin D deficiency has links to depression. It is plausible that high levels of vitamin A from accutane would interfere with vitamin D doing its job and also contribute to the depression side effect, but to my knowledge this is all speculation.
But back to the original paper! What to do about acne? It shouldn't be ignored or dismissed as a non-serious condition, but treatment has to be carefully considered, and people suffering from acne carefully monitored for depression and suicidal thoughts. To me the obvious first choice for treatment would be dietary - however I am not in agreement with dermatologists (about a lot of things, actually), and it is a certainty that they know more about the skin than I do (though likely much less about traditional diets).
The researchers followed 5756 people in Sweden from 1980-2001, before, during, and after accutane treatment for severe acne. Sure enough, admissions to the hospital for suicide attempts were increased in patients during and for up to six months after completion of accutane treatment. Three years after treatment, and for the next 12 years, attempted suicide rates were back to normal levels, so the treatment didn't seem to have permanent effects. The effect size was such that it would take 2300 people treated with accutane for 6 months to result in one additional first suicide attempt. In addition, the effect of accutane on suicide attempts seemed to be stronger in women than for men, and the risk was higher among women who received two or more treatment courses.
Here's the twist. Attempted suicide rates were also higher in the population in the months prior to treatment with accutane. And some (admittedly small) studies have shown that accutane leads to an improvement in anxiety and depression (1)(2) due to a clearing up of the disfiguring acne. And one wonders about the increased risk in women - women who undergo more than one course of treatment would have worse acne, and it is plausible that, as a cohort, young women would be more concerned about disfiguring acne than young men. And of course, we know that acne is a disease of civilization (the Kitavans and Masai are unafflicted, for example) and obviously associated with inflammation, just like psoriasis and obesity and those other diseases all independently associated with depression and anxiety. I have heard of a case of someone with years of unsuccessful standard medical acne treatment (though not accutane, as that is not routinely used in young women in the US due to the risk of birth defects) who responded well to Cordain's dietary cure and also had increased energy on the Cordain paleo-inspired diet.
On the other hand, there are very plausible biological brain mechanisms for accutane causing depression, and accutane has caused depression behaviors in studies of young adult mice (3). Various mechanisms include disrupting hippocampal neurons and their communication (4) and interfering with the serotonin machinery (5 - this link is rather interesting, as accutane seems to be rather the opposite of an SSRI in mice, increasing serotonin reuptake and increasing the post synaptic serotonin receptors).
In addition, cillikat brought up some links between accutane, cholesterol, and vitamin D in her recent comment on this post. Though she mentioned that accutane lowered cholesterol, I found it to be a cause of increased cholesterol, LDL, and triglycerides in several studies (here is the first one from 1983, I believe.) But there is a link between accutane treatment and vitamin D. Accutane is basically a super-strong variety of vitamin A, and vitamin A and D work in concert as their receptor complexes attach together to do a variety of actions in the nucleus (6). Exceedingly high A intake could result in there being a ton of vitamin A receptor complexes swimming around and very few vitamin D complexes to join the party. This issue would be exacerbated by the typical advice to accutane users to stay out of the sun. Studies combining accutane and active vitamin D3 have used the synergistic receptor complex effect to try to kill cancer cells more effectively (7).
Vitamin D deficiency has links to depression. It is plausible that high levels of vitamin A from accutane would interfere with vitamin D doing its job and also contribute to the depression side effect, but to my knowledge this is all speculation.
But back to the original paper! What to do about acne? It shouldn't be ignored or dismissed as a non-serious condition, but treatment has to be carefully considered, and people suffering from acne carefully monitored for depression and suicidal thoughts. To me the obvious first choice for treatment would be dietary - however I am not in agreement with dermatologists (about a lot of things, actually), and it is a certainty that they know more about the skin than I do (though likely much less about traditional diets).
Thursday, November 11, 2010
ADHD and Obesity
Make a guess - are people with ADHD more likely to be obese or skinny?
I would have thought skinny, just on first hunch. Calories in = calories out, right? And a lot of folks with ADHD are very fidgety! Never sit still.
I was wrong. There are actually a number of studies linking ADHD to childhood and adulthood obesity. Another very large study came out a week or so ago in the International Journal of Obesity. I don't have that much time right now so I'll not comment on the methods as much as usual suffice it to say that 12000 or so people were followed from adolescence into adulthood, and records were kept of ADHD hyperactive and inattentive symptoms, along with waist circumference, BMI, blood pressure, physical activity, smoking, and alcohol. No one tracked any measure of diet as I can't see how that would be pertinent to an obesity study...
The results - kids with ADHD symptoms in childhood were 41% likely to be obese as adults, whereas kids without ADHD symptoms were only 34% likely to be obese (still an appalling number - these are young adults, after all, with a mean age at the last wave of the study of 28.9 years), and the more ADHD symptoms you had, the more likely you were to be obese - those with three or more inattentive or hyperactive symptoms were 50% more likely to be obese as adults.
The discussion is interesting. If you made it through my post on antidepressants and weight, you might have noted that antidepressants that jacked up the dopamine response (especially wellbutrin) seemed to make you skinnier over time. Well, ADHD is associated with a decrease in the ability of the brain to handle dopamine. Here's a pertinent quote from the study:
"Genetic association studies have found that ADHD and obesity are both associated with genes regulating dopamine availability.50 Furthermore, in two separate studies using Positron emission tomography with [11C]raclopride, Volkow et al.52 have shown that individuals with ADHD50 and those who are obese both show lower than normal dopamine (D2) receptor availability. This lower dopamine receptor availability could reflect the common dispositions in both ADHD and obesity."
I have some more theories. As we know, ADHD is associated with food additives and a genetic vulnerability to highly processed diets. It is possible that many (certainly not all!! ) kids who have symptoms of ADHD also are in that segment of the population who don't give a crap about their diets. Oh yeah, there's been a study about that. And I'm guessing, Twinkies aside, that highly processed food diets are associated with more obesity. A nonscientific perusal of grocery carts might give you that information.
And then, of course, there is the fact that having ADHD is stressful, and one might end up with depression or anxiety, and then be more vulnerable to obesity via the inflammation pathway.
The answer is always the same (low stress primal diet and lifestyle) because the causes all boil down to the same thing. Eat what you are designed to eat, exercise, and chill out when you get the chance. Enjoy family and friends. Live and love. When things are already deranged it may not be a cure, but in most cases it sure will help.
I would have thought skinny, just on first hunch. Calories in = calories out, right? And a lot of folks with ADHD are very fidgety! Never sit still.
I was wrong. There are actually a number of studies linking ADHD to childhood and adulthood obesity. Another very large study came out a week or so ago in the International Journal of Obesity. I don't have that much time right now so I'll not comment on the methods as much as usual suffice it to say that 12000 or so people were followed from adolescence into adulthood, and records were kept of ADHD hyperactive and inattentive symptoms, along with waist circumference, BMI, blood pressure, physical activity, smoking, and alcohol. No one tracked any measure of diet as I can't see how that would be pertinent to an obesity study...
The results - kids with ADHD symptoms in childhood were 41% likely to be obese as adults, whereas kids without ADHD symptoms were only 34% likely to be obese (still an appalling number - these are young adults, after all, with a mean age at the last wave of the study of 28.9 years), and the more ADHD symptoms you had, the more likely you were to be obese - those with three or more inattentive or hyperactive symptoms were 50% more likely to be obese as adults.
The discussion is interesting. If you made it through my post on antidepressants and weight, you might have noted that antidepressants that jacked up the dopamine response (especially wellbutrin) seemed to make you skinnier over time. Well, ADHD is associated with a decrease in the ability of the brain to handle dopamine. Here's a pertinent quote from the study:
"Genetic association studies have found that ADHD and obesity are both associated with genes regulating dopamine availability.50 Furthermore, in two separate studies using Positron emission tomography with [11C]raclopride, Volkow et al.52 have shown that individuals with ADHD50 and those who are obese both show lower than normal dopamine (D2) receptor availability. This lower dopamine receptor availability could reflect the common dispositions in both ADHD and obesity."
I have some more theories. As we know, ADHD is associated with food additives and a genetic vulnerability to highly processed diets. It is possible that many (certainly not all!! ) kids who have symptoms of ADHD also are in that segment of the population who don't give a crap about their diets. Oh yeah, there's been a study about that. And I'm guessing, Twinkies aside, that highly processed food diets are associated with more obesity. A nonscientific perusal of grocery carts might give you that information.
And then, of course, there is the fact that having ADHD is stressful, and one might end up with depression or anxiety, and then be more vulnerable to obesity via the inflammation pathway.
The answer is always the same (low stress primal diet and lifestyle) because the causes all boil down to the same thing. Eat what you are designed to eat, exercise, and chill out when you get the chance. Enjoy family and friends. Live and love. When things are already deranged it may not be a cure, but in most cases it sure will help.
Primal Muse Podcast!
Very excited to announce Jamie Scott's podcast over on Carbohydrates Can Kill. I think the turning point happened when he realized that it would be incredibly difficult to burn off 5g per kg of recommended athlete carbohydrate intake per day and still keep a day job. Best, I think, to eat like a human, not a hummingbird, when it comes down to it. Great interview!
Tuesday, November 9, 2010
Antidepressants and Weight Gain or Loss
My focus in this blog is not on the standard medical treatments. You can come see me in clinic or go to your own psychiatrist for that. However, an exploration of antidepressants and weight gain can give us some insight into the processes of obesity and the brain, so I will dive on in. It is a more complicated subject than you might expect, as the mechanisms aren't always clear, and mood disorders themselves can cause weight gain or loss. However, the Journal of Clinical Psychiatry came out with a nice review in October, Antidepressants and Body Weight: A Comprehensive Review and Meta-Analysis.
I don't always care for the Journal of Clinical Psychiatry, mostly because they put out "supplements" bound exactly like the main journal which aren't necessarily peer-reviewed and can be high-end academic commercials for drug companies (you will notice that free registration will get you access to the supplements on their website, but you need to pay to read the journal!). If you aren't savvy, you might get tricked. Their supplements usually end up in my trash, but the main journal has some peer reviewed good stuff.
In the antidepressant/body weight review, the researchers screened over 3000 reports, finally settling on 116 that met certain eligibility requirements (published in a respectable journal, used therapeutic doses, weighing patients prior to and at the end of the trials of at least 4 weeks, etc.). Then the researchers configured the data so that it could be more easily compared across the different trials (not always an easy or uncontroversial task), and came up with acute weight change data (trials of 4-12 weeks) and longer-term weight change trials (>4 months).
The results? In acute treatment, the older class of tricyclic antidepressants (with one exception) and mirtazapine were associated with more weight gain, while all SSRIs and buproprion (wellbutrin) were associated with weight loss. Placebo was associated with slight weight gain. In long term trials, the only one that was linked to significant weight loss was buproprion. The SSRIs except paxil were weight neutral (though citalopram had widely varying results), and paxil, mirtazapine, and amitriptyline (elavil) were associated with weight gain. Placebo slightly favored weight gain but was basically weight neutral also.
So what is going on? Well, paxil, mirtazapine, and amitriptyline all have something in common. They all have affinity for the histamine receptor and are anticholinergic. Alpha receptor blockers are also associated with weight gain, and mirtazapine and amitriptyline have that in spades. Drugs that cause weight loss have more affinity for dopamine and enhance serotonin function. Several drugs have a bit of everything (imipramine, for example, is anticholinergic, but also pro-dopamine, so it seems to be weight neutral overall).
Why does histamine promote weight gain? Well, the H1 receptor seems to activate AMPkinase in the hypothalamus (1). AMPKinase reverses the actions of leptin, the appetite-suppressing hormone, and AMPkinase may be activated by orexin, the appetite-inducing hormone. Clozaril, an antipsychotic medication known for its ability to cause huge weight gain, does not cause weight gain in mice that lack the H1 receptor. To make things even more complicated, another antipsychotic, zyprexa, also causes a lot of weight gain through the same mechanism. There's a type of zyprexa that dissolves in the mouth called zydis - same exact drug, only a lot of it may bypass the gut and simply be absorbed into the bloodstream in the mouth - and zydis doesn't seem to cause weight gain. This suggests that it is an interaction with these drugs and the gut that may be the real issue here. That interaction is poorly understood.
The SSRIs are interesting in that they seem to promote weight loss in the beginning, but (except for the strong fat-inducing anticholinergic paxil) are weight neutral in the long term. Remember, when we are low in serotonin, we crave carbs, probably because a high carb diet helps us bring more tryptophan, the serotonin precursor, into our brains for conversion to serotonin. At the beginning of treatment, SSRIs seem to increase serotonin, which will decrease appetite and decrease carbohydrate cravings (possibly via orexin). After a few weeks, however, the post-synaptic serotonin receptors get sucked back into the cell, more or less reducing the overall serotonin effect. The timing of the receptor down-regulation matches the timing of when the medicine usually starts to take effect for depression or anxiety (one piece of evidence that it is not serotonin per se, but another effect of the medicine that is actually antidepressant or anti-anxiety - perhaps because they are anti-inflammatory and anti-kyurnetic?) All of these medicines also have histamine and mild anticholinergic effects to some extent, so the serotonin weight loss effect may counterbalance the histamine.
At the far end on the weight loss side is buproprion, or wellbutrin, which isn't really a serotonin drug at all. It maximizes norepinephrine and dopamine, and has almost no histamine or anticholinergic effect at all. Buproprion has actually been proposed as a treatment for obesity, though given that it went generic several years ago, it may never be FDA approved for that indication. Buproprion isn't all giggles - it can cause anxiety, anger, and seizures.
I don't want to distract from the fact that if we live a healthy life, eat "real food," and take steps to reduce our stress, and exercise, we are less likely to get depressed in the first place. In addition, by avoiding the inflammation of depression and other mood disorders in combination with the SAD, we don't have to worry too much about weight gain, either. But, as I said at the beginning, understanding how these medicines affect weight gain and loss can give us some insights into how the brain (and possibly even the gut nervous system) works, and that is interesting. At least to me.
I don't always care for the Journal of Clinical Psychiatry, mostly because they put out "supplements" bound exactly like the main journal which aren't necessarily peer-reviewed and can be high-end academic commercials for drug companies (you will notice that free registration will get you access to the supplements on their website, but you need to pay to read the journal!). If you aren't savvy, you might get tricked. Their supplements usually end up in my trash, but the main journal has some peer reviewed good stuff.
In the antidepressant/body weight review, the researchers screened over 3000 reports, finally settling on 116 that met certain eligibility requirements (published in a respectable journal, used therapeutic doses, weighing patients prior to and at the end of the trials of at least 4 weeks, etc.). Then the researchers configured the data so that it could be more easily compared across the different trials (not always an easy or uncontroversial task), and came up with acute weight change data (trials of 4-12 weeks) and longer-term weight change trials (>4 months).
The results? In acute treatment, the older class of tricyclic antidepressants (with one exception) and mirtazapine were associated with more weight gain, while all SSRIs and buproprion (wellbutrin) were associated with weight loss. Placebo was associated with slight weight gain. In long term trials, the only one that was linked to significant weight loss was buproprion. The SSRIs except paxil were weight neutral (though citalopram had widely varying results), and paxil, mirtazapine, and amitriptyline (elavil) were associated with weight gain. Placebo slightly favored weight gain but was basically weight neutral also.
So what is going on? Well, paxil, mirtazapine, and amitriptyline all have something in common. They all have affinity for the histamine receptor and are anticholinergic. Alpha receptor blockers are also associated with weight gain, and mirtazapine and amitriptyline have that in spades. Drugs that cause weight loss have more affinity for dopamine and enhance serotonin function. Several drugs have a bit of everything (imipramine, for example, is anticholinergic, but also pro-dopamine, so it seems to be weight neutral overall).
Why does histamine promote weight gain? Well, the H1 receptor seems to activate AMPkinase in the hypothalamus (1). AMPKinase reverses the actions of leptin, the appetite-suppressing hormone, and AMPkinase may be activated by orexin, the appetite-inducing hormone. Clozaril, an antipsychotic medication known for its ability to cause huge weight gain, does not cause weight gain in mice that lack the H1 receptor. To make things even more complicated, another antipsychotic, zyprexa, also causes a lot of weight gain through the same mechanism. There's a type of zyprexa that dissolves in the mouth called zydis - same exact drug, only a lot of it may bypass the gut and simply be absorbed into the bloodstream in the mouth - and zydis doesn't seem to cause weight gain. This suggests that it is an interaction with these drugs and the gut that may be the real issue here. That interaction is poorly understood.
The SSRIs are interesting in that they seem to promote weight loss in the beginning, but (except for the strong fat-inducing anticholinergic paxil) are weight neutral in the long term. Remember, when we are low in serotonin, we crave carbs, probably because a high carb diet helps us bring more tryptophan, the serotonin precursor, into our brains for conversion to serotonin. At the beginning of treatment, SSRIs seem to increase serotonin, which will decrease appetite and decrease carbohydrate cravings (possibly via orexin). After a few weeks, however, the post-synaptic serotonin receptors get sucked back into the cell, more or less reducing the overall serotonin effect. The timing of the receptor down-regulation matches the timing of when the medicine usually starts to take effect for depression or anxiety (one piece of evidence that it is not serotonin per se, but another effect of the medicine that is actually antidepressant or anti-anxiety - perhaps because they are anti-inflammatory and anti-kyurnetic?) All of these medicines also have histamine and mild anticholinergic effects to some extent, so the serotonin weight loss effect may counterbalance the histamine.
At the far end on the weight loss side is buproprion, or wellbutrin, which isn't really a serotonin drug at all. It maximizes norepinephrine and dopamine, and has almost no histamine or anticholinergic effect at all. Buproprion has actually been proposed as a treatment for obesity, though given that it went generic several years ago, it may never be FDA approved for that indication. Buproprion isn't all giggles - it can cause anxiety, anger, and seizures.
I don't want to distract from the fact that if we live a healthy life, eat "real food," and take steps to reduce our stress, and exercise, we are less likely to get depressed in the first place. In addition, by avoiding the inflammation of depression and other mood disorders in combination with the SAD, we don't have to worry too much about weight gain, either. But, as I said at the beginning, understanding how these medicines affect weight gain and loss can give us some insights into how the brain (and possibly even the gut nervous system) works, and that is interesting. At least to me.
Saturday, November 6, 2010
Coenzyme Q10 and Parkinson's
In yesterday's post, I discussed some of the evidence that Parkinson's disease is possibly caused by defects in the energy efficiency of the mitochondria of susceptible (and poisoned) individuals. I don't mean to do a whole Parkinson's series or anything, but the post did bring up some interesting questions.
We already know that a ketogenic metabolism seems to increase energy efficiency in the brain. But what else might do the same? Well, researchers are already looking into some trials of a few compounds of interest - Coenzyme Q10 (also known as ubiquinone) and creatine. There are a few published animal studies, a few small clinical trials of CoQ10, and human trials are ongoing for both CoQ10 and creatine.
CoQ10 is an interesting vitamin/molecule. Yesterday I compared mitochondria (our cells' energy factories) to a ski resort. There are proton pumping complexes in the mitochondria that pump against a gradient, rather like the ski lift carrying skiers up the hill. Eventually the energy is used to transport electrons across membranes in the "electron transport chain" and ATP is created (skiers set free to fly down the hill). To beat the analogy to death, CoQ10 is a bit like the attendants at the top of the ski lift, making sure everyone gets off the lift okay, and guiding skiers between lifts. While small doses of CoQ10 didn't seem to help symptoms of Parkinson's large doses (1200mg) seemed mildly helpful, and even larger doses are being tried now.
Creatine is a compound made from amino acids, and basically helps the body make ATP more easily. According to the Wikipedia article, we make a lot of creatine from dietary amino acids, but about half of our creatine is taken directly from eating skeletal muscle, and the muscles of vegetarians are lower in creatine. In animal studies, supplementation with creatine (combined with CoQ10) was helpful for Parkinson's symptoms.
Anyone who reads "statin skeptic" literature knows that statins lower CoQ10 levels. A pubmed search reveals a lot of papers on the subject - questions arise as to whether statins cause heart failure due to CoQ10 depletion - the heart, like the brain, is energy-hungry, and CoQ10 depletion over time might damage the heart (the study I linked showed lower CoQ10 in statin users and worse heart failure in people with low CoQ10, but no link between statin use and worsening of heart failure...) There are also questions if the CoQ10 depletion causes ALS (Lou Gerhig's disease) and whether the CoQ10 depletion causes the known (rare) statin side effect, diabetes.
So the obvious question is - would statins cause Parkinsons? This small study found that there didn't seem to be a link between Parkinson's progression and statin use, and this brief editorial notes that in population studies, higher LDL cholesterol levels are associated with lower risk of Parkinson's disease, but in the Rotterdam study, statin use seemed to have no correlation with Parkinson's disease risk, and other small studies showed the use of cholesterol-lowering drugs was associated with a decreased risk of Parkinson's. Turns out that serum cholesterol levels (and triglycerides) are the strongest determinant of serum CoQ10 levels - the reason being that CoQ10 rides along with the lipoprotein complexes in the body. Moderate alcohol use also seems to be associated with CoQ10 increases. And the best dietary sources of CoQ10? Meat, eggs, and certain vegetables such as broccoli. Dietary intake wasn't correlated that much with serum levels, though that could be because serum levels had more to do with how much cholesterol was floating around.
What can I make of this confusing mess? CoQ10 is a powerful antioxidant, one of whose jobs is likely to protect cholesterol and triglycerides as they are transported through the scary circulatory system. People with lots of inflammation will have high triglycerides, high cholesterol, and high CoQ10. The high CoQ10 is probably protective against Parkinson's disease (and perhaps a robust amount of cholesterol is too). Statins are also antioxidant and antiinflammatory, so it is possible that those attributes make up for obliterating the cholesterol making machinery and depleting the body's CoQ10, at least in the case of Parkinson's disease. The jury, though, is still out.
We already know that a ketogenic metabolism seems to increase energy efficiency in the brain. But what else might do the same? Well, researchers are already looking into some trials of a few compounds of interest - Coenzyme Q10 (also known as ubiquinone) and creatine. There are a few published animal studies, a few small clinical trials of CoQ10, and human trials are ongoing for both CoQ10 and creatine.
CoQ10 is an interesting vitamin/molecule. Yesterday I compared mitochondria (our cells' energy factories) to a ski resort. There are proton pumping complexes in the mitochondria that pump against a gradient, rather like the ski lift carrying skiers up the hill. Eventually the energy is used to transport electrons across membranes in the "electron transport chain" and ATP is created (skiers set free to fly down the hill). To beat the analogy to death, CoQ10 is a bit like the attendants at the top of the ski lift, making sure everyone gets off the lift okay, and guiding skiers between lifts. While small doses of CoQ10 didn't seem to help symptoms of Parkinson's large doses (1200mg) seemed mildly helpful, and even larger doses are being tried now.
Creatine is a compound made from amino acids, and basically helps the body make ATP more easily. According to the Wikipedia article, we make a lot of creatine from dietary amino acids, but about half of our creatine is taken directly from eating skeletal muscle, and the muscles of vegetarians are lower in creatine. In animal studies, supplementation with creatine (combined with CoQ10) was helpful for Parkinson's symptoms.
Anyone who reads "statin skeptic" literature knows that statins lower CoQ10 levels. A pubmed search reveals a lot of papers on the subject - questions arise as to whether statins cause heart failure due to CoQ10 depletion - the heart, like the brain, is energy-hungry, and CoQ10 depletion over time might damage the heart (the study I linked showed lower CoQ10 in statin users and worse heart failure in people with low CoQ10, but no link between statin use and worsening of heart failure...) There are also questions if the CoQ10 depletion causes ALS (Lou Gerhig's disease) and whether the CoQ10 depletion causes the known (rare) statin side effect, diabetes.
So the obvious question is - would statins cause Parkinsons? This small study found that there didn't seem to be a link between Parkinson's progression and statin use, and this brief editorial notes that in population studies, higher LDL cholesterol levels are associated with lower risk of Parkinson's disease, but in the Rotterdam study, statin use seemed to have no correlation with Parkinson's disease risk, and other small studies showed the use of cholesterol-lowering drugs was associated with a decreased risk of Parkinson's. Turns out that serum cholesterol levels (and triglycerides) are the strongest determinant of serum CoQ10 levels - the reason being that CoQ10 rides along with the lipoprotein complexes in the body. Moderate alcohol use also seems to be associated with CoQ10 increases. And the best dietary sources of CoQ10? Meat, eggs, and certain vegetables such as broccoli. Dietary intake wasn't correlated that much with serum levels, though that could be because serum levels had more to do with how much cholesterol was floating around.
What can I make of this confusing mess? CoQ10 is a powerful antioxidant, one of whose jobs is likely to protect cholesterol and triglycerides as they are transported through the scary circulatory system. People with lots of inflammation will have high triglycerides, high cholesterol, and high CoQ10. The high CoQ10 is probably protective against Parkinson's disease (and perhaps a robust amount of cholesterol is too). Statins are also antioxidant and antiinflammatory, so it is possible that those attributes make up for obliterating the cholesterol making machinery and depleting the body's CoQ10, at least in the case of Parkinson's disease. The jury, though, is still out.
Friday, November 5, 2010
Brain Efficiency
I should preface this post with another post. If you have a minute and you don't mind, have a look at Your Brain on Ketones. The down low is that, for various reasons, a ketogenic diet (very low carb and high fat, or moderately low carb and high medium chain triglyceride, such as coconut oil), seems to allow our mitochondria (the cells' energy factories) to make energy more efficiently. This ability is less important in our muscles (unless you are an elite athlete), but in our brain, which uses a ton of energy and relies on energy-expensive ion gradients to function properly, efficiency is paramount. Never so much as when you are talking about a brain disorder, such as epilepsy, migraines, Alzheimer's, or, as in the case of the paper I'm referencing today, Parkinson's disease.
Parkinson's disease (PD) is another long-term degenerative brain disease, somewhat like Alzheimer's but with more prominent muscular symptoms and a different pathophysiology. In Parkinson's, the little cells that make dopamine, a major neurotransmitter, in the substantia nigra seem to slowly die off. Without dopamine fueling your brain you get tremor, dementia, muscular stiffness, depression, and eventually death. There are some genes known that predispose for Parkinson's, but most cases are idiopathic, meaning no one knows why the disease strikes a certain person and spares another.
Exposure to a certain pesticide, rotenone, and welding does increase risk of PD. Rotenone works by inhibiting the proton pumping complex 1 in the mitochondria (okay, the biochemistry of the mitochondria is complicated but fun. The whole point of the mitochondria is to generate the gasoline of the body, ATP, and mitochondria do this by pumping protons up gradients via several complexes, "proton pumping complex 1", etc. Think of the proton pumping complexes as ski lifts carrying skiers up to the top of the hill, where they are set loose to glory in the thrill of gravity. So rotenone is a hater of snowy fun and shuts down the ski lift.) Another inhibitor of proton pumping complex 1 is MPTP, famous for being an adulterant in synthetic opiates, and causing immediate irreversable Parkinson's disease, killing your dopamine-making neurons. Don't do drugs, kids.
Caffeine and tobacco (which juice up energy efficiency) use seem to reduce the risk of PD. So drink up that coffee! (Don't smoke).
Our intrepid researchers took genetic data from 410 post-mortem brains with Parkinson's Disease and "healthy" controls. They analyzed "6.8 million raw data points from nine genome-wide expression studies" (I *heart* geneticists), focusing on genes active within the substantia nigra. They found several gene sets that seemed to cluster in the PD folks, and many of these genes seemed to be a part of the mitochondrial complex called the electron transport chain. Part of the proton-pumps. Part of the energy factories of the cells. These 95 energy factory genes seemed to be "underexpressed" in Parkinson's sufferers. That means the substantia nigra was suffering an energy shortage, so the cells that make dopamine went kaput, out of gas.
The researchers found a second set of genes that seemed to be associated with Parkinson's. These genes affect glycolysis. (Que?) Remember, the brain can't run on ketones alone. Certain long nerve tendrils are too spindly to carry mitochondria. Those spindly bits need to run on pure glucose. Glucose becomes ATP directly via glycolysis.
Glycolysis turns glucose into pyruvate, yielding several ATP along the way, and then pyruvate enters the citric acid cycle to become ATP. Glycolysis is also part of how some anaerobic bacteria turn sugar into alcohol (called fermentation), so enjoy your biochemistry, preferably on the weekends.
Now we come to a third gene, PGC-1alpha. I know I've already exhausted you, but carry on! PGC-1alpha is a "master regulator of mitochondrial biogenesis and oxidative metabolism." It seems to control protein-folding and direct proteins where to go in the mitochondria. Underexpression of the PGC-1alpha gene was highly associated with Parkinson's Disease, and mitochondrial activity and ATP concentrations were severely decreased in the brain samples of Parkinson's patients.
All of these gene sets are associated with the energy efficiency of the brain cells, and were not only associated with Parkinson's, but also with Parkinson's precursor states, suggesting these are causative, not a result of whatever insult causes Parkinson's.
And then the researchers got very cute. They fabricated a virus to infect rat brains, a virus that causes the overexpression of PGC-1alpha. Then they exposed the rats to rotenone, the pesticide that causes Parkinson's. With the extra PGC-1alpha, these rats seemed to be more immune to Parkinson's than the average rat exposed to rotenone. Other researchers found that mice without any PGC-1alpha were more susceptible to the Parkinson's caused by the MPTP.
And, finally, magnetic resonance spectroscopy of living Parkinson's patients shows that their brains seem to have more lactate on board than normal. Meaning their brains are struggling with the metabolism of glucose and are running on anaerobic pathways instead. When your brain seems to be running like fermentation bacteria, you have a big problem.*
It seems probable that the dopaminergic neurons of the substantia nigra need more ATP than most, so if your brain is inefficient, they suffer first. Therefore a variety of genetic susceptibilities in the electron transport chain, glycolysis, and basic energy creation in the brain presents as Parkinson's disease first and foremost.
I'll quote the article here: "If this hypothesis is valid, it would suggest that modulation of cellular energetics could be used to prevent or treat PD, and that monitoring cellular energetics could serve as a diagnostic tool."
The only way I know to modulate brain energetics is to avoid carbohydrates or drink down a lot of coconut oil. (You can only drink so much coffee, but I thought of some more ways to modulate energy use in the mitochondria, which will be the next post.)
Food for thought!
*There's lactate again. More on that in a future post too.
Parkinson's disease (PD) is another long-term degenerative brain disease, somewhat like Alzheimer's but with more prominent muscular symptoms and a different pathophysiology. In Parkinson's, the little cells that make dopamine, a major neurotransmitter, in the substantia nigra seem to slowly die off. Without dopamine fueling your brain you get tremor, dementia, muscular stiffness, depression, and eventually death. There are some genes known that predispose for Parkinson's, but most cases are idiopathic, meaning no one knows why the disease strikes a certain person and spares another.
Exposure to a certain pesticide, rotenone, and welding does increase risk of PD. Rotenone works by inhibiting the proton pumping complex 1 in the mitochondria (okay, the biochemistry of the mitochondria is complicated but fun. The whole point of the mitochondria is to generate the gasoline of the body, ATP, and mitochondria do this by pumping protons up gradients via several complexes, "proton pumping complex 1", etc. Think of the proton pumping complexes as ski lifts carrying skiers up to the top of the hill, where they are set loose to glory in the thrill of gravity. So rotenone is a hater of snowy fun and shuts down the ski lift.) Another inhibitor of proton pumping complex 1 is MPTP, famous for being an adulterant in synthetic opiates, and causing immediate irreversable Parkinson's disease, killing your dopamine-making neurons. Don't do drugs, kids.
Caffeine and tobacco (which juice up energy efficiency) use seem to reduce the risk of PD. So drink up that coffee! (Don't smoke).
Our intrepid researchers took genetic data from 410 post-mortem brains with Parkinson's Disease and "healthy" controls. They analyzed "6.8 million raw data points from nine genome-wide expression studies" (I *heart* geneticists), focusing on genes active within the substantia nigra. They found several gene sets that seemed to cluster in the PD folks, and many of these genes seemed to be a part of the mitochondrial complex called the electron transport chain. Part of the proton-pumps. Part of the energy factories of the cells. These 95 energy factory genes seemed to be "underexpressed" in Parkinson's sufferers. That means the substantia nigra was suffering an energy shortage, so the cells that make dopamine went kaput, out of gas.
The researchers found a second set of genes that seemed to be associated with Parkinson's. These genes affect glycolysis. (Que?) Remember, the brain can't run on ketones alone. Certain long nerve tendrils are too spindly to carry mitochondria. Those spindly bits need to run on pure glucose. Glucose becomes ATP directly via glycolysis.
Glycolysis turns glucose into pyruvate, yielding several ATP along the way, and then pyruvate enters the citric acid cycle to become ATP. Glycolysis is also part of how some anaerobic bacteria turn sugar into alcohol (called fermentation), so enjoy your biochemistry, preferably on the weekends.
Now we come to a third gene, PGC-1alpha. I know I've already exhausted you, but carry on! PGC-1alpha is a "master regulator of mitochondrial biogenesis and oxidative metabolism." It seems to control protein-folding and direct proteins where to go in the mitochondria. Underexpression of the PGC-1alpha gene was highly associated with Parkinson's Disease, and mitochondrial activity and ATP concentrations were severely decreased in the brain samples of Parkinson's patients.
All of these gene sets are associated with the energy efficiency of the brain cells, and were not only associated with Parkinson's, but also with Parkinson's precursor states, suggesting these are causative, not a result of whatever insult causes Parkinson's.
And then the researchers got very cute. They fabricated a virus to infect rat brains, a virus that causes the overexpression of PGC-1alpha. Then they exposed the rats to rotenone, the pesticide that causes Parkinson's. With the extra PGC-1alpha, these rats seemed to be more immune to Parkinson's than the average rat exposed to rotenone. Other researchers found that mice without any PGC-1alpha were more susceptible to the Parkinson's caused by the MPTP.
And, finally, magnetic resonance spectroscopy of living Parkinson's patients shows that their brains seem to have more lactate on board than normal. Meaning their brains are struggling with the metabolism of glucose and are running on anaerobic pathways instead. When your brain seems to be running like fermentation bacteria, you have a big problem.*
It seems probable that the dopaminergic neurons of the substantia nigra need more ATP than most, so if your brain is inefficient, they suffer first. Therefore a variety of genetic susceptibilities in the electron transport chain, glycolysis, and basic energy creation in the brain presents as Parkinson's disease first and foremost.
I'll quote the article here: "If this hypothesis is valid, it would suggest that modulation of cellular energetics could be used to prevent or treat PD, and that monitoring cellular energetics could serve as a diagnostic tool."
The only way I know to modulate brain energetics is to avoid carbohydrates or drink down a lot of coconut oil. (You can only drink so much coffee, but I thought of some more ways to modulate energy use in the mitochondria, which will be the next post.)
Food for thought!
*There's lactate again. More on that in a future post too.
Thursday, November 4, 2010
Depression, Flu, and To Do
Why not a song to start? Right click in new tab if you want to hear it - alternative rock and peppy! Pumped up Kicks by Foster the People.
When I started this blog way back in June, I was worried about a lack of material. Paleo psychiatry? There's no data, I thought, or nothing solid enough worth writing about. Not a single controlled trial of any diet intervention (other than omega 3 pills or vitamins) that I'm aware of. It's no type II diabetes, where diet interventions are studied to the extreme. I figured I could throw out the basic info and write up a bit on the omega 3 fatty acid trials of depression and bipolar disorders, maybe speculate a bit on eating disorders, and then I'd be scraping for more. I'd likely move on to paleo psychology and talk about how preserving sociopathy in the gene pool keeps our species tough and ready to fight. Well. Here we are. Ninety-two blog entries later. New biology stuff arrives in my email and in the comments on a daily basis. My more immediate plans include wrapping up the Alzheimer's series with a post on some of the infectious cause papers - though Alzheimer's stuff comes out every hour so I doubt it will ever be wrapped up, digging into the thyroid, and more about mental illness and obesity/metabolic syndrome (huge topic), not to mention diving into sleep. In between that the papers come out, and keep coming. I'm like a kid in front of a fire hose.
Today another paper from the Journal of Affective Disorders. "Associaton of seropositivity for influenza and coronaviruses with history of mood disorders and suicide attempts." I like the Journal of Affective Disorders. It's edgy without being desperate. The Anna Sui of the psychiatry world.
The paper starts out with some depressing statistics - mood disorders are expected to be the second leading cause of global disease burden by 2030 (medical students - psychiatry is a promising field), and nearly 21 million adult Americans have a mood disorder. A mood disorder (such as major depression or bipolar disorder) is (obviously) a major risk factor for attempting suicide.
Guess when the "busy season" is for psychiatry. Not the summer! That's true. Most everyone is happy in the summer. But when is everyone really depressed? Wrong! It's not during the winter holidays (northern hemisphere winter, that is). We actually get the most emergency phone calls in September, October, the end of February, and March, when the amount of light changes exponentially day by day. Fall is bad, but spring is much, much worse, and that's not something I would have predicted before residency. Beware the Ides of March.
Springtime is also the busy time for the flu. And seasonal suicide peaks overlap or closely follow seasonal peaks in epidemic influenza and upper respiratory viruses. (These peaks also happen during the yearly nadir of vitamin D levels, don't you know?). I've certainly noticed that my patients with depressive and other mood disorders tend to be down for a few weeks after recovering from a cold or flu. Apparently I'm not the only one who has noticed - back in 1892 Tuke wrote up a series of "post-influenza mania and depression" cases admitted to a London hospital.
So what did the researchers do? They took 257 subjects, mostly female, including 39 healthy controls. They were already involved in a study on environmental influences on exacerbation of mood disorders and suicidal behaviors. (It is perhaps not well known that all scientific research is in fact done on some thousand-odd college students and ex-hippies who are amenable to being recruited for study after study from the same medical center). These participants met criteria for major depressive disorder or bipolar disorder (or were controls), and answered questionnaires about their suicide history. Then they gave some blood, and the blood was examined for seropositivity for previous infection with several viruses, including influenza.
And guess what? People with previous flu and coronavirus (a common cold virus) infection had a much higher chance of having major depressive disorder. We're talking p=0.004 and p< 0.0001 for influenza A, B, and coronoavirus, so highly significant. Previous infection with influenza B was significantly associated with a previous suicide attempt as well (p=0.001), with an odds ratio of 2.53. Influenza B also seemed to increase the risk of having psychotic symptoms in mood disorder patients. Coronavirus and influenza A didn't seem to increase the risk of suicide or psychosis.
Wow. Okay. You mean these "mental" diseases might be biological?!? Immune response to the flu and other viruses involves cell-mediated immunity and a flurry of cytokine activation. Turns out those same things are active in the biological cascade of depression. The same old interleukins and TNF-alpha are involved in everything. Cytokines tend to make tryptophan into kyrunetic rather into serotonin, and that is not so good for the mood, and even worse for suicide. Influenza and coronoviruses are also possibly active in the brain itself, but the evidence is a little sketchy.
I have to admit, during medical school in residency I never got my flu vaccine. I didn't remember ever getting the flu before, and I washed my hands plenty, so hey... once I had kids, I got my yearly shot. According to a presentation I attended back in April, 10-20% of adults will never get the flu. They are lucky, or especially immune. The other 80% of you - here's something to think about. And ratchet up that vitamin D. Can't hurt.
When I started this blog way back in June, I was worried about a lack of material. Paleo psychiatry? There's no data, I thought, or nothing solid enough worth writing about. Not a single controlled trial of any diet intervention (other than omega 3 pills or vitamins) that I'm aware of. It's no type II diabetes, where diet interventions are studied to the extreme. I figured I could throw out the basic info and write up a bit on the omega 3 fatty acid trials of depression and bipolar disorders, maybe speculate a bit on eating disorders, and then I'd be scraping for more. I'd likely move on to paleo psychology and talk about how preserving sociopathy in the gene pool keeps our species tough and ready to fight. Well. Here we are. Ninety-two blog entries later. New biology stuff arrives in my email and in the comments on a daily basis. My more immediate plans include wrapping up the Alzheimer's series with a post on some of the infectious cause papers - though Alzheimer's stuff comes out every hour so I doubt it will ever be wrapped up, digging into the thyroid, and more about mental illness and obesity/metabolic syndrome (huge topic), not to mention diving into sleep. In between that the papers come out, and keep coming. I'm like a kid in front of a fire hose.
Today another paper from the Journal of Affective Disorders. "Associaton of seropositivity for influenza and coronaviruses with history of mood disorders and suicide attempts." I like the Journal of Affective Disorders. It's edgy without being desperate. The Anna Sui of the psychiatry world.
The paper starts out with some depressing statistics - mood disorders are expected to be the second leading cause of global disease burden by 2030 (medical students - psychiatry is a promising field), and nearly 21 million adult Americans have a mood disorder. A mood disorder (such as major depression or bipolar disorder) is (obviously) a major risk factor for attempting suicide.
Guess when the "busy season" is for psychiatry. Not the summer! That's true. Most everyone is happy in the summer. But when is everyone really depressed? Wrong! It's not during the winter holidays (northern hemisphere winter, that is). We actually get the most emergency phone calls in September, October, the end of February, and March, when the amount of light changes exponentially day by day. Fall is bad, but spring is much, much worse, and that's not something I would have predicted before residency. Beware the Ides of March.
Springtime is also the busy time for the flu. And seasonal suicide peaks overlap or closely follow seasonal peaks in epidemic influenza and upper respiratory viruses. (These peaks also happen during the yearly nadir of vitamin D levels, don't you know?). I've certainly noticed that my patients with depressive and other mood disorders tend to be down for a few weeks after recovering from a cold or flu. Apparently I'm not the only one who has noticed - back in 1892 Tuke wrote up a series of "post-influenza mania and depression" cases admitted to a London hospital.
So what did the researchers do? They took 257 subjects, mostly female, including 39 healthy controls. They were already involved in a study on environmental influences on exacerbation of mood disorders and suicidal behaviors. (It is perhaps not well known that all scientific research is in fact done on some thousand-odd college students and ex-hippies who are amenable to being recruited for study after study from the same medical center). These participants met criteria for major depressive disorder or bipolar disorder (or were controls), and answered questionnaires about their suicide history. Then they gave some blood, and the blood was examined for seropositivity for previous infection with several viruses, including influenza.
And guess what? People with previous flu and coronavirus (a common cold virus) infection had a much higher chance of having major depressive disorder. We're talking p=0.004 and p< 0.0001 for influenza A, B, and coronoavirus, so highly significant. Previous infection with influenza B was significantly associated with a previous suicide attempt as well (p=0.001), with an odds ratio of 2.53. Influenza B also seemed to increase the risk of having psychotic symptoms in mood disorder patients. Coronavirus and influenza A didn't seem to increase the risk of suicide or psychosis.
Wow. Okay. You mean these "mental" diseases might be biological?!? Immune response to the flu and other viruses involves cell-mediated immunity and a flurry of cytokine activation. Turns out those same things are active in the biological cascade of depression. The same old interleukins and TNF-alpha are involved in everything. Cytokines tend to make tryptophan into kyrunetic rather into serotonin, and that is not so good for the mood, and even worse for suicide. Influenza and coronoviruses are also possibly active in the brain itself, but the evidence is a little sketchy.
I have to admit, during medical school in residency I never got my flu vaccine. I didn't remember ever getting the flu before, and I washed my hands plenty, so hey... once I had kids, I got my yearly shot. According to a presentation I attended back in April, 10-20% of adults will never get the flu. They are lucky, or especially immune. The other 80% of you - here's something to think about. And ratchet up that vitamin D. Can't hurt.
Wednesday, November 3, 2010
Omega 3 - Not the Cure for Alzheimer's (Again)
Another quick one - into my email this morning came this link:
Another study of Alzheimer's Disease and Omega 3 fatty acids. (That link goes to JAMA "free" full text article published this week but I had to watch a Vytorin add for a minute).
400 some-odd elders with mild to moderate dementia were randomized to 2 grams of algal DHA daily or placebo for 18 months. They were followed with a bunch of neuropsych measures, and a subgroup was examined with serial spinal taps for CSF DHA (not surprisingly increased in the DHA group), and another subgroup was monitored via MRI for brain atrophy (no difference between treatment and control groups). So you can't say the researchers didn't give it their all.
Fish oil is cheap and sexy and biologically awesome, but it won't reverse the neuronal brain damage of full-blown dementia. Nor does it appear to slow it down (though some APOE4 negative DHA folks did better on some of the cognitive tests than the controls). I think it likely works far better (and makes more sense biologically) as a long-term preventative measure rather than as a treatment or cure. Fish eaters in epidemiological studies tend to eat fish for many years, or be in populations where eating fish is extremely common and lifelong, and dementia is less common than in non-fish eaters. Of course that's a correlation - but at least it is a biologically plausible one. Maybe we need a salmon clinical trial in lieu of a algal DHA one.
Adverse events was also interesting - how safe is fish oil with warfarin (a blood thinner)? These relatively high doses of DHA resulted in 3 people on warfarin having an INR (measure of blood-thinning) out of range once during the trial. One of the control group on warfarin had an INR out of range. Since anyone on warfarin ought to have INRs measured regularly, a consistent fish oil dose should not pose a problem, I would think.
If one is looking for a cure or significant improvement for Alzheimer's, I would bet more money on decreasing acute inflammation or improving energy efficiency in the brain via ketosis. This article was also linked to my email: New Drug may halt and even reverse the effects of Alzheimer's. The "new drug" is immune globulin with beta amyloid antibodies administered IV. I have to find the actual article to take a gander and won't have time until later - so more updates then. It would be interesting to see that immune globulin helps Alzheimer's when beta amyloid vaccine doesn't seem to do much at all (other than get rid of beta amyloid) - the dementia remains.
I don't know that there will be a longer term clinical trial of omega 3 fatty acids - such trials are expensive, and not sexy. We'll see.
Another study of Alzheimer's Disease and Omega 3 fatty acids. (That link goes to JAMA "free" full text article published this week but I had to watch a Vytorin add for a minute).
400 some-odd elders with mild to moderate dementia were randomized to 2 grams of algal DHA daily or placebo for 18 months. They were followed with a bunch of neuropsych measures, and a subgroup was examined with serial spinal taps for CSF DHA (not surprisingly increased in the DHA group), and another subgroup was monitored via MRI for brain atrophy (no difference between treatment and control groups). So you can't say the researchers didn't give it their all.
Fish oil is cheap and sexy and biologically awesome, but it won't reverse the neuronal brain damage of full-blown dementia. Nor does it appear to slow it down (though some APOE4 negative DHA folks did better on some of the cognitive tests than the controls). I think it likely works far better (and makes more sense biologically) as a long-term preventative measure rather than as a treatment or cure. Fish eaters in epidemiological studies tend to eat fish for many years, or be in populations where eating fish is extremely common and lifelong, and dementia is less common than in non-fish eaters. Of course that's a correlation - but at least it is a biologically plausible one. Maybe we need a salmon clinical trial in lieu of a algal DHA one.
Adverse events was also interesting - how safe is fish oil with warfarin (a blood thinner)? These relatively high doses of DHA resulted in 3 people on warfarin having an INR (measure of blood-thinning) out of range once during the trial. One of the control group on warfarin had an INR out of range. Since anyone on warfarin ought to have INRs measured regularly, a consistent fish oil dose should not pose a problem, I would think.
If one is looking for a cure or significant improvement for Alzheimer's, I would bet more money on decreasing acute inflammation or improving energy efficiency in the brain via ketosis. This article was also linked to my email: New Drug may halt and even reverse the effects of Alzheimer's. The "new drug" is immune globulin with beta amyloid antibodies administered IV. I have to find the actual article to take a gander and won't have time until later - so more updates then. It would be interesting to see that immune globulin helps Alzheimer's when beta amyloid vaccine doesn't seem to do much at all (other than get rid of beta amyloid) - the dementia remains.
I don't know that there will be a longer term clinical trial of omega 3 fatty acids - such trials are expensive, and not sexy. We'll see.
Monday, November 1, 2010
ADHD and Mom's Serotonin Deficiency
A quickie post today on this study from October's Archives of General Psychiatry, "Attention-Deficit/Hyperactivity Disorder Symptoms in Offspring of Mothers With Impaired Serotonin Production."
If you haven't noticed, I like to be somewhat hardcore about the biology of what is going on. I use psychology all the time in my work, but psychology takes so much explaining that I don't write about it much. Also, there's a whole vocabulary of psychology that has to be learned before it is easy to explain. Though I do end up explaining psychology to people all the time, in appointments. But usually I have a frame of reference (something happening in a patient's life) to hang the psychology on. I don't have anything to hang on you. And psychology by nature is speculative, whereas biology - speculative, but less so. We've got DNA, after all, and neurotransmitters, and electrons and ions whizzing down a neuron.
Speaking of DNA, each of us have certain genes for our serotonin machinery. I've dealt quite a bit with serotonin before on this blog, especially here and here. (Check out our favorite Primal Muse Jamie Scott's complimentary post here ). Quick review - serotonin is a neurotransmitter made from the somewhat rare protein amino acid tryptophan, and the absence of serotonin in the brain tends to make one aggressive, sad, and suicidal. More specifically, serotonin is heavily involved in neurodevelopment, neurogenesis, and neural migration (when our little brains are forming, serotonin helps direct our neurons to the right place at the right time, so to speak). Serotonin is also involved in the formation of platelets and in gut motility, but I'm kind of a brain girl, so that's where I focus my energy understanding this wee chemical.
To make serotonin from tryptophan, we need a couple of enzymes. First, tryptophan hydroxylase (with iron) converts tryptophan to 5-HTP, then a second enzyme makes 5-HTP into serotonin. Turns out we have two kinds of tryptophan hydroxylase (which everyone whose anyone calls TPH) - TPH1 and TPH2. TPH1 is found in the periphery and the pineal gland (a teensy part of the brain that in part regulates sleep - actually serotonin can become melatonin so TPH1 is important in sleep), whereas TPH2 is found in the neurons, and does the major brain work for creating serotonin from our dietary tryptophan. Got it? Good.
So, in a mouse who is expecting a little mouse pup, it has been determined that maternal mouse serotonin is exceedingly important in the neural development of the mouse pup-to-be's little mouse brain. Weirdly enough, serotonin depletion in mouse mothers leads to dopamine depletion in their inattentive and impulsive mouse offspring. We haven't talked about dopamine all that much, so I should really get on that! Suffice it to say that a deficiency in dopamine can lead one to have symptoms of ADHD.
So what's going on in humans? Back to one of those northern European countries with socialized medicine and no fears of losing disability insurance with genetic examination - Norway. Random Norwegians aged 18-40 with ADHD and matched controls AND their agreeable families were sampled from the National Public Registry (which, for medical science, seems like a sparkling good idea, but if you are from Texas like me, you are suspicious of any National Public Registry of any kind. Wut are they usin' that infermation for, anyhow?) Controls, ADHD patients, and family members all had their DNA sequenced and special attention was paid to the genes for TPH1 and TPH2.
Results? Well, many mutations were found for the gene for TPH1. TPH1 is the tryptophan hydroxylase that is expressed in mom's reproductive parts and would be responsible for bathing the baby brain in serotonin. And turns out that mom with serotonin machinery problems did have babies with more ADHD later in life. Dads with serotonin machinery problems had more kids with ADHD, but not nearly as many as the moms did, suggesting the real issue occurs in brain development and mom's serotonin, just like in mice. Moms with TPH1 problems were more likely to be smokers, drinkers, and drug users too. Hmmm. The sample size was rather small, so we can't jump to too many conclusions, but the data as it is was compelling. TPH1 issues in mom is also a "prime candidate" for schizophrenia, autism, and Tourette's in offspring. As if moms need to feel even more guilt. Sorting out the effects of maternal alcohol and drug use versus TPH1 genetic status would take larger studies.
Our brains are complex, and we need everything just so during our development. So be nice to those moms-to-be out there. And no smoking!
If you haven't noticed, I like to be somewhat hardcore about the biology of what is going on. I use psychology all the time in my work, but psychology takes so much explaining that I don't write about it much. Also, there's a whole vocabulary of psychology that has to be learned before it is easy to explain. Though I do end up explaining psychology to people all the time, in appointments. But usually I have a frame of reference (something happening in a patient's life) to hang the psychology on. I don't have anything to hang on you. And psychology by nature is speculative, whereas biology - speculative, but less so. We've got DNA, after all, and neurotransmitters, and electrons and ions whizzing down a neuron.
Speaking of DNA, each of us have certain genes for our serotonin machinery. I've dealt quite a bit with serotonin before on this blog, especially here and here. (Check out our favorite Primal Muse Jamie Scott's complimentary post here ). Quick review - serotonin is a neurotransmitter made from the somewhat rare protein amino acid tryptophan, and the absence of serotonin in the brain tends to make one aggressive, sad, and suicidal. More specifically, serotonin is heavily involved in neurodevelopment, neurogenesis, and neural migration (when our little brains are forming, serotonin helps direct our neurons to the right place at the right time, so to speak). Serotonin is also involved in the formation of platelets and in gut motility, but I'm kind of a brain girl, so that's where I focus my energy understanding this wee chemical.
To make serotonin from tryptophan, we need a couple of enzymes. First, tryptophan hydroxylase (with iron) converts tryptophan to 5-HTP, then a second enzyme makes 5-HTP into serotonin. Turns out we have two kinds of tryptophan hydroxylase (which everyone whose anyone calls TPH) - TPH1 and TPH2. TPH1 is found in the periphery and the pineal gland (a teensy part of the brain that in part regulates sleep - actually serotonin can become melatonin so TPH1 is important in sleep), whereas TPH2 is found in the neurons, and does the major brain work for creating serotonin from our dietary tryptophan. Got it? Good.
So, in a mouse who is expecting a little mouse pup, it has been determined that maternal mouse serotonin is exceedingly important in the neural development of the mouse pup-to-be's little mouse brain. Weirdly enough, serotonin depletion in mouse mothers leads to dopamine depletion in their inattentive and impulsive mouse offspring. We haven't talked about dopamine all that much, so I should really get on that! Suffice it to say that a deficiency in dopamine can lead one to have symptoms of ADHD.
So what's going on in humans? Back to one of those northern European countries with socialized medicine and no fears of losing disability insurance with genetic examination - Norway. Random Norwegians aged 18-40 with ADHD and matched controls AND their agreeable families were sampled from the National Public Registry (which, for medical science, seems like a sparkling good idea, but if you are from Texas like me, you are suspicious of any National Public Registry of any kind. Wut are they usin' that infermation for, anyhow?) Controls, ADHD patients, and family members all had their DNA sequenced and special attention was paid to the genes for TPH1 and TPH2.
Results? Well, many mutations were found for the gene for TPH1. TPH1 is the tryptophan hydroxylase that is expressed in mom's reproductive parts and would be responsible for bathing the baby brain in serotonin. And turns out that mom with serotonin machinery problems did have babies with more ADHD later in life. Dads with serotonin machinery problems had more kids with ADHD, but not nearly as many as the moms did, suggesting the real issue occurs in brain development and mom's serotonin, just like in mice. Moms with TPH1 problems were more likely to be smokers, drinkers, and drug users too. Hmmm. The sample size was rather small, so we can't jump to too many conclusions, but the data as it is was compelling. TPH1 issues in mom is also a "prime candidate" for schizophrenia, autism, and Tourette's in offspring. As if moms need to feel even more guilt. Sorting out the effects of maternal alcohol and drug use versus TPH1 genetic status would take larger studies.
Our brains are complex, and we need everything just so during our development. So be nice to those moms-to-be out there. And no smoking!
Subscribe to:
Posts (Atom)