Pages

Monday, August 30, 2010

Your Brain on Ketones

Ketogenic diets have been prescribed for seizures for a long time.  The actual research diets used in the past were pretty dismal and seemed to involve drinking a lot of cream and eating a lot of mayonnaise.  At Johns Hopkins, pediatric patients were admitted to the hospital for a 48 hour fast and then given eggnog (minus the rum and sugar, I'm guessing) until ketosis was achieved (usually took about 4 days).  In addition, ketogenic diets were calorie restricted to just 75-90% of what would be considered a child's usual calorie intake, and often they were fluid-restricted too (1)!  If we're talking soybean oil mayonnaise, you could see how someone could get into trouble with mineral deficiencies and liver problems pretty quickly.

To understand "dismal,"  some of the latest research showed that a "modified Atkins protocol" was just as good as the classic ketogenic diet, and so much more liberating, as the patients were allowed up to 10 grams of carbohydrates daily, and they didn't begin with the fast, and they weren't calorie restricted (2)(3).  While the classic ketogenic diet was 4:1:1 fat to carbs to protein.  If you use MCT oil for 50% of your calories (have to add it in slowly though to prevent vomiting, diarrhea, and cramping!), you could increase the carbohydrates and proteins to a 1.2:1:1 fat:carb:protein and still get the same numbers of magical ketones circulating.  And while "MCT oil" sounds nice and yummy when it is gorgeous coconut milk, this MCT Oil 100% Pure 32 fl.oz doesn't look quite as appetizing, especially when that is going the be half of what you eat for the foreseeable future (4).   You can see why researchers consider ketogenic diets (especially the original versions) to be extremely difficult and unappetizing (they were), whereas seasoned low-carbers (who have a bit of a different idea what a ketogenic diet is) will find that attitude ridiculous, especially when you compare a ketogenic diet to the side effects of some anti-epileptic medications.

So it looks like modified Atkins (very very low carb, but not zero carb) and a preponderance of MCT is the same, ketone-wise, for the brain as the classic cream-heavy ketogenic diet.  And what does it mean to have a ketogenic brain?  Before, we talked about protons, but now I'm going to examine neurotransmitters and brain energy more closely.  Specifically, glutamate and GABA (5).

If you recall, GABA is the major inhibitory neurotransmitter in the mammalian nervous system.  Turns out, GABA is made from glutamate, which just happens to be the major excitatory neurotransmitter.  You need them both, but we seem to get into trouble when have too much glutamate.  Too much excitement in the brain means neurotoxicity, the extreme manifestation of which is seizures.  But neurological diseases as varied as depression, bipolar disorder, migraines, ALS, and dementia have all been linked in some way to neurotoxicity.

Glutamate has several fates, rather like our old buddy tryptophan.  It can become GABA (inhibitory), or aspartate (excitatory and, in excess, neurotoxic).  Ketogenic diets seem to favor glutamate becoming GABA rather than aspartate.  No one knows exactly why, but part of the reason has to do with how ketones are metabolized, and how ketosis favors using acetate (acetoacetate is one of the ketone bodies, after all) for fuel.  Acetate becomes glutamine, an essential precursor for GABA. 

Here's the confusing part.  A classic ketogenic diet had three major components which were thought to contribute to the anti-seizure effect.  One, it was calorie restricted.  Just calorie restricting epileptic monkeys (no matter what the macronutrient ratios) reduces seizure frequency (and increases longevity).  Secondly, it was acidic, and the extra protons themselves could block proton-sensitive ion channels, or the ketone bodies or fats themselves could affect the neuron membranes, making them harder to excite.  (For the biochem geeks out there, ketones or fats seem to affect ATP sensitive K+ ion channels, making hyperpolarization easier to maintain).   Thirdly, it lowered glucose levels.  And lower glucose is associated with a higher seizure threshold (that's good - once doesn't want to easily have a seizure!) and less neuronal excitability.  Gads.  Doesn't sound to me like glucose really is the preferred fuel for the brain after all.

And now let's really get down to the mitochondrial level.  Mitochondria are the power plants of our cells, where all the energy is produced (as ATP).  Now, when I was taught about biochemical fuel-burning, I was taught that glucose was "clean" and ketones were "smokey."  That glucose was clearly the preferred fuel for our muscles for exercise and definitely the key fuel for the brain.  Except here's the dirty little secret about glucose - when you look at the amount of garbage leftover in the mitochondria, it is actually less efficient to make ATP from glucose than it is to make ATP from ketone bodies!  A more efficient energy supply makes it easier to restore membranes in the brain to their normal states after a depolarizing electrical energy spike occurs, and means that energy is produced with fewer destructive free radicals leftover.

Umph.  What does it all mean?  Well, in the brain, energy is everything.  The brain needs a crapload of energy to keep all those membrane potentials maintained - to keep pushing sodium out of the cells and pulling potassium into the cells.  In fact, the brain, which is only 2% of our body weight, uses 20% of our oxygen and 10% of our glucose stores just to keep running.  (Some cells in our brain are actually too small (or have tendrils that are too small) to accommodate mitochondria (the power plants).  In those places, we must use glucose itself (via glycolysis) to create ATP.)  When we change the main fuel of the brain from glucose to ketones, we change amino acid handling.  And that means we change the ratios of glutamate and GABA.  The best responders to a ketogenic diet for epilepsy end up with the highest amount of GABA in the central nervous system.

One of the things the brain has to keep a tight rein on is the amount of glutamate hanging out in the synapse.  Lots of glutamate in the synapse means brain injury, or seizures, or low level ongoing damaging excitotoxicity as you might see in depression.  The brain is humming along, using energy like a madman.  Even a little bit more efficient use of the energy makes it easier for the brain to pull the glutamate back into the cells. And that, my friends, is a good thing.

Let me put it this way.  Breastmilk is high in fat.  Newborns (should) spend a lot of time in ketosis, and are therefore ketoadapted.  Being ketoadapted means that babies can more easily turn ketone bodies into acetyl-coA and into myelin.  Ketosis helps babies construct and grow their brains. (Update - looked more into this specifically and it seems that babies are in mild ketosis, but very young babies seem to utilize lactate as a fuel in lieu of glucose also - some of these were rat studies, though - and the utilization of lactate also promotes the same use of acetyl-CoA and gives the neonates some of the advantages of ketoadaptation without being in heavy ketosis.)


We know (more or less) what all this means for epilepsy (and babies!).  We don't precisely know what it means for everyone else, at least brain-wise.  Ketosis occurs with carbohydrate restriction, MCT oil use, or fasting.  Some people believe that being ketoadapted is the ideal - others will suggest that we can be more relaxed, and eat a mostly low sugar diet with a bit of intermittent fasting thrown in to give us periods of ketosis (though in general I don't recommend intermittent fasting for anyone with an eating disorder).  Ketosis for the body means fat-burning (hip hip hooray!).  For the brain, it means a lower seizure risk and a better environment for neuronal recovery and repair.

Sunday, August 29, 2010

Saturday, August 28, 2010

Family Tree

(Right click to open in new tab this lovely version of Mark O'Connor's Appalachia Waltz)

That song reminds me of home.  Though, oddly enough, it was a home where I never lived, though we visited the family land in the mountains of western North Carolina every other summer.

There's something you say when you meet someone new in the southern U.S.  "Who are your people?"  Most of the time, if you go back far enough, you find that your new acquaintance is some sort of cousin so many times removed.  Our family trees expand exponentially as we trace our ancestors back, yet there were far fewer people then, so we are all pretty closely related.

My great (times eight) uncle is Daniel Boone.  He's a famous American frontiersman, known best for leading settlers through the Cumberland Gap and founding one of the first towns west of the Appalachian mountains.  If you've ever been hiking in the southern Appalachians, you will know how impenetrable the "rhodadendron hell" is.  Daniel Boone was a hunter and trapper, leaving his family on long treks for years at a time.  He was the inspiration for James Fenimore Cooper's hero "Hawkeye" Natty Bumppo in The Last of the Mohicans.  One can assume he ate a lot of red meat along the way.  He lived until 1820, and was 85 when he died.  His niece, Anna Boone, my direct ancestor, lived to be nearly 100.

In the accounts of Lewis and Clark, they describe batting wild game out of the way just so they could get through the woods of the Pacific Northwest.  Some stories of frontiersmen on the northern U.S. rivers describe their boats being lifted by the huge schools of fish.  For most of human history when our population was small, food was probably plentiful for much of the year.  And if it wasn't, humans just moved to where it was.  I like The Journey of Man: A Genetic Odyssey by Spencer Wells, and how he traces human migration out of Africa, mostly along the coastlines, until we finally reached the New World.  The only modern remaining hunter-gatherers are peoples pushed to fringe environments, and they don't make a likely template for what the majority of our ancestors experienced, though they do tell us just how ingenious and adaptable the human race is.

We are too many, now, but I still have hope for my children.  We swung from trees and ate fruit.  We walked the savanna and ran down game.  We built mighty ships and make war, and created fertilizer from the bones of dinosaurs, and we who are too many eat those bones.

We are ingenious and adaptable, but we must listen to history.  The story of human health, of depression and psychosis, of agriculture gives us clues all along the way.  But we must listen.



(Daniel Boone image courtesy Wikipedia)

Friday, August 27, 2010

Ketogenic Diets and Bipolar Disorder 2

Yesterday I made a brief introduction to the topic of ketogenic diets and bipolar disorder.  Today I want to discuss some of the issues raised in that post a little more thoroughly.

First let's talk a bit about how nerves work.  It's pretty cool, really, but involves a little biochem and physics, so bear with me.  Now a picture, courtesy the US government and wikipedia:


Nerve impulses and signals travel along the nerve fibers via electricity.  How that happens is that the extracellular levels of ions and the intracellular levels of ions are maintained at a very different level.  Inside neurons, the sodium concentration is about 10mM, but outside, it is 130mM or more - rather like there are a bunch of balls stored in a container on top of a hill.  Open a little door on the side of the container, and the balls come pouring out and down the hill.  Potassium is the opposite - levels are very high inside the cells, and quite low outside (1).

Neurons have plasma membranes like other cells in our bodies.  Those membranes are somewhat like a tarp that has been oiled on both sides.  Charged ions such as sodium and potassium can't get through unless they go through special ion pumps that are located in the cell membranes.  The sodium pump, in fact, may use up to 50% of the energy in our brains (2)! 

The result of all these shenanigans is that our nerve cell membranes are left somewhat negatively charged (-75 mV, in fact).  The neurotransmitters (such as serotonin, norepinephrine, dopamine, acetylcholine, glutamate, GABA, etc.) work by changing these membrane potentials in various ways.  Neurotransmitters can open ion channels, allowing sodium to enter the cell and causing a wave of electrical impulse that travels along the neuron.  Neat!

When the electrical impulse (or "action potential") reaches the end of the neuron, the "presynaptic terminal,"  neurotransmitters are released into the space between the nerve cells, called the synapse.  At this part of the neuron, calcium is the important ion (though sodium plays a role too).  The electrical impulse (originally mediated by sodium at the dendrite) that traveled down the nerve causes extracellular calcium to pour into the cell, which then leads to the release of the neurotransmitters into the synapse, which then can affect communication with the next neuron.  Voila!  Your neurons have now sent messages to one another.  Yee haw.  The sodium and calcium membrane potential craziness can be set back to baseline by the transport of potassium, so everything is all set for a new signal to be sent. 


It's Friday.  I know.  But it's important to understand the above to some extent to understand why a ketogenic diet might change the ionic environment in our brains.

Before we get to a ketogenic diet, let's look at lithium, carbamazepine, and valproic acid (depakote), all medications that have anti-seizure and mood stabilizing properties.  Lithium is especially interesting, because, if you recall, it looks a lot like sodium, so much so that our kidneys can become confused.  Seems our brains can too.    In rats treated with lithium, the lithium displaces the intracellular sodium in the neurons, and overall sodium is decreased.  The changed sodium gradient may be central to the mood-stabilizing effects of lithium.  (You doctors out there will be squinting at me right now - hey, lithium isn't an anti-seizure med!  Well, actually, it used to be used as one.  Most psychiatrists will know that we can get away with somewhat lower doses of many medicines for mood control than neurologists need for seizures - an exception to this is depakote - sometimes.  Lithium can be horribly toxic at levels high enough needed to control seizures, so it is never used for seizures now.  But it does have anti-seizure potential.)  Carbamazepine is a little mysterious, but one of its effects is to definitely inhibit the voltage-sensitive sodium channels.  (Lamotrigine, another anti-seizure and mood stabilizing drug, does something very similar).  Valproic acid (depakote) has a whole load of actions, and can increase GABA, making it a pretty good anti-anxiety med.  Another thing it does is to decrease the rapid-fire ability of the spazziest neurons, probably by inhibiting the sodium channels.

Get the picture here?  All these meds can be life-saving if you have bad bipolar disorder or seizures, but they can all be pretty toxic and have a host of side effects.  But all of them work (effectively) as insulators in the brain, decreasing the ability of the neurons to send out out-of whack sodium messages leading to neurotoxic calcium overload.  This calcium overload is speculated to be the cause not only of seizures, but also migraines and bipolar symptoms, which is why anti-seizure meds can be used to treat all three conditions.

Enter the ketogenic diet.  Ketogenic diets (severely carbohydrate restricted diets) result in ketone bodies (made from fat) being used by the brain as fuel in lieu of glucose.  The ketone bodies, acetoacetate and beta-hydroxybutyrate, are acidic.  That simply means that there will be extra H+ protons hanging out, compared to a non-ketotic brain.  Well, protons can be pumped into neurons in exchange for sodium, acting a little bit like lithium.  And a few extra protons outside the cell do all sorts of interesting things, such as reduce the excitability of the neurons and reduce the activity of the excitatory neurotransmitters.  Protons seem to block the calcium channels at the NMDA receptors, for example (3).  GABA (the inhibitory neurotransmitter and anti-seizure also) is increased in the brain in ketogenic diets, along with many other neurotransmitter changes (4)

Sounds good!  Well, some intrepid doctors in Israel had a bipolar patient who didn't respond that well to medication, and after discussion with the patient and family, it was decided to try a ketogenic diet (5).  The patient fasted for 48 hours and began what is described as a "classic" ketogenic diet for two weeks.  Oddly, she didn't have any urinary ketones.  After two weeks, the doctors added MCT oil (coconut maybe?  It doesn't specify), which low-carbers will also know can induce a more reliable ketogenic state.  The patient was pretty gung-ho on the diet, and the doctors made note that her compliance was good for the month the diet was tried.  She showed no clinical improvement, no loss of weight, no urinary ketosis, and no changes in liver function.  Seems odd that she wouldn't get ketosis while fasting or on a strict ketogenic diet, but perhaps that's why it didn't work.  I'll discuss a second case below.

On the internet, one can find a number of anecdotes about people improving bipolar symptoms with ketogenic or low carb diets.  But, as I mentioned yesterday, there are absolutely no systematic scientific studies.  The last thing you would want to do is, all on your own, ditch your meds and try a home-made ketogenic diet without anyone's help.  Ketogenic diets can have pretty bad side effects - constipation, menstrual irregularities, elevated serum cholesterol (if you care) and triglycerides (not good), hemolytic anemia, elevated liver enzymes, kidney stones, and gall stones.  Up to 15% of kids on a ketogenic diet will get changes in the heart conduction which puts them at higher risk for death (again, this was thought to be mediated via selenium deficiency).  Valproic acid + ketogenic diet seemed to worsen the side effects (6).  Now some of these ketogenic diet studies were done at the height of the low-fat era, and likely designed by some pretty lipophobic nutritionists.  And, like Atkins, many of the original ketogenic diets will have no regard for the omega6:omega3 ratio.  Here's a case where Atkins made manic psychosis a lot worse - I wonder about that, as arachadonic acid (omega 6 metabolite) administration has also been shown to worsen psychosis.  In that case, the patient was on valproic acid also, and that may have been part of the problem.  A natural sort of ketogenic diet (think Inuit) would probably have a lot fewer of these complications and side effects.

But it makes you think, doesn't it?  Many of our ancestors probably spent many a winter in ketosis, and other times lack of food or long fasts would have brought on brief periods of ketone body use in the brain too.   Maybe our brains work better if we spend time in ketosis.  Speculation, of course, but not an unimportant question to research further.

Thursday, August 26, 2010

Ketogenic Diets and Bipolar Disorder 1

Bipolar disorder is a challenging illness, with various clinical presentations. In "type one" people struggle with alternating symptoms of mania and depression. In "type two," depression is the primary state, with the occasional rare bit of hypomania. By mania, I mean increased energy, increased sexuality, religion, insomnia, and spending money. It feels great right up until reality comes knocking on the door. Medication has been proven to be helpful in decreasing the number of manic and depressive episodes. Typically, the medications are anti- seizure medicines also, such as depakote, lamotrigine, and carbamazepine.

Most low-carb followers will know that ketogenic (extremely low carb) diets have been used to treat epilepsy for a hundred years. Would the same diets be useful in bipolar disorder (1)?

Now is the time to add a special disclaimer. No ketogenic diet has ever been systematically studied with respect to bipolar disorder. If you have bipolar disorder, please discuss any thoughts you might have from reading this article with your personal physician. The one case study I could find in the literature showed *no* benefit from a ketogenic diet. (4:1:1 fat:protein: carb) Bipolar disorder is an illness I wouldn't want to face alone.

In the literature for epilepsy, patients were encouraged to fast for 12-36 hours to promote ketosis, and then to follow a dietary plan with less than 20 g carbohydrate daily (or even lower, in most research ketogenic diets). In doing so their brains would be flooded with ketones, and most importantly, promote extracellular acidosis. There are several seizure medicines (such as gabapentin) that are no good for bipolar disorder. When scientists look closely, they find that only the seizure medicines that promote a reduced extracellular sodium concentration are helpful in bipolar disorder. Ketosis does exactly that.

It all looks great on paper, but as I mentioned earlier, when well- intentioned doctors tried to use a ketogenic diet to treat resistant bipolar symptoms, they came up with a big zero (2). And while ketogenic diets are definitely safe short term, when ever we begin to talk about treating epilepsy or bipolar disorder, we are talking about long-term strict compliance. No cookies. No bread. One would need to work with a knowledgeable nutritionist, and there are reports of patients on ketogenic diets for epilepsy dying of selenium deficiency (3). (my paleo perspective - you are eschewing your vegetables, eat your organ meats!).

In the next post I will review the case studies above more closely, and delve more deeply into the biochemistry of sodium ions and the NMDA receptor. I know you can hardly wait!  (It's here - be sure to check out Ketogenic Diets and Bipolar Disorder 2)

Tuesday, August 24, 2010

Yoga (ba) GABA

One of my standard recommendations for people who have anxiety is to practice yoga. There are controlled clinical trials demonstrating its efficacy (1)(2), and who hasn't felt that amazing sense of serenity and well-being after a particularly good yoga class? Part of treating anxiety is to bring people down from the state of constant alertness, to help someone be comfortable in his or her own skin for once. Yoga, through the process of holding (sometimes) uncomfortable positions and breathing through the stress, is a direct physical practice for allowing yourself to get through a psychological stress. Eastern ways of viewing stress, as a wave to allow to crash over you rather than as something you can fight, can be helpful in teaching people how to cope.

All well and good. Binge drinking can help you relax (short term) and teaches you how to deal with discomfort in a very physical way too. But I don't recommend it for anxiety. Binge drinking and heavy alcohol use will often make anxiety much worse.

Why does yoga help and a flood of alcohol hurt? Well, the money is on GABA. Gamma-aminobutryic acid is a neurotransmitter I've made brief mention of before. GABA is the chief inhibitory neurotransmitter in the mammalian nervous system. It cools things off and chills things out. People with depression and anxiety have been shown to have low amounts of GABA in their cerebrospinal fluid. MRI spectroscopy has been used to estimate the amount of GABA in people who are depressed, and the levels are low compared to controls (3).

Activate the GABA receptors in the brain with ambien, Xanax, a glass of wine, and you get relaxed and sleepy. When these substances are constantly in the brain and then rapidly withdrawn, you suddenly have overexcited GABA receptors and you can get unfortunate side effects such as insomnia, anxiety, and seizures.

Medicines often used for anxiety or seizures (or both) increase GABA itself. These treatments include depakote, lithium, SSRIs, tricyclic antidepressants, and a treatment for very resistant psychosis and depression, electroconvulsive therapy (shock treatments). And researchers at Boston University Medical School have found increases in the brain (measured via MRI) with yoga (3).

The study I have full access to is a pilot study of 8 experienced yoga practitioners and 11 comparison subjects. None had any history of psychiatric illness or seizures. The yoga people were asked to do 60 minutes of yoga postures and breathing. The control group was asked to read quietly for an hour. Then everyone got an MRI spectroscopic examination. Fun! The people who did yoga ended up with 27% increase in GABA in their noggins compared to controls. Since GABA is relaxing and anti-anxiety, that's good! Turns out, though, that the yoga peeps had lower baseline GABA, but since the study was so small, it just so happened that all the women in the yoga group were in the luteal phase of the menstrual cycle (the second "premenstrual" half, after ovulation), and the women in the controls happened to be in the follicular stage. Turns out that our GABA levels are lower in the "premenstrual" half of the cycle (so give me that glass of wine, honey....).

Recently, the same group at BU did a second, somewhat larger study (4) comparing walkers and yoga practitioners. (I won't have access to the full text of this study for 3 months unless I go to the actual medical library.  Uncool, Journal of Alternative and Complementary Medicine!) Again, healthy people were studied, not anyone with psychiatric illness. This time, 19 yoga practitioners and 15 walkers did yoga or walked for an hour three times a week for twelve weeks. The yoga practitioners reported improved mood and anxiety compared to the controls, and MRIs showed increased GABA in the thalamus of the yoga practitioners compared to the walkers.

Yoga isn't Paleolithic. I don't see our distant ancestors practicing downward facing dog. But yoga combines physical activity with forced acute attention on the present. Lose your focus in tree stand, and you lose your balance. In my mind, yoga and other mindful meditation practices emulate, to some respect, the focus and attention we had to have while hunting and gathering. We couldn't be thinking about the mortgage or Uncle Phil getting drunk at last year's Christmas party. We had to be focused on the trail and the prey.

There are many ways to add mindfulness and a present focus to our everyday lives, and growing evidence that it is good for our brains to do so.

Sunday, August 22, 2010

Zombieland

Back in the day, we ate a lot of brains.  Stands to reason.  All animals come with one, after all.  And we certainly wouldn't leave behind such a great source of important fat.  

Have you seen the movie Zombieland?  I highly recommend it if you're into gore and fun, and while the dietary advice isn't necessarily paleo, the exercise discussions take a functional fitness turn... (link is not "safe for work," as they say).

Zombies might be lacking in variety with their chosen food, but they certainly wouldn't be lacking in micronutrients and omega 3s!  Brains are also an especially rich source of phospholipids, one of which, phosphatidylserine, was mentioned in a comment on Friday's post by Sue.  She seems to have had luck with it helping her joint pain and fatigue.  Terrific!  But why?

Well, phospholipids are found in many foods, but the highest concentrations are in brains and seafood.  When one looks back at different hunter-gatherers roaming the world, they would tend to eat a lot of seafood, or they ate a lot of large land-roving mammals, or both.  It would make sense that today we might have a lot less phospholipid intake compared to our evolutionary past.   In fact, today's foods contain about 1/3 the amount of phospholipids they did even at the beginning of the 20th century (1). 

Research in phospholipids was heating up in the 80s and 90s, but then a little illness came along called mad cow disease, and since the major source of phospholipids for supplements was cow brain, things slowed down for a while until an alternative soy source was found.  Not surprisingly, the soy sourced supplement is somewhat different than the animal sourced one, but looks like from perusing pubmed that almost all the latest research was done with the soy version.  If you are not fond of soy, krill oil combines omega 3 and phospholipids, and since krill (or the algae they eat) are the food for marine animals from which many ancient humans got their phospholipids, it would certainly be a more evolutionarily pedigreed source than soy.

But what does the research show?  Do we suffer as human beings because we've greatly reduced our phospholipid intake?  Well, in sports performance studies, phospholipids can help reduce pain and speed up recovery (1).  And supplementation can result in a statistically significant improvement in your golf shot (2).   This study of memory and cognition in the elderly didn't show any improvement using the soy-derived versions (3), though other earlier studies showed positive effects.  But the most intriguing part of the research is when you find out that ingestion of phospholipids has been found to reduce increases in ACTH and cortisol in response to stress (4).

I've always wondered why we modern humans are considered so "stressed."  I mean, sure, we are probably way more stressed than the majority of our ancestors who worked obtaining food 17 hours a week and otherwise hung out and told stories and played games.  But the most accepted pathophysiologic model for major depressive disorder and other mental illness is the stress diathesis model.  Meaning stress combined with genetic vulnerability changes your brain and causes your symptoms.  There's a lot of research support for this model and it makes a great deal of sense.  BUT.  Mental illness has been increasing over the 20th and 21st centuries, especially depression.  Maurizio Fava said in a lecture it is increasing 10% in each generation since the 1950s.  That is HUGE.  We know this (in America at least) from epidemiological catchment studies (5) done since the beginning of the 20th century.

But are we really more and more stressed?  In the first 50 years of the 20th century, there were two huge world wars.  Millions of people died from huge flu epidemics, and when my mother was a child, there was still constant fear of polio.  By the 60s we were worried about global nuclear annihilation.  Sure, now I have to remember 40 different passwords and traffic is pretty rotten, and we worry about terrorism and relatives with chronic illness and young men and women are still fighting wars, but is that more stressful than what families faced in the last century?

I don't think stress has changed so much, at least from the last century to now.  Agricultural humans have always been unhealthy and stressed, and I don't see how increases in cardiovascular disease and mental illness over the past 100 years could be explained *strictly* by a stress (cortisol) model.

I contend (as many do) that the MAJOR change in the last 100 years has been our industrialized diets.  Agriculture is one thing, and not good for human health (though it did beef up human fertility).  But industrialization of the food supply, I believe, is the primary causative factor in our modern physical diseases and our modern decline in mental health.

And here we have a bit of evidence that may bring diet and stress together at last.  Phospholipid supplementation, in a few studies, decreases our stress response, especially to emotional stress.  (No one had a clear idea why, unfortunately, in the studies I looked at, but as usual, I'll keep an eye out!).  Imagine day after day of munching on mammal brains or atlantic herring, rich in phopholipids, and thus (if one believes the research) having a blunted hormonal response to emotional and physical stressors, compared to our relatively phospholipid deficient diets of today.  Modern disease pathology is all about the cortisol, as much as it is all about the insulin.

We are built for eating brains and seafood.  The farther we stray from our ancestral diets, the more we seem to suffer.

(Thanks to Sue and Geoff and other commenters for your thoughts and ideas on these topics!)

Friday, August 20, 2010

Chronic Stress is Chronic Illness - Wherefore Art Thy Regulatory Mechanisms?

So here's how mammals roll.  We perceive a threat, and then our brains jump into action.  First step - the brain's fear center, the amygdala, and the brainstem stimulate the paraventricular nucleus (PVN) to secrete corticotropin-releasing hormone (CRH).  CRH moseys on down to the pituitary gland, where CRH stimulates the production and release of adrenocorticotropic horomone (ACTH).  ACTH is released into the bloodstream on the fast track to the adrenal glands, where it tells the adrenal glands to pump out the steroid hormones.  Woo hoo!

But we can't jack ourselves up forever.  All bodily systems have innate negative feedback loops.  We eat a huge meal, and leptin regulates our appetite at the next meal to keep us at our body set point, as long as we don't fall too far from the evolutionary paradigm when it comes to what we eat, or we don't have certain kinds of medication or a metabolic disorder screwing up the system.

In the brain, the hippocampus (there's that part of the brain again!) and the left prefrontal cortex can send inhibitory signals to the PVN, telling it to lay off producing the CRH.  No CRH, no ACTH.  No ACTH, no adrenal gland production of steroids.  That would presumably shut down the acute stress response.  What is the signal to tell the hippocampus and the prefrontal cortex to shut off the steroid hormone tap?  Cortisol!  That's the negative feedback - get too much cortisol, we shut off production.  Pretty nifty. 

But here's the problem.  As we found out yesterday, prolonged cortisol exposure damages the hippocampus.  We just aren't designed to cope with prolonged stress forever.  And when we damage to hippocampus, we damage our negative feedback loop (1)(2).  The shut-off valve is broken, so we continue to release cortisol.

We all have different baseline stress levels.  Research suggests that up to 62% of our stress hormone activity levels are inherited.  Also, chronic dysregulation of the HPA axis and stress hormones in childhood will affect how the brain develops, causing increased vulnerability to further stress throughout life (3). 

(A sobering aside here - when I was in medical school in the late 90s, the first teenagers were showing up in the diabetes clinic with type II diabetes - insulin resistance and metabolic syndrome may have begun years earlier.  Before that, type II diabetes was called "adult-onset diabetes." Hyperglycemia damages the brain and raises cortisol levels.  We can't escape the conclusion that this may well be chemically similar to exposing a child's developing brain to repeated emotional trauma.  There's a bit of literature on type I diabetes and an increased incidence of psychological problems - but usually it is explained as an adolescent trying to exert independence with a risky illness and risky medicine (insulin).  I found this abstract on type II diabetes but I don't have access to the paper.)

Can diet change cortisol?  Is a crappy diet a source of chronic stress?  Well, cortisol can change diet in a group of socially housed female rhesus monkeys.  The stressed (subordinate) monkeys chose to eat more calories than the dominant monkeys (4).  Leptin administration seems to decrease subordinate monkey cortisol levels, but made no difference in the dominant ones (5). Dominant females also develop less atherosclerosis than subordinate monkeys (6).  (Note to self- foster good karma so that you do not have to come back as a subordinate female rhesus monkey).   

In human women, calorie restriction and dieting caused an increase in cortisol levels (7).  But longer studies of men showed that losing fat can actually lower cortisol levels (this would make sense, if losing fat decreases insulin resistance and hyperglycemia). Here's an interesting study comparing hormones and metabolism in 16 women with PCOS (polycystic ovarian disease which is often accompanied by insulin resistance).  They ate a high fat "western" meal or an isocaloric low fat, high fiber meal, and then the researchers measured blood glucose, insulin, testosterone, and cortisol for the next 6 hours.  The low fat meal resulted in glucose levels significantly higher and insulin levels twice as high as the high fat meal for several hours after the meal (Oops! Why do we recommend low fat high fiber diets for insulin resistance again?)  But both meals reduced cortisol equally.

But the heart of metabolic dysfunction is in the immune system.  Inflammation.  We know that chronic stress and acute stress can cause immune reactions and inflammation in addition to the whole stress  hormonal cascade (in fact it is thought to be caused by the hormonal cascade) (8).  We also know that high protein meals and tryptophan increase the stress hormone response.  Would an inflammatory protein (like casein) cause a cortisol response?  In this study, yes and no.  The more allergenic casein caused less of a stress hormone response than the supposedly less allergenic casein hydrolysate.  When the same proteins were administered to the same men through their noses (don't snort casein at home, kids!), there was no difference in hormonal response, suggesting that the signal to activate the stress hormones occurs in the gut, not the blood.  These were normal weight, healthy men, too, with no history of mental disorders and not on any medication.  Would there be a difference in someone who is chronically stressed?

The diet and inflammation area is in the midst of a flurry of research right now.  Hooray!  I'll keep an eye out for more info.  Right now we can say that a diet predisposing one to metabolic syndrome (via hyperglycemia) will definitely increase stress hormones, and the whole metabolic picture includes hippocampal damage and increasing risk of depression and anxiety.  We also know that chronic stress will likely predispose you to metabolic syndrome.  Does chronic exposure to an allergenic protein like wheat or a high level of inflammatory linoleic (omega 6) fatty acid directly cause chronic increased stress hormone response (and therefore metabolic syndrome and depression and anxiety via the robustly evidenced stress hormone route?).  I'll look further.

Thursday, August 19, 2010

Stress is Metabolic Syndrome

In a previous post I described a little bit about the HPA axis. That's hypothalamic-pituitary-adrenal axis, or master glands of stress and how they rule the body and the world. In today's post I want to explore a little more about how stress affects our metabolism and our minds.

Just to review, we have a stress response system in case something dangerous happens. And it works great for that kind of situation - send out a wave of stress hormone, a grandmother can lift the car off the trapped toddler. We can run faster, have better stamina, bleed less. What actually happens is that glucocorticoids (cortisol) and epinephrine (adrenaline) are released from the adrenal glands, and these hormones have a wide variety of effects on the body - increasing our cardiovascular capacity, decreasing our immune function, and increasing our ability to mobilize energy. Our own personal temporary superhero juice.

Acute stress has effects in the brain, too (1). Glucocorticoids bind to receptors in particular regions of the brain that encode memory (the hippocampus and the amygdala), so that years later, we can often remember stressful events as if they happened yesterday. This mechanism is part of the basis of flashbacks for people with PTSD.

And then there's chronic stress. Exposure to adrenaline and cortisol on a chronic basis can have disastrous effects on the body and brain. It is thought to contribute to the pathology of cardiovascular disease, high blood pressure, the spreading of cancer, immune system problems, and type I and type II diabetes. Chronic stress is of course also thought to cause, in part, many anxiety disorders and depressive disorders.

Cushing's syndrome is a disease caused by excess cortisol. Imaging studies have shown that people with Cushing's syndrome have a shrunken hippocampus. People with PTSD, and depression also tend to have a shrunken hippocampus. As I've described before, the hippocampus is the epicenter of how depression is toxic to the brain.

It's hard to study neurochemical goings-on in the the brains of living people, as we have the tedious tendency to be using our brains (though sometimes it is not obvious). For that reason, animal models are often used for experiments to really find out what is going on in a depressed or anxious brain. And the typical way to induce depression in a rat, for example, is to expose it to stress. And sure enough, the rat will begin to have changes in weight, disrupted sleep cycles, altered HPA axis function, and neurological changes in the hippocampus and the amygdala. Lithium and some antidepressants seem to protect the brains from the stress in animal models, reducing the amount of neurological changes in the brains. Chronic stress also causes an increase in glutamate in rat brains, which is one of the mechanisms we explored in Depression 2 - Inflammation Boogaloo. Humans will have elevated glutamate in their spinal fluid if they have anxiety.


All right. Blah blah blah. Stress makes you depressed. News flash! But here's where it gets interesting. Because, turns out, people with diabetes have many of the same neurological and morphological changes in the brain as depressed and anxious people do. Hyperglycemia accelerates brain aging and causes irreversible neuron loss in the hippocampus.

Cortisol seems to cause insulin resistance not only in the muscles and liver (which most paleo-minded folks will be aware of), but also in the hippocampus. Diabetics with poor glucose control have elevated cortisol, too. It's all a disastrous circle of sugary hormonal bodily terror. In an insulin-sensitive person, increases in insulin levels cause our cells to whip out the GLUT4 transporter, so that glucose is sucked from the blood into the muscle and fat. Cortisol seems to wreck the function of the GLUT4 transporters to some extent, leaving increasing levels of glucose floating around in the blood. Once you have hyperglycemia, you begin to favor oxidation over antioxidation. In the brain, glutathione (supreme antioxidant) levels are decreased in the hippocampus, leading to the rule of toxic glutamate. The NMDA receptor seems to increase. Not favorable to a healthy state of mind. All these changes decrease synaptic plasticity and repair. Exercise, estrogen, and antidepressants seem to be protective against this effect.

Now let's talk about the insulin receptor itself. The cerebellum, hypothalamus, and hippocampus all have insulin receptors, and insulin itself seems to be involved in mood states and cognition. If someone is insulin-resistant, adding insulin will help a person think more clearly. Insulin sensitivity seems to be associated with appropriate movement of glucose transporters GLUT4 and GLUT8 in the hippocampus to the cell membrane and the endoplasmic reticulum. Jamie Scott had a recent post on the cognitive effects of insulin resistance and diabetes.

Okay. Well, none of these findings are a huge surprise. We knew stress was bad. We knew insulin resistance was bad. Experimental animal models show that pharmacological interventions can reverse or slow down some of the damage. But hey, wouldn't it be better to avoid the chronic stress and insulin resistance in the first place? Better for the brain, better for the body.

Modern life requires work before play. Productivity before relaxation. Ability to afford the time to exercise before one actually gets the time to exercise. Hey, that's life, but take it to the automaton extreme (and add processed food and partially hydrogenated fat), and it rots our brains and ruins our bodies. But we will spend and spend until we have no more credit left. We need to change some of the incentives out there. Chronic, expensive disease is on the back burner, and it will bankrupt us.

Tuesday, August 17, 2010

Love and Opium

First a few interesting tidbits from the news. The stories reference presentations of data at a conference, so I don't have more specific information. But what there is has some interest from an inflammation/chronic disease perspective:

Childhood stress leads to adult ill health, studies say (BBC)

"A series of studies suggest that childhood stress caused by poverty or abuse can lead to heart disease, inflammation, and speed up cell ageing...In one study, researchers from the University of Pittsburgh looked at the relationship between living in poverty and early signs of heart disease in 200 healthy teenagers. They found that those from the worst-off families had stiffer arteries and higher blood pressure.

Another study presented at the conference showed that childhood events such as the death of a parent or abuse can make people more vulnerable to the effects of stress in later life and even shorten lifespan. Researchers at Ohio State University looked at a group of older adults - some of whom were carers for people with dementia. They measured several markers of inflammation in the blood which can be signs of stress, as well as the length of telomeres - protective caps on the ends of chromosomes which have been linked to age-related diseases.
The 132 participants also answered a questionnaire on depression and past child abuse and neglect. A third study reported some sort of physical, emotional or sexual abuse during childhood. Those who did face adversity as children had shorter telomeres and increased levels of inflammation even after controlling for age, care-giving status, gender, body mass index, exercise and sleep."

I'd like to know how they studied the inflammation, but these studies have another piece of information to add to the big puzzle - psychological stress leading to inflammation leading to chronic disease being a likely scenario.

One more interesting news item from the New York Times - a bunch of neuroscientists go camping to study the effects of turning off all the digital stimuli on the brain. Can neuroscientists just relax in the wild? Either they can't, or they wanted to figure out how to write off the camping trip on their taxes...

But now on to opium! In psychiatry we have a a whole recipe book of diagnoses called the DSM IV-TR. The original DSM was derived from an army handbook used by psychiatrists in WWII, some derived from handbooks developed by the Germans from their observations in the late 19th century, and a lot of the rest derived from psychoanalytic thinking. In the DSM I (1952), there were two kinds of illnesses, for the most part, psychosis and neurosis. Psychotic illnesses were defined by a break from reality (as in paranoid or religious delusions in schizophrenia or manic psychosis), and neurotic illnesses were considered to be reactions to psychological stressors and events.

There is also currently a category of illness that has to do with coping skills and temperament called the "personality disorders." It's not a particularly good term and I wish they had thought of another - "I'm sorry, the diagnosis is that your personality is disordered" is not a particularly useful approach to helping people.

For the longest time, it was thought that psychotic illnesses were more genetic/organic, and neurotic illnesses (such as depressive illness, or post-traumatic stress disorder) were reactions to stress and more amenable to treatment by psychotherapy. A type of personality disorder called "borderline personality disorder" was an exception to the neurotic rule - those afflicted tended to unravel and even appear to be psychotic while receiving the old-fashioned on the couch free association type of therapy called psychoanalysis. That's where the name "borderline" came from in the first place - it was thought to be on the "borderline" between psychosis and neurosis.

What is borderline personality disorder? It describes a type of temperament and coping, usually in women but found in men also, where someone is highly sensitive, prone to dramatic relationships, depression, anxiety, addiction, eating disorders, and self-injurious behavior such as cutting. It is very common, with nearly 6% of the population affected. Unlike depression which tends to come and go over the years, personality disorder symptoms are more stable and chronic, though for most people, borderline symptoms do tend to get better over the decades as we live and learn. It most often develops in someone who was abused as a child, but people can have it without ever being abused. Usually it happens in those cases when there is a mismatch of temperament between parent and child. More modern types of therapy can be helpful for the symptoms, but you can only imagine what it must have been like to have borderline personality disorder and to feel unsure and anxious, free associating on the couch while your therapist said very little back in the psychoanalytic days. That kind of therapy would be like re-experiencing the neglect and abuse of childhood in its own way, and that is why psychoanalysis made borderline personality disorder worse. Ultimately, borderline and some of the other personality disorders can get better as people learn to feel worthy and loved.

But, like everything else, we've discovered that even the personality disorders have biological underpinnings. I'm not sure why people continue to be surprised by these findings - it all happens in our bodies, and is thus mediated by biochemistry. In the case of borderline personality disorder, a paper and editorial in this month's American Journal of Psychiatry explore a link between borderline symptoms and opiate receptors.

We all have opiate receptors. They are activated by our natural endorphins, and can help with pain relief and relaxation. Opiate receptors are also activated by opiates, derived from the opium poppy - morphine, oxycodone, heroin, vicodin, percocet, etc. etc. etc. There are opiate activators found in certain varieties of food, most notably wheat (the exorphins) and milk (beta casein A1). We can increase our own endorphin activity through several behaviors - exercise, binging, binging and purging, and self-injury. (While self-injury is a risk factor for eventual suicide, in general people do not engage in cutting as a suicide attempt, but rather the painful act relieves anxiety and focuses psychic pain on a physical level). The placebo effect is also thought to be mediated through activation of the endorphin system (1).

In the paper, scientists measured how an opiate binder called [11C]carfentanil showed up in the brain of living borderline patients with a history of self-injury and in normal controls. They found pretty significant differences within the two groups, suggesting that the patients with borderline personality disorder who self-injure have differences in their opiate systems. Other studies have shown that people who engage in self-injurious behavior such as cutting have lower levels of endoprhins in the blood at baseline and differences in their endorphin genes compared to non-injurers.

Our endorphins regulate many of our social interactions, and almost anything we do to self-soothe, from childhood on, will activate our endorphin system. A certain subset of people, self-injurers in particular, will have less ability to self-soothe that seems to be genetically mediated, so they may go to more desperate measures (binging, addiction, self-injury) in an attempt to feel better. The same endorphin system deficit can explain some of the social problems that people with borderline personality disorder experience.

There are many levels of speculation to engage in at this point. The deficits run in families, and anyone can see how anxious, addiction-prone families can lead to less than optimal conditions for a growing child trying to find his or her way. Epigenetics may well play a role. Add chronic stress and inflammation, poor health, and SAD - there's a whole recipe for generation after generation of biologically mediated mental distress. Fortunately, as we develop more understanding of the underpinnings of these conditions, we can start helping people with specific and sensible treatments.

Sunday, August 15, 2010

More about sunlight, food, and serotonin

The problem with mucking about with our biochemistry is that you are never really sure what exactly is going to happen. For most substances there is a a range of acceptable amounts, though the best range may depend upon levels of something else (zinc and copper, for example, or vitamin D3 and A and K2).

Serotonin is a tricky one to figure out. Too high, and we get confusion, high blood pressure, and possibly even psychosis, aggression, stroke and death. Too low, and we get anxiety, violence, suicide, and insomnia. It is obviously important to keep the brain levels within a nice healthy range, and the moment we start changing things up (as with an SSRI like Prozac), the body starts changing the number of serotonin post-synaptic receptors. Homeostasis in a nutshell.

But serotonin has some natural rhythms that could be useful to understand. First off, there is a definite summer/winter variation. Meyer's group in Canada used a PET scan on 88 healthy drug-naive individuals, and found that the serotonin transporter that shuttles serotonin out of the brain was highest during the winter, and lowest during the summer. The researchers felt that the most likely brain trigger to explain the variation was sunlight, though humidity also seemed to play a role (1). (Just reading the introduction to Jackson's Melancholia and Depression, and in the time of Hippocrates, 5th century B.C., melancholia was associated with black bile, autumn, and cold/dry weather).

Here in the northeastern US, there has already been a noticeable decrease in daylight compared to midsummer (humph. Back in Texas we would just be settling in for our second three months of summer). And, sure enough, the number of unhappy phone calls to my office has increased accordingly. This happens every single fall and spring with the changing light.

Why would our brains shuttle serotonin out for the winter? I don't know. Maybe it has to do with seasonal variations in food supply. Serotonin also signals satiety - perhaps we were better off eat more than we ought in the winter when we could get our hands on food, and it wasn't such an issue in the summer when food was likely more abundant. That's a wild guess, really. But I'm sure there's a reason.

There is also a carbohydrate/protein signal for serotonin. The actual mechanism is messy, but let's give it a whirl (2 - Thanks for the link, Jamie!).

If you recall, tryptophan is the dietary amino acid we need to make serotonin. The best source is meat, but when we eat meat, we get a mix of all kinds of amino acids, and since tryptophan is the least abundant, when it competes with all the other proteins for admission into the brain, it tends to lose out. So a high protein, low carbohydrate meal will leave your plasma full of tryptophan but your brain a little low.

Until you add some carbohydrate. Here's the messy part. Unlike some other amino acids, tryptophan is mostly carried around in the blood by another protein, albumin. Eat carbs - insulin is triggered, and proteins are taken out of the blood and pulled into the muscle. Except the mostly-bound tryptophan is immune to insulin's siren call. And the brain transporter for tryptophan doesn't care if tryptophan is bound to albumin or floating free. All the sudden, there is more tryptophan hanging out in the blood compared to the other amino acids, and tryptophan is first in line into the brain for once. From there, it is made into serotonin, and we feel good and relaxed and full and sleepy, at least for a couple of hours until the signal shuts off. Then we crave more carbohydrate.

So what does it all mean? Rob Faigin and others have postulated that having obscene amounts of sugar and carbohydrate over long periods of time can max out our serotonin machinery, leaving us unhappy, carb-craving, and depressed. Anti-low carb diet folks will claim that without carbohydrates, we will not get tryptophan into the brain and we will be depressed. Data has been mixed, with some studies showing high amounts of long term sugar consumption having no effect, some having quite a robust effect on aggression and mood, and there is also the rather infamous study of low carb diets showing more depression after two years (though the low carb diet group started off with twice as many people who were on antidepressant medication).

Some people try to bypass the whole thing and take 5-HTP, which is the immediate precursor to serotonin and can get into the brain pretty easily without the tryptophan insulin shenanigans. A couple of food-mood books recommend this strategy, others are against it - there have been only two acceptable trials of 5-HTP and the results were mixed. 5-HTP is not found in the food supply, so it may be safer to take l-tryptophan itself (it was banned in the US in 1990 due to contamination in the Japanese chemical plant that made it). 5-HTP should not be combined with migraine or antidepressant medication, or with too much vitamin B6. If you have a lot of vitamin B6 and supernatural amounts of 5-HTP in the liver, you can manufacture quite a bit of serotonin to be released into the bloodstream, and you risk giving yourself the symptoms of carcinoid syndrome. (Carcinoids are serotonin-secreting tumors and give you flushing, high blood pressure, and valve calcification. Phen-fen was a serotonin- related medicine too). Serotonin in the periphery cannot get into the brain - only 5-HTP and tryptophan can get into the brain.

Confused yet? Wait until you hear that there are seven subfamilies of serotonin receptors, with subtypes of each subfamily, all of which can be up or down-regulated depending on circumstance.

Yup, complicated.

At the risk of being a hypocrite after yesterday's post, I do take some supplements myself. All are designed to more closely mimic an evolutionary milieu of nutrient-rich food and plenty of sunlight. In that vein I take vitamin D3, a mineral/multivitamin, omega 3 capsules depending on my food for the day, and extra magnesium, as the one issue I had with primal eating was cramps in my feet. Ouch! I haven't yet developed a taste for organ meats, which leaves me needing the multi and minerals, I'm sure. Now this is not what you should take, it is only what I take. Aside from those, I rely on food (grass fed beef, wild-caught Alaskan salmon, pastured or omega 3 eggs, organic poultry, pastured butter, ghee, extra virgin olive oil, coconut milk and oil, some fruit, and tons of vegetables from the local farm. Oh, and some macademia nuts, or the rare Larabar.) Some special paleo mironutrient extras include a sprinkle of dulse or kelp, and Celtic salt. Again - that is, for the most part, what I eat. You eat whatever you want :)

Saturday, August 14, 2010

Theory of Mind and Evolutionary Psychiatry

We are human because we are social.  There is some debate as to why we became quite so social, but the predominant theory is that when we left the forests for the savanna, a larger group offered better protection from predators.  Living in large groups (on Facebook and elsewhere we tend to have 150 people we keep fairly decent track of) required many changes in our brains that no primate had needed before.  We were required to understand from another person's point of view, not just empathy, which is understanding how another feels and is common to all primates and many animals, but actually be able to picture ourselves in someone else's dirty bare feet and figure out what he or she would do, and why.  Understanding the intentions and dispositions of others and reflecting upon our own mental state is vital for survival in a large group.  We developed moral codes, social rules, and language.  Language is most interesting because quite a bit of our speech is not plain, but metaphor, requiring years of learning to understand the context and the culture.

The size of the neocortex relative to body size in primates is directly related to the usual size of the family group or tribe of that kind of primate.  We have the biggest neocortex, and the biggest tribes.  The only exception to this rule is gorillas, who live in relatively small groups, but have rather large brains.   The size of the neocortex is also directly proportional to the lifespan of primates - we have the longest.  The size of the social group of primates is also directly correlated with the length of time we are considered "juvenile," and in humans, that extended period of youth means that we are also selected for longevity, with a post-reproductive lifespan of several decades (1).

Deception requires understanding the "theory of mind" in another.  Self-deception may be the most advanced trait of all (right up until it becomes a little too much self-deception) - as someone who has no understanding of their own unacceptable wishes will appear to be more trustworthy and sincere.  

Evolutionary Psychiatry was born because I want to figure out how to heal.  I know how to make symptoms better.  I spent eight years in medical school and residency learning how to make symptoms better.  But the population is getting sicker and sicker, and the only reasonable theoretical paradigm I could find to explain the core dietary and psychological problems was an evolutionary one.  I have no interest in self-deception, however temporarily adaptive it may be.

There is an excellent evidence-based textbook for Diet and Western Disease (Food and Western Disease: Health and nutrition from an evolutionary perspective), but Dr. Lindeberg makes no reference to psychiatric issues in his book that I recall.  Most evolutionary psychiatry texts (such as Textbook of Evolutionary Psychiatry: The origins of psychopathology) have quite a bit of biology, but are, frankly, more psychologically oriented. 

One cannot ignore psychology.  It can outflank biology sometimes, it takes years to master, decades to perfect, and I use it every single day at work.  But biology usually has the upper hand.  And what are we doing, anyway, merely sending people off to get coping skills when their neurons are starving for cholesterol or omega 3 fatty acids or zinc or magnesium or serotonin or dopamine?  The growing biological evidence is that much of mental illness is autoimmune in nature.  I noted in a previous post that therapy is anti-inflammatory, but we have so much knowledge and so many tools.  No one gets to my office without going through a great deal of suffering - would you ever seek out the help of a psychiatrist if you didn't absolutely have to?  Everyone deserves a full-spectrum approach.

I looked at a number of popular press "food and mood" type books out there.  Check out the top ten on Amazon, and I assure you, I have read them all.  While many have some interesting information, they all have serious limitations and are, for the most part, scarcely evidenced-based.  The best two are Primal Body-Primal Mind: Empower Your Total Health The Way Evolution Intended (...And Didn't)and Depression-Free, Naturally: 7 Weeks to Eliminating Anxiety, Despair, Fatigue, and Anger from Your Life, but both leave you with the feeling you need 20-40 supplement pills a day just to make it through, and the latter has a whole section on the ridiculous "blood-typing" diet that has been thoroughly debunked, seriously straining the credibility of the entire book.  That's not good enough for me, or for my patients.  My goal is to present credible, practical, evidenced-based information day after day.

Our ancestors ate food, not supplements.  On a public health level, I'm trying to get away from supplements as much as I am trying to decrease the need for prescription medications.  I blog about interesting medicines and supplements because it teaches me the biochemistry and the connection to dietary vitamin deficiency or inflammation.  On a personal level, supplements and medications may be efficacious.  But understanding supplements is not my mission. 

Five years ago I could not have done this.  I have a toe in academia, but scarcely a toe.  I have a busy practice, the responsibilities of a business, two very young children and a husband and no ability to visit the physical medical library more than a few times a year.  I have no research assistants other than you, the blogging community.  And yet there is no moral excuse for not writing this blog.  I need to know how to heal, and I've trained for too long and know too much to be satisfied with merely mitigating the symptoms.  A wonderful therapist is an invaluable person, but his or her special gift is to allow and to facilitate others to figure out their own path - "don't just do something, sit there."  I have only a limited capacity to do therapy, as I am always trying to "do."  Perhaps it is youth, hubris, or naivete, but I would really like to heal. 

We are human because we are social.  Online access and a wonderful, worldwide educated community actually interested in this ridiculous little niche of evolutionary psychiatry has made this blog far better than it would have been with my own resources and knowledge.

Please, keep commenting.  Please challenge and question.  Being social makes us human.  Knowledge and education may save our humanity.

And for my weekly conceit - Delibes, the Flower Duet from Lakme....

Friday, August 13, 2010

The Evolution of Serotonin

Jamie Scott's Midwinter Blues post was the fourth of four (unplanned) complementary fructose/fructan posts on Primal Muse and Evolutionary Psychiatry. Here's the whole "series" in case you missed out:

1)FODMAPS (on Primal Muse)
2)IBS, Fructose, Depression, Zinc, and Women (on Evolutionary Psychiatry)
3)Dietary Strategies for Fructose Malabsorption (on Evolutionary Psychiatry)
4)Midwinter Blues (on Primal Muse)

Anyway, I found Jamie's last post both extremely interesting and incredibly expensive.  He linked this article about sunlight and serotonin, which turned out not to be a paper, but rather the first chapter of the Handbook of the Behavioral Neurobiology of Serotonin. I know most of you might find a new pair of vibrams or Hermes perfume or The Primal Blueprint Cookbook: Primal, Low Carb, Paleo, Grain-Free, Dairy-Free and Gluten-Free an irresistible purchase. I have a weakness for impenetrable neurological biochemistry tomes. Unfortunately my institutional access did not grant me the rights to the chapter (which could be purchased for $31), but I wasn't going to pay $31 for a single chapter when I could spend, say, 5X as much on the whole book. (Hopefully Mr. Dr. Deans is not reading this post...)  The good news is I get points on my credit card, so eventually I'll have enough to get a second iRobot 560 Roomba Vacuuming Robot, Black and Silver. Or maybe even a iRobot 330 Scooba Floor-Washing Robot.  Until then, I have my Handbook about Serotonin and we'll start with chapter one: Evolution of Serotonin, Sunlight to Suicide.

Biochemistry is all about energy and chemical reactions. And on planet Earth, most of the business of life occurs with chemical reactions driven by sunlight. The very first photosynthetic reactions occured with a molecule known as an "indole" - a benzene ring with a pyrrole ring attached.

It's not obvious, but that molecule has lots of electrons whizzing around, and the carbon atom at position "3" above is extremely reactive and will lose electrons very easily.   Add light to specific types of  indoles, and energy will be created as molecular hydrogen is released.  It is postulated that this reaction first occurred on Earth 3 billion years ago.  This is the beginning of light-based life.

Tryptophan, the dietary amino acid (and the rarest amino acid in the diet, according to Primal Body-Primal Mind: Empower Your Total Health The Way Evolution Intended (...And Didn't)) precursor to serotonin, is an indole.  It happens to be the most fluorescent amino acid under blacklight.  Tryptophan absorbs light energy, and is a vital amino acid for photosynthesis in sea bacteria, algae, and plants.  The process of photosynthesis creates oxygen from water, using the energy from the electrons whizzing around on tryptophan, among other things.  This creation of oxygen by photosynthesis changed our planet's atmosphere and made our lives possible.

Plants evolved a specific energy factory within their cells called a chloroplast, whose function is to capture light for energy, and to create tryptophan.   Chloroplasts are where chlorophyll lives.  Tryptophan is also made in all primitive unicellular organisms and plant systems.  Animals (like humans) do not make tryptophan and must obtain it through diet.  The best source for humans is from the meat of other animals, though tryptophan remains tough to get into the brain, as it is the least abundant amino acid in muscle tissue, and it has to compete with all the other more abundant aromatic amino acids (tyrosine, phenylalanine, leucine, isoleucine, valine, and methionine) for the aromatic amino acid transporter.  Everyone is waiting in the lobby for the elevator, and only a few can ride to the top at a time. 

Serotonin is made from tryptophan only in mast cells and neurons.  However, cells in every single organ have special uptake proteins to capture circulating serotonin from the blood. 

Other derivative molecules of tryptophan include melatonin (important in the sleep-wake cycle), psilocybin (the active molecule in psychogenic mushrooms), ergotamine, yohimbine (a traditional aphrodisiac), and LSD.  Most of these compounds are active in the human brain because they can stimulate the serotonin receptor.  A tryptophan-based compound called auxin affects cell growth in plants, enabling shoots and leaves to extend out towards the light.

In order to make serotonin, tryptophan has to go through a couple of enzymatic reactions.  First, tryptophan hyrodxylase makes tryptophan into 5-HTP.  Then tetrahydrobiopterin and aromatic amino acid decarboxylase (plus vitamin B6 and zinc) make 5-HT (serotonin).   Serotonin is degraded by MAO.

Tryptophan hydroxylase may be the oldest enzyme to attach oxygen to other molecules.  Since oxygen is generally quite reactive and toxic biochemically, this was an early way to safely get rid of excess oxygen created by photosythesis in primitive organisms.  The light receptors in the human (and other animal) retina are very similar to serotonin receptors and were first thought to evolve a billion years ago.  Serotonin is the oldest neurotransmitter, and the original antioxidant.  There are 20 different serotonin receptors in the human brain, and serotonin receptors are found in all animals, even sea urchins. 

In other animals, serotonin is involved in swimming, stinging, feeding modulation, maturation, and social interaction.  In general, it is thought of as a growth factor for animal brains.  In humans, deficiency of serotonin is implicated in autism, Down's syndrome, anorexia, anxiety, depression, aggression, alcoholism, and seasonal affective disorder.  Light therapy and serotonin-increasing medications are both effective treatments for depression that occurs with low levels of sunlight.   Light exposure increases serotonin in humans, and serotonin levels are lowest in midwinter, and higher on bright days no matter what time of year.  10,000 lux light therapy decreases suicidal ideation.

Tryptophan is an important amino acid, most readily available from animal sources (vegetable sources such as pumpkin seeds contain phytic acid which may inhibit its absorption), and its many important  derivative molecules work best with plenty of sunlight.  Think of it as your own little bit of photosynthesis.  Sometimes we're not as different from cyanobacteria as we'd like to think...

Thursday, August 12, 2010

Western Diet and ADHD

Thanks again to Dr. Hale for pointing out a new study on diet and ADHD. And thanks again to Australia for studying diet and psychiatric issues observationally, though a prospective controlled dietary trial (from any country) would be nice every now and then. Throw us psychiatrists a bone. I know we aren't cardiologists and don't make the big $$ with procedures, but we're important too. Ahem.

The design of this study is rather interesting. In 1991, researchers recruited a bunch of pregnant women in and around Perth, Australia, to allow their children to be followed for medical science. 2900 or so agreed, and the moms/families answered questionnaires at 18 weeks pregnant, 34 weeks pregnant, birth, 1, 2, 3, 5, 8, 10, and 14 years of age. By the 14 year mark, 1,860 of the original cohort of 2868 live births sent in their SASE with answers about diet, sociodemographic, and lifestyle factors. Each caregiver was asked if their child was diagnosed by a health professional with ADHD, and trained research assistants followed up with the healthcare professional to confirm the diagnosis, including the ADHD subtype (inattentive, hyperactive, or combined). In the end, the diagnosis or lack thereof was confirmed for 1634 adolescents.

Diet was evaluated via a food frequency questionnaire - which is always a bit of a shot in the dark, but the questionnaire was confirmed both by the kid's primary caregiver and by a visit with a research assistant. The researchers divided the diets into two patterns - "Western" and "Healthy." A "Western" diet had higher total fat, saturated fat, refined sugars, and sodium with lesser amounts of omega 3, fiber, and folate. (Alas - poor sat fat is lumped in with refined sugars and processed food (sodium) YET AGAIN). The "Healthy" pattern had high omega 3, fiber, and folate and low saturated fat, refined sugars, sodium, and total fat.

Tons of numbers were crunched. Maternal smoking, poverty, alcohol, stressful events during pregnancy and early childhood, physical activity... many, many, many numbers were crunched. In the end, 10% of the boys and 3% of the girls had a diagnosis of ADHD. Adolescents with a "Western" dietary pattern were a little more than twice as likely to have ADHD than those with a "Healthy" pattern. Three or more "stressful events" during pregnancy were also associated with ADHD in boys, but not in girls. Kids who exercised outside school hours were less likely to have ADHD.

Specific food groups with increased risk for ADHD included fast food, processed meats, red meat, high fat dairy products (ice cream?), soft drinks, and crisps (chips, I think they mean!!).

Now some discussion.

First, Evolutionary Psychiatry would like to point out that this is an observational study, with all the limitations therein. Good for speculation, but not good for recommendations. Fortunately, we at Evolutionary Psychiatry love to speculate.

Let's review some other data about ADHD and nutrition. Lower plasma omega 3 and higher omega 6 has been observed in kids with ADHD (1). ADHD is also linked in a prospective, randomized controlled trial with food additives and food colorings, which would be higher in a "Western" diet. A lack of micronutrients (such as zinc) has also been associated with ADHD (and supplementation was found to be helpful in an intervention trial) (2)(3). All these issues would be associated with a Western (but not a paleolithic-style) dietary pattern.

One more important thing to consider. ADHD is highly genetic (perhaps related to inherited inefficiency with dopamine receptors in the frontal lobes). I'm an adult psychiatrist - often I will get an adult coming in saying, "My son was just diagnosed with ADHD, and you know, I think I've had it my whole life too." Eating "healthy" requires some planning and organization (though it is not that difficult once you get used to the dos and don'ts). When ADHD affects whole families, fast food can be an easier solution than cooking and preparing meals at home.

Tuesday, August 10, 2010

Circles of SAM-E

Figuring out biochemistry is rather like watching a group of kids playing on the playground. Lots of kids bounce off each other, swing from jungle gyms for no apparent reason, and toss balls around. In the end, energy was consumed and fun was had.

One of the ball-carriers in our livers and brain is the vitamin co-factor SAM-E (pronounced "Sammy"). Otherwise known as S-Adenosyl methionine, SAM-E has a cute little sulfur moiety on it (the "S") that can carry around biochemical balls known as methyl groups (-CH3). You're probably really excited to learn that carrying around methyl groups is an extremely important job, and SAM-E is the go-to running back, the Earl Campbell or the pre-marijuana Ricky Williams of the vitamin crew.

SAM-E is important in depression because, along with folate, B6, and B12, it helps create three of the major neurotransmitters from their dietary amino acids - dopamine, norepinephrine, and serotonin. We can make SAM-E, but without enough essential vitamins (including B6 and B12) and the amino acid methionine, we can't make enough of it or recycle it. And studies have shown that people with depression tend to have lower serum levels of SAM-E.

A new study and editorial about SAM-E came out in this month's Green Journal (the American Journal of Psychiatry - which I'm embarrassed to say I can never remember the proper name of because we always call it the Green Journal). The research team is from the Mass General - George Papakostas, David Mischoulon, Jonathan Alpert, and Maurizio Fava. Full disclosure, back (a while ago!) when I was a chief resident, I had the opportunity to sit in on their weekly research meetings. They were doing some initial planning about SAM-E, but at the time they were mostly focused on the STAR-D trial. They were always pleasant, professional, and would toss brilliant ideas back and forth - it was quite invigorating just to be in the room, though we were all stuffed into a small space for the amount of people, with research assistants sitting on tables on the perimeter, and several of the key players (Alpert or Mischoulon) doing the same, if they arrived late!

Anyway, this current study, "SAMe Augmentation of Serotonin Reuptake Inhibitors for Antidepressant Nonresponders With Major Depressive Disorder. A Double-Blind, Randomized Clinical Trial" is pretty similar to drug industry trials of pharmaceuticals, though it was funded by an NIH grant, and the SAM-E and matching placebo were freely provided by Pharmavite (who makes SAM-E, I assume). They used healthy depressed adults with no history of bipolar disorder, no risk of getting pregnant during the trial, no psychosis, and no substance abuse who were already taking antidepressants. This is an augmentation trial - as we know from every antidepressant study in history and STAR-D (Sequential Treatment of Resistant Depression), antidepressants (no matter what variety) help about 30% of people feel pretty normal, 30% of people feel a bit better, and 40% of people feel the same or worse. SAM-E has a bunch of trials (between the IV and oral trials, I think I count 20!) showing it has similar efficacy to antidepressants, mostly tricyclics. Because SAM-E has a different mechanism than the standard antidepressants, it's important to see how the two might work together. Alone, they are all pretty "meh" unless you are in the lucky 30%. Together, you might start to approximate the body and brain environment that we would have with the types of stress, exercise, and nutrition for which we were evolved, without the excess inflammation and autoimmune reactions that cause depression in the first place... sigh. (I think these trials are interesting because they tell us a bit about the biochemistry of depression. And medicines can help! I've seen it many times. But if we don't find and address the possible dietary and environmental and psychological causes, adding medication can be a bit like pissing into the wind, if you pardon the expression. Just like with diabetes.)

In this preliminary study of 73 already-depressed on antidepressants individuals, the SAM-E augmented the effect pretty well, and without side effects, for the most part (most common ones are upset stomach and diarrhea, like any vitamin). SAM-E (dosed at 800mg twice daily by the end of the study) dropped the Hamilton D depression score by 10 points from baseline, while the placebo-treated group dropped only 6 points. I know that doesn't sound too exciting - but there are only a few FDA-approved medications for the treatment of resistant depression, and they can have some pretty horrible side effects, and they didn't work as well as SAM-E did in this particular study. There are reasons that it is not quite fair to compare the studies of other medicines to this SAM-E study, and they are all duly discussed in the editorial (which appears to be free online and linked above!).

Sweet! Take a vitamin (basically) and boost your antidepressant! SAM-E has also had some studies showing that it can protect the liver and reduce joint pain in arthritis. Win-win! What are the down sides?

Well, like any antidepressant worth its salt, SAM-E can cause mania and anxiety. In fact, I would feel remiss in recommending that anyone start it without at least talking about it with their doctor or therapist, and having a family member watch for signs of mania. Theoretically, if you take SAM-E and you don't take enough of the co-factors, you can end up with a lot of homocysteine lying around. Homocysteine is a marker for heart disease, but that doesn't stop some articles I read from suggesting that SAM-E could increase your risk of heart disease by increasing homocysteine if you don't take a multivitamin with it. (SIGH. Correlation does not equal... ah, never mind). I would be more concerned that it wouldn't work as well without the necessary vitamin cofactors - so take a multi with it! Also, it's expensive. The 800 mg twice a day dose would cost $111 dollars a month at this national chain (though some people might be able to get away with lower doses). For that, you could get a prescription antidepressant and a couple of visits to a therapist covered by insurance. Also, like any over the counter supplement, "buyer beware." Some of them are sold by milligram weight rather than by milligram dose - you wouldn't want to spend big bucks for half the amount you thought you were getting! In 2000, early tests of SAM-E products by an independent laboratory showed that half of the products had less SAM-E than claimed. By 2007, it seems the companies cleaned up their act, and testing of 11 brands showed they were what they claimed to be.

And, of course, no one knows the long term side effects of piling on SAM-E.

Better that we avoid all this in the first place by having an all-star vitamin football team from your terrific, organic and local grown, grassfed and pastured all natural foods diet and a clean, anti-inflammatory playing field from childhood. But sometimes the works are already a bit gunked up. We have some professional cleaning solutions, though those have some down sides, it may not be as much of a down side as leaving the works gunked up.