Pages

Friday, December 31, 2010

Closing Out 2010

For many, like my beloved Texas Longhorns, 2010 was a difficult year. I started the year in a plateau, having gained and lost (and gained and lost) quite a few pounds during and after pregnancies for my two little girls. Nutrition was an interest of mine for many years, however. I had read Good Calories, Bad Calories and the works of Micheal Pollan, and for years had been avoiding too much fake processed food and vegetable oil. But it wasn't until I found the idea of a paleolithic-style diet that everything began to have focus, and made perfect sense. I started out with a nutritionist who had a paleolithic bent (but a distaste of fat and a fondness for brown rice and oatmeal nonetheless). I would describe his plan as Loren Cordain meets Body For Life. I followed his plan for 3 months, losing all the excess baby weight and then some, then discovered The Primal Blueprint, where paleo meets Good Calories, Bad Calories, and Food and Western Disease (paleo meets academia). I added back the fat, ditched the post-workout brown rice and oatmeal, and through a perusal of the forums found the wonderful resources of the paleo and traditional foods blogs, and the rest is history.

None of that had much to do with psychiatry. But with my biochemistry knowledge and front lines experience with mental illness, it seemed an obvious route to take with a blog. As a population, we are long-lived and expensive and sick. It didn't seem to me that hunter-gatherers could have been so afflicted and survive very long. And many of my patients (and myself) had the look - the extra pounds, the flushed skin, the thin hair - something was wrong. Very wrong.

It is not that hard to find inflammation. A new commenter on an older post noted that it seems to be in fashion to blame everything on inflammation. And to some extent I agree - when I was pregnant, all the little discomforts (loose joints, weight gain, bleeding gums) were blamed on "hormones." Inflammation is a big word, covering a vast landscape of biochemical processes. But it is a place to start, beyond psychology, beyond stress. Inflammation is where vegetable oil meets the modern stressful life. And that is where psychopathology lies as well.

There are a number of ways to attack psychopathology. Therapy, exercise, proper sleep and avoidance of addictive poisons. And here I focus on food, and parse the details. This is science so the definitive is less than we would like. But this is science, so we can question dogma.

Fortunately, I have all my friends to parse with me. And that is the greatest blessing from 2010. Not the skinny jeans, the clear skin, or the vibrant hair (though those are quite nice). The greatest blessing is the community. My old friend Dan from medical school, gorgeous Jamie and Julianne all the way in New Zealand. Stephan in the northwest US, Dr. BG and Aaron Blaisdell in southern California. Steve Parker in Arizona. Paul Jaminet a few miles away in Cambridge. Melissa in NYC.  Thanks to   Leslie Irish Evans and Robert Su for the podcast opportunities! Andrew on his boat. Enigmatic paleo rock star Kurt Harris in the midwest. Dr. Eades and Richard Nikoley. The commenters and facebook friends and twitter community who are all ready to look at a new angle and take apart a new or old idea.  Tear it apart.  That's what leads us forward.

Evolutionary medicine in the 21st century is a breathtaking enterprise. I can be a doctor and use my powers for good. My plans for the blog include (yes, at long last) the thyroid, delving more into phospholipids and the now-famous choline, and looking more into sleep. I'll endeavor along the way to keep up with the latest news and papers (though I still have a day job, a husband, and two adorable little girls). I'm looking forward to a gluten-free January, and more pairs of skinny jeans. Mostly I follow the winds, and my nose, and my gut, which I suppose is more or less what my ancestors did.

It is nice to be able to use one's powers for good. Happy New Year.

Wednesday, December 29, 2010

Your Brain On Creatine

Thank you, good paleo fairy, Melissa McEwan. By my count, next time I'm in NYC I owe you a half dozen NorCal Margaritas. And thank you also former primal muse who has now moved over to That Paleo Guy (I should just send you a margarita machine.) I now have in my little hands several creatine fed to vegetarians papers (1)(2)(3). Seems that cognition researchers (and athletic performance researchers) simply love giving vegetarians creatine - a practice that might seem curious until you look at these facts:

1) Creatine is an amino acid found only in animal flesh but most abundantly in skeletal muscle flesh (like steak). It is not an essential amino acid, as we can synthesize is from other amino acids found also in plant foods, but as with changing the plant-based omega 3 fatty acid ALA to the marine animal based omega 3 acid DHA, the synthesis is inefficient. It is known that vegetarians have lower tissue (measured directly via muscle biopsy) amounts of creatine than omnivores (4).

2) Why should we care if we have creatine? Well, if you recall, our cells run on energy supplied by ATP. Whether we fuel up with glucose or ketones, eventually those raw materials get transformed into ATP, which as it is broken down powers all sorts of energy-requiring processes. We will obviously burn through ATP faster in our muscles when we are running or jumping or performing various feats of strength, but we also burn through ATP faster when we are using our noggins for something a bit complicated. Our little brain (the size of your two fists held together) burns through 20% of the energy we use each day, primarily to keep those ion gradients fueled that allow our neurons to charge up and then be discharged to communicate information.

3) Creatine can bind to phosphate (P) to make phosphocreatine, and this acts as a "buffer" to make ATP lickety-split. Turns out we can make ATP 12 times faster using phosphate reserves from phosphocreatine than by using oxidative phosphorylation and a whopping 70 times faster than making ATP de novo. When we think hard, brain levels of phosphocreatine can drop pretty acutely while ATP levels stay constant, showing that we can bust into that reserve to keep our thinking sharp. In short, creatine improves brain efficiency.

So let's look at these papers, shall we? In both the cognition study papers, healthy college students were recruited (colleges being both a good source of research volunteers and vegetarians) and divided into creatine or placebo supplementation groups. The British study compared vegetarians and vegan young women to omnivores, the Australian study used only vegetarians and vegans, but had a crossover design (all subjects got both placebo and creatine along the way). Both studies did various measures of cognitive and memory testing (number of words you can remember from a list read to you, how many F or P words you can say in 2 minutes, how many numbers you can repeat backwards from a string of numbers read to you, recognizing strings of three even or odd numbers in a series of numbers read at 100 per second). The British study added a measure of reaction time (subjects had to press a button corresponding to a light as fast as they could once it was lit). The Australian study was 6 weeks, the British study was 5 days, and both used 5g creatine monohydrate as the supplement and dextrose (glucose) as the control.

Because glucose administration has been shown to (immediately) increase cognitive performance (5), all the cognitive testing was done fasted and on a day with no supplementation.

The results? First off, everyone, vegetarian or omnivore, on placebo or creatine in the British study did worse the second time around on the memory tests (maybe they got bored?). But compared to the placebo group, the omnivores in the British study were about the same as the creatine supplement group (omnivores have been shown to benefit from a maximum of 20 grams a day at first then maintenance 2-5 grams per day supplementing for athletic performance), suggesting that us animal flesh eaters have a physiologically appropriate amount of phophocreatine reserve in the brain for interesting tasks such as pushing buttons in response to light stimuli and complicated mental tasks that involve the prefrontal cortex and the hippocampus.

The vegetarians in the creatine group did much better than the vegetarians in the placebo group on the second battery of tests involving word recall and measures of variability of reaction times. More simple mental tasks didn't improve in the vegetarians or the omnivores, suggesting, interestingly enough, that complicated thinking burns more energy than uncomplicated thinking (so do smart people burn more calories?? I'm not aware of any research to that effect, in fact I thought there wasn't much of a difference, but we'll look into it...). In some of the measures, vegetarians were higher than omnivores at baseline, by the way, and in general the memory tests between the two groups did not vary at baseline - the vegetarians just seemed to benefit much more from creatine supplementation.

In the Australian study (using only vegans and vegetarians), creatine supplementation had a significant positive effect on working memory (using backwards digit span) and intelligence measures requiring processing speed. Various cognitive tasks that were worse in the placebo vegetarians compared to creatine vegetarians are similar to those that are affected in ADHD, schizophrenia, dementia, and traumatic brain injury. In addition, people with the Apoe4 allele and therefore more vulnerable to developing Alzheimer's seem to have lower brain levels of creatine.

There. Simple - when we are not being simple, we do better with creatine.

Except there are a few wee wrinkles. It turns out that creatine supplementation seems to have an effect on glucose regulation (3)(6). Weirdly, the first study shows a higher glucose level to a oral glucose tolerance load (in vegetarians), and the second study (in young athletically active males) shows a lower amount of area under the oral glucose tolerance test curve (that's good - shows increased glycemic control) with creatine supplementation. But if we consider the fact that a ready supply of glucose in the short term can improve cognitive performance, the British investigators were wondering if creatine supplementation increased glucose in vegetarians, thus increasing cognitive performance. They didn't bother to measure the glucose in the subjects, though, so who knows. In the Australian study, glucose was measured in the fasting subjects but specific levels were not reported in the paper, but it didn't seem that anyone had a high level.

In addition, creatine in the tissue doesn't necessarily equal creatine in the brain - however, animal studies have shown that problems with the creatine transporter into the brain shows up as cognitive problems similar to the unsupplemented human vegetarians (compared to their supplemented vegetarian brethren). However, it is likely that the synthesis and transport mechanisms are upregulated in vegetarians (as they have low levels of creatine), so creatine might pack more punch early on for the veggies, until levels become saturated.

Well, I'm not all that interested in supplementing with creatine. But I am interested in continuing to eat steak, and in having the most efficient energy reserves available for my brain.

And another question - if low levels of creatine can contribute to Parkinson's, are vegetarians more vulnerable to Parkinson's? I'm not sure of the provenance of this very interesting document I found on the internet - but it seems to be written by vegetarian physician and advocate Joel Furhman (though his suggested food pyramid does allow for animal products at the small tippy top) about two case studies of vegans developing Parkinson's, blaming low levels of DHA, and Dr. Furhman then peddling his vegan DHA product. But maybe creatine deficiency could be implicated? Who knows? I'll be eating plenty of meat just to be safe.




Tuesday, December 28, 2010

The Vulnerable Substantia Nigra

Blogging while out of town has proved more difficult that I thought. For one thing, we have free babysitting, so we've obviously been going out at night rather than staying in. Also, I am mostly limited to the iPad, which isn't as easy to blog from as a normal computer (yes, cry me a river, but true nonetheless). And I was planning on blogging about the creatine and vegetarians paper from the British Journal of Nutrition. However, It turns out my institution doesn't have access to the full text, and I really don't want to shell out $45 for a single paper that tells me to eat meat. I already eat meat, and if you want your brain to be tip top, probably best you do too, or supplement, supplement, supplement with that growing list (B12, zinc, taurine, creatine, carnosine, etc. etc.)

But anyway. A few weeks ago, Dr. Aaron Blaisdell, who I'm told will have access to all the best parties at the Ancestral Health Symposium, was kind enough to send me this paper - "Oxidant stress evoked by pacemaking in dopaminergic neurons is attenuated by DJ-1.". The paper is a bit technical. But hearkens back to a previous blog entry, Brain Efficiency. In that entry I talked about how Parkinson's Disease comes about when the dopamine-making neurons of the substantia nigra (thanks, Ned) poop out for some reason. No dopamine in the substantia nigra, and you get stiffness, dementia, tremor - Parkinson's Disease. Parkinson's is another one of those diseases that seems to be increasing faster than we might expect for the aging population. It is postulated that oxidative stress causes the problems (oxidative stress means burn-out, basically. Too much gas for too long, too much build-up of the toxic byproducts of making energy). The burn-out happens in the mitochondria, the energy factories of the cells (which makes sense). But no one knows why the mitochondria in the substantia nigra would be more vulnerable than the mitochondria in other cells.

Increasing the efficiency of the mitochondria by using certain supplements (such as coenzyme Q and creatine) which are also available from meat and organs from animal foods is currently being investigated as treatment for Parkinson's Disease. This new paper has some evidence for a mechanism why the mitochondria in the substantia nigra are so vulnerable as to be the canaries in the coal mine.

In the paper, researchers investigated some mouse substantia nigra(s?) and found that those particular neurons have some interesting properties. They seem to pulse in energy output, rather like a pacemaker of some sort. The pacemaking requires a lot of energy, as the cells have to let go of their energy and then build it up again at regular intervals. Since they burn through more energy doing this pacemaking than other dopamine-making neurons in neighboring brain areas, they seem to be more vulnerable to excess oxidative stress. So more vulnerable to burn-out resulting in Parkinson's Disease.

The solution (or, perhaps better stated as the possible prevention) is, of course, always pretty much the same. Eat a diet of nutrient-rich foods and avoid poisons that will stress your brain. Say no to excess fructose, wheat, and omega 6 fatty-acids and fake, processed foods. I have to say that going out into the real world on this vacation (not my kitchen or pantry) shows me once more just how ubiquitous the poisons are. We checked out some "pizza topping" cheese-like substance in a bag right next to the real cheese which looked like mozzarella, but was actually soybean oil, corn starch, and potato starch. Ick! And guess what - that mayonnaise "with olive oil" is still mostly soybean oil. Avoid!

CoEnzyme Q rides around the body in your cholesterol carriers, so sufficient cholesterol is important. We can make creatine, but when we eat it we get it mostly from muscle flesh. Vegetarians are low in creatine.

We have a certain design spec. It is remarkably flexible, yet in the post-industrial age we have managed to scribble far, far outside the lines of what our bodies consider food. Once again, straying too far for too many meals is really not a good idea.

Friday, December 24, 2010

Secrets of the Synapse

Merry Christmas Eve, y'all! I'm in Texas and therefore blogging in a Texas accent currently. Also, I ate some Tex-Mex, which aside from the vegetable oils and corn and cheese and beans is totally paleo. It is too much to hope that the restaurant we went to used the traditional lard.

But life goes on, the year comes to a close, and earlier this month a lovely "Brief Communication" was published in Nature Neuroscience. Now Nature Neuroscience is some hard core brain journaling. I like to think I know more about the brain than the average soul, but when I read the titles of the articles in Nature Neuroscience, I understand the gist of about half of them.

This paper, "Characterization of the proteome, diseases, and evolution of the human synaptic density" is well worth a squint or two. Here's a full text link. What it comes down to is that the researchers were able to find the actual proteins and their associated genes linked to all sorts of neurological disease via human brain sampling and a rather amazing use of free online databases. It's the wikipedia of neuroscience, without the amateur editing, described in a stunning three pages.

Basically, the researchers took brain neocortex samples from 9 adults and used some advanced chemical sorting techniques to to identify all the proteins in the sample, which was calibrated to be a sample of the nerve synapse (specifically the post-synaptic area). Just as when we call all DNA the "genome," when we identify all the proteins, we have what is called a "proteome."

The 748 proteins found in all three replications of the experiment were recorded into a freely available online database. Then the data were compared to the Online Mendelian Inheritance in Man database, which has information about genetic diseases from linkage studies. Linkage studies are usually done comparing siblings and parents with genetic diseases. If there is enough available data, linkage studies will give you sections of chromosomes, and in some cases, even specific genes associated with the diseases (here's a nice mini-primer on the difference between linkage and association genetic studies - my previous post on migraines reviewed an association genetic study).

In short, our researchers compared the data and found 269 diseases resulting from mutations in 199 genes. 133 of these diseases specifically affect the nervous system (80% central, 20% peripheral). Alzheimer's, Parkinson's, Huntington's diseases and disorders resulting in mental retardation, movement disorders and ataxia, epilepsy, and many rare diseases were all scooped up in this analysis.

Breaking down the data further, the scientists found 21 neural phenotypes (a phenotype is a gene expression - we each have genes for blue eyes or brown eyes or both or neither, but our phenotype is our eye color). A phenotype for mental retardation represented 40 genes, while 20 genes represented spasticity. The large number of genes In these sets are thought to mean that the post-synaptic area of the human brain is exceptionally important in these disorders.

The researchers didn't stop there (we're still on the first page of the paper!) The next step was to compare the human data to the much more specific set of mouse data (more specific because we do all sorts of enlightening but gruesome experiments on mice that we are not able to do on humans) From that they were able to find specific sets of genes related to actual cellular morphology linked to certain, especially important "enriched" phenotypes. The enriched phenotypes, associated with lots of neuronal functioning and disease, include components of known very important signaling mechanisms (the NMDA receptor and associated proteins, for example).

Next the human neural coding sequences were compared with various primates and mice using the dN/dS ratio. This data analysis compares the differences in specific samples (the post-synaptic neuron genes from human and chimpanzees, for example) to the expected (average) rate of genetic change over time. Humans and mice diverged 90 million years ago, yet the post-synaptic neuronal genetic dN/dS ratio was "very significantly" less than the variation between the entire human and mouse genome (we're talking a p value of 10 to the -148). Human neuronal genes were, not surprisingly, also very significantly similar to the primate genes studied compared to the whole human and various primate genomes (with similarly minuscule and thus highly significant p values). Mice and rats diverged 20 million years ago from each other, and their post-synaptic genes are also much more similar than you would expect to each other. All this means that the forces of evolution have conserved these important genes over millions of years, meaning they had better work just so, or your offspring won't survive.

The scientists also compared the conservation of these post-synaptic genes to the genes of other areas of the brain (which are also highly conserved), and found the post-synaptic genes were more conserved than the other brain sets, and also more conserved than other genes for basic cellular components (such as the endoplasmic reticulum, the mitochondria, and the nucleus). Highly interconnected "hub" proteins were also more conserved than other post-synaptic genes, showing that the structure of the synapse seems to mediate the evolutionary conservation of the gene sequences involved.

The human [post-synaptic density sampled for this experiment] has a high degree of molecular complexity, with over 1000 proteins,..combinations of proteins regulate the phenotypes of over 130 brain diseases. It is possible, indeed likely, that the proteins identified represent an overall synaptic parts list, with subsets of synapses containing subsets of these proteins... Our data provide a valuable resource and template for investigating human synapse function and suggest new diagnostic and therapeutic approaches.


I'll say. It might also show us that animal studies involving the synapse might give us fairly accurate information related to humans, especially compared to information obtained from dietary studies or the like. The information from this study is a holiday present to neuroscientists everywhere. And shows us the vast potential of comparing existing databases to new data to create working protein maps of what is actually going on in the synapse or other biologic systems.

Amazing. Nearly miraculous. One way or another, the secrets of our complex brains will eventually be revealed.

- Posted using BlogPress from my iPhone

Tuesday, December 21, 2010

Alzheimer's and HDL

There are a number of December papers I've been wanting to blog about, but other things came up along the way.  But here I have a moment before the children come home, and the house is quiet except for the cats... I ought to be wrapping presents... but here is a new Alzheimer's paper to add more confusion to the cholesterol and Alzheimer's information.  Overall, the data have suggested that high cholesterol in midlife and low cholesterol in late life both increase risk for the development of Alzheimer's in late life. 

As usual the lipid hypothesis holds sway, and when one reads the analysis in the literature, one gets the idea that the high cholesterol in midlife is likely a causative factor, whereas low cholesterol in late life is "part of the disease process."  (For example, Grandfather must be eating less because he is getting demented, so his cholesterol drops.  For years prior to severe symptoms that might legitimately lead to eating less.)  Of course I'm of the opinion that cholesterol is good for the brain.  But not too many people listen to me.  Or else the  The Primal Blueprint Cookbook: Primal, Low Carb, Paleo, Grain-Free, Dairy-Free and Gluten-Free would be in the top 5 bestselling cookbooks rather than number 5 on the Physician's For Responsible Medicine's Worst Cookbooks of the Year. (Hah they even have the huevos to remark upon the silly "low carb animal" vs. "low carb vegetable" study that came out earlier this year as evidence for avoiding Mark's book!!)  I think I might pick up a few more books on that list of the worst five, frankly.  They look as if they contain many delicious recipes.   

Okay, back to the paper.  Did you know that more than 50% of the adult US population has high cholesterol?  And 1% of people ages 60-69 will develop Alzheimer's, increasing to more than 60% of those over 95.  "There is evidence that cholesterol alters the degredation of the amyloid precursor protein" (supposedly bad) but "cholesterol depletion induces [Alzheimer's Disease]-type injuries in cultured hippocampal slices" (now that sounds quite bad.)  Overall, the observational studies linking dyslipidemias to Alzheimer's have been inconsistent (with the typical take from the literature what I discussed in the 2nd paragraph above).  

Well, high LDL and low HDL have been linked in the past to vascular dementia.  Vascular dementia is kind of the brain version of heart disease.  Arteries get clogged up with plaque (different plaque, actually, than amyloid plaque, things get blocked, mini-strokes occur, and someone gets gradually demented as the amount of damage builds up. Anyhoo, Alzheimer's is a bit different - neuronal amyloid plaque builds up, then you get inflammation and tau tangles, and neuron damage and death.

The authors of this study wanted to examine a cohort of people after the start of widespread use of lipid-lowering agents (primarily statins) in the 1990s.  So these folks in "Northern Manhattan" were recruited from the Medicare roles in 1999-2001, baseline measures of general health and cognitive function were gathered, and they were followed up every 18 months or so.  Out of 1130 individuals who completed the study, 101 were diagnosed with AD, 89 diagnosed as having probable AD, and 12 as possible AD (while Alzheimer's Dementia can only be definitively diagnosed via autopsy, there is a characteristic style of progression of memory loss that makes the performance on certain cognitive tests a decent way of diagnosing the disease).

The mean age of onset of the disease was about 83.  Higher HDL (especially over 56 mg/dl) was protective after adjusting for all sorts of things, including age, ApoE4 status, sex, education, ethnic group, and even vascular risk factors and lipid-lowering treatment.  Interestingly, higher total cholesterol levels and higher LDL cholesterol levels were also protective through all the adjustments except the last two, though became nonsignificant when lipid-lowering treatment and vascular risk factors were adjusted for. (In this cohort, high insulin levels were a strong risk factor also).

The authors put forth the "HDL as garbage trucks" hypothesis of HDL cleaning the cholesterol out, and they postulate the following: "High-density lipoprotein cholesterol might also be linked with small-vessel disease by ...interaction with with APOE and heparan sulfate proteoglycans in the subendothelian space of cerebral microvessels.  Thus, a low HDL-C level could precipitate AD through a cerebralvascular pathway."  (Or how about this theory, which is my own little edit - high HDL-c is associated with low amounts of inflammation and that is the secret to a healthy brain). And more discussion in the paper is about how the study of lipids and Alzheimer's has been confusing all along, and certainly low HDL is associated with stroke, so maybe also some linked pathology is responsible in AD... and previous cohorts with higher total cholesterol in midlife blah blah...  this paper twists and turns about so many times I get confused.

Don't be confused.  Let go of the lipid hypothesis.  Think inflammation.  It's the immune system, not the liver, that is trying to kill us.  Then everything becomes clear, and things start making sense again.  Peace.

Saturday, December 18, 2010

Compulsion (with opera)

In this post I mean to review the basic neurobiology of compulsive behavior (uh oh, 90% of you just turned off your computers and started watching past episodes of "Top Chef, Just Desserts" - and how could Morgan not have won, anyway?), and also figure out why Dr. Garner's presumably serotonin deficient, hair-pulling mice began scratching and removing hair MORE on a serotonin-promoting mouse chow diet with added glucose and tryptophan.

To understand what might be happening, we have to back up and talk about obsessions and compulsions and where they exist in the brain.  For everything we are has a place in the brain (right click in new tab for a song).

Compulsions mean performing unpleasantly repetitive or seemingly unnecessary acts in order to prevent perceived negative consequences ("step on a crack, break your mother's back").  Scratching, hair pulling and twisting do fit under that definition, as people often perform the behavior to stave off anxiety.  Impulsivity is the predisposition toward rapid, thoughtless acts without regard for the eventual negative consequences (such as gambling or uncontrolled aggression), and hair-pulling (trichotillomania) and picking may also fall under this category. At first glance, compulsion and impulsivity may seem opposites, but they run in parallel neuronal tracts, and people with symptoms of one will often have symptoms of the other.

The compulsive and impulsive psychiatric diseases are among the most highly inherited diseases in psychiatry.  Obsessive compulsive disorder, ADHD, Tourette's, autism, and substance abuse disorders indisputably run in families, suggesting (more even than for other psychiatric disorders) physical brain differences between those susceptible to the disorders and those who are not.  Children with autism, for example, are likely to have tics or repetitive physical movements consistent with Tourette's or OCD, or impulsivity consistent with ADHD, or both.  Those obsessively driven to drink or gamble are also more likely to be impulsive as well.  There are all sorts of subcategories of different kinds of obsessive and impulsive behavior that exist along different neuronal tracts -  the complexity, therefore, is immense.

However, one can conceptualize the brain as doing two things for impulsive behavior and compulsions.  On one hand, the brain has tracts that promote these behaviors - these tracts generally run from the center, more "primal" parts of the brain up to the outer "civilized" cortex.  The cortex tends to work in the opposite direction, inhibiting compulsions and impulsivity.  Therefore one can become vulnerable to these behaviors via two means - either by problems that increase the "primal" signals from the center of the brain, or by problems that block the "civilizing" influence of the outer shell of the brain. 

In terms of neurotransmitters, serotonin deficiency is thought to be responsible in part for anxiety driven behaviors, obsessions, skin-picking, hair-pulling, etc.  Dopamine deficiency in the cortex is responsible for impulsivity (such as in ADHD).  However, dopamine excess is thought to be responsible for motor tics, tapping in autism, for example, and other obsessive physical acts.

But let's bring it back to Garner's mice.  They are genetically prone to hair-pulling behavior (Jaminet postulates this is due to a predominant Th1 immune/inflammation response to infection), and the hair-pulling worsened with increasing serotonin turnover in the brain, despite the "general rule" that anxiety and picking/hair-pulling behaviors are due to serotonin deficiency.

(In the mood for more opera?  If so, how about this classic?)

But not so fast.  There may not be as much of a mystery as we like to think.  In his paper, Garner brought up the fact that SSRIs (selective serotonin reuptake inhibitors), while used to treat trichotillomania, can also induce skin-picking behaviors in some.  His idea (I'm stretching, a bit) by the end of the paper was that this particular strain of mice would be a bad candidate for SSRI treatment.  He may be right, but he may be wrong, because increasing serotonin turnover in the brain via diet is not the same as administering an SSRI.

The problem with thinking we know anything about the human brain is that the brain is immensely complex.  Remember - a hundred billion neurons with 10,000 interconnections EACH who fire up to 1000 times per second.   And when we strictly limit ourselves to serotonin, we are talking 20 different flavors of receptors, each doing subtly different things.  Garner's mouse diet increased serotonin turnover overall in the brain, presumably jacking up the general serotonin signal.  Well, there are mouse models for everything, and some have shown that increasing the activation of the serotonin 2C receptor (5-HT2C receptor)  increases compulsive behavior, just like in Garner's mice.

SSRIs work a little differently than just blanket administration of serotonin in the brain.  SSRIs result in the downregulation of the post-synaptic receptors, among them 5-HT2C.  Therefore they will (in general) decrease the overall signal away from compulsion, perhaps relieving symptoms.  SSRIs will tend to favor serotonin signal to another receptor, the serotonin 2A receptor.  Activating this receptor tends to reduce anxiety and compulsions. 

(more music)

Psychiatrists use a whole host of SSRIs (prozac, paxil, zoloft, celexa, lexapro ect. ect. yawn yawn) but only one serotonin receptor activator (buspar, a 5HT2A agonist which may well be one of the most useless medicines we have in my experience).  For better or worse, and for all their limitations, SSRIs do a better job of modulating the anxiety/compulsion symptoms than straight-up administration of serotonin (via a tryptophan promoting diet or tryptophan supplements).

Again, the medicines are not my interest in this blog, but our knowledge of how they work, and the animal models, give us clues as to how our brains work, and that I find interesting.  Are our brains better off with a diet of whole foods with just enough omega 3s, not too many omega 6s, no weird industrial designer pseudofood or toxic plant proteins?  Most likely. And Garner's mice tell us that straying from the design parameters of the diet our brain likes may well cause major problems.  And that sugar and tryptophan supplementation may cause major problems, if we share genetic vulnerabilities with his mice.

That is one theory.  I also question whether kyurnetic and inflammation in general were escalated with Garner's mice's experimental diet.  From my comments on Dr. Jaminet's post:

Just want to add to my speculation about increasing inflammation (which I think is the more likely scenario) – tryptophan is [a] precursor for both serotonin and kyrunetic. The former helps modulate major brain circuitry communication, the latter seems to encourage excitatory communication, and in excess will be neurotoxic. Kyurnetic is elevated in cases of inflammation. SSRIs increase serotonin initially, but then after 2 weeks, the post synaptic receptors are down regulated, so the overall effect is not so much to increase serotonergic transmission, but to make what transmission there is go through more efficiently. SSRIs also seem to favor the metabolic pathway of tryptophan to serotonin rather than kyrurnetic.

It is possible that by massively increasing the tryptophan uptake into the brain in the context of the baseline inflammatory diet and a genetically vulnerable mouse population, there were increases in kyurnetic, leading to neurotoxicity and mouse psychopathology. I’m not a big fan of l tryptophan and 5 htp for that reason – I know you are not a fan due to tryptophan's role in the infectious theory of neurotoxicity. Since this scenario probably occurs in the human brain as well, it makes one wonder about our USDA diets.
  So, take your pick.  But there are many plausible methods by which increasing tryptophan can increase picking/hair-pulling behaviors, and more and more reasons to eat well from the beginning.

Thursday, December 16, 2010

Scratchy

Purdue University and its mice hit the nutrition news this week with the work of Dr. Joseph Garner and his team. Their paper, "Nutritional up-regulation of serotonin paradoxically induces compulsive behavior" was published in Nutritional Neuroscience, and prompted several reports and tweets rather like this one. Jad supplied me with a links in the comments to That Tapeworm Ate Your Depression, and Jamie emailed me the abstract as well. No one was going to let Mark Sisson get the jump on me this time!

And, to be sure, this paper and the work are pretty neat. As far as I know, it may be the only paper showing a definitive development of psychopathology with an adjustment of diet. So that's a big deal!

A little background - serotonin is a neurotransmitter in part responsible for calm, happiness, and whatever the opposite of wanting to kill yourself is - contentment and serenity with living in your own skin, I would say. I talked about serotonin in several blog posts:

The Evolution of Serotonin
More About Sunlight, Food, and Serotonin

As did Jamie:
Midwinter Blues
More Serotonin
Brain Dump on Serotonin


It is common knowledge that eating carbs will increase serotonin levels in the brain. Basically, carbohydrates increase the brain's ability to import tryptophan, the amino acid precursor to serotonin, through the blood brain barrier. It shouldn't surprise you that eating tryptophan can also increase the transport of tryptophan through the blood brain barrier. Well, Dr. Garner and team searched the literature and quantified the whole thing, and figured out that if one increases carbohydrate:protein ratio a certain amount and increases tryptophan a certain amount, serotonin creation in the brain goes lickety split, zoom zoom - and if you do that, hey, maybe you get a calm happy la-la land of serotonin peace coma, rather like Thanksgiving afternoon after turkey, mashed potatoes, and pie.

Instead he got scratchy mice. But I'm skipping to the end. Enter the mice - happy mice in their standard cages on their standard mice chow control diet or treatment diet, in a double-blinded (actually, they only called it blinded, as maybe the mice knew, but couldn't tell anyone) crossover trial.

Read about the control diet and try not to gag too much:


Casein 24%
Soybean Oil 10%
Cornstarch 52.3%
Sucrose 5%
Fiber (cellulose) 4%
Vitamin and mineral mix
Choline 0.2%

Overall it was 24% protein and 57.3% carbohydrate, and only 10% fat, mostly polyunsaturated omega 6 vegetable oils. Kind of the USDA dream diet, really - skim milk derivative and vitamin-enriched low sugar corn cereal, kids! (It is noted that all the mice gained weight when allowed to eat ad libitum during this experiment, though the treatment mice gained more than the controls).

The treatment diet had an increased carbohydrate to protein ratio with a little extra tryptophan, and a big bolus of sugar in the form of dextrose (which is two glucose molecules hooked together - or not, it is just glucose molecules. Thank you Jim, maltose, not dextrose, is glucose-glucose)

Casein 12%
Methionine 0.4%
Tryptophan 0.9%
Soybean Oil 10%
Cornstarch 30%
Dextrose 33%
Sucrose 5%
Fiber (cellulose) 4%
Vitamin and mineral mix
Choline 0.2%

These mice have a little issue, in that they engage in behavior called "barbering." Meaning they pull out their own hair and the hair of their cage mates (they pull out their own hair in a particular pattern, and cage mates in a different pattern, so the hair-pulling can be differentiated. Humans with a rather common (3-4% of women) behavior called trichotillomania also compulsively twist hair until breaking and pull out hair, eyebrows, and eyelashes. It is thought that trichotillomania is due to a serotonin deficiency, so it would make sense to test this hypothesis by using a mouse model and a diet designed to increase serotonin in the brain.

So after the mice were fed their diets, and the amount of barbering was measured, and the amount of scratching was measured during a "spray test," and then the mice were decapitated, the brains were put on ice and eventually homogenized and analyzed for serotonin and other neurotransmitter amounts.

As expected, treatment diet increased whole brain metabolites of serotonin and decreased the ratio of serotonin to the metabolite - consistent with increased serotonin synthesis and metabolism. Dopamine metabolites were also reduced, consistent with the general principle that when serotonin is increased, dopamine is suppressed.

However, the treatment diet, which definitely increased serotonin turnover, actually increased barbering behavior. Scratching scores were doubled. In addition, a deadly skin infection seemed to plague the treatment mice, especially the female mice. Basically, a diet low in protein and fat and high in sugar led to hair-pulling, scratching, and death via skin infection in this mouse model of trichotillomania.

So, what the heck is going on? If trichotillomania is caused by low serotonin, why would increasing serotonin metabolism cause more picking? SSRIs, which also affect serotonin, are used to treat trichotillomania. However, any psychiatrist in practice will know that SSRIs can also induce skin-picking behavior in vulnerable individuals that will go away once the SSRI is withdrawn.

Skin-picking and grooming are basic primate behaviors. Observe any group of monkeys or chimps and they seem to spend a lot of time grooming each other. These activities are thought to be mediated by serotonin. High amounts of dopamine can also cause compulsive tapping (as in OCD or autism) and tic behaviors. It is interesting that women, who seem to be more vulnerable to serotonin pathology, are four times as likely to have trichotillomania as men, but men, who are more vulnerable to dopamine pathology, have much higher rates of tapping and tics.

I conceptualize serotonin and dopamine levels in the brains as see-saws. Sometimes pushing the see-saw one way with diet or a pharmacologic agent will result in the see-saw becoming balanced, sometimes pushing it will unbalance it further. So some folks with trichotillomania will improve with an SSRI, and other folks with no picking problems will start to pick when given an SSRI. And these mice, apparently, do not do well with increased serotonin turnover. Of course one has to wonder what such a diet would do to our psychopathology.

More on the neurobiology of compulsions and trichotillomania in the next post. If I get through the articles. Reading about trichotillomania always makes me itch.


Wednesday, December 15, 2010

Monday, December 13, 2010

That Tapeworm Ate Your Depression

I'm a little embarrassed that Mark Sisson got to this one before I did.  But I'm sure he has several minions to scan the literature for him, whereas I have a few loyal friends and fellow bloggers.  Here it is, though, a new paper from the Archives of General Psychiatry, "Inflammation, Sanitation, and Consternation: Loss of Contact With Coevolved, Tolerogenic Microoganisms and the Pathophysiology and Treatment of Major Depression."

And I have to admit, after reading Mark's little blurb, I went to the paper expecting to be annoyed.  There are a lot of versions of the hygiene hypothesis (basically the idea that our environments are too clean) that make it sound as if your mom is a crazy germophobe and that's why you have asthma. Which doesn't make sense, because that remote control your kid is chewing on has about a billion microbes on it.  Also, you will often hear that "children just aren't exposed to childhood infections anymore" as we have vaccines and smaller family sizes and antibacterial soap.  But the typical childhood infections such as chicken pox, whooping cough, diphtheria, etc. are all as modern as eating grains, and were established in humans as we developed higher population densities and domesticated animals, so lack of exposure to those bugs wouldn't necessarily mess with our evolved immune system (also, there is some (association) evidence that exposure to common viruses increases inflammation and may increase our risk for depression).  That particular version of the hygiene hypothesis is dealt a death blow by the fact that inner city kids rife with childhood infections have the highest rates of asthma, much higher than isolated rural kids living out in the country with all the ragweed (1).

But I set aside my preconceptions and took a look at the paper, and thank goodness I did, because it is epic, amazing, and brilliant.  All psychiatrists, psychologists, and other doctors download it now if you have access and have a look.  It even includes the Dobzhansky quote "nothing in biology makes sense except in the light of evolution."

So here we go.  I've made a point before that depression is a result of inflammation.  Specifically, depression is associated with higher serum levels of IL-6, NFkappabeta, TNF alpha, and a host of other pro-inflammatory cytokines.  Medically healthy individuals with depression and a history of early life stress mount a larger inflammatory response to laboratory psychosocial stressors than do nondepressed controls.   The prevalence of major depressive disorders is increasing in all age cohorts, but especially in younger people, and countries transitioning to be part of the developed world experience increasing rates of depression along the way.  One would hypothesize, then, that something environmental in the modern world makes us vulnerable to depression (and other inflammatory diseases of civilization, such as MS, inflammatory bowel disease, type I diabetes, asthma, etc.).

"Overwhelming data demonstrate the prevalence of helper T cell type I...mediated autoimmune and inflammatory bowel and Th2 mediated allergic/asthmatic conditions have increased dramatically in the developed world during the 20th century, with increases in immune-mediated disease incidence in the developing world during the same period closely paralleling the adoption of first world lifestyles." 

Asthma, hay fever, type I diabetes, inflammatory bowel disease, and multiple sclerosis have all increased 2-3 fold in the developed world in the last 60 years.  Many of these conditions are highly comorbid with major depressive disorder.

I've focused on a pro-inflammatory diet as a hypothetical cause for increasing depression (along with obesity and the other diseases of civilization).  The vast majority of the depression literature, I would say, has focused on the pro-inflammatory aspects of a stressful modern life (which I contend isn't necessarily more stressful than life was 60 years ago, or 800 years ago, during the Bubonic Plague, for example).  This paper focuses on "the loss of a microbial modulated immunoregulation" of our Th1 and Th2 immune cells.

Quick review - childhood viral infections tend to mobilize type I T helper cells.   Since Th1 cells seem to balance and modulate the Th2 cells, one might expect that a lack of Th1 activation due to a sanitized environment would lead to naughty Th2 cells running rampant, causing asthma and allergy and the like.  That makes sense, except naughty Th1 cells seem to cause other autoimmune issues, like Crohn's disease, and the incidence of Crohn's disease has increased steadily along with asthma and allergy.   In fact, "most follow up studies have failed to show an association between childhood infection and increased autoimmune and/or atopic conditions in the modern world while continuing, in general, to find correlations between a first-world lifestyle and increases in these conditions."

But humans have been living with some microorganism and parasites for much longer than the childhood infectious diseases of the last 10,000 years of agriculture.  These ubiquitous organisms seemed to keep Th1 and Th2 cells busy without causing problems, in other words, the "old friends" germs "induced and maintained an adaptive level of immune suppression."  Or:

"the mammalian genome does not encode for all functions required for immunological development, but rather that mammals depend on critical interactions with their microbiome (the collective genomes of the microbiota) for health."

What are these organisms?  First off are the pseudocommensals, saprophytic mycobacteria that are found in mud and untreated water and on unwashed food.  They don't colonize the body, apparently, but were known to pass through it in large quantities historically:

A bunch of commensal species are known to inhabit our gut, among them Bacteroides, Lactobacilli, and Bifidobacteria.  And finally, the helminths (internal parasites, such as tapeworms) are the third member of the triad of "old friends."

There is a whole body of literature dedicated to animal studies showing how exposure to these "old friends" reduces autoimmine, inflammatory conditions, and even cancer.  A sugar molecule from Bacteroides species protected against colitis and distorted immune system development in germ-free mice.  Prebiotics known to increase Bifidobacteria in the rodent gut reduced serum concentrations of cytokines such as TNF alpha and IL-6.  "Metabolic products from gut microbiota reduce inflammation in animal models of a variety of human autoimmune and allergic disorders, as well as in [test tube] preparations of human [immune cells].  The health of the human gut microbiome has been shown to impact varied physiologic processes such as pain sensitivity, sleep, and metabolism (all of which are abnormal, by the way, in major depressive disorder.)  A parasitic worm, Schistosoma mansoni, can make a friendly phospholipid for us, phosphotidylserine.  Exposure to a pseudocommensal organism, M vaccae, reduced serum TNF alpha concentrations over a three month period compared to placebo (in humans and human monocyte cell lines).  Recall that TNF-alpha is increased in depression, and antidepressants reduce TNF-alpha - it does make one wonder if these "old friends" have antidepressant effects.

Without constant exposure to these immune modulating "old friends,"  it is plausible that modern humans are at risk for mounting inappropriate inflammatory responses, leading to many of those undesirable diseases of modern civilization, including depression.  I wonder if using inappropriate food, such as vast quantities of fructose, could destabilize the gut microbes and be part of the inflammatory process.  One could further postulate that exposing depressed individuals to "old friends" could act as a treatment.

Gut-depression links are already well known - psychological stress in humans is associated with reduced fecal Lactobacilli, and individuals with major depressive disorders had some fragments gut bacteria inappropriately floating around in their blood, suggesting the presence of leaky guts.  One small study showed that giving people a prebiotic that favors Bifidobacteria reduced anxiety in patients with irritable bowel (2), and another 2 month placebo-controlled study showed that lactobacillus treatment reduced anxiety (but not depression) in people with chronic fatigue (3).  Probiotic treatment did not reduce depressive symptoms in chronic fatigue patients in another small study, but it did improve some cognitive symptoms that are common in major depressive disorder (4).  M vaccae was administered to patients with renal cell cancer, reducing serum IL-2 and some depression symptoms (5), and in another larger study, killed M vaccae reduced depression and anxiety symptoms in lung cancer patients receiving chemotherapy (6). 

There is a long way to go before we start feeding people dirt and worms as an evidenced-based strategy for treating depression.  But...the ideas are intriguing, based in common sense, and scientifically sound.  People with the "short" genetic form of the serotonin receptor, for example, are known to be more vulnerable to major depressive disorder, and they are also more vulnerable to known forms of depression caused by inflammation, such as depression caused by interferon alpha treatment.  These findings link genetic vulnerability to environmental inflammatory factors to depressive symptoms.  Priming the body with known anti-inflammatory modulators should help depression.  Even if it might not seem that...tasty.

Saturday, December 11, 2010

The Monolith

Where evolutionary psychiatry meets history and anthropology is where we become modern humans. It is a tricky question when that happened, because as best as we can tell, we have been genetically modern for the past 200,000 years, yet we didn't have beads and art and tool advances and religious icons and all those uniquely human attributes until 60-80,000 years ago.  Here's a song (right click in new tab) to get you thinking on it.

It is hard to imagine you and me and our neighbors sitting around twiddling our thumbs for 120,000 years - surely we would have carved a bead or two and perfected the spear along the way. But there's no evidence we did anything of the sort until some folks in southern Africa started munching on shellfish. All modern humans are descended from those southern Africans who later migrated up to the Middle East. And anthropologists who followed the trail of our ancestors found the first evidence of widespread consumption and transport of shellfish.

Shellfish are rich in iodine and omega 3 fatty acids. We've discussed the omega 3s at some length, but now let's look at iodine. Iodine is needed to make thyroid hormone, which in turn stimulates the enzyme tyrosine hydroxylase, which is an essential step in the making of dopamine, that neurotransmitter responsible, perhaps, for us being all too human. In addition, the omega 3 fatty acids increase dopamine receptor binding and dopamine levels.

Gagneux et al notes that there appears to have been a huge increase in T3 (thyroid hormone) with the advent of modern humans (chimps, for example, have much higher levels of transthyretin, which binds thyroid hormone and keeps it inactive). It is noted that both chimpanzees and Neanderthals had superficial features in common with developmentally iodine-deficient humans (large femurs, extended brows, and shorter stature). Humans have larger thyroids than chimps, whereas chimps have larger adrenal glands (brain dopamine is essential for inhibiting the systemic arousal caused by activation of the adrenal glands).

Shellfish aren't the only explanation for the rise of modern humans. After all, plenty of animals eat crabs and the like and they don't fly airplanes or...blog. Another (even more speculative theory) suggests that increased human intelligence via marine animal consumption led to longer lifespan, which led to increased populations and increased socialization, communication, and migration. The competitive stresses and achievement drives to out-perform one's neighbor would have plausibly helped to select for brains with more dopamine - though as you may recall there don't seem to be any specific genes for dopamine lateralization. And, once again, we are genetically and physically very similar to our ancestors of 200,000 years ago (except our brains are a little smaller). I'll let Previc explain: "Cultural and dietary influences on dopamine, transmitted prenatally, would have been passed on and enhanced in successive generations and thereby rendered a permanent part of our inheritance. Even when humans moved inland and no longer relied as much on aquatic fauna, their dopaminergically mediated cultural achievements were self-sustaining."  It was just around the time of this human mind "Big Bang" that we went through a population bottleneck.  All of us are descended from a few thousand people from 65-70,000 years ago - Previc doesn't mention this bottleneck in his book, but the timing is mighty suspicious.  It might be that only the humans who were able to fully utilize our super dopamine tracts were able to survive whatever crisis rocked our species back then.

So our ancestors did not need a monolith to spark our leap forward - we needed some clam-digging and crab catching. Iodine and omega 3s enhanced our dopamine-dependent traits - enhanced working memory, cognitive flexibility, the capability of thinking in temporal and spacial distance, creativity, and increased mental speed. This dopamine upgrade in our processing skills enabled art, advanced tool-making, language (not just speech, which seems to have evolved earlier), and long-distance exchange. It made us modern and uniquely capable of wonder and destruction on a scale known only before to nature itself.


Thursday, December 9, 2010

Pleasure, Pain, Wheat, and Psychopharm

There is something of an addiction theme out in the blogosphere today.  Mark Sisson is talking kicking the junkfood habit, and Dr. BG is talking politics and CRACK.  I thought I would throw in a little neuromapping of pleasure and pain, as hey, apparently folks like a bit of neuroanatomy (who'd a thunk?) with their grain-free Evolutionary Psychiatry.  Also, there's a new medicine for weight loss that will likely approved by the FDA, called "Contrave," and how it works has everything to do with subverting and modulating pleasure.   So let's dive in.

My source for today's post is mostly the clever Dr. John J. Medina, who writes a "Molecules of the Mind" column for Psychiatric Times.  Finally, someone with a geekier column than "Evolutionary Psychiatry." 

Pleasure in the brain is primarily mediated through the neurotransmitter dopamine.  Neurons in the ventral tegmental area (the starting post for those dopamine tracts I talked about a few days ago) respond to sex, drugs, rock n' roll, food, and communicate with some other neurons in the nucleus accumbens, and a third neural network in the amygdala and the ventromedial prefrontal cortex.  These are all segments of the "medial dopamine tracts" I reviewed at some length a few days ago.  It may be of interest that those of you who derive pleasure from other people's pain (schadenfreude) experience your mischievous pleasure here as well.

Pain is experienced in the aptly named "cortical pain network,"  which are regions of the brain a little separated from the pleasure centers.  The different areas of the pain network collect sensory pain (such as a splinter in one's finger) and emotional pain.  Typically, experiments used to isolate pain circuitry in the brain involve "aversive stimuli" such as electric shock.

So it turns out that social pleasure and pain have hijacked these evolutionary circuits of pleasure and pain, so that if you score a date with that hot chick and share a high five with your frat brothers, or you get arrested for that meth lab you are running in the basement of the chemistry building at school, you will experience the pleasure and pain in the same areas of the brain that you experience sex and electric shock.  It hurts to be human, sometimes.    If you feel that you are fairly treated and are feeling cooperative, your pleasure centers are stimulated.  If you grieve the loss of a loved one, your cortical pain network lights up.

And what about opiates or wheat exorphins or binge eating?  Turns out those activities stimulate the dopamine reward centers of the nucleus accumbens.  That new weight loss drug is a combination of two older drugs, naltrexone and buproprion (wellbutrin).  Naltrexone is a straight-up opiate blocker.  It is FDA-approved to reduce cravings for alcohol, but to be honest I use it more often for people trying to kick an oxycodone or heroin habit.  You have to be off opiates for a couple weeks or risk an exceedingly uncomfortable precipitated withdrawal, and once you are on naltrexone, your opiate tolerance will drop like a rock, so if you go back to using like you used to, you could easily overdose.  But naltrexone isn't a controlled substance, and if you take your medicine as prescribed and try to use opiate drugs on top of it, those drugs won't work.  There's nothing like a pharmacologic lock on the opiate system to help out an opiate addict.   Naltrexone has been studied in binge eating and in gambling, and for some people, it seems to help.  I've used it on a couple of occasions for sleep eating with extreme carb (grain) cravings with some success, also in patients with celiac who can't seem to kick the wheat habit.  Naltrexone use requires monitoring of the liver, and typical side effects include upset stomach.  But both of those are less noxious than a heroin habit in my opinion, but every case is different and risks and benefits must be discussed for each situation.  

Buproprion (wellbutrin), the other drug in Contrave, was discussed in Antidepressants and Weight Gain or Loss.  It helps keep the dopamine systems humming along without interruption, so you don't necessarily need that lift from vegetable oil laden fast food french fries.  Wellbutrin can cause seizures, irritability, insomnia, and anxiety, so not a bucket of laughs by any means, but all things considered has some of the fewest side effects of any antidepressant.

I'm not clear that any insurance company in Massachusetts will pay for a combination of two medicines you can likely prescribe cheaply separately for less cost.  And I hope that my readers understand that I think a paleolithic (or, if you are a conservative sort, Mediterranean) style diet (Dr. Parker reminds me that I mean carbohydrate-restricted versions of these for most people trying to lose fat) should be attempted along with proper exercise before we hand out pills to lose weight.  In the studies, Contrave resulted in 5% weight loss.  So that's good for improvement of some health conditions, such as diabetes or hypertension, but it won't make you the star of a hydroxycut commercial by any means.

Addiction is tough.  Those who suffer need all the help they can get.  Sometimes that means the judicious use of pharmacology.  Most of us can get by with a bit of knowledge and (if we're lucky) some self-restraint.

Tuesday, December 7, 2010

Your Brain Loves Cholesterol (Don't Go Too Low)

Sometimes this blog writes itself.  Today I was sitting around minding my business, spreading happiness and serenity, when I got an email from the Amazing Jamie Scott, who sent me a link to this paper: Diabetes and Insulin in Regulation of Brain Cholesterol Metabolism.  And then my receptionist handed me my mail, and the top story of this month's Psychiatric Times is "Statins, Cholesterol Depletion, and Mood Disorders:  What's the Link?"

I'm beginning to feel less small and alone in the world.

Let's start with the Psychiatric Times article.  Statins, as most biochem nerds will know, are pharmacologic inhibitors of HMG-CoA reductase, which is the key rate-limiting enzyme in the biosynthesis of cholesterol.  So our livers, doing their best to kill us off with heart disease, make cholesterol like mad fiends, while a statin will slow that pesky liver down, lowering serum cholesterol, and allowing us to live forever.*  Or something like that.

But throwing a monkey wrench into the cholesterol machinery has some... issues.  For one thing, it seems to ruin the binding and G-protein coupling (total random aside - at my medical school we had several professors of biochemistry who had Nobel Prizes to their name - among them Brown and Goldstein for their elucidation of the metabolism of cholesterol, and Gilman who discovered the G-protein.  And here they are, together at last on my heretical blog) to the serotonin IA receptors (1).  That's probably not the best thing to do - decreasing the ability of the serotonin IA receptors to work can lead to anxiety and irritability.  At the same time, there seem to be other changes in the actions of receptors in the context of "chronic cholesterol depletion" (I bet you would never find that phrase in 'Cardiology Times.').  As we know, low serum cholesterol is associated with violence, accidents, and suicide.

Now it is my pleasure to introduce Dr. James Lake, a psychiatrist and chair of the APA's Caucus on Complementary and Integrative Medicine (that's mainstream medicine talk for "woo."  I emailed one of my residency mentors about my blog a few months ago as I had some questions for him - he wrote back after reading some of the entries and said I was the "alternative Harvard Mental Health Letter" - I'm still not sure if that was a positive or negative comment).  Anyway, Dr. Lake seems to have drunk some of the same kool-aid that I have, as he recommends that depressed patients with elevated cholesterol aim not to go lower than a total cholesterol of 160.  How very reasonable!

He's also been hanging out with Dr. Beatrice Golomb, who has studied data from a few websites and run some surveys of her own.  In an analysis of 324 emails of people taking statins who were bothered enough to go out of their way to email "https://www.statineffects.com/" or "http://www.askapatient.com/," 30% reported mood changes such as depression, irritability, and anxiety.  When patients who complained about statin side effects were asked survey questions, 65% of 843 endorsed increased anxiety or irritability and 32% reported an increase in depressive symptoms.  Golomb published a case series of 6 patients who self-referred with irritability or short temper on statins (including "homicidal impulses, threats to others, and road rage") - in 100% of cases, stopping the statin cured the symptoms, and 4 of the 6 had renewal of the symptoms with a statin rechallenge. 

Granted, these are all people who complained of symptoms in the first place, so it is hardly a random sampling, and the case series could represent the nocebo effect.  But when I looked at the PDR for Crestor a few months ago (I can't seem to find it again on the internet, but if I do I will link it), I didn't find anything on irritability or anxiety, and depression was only mentioned briefly as an "aftermarket" side effect - meaning the crestor folks didn't find those to be side effects in their carefully controlled studies, but there are now some reports of depressed mood in the general public after release of the medicine.  To me that just doesn't quite add up to my own experience.  I've had several situations clinically where withdrawing the statin resulted in immediate improvement of anxiety, depression, and/or irritability for some treatment resistant patients.  Of course that could be nocebo too, but nocebo and placebo effects tend to wear off after about 3 months, and I've seen patients improve after 2 years, and the biological mechanisms seem plausible.  Well, definitive answers will wait for another day.

Statins improve mortality for middle-aged men who have known heart disease, have had a stroke, or have high levels of inflammatory markers.  If you don't meet those particular criteria, statins will give you no mortality benefit.

But let's not be so negative - a literature review involving statins and mental health "found no statistically significant effect" of low cholesterol on psychological well-being.  However, there may be a difference among the different statins.  Simvastatin can readily cross the blood brain barrier, whereas pravastatin really can't.  Golomb tested 1016 healthy men and women for 6 months with simvastatin, pravastatin, or placebo.  Those on simvastatin reported significantly worse sleep, and, if sleep was impaired, worsening aggression.  It was felt that statins that cross the blood brain barrier inhibited serotonin production.  Interesting.

Now onto Jamie's paper - which is a study of diabetic mice.  The researchers found that insulin-deficient diabetic mice had a reduction of a major regulator of cholesterol metabolism, leading to a reduction in brain cholesterol synthesis and lower synaptic cholesterol content (that's bad).   The decline in brain cholesterol production happened in cases of insulin depletion OR hyperglycemia in various mouse models of diabetes type I and II, but not in obese (but normoglycemic) or insulin resistant (but normoglycemic, meaning high levels of circulating insulin were needed to keep the blood sugars normal) mice.  Diabetes type I and II can both lead to CNS complications, including delirium and faster than "normal" decline in cognitive functioning.  Diabetics, as we know, have higher rates of depression and Alzheimers.  The brain contains 25% of the cholesterol in the body, and much of it is made right in the brain.  Therefore diabetes produces a "global suppression of the enzymes of cholesterol synthesis and their master transcriptional regulator, SREB-2 in the brain... [which alters] neuronal and physiological function."  A lot of this action occurs in the hypothalamus, which is a major point of control of the endocrine system, appetite, and energy balance. 

Sometimes it all comes together.  The bottom line?  Eat whole real food, not tons of sugar and linoleic acid or wheat.  Don't get diabetes if you can help it.  Don't let anyone or anything suck the cholesterol out of your brain.  Once things get out of whack, they can continue to be out of whack in all sorts of disastrous ways for quite a while.

* not really

Genius and Madness

In the Dopamine Primer 2, I covered the four major dopamine tracts in the brain.  Today I'll break down the two most important ones to psychiatrists and anyone interested in the fabulous story of human evolution and greatness, the mesolimbic and the mesocortical pathways.

For simplicity sake (as mesolimbic and mesocortical don't have a whole lot of meaning unless you are versed in neuroanatomy, also because the mesolimbic system ends in the cortex so it is also mesocortical, and that's hella confusing), I'm going to call them the "medial" and "lateral" dopamine tracts respectively.  The medial tracts go from the center, primitive, animal parts of the brain up to the emotional centers of the brain, and then to the front part of your brain (literally the center of your forehead, more or less).  The lateral tracts go from the center, primitive, animal parts of the brain up around the outside and end up more by your eyeballs (more or less).

Both tracts carry dopamine, but the tracts are responsible for somewhat different human behaviors.

The lateral tracts are responsible for:
Future-orientation in predicting events
Strategic thinking
Rational, abstract thought
Focus and control
Unemotional

Someone who has an optimal amount of dopamine in the lateral system is going to be self-contained, practical, self-confident, and able to forgo immediate gratification in order to ensure greater reward later on.  He or she might be the perfect person to bring with you on an expedition somewhere.  However, an extreme "lateral dopamine" type person wouldn't be the one you might confide in with emotional problems.  Also, on that expedition, if you break your leg and no longer become practical, it might be just a little too easy for he or she to leave your burdensome self there in the wilderness.  So the dark side of dominant lateral tracts would be grandiosity, ruthlessness and sociopathy.   

The medial tracts (more emotional in nature rather than rational thought) are responsible for:
Action
Aggression
Future-orientation in exploration (motivation and drive)
Creativity (along with paranormal experiences and psychosis)
Hyperactivity and impulsive behaviors
Euphoria and pleasure-seeking

A more medial dopamine personality might be a bit wacky, impulsive, and free-thinking.  A hippie or an artist.  Not particularly good at planning, but often compelling, creative and interesting.  Maybe not the first person you would want to take into the wilderness, but perhaps capable of intuitive leaps of logic that could get you out of a real jam.  And as it is the serotonin/norepinephrine right brain tracts that are more responsible for emotional sensitivity and understanding social cues, the medial dopamine dominant personality may not be particularly empathetic, and might wander off and leave you alone in the the wilderness as he or she might think of something better to do.  The dark side of the medial dopamine dominance would be psychosis, paranoia and irresponsibility.  (Keep in mind that these are all generalizations - all the tracts interact in complex ways so there is rarely any such thing as a pure "medial dopamine personality.")

Too much excess in the medial dopamine tracts leads to madness.  Irrational thought, paranoia, loose thought associations, psychosis.   In just enough excess it is creative genius.  Families with schizophrenics are also more likely to have more creative individuals.  And many people considered geniuses also suffered madness, such as Nobel Prize winner John Nash, a schizophrenic who had this to say when asked how a mathematician devoted to logic and proof could believe that extraterrestrials were sending him messages:  "Because... the ideas I had about supernatural beings came to me the same way that my mathematical ideas did."   Unfortunately, since schizophrenia is ultimately an inflammatory  neurodegenerative disease, it is unlikely that a genius suffering from schizophrenia could maintain brilliance for more than a few adult decades. 

It is an interesting observation of Previc's The Dopaminergic Mind in Human Evolution and History in that these two tracts seem to line up with Freud's separation of the human mind into the ego and the id (the id here being primitive drives of sex and pleasure-seeking, loose thought processes, and impulses of the medial dopamine tracts, whereas the ego weighs risks and benefits before jumping into any particular course of action - a lateral dopamine action.  Freud's superego, or the conscience, is more likely more associated with those socially directed serotonin/norepinephrine right-brained tracts).  

Estrogen has a tendency to inhibit dopamine (allowing for a greater dominance of social empathy, balance and those norepinephrine/serotonin pathways), whereas testosterone will tend to enhance dopamine.  In the agricultural past, dopamine dominance has allowed for greatness and male dominance - Previc lists the following famous men in history and their "dopaminergic traits":

Alexander the Great - high intelligence, visionary, motivated, risk-taking, self-confident, but also grandiose, ruthless, restless, and paranoid.

Columbus - intelligent, visionary, motivated, self-confident, risk-taking, but also grandiose, ruthless, and restless.

Newton - extremely high intelligence, visionary, self-confident, motivated, but obsessive, lacking empathy and social skills, ruthless, paranoia, and neglected personal hygiene from time to time.

Napoleon - intelligent, visionary, motivated, risk-taking, self-confident, but with delusions of grandeur, ruthlessness, and restlessness.

Einstein - extremely high intelligence, visionary ideas, high motivation, self confidence, but had obsessiveness, lack of empathy and social skills, grandiosity, and personal hygiene neglect.

Which brings me to the present day, where is some respects things have flip-flopped between men and women, at least in America.  According to a recent article in Time Magazine about the "Sheconomy,"  young single urban women outearn young single urban men, and while 35% of women aged 25-29 have a college degree, only 27% of men do.  There is something going on (socially, in culture, environmentally?  I could barely hazard a guess)  that is making it easier for women to remain focused on long-term productive educational goals (a dopamine trait) in their youth.  Women now own 1/3 of the businesses in America.  As women still pay a major career penalty for having kids, and previous generations still control the Fortune 500, women have not cracked the higher echelons quite yet.  But the Generation Y numbers could indicate that it is only a matter of time.  And women still have the estrogen/serotonin/norepinephrine advantage of better being able to read social cues, which is an a helpful trait in sales, business, and even politics if coupled with enough grandiosity.

In an agricultural world, vision, ruthlessness and sociopathy allowed for one man to grab all power at the expense of his neighbors, creating kings.  In a post-industrial world, the hyperdopamine advantage may not be quite so simple.

Saturday, December 4, 2010

Rant

Pubmed is awesome. It makes us all into research giants from home. I've even made my own username and password in "My NCBI" and programmed in some automatic searches, so the first Saturday of every month (today, as a matter of fact), Pubmed sends me some emails about the latest research pertaining to my keywords of interest. Today I received several emails, one of which contained a link to this piece of crap review paper from some unfortunate primate researchers in Oregon, "Perinatal Exposure to High-Fat Diet Programs Energy Balance, Metabolism and Behavior in Adulthood."

In my endless optimism, I think, hey, maybe here is a paper that discusses some neurobiological programming related to the quality of the diet. That would be awesome! But no! Do me a favor and take a gander at the abstract. Here, I'll copy a sentence for you: " Evidence from a variety of animal models including rodents and nonhuman primates indicates that exposure to maternal high-fat diet (HFD) consumption programs offspring for increased risk of adult obesity. Hyperphagia and increased preference for fatty and sugary foods are implicated as mechanisms for the increased obesity risk." The takeaway point of the article is that if you eat a "high fat" diet in pregnancy, you will have fat, depressed offspring.

Now this is a review paper. I read the whole thing. There is next to no information on the actual macronutrient content of any of the diets studied (in rodents and primates). However, there is a seeming equilibration between "junk food diets" and "high fat diets" that sets my teeth on edge. And I've read enough Hyperlipid to know that a typical research "high fat" diet for rodents is comprised of sugar and crisco or some other trans fat industrial nightmare.

So if one were, say, a journalist, without knowing the context, who had the same "My NCBI" keywords programmed in that I do, one would find this review paper and pen some misinformed news item telling everyone to avoid fat while pregnant because it will make your kids fat and depressed. So pregnant women will eat their whole wheat toast with an apologetic smear of margarine and consider themselves healthy, and wonder why they fail the oral glucose tolerance test at 21 (or 28? I forget) weeks.

JUNK FOOD does not equal HIGH FAT DIET.

Junk food is loaded with vegetable fat, grains, and sugar, all cheap commodities to make cheap, industrial, disgusting "food."

If you load up on junk food during your pregnancy, studies of rodents and nonhuman primates indicate that your kids will be more likely to be obese and crave junk food, and will be depressed.

It is not about the macronutrients. Not for children with fresh pancreases and livers. It's about the quality. Give them nutrient-rich, fresh, non-processed amazing food. Forget about whether it is high fat or high carb. It doesn't matter. Just don't poison them with omega 6, trans fats, grains, and industrial fructose.

Well, I have to finish cleaning up before the babysitter comes over so my husband and I can enjoy a proper holiday party. How is it that a 29 pound 18 month old can undo my organization much faster than I can organize? #entropy.

Friday, December 3, 2010

Brain Efficiency, Pediatric Edition

Back in November, I wrote a post titled Brain Efficiency that detailed some of the links between mitochondrial dysfunction and Parkinson's Disease.  The mitochondria are the energy powerhouses of the cells, cranking out ATP (cell gasoline) to keep pace with all our cells need to do. 

Classically, mitochondrial dysfunction was felt to be relatively rare, and it was usually investigated in cases of chronic fatigue, unexplained muscle weakness, that sort of thing - we medical types are a literal lot.  "Goodness, you have no energy?  Maybe we should see if your cells can make energy."   

However, my readers know that that pile of gelatin quivering between your ears is one of the most energy hungry parts of the human body.  It is 5% of our body weight, but comprises 20% of our metabolism.  Therefore, any genetic predisposition to dysfunctional mitochondria may show up was a brain problem.

Earlier this week, this article was published in JAMA, "Mitochondrial dysfunction in autism." This is the first study to examine the function of mitochondria in a well-defined population of children with autism, which is a disorder that strikes in infancy to early childhood and can result in poor social skills, developmental delay, and stereotypical repetitive movements among other symptoms.  It was a small study, 10 children diagnosed with full spectrum autistic disorder and 10 controls, but they examined everything soup to nuts, as it were.

The children and their cells were examined for problems with mitochondrial DNA, the actual energy-generating capacity of their actual mitochondria (lymphocytes* were put on ice and immediately taken to a lab to measure the respiration!), and for signs of leftover metabolic garbage hanging around.  The results were pretty remarkable. 


Cell respiration (mitochondrial capacity to take glucose (or ketones!) and oxygen and turn it into energy) can be measured by the amount of input of fuel and output of the byproducts of respiration.  I'm not the most mechanical of people, but I imagine measuring cell respiration has the automotive equivalent of measuring horsepower.  Some of us have Ferraris, others Ford Festivas, and most will be somewhere in between.  The autistic kids had lower average NADH oxidase activity - their average was 4.4 (95% CI, 2.8-6) as opposed to 12 (95% CI, 8-16) in the control kids, and the majority of the autistic kids had levels that were below the range of the control kids.  The mitochondria of the autistic kids seemed to putter along, compared to the more zippy mitochondria of the control kids. 

Now let's look at some metabolic byproducts of slow or inefficient cell respiration.  Higher pyruvate levels were found in the autistic kids (pyruvate levels should be relatively low if your mitochondria are efficiently processing oxygen and glucose) than in controls (0.23 vs 0.08), and 8 out of the 10 autistic kids had pyruvate levels higher than any of the controls.  This finding matched the decreased amount of pyruvate dehydrogenase activity found in the autistic kids.   Levels of hydrogen peroxide were also higher in autistic kids.  

At a genetic level, the kids with autism had a lot more copies of mitochondrial DNA in their cells.  (Our mitochondria are probably evolved from energy-producing bacteria that another ingenious and cheeky cell gobbled up long, long ago to create its own internal power plant.  Therefore our mitochondria, within our big old animal cells, have their own DNA called "mitochondrial DNA."  This mDNA (also sometimes called mtDNA) is inherited from our mother, and her mother, and her mother etc. etc. back to that first precocious gobbling cell, as we get all our cell organelles from our mother's egg, and only a bit of good old human DNA from Dad.).  An average human cell mitochondria has 2-10 copies of its DNA hanging around.  5 of the autistic kids had more mDNA than expected.  2 of the autistic kids also had deletions of certain areas of their mitochondrial DNA.

So what does all this data mean?  It was felt, all told, that the autistic children had cells that were in a state of chronic oxidative stress.  This would explain not only the respiration issues, but also the higher copies of mitochondrial DNA, made either due to errors from free radical damage, or to compensate for the inefficient mitochondria.  But don't jump to conclusions.  We don't know if mitochondrial dysfunction is the cause of autism, or one of the myriad effects.  Maybe the kids were born with Ferraris, but adulterants in the fuel causes it to run like a Ford Festiva.  It also make sense, as we know the brain needs  efficient mitochondria motoring along to keep all those ion gradients that power thinking online, that inherited defects in mitochondria could leave one more vulnerable to developmental insults and problems as the brain forms.

This finding could relate to modern diets and habits in all sorts of ways.  It occurs to me that one stand-out epidemiologic link to increased rates of autism is in kids whose moms had gestational diabetes, so presumably higher glucose, insulin, and other neuronal hardships for the developing baby.  We also know that ketosis helps mitochondrial efficiency and promotes neurogenesis and neuronal repair - vitamin D also has a role in promoting neuronal repair.    And inflammation in general would require mitochondria to be in tip top shape to keep up with the metabolic requirements and clean-up. 

I love it when a little more information comes along in real time to add a piece to the puzzle.

*lymphocytes are cells of the immune systems and easily sampled from a simple blood test, compared to painful muscle biopsies or scary brain biopsies.  Since lymphocytes use equal amounts glycolysis and oxidative phosphorylation to make energy, it was felt they would be a fair tissue to use to measure oxidative capacity of the cells of the autistic children versus the control children.