Angela A Stanton, PhD

Angela A Stanton, PhD, is a Neuroeconomist who evaluates changes in behavior, chronic pain, decision-making, as a result of hormonal variations in the brain. She lives in Southern California. Her current research is focused on migraine cause, prevention and treatment without the use of medicines.

As a migraineur, her discovery was helped by experimenting on herself.

She found the cause of migraine to be at the ionic level, associated with disruption of the electrolyte homeostasis, resulting from genetic mutations of insulin and glucose transporters, and voltage gated sodium and calcium channel mutations. Such mutations cause major shifts in a migraine brain, unlike that of a non-migraine brain. A non-migraineur can handle electrolyte changes on autopilot. A migraineur must always be on manual guard for such changes to maintain electrolyte homeostasis.

The book Fighting The Migraine Epidemic: How To Treat and Prevent Migraines Without Medicines - An Insider's View explains why we have migraines, how to prevent them and how to stay migraine (and medicine) free for life.

Because of the success of the first edition and new research and findings, she is now finishing the 2nd edition. The 2nd edition is the “holy grail” of migraines, incorporating all there is to know at the moment and also some hypotheses. It includes an academic research section with suggestions for further research. The book is full of citations to authenticate the statements she makes to be followed up by those interested and to spark further research interest.

While working on the 2nd edition of the book she also published academic articles:

"Migraine Cause and Treatment" Mental Health in family Medicine, November 23, 2015, open access
"Functional Prodrome in Migraines" Journal of Neurological Disorders, January 22, 2016, open access
"Are Statistics Misleading Sodium Reduction Benefits?", Journal of Medical Diagnostic Method, February 3, 2016, open access
“A Comment on Severe Headache or Migraine History Is Inversely Correlated With Dietary Sodium Intake: NHANES 1999-2004” Angela A Stanton PhD, 19 July 2016 DOI: 10.1111/head.12861 not open access, membership required to read it.

Dr. Stanton received her BSc at UCLA in Mathematics, MBA at UCR, MS in Management Science and Engineering at Stanford University, PhD in NeuroEconomics at Claremont Graduate University, and fMRI certification at Harvard University Medical School at the Martinos Center for Neuroimaging for experimenting with neurotransmitters on human volunteers.

For relaxation Dr. Stanton paints and photographs. Follow her on Twitter at: @MigraineBook

Genetics and the Migraine Brain: Mutation, Adaptation, or Variance?

3795 views

Fitness and survival are by nature estimates of past performance.

George Wald

Reconsidering Genetic Mutations: What is Baseline?

Each person possesses a set of genes that belongs to that person only. It is not possible to state what the “baseline” human genetic makeup is; we can only look at phenotypes (observable characteristics or traits) and the genetic makeup of what, on average, we are today. We use the average of these genetic commonalities to establish the baseline from which each person differs to some degree. How we interpret what we end up with using such averaging is the question of this paper. In my view, the genetic commonalities of today’s humans is not a baseline since it implies that everything that came before the current form is ignored; or if that form shows up as a result of atavism, we consider that as mutation (in a negative sense) away from the “norm” that needs to be medically treated. There is something wrong with this.

What is baseline as far as genetics are concerned is an important question. It determines how we view genetic differences e.g. whether those differences represent negative or positive changes or no changes at all. This question came up in my research looking at genotype and phenotype variances associated with migraines. Migraine is considered to be a neurological disorder that is treated with dangerous drugs that don’t cure migraines, only try to reduce symptoms. These medications often leave migraineurs with permanent side effects.

Migraineurs are very sensitive to their environment because their sensory organs (nose, eye, ears, touch, taste) are more sensitized. Such hyper-sensitive sensory organs and the associated sensitivity (1) are considered to be disorders in childhood and phantom hallucinations in adulthood (2) that need medical treatment. Yet these are not hallucinations. Hyper sensory organs are more sensitive to stimulation and detect smells, sounds, sights, tastes, and touches more vividly than those without such neurons. Unfortunately, migraine also comes with disabling prodromes, pain, and postdromes that places migraine into a disorder category. While research is focusing on medicines to reduce or prevent the pain, why not focus on the cause of and reason for the pain? After all, this highly sensitized brain must have yielded evolutionary benefits to have become the 3rd most prevalent illness in the world, representing 15% of the (diagnosed) population.

Over the many years of my migraine research, I have been able to teach thousands of migraineurs (myself included) to induce, abort, and prevent migraines without the use of any medicines. If a condition can deliberately be induced, aborted, and prevented, it cannot be a disease. It certainly comes with a host of genetic differences from those without migraines, but differences need not be disorders. This prompted me to ask a very important chicken or egg question: Did the hyper sensory “on alert” brain of migraineurs or the less sensitive brain of non-migraineurs come first? This question doesn’t aim to establish what is normal but it certainly aims to establish the baseline. If the baseline brain is the hyper sensitive brain (my hypothesis), then why not learn to feed the nutrients it needs so that it experiences no pain?

We consider evolutionary adaptations as those genetic changes that improve the survival chances of the organism. The brain that smells, hears, sees better is a survival advantage. In ancient times, it was important to smell a lion in the bush miles away or feel weather changes by barometric pressure change or the movement of a single leaf even if the actual change was far away in time or place. The hyper sensitivity of the migraine brain is similar to the hyper sensitivity of the mammalian brain, where not being alert is equal to becoming food. Alertness is not just mammalian—all wild creatures have hyper sensitivity to their environment. Based on mammalian facts, my theory is that the brain of migraineurs is the original brain and those without migraines had the good fortune to adapt to changing life as a result of a series of genetic variances that became adaptations. Does genetics support my hypothesis?

Mutation, Adaptation, Variant, Trait, and Epigenetics

In the GeneCards database, one can search for the many genes associated with traits and health conditions. However, a gene may be associated with many unrelated traits and health conditions. There are several genes connected to migraines—each of these genes has a different representation in those without migraines. The trick is to understand what the differences may mean and how those manifest themselves in a population.

The word mutation is often used to describe genetic changes, although change has a neutral meaning whereas mutation doesn’t. “A Mutation occurs when a DNA gene is damaged or changed in such a way as to alter the genetic message carried by that gene.” (source) A mutation is permanent. Mutation is either a damage or an adaptation. A damage, of course, is a mutation with negative consequences to the organism. Adaptation is a long-term change in behavior, physiology, and structure of an organism such that it becomes suited to the environment and the traits acquired are passed on to the future generations. An adaptation is thus an improvement in terms of the fitness of the organism. Adaptations imply evolutionary beneficial changes to the species. Major changes are made up of many small changes over time that remain and collectively create an adaptation or a mutation.

These small changes, variations, occur all the time to each of us. Many modern variations (also called variances or variants) lead to negative consequences, because they may modify the fitness of the species without adaptation or mutation. Variances are transfers of a single nucleotide in the DNA sequence (adenine A, cytosine C, guanine G, or thymine T), such as an A changed to a C, for example. Variances are often referred to as variants or SNPs in medical literature. A single variant may or may not bring about any change, depending on its inheritance or mechanism. In some conditions, a single SNP can make a difference (autosomal dominant) or not if recessive. A collection of SNPs may lead to specific traits that are observable by physical attributes (such as hair color or height).

Many modern health conditions, such as diabetes mellitus (type 2), are made up of many thousands of SNPs and without becoming an adaptation or a mutation are passed to future generations. This is a health concern that evolves as a result of epigenetic forces acting on the SNPs. Epigenetics greatly influence SNPs and can cause an endless variety of changes based on the available chemicals in the gene’s environment. Epigenetics need not exert its influence from the external environment but can equally (and more likely) do so from internal stressors, such as insulin resistance in response to carbohydrate consumption by those who are sensitive to or cannot digest glucose or fructose.

The Case of Red Hair: Mutation? Adaptation? Variance?

The MC1R gene, which some of us carry, is important for the red hair trait. Is red hair a mutation, an adaptation, a variant, or none of the above? If the MC1R existed in our ancient relatives that are now extinct, such as the Neanderthals, then it cannot be considered a mutation in modern humans since it happened before humans existed. As it turns out, the Neanderthals had the MC1R red hair genes, thus the MC1R variant is part of our human baseline (3). Red hair does not come from MC1R gene alone but the variants of 1994 genes participate in the creation of red hair (4). The importance of this is the length of time that passed between the first appearance (as far as we know) of the variation of this gene and the number of variances it takes to create red hair. That is because older traits that have been with our ancient relatives, and are still with us today, have very few variances; they are stable.

The question then is as follows: was the MC1R a prehistoric mutation some of us still carry? If so, its definition as mutation in modern human genetics is incorrect. MC1R existed before modern humans, and therefore it is not a mutation but a baseline trait. Those who don’t have the MC1R gene are perhaps the ones who mutated, adapted, or changed. Red hair is still found today, but no medical authority suggests that we should medicate against red hair.

It appears that the more ancient a trait, the slower the variance-rate (fewer variants), thereby indicating that a closer state to baseline is one with fewer variants: traits that are more ancient have fewer variants either because they are more stable or they represent a sub-population with a common ancestor. While this statement may find many opponents, the EDAR gene study, searching why an East Asian sub-population has different traits from the rest of East Asia, noted that it

“is striking that there are only 35 nonsynonymous variants in our entire list of candidates. Based on the genomic coverage of the 1000G data, we estimate that there are no more than 38 candidate causal nonsynonymous SNPs in the 412 candidate selected regions we analyzed… These data suggest that only a minority of recent adaptations are due to amino-acid changes and that regulatory changes are likely to play a dominant role in recent human evolution.” (5)

The EDAR gene, a single gene, with its SNPs affected a multitude of changes in a population, providing the East Asian sub-population with strong hair, smaller breast, different sweat glands, and a host of other differences over 10,000 years ago. In this case, a single functional genetic variant was deleted from the genome and that brought about a major change that acts on many traits. A variant like this is neither a mutation nor an adaptation. On the other hand, there are conditions in which hundreds or thousands of SNPs are required to bring about a single change, such as cleft palate (1855 genes)—possibly since this is an undesired trait for fitness and many dominoes must fall in place to inherit the entire set of variances to end up with cleft palate.

This clearly makes the statement that modern diseases (Alzheimer’s, cancer, etc.,) are not easy to genetically replicate and pass on to future generations. To do so, epigenetics must push its force of influence on many genes to produce variants. Epigenetics can become an undesirable force by, for example, what we eat, how much sleep we get, or what medicines we take. Our modern diet is over saturated with sugar and grains, placing a giant pressure on our immune system—neither sugar nor grain metabolism was part of our ancient original set of genes and so they force variances that can be random. The ensuing health conditions make the roulette wheel spin, stopping anywhere causing SNP-based changes.

Modern diseases have a very large number of SNPs that are disadvantageous so they cannot be adaptations (cancer, Parkinson’s), whereas adaptive changes have few SNPs (source). This makes a lot of sense since changes that are advantageous to the species need to be stable to easily pass on from generation to generation, whereas changes that are disadvantageous would be selected with a very large number of SNPs such that passing on the trait becomes difficult. For example, a research found that lactose tolerance adaptation occurred by several human groups around the globe about 3,000-7,000 years ago independent if a particular group drank milk or not. Lactose tolerance (lactase tolerance) and lactase persistence have only 20 and 26 SNPs, respectively. Contrast this with breast cancer and its 7,487 SNPs, a modern disadvantageous condition best not be passed on to future generations.

With all these complications, how do we establish if a trait is ancient or modern? And how do we know if it is a mutation, a variance, or an adaptation?  This is a much more complex question than it appears and an answer to this is not straightforward at all.

Genetic Variance and Genetic Baseline

There are conflicts within the theory of variants and establishing the default—the baseline. For example, a study suggests that brown eye is the default from which the changes went in the following direction:

brown (904) ==> green (781) ==> blue (1092)

where SNP numbers are in parentheses. Note the number of SNPs would hint that green eyes may have been the default and so based on the number of SNPs, the most stable would set the baseline and give rise to variances from there on:

green (781) ==> brown (904) ==> blue (1092).

The interpretation of what SNP means is not clear for all, nor is it clear how to interpret SNPs when some point to one thing and others point to another. The only thing that we can reliably state that it hints at the age of the variance but we cannot draw any other conclusions.

Is the Migraine Brain a Mutation or the Baseline?

Migraine is the baseline because of the small number of variances. Returning to the subject of my interest: the similar traits of migraineurs, whether migraine is a mutation (as it is referred to and medicated as a result) or the baseline for the human brain based on the brains of our ancient ancestors. You need to be a member in the Facebook migraine group to appreciate how much migraineurs are like siblings. Sometimes we run surveys. Migraine has 1293 SNPs.

Two recent surveys:

  1. Ehlers-Danlos Syndrome (EDS), which is associated with hyper mobility and vascular differences, was a fascinating find. EDS is very rare, found in only 1 out of 5000 (0.02%) people in the US. Yet over 60% of the surveyed migraineurs have at least one type of EDS (there are several types). EDS has 121 SNPs. The overlap between EDS and migraine SNPs is also significant: 43% (52 out of 121) of EDS SNPs are also found in migraine SNPs. The conditions must be related—perhaps a common ancestor.
  2. Raynaud’s syndrome is also rare; it is present in 1 out of 20 people (5%) in the US. Yet over 70% of the migraineurs surveyed have Raynaud’s. The SNP count for Raynaud’s is 88; 68% of the SNPs in Raynaud’s are part of the migraine SNPs. The two conditions must therefore also be related–perhaps common ancestor here too.

EDS and Raynaud’s also overlap with each other; 16% of the SNPs in Raynaud’s are also in EDS and 12% of the EDS SNPs are found in Raynaud’s. It appears that the two conditions are also related and both are related to migraine. While I can certainly explain the reason for the evolutionary benefit of the genetic adaptations for both Raynaud’s and EDS, I have a different point to make here.

Note that Raynaud’s phenomenon has 88 SNPs, EDS 121, and migraine 1293. These are small stable numbers (blue eyes 1092!), indicating how ancient they are and that they may indeed be related to one another. After my lengthy introduction, this is my point: Raynaud’s, EDS, and migraine are not mutations but are variances that over time turned into adaptations that in ancient time served a benefit. The percent of people afflicted with migraines is too large and they have too many similar traits to be considered a random mutation. If the hyper sensitive migraine-brain is an adaptation that serves an evolutionary benefit, why do we medicate it rather than feed it the way an ancient brain would have been fed? During the ancient time when migraine brain, EDS, and Raynaud’s were important and beneficial, there were no soft drinks, smoothies, cakes, breads and pastas on every corner. We need to respect who we are and eat for our brain.

Conclusion

Migraine is specific to a population subgroup with ancient traits that were essential in our ancient past. Today, many of the benefits of a migraine-brain are troublesome (hyper sensitive sensory organs that use more voltage) and migraine is associated with glucose (6) and fructose intolerance that lead to migraineurs’ metabolic syndrome (7-10). By getting rid of glucose and fructose in the migraineurs’ diet, migraine can be completely prevented. We need to re-evaluate the medicating of migraineurs by downregulating their hyper sensory organs. Migraine pain is fully preventable without any medicines by simply feeding this ancient brain ancient type food!

 

We Need Your Help

More people than ever are reading Hormones Matter, a testament to the need for independent voices in health and medicine. We are not funded and accept limited advertising. Unlike many health sites, we don’t force you to purchase a subscription. We believe health information should be open to all. If you read Hormones Matter, like it, please help support it. Contribute now.

Yes, I would like to support Hormones Matter. 

This article was published originally on April 6, 2017. 

References

  1. Schwedt TJ (2013) Multisensory Integration in Migraine. Curr Opin Neurol:248-253.
  2. Coleman ER, Grosberg BM, & Robbins MS (2011) Olfactory hallucinations in primary headache disorders: Case series and literature review. Cephalalgia 31(14):1477-1489.
  3. Lalueza-Fox C, et al. (2007) A Melanocortin 1 Receptor Allele Suggests Varying Pigmentation Among Neanderthals. Science 318(5855):1453-1455.
  4. Science WIo (2017) The Human Gene Database. in GeneCards: The Human Gene Database, ed Science WIo (Weizmann Institute of Science, Internet),  The Human Genome Database.
  5. Grossman Sharon R, et al. (Identifying Recent Adaptations in Large-Scale Genomic Data. Cell 152(4):703-713.
  6. Mohammad SS, Coman D, & Calvert S (2014) Glucose transporter 1 deficiency syndrome and hemiplegic migraines as a dominant presenting clinical feature. Journal of Paediatrics and Child Health 50(12):1025-1026.
  7. Salmasi M, Amini L, Javanmard SH, & Saadatnia M (2014) Metabolic syndrome in migraine headache: A case-control study. Journal of Research in Medical Sciences: The Official Journal of Isfahan University of Medical Sciences 19(1):13-17.
  8. Sachdev A & Marmura MJ (2012) Metabolic Syndrome and Migraine. Frontiers in Neurology 3:161.
  9. Bhoi SK, Kalita J, & Misra UK (2012) Metabolic syndrome and insulin resistance in migraine. The Journal of Headache and Pain 13(4):321-326.
  10. Guldiken B, et al. (2009) Migraine in Metabolic Syndrome. The Neurologist 15(2):55-58.
Share

Ancient Tribes to Modern Babies: Whole Milk Is a Common Denominator

2944 views

Three seemingly random disparate points of observations merged into one coherent thought in my mind the other day. The first observation was from a 4-part Amazon Prime documentary series titled “Tribes”. The second observation was a sentence that stopped me in my tracks:

In the 1950s Arild Hansen showed that in humans, infants fed skimmed milk developed the essential fatty acid deficiency. It was characterized by an increased food intake, poor growth, and a scaly dermatitis (Wikipedia).

I bumped into this as part of my research during the third observation, which was my discovery that monounsaturated fatty acids, such as olive and avocado oil, as much as we are told how important and healthy they are for us to eat, are neither essential, nor particularly healthy.

These three seemingly independent points connect at a very important aspect of our life: what food is important and/or essential for us humans to eat, what and how did our ancestors eat, and the puzzling sentence about how babies experience two of the most common modern diseases: overeating and scaly dermatitis. And we know that these diseases also afflict a large percent of the adult population.

I start by reviewing the four documentaries I watched, then discuss the fatty acids, and conclude with the babies.

Tribes and Milk

The documentary visited four currently existing naïve tribes still living in their ancestral ways on their ancestral lands today, retaining as much of their traditional culture as possible. Each tribe is in Africa, near or in Kenya. These are Pokot (Children of the Nile in Bogoria), El Molo (Phantoms of the Lake), Turkana (Vampires of the Desert), and Rendille (Shadow Hunters).

Of these four tribes, none practices agriculture. They are nomadic or semi nomadic and three out of the four herd their own animals, including goats and cattle, and one tribe also herds camels. None of the tribes is permitted to hunt, except the El Molo is allowed to fish and if they catch a crocodile, they can eat it. The El Molo don’t herd.

Of these four tribes, I have only seen the Pokot tribe consume any carbohydrates (cornflour cooked with water) but consumed only by the women and only one night, when the circumcision of the 15-year old boy takes place and women that night are not permitted to eat meat. None of the other tribes consumed any plants, fruits, vegetables, nuts, under any other circumstances—although they did make herbal concoctions for the sick to consume.

Of the four tribes, the El Molo live in a place next to a salt-water lake where nothing grows, so their staples are fish and the occasional crocodile. They do some trading with herding tribes, so there was one scene in which dried fish was exchanged for milk.

The herding tribes live exclusively off of meat, blood, and milk. In fact, their daily staple is milk. They only eat meat when they slaughter their own animals, which typically occurs at times of celebration. They also drink blood by letting the animals bleed without killing them, taking only as much blood as they need to take for sustenance.

Why Milk?

Milk is a most nutritious food: this is why babies suckle. Milk contain all essential nutrients, vitamins, and minerals. Babies that nurse real mothers’ milk grow fast, are “chubby” healthy, and receive plenty of protection against various diseases.

The energy cost for human brain growth, which reaches 70% of all the energy the mother ploughs into her fetus during the brain growth spurt, is guaranteed by fat stores unique to humans amongst the primates.

If you ask, most modern doctors will tell you that milk is not healthy. Most people you ask will tell you that drinking milk as an adult is unnatural and that it doesn’t ever exist past infanthood. The list of anti-milk tirades is endless. And this seems to be true: the majority of people are said to be lactose intolerant as they grow into adulthood. According to statistics, 65-70% of adults are lactose intolerant. So it was fascinating for me to watch these four tribes, in which all members, from newborn to the very old, lived off of milk.

It is easy to accept that the tolerance for lactose reduces once someone stopped drinking mother’s milk, because keeping some enzymes ready to work when there is no need for them, is an inefficient allocation of energy that could be spent elsewhere in the body. But what if the lactose intolerance lies somewhere else? Such as in the cow milk protein variant A1 – associated with a mutated cow gene – which often causes gases, bloating, and other discomfort associated with lactose intolerance. The milk the tribes consume only contain A2 milk protein that does not trigger any discomfort in humans. Maybe the secret of lactose intolerance be simply the intolerance of the A1 protein?

Essential Versus Important

Genomic research has explained that DHA is responsible for transcribing over 170 genes required in brain development.

Before I jump into expanding on the subject of the essential nutrients in milk, and why whole milk—in particular—is of the highest essence, I need to cover the confusing meaning of “essential” when we discuss nutrients. In everyday parlance “essential” means “we must have it or else”, but when it comes to nutrition we have been bombarded with all kinds of confusing messages. For example, we have been told that eating whole grains is essential and eating meat is not essential. Yet, over the history of human evolution, the consumption of grains only appeared as a staple in the last, approximately, twelve-thousand years, when humans started to develop agricultural practices—while the consumption of meat had been the staple for millions of years. So not only is it illogical, but it is absolutely incorrect to suggest that something humans have evolved on and that has helped them become who they are today, is not only not essential, but is harmful.

The true meaning of something that is “essential” in nutrition, is that “you must eat it because the body cannot create it”. That is all. Let me elaborate on this with a couple of examples. When the body finds something extremely important for survival, evolutionary processes have ensured us the ability of being able to make it ourselves. So, for example, cholesterol and saturated fatty acids are so essential, that the human body has created and retained the ability to make these come rain or shine, no matter what we eat.

Basically, the ability that humans can make cholesterol and saturated fatty acids allowed humans to survive on an otherwise nutritionally weak grain diet. Although grains have plenty of nutrients in them, these nutrients aren’t available to humans by our digestive processes. Human digestive processes are perfect for eating animal products but are incapable of digesting plant fiber. Yet whatever nutrients are in the plants, they are in the fiber.

The human body is also extremely efficient and unnecessary functions are removed. Thus, nutrients that humans have always had access to in their diet, such as omega 6 and omega 3 (polyunsaturated) fatty acids, our body does not create and we must consume them—albeit in a very small amount. According to the NIH, the daily recommended allowance (RDA) of omega 3 fatty acid (O3) is 1.1-1.6 gr for adults—age and gender variable. Omega 6 fatty acid RDA (O6) is 11-17 gr a day, also age and gender variable. However, there is no agreement on the ratio of O6 to O3 fatty acid. While the RDA appears to suggest a ratio of over 10:1, many experts disagree and recommend anywhere between 1:1 to 4:1. Hence we don’t have a consensus on how much of either of these essential fatty acids we must consume, though we all agree that it is minimal.

What we do know is that animal fatty acids, particularly from grass fed and well-cared for animals, contain ample amounts of O6 and O3. Plants on the other hand don’t contain much O3 fatty acids, and whatever little amount they do contain is in a precursor (ALA) form, which cannot be used by humans without conversion. Humans are unable to convert ALA O3 to DHA, the animal form, efficiently. Studies show that the rate of ALA conversion to DHA is between 0%-4% in young men and up to 9% in young women. With age, this conversion rate further decreases.

Only one type of O6 and one type of O3 fatty acid is essential. In O6 we must consume Linoleic Acid and in O3 fatty acid we have a choice of three alternatives: ALA (plant form, precursor to DHA, discussed in the previous paragraph), EPA (animal form, precursor to DHA), or DHA (animal final form). So the animal form of O3 is the clear winner—animal form is found in fish in large amount, and to some degree in all animal meat, egg, and milk—particularly in organic milk.

Milk As a Nutritious Food

If you ask me about milk as food, I will tell you that milk is a super-food, rich in essential fatty and amino acids (proteins), low in carbohydrates, and a rich source of electrolytes, vitamins, and minerals; I call milk the “perfect meal”. I often have just milk (sometimes a bit of cream added) as a whole meal.

Finding information on the milk types these tribes drank is not easy in any Western database.

The data in the table below on camel milk was averaged between the many camel types listed. For goat, cow whole and skim milk the data was taken from the USDA database.

Milk (100 gr) Camel Mature Human Cow Whole Goat Whole
Fat 3.83 4.38 3.25 4.14
Protein 3.64 1.03 3.15 3.56
Carbs (lactose) 4.68 6.89 4.8 4.45

While this table doesn’t show O6:O3 ratio, it does show that milk from farm animals is an excellent source of nutrients and is quite similar in nutrient content to mature human milk. Note also that mature human milk contain more fat than any other milk listed in the table. Fat is healthy and it allows newborn babies their fast brain development. Since the brain is mostly fat and cholesterol, drinking high-fat milk with high cholesterol is great source of nutrients for human babies.

Some Milk Problems

With all the good things I have noted about milk, it can also have downsides. Some babies are lactose intolerant—they are also unable to nurse as a result. Obviously, in cases like this, milk needs to be substituted with an appropriate and nutritionally equivalent fluid.

Some babies—as well as adults—complain about getting a stuffed up feeling in their head, sinus pressure, perhaps phlegm building up. These are reactions to the mutated milk protein in cow milk. Human milk, camel, and goat all contain the ancient A2 milk protein, but most cows around the Westernized world today produce milk that contains a mutant A1 protein as well, so their milk protein profile is A1/A2.  Milk cartons/bottles are usually stamped with A1/A2 so you can tell what you are buying.

Should you find that you cannot tolerate this mutant protein, try milk that contains only A2 milk protein. To this date, all goat milk is still A2, and they are readily available everywhere. In addition, all Guernsey and some Jersey cows produce A2 protein milk. There is also a brand called A2, which only carries A2 milk.

Raw milk contains much more nutrients than pasteurized milk, but they come with health risks and also they are not legal in every state. Look around and consider trying A2 protein milk, and enjoy the healthy dose of fatty acids you receive.

We Need Your Help

More people than ever are reading Hormones Matter, a testament to the need for independent voices in health and medicine. We are not funded and accept limited advertising. Unlike many health sites, we don’t force you to purchase a subscription. We believe health information should be open to all. If you read Hormones Matter, like it, please help support it. Contribute now.

Yes, I would like to support Hormones Matter. 

The article was published originally on September 21, 2020. 

Share

Are Vegan Diets Heart Healthy: A Case Study

3787 views

What is the Best Diet?

I have been working with tens of thousands of migraine sufferers from all over the world for nearly 8 years. During this time, great many migraineurs have learned to become migraine free by following nutritional and lifestyle changes, that is, the Stanton Migraine Protocol. The process of recovery starts with understanding the migraineur’s health condition prior to implementing any changes. For instance, I request all new members to provide me with a specific list of blood tests.

Over the years I have looked through a tremendous number of blood tests. Two of the standard items on the test-list are fasting blood glucose and the lipid panel—a.k.a. cholesterol. More often than not, migraineurs need to entice their doctors to order basic tests because doctors are suspicious of anything their patients initiate.

So, imagine when one of my migraineurs suddenly popped into my Facebook group to post a 7-year long history of her blood glucose and lipid panel results. What instantly caught my attention was how clearly her stages of dietary changes were documented by her test results: first on the SAD (Standard American Diet), then on a vegan diet, dropping all meat and dairy, and finally on a very strict version of vegan diet with 80% carbohydrates, 10% protein, and 10% fat. Let me show you what I mean.

Blood Glucose and Cholesterol Refresher

Most people are quite familiar with what cholesterol and blood glucose tests are but there is a confusion about what they really mean.

Blood glucose is often measured because it is an easy to check (even at home) health status about how one’s body can handle carbohydrates. Carbohydrates turn to glucose and all plants and their processed derivatives have carbohydrates. So foods such as bread, cereals, pasta, pizza, all fruits, all vegetables—including salads, beans, and nuts—convert to glucose. Of course, sugar, honey, and any syrups and sweetened drinks all show up as increase in blood glucose.

The ideal blood glucose range is 70-99 mg/dL at all times, though there is some allowance given for a short period of time after consuming a meal. It is recommended to never go past 140 mg/dL. Fasting blood glucose, tested on typical laboratory blood tests, is based on 10-12 hours of fasting, meaning not eating anything prior to the test, not even vitamins.

As already mentioned, fasting blood glucose between 70 and 99 mg/dL is healthy; between 100-125 mg/dL is considered to be prediabetic, and 126 mg/dL or over is diabetic (see here).

Cholesterol is a bit more difficult to interpret. The standard lipid panel that is supposed to test cholesterol actually doesn’t have the ability to give any real information about cholesterol—the test itself lacks the kind of sensitivity that would be able to measure cholesterol.

Cholesterol is a most essential substance in the humans body, to the point that if we don’t get enough of it from our food, we make what we need on our own. Since Wikipedia has a great description of everything you ever wanted to know about cholesterol, I only discuss the very basics here.

Cholesterol is a waxy substance that gives integrity to cellular membranes, is the building block of our hormones, and also makes up a very large part of our brain. Over 25% of all the cholesterol in your body is in your brain (see here), which together with fat take up over 65% of the brain (see here).

In the lipid panel blood tests we can distinguish several elements: HDL, LDL, Triglycerides, Total cholesterol, and often some residual factors.

  • HDL: not a cholesterol but a lipoprotein ball—a vessel, which contains cholesterol that it picks up from cellular activities. It is the “cholesterol recycler” and so it is known as the “good cholesterol” even though it is not a cholesterol.
  • LDL: not a cholesterol but also a lipoprotein ball—a vessel, which carries cholesterol to cells for support. In addition to cholesterol, it also carries triglycerides and all fat-soluble vitamins, such as vitamin A, vitamin E, vitamin D3, and vitamin K2, all of them extremely essential for our health.  LDL is referred to as the “bad cholesterol”, though note it carries important fat-soluble vitamins, so it cannot be all that bad.
  • Triglycerides: not a cholesterol but a fatty acid, created by the liver as a form of storage for excess energy we have consumed—converted primarily from excess carbohydrates.
  • Total cholesterol: this is the sum of the good cholesterol HDL, the bad cholesterol LDL, and one-fifth of the triglycerides. Total cholesterol is not very meaningful, it may be high simply because the good cholesterol is high and so it is often flagged for no reason.

You may have noticed that none of the lipid panel components is really a cholesterol, this fact plays a large part in the general confusion about cholesterol.

The current generally accepted guidelines recommend that for optimal health the following should be met:

  • LDL should be <100 mg/dL.
  • HDL should be >40 mg/dL for males and >50 for females.
  • Triglycerides should be <150 mg/dL.

There may be some variations on the above, depending on the laboratory where the tests are done.

From Good to Bad to Ugly

Start: The SAD Diet

Let us now follow the actual test results of our migraine group member through her dietary changes. Her starting test results from (3/10/2014) while she was on the SAD diet were:

  • Fasting blood glucose: 90 mg/dL, perfect.
  • HDL: 67, so way above minimum recommended, very good.
  • LDL: 139, slightly higher than recommended.
  • Triglycerides: 60, well within the recommended range, very good.

Just looking at these data, she was actually quite healthy at this time. The modern cardiovascular risk analysis is based on the triglycerides/HDL value (see here). This only works in mg/dL so if your cholesterol test results are in a different unit, they need to be converted first. The cardiovascular risk is scored as follows:

  • Ideal 0.6 -< 1
  • Good 1 -< 2
  • Barely acceptable 2 -< 3
  • Heart attack is waiting at the door if > 3

Our migraineur’s score was 60/67 = 0.895. As you can see, on the SAD diet her risk score was well within the best score possible. Nevertheless, her doctor—a cardiologist—still recommended a change to a plant-based (vegan) diet, seemingly based on the slightly elevated LDL test result.

Change in Diet

The Vegan Diet #1

It is difficult to not hear and read everywhere today how great the plant-based diet is. In fact, some European countries don’t allow children to take animal based products for their school snacks (see here) and others simply dropped all meat options in the canteens of schools (see here). In the US too (and in 40 other countries), there is “Meatless Monday”, and also a huge push elsewhere for increased prices and taxes on meat products, referred to as “sin tax”.

Vegetarianism and veganism are strongly encouraged in most of the Western world. People assume that they must then be good for health, but is that really so? Or is the benefit of the plant-based diet a myth?

Our migraineur followed her cardiologist direction to the T. She dropped all meat and dairy, turned vegan… and then… well… her blood test showed something unexpected.

Her first 4-year vegan-period values are as follows:

Date HDL LDL Triglycerides CV Risk (triglycerides/HDL)
3/18/2016 56 113 105 1.875
7/20/2016 61 134 103 1.688
10/20/2016 64 152 131 2.047
5/17/2018 71 162 139 1.958
1/10/2019 65 176 119 1.831
Average 63.4 147.4 119.4 1.88

Since she started the vegan diet in order to improve (reduce) her LDL, let’s compare her cholesterol results on the SAD diet against the vegan diet for which I averaged the 5 tests:

Type HDL LDL Triglycerides CV Risk (triglycerides/HDL)
SAD 67 139 60 0.895
Vegan 63.4 147.4 119.4 1.88

As you can see, every marker that should have gone down (LDL and triglycerides) increased and the one expected to increase (HDL) dropped while following the vegan diet. Her cardiovascular risk score increased and was no longer in the ideal range, though it was still good.

Her fasting blood glucose has also been recorded, and she shared her 5 test results:

Date Fasting blood glucose in mg/dL (ideal 70-99 mg/dL)
3/18/2016 88
7/20/2016 113
10/20/2016 116
5/17/2018 122
1/10/2019 122
Average 112.2

Her fasting blood glucose increased from the ideal of 90 mg/dL that she had during the SAD diet to prediabetic at 112.2 mg/dL on 4 years of vegan diet. So her fasting blood glucose too, moved exactly opposite of what the doctor was hoping to see.

The Vegan Diet #2

Consequently, the cardiologist told the migraineur that she needed to double down because the markers moved in the undesired direction. So the migraineur restricted her fat and protein to the absolute minimum of 10% each and increased her carbohydrates to be 80% of her calories. She followed this extreme vegan way of eating for nearly 3 years religiously. Here are her 5 blood test results for cholesterol this period:

Date HDL LDL Triglycerides CV Risk (triglycerides/HDL)
4/25/2019 70 173 111 1.586
5/15/2020 42 134 121 2.88
8/14/2020 46 134 121 2.63
12/4/2020 51 172 155 3.039
3/14/2021 52 146 158 3.0385
Average 52.2 151.8 133.2 2.552

So let’s compare now her cholesterol on all three diets: the SAD, vegan, and extreme vegan:

Type HDL LDL Triglycerides CV Risk (triglycerides/HDL)
SAD 67 139 60 0.895  ideal
Vegan 63.4 147.4 119.4 1.88    good
Extreme Vegan 52.2 151.8 133.2 2.552  barely acceptable

Wow! Everything was heading in the wrong direction! Her HDL, which should have increased, decreased further after she went nearly fat and protein free. Her LDL and triglycerides were supposed to decrease but both have increased, extremely so on the extreme vegan diet. But by far the worst news was her cardiovascular risk score, which went from a completely normal level on SAD to approaching serious cardiovascular risk for damage and possible heart attack! And all this happened by switching from a really unhealthy SAD diet to a “very healthy” vegan diet???

And what about her blood glucose after she tightened her belt and moved to the extreme vegan diet?

Date Fasting blood glucose in mg/dL (ideal 70-99 mg/dL)
4/25/2019 128 ==> Diabetic!
5/15/2020 104
8/14/2020 111
12/4/2020 115
3/14/2021 124
Average 116.4

Her fasting blood glucose changed from excellent to bad to worse as she changed to the vegan and the extreme vegan diet! From 90 mg/dL to 112.2 mg/dL to 116.4 mg/dL, meaning from healthy to pre-diabetic to worse pre-diabetic (closer to diabetic). In the extreme vegan set of blood tests she even managed to acquire a clearly diabetic fasting blood glucose value (128 mg/dL) at one point in time.

The graph above summarizes the outcome of the three different ways of eating over the seven-year time period of testing.

Over the 7 years of vegan dieting, a completely healthy individual turned into a sick person with type 2 diabetes and the risk of a potential heart attack. And all this on the advice of her cardiologist. Lovely.

Conclusion

This migraineur joined my migraine group a couple of months ago and started the carnivore diet approximately 5 weeks ago. In the migraine group we run a Kraft in Situ mimic test where the migraineur measures blood glucose and blood ketones fasted, pre-meal, and then starting 30 minutes after the last bite of food, the migraineur takes blood samples every 30 minutes for 5 hours. On her first test, shortly after she joined, she showed all the signs of prediabetes. Only a couple of weeks into the carnivore diet her fasting blood glucose was back to normal at 89 mg/dL.

On the carnivore diet she has cut out all plants and only consumes animal foods high in protein and animal fats. She now eats: meat, poultry, fish, seafood, organs, animal fats, dairy, and lots of eggs. The only non-animal foods that those of my group members who are advised to be on the carnivore diet can eat are mushrooms (fungi) and seaweed (algae) as these are not plants.

As for this migraineur, it is too soon to see major improvements in her health after 7 years of serious damage. Her fasting blood glucose is already back to normal and we have yet to see her cholesterol results. Based on my experience of over ten thousands migraineurs in my Facebook migraine groups, and the many blood tests I have seen over the years, the expectation is a complete reversal of her HDL to be high again, triglycerides to be low again, and with that her cardiovascular risk score to drop back to normal. Her LDL will likely increase, because it will be carrying lots of fat-soluble vitamins and minerals that are only found in animal products.

I hope that this shows you the importance of eating the appropriate human species food stuff, that includes—as it always has included during human evolution—lots of animal products. The vegan diet is clearly not the way most people’s health will improve. There might be some individuals for whom the vegan diet is harmless, but it is clearly dangerous for the majority of people.

Don’t follow a diet because your friends do. Unfortunately, you also have to be skeptical of the advice of many healthcare professionals. Follow the data and how you feel. Eat the way that is healthy for you.

We Need Your Help

More people than ever are reading Hormones Matter, a testament to the need for independent voices in health and medicine. We are not funded and accept limited advertising. Unlike many health sites, we don’t force you to purchase a subscription. We believe health information should be open to all. If you read Hormones Matter, like it, please help support it. Contribute now.

Yes, I would like to support Hormones Matter.

Share

Electrolyte Balance With Different Low Carb Diets

4048 views

Over recent years, a number of new diets have become popular among health conscious individuals. The most popular include: the low carb high fat (LCHF) diet, as well as the ketogenic and carnivore diets. In this video-interview I discuss how diet affects electrolyte balance. Electrolytes, minerals like salt and potassium are essential “cofactors” for enzymes involved in metabolism. They are essential, meaning we cannot make them on our own. We have to ingest them through diet. Each of these new diets differs in available electrolyte cofactors. In addition, each of these new ways of eating modifies how much insulin we use. Insulin holds onto sodium and so as we reduce our insulin need, we also reduce sodium. Since electrolytes affect everything from cell firing to heart rhythm, it is important to consider whether one’s chosen diet provides sufficient quantities of these critical elements.

I have written about the electrolytes salt and potassium on several occasions, see here and here and here. I have also written about how thirst is an indicator of electrolyte balance or imbalance, see here. In the video, we discuss salt thirst from a different perspective, and sodium requirements with different diets. US Dietary Guideline recommends 2300 mg sodium (sodium is 40% of salt) for the general population and 1500 mg sodium for the elderly and those with heart disease. This is in contrast to research showing that such low sodium actually increases mortality. Accordingly, the ideal amount is around 3500-4500 mg sodium a day for all populations. Another study found that 4000 – 6000 mg sodium was ideal for non-hypertensive individuals.

Salt, Edema, Thirst, Insulin, and Hydration

In diets more heavily focused on animal proteins, managing hydration can become difficult without proper electrolyte support. Edema, or swelling, is a common symptom of improper electrolyte balance and dehydration. Because these diets reduce circulating insulin, and as a result also sodium and water levels,  the body becomes dehydrated. To conserve water and salt, the body stores it in the tissues, leading to edema. Rather than reducing water and salt, an increase in both is required to prevent edema. Thirst may also follow, but drinking water is often an inappropriate response.

I also cover protein to some extent in this interview. Much is yet to be clarified about the amount of protein that is ideal. I have written about protein before, you will find it here.

I hope you enjoy this video discussion.

Keto & Carnivore: Electrolytes, Water Retention & More

 

We Need Your Help

More people than ever are reading Hormones Matter, a testament to the need for independent voices in health and medicine. We are not funded and accept limited advertising. Unlike many health sites, we don’t force you to purchase a subscription. We believe health information should be open to all. If you read Hormones Matter, like it, please help support it. Contribute now.

Yes, I would like to support Hormones Matter. 

This article was first published in April 2020.

Share

Pregnancy Toes – What Sugar Does to Feet

4699 views

Pregnancy toes are really swollen feet and swollen toes. The name stuck in my mind because one of my daughter-in-laws is pregnant and I was sent a photo from her winter vacation in her flip flops in the snow and winter coat—she was not able to put her boots on because of her swollen feet (swollen even in the cold!).

I did not think much about it until she came to visit me yesterday and I noticed the flip flops and her chubby toes. She had “pregnancy toes” again she said. It then suddenly all became clear. I asked her: did you by any chance have any sugar today? And she said “as a matter of fact, yes!”

I reached for my salt pills that I use for my migraines as do all members in my migraine group on Facebook and handed her one. I really should have photographed what happened but I did not think the effect was going to be so fast and so big. Less than 15 minutes after she took the salt pill and a glass of water, her toes went back to normal. We ended up laughing it away. Had she known this, she could have worn her boots in the snow after all!

So what did her pregnancy toes have to do with sugar and salt you may ask? Previously, I quoted from the Harrison’s Manual of Medicine an important paragraph that I repeat here:

…serum Na+ falls by 1.4 mM for every 100-mg/dL increase in glucose, due to glucose-induced H2O efflux from cells. (page 4)

The above means glucose (part of sugar) and sodium (part of salt) are in inverse relationship. As you increase sugar, salt drops and water is sucked out of your cells by sugar like a giant Slurpee machine. The water then collects on the outside of your cells rather than the inside, thereby dehydrating your cells and at the same time make your body swell. Edema is often associated with too much salt, but in fact, it is too much sugar. Being always thirsty is associated with Type 2 Diabetes but it is also associated with not having enough salt in the body since without salt the cells cannot get hydrated.

In light of this fragile balance between sodium and glucose in the blood, are we treating pregnancy edema, gestational diabetes, and other maternity complications, the way we should? Consider that with pre-eclampsia (gestational hypertension), women are told not to eat salt. You can see what happens when we reduce sodium: glucose increases and we also induce an ionic imbalance. This ionic level imbalance is visible (like the swollen toes) and may lead to further complications. There are two problems that we are facing here: first if she does not eat salt, her sodium-potassium pumps cannot work–this may cause migraines and headaches as I often see in my migraine group. Secondly, as you saw the fragile balance between the see-saw action of glucose and sodium, if she stops eating sodium her glucose may increase, causing swelling. This is an interesting theory to ponder – one that merits research.

Sodium and Glucose Work Together

Salt breaks up in the body into sodium and chloride. Sodium attracts water and holds onto it inside the cells. It keeps chloride outside of the cells to ensure proper voltage and electrolyte balance with the aid of potassium. When you eat sugar, the glucose part of it removes the water from the cells via osmotic channels that are too narrow for the sodium ions to exit. Thus, one ends up with a ton of water outside the cells with sodium inside hugging a tiny amount of water. Swelling occurs as the water leaves the cells but remains between cells.

Given the inverse nature of glucose and sodium in the blood, if one is swollen as a result of too much sugar, eating salt will take the water back from sugar and move it back into the cells–as it did for my daughter-in-law’s pregnancy toes. What is important in this information is this:

  1. If you feel swollen after eating sweets, you need to eat salt and drink a bit of water to reduce your swelling.
  2. If you have Type 2 Diabetes or are hypoglycemic, eating a salty meal can give you a major sugar crash and land you in the hospital!
  3. Eating sugar of any quantity will dehydrate your cells and you and make you run to the toilet every 30 minutes.

Because glucose takes water out of the cells, the edema that follows increases extra-cellular water and causes swelling in the body. This extra-cellular water needs to be reabsorbed into the circulation for the kidneys to be eliminated. To be reabsorbed, sodium is necessary since without sodium, the cells cannot operate their voltage gated sodium pumps and so the gates cannot open to grab glucose to take it into the cells and to get the water back into the cells. I think you can already see the contradictions in the logic of reduced salt: the mom-to-be is told to not eat salt, this causes extra-cellular water and swelling, which needs salt to be reabsorbed into her cells for clearance by the kidneys but which she is not allowed to eat. This way ionic level balance is not possible and chain reactions may occur with negative consequences. She may have protein leaching into her urine, extra hard kidney work, and a whole other long chain of complex events may kick in to make pregnancy a rather unpleasant experience risking the health of the fetus.

The amount of extra-cellular water is very hard for the body to get back into circulation without salt and may take days, taxing the kidneys with the volume of water leaving and increasing pressure on the blood vessels from the outside, causing high BP. However, as the volume of water is leaving the body finally, this reduces blood pressure. When a pregnant woman’s blood pressure drops as a result of all that water leaving, the dehydrated blood cells carry less oxygen. This indicates reduced oxygen for both her and the baby.

By telling mothers to reduce salt intake, glucose increases, which increases blood pressure (BP) rather than reduces it. The similar phenomenon happens in gestational diabetes. In gestational diabetes (and gestational hypoglycemia as well) the sugar level is unstable and is either too high or too low, respectively. Should the mother-to-be eat a salty pickle (as cravings always dictate pickles), she may end up in a major sugar crash and in the hospital for immediate treatment.

The balance between sodium and glucose is very fragile and extremely quick changing as you could see on my daughter-in-law’s foot. Interestingly we now also know that salt does not increase blood pressure but sugar does and so a reduced salt diet automatically increases blood pressure because of the glucose and sodium inverse connection and sugar’s dehydrating properties. Reduced salt also increases triglycerides (Graudal, 2011), causing a lot of problems for people with preexisting heart conditions. So by reducing the salt intake of the mothers to be, are we creating diabetic mothers and/or babies? Babies have been born with diabetes 2!

Is it possible that we are giving the wrong advise to pregnant women about salt and sugar? It’s an interesting question to pose and further research is badly needed. Knowing that salt and sugar are in inverse proportion in the blood, one may suggest eating them together. In fact, eating them together is a much better idea than eating sugar alone. It is best to not eat sugar at all but if you must eat sugar, consider eating salt too.

Sources:

Effects of low sodium diet versus high sodium diet on blood pressure, renin, aldosterone, catecholamines, cholesterol, and triglyceride. Graudal et al., Cochrane Database Syst Rev. 2011 Nov 9; (11).

This article was published originally on Hormones Matter on February 15, 2015. 

We Need Your Help

More people than ever are reading Hormones Matter, a testament to the need for independent voices in health and medicine. We are not funded and accept limited advertising. Unlike many health sites, we don’t force you to purchase a subscription. We believe health information should be open to all. If you read Hormones Matter, like it, please help support it. Contribute now.

Yes, I would like to support Hormones Matter. 

Share

Cymbalta: A Neurological Damage Machine

6934 views

A friend shared a story with me earlier today—it was a quote from something amazing that one of her friends wrote. She told me: “I watched my friend become a shadow of himself and it was scary thinking this young, big, strong guy was going to die.”

The story took my breath away and I asked if I can share. I was granted permission—I am not providing names or places, only the experience. There are several reasons for publishing this story and an article. We need to call attention to a huge knowledge gap in our healthcare industry. This knowledge gap also killed my mother and even though I had the requisite knowledge to prevent her death, the physicians ignored me. The medicine killed her even with my knowledge. There was noting I could do against the force of an industry that doesn’t care; where doctors are permitted complete ignorance. In this story, the participants had the suspicion that it was the medicine, but they were told repeatedly by their doctors that their suspicions were nonsense.

I should note that there are several lawsuits against the makers of Cymbalta. Some have already settled, while others are still ongoing.

The Story

“I am sharing this story because I firmly believe that my husband is not a medical anomaly. He is not the only person to get rare side effects from the medication Cymbalta (or any medication for that matter). My hope is that someone reading this may recognize some of these symptoms that are not associated with Cymbalta [on the label] and just maybe they won’t lose nearly a decade of their lives chasing a undiagnosable disease.

In late 2011, my husband injured his back. In January 2012, he was prescribed an antidepressant, Cymbalta, for the associated nerve pain and headaches he had (off-label use). Shortly after taking the Cymbalta, he developed a facial tic and uncontrollable muscle movements in his right shoulder and arm.

It was assumed by the doctors and many neurologists that rather than having injured himself, he had a neurological problem and the “injury event” was the beginning of the disease. As time progressed, his condition progressively deteriorated. Every MRI, brain scan, blood test, spinal tap, sleep study and every other test performed came back normal. The “disease” affected every part of his nervous system; his brain, spinal cord and nerves.

We were told that most known neurological diseases affected only one branch of the nervous system. The closest we got to a diagnosis was when one neurologist said he believed that my husband had one of three rare, progressively degenerative neurological diseases that could only be diagnosed post-mortem with brain biopsy. We were told to expect the disease to continue to progress at the rate it had been and that he would likely develop early onset dementia.

He suffered from debilitating dystonia. His muscles would pull and twist so hard in the wrong way that he often dislocated his shoulder, elbow, or wrist. He suffered from two types of seizures: Grand Mal and Focal seizures.  He would go for weeks to months at a time without mentally being aware of the lives going on around him. There were times he didn’t recognize me and other times he had no idea where he was.

He went through periods where he slept 75% of the time. He often fainted during these episodes of mental confusion. He recalled nothing of these periods after they occurred. He occasionally would get both auditory and visual hallucinations. At times his left eye would wander. He developed a condition called nystagmus where both his eyes would vibrate rapidly from side to side. He would occasionally wake up paralyzed and remain that way for hours all the while being completely aware. He was often wholly dependent on me.

I asked every neurologist we saw about the Cymbalta, but each one assured me his symptoms were not associated with the medication. After years of seeing specialists, my husband had given up on finding a name for what he had. His last straw was a neurologist telling him he was somehow creating all the symptoms in his head and he should see a hypnotherapist. In 2017, I finally asked our primary care doctor about the Cymbalta and he suggested we taper him off and find out.

He was tapered onto another antidepressant, as he was understandably depressed at this point. Prior to stopping the Cymbalta, he would get multiple rapid onset sharp headaches a day that caused bleeding from his right ear, eye and nostril. The first week off the Cymbalta he didn’t suffer from a single one of those headaches. Very slowly, over the course of the following year and a half every single symptom disappeared.

The drug companies are not aware of every symptom associated with a particular medication when it is released to the public. If the pharmaceutical companies do not know all the side effects, then the specialists meeting with patients certainly don’t know to look for them. Today we are beyond grateful that my husband has seen such dramatic improvements, but we can never get back the lost years.

As we now look into the next phase, I see so clearly all that has been lost. The emotional toll on our little family has been enormous. From the years of memories that were never made to the financial hardships that illness creates. We can never get those years back and that makes me especially sad for my children. We are in the very early stages of dealing with the consequences of the illness. He has nerve damage along the right side of his body that will require a great deal of physical therapy. He does not remember much of the last 8 years. He has spotty memories of the kids as they grew older. When he looks in the mirror, he does not recognize himself as he expects to see his younger self.

I recently reported all the side effects to the FDA, I’m hopeful that sharing our story now will help someone before they too lose years of their lives. Every medication has side effects, and many are not known.”

WARNING: Do not quit Cymbalta or any medication cold turkey or without the permission, knowledge, and instruction of your medical provider. Some medicines–including Cymbalta–have severe “discontinuation syndrome” (withdrawal), which in some cases can even lead to fatality.

I hope this story took your breath away as it did mine. My intention of sharing this is to create some changes in how medicines are tested for adverse reactions, how they are prescribed, and how much doctors should be required to know about the medicines they prescribe.

To Do’s For The Medical Industry

  1. Medicines are often very harmful, regardless if they are prescription or over-the-counter medicines. Admit the danger and prescribe with warning attached.
  2. People who have adverse reactions should take a few minutes out of their lives and report every single adverse effects they find to the FDA here. The FDA cannot read minds. The FDA needs to have all people report every little thing they find wrong with whatever medicine, vitamin, supplement, or herbs they take.
  3. Medicines are often prescribed off-label. Off-label prescription is a medicine prescribed for something other than what it was tested and FDA approved for. This means that people who receive an off-label prescription are volunteering to be experimental subjects without consent.
  4. Patients who receive off-label drugs are not always told that the drugs they get are off-label, but even if they are told, they often don’t know what that means. The prescription of a medicine off-label should be illegal.
  5. Doctors should memorize the adverse effects of the medicines they prescribe better than the names of the drugs they memorized in order to prescribe them.
  6. Doctors should be required to take a course each year that updates them on all adverse effects—real adverse effects that real people report to the FDA and not only those that the pharmaceutical companies report after their clinical trials.
  7. In clinical trials people are selected based on the least likelihood for adverse reactions. Therefore, the pharmaceutical companies, by definition, underestimate the side effects of all medicines they create. Clinical trials should require the inclusion of subjects that represent the general population with all their health conditions and interacting medicines. All adverse reactions must be included in the adverse reactions list.
  8. Clinical trials remove people in an evaluation phase who develop any adverse reaction to the drugs under testing. The adverse reactions for which these subject were dismissed are often ignored and are not reported as part of the final adverse effects that go on the label of the medicine. All adverse reactions should be reported from all parts of the clinical trial and all should be noted as adverse reaction on the label.
  9. Doctors should be required to pass a test on adverse reactions to each new drug every single year, where the test is created and overseen by an independent expert group that collects all reported adverse effects from the FDA database.

This is not a comprehensive list; it is only a start. Feel free to send me your recommendations and I will update the article.

We Need Your Help

More people than ever are reading Hormones Matter, a testament to the need for independent voices in health and medicine. We are not funded and accept limited advertising. Unlike many health sites, we don’t force you to purchase a subscription. We believe health information should be open to all. If you read Hormones Matter, like it, please help support it. Contribute now.

Yes, I would like to support Hormones Matter.

This article was published originally on February 28, 2019. 

Share

Hydration, Thirst, and Drinking Water

4256 views

Most of us equate the expression “hydrate extra” with drinking more water but – unfortunately – this is incorrect. In any online dictionary “to hydrate” means to create “…a substance that is formed when water combines with another substance…” In other words, water alone is not a hydrating fluid but it must be combined with something to become one. We do not have water in our body on its own; we have a substance we call electrolyte. I wrote substantially on the topic of hydration, mixing water with minerals, as part of the protocol that prevents migraines. However, a new problem has surfaced: when to drink water? Several articles have recently published water drinking instructions on the internet. Most of these articles consider it bad practice to drink water when one is not thirsty and recommend drinking water only when thirsty. There are several serious flaws with this argument.

Sweat

The first flaw is that most research is aimed at athletes; but athletes are not representative of the majority of the population. Furthermore, athletes should not be drinking “water” to hydrate. Drinking water cannot be absorbed by cells without adequate sodium to hold onto it. When athletes sweat, the content of sweat is not water but electrolyte. Many sport drinks aim at re-hydrating athletes but their problem is their sugar or sugar substitute content, defeating the purpose — see how much sport drink one needs to drink to make up the content of sweat for an athlete. Then add up the sugar in a typical sports drink: 1 teaspoon of sugar is 4 grams of carbs. An average serving of a typical sport drink provides between 14 grams to 54 grams of carbs, all sugar, which converts to 3.5 to 13.5 teaspoons of sugar per serving. Drinking sugar substitutes is even worse because sugar substitutes fool the body like it is receiving glucose so insulin spikes but there is no glucose. This creates insulin overflow in the blood causing you to become hungry! Sugar substitutes may lead to obesity and metabolic syndrome. Drinking sport drinks with sugar substitutes actually reduce muscle energy.

Moreover, anything that converts to glucose in the body removes both water and sodium from the cells1 so drinking/eating sugar with sodium (salt is the form in which sodium is available to us) and water is worse than not drinking anything at all. Many athletes have smartened up and drink pickle brine rather than water. Pickle brine is great, assuming the brine is of salt and water and not vinegar. Vinegar is fermented ethanol (alcohol). Thus, drinking vinegar-processed pickles will actually dehydrate further. Look for pickles made with salt rather than vinegar.

Best Hydrating Fluid

Whole milk is an ideal hydrating fluid because it has perfect electrolyte balance in sodium, potassium, water, blood sugar (lactose), calcium, phosphorus, magnesium, and protein. Whole milk is THE perfect electrolyte. Some athletes drink water and take salt pills (also called electrolyte pills). That is also a great option, particularly since they are easy to carry around and take one when needed.

The second flaw in the argument of “drink water when thirsty” is that many people feel thirst after eating sugar when it is the least advisable to drink water. Since about half of sugar converts to glucose, and glucose pulls water and sodium out of the cells1, if one is thirsty after eating sugar and drinks water, the metabolic process will remove more water from the cells. This can cause edema. Although most articles today blame salt for causing edema, the opposite is true.

While sodium retains water inside the cell, glucose removes water and sodium from the cell and forcing the water to be retained in extracellular space2. Eating salt when one has edema actually reduces edema by the sodium bringing water back into the cell. This was easily demonstrated by a previous article showing how this works.

The problem with most studies that blame salt on retaining water is that no studies have ever controlled for both salt and sugar at the same time in the same experiment. All studies I could find only looked at the effects of salt on the body regardless of the amount of sugar, water, or protein the subjects had consumed before the experiment. Since the body can easily be tipped out of balance and is never in a vacuum for a pristine controlled experiment, one cannot say with certainty that one element makes a particular change without looking at what else is affecting the body. No such studies exist except in my migraine group where we control for all variables. We found that being thirsty often means the person does not have enough salt to keep water where it belongs3. A migraineur should never drink water when she is thirsty, particularly not if carbohydrates were consumed.

The final problem with only drinking water when thirsty is the population of people who have diabetes 2; they are always thirsty. Being thirsty can be a sign of diabetes mellitus and not the need for more water.

Should you wait until you are thirsty before you drink?

Absolutely not, and for sure drinking water alone will not get you hydrated. How much water you should drink is a question I will address in another article. Drinking the minimum 8 glasses of water is a myth; people vary in size, age,and activity, implying that each person needs a different amount of water. There are many online water calculators that go into detail of weight, climate, activity, altitude, your health, pregnancy, nursing, etc. For each person the amount of water and thus hydration need (not just water) will differ and for that hydration level you need to make sure you drink adequate amounts of water as part of your hydration protocol.

Sources

  1. Longo DL, Fauci AS, Kasper DL, Hauser SL, Jameson JL, Loscalzo J. Harrison’s Manual of Medicine 18th Edition. New York: McGraw Hill Medical; 2013.
  2. Millar T. Biochemistry Explained: A Practical Guide to Learning Biochemistry.Vol reprint edition: CRC Press; 2002.
  3. Stanton, Angela A. Fighting The Migraine Epidemic: A Complete Guide: How To Treat & Prevent Migraines Without Medicine

This article was published previously on Hormones Matter in July 2015. Minor updates were added.

 

We Need Your Help

More people than ever are reading Hormones Matter, a testament to the need for independent voices in health and medicine. We are not funded and accept limited advertising. Unlike many health sites, we don’t force you to purchase a subscription. We believe health information should be open to all. If you read Hormones Matter, like it, please help support it. Contribute now.

Yes, I would like to support Hormones Matter.

Share

If Ketosis Is Only a Fad, Why Are Our Kids in Ketosis?

6141 views

Our Kids Are in Ketosis

Some time ago, I published an article on Medium titled “Our Kids Are In Ketosis”. The article turned many heads and has been shared thousands of times on social media. It has also been cited by an MD in a YouTube video here. So what is the fuss about?

Various ways of eating are labelled as “diet fads” by many people, including many doctors, scientists, and nutritionists. A fad is defined as “an intense and widely shared enthusiasm for something, especially one that is short-lived and without basis in the object’s qualities; a craze” (Google dictionary). Some diets should be labelled fads because it is impossible to maintain them long term. Fruitarian, for example, is one such fad; it is impossible to maintain safely long-term by humans, primarily because one may die from starvation, pancreatic cancer, or as a result of insufficient nutrients in a diet that contains only fruits. In addition to the demand that is placed on the use of insulin to remove the glucose oversupply from the blood, fruitarian diets lack essential amino acids and essential fatty acids for survival.

Some may suggest that the Standard American Diet (SAD), also referred to as the Western Diet, is a fad. The majority of the US population consumes the SAD diet with dire outcome: heart disease, obesity, type 2 diabetes, non-alcoholic fatty liver disease, cancer, Alzheimer’s disease, multiple sclerosis, Parkinson’s disease, nearly all (if not all) autoimmune diseases, and a host of other non-communicable diseases are outcomes of the SAD diet. While on a human life scale we may think of the SAD diet as sustainable, it clearly is not. It has been on the menu for over 50 years, and it feels awkward calling it a fad, even though it most certainly meets “without basis in the object’s qualities”.

“Let food be thy medicine, and let medicine be thy food”

This wise sentence is attributed to Hippocrates. As ancient as this statement is, it still holds true today—if only we followed its wisdom. I am a follower of another Hippocrates saying with a slight modification: “You are not what you eat but how you metabolize what you have eaten!” You will find this sentence as my intro (mission statement) on my FB page. The distinction between what we eat and how we metabolize what we have eaten is essential; it defines who we are.

Understanding Diet and Nutrition

In order to understand how nutrition affects us and what we should be eating, much needs to be understood about the human digestive and metabolic systems. While there is a lot of discussion about the importance of carbohydrates in our diet today—particularly consuming many servings of complex carbohydrates, such as whole grains, starchy vegetables, and fruits daily—this was not always our understanding. For example, when I grew up, carbohydrates were understood to cause obesity(1). Most of my childhood was spent on a relatively low carbohydrate diet—and that was normal for Europe at that time. Cereals—whole grain or otherwise—didn’t exist, neither was I growing up drinking orange juice and eating pancakes with syrup in the morning. In those years, the role of carbohydrates in metabolic disease was clearly understood—and metabolic disease was very rare.

I grew up eating lots of animal meat: pork, bacon, beef, eggs, poultry—including duck and geese—and organ meats, including liver, kidneys, bone marrow, heart, gizzards, pork ears, snouts, hoof, chicken feet, even chicken blood, pork and beef blood in sausages, such as blood sausage—still sold in many places—cooked in pork lard, duck fat butter, or beef tallow, and lots of fatty fish. We also ate a lot of dairy, including milk and cheeses plus all other dairy forms. We had various dairy and cultured milks as well. Of course, we drank milk that could be placed in the window for a day and by next day it was kefir without any added culture and without rotting. Our staples were potatoes, beans, rice, onions, and apples that lasted through February in cold storage. Other fruits and vegetables were all only seasonal, lasting a very short time.

We didn’t have processed foods and processed oils of any kind. While some canned foods and sweetened drinks certainly existed, those were seldom consumed. There was no such as daily dessert. If I was a good girl, maybe once a month I was taken to special pastry places and I could pick one dessert. If you think my family was poor and hence I was eating this way, you are wrong. I was a privileged child. I spent several of my childhood summers in the South of France eating more seafood, but basically the same exact diet. The European diet, when I grew up, was a very cholesterol rich fat-heavy low carbohydrate diet.

It is important to note that while cholesterol levels have been consistently dropping over the past several decades(2)—as a result of the recommended SAD diet full of vegetable oils—the rates of cardiovascular and metabolic diseases have been increasing over the same period, indicating a possible inverse relationship. This is promptly ignored by medicine and there is a continual push toward the consumption of lower cholesterol foods, such as whole grains, fruits and vegetables, and vegetable oils. This, in spite of several studies showing the above mentioned inverse relationship, suggesting that higher cholesterol extends healthy lifespan(3,4). The reason I mention cholesterol is because a reduced carbohydrate diet, such as the one I grew up on, is necessarily a heavy saturated fat and cholesterol diet.

Yet, during those years in Europe, metabolic diseases (such as non-alcoholic fatty liver, cardiovascular disease (CVD), stroke, type 2 diabetes, etc.,) were rare. This article shows the confusion. In it, they discuss the rates of CVD, which was very low in France at the time, to be caused by a “lag”. The assumption being that the heavy saturated fat-filled foods will catch up with the French and they will eventually develop CVD. Interestingly, the French ate heavy saturated fat for centuries before this study, so the catching up is a silly theory aiming to explain the French Paradox. And the catching up is still going on without any success—though now that American junk food is available to the French, it may be only a matter of time.

Ketosis

When one consumes a low carbohydrate high fat diet, even with high protein and some beans time to time, the person will be in a state of ketosis.  In ketosis, the body can use both glucose and fat (both consumed and stored fat) as its fuel. Fat is converted to ketones, or β-Hydroxybutyrate (BHB). This state is reached by diets rich in animal fats, medium amounts of protein, and little or no carbohydrates. Rather than hindering health, ketosis supports and protects it (5), as it can be seen in the treatment of epilepsy(6). Ketosis is the basic human metabolic process. This is further discussed later.

Carbohydrates Versus Fats

When some time ago I bumped into the book Metabolic regulation: A Human Perspective by Keith N. Frayn (3rd edition), in chapter 2.1 and 2.2 I came to two very unusual sections that made me jump up.

The chapter section states the following:

“…fatty acids are usually a preferred fuel (over glucose) for skeletal muscle… Fatty acid release is very effectively switched off by insulin, so muscles no longer have the option of using fatty acids… The characteristics of individual cells or tissues ‘set the scene’ for metabolic regulation. For instance, the metabolic characteristics of the liver mean that it will inevitably be able to take up excess glucose from plasma, whereas other tissues cannot adjust their rates of utilization so readily… The brain, in contrast, has a pathway for utilizing glucose at a rate that is relatively constant whatever the utilizing glucose concentration, a very reasonable adaptation since we would not want to be super-intelligent only after eating carbohydrate, and intellectually challenged between meals.”

I was surprised. In all literature that advises healthy nutrition, we are told to consume a ton of carbohydrates. The USDA Dietary Guidelines recommend My Plate (what used to be the food pyramid, see the history of the USDA guidelines here), which is mostly carbohydrates, with a little bit of meat or fish, eggs, dairy, and some oil on top. Basically most of the diet—over 60 percent—is supposed to come from carbohydrates, such as whole grains, fresh fruits, vegetables, nuts, and seeds. Those are all carbs. The paragraph from the book was intriguing to me, because it tells us that we don’t need to eat carbs, contrary to USDA dietary recommendations.

Skeletal muscles actually prefer to use fat rather than glucose! We have a very carbohydrate-centric view of nutrition today, much more so than when I grew up, yet our muscles and some other organs too, such as the heart(7,8), actually prefer fat as fuel and not glucose! Why are we eating so many carbohydrates?

The Fuels of Our Body

Few professionals, even within the field of nutrition, realize that glucose is not our primary fuel and ketones are not our backup fuel. The human body has no primary or backup fuel: it has three fuels. Some of these fuels are toxic if they stay in the blood too long in larger amounts than needed, and so the body must remove them fast. I presume that the urgency for glucose use—meaning the urgency for the removal of glucose from the blood—is misleading and makes people think that if the body switches to using glucose the moment it is provided, it must be its primary fuel; but this is incorrect assumption. This suggests that the first task that has to be done is the preferred task, but this is seldom the case. For example, assume you trip on your way to a dinner party, hit your head into the cement, and now see double. Although your urgent task now is to head to emergency to see if you received a concussion or other serious damage, the emergency visit is not your preferred task. It just suddenly became a priority you need to tend to immediately before you join the dinner party.

It is true that when we eat carbohydrates, the body must stop everything and remove the excess glucose from the blood. The maximum comfort level for glucose in the blood is 99 mg/dL (5.5 mmol/L) and if we eat carbohydrates in such amounts that this is exceeded, it is literally an emergency and the body must remove the excess glucose. For reference, 99 mg/dL is one teaspoon of glucose in the entire 5 liters of blood. Extra glucose in the blood is toxic. However, having to remove glucose immediately does not make it into the primary fuel.

The three fuels our body can use are: alcohol, providing 7 kcal per gram, glucose, providing 4 kcal per gram, and fat, providing 9 kcal per gram. While alcohol is a viable fuel for the human body, it creates many adverse effects, causing damage, so it is not recommended to be used as fuel, though alcoholics often use it as such. Given that alcohol is the most dangerous of all the fuels(9), if our body is provided the three fuels simultaneously, alcohol will be burned first. Yet we don’t ever assume that alcohol is our primary fuel.

How the Body Prioritizes Fuels

Priority can be assigned for many reasons. One can give priority to something preferred, but if this means that something harmful has to wait to be removed, there is a risk of disease. If this was the method our body chose for prioritization, we may not have remained alive long enough to become who we are today. One can prioritize by the amount of energy each fuel gives, using up first the one that gives the most energy, and saving the rest for later. But, again, if the rest causes damage while waiting to be used, then this has to be avoided. One can also prioritize based on the removal of toxic fuel first and save the least toxic fuel to be used last.

If our body prioritizes based on preventing damage by burning the fuel in order of toxicity, then the order would be:

  1. Alcohol
  2. Glucose
  3. Fat

If priority would be given to those fuels that provide the most energy, the order would be:

  1. Fat
  2. Alcohol
  3. Glucose

If the priority is based on the USDA dietary guidelines, then the order would be (alcohol as fuel is usually not mentioned anywhere):

  1. Glucose
  2. Fat

What we actually see happening is that the body burns fuel in the order of which it causes the most damage in the blood. The most dangerous fuel in the blood is alcohol. Its side effects can even be fatal, so clearing that from the body is necessarily task #1.

The urgency with which we must respond to the situation is dependent upon its outcome.

Glucose is the second most dangerous fuel in the human body after alcohol. Fructose, a sugar that is part of table sugar, honey, and most fruits, turns into ethanol (an alcohol) causing damage, and glucose—referred to as blood sugar—is toxic to our nerves, may cause eye damage and type 2 diabetes, and cardiovascular disease (including stroke) as well.

By contrast, fat is a safe fuel to the point that it is usually stored in our body—something many of us wish to have less of. It is not a toxic substance and at 9 kcal per gram, it provides the most energy. Alcohol inhibits the burning of glucose and fats, making it the mandatory first fuel to burn. So the actual order of fuel priority is:

  1. Alcohol
  2. Glucose
  3. Fat

So why was I so surprised reading that some organs prefer to use fat for fuel? We read and hear so much about carbohydrates as the most important fuel, the kinds of carbohydrates we should be eating, and how much. We can learn very little about the other important fuels: alcohol and fats. Yet, as the surprising paragraph suggests above, often the other fuels are more important. In particular, if we look at the human lifespan, other fuel needs become stunningly obvious. I think we have forgotten that babies are born in ketosis and their primary fuel is ketones.

Human Fuel from Conception

Indeed, babies are born in ketosis(10) and mother’s milk is low in carbohydrate and high in fat (LCHF), which keeps the newborn in ketosis all through nursing. Breast milk contains more glucose across time, as the child matures.  At full maturation (about one month), its nutritional content per cup (8 oz) is 10.8 gr fat, 2.53 gr protein, 16.9 gr carbohydrates (in the form of lactose) and total energy of 174.92 kcal. It also contains 87.5 gr water. Subtracting the water and looking at the macronutrient ratios only, this glass of mature nursing milk is 55.57% fat, 5.79% protein, and 38.65% carbohydrates (in lactose, so not free sugar). In terms of fatty acid composition: 4.942 gr saturated fat, 4.079 gr monounsaturated fat, and 1.223 gr polyunsaturated fat, which in percentages: 48.24% saturated fat, 39.82% monounsaturated fat, and 11.94% polyunsaturated fat. One must agree that babies are not fed poison by their mothers and that Nature didn’t provide toxic nursing milk such. In fact, we can see that babies grow very rapidly while nursing milk and we know from studies that babies who are nursed, have a better chance for survival, grow healthier, faster, and their brain develops better (see here and here).

As babies grow, they retain metabolic flexibility, meaning they stay in ketosis for periods time, which changes by age and how often they are fed, and they do enter the carbohydrate metabolic process as they eat(10-12).

The below figure is very telling. On the vertical axis you find β-hydroxybutyrate [BHB], which is the level of ketones measured in blood, and on the horizontal axis, time passing in hours and days between feeding times. The graph inside contains people of all age groups and both genders. Officially, a fed person (with carbohydrate containing food) has zero amount of BHB in her/his blood. As time passes, and if the person fasts for a blood test or a medical procedure, slowly BHB increases in the blood. The clinically accepted level of BHB in the blood that is not considered ketosis is 0.3 mmol/L, which can easily be reached by healthy individuals as a result of such fasting. Anything above 0.3 mmol/L and chronically maintained 0.3 mmol/L means the person is in ketosis.

From Annu. Rev. Nutr. 2006. 26:1–22; doi: 10.1146/annurev.nutr.26.061505.111258

Figure 1. George F. Cahill Jr. showing time passing without food and ketosis level by age and gender

Note that babies are born with 0.5 mmol/L or higher level of BHB and remain in ketosis until they are fed, which immediately after birth cannot happen since the mother has no milk yet—this means the baby remains in ketosis until the time comes that milk is released by the breast of the mom. It is suggested that the baby is in ketosis because it is not being fed(10), meaning starvation, but this is questionable, since babies remain in ketosis even after being fed—more on this later. The human brain uses 20% of the energy the body generates. In the case of infants, the need for energy arises very quickly, as shown in Figure 1, approximately in 20 minutes. There are other factors as well, such as the brain is over 80% fat and cholesterol(13), with the cholesterol alone representing 20-25% of the total cholesterol of the body (here).

Thus ketosis, in one scenario, is a state into which our metabolism reverts to when food is not immediately available on demand (starvation). On the other hand, studies on the fetus in utero show that the fetus is in ketosis even though it is never under nutritional duress(14,15) and there are ketones in the placenta(16) as well. Therefore, it is very difficult to suggest that ketones are a backup fuel used only in starvation if the fetus is in ketosis time to time even without starvation. Clearly, being in ketosis provides some benefits that are not possible to achieve using the glucose metabolic process. I venture to suggest that the ketogenic and the glucogenic metabolic processes have distinct functions, each to benefit us on its own way.

Concluding Thoughts

To conclude the importance of ketosis, let me offer some anecdotal evidence. In the Facebook migraine groups I manage (see here and here), all migraineurs are asked to run a special blood glucose/blood ketone test that starts with fasting and premeal tests and continues for 5 hours postprandial. There are many children with migraines whose parents are in the groups—the children are from ages 2-18, so for them parental help is required. Thus, over time, I have had the opportunity to evaluate the 5-hour blood test results of children of all ages—I think the youngest was 5 years old and the oldest 16, so far. I have yet to see a blood ketone test of a child anywhere in this age group that is not showing ketosis both before and after a meal—even if the meal has fruits and dairy in it.

I suppose ignorance is bliss. Few doctors or researchers have the same opportunity I have in being able to measure the blood ketones of various ages of children for five hours postprandial plus fasting and premeal measures, therefore, most don’t realize just how much our kids are in ketosis.

Thus, while today in most countries around the world any type of food is just a short walk/drive away 24/7, and we need not experience hunger and starvation, our children are still in ketosis 24/7. Shouldn’t that tell us something about the importance of ketosis?

References

  1. Dawkins, M. Carbohydrates and Adiposity. British medical journal 1, 719-720, doi:10.1136/bmj.1.5385.719 (1964).
  2. Farzadfar, F. et al. National, regional, and global trends in serum total cholesterol since 1980: systematic analysis of health examination surveys and epidemiological studies with 321 country-years and 3.0 million participants. The Lancet 377, 578-586, doi:10.1016/S0140-6736(10)62038-7 (2011).
  3. Ravnskov, U. High cholesterol may protect against infections and atherosclerosis. QJM: An International Journal of Medicine 96, 927-934, doi:10.1093/qjmed/hcg150 (2003).
  4. Kaysen, G. A. et al. Lipid levels are inversely associated with infectious and all-cause mortality: international MONDO study results. Journal of lipid research 59, 1519-1528, doi:10.1194/jlr.P084277 (2018).
  5. DuBroff, R. & de Lorgeril, M. Fat or fiction: the diet-heart hypothesis. BMJ Evidence-Based Medicine, bmjebm-2019-111180, doi:10.1136/bmjebm-2019-111180 (2019).
  6. Berghoff, S. A. et al. Dietary cholesterol promotes repair of demyelinated lesions in the adult brain. Nature communications 8, 14241-14241, doi:10.1038/ncomms14241 (2017).
  7. Park, T.S., Yamashita, H., Blaner, W. S. & Goldberg, I. J. Lipids in the heart: a source of fuel and a source of toxins. Current Opinion in Lipidology 18, 277-282, doi:10.1097/MOL.0b013e32814a57db (2007).
  8. Noh, H.-L., Yamashita, H. & Goldberg, I. J. Cardiac Metabolism and Mechanics are Altered by Genetic Loss of Lipoprotein Triglyceride Lipolysis. Cardiovascular Drugs and Therapy 20, 441-444, doi:10.1007/s10557-006-0633-1 (2006).
  9. Van de Wiel, A. Diabetes mellitus and alcohol. Diabetes/Metabolism Research and Reviews 20, 263-267, doi:10.1002/dmrr.492 (2004).
  10. Cahill, G. F. Starvation in man. N Engl J Med 282, doi:10.1056/nejm197003052821026 (1970).
  11. Storlien, L., Oakes, N. D. & Kelley, D. E. Metabolic flexibility. Proceedings of the Nutrition Society 63, 363-368, doi:10.1079/PNS2004349 (2007).
  12. George F. Cahill, J. Fuel Metabolism in Starvation. Annual Review of Nutrition 26, 1-22, doi:10.1146/annurev.nutr.26.061505.111258 (2006).
  13. Chang, C. K., DS; Chen, JY;. Essential fatty acids and human brain. Acta Neurologica Taiwanica 18, 10 (2009).
  14. Herrera, E. & Amusquivar, E. Lipid metabolism in the fetus and the newborn. Diabetes/Metabolism Research and Reviews 16, 202-210, doi:doi:10.1002/1520-7560(200005/06)16:3<202::AID-DMRR116>3.0.CO;2-# (2000).
  15. Herrera, E. Lipid metabolism in pregnancy and its consequences in the fetus and newborn. Endocrine 19, 43-55, doi:10.1385/endo:19:1:43 (2002).
  16. Orczyk-Pawilowicz, M. et al. Metabolomics of Human Amniotic Fluid and Maternal Plasma during Normal Pregnancy. PloS one 11, e0152740-e0152740, doi:10.1371/journal.pone.0152740 (2016).

We Need Your Help

More people than ever are reading Hormones Matter, a testament to the need for independent voices in health and medicine. We are not funded and accept limited advertising. Unlike many health sites, we don’t force you to purchase a subscription. We believe health information should be open to all. If you read Hormones Matter, like it, please help support it. Contribute now.

Yes, I would like to support Hormones Matter.

 

Image credit: Creative Commons Attribution (CC BY 4.0) terms and conditions https://creativecommons.org/licenses/by/4.0 Baby. Credit: Bill McConkey. CC BY

Share
1 2 3 6