How The Mineral Crisis Happened

Context is so powerful. A lot has happened in the last century, more than can fit into this blog post, but we will do our best to lay out some of the key points to help set the context of where we are now.

Before we dive in, let’s take a step back and look at the big picture. In general, life is good. Humans have advanced faster and further in the last century than in the hundreds of thousands of years preceding. Our advancement has led to longer life spans, energy abundance, and the ability to connect over the internet and instantly find knowledge at our fingertips. It is outside of the scope of this post, but it is worth knowing how far we have come. If you want to learn more, there is a fascinating book called The Rational Optimist.

So with all these advancements, why is it that 1 in 4 adults suffer from a diagnosable mental disorder? (1). More importantly, why are the leading causes of death cardiovascular diseases, cancers, digestive disease, and diabetes? (2) These diseases were almost nonexistent at the turn of the early 1900s. Could there be something that has caused both the rise in these illnesses along with the increasing epidemic of obesity? Or is it just because we can diagnose diseases better than ever?

According to a Harvard article on U.S. Obesity trends, “in 1990 obese adults made up less than 15% of the population… Today, nationwide, roughly two out of three U.S. adults are overweight or obese (69 percent), and one out of three is obese (36 percent).” Even more alarming is that the children are getting more and more obese. (3) (4)

Often in history, decisions made with good intentions ended up causing as much or more harm than good. During the 20th century, there were substantial changes in agriculture, diet, and medicine. These changes were not all bad. They led to the end of many illnesses and the proliferation of food. They resulted in the savior of many lives and allowed our world to rebuild and grow after two world wars. Unfortunately, it also has led us to the current health and medical crisis.

Part of the problem is that many of these changes were based not on science but politics and big business. They were decided not by doctors and scientists but by politicians and lawyers. As a result, we have become a nation led to believe that a low-fat, high processed food diet, is not only healthy but the path to wellness. Nothing could be further from the truth.

Let us dive in and better understand how the changes in agriculture, diet & food, and medicine have led to the current crisis and mineral deficiency.


Until the beginning of the 20th century, farmland was rich with minerals and free of toxic man-made chemicals. The farmland was also owned and managed primarily by local farmers. But it was at this time that large companies acquired and took over the majority of these farms. They implemented new technologies that focused on optimizing the number of crops that could be yielded and other modalities that quickly depleted and damaged the soils.

One such example is that growing & harvesting methods changed. Instead of using proper crop rotation and ancient fertilizers, which vitalized and maintained the soils’ mineral-rich content, modern farming relies on artificial chemical fertilizers, pesticides, and other unnatural agents to boost crop yield. Sadly, these negatively impacted the soil, resulting in impaired nutrient uptake in the plants, notably with copper and magnesium.

Now there was an upside, as it allowed us to begin to beat the starvation and malnutrition the country faced. Two wars and government loans led to these methods becoming pervasive. (5)

Besides a massive increase in crop production, the ability to grow crops on land which had been unfarmable resulted in an interesting side-effect: it dramatically decreased the available grazing land for cattle, which resulted in cattle moving from free-range to being raised in pens. This was the origin of today’s industrial meat production. (6)

Another invention was the creation of the fertilizer nitrogen-phosphorus-potassium (NPK), which became and is still widely used. NPK, unfortunately, blocks the uptake of copper by plants. (7)

Another deadly invention, glyphosate (commonly known as roundup), is a chemical chelator that binds and removes minerals such as calcium, magnesium, manganese, copper, and zinc. According to Stephanie Seneff, PhD, one of the world’s foremost experts on the dangers to human health caused by glyphosate, the capacity to chelate at low pH means that nothing can stop glyphosate’s chelating action in the soil, plants, animals, and humans. Glyphosate also inhibits the production of ceruloplasmin – a key copper protein in our body.

The result? There has been an 80% loss of copper in crop soil over the last century. And it is only getting worse. (8)

In addition to farming, the modalities and flexibility of transporting goods drastically changed. Refrigeration became mainstream, as well as canning, painting, and coating vegetables and fruits. These technologies allowed for year-round availability and global shipping, but they also contributed to the depletion of minerals. Plants pull nutrients in up until they are ripe.

Are you starting to see this issue in front of us? Now let us look at what happened to our diet & food.

Diet & (the Emergence of Processed) Foods:

In the 20th century, the way we looked at food forever changed. Due to a series of events that started in the 1920s, the way we measured, discussed, and researched food/nutrition became focused solely on the nutrients contained in foods (or to be precise, the recognized nutrients we look for in foods.)

As popularized by Michael Pollan in his book, In Defense of Food: An Eater’s Manifesto, nutritionism was born.

“Nutritionism is a paradigm that assumes that it is the scientifically identified nutrients in foods that determine the value of individual foodstuffs in the diet. In other words, it is the idea that the nutritional value of a food is the sum of all its individual nutrients, vitamins, and other components.” (9)

Not the whole foods themselves.

The issue is that it becomes easy to blur and obfuscate the differences between whole and processed foods. As you can imagine, the food industry played a significant role, pushing scientists and researchers to villainize nutrients rather than types of or specific foods.

Like agriculture, modern food processing methods became commonplace. These technologies include pasteurizing, supplementing, and replacing ingredients with refined sugars and hydrogenated oils. Unfortunately, one of the results of these changes was the loss of critical nutrients (retinol, magnesium, and copper.)

Additionally, vitamins and other supplements were added to foods. In subsequent articles, we will address the dangers and issues that arise from these artificial substances. The vitamins and supplements were predominately manufactured from chemicals and are often different from what the vitamins are supposed to be in nature. And they often do more harm than good. For a history of vitamins, you can read more here.

Again, this was not all bad. The innovations cured or remediated many illnesses and diseases and famine. But like many things, good intentions can still lead to severe consequences.

In the 1940s, inorganic iron fillings were added to our food system via enriched flour and grain-based products. There were a lot of beliefs and confusion around the why and how much. In fact, in 1969, the Food and Drug Administration (FDA) increased the recommended amount of iron by 50% in many foods. They wanted to increase it more, but dozens of scientists testified against this. Nonetheless, to this day, excessive iron is still found in most processed foods with grains. It is scary, but iron and folate are often found in these foods in twice the listed amount. Add that to the fact that people tend to eat double the recommended serving size of cereal, and you can imagine how excessive amounts of iron and folate are being consumed just through cereal. (10)

We will dive deep into the importance and issues with iron throughout the blogs and especially our blog on Iron 101. But for now, it’s worth noting that iron is the 4th most common element on this planet. Iron is highly oxidizing (i.e. it causes our body to rust), it is the metal that ages us, when left unchecked. (11). And our body has a very sophisticated system for recycling, storing, and maintaining iron in balance will little need for additional iron. Thus, contrary to the FDA, we do not want or need excessive iron.

In the 1950s, the two most dangerous idealisms were born on top of nutritionism. In 1955 President Dwight David “Ike” Eisenhower had a heart attack. His cardiologist, the famed Dr. Paul Dudley White, villainized the president’s high-fat/high cholesterol diet and put him on a low-fat, low-cholesterol diet instead. (The fact that Eisenhower had smoked four packs of cigarettes a day up until 1949 didn’t seem to cross anyone’s mind as being a likely cause.) When Ike ran again for president in 1956, the low-fat diet was credited for his recovery and ability to return to work. What never got publicized is that Ike hated his low-fat diet. He felt hungry all the time even as he gained weight and his cholesterol continued to rise. He also continued to have heart attacks—six more after leaving office. The last and fatal attack occurred in 1969.

The supposed link between saturated fats and heart disease was a marketing windfall for the food industry. Suddenly they had a demand for a product that had been pretty unpopular up until then: margarine. (12)

The problem was that Dr. White based his advice on a faulty and manipulated study called the Seven Countries Study by Ancel Keys (13,14). Keys set out to prove a hypothesis that a fat-rich diet was responsible for heart disease and overall poor health.

The study was so named because it involved studying the diets of populations in Finland, Greece, Japan, Italy, Netherlands, Yugoslavia, and the U.S. From this study, Keys claimed proof that cholesterol levels were strongly related to heart disease mortality. Unfortunately, Keys’ study was highly flawed because, among other things, he buried the findings of 15 other countries that contradicted his desired outcome. Sadly, and to many people’s detriment, Keys remained adamant that dietary saturated fats caused elevated cholesterol levels, which in turn caused heart disease.

The logic went like this: Saturated fats such as those found in butter, meat, cheese, and eggs raised serum cholesterol in laboratory animals and humans. Because cholesterol is a major component of atherosclerotic plaques, and early studies had linked high serum cholesterol levels to heart disease, then saturated fat must cause heart disease. (15)

Nothing could be further from the truth. Recent evidence and studies have questioned the risks of fats. A paper published in the January 29, 2015, edition of the BMJ’s Open Heart examined the data on fat and cardiovascular disease available to U.S. and U.K. regulatory committees at the time the 1980 and 1984 guidelines were issued (16).

“The analysis revealed that the six randomized controlled trials available back then did not provide sufficient evidence that cutting total fat or saturated fat intake reduces deaths from heart disease. The authors conclude that the ‘dietary advice not merely needs review; it should not have been introduced.'”

In 1957, the American Heart Association (AHA) joined forces with Keys (17). Soon, AHA spokespersons took to television to warn the American public about the dangers of butter, eggs, bacon, and other saturated fat food sources in relation to heart disease. The government quickly followed suit, issuing federal guidelines recommending that a low-fat diet be followed in order to prevent heart disease.

Every five years since the late 1970s, the USDA has issued a set of dietary guidelines for Americans. And every five years, predictably, the food industry rises up to make sure that eating less of anything isn’t in those guidelines. Politics and lobbying shape these dietary recommendations far more than actual science. This led to the eventual creation of the food pyramid. So how has this misinformation affected us?

As a result “From 1909–1999, consumption of soybean oil in the United States increased by more than 1,000-fold per person and margarine consumption increased 12-fold, whereas consumption of butter and lard decreased by about four-fold each (18).”

Rather than protect people, it has led to the creation of a nation of overweight, sick, and obese citizens.

“[If] foods are understood only in terms of the various quantities of nutrients they contain,” Gyorgy Scrinis wrote, then “even processed foods may be considered to be ‘healthier’ for you than whole foods if they contain the appropriate quantities of some nutrients.” (19)

No idea could be more sympathetic to manufacturers of processed foods, which surely explains why they have been influential and supportive, if not completely behind the nutritionism bandwagon.

Before we wind up this section, let’s lightly touch on cholesterol and retinol.

Without diving in too deep, here are a few key points on cholesterol. Everyday our liver produces around 1000mg of cholesterol. It is the most important component and structural unit of cell membranes. It is a vital piece for a normal functioning nervous system. And it is the building block for our body to manufacture numerous hormones. It also serves as an effective and powerful anti-inflammatory agent. You can read more about cholesterol 101 here.

Lastly, but definitely not least, retinol, the bioavailable form of Vitamin A, is a fat-soluble vitamin that is only found in animal-sourced foods, such as liver, oily fish, cheese, butter, heavy cream, and egg yolks. As you can imagine, retinol has largely disappeared from our lives because of the modern low-fat diet.

Retinol plays an essential role in body growth, energy production, immune function, vision, and reproductive health. And retinol is a key player in the Mineral Cure Diet. You can read about Retinol 101 here.

Now let’s look at the third cause of our mineral depletion and imbalance in modern medicine.

Modern Pharmaceutically Driven Medicine:

At the dawn of the 20th century, doctors trained and practiced medicine in diverse ranges. It was estimated that at least 25% of them practiced holistic health care that emphasized the importance of diet and nutrition as primary preventative and therapeutic treatments.

At this time, a movement was funded by Andrew Carnegie, one of the wealthiest men in the world, to improve and revise the world of medical education. The result was the closure of many medical schools and a refocus of the existing ones on the ideology of narrow symptom-centered, drug-based treatment regimens. Lost among the reorganization and the revamping was whole-patient care and other treatment options, like diet and nutrition.

The problem partially stems from this reinvention of the medical system. Now at the time, the change made sense. Because at the time of the change acute, infectious diseases were the leading causes of death.

Treatment in these cases was relatively simple: the patient developed pneumonia, went to see the doctor, received an antibiotic (once they were invented), and either got well or died. One problem, one doctor, one treatment.

However, as lifespans extended by decades and our food, diet and agriculture changed, new diseases arose along with multiple complications. The one-for-one solution was unable to keep up.

Modern Western or orthodox medicine focuses on acute treatment of the patient’s symptoms with the use of prescription medications, surgical operations, various forms of therapy, and radiation. The advantage and progress of this modern system are that it is ideal for urgent situations that require serious or immediate care. Additionally, modern pharmaceuticals can quickly alleviate many negative symptoms to allow people to resume their daily lives with minimal interruption or discomfort.

However, modern Western medicine often only treats the symptoms without addressing the root cause of the issue. Additionally, the long-term effects of many medications can have huge impacts on the body over time. Since pharmaceuticals don’t fix the underlying issues in far too many situations, they are prescribed to be taken for the rest of a patient’s life. Sadly, as a result, prescription drugs have become the third or at best the fourth leading cause of death. (20) (21).

It is not just the prescription drug and symptom focus, but the conventional medical system now makes it next to impossible for a primary care provider (PCP) to offer much more than a one-off solution. The average patient visit with a PCP lasts about 10 to 12 minutes, and the average PCP has about 2,500 patients on his roster. If a patient has multiple chronic conditions, takes several medications, and presents with new symptoms, it is nearly impossible to provide quality care during that 10-minute visit.

Things are only getting worse. Chronic disease is shortening our lifespan, destroying our quality of life, bankrupting governments, and threatening the health of future generations. And unfortunately, conventional medicine continues to fail to address this crisis adequately.

Six in 10 Americans now suffer from chronic disease, and four in 10 have multiple chronic conditions. Chronic disease is responsible for seven of 10 deaths each year. The rate of chronic disease in kids more than doubled between 1994 and 2006. Ninety percent of the $3.5 trillion we spend on healthcare in the United States each year treats chronic disease and mental health conditions. (22)

Many pharmaceuticals are made using petrochemicals. Many contain some form of fluoride, and many contain iron oxides. These and their very chemical function either directly or indirectly deplete magnesium and copper or contribute to the iron load in the body.

Thus, while the intent behind modern pharmaceuticals and the medical system may have been spot on at the turn of the 20th century, it no longer serves nor protects us. It has even been put forth that the modern Western medical system is to blame for our current health care crisis. (23)

So what can we do?

Recent statistics suggest that more than 85 percent of chronic disease is caused by environmental factors like diet, behavior, environmental toxins, and lifestyle. (24)

It is our belief and intent that through education, diet, and lifestyle changes focused on ancestral concepts, curing your mineral imbalance, as well as removing the other bad stuff. You can take control of your life and make all the difference for yourself.


  8. Ben Edwards, MD based on pioneering and enduring work of McCance and Widdowson, The Composition of Foods published by the Institute for Food Research
  16. Harcombe, Z., et al., 2015.
  17. Page et al, 1957. Atherosclerosis and the Fat Content of the Diet. Circulation 16:163-178
  18. Blasbalg, T. L., et al., 2011

Share This Post

More To Explore

Boron 101

Overview: The trace mineral boron is a micronutrient with diverse and vitally important roles in metabolism that render it necessary for plant, animal, and human

Retinol – Vitamin A 101

Overview: Retinol, aka preformed vitamin A, is a fat-soluble vitamin that is only found in animal-sourced foods, such as liver, oily fish, cheese, butter, heavy

Magnesium 101

Overview: Magnesium (Mg2+) is one of the most abundant minerals in the body. Historically it was naturally present in many foods, but as we mentioned

Copper 101

Overview: Copper is an essential trace element that is vital to human life. Unfortunately, it is one of the human body’s most misunderstood yet crucial