Alcohol addiction and the THIQ Hypothesis

This is a summary of research results, a compilation of observations of mine, all tossed together with speculations based on my education and expertise. Frankly, I’m quite convinced of the conclusions, enough so that I intend to pursue biochemical and genetic research on this. There’s enough here to spend a life time on, and I intend to, unless I can get real results sooner. I find it a fascinating study in the operation of science, as much as in addictions.

It starts some years ago with dead people. In a fairly well known (among those in the addiction field) story, a researcher was looking into brain structures using dead people at a coroner’s office. She knew about the particular changes that occur with chronic opiate use. While examining the brains in question, she remarked to the coroner that it was surprising that so many of their subjects were junkies. The coroner replied that these were in fact winos who suffered from all the signs of chronic alcoholism in all their body tissues, and none were shown to be addicted to opiates. So began the research into the chemical similarities between alcohol and heroin addiction. Among the results of this was the fact that there was a general atrophy of endorphin receptors. Somehow, these receptors,those stimulated by either endorphins (endogenous morphine) or the plant kingdom’s real stuff, were getting wiped out. Examining them showed that they were being plugged by a molecule which fit into the receptor, but was dissimilar enough that it was not being removed.

This chemical is tetrahydro isoquinilone. It is a normally formed breakdown product of the monoamine neurotransmitters. Monoamine oxidase attacks these neurotransmitters once they’ve done their job, removes them from the receptors, and disassembles them for reuptake into the neuron for recycling. These are normal, but only in very small amounts. In the presence of acetaldehyde, the first breakdown product of ethyl alcohol, and in fact a product of burning tobacco, monoamines break down much more frequently into this chemical.

THIQ - tetrahydro isoquinilone

When acetaldehyde is present, THIQ forms. It gets plugged into endorphin receptors, and stimulates them. This is a primary agonist action. However, the part of the molecule which protrudes from the receptor is *not* shaped like the neurotransmitter it acts as, and the monoamine oxidase cannot remove it or break it down. It gets stuck in the receptor, preventing it from being used again. This is a secondary antagonist action — a permanent one. With more and more endorphin receptors being taken out of action, the person begins to feel the lack. They feel the need to return to the previous balance. They have already trained their brain to know that using alcohol (or tobacco) relieves this need. So they use some more. Once enough receptors are taken out by THIQ, and the person attempts to correct it, an escalating spiral has started.

This THIQ hypothesis made big noise when it was first introduced, a little over 5 years ago. At first, there were studies which showed that the hypothesis was flawed. It fell out of favor. Then, more studies showed that the reaction would in fact take place, and it came back in. Finally, in the absence of corroborating evidence in biological systems, it fell out of favor again.

Last summer, out comes a claim from a researcher named Mele. He had been hired by a tobacco company to do studies on, well, I’m not really sure. But what he ended up doing was showing that rats preferred to take acetaldehyde. Since it’s nasty stuff, they won’t drink it, so the apparatus used included IV injection. Once habituated to nicotine, rats would press a bar 12 times on average for water. They would press 4 times that for water and nicotine. But they would press it 10 times 40 times that for water, nicotine and acetaldehyde. Since this appeared to substantiate the fact that tobacco was addictive (recall acetaldehyde is in tobacco smoke) the company decided not to publish his results (so goes his fairly well substantiated claim). The corroborating evidence for acetaldehyde’s role in mediating *some forms* of addiction apparently exists, and in fact existed 10 years ago, when he did this work.

What else supports this? The long standing claim that genetics plays a part. Alcohol breaks down by the action of alcohol dehydrogenase on the alcohol, removing a hydrogen, forming acetaldehyde. This toxin is supposed to be removed quickly by the action of acetaldehyde dehydrogenase (away with another hydrogen) forming acetate and water. If, in this two step process, there is either relatively too much alcohol dehydrogenase, or relatively too little acetaldehyde dehydrogenase, a build up of acetaldehyde will occur. The levels of both of these enzymes are genetically determined. It would seem that if the gap between these were too large the person would be more prone to addiction.

Yet the Japanese are very often extremely deficient in aldehyde dehydrogenase, hence their tendency to turn red, sweat, get cramps, etc., when they take alcohol. This effect is so common that it’s called the ‘oriental flush’. This is precisely the effect seen when someone drinks on Antabuse – disulfarim – which blocks the action of aldehyde dehydrogenase. It appears there is a range of disparity between the enzymes which allows the acetaldehyde build up to the point that THIQ is formed, but not so much that the person suffers too much of this effect when they drink.

Many studies have been done on alcohol metabolism. Most have shown that the metabolism of alcohol does not correlate with incidence of alcoholism. Not in racial groups, and not in individuals. My contention is that they’ve studied the wrong thing. They looked at the metabolism of alcohol, *not* the metabolism of acetaldehyde. Alcohol provides the primary action sought by users, and may in itself be habit forming, but I claim that it is the acetaldehyde which mediates the addiction. My field of personal interest is in studying what appears to be an extremely high rate of alcoholism in Native Americans. Incidence rates in 1980 were estimated at 70%. Those that had it suffered adverse consequences in all body systems more so than most other groups, and much sooner.

Native Americans are a genetic group coming from three separate ancestral groups from the orient. It is likely they have a genetic make up for the markers for acetaldehyde production, which falls pretty much at the optimum for THIQ production. What’s it going to take to prove this? First, following up on Mele’s work to substantiate it. Ksir at Wyoming (co-author of probably the best drug use and abuse textbook on the market) is in the process of doing this. Getting the rats to take acetaldehyde is apparently the sticking point.

Second, redoing the alcohol metabolism studies with measurement of acetaldehyde levels over time. Last, determining the genetic markers for acetaldehyde production, and correlating them with incidence rates in genetic groups, and with individuals who have been shown to be particularly susceptible to alcohol addiction. #1 being under way, I intend to work on #2. #3 will come once report comes from the human gnome project that they have isolated the genotypes which determine acetaldehyde dehydrogenase production. We know the specific enzymes and locations of production, so it won’t be too tough a job. Getting it past the ethics committees will be tougher.

As I said, some of this is speculative. But too many pieces of the puzzle fit too well for it to be entirely wrong. If it all turns out, it will give us a genetic marker for testing to determine if a person is at high risk to develop addiction to anything which produces acetaldehyde. This will give the person the chance at informed decision making. And given the expected advances in genetic medicine, it should be possible to correct the enzyme levels if not in individuals, then in their offspring, and so reduce their risk. Addiction would still exist. But some cases of it could be prevented either by high risk persons choosing not to use, or physical reduction of the risk factor. I welcome comments, particular thoughtful comments, regarding the background, logic and conclusions here. I am, at the root of it all, interested in discovering the true nature of addictions, or as much as possible. This is the end of Dynasor’s writing

The short story is: Rats were placed in one cage with water and an alcohol mixture. The rats chose to drink the water, completely ignoring the alcohol mixture.

The rats were taken out of the cage. THIQ was surgically implanted in their brains and they were placed back in the cage. Now they drank the alcohol, completely ignoring the pure water, until they died.

Such is the nature of our disease. Surgically implanting the THIQ completely bypasses all social factors so that whether or not we went to church, got potty trained, came from a split family, were sexually abused, or any other number of social factors matter not one whit. It is completely a matter of whether or not the brain manufactors THIQ. If it does. (and we drink at all) we become alcoholic.

If the brain does not manufactor THIQ we cannot become alcoholic, even if we drink a freight train load.

by Dennis McClain-Furmanski,

The Brain And Alcohol: Still a Mystery

The brain is still a big scientific mystery, and the effects of alcohol on a developing brain are even murkier.

Scientists say alcohol alters how individual brain cells operate and how entire regions of the brain function.

It reduces the density of grey and white matter and shrinks the brain itself. In very extreme cases, the brain looks almost mushy.

As pediatrician and FASD (Fetal Alcohol Spectrum Disorder) expert Sterling Clarren likes to say, “The big, important message is that alcohol does not affect a small piece of the brain. It affects everything.

Alcohol is a teratogen, a substance that causes birth defects.

TERATO: from the Greek meaning monster

GEN: from the Greek meaning make

TERATOGEN: monster-maker

Alcohol can shrink the corpus callosum, a thick band of fibres in the brain’s core that connects the right and left hemispheres, the logical and emotional halves of the brain.

If your corpus callosum is stunted, it’s difficult for the two sides of the brain to talk to each other, resulting in problems storing and retrieving information, paying attention and problem-solving. Some people with severe FASD don’t have a corpus callosum at all.

Alcohol can shrink the cerebellum, which is in charge of processing inputs from other areas of the brain to co-ordinate motor and thinking skills. If it gets damaged, fine motor skills can be ruined.

Individual brain cells can be mangled by alcohol, damaging everything from the regulation of gene expression, to the way cells interact, to the growth and survival of neural stem cells — the basic regenerating cells of the brain.

The glial cells — the security blankets that protect and feed each neuron — can also be profoundly affected.

There is no known safe level of alcohol a woman can drink while pregnant without harming her baby.

Binge drinking or chronic alcoholism present the bigger risk.

But women who drink only occasionally could also damage their babies, depending on the development stage of the fetus and other factors such as nutrition, stress, smoking, how fast a woman metabolizes alcohol and old-fashioned genetics.

Not every child whose mother drank during pregnancy will suffer brain defects. Half emerge unscathed.

Genetics play a huge and mysterious part in determining which child will be most affected by alcohol in utero.

Even among twins, one might have a genetic tick that modulates the effect of alcohol while the other will suffer brain damage.

The worst effects occur during the first trimester. Organs are starting to form, so adding an alcohol “insult,” as researchers call it, can cause congenital heart problems, kidney damage and brain defects.

It’s also in the first trimester that the classic FASD face forms.

In the second and third trimester, alcohol will affect how the brain matures, especially its wiring.


Dr. Albert Chudley, and various academic papers, including Foetal Alcohol Spectrum Disorders and Alterations in Brain and Behaviour by Consuelo Guerri, Alissa Bazinet and Edward Riley (Alcohol and Alcoholism, 2009)

By: Staff Writer,

The Chemicals In Your Cosmetics

Sodium lauryl sulfate is an effective degreaser used to clean oil stains from the floor of my mechanic’s repair shop; what’s it doing in my toothpaste and my daughter’s bubble bath? And, why is the long-known carcinogen nitrosamine, banned in Canada and the European Union, still a common ingredient in my mascara, concealer, sunless tanning lotion and baby shampoo?

The simple answer is that the U.S. Food and Drug Administration still doesn’t bother to regulate anything it dismisses as cosmetics — any products used topically — despite the growing science showing how easily poisons and pollutants can be absorbed through the skin. Since the 1930s, the only thing the FDA regulates is the accuracy of the labeling on cosmetics.

As long as manufacturers list in gory detail the witches’ brew of industrial chemicals, heavy metals, and toxic substances they blend into your eye cream or face wash, they are free to dump whatever they want into your epidermis.

As consumers, we are left to defend ourselves armed only with unintelligible ingredient labels and confusing news reports about what parts per billion of something can cause cancer or Alzheimer’s. Americans are taking their bodies on a magical mystery tour full of chemicals and heavy metal toxins by way of basic grooming habits.

Just a little Googling reveals that every day we are exposed through personal care products to more than 10,000 nasty chemicals banned elsewhere in the world. Everything from lip balm to hand lotion is filled with stuff we wouldn’t dream of putting in our stomachs. Instead, we eagerly spread it over the largest organ of the body — ensuring effective absorption and exposure to a daily dose of illness-inducing and cancer-causing garbage. The american medicine cabinet has become a virtual love canal of hidden industrial waste that wouldn’t be allowed anywhere else.

For example, the Environmental Protection Agency requires workers to wear protective gloves, clothing, and goggles when handling chemicals like Diazolidinyl Urea and Propylene Glycol when they manufacture your favorite antiperspirant. The EPA warns workers against skin contact with these chemicals because they are known to cause brain, liver, and kidney abnormalities — in concentrations lower than those found in off-the-shelf stick deodorants. By contrast, you are not even given a fair warning by the deodorant industry as it encourages you to apply these very same poisons to your naked underarms every morning.

Okay, so according to Washington it’s every woman for herself, but ever try to read the ingredients of your shampoo? I mean the ingredients that are actually listed? Good luck even pronouncing isobutylparaben. And if “fragrance” is involved you’ll never actually get the straight story. Fragrance is protected as a trade secret and up to 200 suspect ingredients can be buried in there with no call-out.

In a recent Congressional hearing the head of the FDA’s Center for Food Safety and Applied Nutrition, Stephen Sundlof, waved the white flag when he said, “The law as it is currently written allows virtually anything to be incorporated into a cosmetic.” This lack of oversight means that consumers actually know very little about what makes up their make-up. And there is little rigor to the enforcement of existing policies: only nine out of tens of thousands of chemicals have been banned in the U.S., compared to 11,000 so far in the E.U.. Even more alarming is the fact that only 11 percent of ingredients used by Americans in personal care products have even been reviewed for safety — by anyone.

So, what have the Europeans and Canadians figured out that we have not? For one, their governments don’t rely on a voluntary reporting system to monitor product safety. Incidents — from adverse reactions to longitudinal health surveys — are made public by law. Under decades-old U.S. law, cosmetics companies are not required to publicly submit information on the safety of their products so, surprise, they don’t. And the toothless FDA relies almost solely on the Cosmetic Ingredient Review (CIR), the industry’s self-policing safety panel, for its product safety data. European regulators do their own safety research and reporting.

While the poets may consider your body a wonderland, the truth is it’s more likely a wasteland of built-up toxins that would earn perpetrators federal jail time if they dumped it into any canal other than the alimentary.

What we need is a green movement for the human body. Improving consumer protections against “body dumping” must start with the FDA. Fortunately, even with a regulation-averse Congress, much of the FDA’s powers are interpreted internally. There are numerous administrative steps the FDA can take without Congress butting in — if it so motivated by public alarm. You can contact your regional FDA office and make some noise. Several good organizations under the banner of the Campaign for Safe Cosmetics — including the Environmental Working Group and Health Care Without Harm — have been banging the drum in Washington, but they need our help to be effective.

It seems our city sewers have more protections than we do. As a creative alternative, perhaps we could declare ourselves micro-dumps and ask for protections under the EPA. Or we might seek relief from broader protections granted to us under the Occupational Safety and Health Organization (OSHA). Hazmat-clad technicians could scan our ditty bags for offending lipstick and hand creams.

One has to wonder if all this would be different if men wore makeup and a tad more product in their hair.

Source: The Huffington Post,  Estelle Hayes is a Silicon Valley journalist and blogger.

Memory Loss Can Be Caused By Over-The-Counter Drugs

Did you know that common over-the-counter drugs or prescriptions can cause memory loss and cognitive impairment?

Mild cognitive impairment is a common, age-linked condition that is often an early sign of Alzheimer’s disease. Its cardinal symptom is forgetfulness or impairment of short-term memory.

Numerous drugs have been shown to produce mild cognitive impairment (MCI). They may create or aggravate Alzheimer’s-type symptoms.

(NOTE: You should NOT stop taking medications without first consulting your physician.)

Most of the drugs that cause MCI have a property called “anti-cholinergic.” They inhibit activity of the neurotransmitter acetylcholine, which plays a critical role in memory and cognitive function.

Here’s the problem: only a few of these drugs are officially classified as anti-cholinergic. The official anti-cholinergic drugs are mostly used for relieving intestinal cramps or bladder irritability and are labeled “anti-spasmodic.” They’re at the top of the list below.

But there are 17 additional types of drugs used for many other purposes that may also have anti-cholinergic effects. The list includes commonly used drugs like antihistamines, acid blockers and antidepressants. Unfortunately, many doctors and pharmacists are unaware of the anti-cholinergic properties of these medications.

In an address to the American Academy of Neurology at the 60th Annual Meeting, Dr. Jack Tsao, associate professor of neurology at Uniformed Services University in Bethesda, Maryland, said, “… a lot of medicines that are not advertised as anti-cholinergic in nature actually have anti-cholinergic properties.” Dr. Tsao and his colleagues followed a group of nuns and clergy from the Rush Religious Orders for about eight years and found an accelerated rate of cognitive decline in those who began using anti-cholinergic drugs.

Several published studies have also shown that people taking drugs with hidden anti-cholinergic effects are at increased risk for MCI.

It is likely that these drugs have additive effects: the more anti-cholinergic drugs a person takes at one time, the greater the risk of side effects.

Because the list is long and includes drugs used for many different purposes, it is possible for an individual’s total burden of anti-cholinergic drug activity to be much higher than expected.

In addition, advanced age is associated with increased susceptibility to anti-cholinergic drugs because of a reduction in acetylcholine activity with age.

If you’re concerned about MCI in yourself or someone you know, check the list of drugs below to see if medication might be contributing to the problem.

Drugs with Anti-cholinergic Properties

Some of these are available without prescription and may be found alone or combined with other drugs, especially in over-the-counter cold and headache remedies. Don’t just rely on the product’s name. Check all ingredients. Bring this information to your doctor. Do not discontinue the use of any prescription drug without your doctor’s approval.

Antispasmotics: used to relieve intestinal cramps or bladder symptoms, these are also found in numerous over-the-counter and prescription combination products used for colds and coughs, with various brand names:

o Atropine
o Belladonna (Donnatal and others)
o Clidinium (Quarzan)
o Dicyclomine (Bentyl and others)
o Flavoxate (Urispas)
o Glycopyrrolate (Robinul)
o Hyoscyamine (Levsin, NuLev, Cystospas and many others)
o Oxybutynin (Ditropan and others)
o Solifenacin (VesiCARE)
o Propantheline (ProBanthine and others)
o Scopolamine (Transderm-Scop and others)
o Tolterodine (Detrol)
o Trospium (Regurin and others)

Antihistamines: these are used in numerous over-the-counter and prescription products alone or in combination with other drugs for relieving symptoms of allergies, colds, dizziness or improving sleep:

o Azatadine (Optimine and others)
o Chlorpheniramine (Chlortimeton and others)
o Clemastine (Contac, Tavist and others)
o Cyproheptadine (Periactin)
o Desloratadine (Clarinex and others)
o Dimenhydrinate (Dramamine and others)
o Diphenhydramine (Benadryl and many others)
o Doxylamine (Unisom and others)
o Hydroxyzine (Atarax, Vistaryl)
o Loratadine (Claritin and others)
o Meclizine (Antivert and others)
o Pyrilamine

Note: Fexofenadine (Allegra) and cetirizine (Zyrtec) are antihistamines without anti-cholinergic effects, but may cause sedation.

Antacids: these are histamine H2 antagonists, used to relieve heartburn and stomach pain. For more on acid suppressing drugs, see my article “Stomach Acid and the Future of Health Care“:

o Cimetidine (Tagamet)
o Famotidine (Pepcid)
o Nizatadine (Axid)
o Ranitidine (Zantac)

Note: Although these drugs have relatively weak anti-cholinergic activity, their use is associated with MCI in older adults.


o Amitriptyline (Elavil and others)
o Amoxapine (Asendin)
o Citalopram (Celexa)
o Clomipramine (Anafranil)
o Desipramine (Norpramin)
o Doxepin (Sinequan and others)
o Duloxetine (Cymbalta)
o Escitalopram (Lexapro)
o Fluoxetine (Prozac)
o Imipramine (Tofranil)
o Lithium
o Nortriptyline (Pamelor, Aventyl)
o Paroxetine (Paxil and others)
o Protriptyline (Vivactil)

Muscle relaxants:

o Carisoprodal (Soma and others)
o Chlorzoxazone (Parafon Forte and others)
o Cyclobenzaprine (Flexeryl and others)
o Methocarbamol (Robaxin and others)
o Orphenadrine (Norflex and others)

Antiarrythmics: used to treat cardiac arrhythmias:

o Digoxin
o Disopyramide (Norpace and others)
o Procainamide
o Quinidine (Quinaglute and others)

Antiemetics: used to suppress nausea or vomiting:

o Promethazine (Phenergan and others)
o Prochlorperazine (Compazine and others)
o Trimethobenzamide (Tigan)

Antipsychotics: used for severe psychiatric disorders:

o Chlorpromazine (Thorazine and others)
o Clozapine (Clopine and others)
o Mesoridazine (Serentil)
o Olanzapine (Zyprexa)
o Promazine
o Quetiapine (Seroquel)
o Thioridazine (Mellaril)

Antiparkinsonian: used in the treatment of Parkinson’s disease and related disorders:

o Amantadine (Symmetrel)
o Benztropine (Cogentin)
o Biperiden (Akineton)
o Procyclidine (Kemadrine)
o Trihexyphenidyl (Artane and others)


These drugs were shown to have anti-cholinergic effects at high concentration. They may exert clinically significant anti-cholinergic side effects when used at high doses or in people with impaired kidney function or a heightened susceptibility to anti-cholinergic side effects:

o Amoxicillin (an antibiotic)
o Carbamazepine (Tegretol, a drug for controlling seizures or chronic pain)
o Celecoxib (Celebrex, an anti-inflammatory pain reliever)
o Cephalexin (Keflex, an antibiotic)
o Diazepam (Valium, a tranquilizer)
o Diphenoxylate (Lomotil, a drug for diarrhea)
o Fentanyl (Duragesic, a narcotic pain reliever)
o Furosemide (Lasix, a diuretic used for fluid retention)
o Hydrocodone (a narcotic pain reliever, found in Vicodin)
o Lansoprazole (Prevacid, a proton pump inhibitor, used to reduce stomach acid)
o Levofloxacin (Levaquin, an antibiotic)
o Metformin (Glucophage, a drug that reduces blood sugar, used by diabetics)
o Phenytoin (Dilantin, a drug for controlling seizures)
o Temazepam (Restoril, a sleeping pill)
o Topiramate (Topimax, a drug used for preventing migraine headaches)

A medication does not have to be swallowed or injected to exert systemic effects.

Anti-cholinergic eye drops may affect the brain. They are used to dilate the pupils. These include:

o Cyclopentolate
o Homatropine
o Tropicamide

Anti-cholinergic Herbs: Numerous herbs and natural products have anti-cholinergic effects and may be more hazardous than medications. Here are those that have been studied the most:

o Amanita muscaria (fly agaric)
o Amanita pantherina (panther mushroom)
o Arctium lappa (burdock root)
o Atropa belladonna (deadly nightshade)
o Cestrum nocturnum (night blooming jessamine)
o Datura metel (yangjinhua, used in traditional Chinese remedies)
o Datura suaveolens (angel’s trumpet)
o Datura stramonium (jimson weed)
o Hyoscyamus niger (black henbane)
o Lantana camara (red sage)
o Phyllanthus emblica (Indian gooseberry)
o Solanum carolinensis (wild tomato)
o Solanum dulcamara (bittersweet)
o Solanum pseudocapsicum (Jerusalem cherry)

Learn more about herbs, traditional uses and side effects in my Herb Guide

In addition to memory loss and cognitive impairment, anti-cholinergic drugs may cause nervousness, confusion, disorientation, hallucinations, restlessness, irritability, dizziness, drowsiness, blurred vision and light sensitivity.

Know What You Are Taking

You should know everything that you or people in your family are taking: drugs and supplements and their potential side effects and interactions. If cognitive impairment is a problem and you’re taking one or more of the substances listed above, what you’re taking may be a cause or contributor.

The Raw Truth About Raw Vegan Diets: A Primal Perspective

Raw vegan diets are all the rage these days.  Advocates claim that a diet composed exclusively of raw plant foods will support optimal health, protect animals, and save the planet.

The raw truth is that raw vegan diets don’t support health for most people, for a very simple reason:  Humans are not adapted to a raw vegan diet.  For that matter, not even our closest primate relative, the chimpanzee, is adapted to a raw vegan diet.

Let’s take a critical look at the rationales and effects of raw vegan diets.

Raw Rationale

David Wolfe, raw food advocate, wrote a book entitled Nature’s First Law (Don’t buy it!).  In this book, he suggests that the “first law” of nature is that food is raw; if not raw, its not food.   He derives this “law” from the observation that no animal other than humans cooks anything before eating it.  Thus, humans break the “law.”

Let’s try more reasoning like Wolfe’s:

  • No other animal uses language, therefore humans should not use language.
  • No other animal wears clothes, therefore humans should not wear clothes.
  • No other animal makes violins, therefore humans should not make violins.
  • No other animal writes sonnets, therefore humans should not write sonnets.

These people apparently have not noticed that by their reasoning, we humans probably should not be, well, human.

And, particularly pertinent to the raw food lifestyle, no other animal uses metal knives, blenders, dehydrators, grinders, or juicers, therefore humans should not process foods with any of these items either.  Yet raw foodists seem plenty happy to apply the knife, high speed blending, mechanical grinding, and juice extraction to food.

Flimsy Analogies

Wolfe and other raw foodists also are fond of using analogies like this:  “If you set fire to your house, it does not improve the house, it destroys it.  Therefore, we can conclude that applying fire to food can only destroy, not improve it.”

Raw vegan blogger Steve Pavlina states it this way:

“Incidentally, if you want to see what happens to protein when you cook it, pluck a hair off your head and put a flame under it. Cooked protein becomes a sticky mess that doesn’t digest well at all. Raw plant foods provide all the protein we need, in the right form for easy assimilation.”

Wolfe and Pavlina have constructed straw man arguments against cooking by conflating it with incineration.

Perhaps their starved brains can’t see this, but incinerating a house or human hair with a direct flame is, well, not quite the same thing as cooking.   Cooks don’t light food on fire, they use finesse to capture and employ radiant heat arising from flames to alter the physical properties of the foods.  Strictly speaking, it is not the fire that they use, it is the heat.

Does cooking make food less valuable?  On the contrary, skillfully applied heat dramatically increases the nutritional value of plant foods.

Hedren et al performed an experiment designed “to develop an in vitro digestion method to assess the impact of heat treatment, particle size and presence of oil on the accessibility (available for absorption) of alpha- and beta-carotene in carrots.”  Their methods:

“Raw and cooked carrots were either homogenized or cut into pieces similar to chewed items in size. The carrot samples, with or without added cooking oil, were exposed to an in vitro digestion procedure. Adding a pepsin-HCl solution at pH 2.0 simulated the gastric phase. In the subsequent intestinal phase, pH was adjusted to 7.5 and a pancreatin-bile extract mixture was added. Carotenoids released from the carrot matrix during the digestion were extracted and quantified on high-performance liquid chromatography (HPLC).”

Their results:

“Three percent of the total beta-carotene content was released from raw carrots in pieces. When homogenized (pulped) 21% was released. Cooking the pulp increased the accessibility to 27%. Addition of cooking oil to the cooked pulp further increased the released amount to 39%.”

Look, although they treated the finely chopped (chewed-simulation) carrots with an HCl solution at pH 2.0, only 3 percent of ß-carotene was released.  Don’t you think this tells us something?  For example, maybe our gut isn’t equipped to adequately digest raw plants as they occur in nature?

The human gut can’t extract high amounts of anything from raw carrots or similar vegetables for one simple principal reason:  our bodies do not produce the enzyme cellulase required to digest and break down the plant cells that contain all the nutrients supplied by fibrous plants.

Almost all of the valuable nutrients in plants occur inside the cells of the plants.  These cells have walls composed of cellulose.  Lacking any enzyme to break down this cell wall, humans must use other means to open the cells to extract the nutrients.  Outside modern industrialized nations, most people apply heat to the food, which causes the juice in the cells to expand and this causes the cells to explode open, making their contents more available for absorption.

Thus, carrots cooked to a soft texture deliver 9 times as much ß-carotene as chewed raw carrots.  This experiment explains why raw foodists love and need their blenders and juicers.  Let’s say some raw food person eschews machinery.   Here’s the data:

  • One hundred grams of raw carrot contains 16, 706 potential IU of potential vitamin A activity in the form of carotenes.
  • A human requires about 1000 mcg daily of retinol equivalent (RE) activity from food.
  • If only chewing the carrots, a human will extract about 3% of the carotenes, or 501 IU.
  • 10 IU of ß-carotene from plant foods provides one RE.

From this we can conclude that 100 g of raw carrot provides only 50 RE.  Since a human requires about 1000 RE daily, he would have to eat 20 x 100 g, or 2 kg/4.4 pounds of carrots daily to even have a chance of meeting his vitamin A needs.

Since the typical person eats only 3-5 pounds of food daily, he would have to eat nothing but carrots.  Two kilos of carrots supplies only 820 calories, assuming that we can extract 100% of available calories from raw plants, or only about 24 calories if we extract calories from chewed carrots at the same 3% rate that we extract ß-carotene (the safer assumption, since all the sugars in carrots are also locked up in the indigestible cells).

So this guy better be up for spending a lot more time eating to meet his 2500 calorie daily requirement.  How about at least 6 kilos of carrots daily, and possibly 200 kilos daily, to meet your energy requirements?

So the raw fooders fall back on their blenders, which will increase the nutrient delivery  by about 7 times.  Now you only need a mere 600 g/1.3 lbs. of carrots to get enough ß-carotene to have a chance at adequate vitamin A production.  The caloric delivery soars to 84 calories per kilo. Now we’re making some progress.

Perhaps you can begin to see why some people rave about weight loss achieved when they gorge on raw foods.  The caloric delivery can be so low, you may as well be fasting.

Back to the vitamin A, all of that assumes that he was not one of the approximately 45% of people who don’t convert carotenoids to vitamin A at all. It appears that in the course of human evolution, the activity of ß-carotene 15,15′-dioxygenase (needed to convert ß-carotene to retinol) has declined substantially, such that up to 45% of people do not convert ß-carotene to retinol vitamin A. Hickenbottom et al found that 45% of 11 men tested did not convert ß-carotene to vitamin A (retinol).  Lin et al found the same in women. Leung et al identified gene polymorphisms contributing to this variability in carotene conversion capacity.

Now apply this to calcium.  Take a raw vegetable with a fairly high calcium content, such as collards, which may contain up to 250 mg calcium per 100 g.  Like the ß-carotene in carrots, this mineral lies inside the cells of the collards, surrounded by cellulose that we can’t digest.  This means that we might extract only 3% of the calcium from raw collards, i.e. 7.5 mg per 100 g raw.  A human requires about 750 mg calcium daily, so he’d have to eat 10 kilos of raw collards daily to get adequate calcium.  Let’s get chewing!

Actually, one might rightly question whether we can rightly call a food that has gone through a juice extractor, blender, or grinder of any sort “raw,” “uncooked,”  or even “unheated.”

All of these devices treat the food with friction, and friction always generates heat, the key element of the “cooking” that raw foodists so passionately attack.  One might call the products mimimally heated, but they are heated nonetheless.

Moreover, no other animal has to use blenders to get adequate nutrition from its raw food diet.  If you vegan raw fooders discard cooking because “no other animal does it,” shouldn’t you also discard blenders and juicers for the same reason?  Perhaps your starved brains can’t comprehend contradiction?

If your raw food diet only works if you use a blender, I have to wonder what you think your raw food ancestors did without those blenders, only invented in the 20th century.  So far as I know, no archaeological dig has found blenders or juicers in early human tool kits.  Can you imagine stone age women trying to juice carrots by grinding them against rocks?   Not quite optimal foraging.

I’ve a few more things to say about the delusions of vegan raw fooders.  Until next time, have a steak.