|« Fat Tony Science||Interview with Dr. Mehmet Oz on The GenoType Diet »|
The year 1956 stands out in my mind for a variety of reasons, the most important being (at least for me) that it was the year I was born. It also marked the year of the only ‘perfect game’ even thrown in a baseball World Series. Music fans might remember that it was the year that Elvis Presley entered the United States music charts for the first time, with 'Heartbreak Hotel.'
1956 was also the year a scientist named Roger Williams published a book called Biochemical Individuality, which attempted to relate inherited individual distinctions to nutritional requirements. Although Williams was no small figure in medicine (at the University of Austin he had discovered pantothenic acid, one of the critical B vitamins, and had published skews of articles detailing some of the most basic biochemical discoveries) Biochemical Individuality attracted little, if any attention from the medical community, probably due to the fact, as Jeffrey Bland speculates in his book Genetic Nutritioneering, Williams expressed many of his ideas in biochemical terms, which doctors of the time were far less comfortable with compared with today.
How prescient is the following phrase:
“The existence in every human being of a vast array of attributes which are potentially measurable (whether by present methods or not), and often uncorrelated mathematically, makes quite tenable the hypothesis that practically every human being is a deviate in some respects.”
It’s a strange choice of words, but the word deviate in this context signifies a turning away from the normal or a variance of some sort. Of course, we tend to think of the word more as a term for individuals who deviate from some sort of social norm; but norms are norms.
Williams was certainly deviating from conventional medical wisdom. Nobody at the time was looking at peculiar and individual aspects of nutrition that might be predicted genetically. More importantly, in 1956 there wasn’t anywhere near the enormous genetic industry and technology that exists today; it had only been three years before that James Watson and Francis Crick deduced the basic structure of DNA, (Deoxyribonucleic acid) –the double helix-- that contains the genetic instructions specifying the biological development of all cellular forms of life.
Thus when Williams talked of “attributes that are potentially measurable (by present methods or not)” he is taking an amazingly huge step into the future.
So Williams’ phrase “often uncorrelated mathematically” should probably be reinterpreted to mean “we can’t see the connections because of our current puny computational abilities.” Nowadays we link supercomputers together into vast neural networks and process data at a speed and accuracy that just boggles the mind. It was just this type of muscular computing that allowed scientists like Craig Venter and his firm Celera Genomics to help crack the human genome in record time. Today, the combination of gene sequencing and supercomputers is a day to day event in hundreds of laboratories worldwide, and is a prime part of a vast new field called bioinformatics.
In 1956, nutrition science was still in its infancy, concerned mostly with deficiency types of diseases such as pellagra and anemia, and making sure that we all ate “balanced meals.” There was no link between diet and cholesterol or between cholesterol and hardening of the arteries and medical journals often featured cigarette ads on their back pages. Ulcers were often treated by telling the patient to drink copious amounts of milk, the so-called “sippy-diet.” In other words, nutritional thinking at the time was predominantly disease-based, which is odd, since almost everything we do with food has absolutely nothing to do with disease. This resulted in what my friend and colleague Jonathan Wright used to call "The Association Diets.'
This is not to say that pieces of the puzzle weren’t evident, or that intelligent people were not already beginning to ask the right questions. It’s just that the questions could only be based on what was thought to be known, and what was known was not very much.
I can remember taking a computer class in high school (already well into the 1960’s) where we were taught to diligently inscribe a series punch cards with a 'number 2' pencil, which were then collated and fed to a machine the size of a large refrigerator, which then hacked and coughed for a while, finally yielding a half page printout of a list of fifty prime numbers.
Unless, of course, you had the misfortune to have penciled in the wrong box, in which case you just started all over again; a frustrating experience, which lead to one of my young colleagues, in a rage of frustration, placing one of the cards on the floor and proceeding to stomp on it repeatedly with his shoed foot, sending it on to the card reader --and probably producing the first computer virus-- a trick many of us would repeat when similarly frustrated. Your home computer can do these functions in micro-seconds, and the software to do it is considered so basic that it is usually packaged for free with the operating system.
BTW, when my husband was getting his MS degree in the early sixties, I remember only too well those "refrigerator" computers that coughed and choked and clanked it seemed like for hours before they finally gave up something. Ah, but those were some wonderful days, slow computers and all! :-)
Comments are not allowed from anonymous visitors.