I've relocated my active blogs to a new dedicated site:
- Personal Genomics (n=1)
- Ask Dr. D'Adamo
- Blood Type and Nutrition
- Science and Culture
- Disco Hospital
- A Furious Kind of Calm
- History of Brooklyn
This blog will stay active as an archive of my earlier postings. All the other active bloggers will continue to work from this site. My thanks to them and to all the readers who have been so dedicated, helpful and supportive over the years
The class I currently teach in generative medicine uses a content system called Blackboard. Blackboard allows me to upload material and pose questions to a forum-like discussion area. One of my students, upon reading the assignment in the textbook made the following comments:
You talk about the division between classical science and naturopathic science, which you equate, respectively, with reductionism and emergence (p. 30). Do they necessarily have to oppose one another and can they not coexist. And does not naturopathic medicine incorporate some degree of reductionism and classical science some degree of emergence?
I guess this goes back to the old question of: can't we all just get along but, further, isn't it sort of imperative that we not categorize conventional versus naturopathic medicine in such black and white terms? Or maybe it is a more useful distinction than I'm discerning?
If, indeed, scientific reductionism is dead (p. 20) and the biomedical community is unaware, how do you best suggest we, as NDs (or future NDs), start to make inroads into that community to convince them that the idea of emergence/holism/a generative approach is worth substantively incorporating into the larger paradigm of "modern" medicine?
I would argue that there is a global groundswell of desire among consumers of healthcare for this generative approach, but it might be up to us as practitioners of naturopathic medicine to bring it mainstream. But the path to that end is not clear. At the end of the day, not only do we have to all get along, but we need to understand what the other is saying.
To which I replied:
If we define 'dead' as having lived a life with purpose, and perhaps even being so lucky as to exhaust that purpose, then reductionism is quite dead in the sense of being 'not alive'. [Which leads to the question: if an idea has no purpose, hence no life, does it even get to die?]
There will always be a reason to think in reductionist terms when the facts do indeed fit the scenario. IMHO there will always be opportunities for non-complex thinking (and indeed one should seize them whenever one can).
My position is that, as a profession, we are perhaps running the risk of being overly seduced by the simplicity of fitting our oeuvre to the existing allopathic framework. In essence we will be moving into a neighborhood in which the prior occupants have already sucked out the life and are themselves moving on to new areas.
Moreover in doing so we may well be creating a nascent culture of new dogmatists, apparatchiks who insist on only dealing with issues on these terms. If that was not bad enough, this then runs the risk of creating its own response element, its own duality, such that a second subculture results that does the exact opposite, accepting facts a priori.
So, what about this generative medicine idea? As you so astutely point out, the goal is to blend both the complex-systems approach with the mechanistic-reductionist approach, point being that we, as naturopaths, should have a pretty good feel for where the work needs to be done and how to go about doing it. Perhaps this duality is itself a power law: we may be using an 80% reductionist formula to discern 20% of our total causalities. Certainly systems-complexity-network (SCN) medicine comprises only a small fraction of current biomedical information analysis. Generative Medicine, as I see it, should resolve that duality.
No matter what, the informational chasm does indeed lay which complexity, as well as any future potential for understanding and treating the life process itself.
Like they say, if you really want to learn something, teach it.
I write a weekly email to the ND students on my shift at the University of Bridgeport Health Science Center. If in the course of working with a patient a concept arises that appears to require more in-depth knowledge I often specify certain research articles for the students to read in preparation for the next shift. This is not typical of clinic shifts and my shift is thought to be among the more demanding. Despite (or perhaps because of) this, a place on the shift rotation is always in high demand.
Recently the discussion came round to two concepts related to both cancer and inflammation: endoplasmic reticulum stress (ER stress), which results in poorly manufactured proteins; and the subsequent unfolded protein response (UFR), which occurs as a result of the dangerous aspects these improperly folded proteins pose to the cell.
After providing the students with 6-7 key studies, I began to suspect that they might need some cheering-up. So I appended this little ditty to the email:
A polymath is someone who is interested in everything, and nothing else.'
A polymath (Greek πολυμαθής, polymathēs, "having learned much") is a person whose expertise spans a significant number of different subject areas. In less formal terms, a polymath (or polymathic person) may simply be someone who is very knowledgeable. Most ancient scientists were polymaths by today's standards.
How do YOU propose to become a polymath? What might you need or observe?
Firstly, and this might be obvious, there must occur a huge leap in self-confidence. If you are easily intimidated by learning new things, or think you are somehow less smart than others, becoming polymathic will not be easy.
The learning-intimidated state is overcome by taking the first few baby steps in developing a new appreciation of just what you are capable of being aware of. This might be as simple as arriving at the conclusion that, since you are (at the very least) in control of your own life, who else is better capable?
I once had to do an interview with
Well of course I got all bent out of shape. The guy is like one of the smartest people on the planet. etc. etc. etc. string theory, etc. etc. etc. skeptic, etc.etc. etc.
As I got more twisted and twisted, I went to dinner at an old friend's house. These people were once our next door neighbors and we've kept in touch over the years, despite (or perhaps because) we don't have very much in common. On hearing my lament, the husband listened and simplified the whole thing for me by identifying the one basic truism:
'He may be smart, but he don't know what you know.'
Anyway, we couldn't get a time that worked for both of us and the interview never took place, but I did learn something about myself.
2. Rush/ Don't rush.
How often do we use time to avoid something? To just get 'it' over? However, far from telling you to slow down, I'm suggesting that you rush with a purpose. Not just to 'get it over with ', but rather to 'just to get to the end of it.' Then, instead of just moving on, going back and revisit that notion that caught your eye. Getting to the end of something lets the brain 'pin the four corners' of the concept and set aside space for conceptualization and context.
This observation has an important metaphor about disease and health buried within it.
Health is often like a car speeding down the highway: scenery flying by, but just barely noticed. Windows up. AC on. Favorite tunes playing. The conversation centering on some arcane subject. However, the timing belt breaks. So now we are going zero miles per hour, perched aside a forlorn stretch of highway. Initially we can only think about getting out of here ASAP. But soon other senses intervene and we begin to mesh with this new reality. Perhaps a tall copse of Shepherd's Purse. The sound of a small stream heard but not seen. The alluring shade of a nearby tree.
Disease as metaphor, disease as teacher.
3. Quantity has a quality all of its own.
Think of it like this: if you were pouring a concrete floor, it would be rather silly to start in one corner, pour a one-foot-square area, wait for it to cure and dry, and then move on to the next square. Not only would it be inefficient, but the floor itself would have very little structural integrity.
What should we do instead?
We'd pour 'skim coats' over the entire area, perhaps in several layers, trying to cover as much of the entire area as possible. Now, if I were insecure about my 'footing', and had to make the choice between standing on a one by one-foot-square of concrete or the first layer of a thin skim coat covering the entire area I might opt for the apparent security of the fully-completed one-foot square (and indeed, most testing is done on a person's ability to make the 'most perfect' one-foot square). Trouble is, I'm on a square of concrete that doesn't allow me to move anywhere else.
That's the problem with developing polymathic knowledge: there is an initial 'awkward stage' that many people find troubling, especially if they are insecure, or have been led to believe that all things taught to them must have immediate 'meaning'. However, there is a fix for this and a few of you have already figured it out:
Wonder is often compared to the emotion of awe but unlike awe wonder inspires joy rather than fear or respect.
So you might say that wonder is curiosity tinged with the prospect of potential joy.
Thus we have two choices with this week's assignments. We can moan about the amount of reading on what (after all) is just a clinic shift; start with the first article, tediously plow through it in workmanlike fashion, and then move onto the next, and the next.
This will take hours.
Or we can isolate the key concepts (in this instance I've supplied them: molecular chaperones and the process of n-glycation) gain a cursory understanding of these concepts and then skim through the articles, 'wondrously' luxuriating in areas that catch our attention and/or fit our model of the big picture. Each iteration (or skim coat) deepens knowledge and allows for the creation of more and more interconnectedness.
So, as you might gather, far from being onerous, polymathic behavior is a labor-saving device.
I'm a believer in 'Fat Tony Science.'
I imagine that Fat Tony is some sort of quasi-questionable figure, or maybe just a street-wise person, like so many of the people I could observe growing up in Brooklyn. Fat Tony Science attempts to inject a healthy dose of street smarts into the decision tree by looking beyond simple facts and assumptions to things like ulterior motives and inter-personal realpolitik. Fat Tony Science becomes especially handy when parsing patient histories.
Here's an example:
You flip a coin 99 times and every toss comes up heads. What are the odds of it coming up tails on the next toss?
Little old lady at the slots:
It will come up tails, it's bound to after so many heads.
The same odds as always, 50-50.
It will come up heads, since only a rigged coin will produce 99 consecutive heads.
The year 1956 stands out in my mind for a variety of reasons, the most important being (at least for me) that it was the year I was born. It also marked the year of the only ‘perfect game’ even thrown in a baseball World Series. Music fans might remember that it was the year that Elvis Presley entered the United States music charts for the first time, with 'Heartbreak Hotel.'
1956 was also the year a scientist named Roger Williams published a book called Biochemical Individuality, which attempted to relate inherited individual distinctions to nutritional requirements. Although Williams was no small figure in medicine (at the University of Austin he had discovered pantothenic acid, one of the critical B vitamins, and had published skews of articles detailing some of the most basic biochemical discoveries) Biochemical Individuality attracted little, if any attention from the medical community, probably due to the fact, as Jeffrey Bland speculates in his book Genetic Nutritioneering, Williams expressed many of his ideas in biochemical terms, which doctors of the time were far less comfortable with compared with today.
How prescient is the following phrase:
“The existence in every human being of a vast array of attributes which are potentially measurable (whether by present methods or not), and often uncorrelated mathematically, makes quite tenable the hypothesis that practically every human being is a deviate in some respects.”
It’s a strange choice of words, but the word deviate in this context signifies a turning away from the normal or a variance of some sort. Of course, we tend to think of the word more as a term for individuals who deviate from some sort of social norm; but norms are norms.
Williams was certainly deviating from conventional medical wisdom. Nobody at the time was looking at peculiar and individual aspects of nutrition that might be predicted genetically. More importantly, in 1956 there wasn’t anywhere near the enormous genetic industry and technology that exists today; it had only been three years before that James Watson and Francis Crick deduced the basic structure of DNA, (Deoxyribonucleic acid) –the double helix-- that contains the genetic instructions specifying the biological development of all cellular forms of life.
Thus when Williams talked of “attributes that are potentially measurable (by present methods or not)” he is taking an amazingly huge step into the future.
So Williams’ phrase “often uncorrelated mathematically” should probably be reinterpreted to mean “we can’t see the connections because of our current puny computational abilities.” Nowadays we link supercomputers together into vast neural networks and process data at a speed and accuracy that just boggles the mind. It was just this type of muscular computing that allowed scientists like Craig Venter and his firm Celera Genomics to help crack the human genome in record time. Today, the combination of gene sequencing and supercomputers is a day to day event in hundreds of laboratories worldwide, and is a prime part of a vast new field called bioinformatics.
In 1956, nutrition science was still in its infancy, concerned mostly with deficiency types of diseases such as pellagra and anemia, and making sure that we all ate “balanced meals.” There was no link between diet and cholesterol or between cholesterol and hardening of the arteries and medical journals often featured cigarette ads on their back pages. Ulcers were often treated by telling the patient to drink copious amounts of milk, the so-called “sippy-diet.” In other words, nutritional thinking at the time was predominantly disease-based, which is odd, since almost everything we do with food has absolutely nothing to do with disease. This resulted in what my friend and colleague Jonathan Wright used to call "The Association Diets.'
This is not to say that pieces of the puzzle weren’t evident, or that intelligent people were not already beginning to ask the right questions. It’s just that the questions could only be based on what was thought to be known, and what was known was not very much.
I can remember taking a computer class in high school (already well into the 1960’s) where we were taught to diligently inscribe a series punch cards with a 'number 2' pencil, which were then collated and fed to a machine the size of a large refrigerator, which then hacked and coughed for a while, finally yielding a half page printout of a list of fifty prime numbers.
Unless, of course, you had the misfortune to have penciled in the wrong box, in which case you just started all over again; a frustrating experience, which lead to one of my young colleagues, in a rage of frustration, placing one of the cards on the floor and proceeding to stomp on it repeatedly with his shoed foot, sending it on to the card reader --and probably producing the first computer virus-- a trick many of us would repeat when similarly frustrated. Your home computer can do these functions in micro-seconds, and the software to do it is considered so basic that it is usually packaged for free with the operating system.