HIV/AIDS Skepticism

Pointing to evidence that HIV is not the necessary and sufficient cause of AIDS

Posts Tagged ‘cognitive dissonance’

The debilitating distraction of “HIV”

Posted by Henry Bauer on 2008/12/21

Every now and again, Martin chides me for writing about “HIV” (which doesn’t exist), “infection” (which doesn’t occur), and the like. My standard response has been that I don’t know how to write about HIV/AIDS doings without using the terminology that everyone’s familiar with. In my book, I tried to address the issue by saying that by HIV I would always mean, “Whatever it is that HIV tests detect”, but that repeating this every time, or always putting scare quotes around “HIV”, would get tiresome, for readers as well as the writer. I also used “F(HIV)”, for “frequency of positive HIV-tests”, instead of “the prevalence of HIV” so as not to entrench belief in the existence of an infectious agent.

In principle, I’ve recognized that Martin is right in pointing out that it’s not just terminology, because with every use of the terms (HIV, infection, AIDS, etc.), we absorb as well as disseminate something of the mistaken view. In practice, I haven’t known how to avoid doing this.

I’ve come to appreciate even more the force of Martin’s essential point through grappling, the last few months, with the interpretation of data on deaths from “HIV disease”. The difficulties I was having owed, to an appreciable degree, from having my mind infected with a subterranean notion that “HIV” means something, indeed something  specific — even as I was, on the conscious level, describing “HIV-positive” as being analogous to a fever and not meaning anything specific.

That analogy with fever, for which I’m grateful to Christian Fiala, is indeed an excellent one, concise and easy for people to grasp immediately without further explanation. Like all analogies, though, it isn’t more than an analogy, and can’t encompass all the characteristics of “HIV” — most particularly, that while fevers signal something out of the ordinary, even if not necessarily a serious health challenge, “HIV-positive” may signify nothing at all out of the ordinary, in the sense that “HIV-positive” may not be worth thinking or worrying about any more than, say, having a cold, waking up with an aching joint, just having been vaccinated against flu, or being pregnant.

My research into HIV-associated matters had been stimulated by the unbelievable assertion cited by Harvey Bialy, that in the mid-1980s teen-aged females applying for military service tested HIV-positive as frequently as their male peers. My book recounts what I found about the demography of positive “HIV”-tests: the regular variation with sex, age, race, and geography demonstrates that “HIV-positive” isn’t contagious or infectious. The variations between social groups demonstrates that “HIV-positive” has something to do, at least sometimes, with health challenge, or immune-system reaction, albeit not necessarily any serious threat to health — in groups where one expects to find relatively poor health or manifest illness, the average frequency of positive “HIV”-tests tends to be greater . I even suggested that “HIV-positive” might mean something different with different people: since only a few of the “HIV” proteins, and not always the same ones, are required for the test to be pronounced “positive”, perhaps there are some hidden specificities — maybe “HIV-positive” for gay men is detecting different substances than “HIV-positive” among pregnant women, say (and in neither case are those detected substances necessarily a cause for concern, anything “out of the ordinary”).

I hadn’t looked seriously into death statistics until about a year ago, when Sharon Stone told Larry King that  AIDS is “the fourth leading killer of women in America”. Of course that isn’t the case, it isn’t even in the top ten — World Aids Day: Sharon Stone on Larry King, sharing urban legends (or celebrity facts) , 22 December 2007. However, the data revealed some interesting variations by race and age, so I looked at that in more detail [“HIV Disease” , 28 December 2007;
How to test theories (HIV/AIDS theory flunks), 7 January 2008]. I noticed the peculiarity that black Americans are both more prone to test “HIV-positive” but also to survive that condition to a greater age than white Americans . Though I recognized that as another count against HIV/AIDS theory, I was mind-infected by “HIV signifies something” and didn’t take this to the conclusion that now seems so obvious.

Periodically I would come back to the remarkable fact that people aged around 35-45 always test HIV-positive more frequently than older  as well as younger adults or teenagers, and cite it as confirmation of the demographics that show “HIV” isn’t an infection [for example, “HIV demographics further confirmed: HIV is not sexually transmitted”, 26 February 2008]. I re-emphasized that “’HIV’ and ‘AIDS’ are two separate things” [Unraveling HIV/AIDS, 8 March 2008] — thereby illustrating the mind-infection that Martin kept warning me about; I ought to have remained aware that “HIV” isn’t “a thing” at all. By a few weeks later [“HIV Disease” is not an illness, 19 March 2008], I had come to realize that the death statistics in themselves show that “HIV disease” isn’t an illness, because the greatest risk of death is among 35-45-year-olds whereas all other illnesses, diseases, and “natural causes” too bring the greatest risk of death at older ages, the risk increasing about exponentially with age from the teens or twenties upward. I even recognized an implication of the fact that the age distributions of “HIV-positive” and of “HIV disease” deaths virtually superpose — the implication I recognized being that there’s no “latent period” and no evidence that HAART has been of benefit, or rather evidence that HAART has NOT been life-extending. But I didn’t grasp this further reminder that “HIV” isn’t “a thing”.

I’ve even commented on cognitive dissonance [for example, “HIV/AIDS illustrates cognitive dissonance“, 29 April 2008] — in others, that is, while not seeing what was staring me in the face, because I was mind-infected with the term “HIV”, as though “HIV” were a “thing”. I’d even been warned against that sort of mistake in many encounters with philosophers, for whom “reification” is a well-recognized fallacy: imagining there is “a thing” just because a name, a term, has been invented.

It was the egregious claim that HAART had saved millions of life-years that brought me back to looking at death statistics [HIV/AIDS scam: Have antiretroviral drugs saved 3 million life-years?, 6 July 2008]. I noted the peculiarity that all this life-saving and life-extending had left the average age of death from “HIV disease” at around 40 — but apparently I wasn’t yet able to tie this in with the fact that “HIV” isn’t “a thing”. I wasn’t yet able to see that the disjunction between low mortality and average age of death [More HIV/AIDS GIGO (garbage in and out): “HIV” and risk of death, 12 July 2008] is obviously to be expected, because “HIV” isn’t “a thing”.

I returned to the strange fact that the age of maximum likelihood of testing “HIV-positive” is always about the same as the age of maximum likelihood of dying from ”HIV disease” [How “AIDS Deaths” and “HIV Infections” vary with age — and WHY, 15 September 2008] and was finally set on a productive line of thought through noticing the stark disjunction between mortality from “HIV disease” and average age of death “from ‘HIV disease’” [HAART saves lives — but doesn’t prolong them!?, 17 September 2008]. But I was still in the mind-frame of arguing against latent periods and HAART benefits [No HIV “latent period”: dotting i’s and crossing t’s, 21 September 2008].

A re-statement of these matters in “Poison in South Africa” [26 October 2008] aroused comments from defenders of the HIV/AIDS faith that spurred me to carry out some laborious calculations that I’d been procrastinating about. The age distribution of people living with AIDS was like that of people tested for HIV and like that of deaths among PWAs. Finally I recognized that the disjunction between mortality and age of death is because both are based on “HIV” but “HIV” isn’t “a thing”, and you can’t classify PWAs or deaths on such a basis.

Take ANY group of people, apply “HIV” tests, and the frequency of positive tests will be at a maximum in the age range 35-45 or so. There are indications that the range may be a bit different for females as for males, and for people with different racial ancestries, but those differences — if indeed there are any — seem to be small.
Take ANY group of people, HEALTHY OR ILL, do “HIV” tests, and the frequency of positive tests will be at a maximum in the age range 35-45 or so. I had pointed this out in my book, with data from blood donors, gay men, heterosexuals at STD clinics, soldiers, sailors, marines, in the Job Corps, in all racial groups, in both sexes . . . . In other words, “HIV-positive” has nothing specifically to do with illness or with death.
That can be difficult to bear in mind, in part because of the habit of thinking of “HIV” as “a thing”; in part because the likelihood of positive “HIV”-tests does vary with physiological condition, and some illnesses are associated with a high probability of “positive” “HIV” tests. But never forget that many non-illnesses, like pregnancy, are also associated with a high probability of “positive” “HIV” tests: “HIV-positive” has nothing specifically to do with illness or with death.

The confusion came about because Gallo et al. were looking for things that might be common to victims of AIDS, who were very ill people (high likelihood of “positives”) and happened to be of average age in the mid-to-upper thirties (in any group, maximum probability of “positive” tests). What they came up with was an artefact; a sort of thermometer that is particularly prone to detect fever in certain physiological conditions, and that is also particularly likely to read “fever” by mistake, in certain other physiological conditions and especially with people aged about 35-45.
It’s hard to ingrain that firmly in one’s thinking, and keep it at the forefront of one’s mind, after being used to imagining that “HIV” is “a thing”.

So it took me “longer than otherwise necessary” to grasp what the disjunction of death ages and mortality rates illustrates: mortality rates are reported for the population of “people with AIDS”, but that has nothing specifically to do with illness or death, because inclusion in the group has as sine qua non a positive “HIV” test, which signifies nothing specifically about illness or risk of death. I kept thinking about “the median age of death” as pointing to a particular life expectancy, a lack of benefit from HAART, generally a conundrum for HIV/AIDS theory — while the straightforward meaning is simply this:
Take ANY group of people, apply “HIV” tests, and the frequency of positive tests will be at a maximum in the age range 35-45 or so.
Take those people who have been mistakenly diagnosed as infected by “the ‘HIV’ thing”, and the frequency of positive tests among them will be at a maximum in the age range 35-45 or so.
Take ANY group of people who have just died FOR ANY REASON, carry out “HIV” tests on the cadavers, and the frequency of positive tests will be at a maximum in the age range 35-45 or so.

It’s just meaningless to compare median age of death, in any group categorized by “HIV tests”, with mortality among that group, because “HIV” has nothing to do with risk of death. That’s why the attempt to compare those things revealed a stark disjunction, with different “relationships” between death age and mortality at different times — up to 1986/87, from then to 1992, discontinuity at 1992/93, different again to 1996, another discontinuity at 1996/97, different “relationship” again after that.

—–

So, Martin: thanks for your periodic reminders, thanks for not giving up on me. I think I may finally have grasped the point. Not that it will necessarily make it easier to write about this stuff without using misleading terms, but maybe I’ll be able to make the meanings of what I write less misleading.

Best holiday wishes!

Posted in HIV absurdities, HIV does not cause AIDS, HIV risk groups, HIV skepticism, HIV tests, HIV varies with age, HIV/AIDS numbers | Tagged: , , | 14 Comments »

True Believers of HIV/AIDS: Why Do They Believe Despite the Evidence?

Posted by Henry Bauer on 2008/10/30

A correspondent sent the following, asking whether it might be a relevant comment on one of the Nobel Prize posts. I think it’s more than that, it gets to the root of the problem that Rethinkers and Skeptics face, how to entice the indoctrinated public media and the committed mainstreamers to pay attention to the evidence that disproves HIV/AIDS theory. Andy D. wrote:

“I can find but three possible explanations for the ‘Establishment’s’ most arrogant and condescending behavior and unsubstantial, propagandistic websites and media appearances:
1. They are very well aware of the inconsistencies, problems and failings of HIV-AIDS-theory and their horrible implications regarding AIDS politics and medication, and find some overriding self-interested reason to continue to uphold what they know is wrong; or
2. They are unwilling to look critically at a theory they have established and promoted; or
3. They regard all ‘dissident’ propositions as so silly — what they call ‘moon-is-green-cheese’ pseudoscience — that they require no disproof.

I’ve seen again and again with honest scientists that they are happy to discuss and argue about their theses. Esteemed, intelligent and highly informed people like Peter Duesberg, Etienne de Harven, Heinz Ludwig Sänger, Kary Mullis or yourself should not be treated like nagging students asking the same stupidly absurd questions over and  over again.”

I touched on one aspect of an explanation for all this in “HIV/AIDS Illustrates Cognitive Dissonance” [29 April 2008]: Human psychology is such that true believers simply cannot grasp the implications of evidence that contradicts their belief. Andy’s questions spurred me to think about all this anew. How do people become true believers in the first place? If one could answer that question, it might point also to possible ways of helping people to change their mistaken beliefs.

Human beings are actually raised to be true believers. As babies and children, we are persuaded, urged, or disciplined in various ways to accept what our parents and our teachers tell us. Children  are delightfully curious and questioning, but at first they lack the background information to argue effectively against what they’re told. By and large, too, what children are told makes sense and works out in practice: “Don’t touch that hot stove!” and innumerable other commands, when ignored, prove themselves to have been good ones. So we tend to grow up with confidence in what our elders tell us, and as adults we readily substitute for parents and elders the “experts” , the “authorities”, the “Establishment”.

When we encounter someone who believes very differently than we do, we tend to be puzzled: “How could anyone believe that?!”

The answer is simple: They had different parents and teachers, and later they listened to different “experts” and “authorities”.

So to ask, “How could anyone believe that?!”, is the wrong question. The right question is, “How does anyone come not to accept what they’ve been told, what everyone around them ‘knows’?” (I’ve written more along these lines in Science or Pseudoscience: Magnetic Healing, Psychic Phenomena, and Other Heterodoxies, especially p. 47 ff. and p. 207 ff.).

When it comes to supposedly factual matters, textbooks and undergraduate courses emphasize learning what — according to the authorities — has already been found out and is already understood. There’s a significant difference here between “scientific” matters and non-scientific ones. If humanists and scientists can be persuaded to discuss their differing approaches to college teaching, it turns out that the scientists have a rather naïve view of their mission as one of transferring reliable, accredited information, whereas the humanists tend to emphasize the nurturing of critical thought. One indication of the difference is that science courses tend to be sequenced in linear hierarchy: students must take general chemistry before specialized inorganic, organic, and physical chemistry, and they must take some math and physics before physical chemistry, and so on. By contrast, great swaths of “upper-level” courses in the humanities have few if any prerequisites (more about this in To Rise above Principle: The Memoirs of an Unreconstructed Dean, p. 140).

So scientists and doctors, already trained by parents and earlier teachers to believe what they’re told, become even further accustomed during their “education” — more correctly, their indoctrination — to accept contemporary “knowledge” and beliefs. Once graduated and credentialed, as professionals and practitioners, to those habits of intellectual conformity there are added weighty practical considerations: straying from orthodox paths can incur serious, even disabling damage to one’s career and livelihood.

It isn’t that doctors and scientists “go along” cynically with beliefs and practices that they recognize as wrong or unsound. At best, when they’re conscious of some disparity between what they do and “what’s right”, they rationalize: for example, that they can do more to correct matters by “working within the system” than by becoming whistle-blowers. More usually, though, like other humans, they presume that, because their inherent desire is to do the right thing, therefore they cannot be doing anything that’s fundamentally wrong. That’s the basis of “cognitive dissonance”: psychological mechanisms common to all human beings can render us incapable of discerning facts that disprove our beliefs. I recommend highly the book by Thomas Gilovich, How We Know What Isn’t So: The Fallibility of Human Reason in Everyday Life (Free Press, 1991)  for an excellent and very readable discussion of various ways in which we can fool ourselves into not seeing facts that contradict our beliefs; we are simply oblivious to them.

In science and medicine as much as in everyday life, human beings want to “fit in”. We are social animals and want to be part of a group, and that applies on intellectual issues as much as in other matters. The highly creative astrophysicist Thomas Gold described the intellectual conformity in scholarship and research as an expression of “the herd instinct”, illustrating it by the furious opposition he encountered over his suggestions about the mechanism of hearing (about which he later proved to have been right) and the origin of petroleum (about which he may yet turn out to be right) — see “New ideas in science”, Journal of Scientific Exploration 3 [1989] 103-12. The histories of science and of medicine are replete with instances of great breakthroughs that were desperately resisted by the mainstream “authorities” for as long as possible (the concise essay about this by Bernard Barber remains well worth reading: “Resistance by scientists to scientific discovery”, Science, 134 [1961] 596-602).

That desperate resistance is a consequence of cognitive dissonance and the herd instinct. True believers have reached their beliefs not by considering the evidence but by taking things on faith from the authorities. When they are challenged, it threatens not only their belief but also their self image — their lack of critical thought — and their membership of the herd: if they came to see that the belief is mistaken, they would also have to become outsiders. All that is unacceptable in the extreme, and is therefore resisted by every available means. But true believers cannot respond substantively, because they haven’t arrived at their beliefs in that manner, they have taken matters on faith and don’t even know what the evidence pro and con is. So the desperate resistance typically takes the form of personal attacks, character assassination, guilt by association, and the like; see “Dissenting from HIV/AIDS theory” and “Questioning HIV/AIDS: Morally Reprehensible or Scientifically Warranted?”

A quite general corollary of cognitive dissonance and the herd instinct is that a significant number of counter-intuitive breakthroughs have been made by people who were outsiders rather than specialists in the relevant field; for references and discussion, including counter-examples, see T. F. Gieryn & R. F. Hirsh, “Marginality and innovation in  science”, Social Studies of Science 13 (1983) 87-106. The standard dismissal of Rethinkers by HIV/AIDS dogmatists, that the Rethinkers haven’t themselves done hands-on HIV/AIDS research, has no basis in empirical fact and the history of science.

These matters are highly pertinent for Rethinkers, or in general for anyone and any group that aims to bring down an established paradigm. A direct lesson is that it’s unusual for human beings to question what they have been taught to believe, because of the psychological mechanisms —  ranging from entirely unconscious to barely conscious — that conspire to safeguard us from “seeing” anything that might raise doubts. A bitter extrapolation from this is to recognize how enormously difficult it is to persuade someone else that their beliefs are provably wrong:

“It is difficult enough to reach a personal, informed view on matters over which controversy rages; there is little chance that the true believers or true disbelievers can be converted. ‘The most we can hope to achieve is to make the credulous more skeptical, and the skeptical more open-minded’” — p. 218 in Science or Pseudoscience: Magnetic Healing, Psychic Phenomena, and Other Heterodoxies,  citing Arthur C. Clarke, whose words on this subject are well worth attending to; see the Introduction and Epilogue in Arthur C. Clarke’s World of Strange Powers (ed. John Fairley and Simon Welfare, G. B. Putnam’s Sons, 1984).

——————

So, Andy: My view is that we should never be surprised when adherents to mainstream views seem impervious to even the plainest evidence. That’s NORMAL! And it’s so in science as much as in any other human activity. Most of us are still taught in school, college, university, that science is objective and that scientists care only about  learning the truth; but science isn’t done that way, it’s a complicated human activity; for a relatively brief discussion, see Scientific Literacy and the Myth of the Scientific Method ;  and for a comprehensive account, I recommend John Ziman, Real Science.

As to HIV/AIDS specifically, it’s extraordinarily unlikely that the dogma will be abandoned because of research or publication or critical thinking or re-thinking within the mainstream. Much more likely, it will be overturned under pressure from outside sources: perhaps political, because of the inordinate, disproportionate, and unproductive expenditures; perhaps legal, if enough “HIV-positive” people damaged by “antiretroviral therapy” win enough and sufficiently important court actions; or perhaps, again legal, if someone charged with transmitting HIV manages to bring the court to look at the scientific evidence; or if someone prominent enough among black leaders comes to realize that people of African ancestry are being disproportionately subjected, without good reason, to toxic medications; or if someone powerful enough in the major media becomes so interested as to actually look into the facts. Otherwise, I fear, the mainstream will just continue to fiddle with new medications, gradually continuing to make the treatments less toxic, and gradually extending the life-span of HAART-treated people to an average beyond the present middle forties. If that is the case, then it may take a horribly long time before the death toll from antiretroviral drugs becomes so obvious and widely known that the established view is finally held to public account.

Posted in antiretroviral drugs, experts, Funds for HIV/AIDS, HIV and race, HIV does not cause AIDS, HIV risk groups, HIV skepticism, HIV transmission, Legal aspects, sexual transmission, uncritical media | Tagged: , , , , , , , , , , , , , , , , | 26 Comments »

 
%d bloggers like this: