HIV/AIDS Skepticism

Pointing to evidence that HIV is not the necessary and sufficient cause of AIDS

Archive for December, 2008

Living with HIV; Dying from What?

Posted by Henry Bauer on 2008/12/10

Recent comments by “Köpek Burun” (= “dog nose”?? “snout”? Menganito?!),  about “Poison in South Africa” [26 October 2008]  inevitably referred back to “HAART saves lives — but doesn’t prolong them!?” [17 September 2008]. In my response, I referred to calculations that I’ve been working on, of the age distributions of PWAs (“People living With AIDS”), and the age distribution of death rates. Since it’s so pertinent to that discussion, I need to post the calculations even though my full analysis isn’t finished yet.

Age distributions of PWAs can be calculated from the data in Table 2 of “HAART saves lives”.  The (average) number of PWAs during a given year results from adding new diagnoses in that year to survivors at the end of the previous year. Those survivors can be calculated from the total number of diagnoses minus deaths up to and including that year.  For calculating total numbers of PWAs for each year, that is straightforward, and the results were given in Table 3 of “HAART saves lives”.   However, for the age distribution of PWAs in each year, one must take into account that survivors from a given year will be a year older, on average, in the following year. For example, some of the survivors from the age range 20-29 in 1990 will be in the range 30-39 in 1991. I made the assumption of symmetrical distribution within each age range — in other words, represented the data by histograms defined by the age ranges in which the data were reported. A number of trial calculations using more elaborate curve fitting showed that this did not make a significant difference to the results, presumably because the chief variable of interest is the median age and most of the cases fall in the middle age ranges (where the cases-vs.-age curves are steepest, there numbers of cases are so much smaller than in the middle age ranges that small errors there hardly affect the calculation of median age).

Another complication is that the age ranges for which deaths and diagnoses, respectively, are reported were not the same in the years 1993-98. For those years, the age ranges for the deaths were converted to those for the diagnoses, again using a histogram model; that this did not introduce drastic errors was verified by the fact that the re-calculated median ages remained within 1% of the initial ones.
The reason for dual death reports for 1998, for comparison with earlier and later years, was given in the notes to Table 2 in “HAART saves lives”.  For purposes of comparison over the whole period 1982 to 2004, the most appropriate values for 1998 are presumably the average of those dual numbers, namely, 39.4 for median age of PWAs, 41.2 for median age of deaths, 1.8 for the interval between them, and 4.5% for the death rate. All of those fall smoothly into the progressions from 1982 to 2004.

Table I

pwaagedistributions1

I had begun this work to probe the effect of HAART, but I realized eventually that these death statistics speak directly to the issue of whether HIV causes AIDS, well beyond merely demonstrating that HAART doesn’t extend life. From 1982 to 2004, the death rate (last column in the Table) declined from 65% to 2.8%; yet the difference between the median age of the population of PWAs and their median age of death increased only from about 7 months (0.6 years)  to about 22 months (1.8 years). That’s a stark contradiction. The median age of death in any population is the average life-span. If the median age of the existing population is within a couple of years of the life span, then the death rate must be enormous; but here the mortality in recent years is small while the life span is a mere 22 months greater than the median age of the population.

The contradiction means that the basis for classifying as “PWA” is not the same as what determines death; those who are dying are in some manner atypical within the PWA population. What typifies PWAs, though, is being HIV-positive. Therefore something other than being HIV-positive distinguishes those who are dying from those who are not dying.

At least for the most recent decade, HAART seems the obvious “missing link”. Let’s assume that those few percent of PWAs who are dying have not been getting HAART. Then PWAs who survive, who are benefiting from HAART, would be getting older, and the median age of the PWA population would be steadily increasing IN CONTRAST to the median age of those dying, which would continue to be that typical for untreated PWAs. The data show no such thing. During the HAART era, the median ages of death and of PWAs drift upwards in tandem, with no discernible change in the magnitude of the difference, 1.7-1.9 years. Most striking, the median age of surviving PWAs remains below, not above, the median age of death.
In any case, the same contradiction between median-age differences and mortality rates applies in the years 1982 to 1996. The only resolution for this conundrum is to recognize that what determines PWA status — namely, being HIV-positive — isn’t what determines death among PWAs. In other words, HIV doesn’t cause death.

*******************************

There’s another, independent, aspect of “HIV disease” deaths that speaks against HIV as a cause of death. Several times [for example, “How ‘AIDS Deaths’ and ‘HIV Infections’ vary with age — and WHY”, 15 September 2008]  I’ve remarked on the peculiarity that the death rate for “HIV disease” is at a maximum roughly at ages 35-45, something like the prime years of adulthood.

All other diseases show the very opposite, death rates at their lowest among young-to-middle-aged adults and high among very young children and increasing progressively at ages beyond middle age. For instance, in “’HIV Disease’ is not an illness” [19 March 2008], Table B shows all-cause mortality lowest among young teens and increasing with age (very roughly, doubling in each higher decade); Table C  shows a similar variation for cerebrovascular diseases; the “Health, United States” (HUS) reports from the National Center for Health Statistics display this type of variation with age for every type of illness. For influenza, here’s a graphical representation:

Figure I
from a poster presentation, “Death and Aging in the Time of Influenza: United States, 1960-2002” by Nobuko Mizoguchi, MPH/MPP  Department of Demography, University of California at Berkeley]

Click to access MizoguchiPoster.pdf


flumortality

“HIV disease” is entirely different, see for instance Figures 2a,b in “No HIV ‘latent period’: dotting i’s and crossing t’s” [21 September 2008] , or any of the tables for deaths from HIV disease in HUS reports, e.g. reproduced as Table D in “’HIV Disease’ is not an illness”, 19 March 2008 ; see also Table 2 in “HAART saves lives — but doesn’t prolong them!?”, 17 September 2008. “AIDS” deaths, be it in absolute numbers or in rates, are at a maximum around age 40 ± 5.

In my view, this alone already gives the lie to claims that HIV is fatally pathogenic. No matter what the origin may be of an attack on the living human organism, the tendency to succumb and die increases steadily with age. What could it be about “HIV” to allow older people to resist its ravages better than people in their prime middle years?

The data can’t be explained away speculatively as something about ages at which people most likely get “infected”, because EVERY age distribution having to do with HIV or with AIDS peaks in those same years: positive HIV tests, new AIDS diagnoses, deaths from “HIV disease”, median age of all PWAs (see Table I above). Age distributions for deaths and for positive HIV tests superpose, as illustrated graphically in “How ‘AIDS Deaths’ and ‘HIV Infections’ Vary with Age — and WHY”, 15 September 2008 , and in “No HIV ‘latent period’: dotting i’s and crossing t’s”, 21 September 2008 .

Yet another way to illustrate this is to calculate age-specific PWA-specific death rates, see Table II below. The usual way of reporting death rates (as in the HUS reports) is per 1000 or 100,000 for the population as a whole in the given age-group. But one might try to gain further insight into why HIV is so peculiar by looking at what proportion of PWAs in each age group die each year (deaths in that year in that age range divided by the number of PWAs in that age range in the same year). I haven’t yet done the calculation for every year, because the salient overall conclusion seems obvious enough:

Table II

pawdeathratesagespecific

These numbers are much more sensitive than the median ages are, to the various assumptions made in calculating age distributions of PWAs, so the variations are less smooth and only clear major differences should be regarded as reliable. The crucial point is quite clear, though: how little variation there is between the death rates in the various age ranges in any given year. In 1999, for example, about the same proportion of PWAs aged 35-44 died as among those aged ≥65 or among those of intermediate age. In the years 2002-2004, a smaller proportion those PWAs aged ≥55 died than of those younger PWAs aged 35-54.

This makes no sense, if PWA, “living with AIDS”, means suffering from a fatal illness that is only temporarily staved off by continual antiretroviral treatment. Older people should succumb more readily than younger people.

The only death statistics that show maximum rates among younger adults are accidents, homicide, suicide: what one might call lifestyle hazards, not biological health challenges. That accords with the hypothesis — for which there is much supporting evidence — that AIDS in the early 1980s was an epiphenomenon of the fast-lane lifestyle practiced by small groups of gay men; look back at Tony Lance’s essay on intestinal dysbiosis.

I offer another speculation as to a possible cause of death that would not discriminate much by age. A highly toxic chemical poison that’s likely to kill within a few years would probably kill old and young people at comparable rates. AZT and other antiretroviral drugs would fit that bill.

But I don’t want to conclude on so speculative a note. The fact that deaths from HIV or AIDS are maximum at ages 35-45 shows that those deaths are not the result of an infectious disease, or for that matter of any natural illness. The fact that the median age of the PWA population has been steadily lower (within about two years) than the median age of death among those people, while the mortality has declined enormously over two decades, proves that whatever caused the deaths is not what defines the category “PWA” — i.e., HIV doesn’t cause death, HIV doesn’t cause AIDS. A fortiori, the data show that HAART doesn’t extend life: the interval between median age of the PWA population of deaths among them has held steady at 1.7-1.9 years throughout the HAART era, and surviving PWAs are not living longer than those who die.

The only explanation that satisfies all the data is that testing HIV-positive is an artefact as regards illness or death. Testing HIV-positive is just a marker of some sort of physiological response to a variety of challenges.

Posted in antiretroviral drugs, HIV absurdities, HIV does not cause AIDS, HIV skepticism, HIV tests, HIV varies with age, HIV/AIDS numbers | Tagged: , , , , , | 60 Comments »

Recreational HIV drugs

Posted by Henry Bauer on 2008/12/08

Getting high on HIV drugs in S Africa [Alka Marwaha, BBC News, 8 December 2008]

Anti-retroviral drugs used to treat HIV/Aids are being bought and smoked by teenagers in South Africa to get high. Reports suggest that the drugs are being sold by patients and even healthcare staff for money. . . . Aids patients themselves have been found smoking the drugs instead of taking them as prescribed. . . . Smoking the pills has a hallucinogenic and relaxing affect. . . . ‘When you look at them, just a few seconds after taking it, they are in another world’ . . . . The children do not know where they are and they stop making sense. . . . It had now become a national problem in South Africa . . . . ‘people who are healthy, that are taking this medication are exposing themselves to potential side-effects of these drugs’”.

Posted in antiretroviral drugs | Tagged: , | 9 Comments »

Collateral damage from HIV/AIDS

Posted by Henry Bauer on 2008/12/06

Enormous harm has been caused by the mistaken view that “HIV-positive” signifies infection with a fatal retrovirus that can only be held in check by highly toxic medication to be administered until the patient dies.

By now, millions of people have been subjected to this iatrogenic damage; including some unknown but large number of babies, whose mitochondria (central to cellular energy processes) have been irreparably debilitated. We know of people who tested positive only because of anti-tetanus shots, or flu vaccination, or surgical procedures, or many other conditions having nothing to do with a putative immune-system-destroying virus, and those people suffered long periods of ill health and low quality of life until they stopped taking the antiretroviral drugs and regained something like their previous state of sound health.

Physical harm to innumerable people is not, though, the only collateral damage from this medical pseudo-science. Sociopolitical harm is no negligible aspect of this tragedy. For example:

Discrimination against gay men:
Russia Mayor Links HIV To Gay Rights
Just days after the world presented an united front against HIV during Monday’s 20th anniversary of World AIDS Day, Moscow mayor Yuri Luzhkov has linked HIV to the gay rights movement . . . . Luzhkov, speaking at a conference in Moscow titled “HIV/AIDS in Developed Countries”, said that his administration would continue to ban the progress of gay and lesbians rights, citing the notion that greater visibility for the gay community was responsible for an increase in HIV in Moscow. ‘We have banned, and will ban, the propaganda of sexual minorities’ opinions because they can be one of the factors in the spread of HIV infection,’ he said.”
[Admittedly, this is not the only threat to freedom of speech in present-day Russia]

Panic in schools:
How much harm has been done to how many people and to which social interactions and to what degree, by the announcement of possible HIV infections in a St. Louis school, can never be known:

Too early to know if Mo. school had HIV outbreak
ST. LOUIS (AP)— Six weeks after someone with HIV said dozens of students at a St. Louis high school might have been exposed to the virus, it remains unclear whether an outbreak has occurred.
Missouri health authorities say preliminary October test results for St. Louis County show two new cases of HIV among people 24 and under.
It isn’t clear whether those cases are even connected to Normandy High School, where students were tested voluntarily in late October. An infected person told county health officials that as many as 50 teens might have been exposed to the virus that causes AIDS.
The county plans a second round of HIV testing in January. Antibodies to the virus can take three to six months to appear. A final assessment isn’t expected for at least six months.”

As I said when reporting on the initial publicity from Normandy High School:
“Perhaps the best way of instilling fear and producing mass hysteria is by innuendo and vague suspicions, being unspecific and secretive”.

Here, six weeks later, the uncertainty is predicted to persist for at least another six months, during which time students and parents primarily, but teachers and officials too, will be on tenterhooks, wondering who might have unknowingly contracted the fatal virus; after all, as I cited earlier, “The Health Department also will not say how any exposure might have occurred”.

In a previous “footnote” to the story,  I could unfortunately already illustrate — as now, once again — that “further ‘news’ and rumors . . . will be leaking out from those ignorant, panicked, ‘everything is normal’, school administrators and health officials in St. Louis.”

Racist attitudes:
Derailing a disease: With new infections here far outpacing the national average, routine HIV testing should be a priority” [Houston Chronicle, 4 December 2008]

“Unfortunately, the human immunodeficiency virus continues its insidious spread in the population. Earlier this week Houston Health Department officials released a grim set of figures to mark World AIDS Day: About 1,700 people became infected with the virus in Harris County in 2006, nearly twice the national rate for new cases. A disproportionate number of those cases occurred among blacks and Hispanics” [emphasis added].
Despite all the high-falutin talk about removing stigma and not blaming victims, how could the continuing stories that Blacks and Hispanics are disproportionately affected by this supposedly sexually transmitted disease not fuel racist beliefs about irresponsible behavior by minorities, particularly in sexual matters?

Breaking up of relationships:
“Hellsing wrote [commenting on the Houston story above]:
When I found out my former husband had a few girlfriends, I got tested immediately. I also had an attorney to call and another residence in which to move while the divorce went through.”

“The urban legend of ‘the down-low’ has brought about circumstances where any woman who tests HIV-positive and who has ever slept with a black man automatically attributes that condition to him, without further ado and without any corroborating evidence. ‘My fault was that I slept with my husband’ (now her ex-husband), says one black woman, who tested HIV-positive when she was pregnant . . . . ‘I let my guard down with the wrong person,’ says yet another . . . . A 20-year-old was ‘the victim of unprotected sex with a guy she thought was her soulmate’ . . . . It seems more than likely that some black men have found themselves unjustly judged guilty of practicing the down-low, and that otherwise stable or potentially long-lasting relationships have thereby been disrupted” [pp. 246-7 in The Origin, Persistence and Failings of HIV/AIDS Theory ]

Imprisoning innocents:
Around the world,  an increasing number of individuals are in jail, declared guilty of infecting others with something that is not transmissible.

Bringing science and medicine into ill repute:
For the time being, it is only a relatively small number of people who are aware of how drastically medical practice and medical science have gone. When the knowledge becomes widespread, the exact nature of the fallout can hardly be predicted, but it will certainly be enormously consequential. It may well do for medical science, and even science generally, about what Enron did for energy de-regulation and what the present global financial meltdown is doing for the world’s way of trading, banking, and trying to regulate economies.

———————–

Altogether, HIV/AIDS theory has been responsible for disasters individual and social, including professional and career damage to the few scientists and doctors who refused to accept the official view. Once the realization becomes sufficiently widespread, that the theory is not only wrong but was never even a well supported hypothesis, there will be further calamities befalling innumerable people and institutions, some no doubt well deserved but many of them afflicting people who simply trusted authorities that they had no reason not to trust.

It seems pertinent to repeat this from an earlier post:

This thing is going to be studied long after our time. . . .
Because this is a major historical event
that is going to be studied for 100 years —
how the United States gave AIDS to the world

—   Charles A. Thomas

Posted in antiretroviral drugs, experts, HIV and race, HIV does not cause AIDS, HIV in children, HIV risk groups, HIV skepticism, HIV tests, Legal aspects, prejudice, sexual transmission | Tagged: , , , , , , , , , , | 3 Comments »

Institutionalizing conflicts of interest

Posted by Henry Bauer on 2008/12/02

A fellow scientist of my generation likes to describe us as “dinosaurs”, and periodically accuses me of naivety if I slip into suggesting that facts win out in the end or that scientific ideals and traditional ethics have not been completely abandoned.

I guess it’s true that I’ve written and continue to write as though there are people out there who share my disbelief at, for example, the brushing aside of conflicts of interest as only “apparent” (see “Consequences of misconduct in science”).  And there ARE people who share my attitude, call them naïve and unrealistic if you wish: there’s Sheldon Krimsky, Science in the Private Interest: Has the Lure of Profits Corrupted Biomedical Research? (Rowman & Littlefield, 2003); there’s Andrew Stark, Conflict of Interest in American Public Life (Harvard, 2000); there are Centers for Ethics, and periodicals devoted to ethics in research. Plenty of academics are aware of the sad fact that science and medicine, both research and patient care, have been pervasively infiltrated in a way that might even be called corrupting.

Researchers and administrators of research, however, seem oblivious. Well into the 1970s and even the 1980s, universities were at least trying to apply some brakes. We had to make formal application if we consulted more than half a day per week, and if our remuneration exceeded some modest amount. We were not permitted to run a business that was in any way connected with our academic responsibilities. We took for granted the burden of offering our professional advice as to the publishability of manuscripts or the qualifications of candidates for jobs or promotions. When we traveled to present invited seminars or to advise academic institutions, we didn’t expect honoraria in addition to having our expenses covered — and we felt unusually appreciated when we received honoraria equivalent to a few hours of our annual salary. We regarded it as exceptional perks — comparing ourselves to so many other people- — that our university salaries were paid on a 9-month or 10-month basis, permitting us to teach r do research for an extra 20% or so of annual remuneration. I recall being shocked, in the early 1980s, when professors of English were asking remuneration for reading book manuscripts of candidates for tenure.

What a different world it is, just a couple of decades later. A misguided Director of the National Institutes of Health dropped certain restrictions on outside income, with predictably disgusting consequences (David Willman, Los Angeles Times, 7 December 2003: “Stealth merger: Drug companies and government medical research”, p. A1; “Richard C. Eastman: A federal researcher who defended a client’s lethal drug”, p. A32; “John I. Gallin: A clinic chief’s desire to ‘learn about industry’”, p. A33; “Ronald N. Germain: A federal lab leader who made $1.4 million on the side”, p. A34; “Jeffrey M. Trent: A government accolade from a paid consultant”, p. A35; “Jeffrey Schlom: A cancer expert who aided studies using a drug wanted by a client”, p. A35.)

Just as with political lobbying, we Americans seem able to euphemize, ignore, and even defend practices that in other lands we would be quick to recognize as plain corruption. What set off this tirade was a news item in the Chronicle of Higher Education, 20 June 2008, p. 13: “To lure top scientists, NIH raises pay for some peer reviewers”, by Jeffrey Brainard. Here are a few extracts:

“The National Institutes of Health plans a major increase in the money to provides to long-serving peer reviewers . . . . Some will receive $250,000 for six years . . . . Under the current terms of $200 per day, such scientists would net only about $6000 after six years”.
[Peanuts! Coffee money! But, after all, this is in addition to their salaries wherever they happen to be working, their pay isn’t cut just because they’re away from the office or the lab. And that $200 per day is in addition to expenses, of course, for travel, food, and accommodation; expenses that can be and are often padded a little.]

“But the largesse . . . . would benefit only a few hundred of the several thousand scientists who help evaluate grants of the institutes. . . . Traditionally, many scientists have willingly reviewed applications, though the fees they have been paid fell well short of the value of the time commitment required: at the NIH, 40 to 80 hours of preparation for each day-and-a-half meeting” — Right. I’ve known quite a few people who have served in this way. (Serving as  an academic dean teaches quite a lot about human nature.) Those who spent anything like that amount of preparatory time did it because of their sense of responsibility and don’t need extra money, while those who expect the money and will not otherwise serve will also not spend that amount of time on it.

“’In the end, peer review is only as good as the quality of the people doing it,’ said Elias A. Zerhouni, the NIH’s director”.
Yes, indeed. We need honest, conscientious people who do these things because their profession is a vocation, a calling, not just a way to earn a living, and certainly not a way to acquire wealth.
[Zerhouni continued,] “I think you get what you pay for”.
And there you have it.
— Want medical care? The more you pay, the better care you’ll get. But didn’t we used to think that was a dreadful situation, when behind the Iron Curtain one had to give bribes and tips to get proper care?
— Want education for yourself or your children? The more you can pay, the better education they will get. But isn’t there some sort of consensus still that every American child should get every educational opportunity they can benefit from?
— Want honest evaluation of research? You’d better pay for it, especially to people who don’t need the money because they earn so much already.
It reminds me of the philosopher, I don’t recall whether it was Mort Sahl or Bob Newhart or Tom Lehrer, certainly one of their ilk, about responding to question from students: “And, of course, if you raise my pay, I’ll even give them correct answers”.

But it’s not all gravy, we’re told. “The $250,000 compensation [lovely choice of word] will be awarded as an ‘administrative supplement’ to existing research grants”, so the recipients can use it at will: “They will keep only some of the money, as salary — the underlying grants also typically finance research equipment and laboratory assistants”.

And of course this administrative supplement is in addition to the $200 daily honoraria.

“NIH leaders rejected, though, a controversial proposal by a peer-review task force that would have capped at five the number of research grants that any one scientist could hold, in order to spread dollars among more grant applicants, including younger ones”.
[An earlier piece in the Chronicle had mentioned that scientists are on average 42 years of age before they get their first NIH grant. Got to keep those young Turks in their place, kowtowing as “postdoctoral fellows” to us experienced gurus; otherwise, who could we get to actually do the work in our labs?]

Robert Merton, founding sociologist of science, long ago identified the “Matthew Effect”:

For unto every one that hath shall be given, and he shall have abundance: but from him that hath not shall be taken away even that which he hath.
—Matthew 25:29, King James Version.

It’s not new, in science, it’s just become as egregious as Credit Default Swaps and other scams. I usually resist the notion that there exists a self-interested, self-serving Establishment, be it in government or in education or in research. But facts are stubborn things, as they say, and sometimes my naivety bows to them. Dr. Zerhouni and the other “NIH leaders” have certainly provided us with some very stubborn, unpalatable facts.

Posted in experts, Funds for HIV/AIDS, uncritical media | Tagged: , , , , , , , , , , , , | Leave a Comment »