HIV/AIDS Skepticism

Pointing to evidence that HIV is not the necessary and sufficient cause of AIDS

Posts Tagged ‘HIV/AIDS deaths’

CDC versus CDC: Which Data to Believe?

Posted by Henry Bauer on 2008/08/15

I’ve commented critically, on numerous occasions, in many connections, on the fallacy of accepting outputs from computer models as though they were reliable data. I’ve also noted on several occasions that the so-called “Surveillance Reports” published by the Centers for Disease Control and Prevention (CDC) have increasingly — since the late 1990s — featured estimates rather than reported numbers (for example, see Table 33, below, from The Origin, Persistence and Failings of HIV/AIDS Theory, and the following pages in the book).

Another egregious example of estimates taking the place of reported numbers turned up as I was looking into information about deaths from “AIDS” (= “HIV disease”). That led me to remember that bureaucracies are ill suited to doing, assessing, managing, or reporting matters scientific: bureaucracies are not good at self-criticism; internal disagreements are wherever possible hidden from outsiders and settled by political rather than scientifically substantive negotiations. That’s part of the reason why 21st-century science is becoming riddled with knowledge monopolies and research cartels.

The Centers for Disease Control and Prevention is a sizeable bureaucracy. Some 16 units report to the Director:

Within the Coordinating Center for Infectious Diseases reside four National Centers, for:
— Immunization and Respiratory Diseases (NCIRD)
— Zoonotic, Vector-Borne, and Enteric Diseases (NCZVED)
HIV/AIDS, Viral Hepatitis, STD, and TB Prevention (NCHHSTP)
— Preparedness, Detection, and Control of Infectious Diseases (NCPDCID)

NCHHSTP houses a variety of programs under 6 “topics”:
— Sexually Transmitted Diseases
— Viral Hepatitis
— Tuberculosis
— Global AIDS
— BOTUSA (Botswana-USA).
[That “HIV/AIDS” and “Sexually Transmitted Diseases” are separate “topics” does not, regrettably, mean that the CDC has now acknowledged that HIV/AIDS is not sexually transmitted.]

Within (presumably) the “HIV/AIDS” topic is the Division of HIV/AIDS Prevention, which has published HIV/AIDS Surveillance Reports.

Within the Coordinating Center for Health Information and Service (CCHIS) reside three National Centers:
Health Statistics (NCHS)
— Public Health Informatics (NCPHI) (has 5 divisions)
— Health Marketing (NCHM)
[For anyone who is not squeamish about bureaucratic and PR jargon, I recommend highly the explanation of what “health marketing” is (and if you can explain what the explanation means, please let me know)]

Evidently the publishers of the HIV/AIDS Surveillance Reports are quite a few bureaucratic steps away from the National Center for Health Statistics, which publishes the National Vital Statistics Reports (NVSR) and annual summaries of Health, United States (HUS). Perhaps that explains why the data in the Surveillance Reports differ so much from those in NVSR and HUS.

Take the instance of deaths in 2004 from “HIV disease”.

NVSR 56 #5, 20 November 2007, using “information from all death certificates filed in the 50 states and the District of Columbia”, lists by age group (in its Table 1) the numbers of recorded deaths, and the death rates per 100,000, for the ten leading causes of death in each group. “Human immunodeficiency virus (HIV) disease” appears as one of those ten leading causes only between ages 19 and 54. There are listed 160 deaths among 20-24-year-olds, 1468 deaths among ages 25-34, 4826 deaths among ages 35-44, and 4422 deaths among ages 45-54.

However, numbers for some of the other age groups can be calculated because the death rates for them are supplied in Health, United States, 2007 — With Chartbook on Trends in the Health of Americans (National Center for Health Statistics, Hyattsville, MD: 2007). Appendix I confirms what is said in NSVR: “Numbers of . . . deaths from the vital statistics system represent complete counts . . . . Therefore, they are not subject to sampling error”. Table 42 [also featured in an earlier post, HIV DISEASE” IS NOT AN ILLNESS, 19 March 2008] is for deaths from HIV disease:

* Rates based on fewer than 20 deaths are considered unreliable and are not shown.

(Note again, under the heading of Table 42, “Data are based on death certificates”.)

These rates allow calculation of actual numbers of HIV-disease deaths for age groups from 5 through 84 years of age (column F, Table I below), because the NVSR gives not only numbers but also the corresponding rates for each age group, allowing calculation of the factor connecting rate and number, see column D. (The factor is independent of the particular disease but varies with age: it reflects how many individuals are within that age group in the whole population.) Together with the numbers already given in NVSR, this yields numbers of deaths for the whole range from 5 to 84 years of age, column G.

Now compare those numbers with the estimates published in Table 7 of HIV/AIDS Surveillance Report, volume 18, “Cases of HIV infection and AIDS in the United States and Dependent Areas, 2006”, presenting data “reported to CDC through June 2007”) :

For 2004, here is a comparison of the numbers from these two sources within CDC:

The estimates from the CDC are on average 21% greater than the actually recorded numbers. Moreover, the error varies with age group in a remarkably regular way; one that exaggerates the median age of death by more than 3 years.

Now, Table 7 in the Surveillance Report does have this caveat, in small print in a footnote to the Table: “These numbers do not represent reported case counts. Rather, these numbers are point estimates, which result from adjustments of reported case counts. The reported case counts have been adjusted for reporting delays and for redistribution of cases in persons initially reported without an identified risk factor, but not for incomplete reporting” [emphasis added]. Incomplete reporting for 2004 should hardly be a problem, however, in a publication that presents data “reported to CDC through June 2007”; nor would incomplete reporting vary with age group in this remarkable manner, it would be more random.

Such “adjustments” 3 and 4 years after the event are no rarity in these CDC HIV/AIDS publications. For example, deaths “reported” for the 1980s were “adjusted” downwards in wholesale fashion more than half-a-dozen years later, thereby altering the fact that the earlier data had shown deaths to have been leveling off, see Table 33, p. 221 in The Origin, Persistence and Failings of HIV/AIDS Theory:

Note how “reported” deaths for the years through 1986 somehow decreased dramatically between the 1988 report and the 1989 report. Such re-writing of historical facts will be familiar to students of the former Soviet Union, but it is not normally found in scientific publications.

At any rate, CDC unapologetically—indeed, without admitting it or drawing attention to it—routinely publishes considerably revised “estimates”; for example (Table III), for deaths in 2002 as given in the 2005 and 2006 Surveillance Reports. Table 7 in the 2006 Report does not warn that numbers for as far back as 2002 are different from those for the same years in the 2005 Report.

The Technical Notes do warn: “Tabulations of deaths of persons with AIDS (Table 7) do not reflect actual counts of deaths reported to the surveillance system. Rather, the estimates are based on numbers of reported deaths, which have been adjusted for delays in reporting”.

The estimates may be based on reported deaths; but if so, then they are very loosely based on them indeed, since they differ by as much as 38% in some age groups, see Table II above. That adjustments from one year to the next are so similar in percentage terms for the various age groups (Table III); that the differences between actual counts and “estimates” vary in such regular fashion with age (Table II); and that the numbers given are “point estimates” all indicate that the estimates are arrived at by means of some sort of overarching algorithm, computer model, or graphical representation, with—presumably—periodic adjustment of some of the assumptions or parameters defining the model. However, when estimates, no matter how derived, are claimed to be “based on numbers of reported deaths”, one expects that the mode of estimating will be progressively refined over the years to bring the estimates closer to the actual numbers. That has evidently not been the case here: estimated “data” for deaths for 2004 are shockingly different from the reports based on death certificates (Table II).

Once again—or rather, as usual—HIV/AIDS “researchers” imply greater accuracy than is warranted. The “point estimates” in Table II differ from year to year by a couple of percent, so the numbers should never be written to more than 3 significant figures. When they differ from actual numbers as much as in Table III, even two significant figures give a false impression.

The overall description at the beginning of the Surveillance Report is also misleading: “Data are presented for cases of HIV infection and AIDS reported to CDC through June 2007. All data are provisional.” Nothing here about “estimates”, and the reader who scans without careful attention to fine-print footnotes and Technical Notes could easily believe—given that numbers are given to four and five significant figures—that these really are “reported” “data”, not computer garbage-output emanating from invalid models. Nor are readers referred to NVSR or HUS; the only mention of either is in the Technical Notes and does not refer to Table 7: “The population denominators used to compute these rates for the 50 states and the District of Columbia were based on official postcensus estimates for 2006 from the U.S. Census Bureau [24] and bridged-race estimates for 2006 obtained from the National Center for Health Statistics [25].”

Why would one publish estimates when actual numbers are reported by a sibling unit in the same bureaucracy? After all, death certificates are a legal requirement, and information from them should be as trustworthy as demographic data ever can be. Is it coincidental that the HIV/AIDS specialists always overestimate?

Posted in HIV varies with age, HIV/AIDS numbers | Tagged: , , , , , , | 7 Comments »


Posted by Henry Bauer on 2008/02/07

Statistics may seem to be plain information, but often they are issued in order to send messages. So the choice of numbers is naturally determined by what the desired message is.

Want to show that the HIV/AIDS epidemic is even worse than we thought?

“AIDS advocacy groups say the new figures [to be released by the Centers for Disease Control and Prevention] will put the number of Americans infected with the AIDS virus each year close to 50 percent higher than previous estimates, at 55,000 instead of 40,000”
[WASHINGTON (Reuters) “Under 1 percent of U.S. adults have HIV: report”, by Maggie Fox; 29 January 2008]

Want to show that the epidemic is not as bad as we thought?

“The CDC has estimated in the past that more than 1 million Americans in total are infected with the human immunodeficiency virus that causes AIDS.”
“In 1999 to 2006, the prevalence of HIV infection among adults aged 18-49 years in the civilian noninstitutionalized household population of the United States was 0.47 percent . . . . [which means] anywhere between 447,000 people and 841,000 people, with 618,000 the middle number. . . . . [according to] the National Center for Health Statistics . . . . The agency’s snapshot of HIV infection in the United States shows the rate continues to be stable. . . . The report covers adults aged 18 to 49 and only people living in households — not prisoners, the homeless or patients in institutions” [emphases added]

Want to show that there’s still ample cause for concern, even though the HIV/AIDS epidemic is not as bad as we thought?

Even if overall the numbers are stable—“We can say the prevalence is basically stable in this U.S., household-based population”—, it’s always possible to pick out sectors where there are grounds for grave concern:

“We do see the disparities by race/ethnicity . . . confirms other surveys that show black men are far more likely than other Americans to be infected. . . . Black men aged 40 to 49 had the highest rate of infection, at close to 4 percent” [emphasis added].
Also, “2% of non-Hispanic blacks were HIV-positive, compared with 0.23% of whites and 0.3% of Mexican-Americans” (Kaiser Health Disparities Report: A Weekly Look At Race, Ethnicity And Health, 31 January 2008).


Want to show that we must leave these important matters to the experts, because lay people can’t handle such complicated calculations?

Just try to put all the official numbers about HIV/AIDS together. The result will be great respect for the experts’ ability to remain unruffled in the face of blatant contradictions. For instance, the number of new infections is worse than we thought, more like 55,000 annually. Therefore the total number living with HIV/AIDS should have been rising at 55,000 per year. Yet we‘ve just had official assurance that the rate of HIV among Americans has been stable for something like a decade. What’s been happening to those 55,000 new cases annually?

A natural thought is that there have been 55,000 deaths from HIV disease per year. But if you look at the official statistics, you get the following (for 2004, from National Vital Statistics Report, 56 #5, 20 November 2007):

Deaths from HIV disease: 5608 white Americans, 7271 black, 84 Native American, 100 Asian Americans; the total of which is 13,063. (By the way: “HIV disease was also briefly among the top five killers for the black population during the 1990s” [emphasis added]).

Or by ethnicity: 1758 Hispanic deaths, 11,195 non-Hispanics: total is 12,953. (Again: “HIV disease was one of the top five causes of death for the Hispanic population in the mid-1990s but quickly dropped out of this group”.)

[13,063 minus 12,953 equals 110 people who are neither Hispanic nor non-Hispanic, apparently, but who died anyway.
The report does note, in very fine print, that reporting of ethnicity and race can be somewhat inconsistent, but annoying little inconsistencies like 110 apparently missing people could all be avoided if these reports didn’t claim more accuracy for the numbers than they deserve. Since uncertainties exist in the reporting of ethnicity and race, how about not giving numbers to 5 significant figures and just publishing “13,000”? That would be more honest as well as less annoying. Perhaps statisticians employed by federal agencies could take a short course in the use of “significant figures”—see MATHEMATICAL AND STATISTICAL LIES ABOUT HIV/AIDS, 2 December 2007.]

Anyway: About 13,000 a year die from HIV disease. About 55,000 newly contract it each year. Therefore the total number living with HIV/AIDS should be rising at the rate of about 42,000 per year. Yet the total number of Americans living with HIV/AIDS has remained stable since the mid-1980s, at about 1 million (references cited in the Preface, pp. 1-2, of The Origins, Persistence and Failings of HIV/AIDS Theory)—or maybe less, if you use numbers from the National Household Survey, above, rather than from the Centers for Disease Control and Prevention.

Perhaps, then, those 42,000 disappearing mysteriously each year represent people who eventually revert to HIV-negative after once having tested HIV-positive?
But of course the orthodox view is that seroreversion is exceedingly rare.
On the other hand, the evidence is that seroreversion is far from rare—see HIV “INFECTION” DISAPPEARS SPONTANEOUSLY, 22 January 2008).


With deaths from AIDS or “HIV disease”, you can also pick just about any number:

The Centers for Disease Control and Prevention likes to be on the safe side with its estimates. Where the National Vital Statistics Report (above) states 13,063 or 12,953 reported HIV/AIDS deaths for 2004, the Centers for Disease Control and Prevention estimates 18,099 for the same year (HIV/AIDS Surveillance Report, volume 17, revised June 2007).

Leave aside this discrepancy between 13,000 and 18,000 and consider only the methodology for a moment. It indicates that I should moderate my criticism, above, about reporting 13,063 and 12,953 when all that’s relatively reliably known is “about 13,000”. I had been tempted to repeat it concerning the estimate of 18,099 which should obviously have been given instead as “about 18,000”. But then it occurred to me that these estimates originate in computer programs, whose capabilities stretch to a much larger number of digits than a mere 5. Quite likely the computer spat out not 18,099 but something like 18,098.783. Instead of criticizing the resident experts, I should congratulate them for rounding up the estimate to the nearest person.

Posted in experts, HIV absurdities, HIV/AIDS numbers | Tagged: , , , | 1 Comment »