Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 149
Filter
1.
Clin Infect Dis ; 36(12): 1577-84, 2003 Jun 15.
Article in English | MEDLINE | ID: mdl-12802759

ABSTRACT

Data regarding the care and management of human immunodeficiency virus (HIV)-infected patients provided by infectious diseases (ID)-trained physicians, compared with data for care and management provided by other specialists, are limited. Here, we report results of a self-administered survey sent to 317 physicians (response rate, 76%) in 4 metropolitan areas of the United States who were identified as providing care to disadvantaged HIV-infected patients. ID-trained physicians who responded that they strongly agreed or somewhat agreed that they had enough time to care for their HIV-infected patients were more likely than were non-ID-trained physicians to provide therapy-adherence counseling. Physicians with >or=50 patients in care and ID-trained physicians were less likely to always discuss condom use and risk reduction for HIV transmission. Factors significantly associated with referring rather than treating HIV-infected patients with hypertension or diabetes included having <50 patients in care, being an ID-trained physician, and practicing in a private practice. These results suggest the need for targeted physician training on the importance of HIV transmission prevention counseling, increasing the duration of patient visits, and improving strategies for generalist-specialist comanagement of HIV-infected patients.


Subject(s)
HIV Infections/therapy , Medicine , Physicians , Practice Patterns, Physicians' , Referral and Consultation , Specialization , Antiretroviral Therapy, Highly Active , Counseling , Empathy , Humans
2.
Genes Immun ; 3(5): 295-8, 2002 Aug.
Article in English | MEDLINE | ID: mdl-12140749

ABSTRACT

The CCR5-Delta32 genotype is known to influence HIV-1 transmission and disease. We genotyped 1301 US women of various races/ethnicities participating in the HIV Epidemiologic Research Study. None was homozygous for CCR5-Delta32. The distribution of heterozygotes was similar in HIV-1 infected and uninfected women. Thirty-seven (11.8%) white, 28 (3.7%) blacks/African Americans (AA), seven (3.3%) Hispanics/Latinas, and one (6.6%) other race/ethnicity were heterozygous. The frequency of heterozygotes differed among sites for all races combined (P = 0.001). More heterozygotes were found in AA women in Rhode Island (8.9%) than in the other sites (3.1%) (P = 0.02), while heterozygosity in white women was most common in Maryland (28.6%) (P = 0.025). These regional differences could be accounted for by racial admixture in AAs, but not in whites. Regional variations should be considered when studying host genetic factors and HIV-1 in US populations.


Subject(s)
Genetic Variation , HIV Infections/genetics , HIV Infections/immunology , Receptors, CCR5/genetics , Adolescent , Adult , Base Sequence , Black People/genetics , Case-Control Studies , DNA/genetics , Female , Genotype , HIV Infections/epidemiology , HIV-1 , Heterozygote , Hispanic or Latino/genetics , Humans , Middle Aged , United States/epidemiology , White People/genetics
3.
Int J Cancer ; 94(5): 753-7, 2001 Dec 01.
Article in English | MEDLINE | ID: mdl-11745473

ABSTRACT

The purpose of our study was to identify the types and rates of cancers seen in high-risk human immunodeficiency virus (HIV)-infected and HIV-uninfected women. From 1993 to 1995, 1,310 women enrolled at four urban U.S. research sites in the HIV Epidemiology Research Study and were interviewed biannually to identify interval diagnoses and hospitalizations until study closure in March 2000. Cancer incidence data were collected through abstraction of medical records and death certificates. Of 871 HIV-infected and 439 HIV-uninfected women, 85% had a history of smoking and 50% a history of injection drug use. For our analysis, 4,180 person-years were contributed by HIV-infected women, and 2,308 person-years by HIV-uninfected women. HIV-infected women had 8 non-Hodgkin's lymphomas, 5 invasive cervical cancers (ICC), 1 Kaposi's sarcoma and 12 non-AIDS defining cancers, including 4 lung cancers, compared with 4 cancers in HIV-uninfected women including 1 lung cancer (all cancers, 6.22/1000 person-years vs. 1.73/1000 person-years, p = 0.01). CD4+ cell counts were above 200/mm3 in all women with ICC. HIV-infected women with lung cancer were young smokers (mean age, 40 years), and all died within 6 months of diagnosis. Lung cancer occurred at twice the rate in HIV-infected vs. uninfected women in the cohort and severalfold above expected in age- and race-matched women in U.S. national data (incidence relative risk 6.39; 95% confidence interval 3.71, 11.02; p < 10(-7)). The frequent occurrence of cervical and lung cancers have important implications for the counseling (cigarette cessation), screening (PAP smears) and care of women with HIV infection, as they live longer because of current antiretroviral therapies.


Subject(s)
HIV Infections/complications , Neoplasms/epidemiology , Adolescent , Adult , CD4 Lymphocyte Count , Female , Humans , Incidence , Lung Neoplasms/epidemiology , Middle Aged , Prospective Studies , Uterine Cervical Neoplasms/epidemiology
4.
Clin Infect Dis ; 33(12): 2055-60, 2001 Dec 15.
Article in English | MEDLINE | ID: mdl-11700576

ABSTRACT

The impact of protease inhibitors (PIs) on emergency department (i.e., emergency room [ER]) visits and hospitalizations was examined among a cohort of human immunodeficiency virus (HIV)-infected and high-risk women followed-up in the HIV Epidemiology Research Study (HERS) from 1993 through 1999. The rates of hospitalization and ER visits were measured as a function of recent or current PI use, age, race, transmission risk category, HERS site, baseline CD4 cell count, and baseline virus load; the PI effect was estimated separately by baseline CD4 cell count. In the HERS, PI use was strongly associated with lower rates of ER visits and hospitalizations for patients with baseline CD4 cell counts of <200 cells/mL (for hospitalizations: rate ratio [RR], 0.54; 95% confidence interval [CI], 0.33-0.89; for ER visits: RR, 0.38; 95% CI, 0.24-0.61). Other factors associated with increased hospitalization and ER use included history of injection drug use, low CD4 cell counts, and high virus loads.


Subject(s)
Emergencies , HIV Infections/drug therapy , HIV Protease Inhibitors/therapeutic use , Adult , CD4-Positive T-Lymphocytes/immunology , Cohort Studies , Female , HIV Infections/immunology , Hospitalization , Humans , Longitudinal Studies , Outcome Assessment, Health Care
5.
AIDS Patient Care STDS ; 15(9): 473-80, 2001 Sep.
Article in English | MEDLINE | ID: mdl-11587633

ABSTRACT

Anemia is common during human immunodeficiency virus (HIV) infection and is associated with increased mortality. We conducted a study to examine the impact of highly active antiretroviral therapy (HAART) on anemia in a multicenter cohort of HIV-positive women, the Human Immunodeficiency Virus Epidemiology Research (HER) Study. Among women receiving HAART (n = 188), non-HAART monotherapy or combination antiretroviral therapy (ART) (n = 111), or who had no reported treatment (n = 62), the prevalence of anemia (hemoglobin, <120 g/L) at baseline was 38.3, 36.9, and 43.6%, respectively (p = 0.58) and at 1-year follow-up was 26.1%, 36.9%, 45.2%, respectively (p = 0.01); mean hemoglobin at baseline was 125 +/- 16, 122 +/- 16, and 122 +/- 18 g/L, respectively (p = 0.29) and at 1-year follow-up was 128 +/- 14, 123 +/- 16, and 119 +/- 20 g/L, respectively (p < 0.0001). Adjusted linear regression models showed that HAART was associated with an increase of hemoglobin of 0.20 g/L per month (p = 0.007). After 1 year of treatment, HAART was associated with a 32% reduction in anemia among HIV-infected women (p = 0.01), whereas there was no significant change in the prevalence of anemia among those on non-HAART ART or those who had no reported treatment. HAART is associated with a large reduction in anemia among HIV-infected women.


Subject(s)
Anemia/epidemiology , Anemia/etiology , Antiretroviral Therapy, Highly Active , HIV Infections/drug therapy , Adult , Baltimore/epidemiology , Cohort Studies , Cross-Sectional Studies , Female , HIV Infections/complications , Hemoglobins/metabolism , Humans , Michigan/epidemiology , New York City/epidemiology , Prevalence , Rhode Island/epidemiology , Women's Health
6.
JAMA ; 284(21): 2727-32, 2000 Dec 06.
Article in English | MEDLINE | ID: mdl-11105177

ABSTRACT

CONTEXT: Despite scientific uncertainties about effectiveness, wearing back belts in the hopes of preventing costly and disabling low back injury in employees is becoming common in the workplace. OBJECTIVE: To evaluate the effectiveness of using back belts in reducing back injury claims and low back pain. DESIGN AND SETTING: Prospective cohort study. From April 1996 through April 1998, we identified material-handling employees in 160 new retail merchandise stores (89 required back belt use; 71 had voluntary back belt use) in 30 states (from New Hampshire to Michigan in the north and from Florida to Texas in the south); data collection ended December 1998, median follow-up was 6(1/2) months. PARTICIPANTS: A referred sample of 13,873 material handling employees provided 9377 baseline interviews and 6311 (67%) follow-up interviews; 206 (1.4%) refused baseline interview. MAIN OUTCOME MEASURES: Incidence rate of material-handling back injury workers' compensation claims and 6-month incidence rate of self-reported low back pain. RESULTS: Neither frequent back belt use nor a belt-requirement store policy was significantly associated with back injury claim rates or self-reported back pain. Rate ratios comparing back injury claims of those who reported wearing back belts usually every day and once or twice a week vs those who reported wearing belts never or once or twice a month were 1.22 (95% confidence interval [CI], 0.87-1.70) and 0.95 (95% CI, 0.56-1.59), respectively. The respective odds ratios for low back pain incidence were 0.97 (95% CI, 0.83-1.13) and 0.92 (95% CI, 0.73-1.16). CONCLUSIONS: In the largest prospective cohort study of back belt use, adjusted for multiple individual risk factors, neither frequent back belt use nor a store policy that required belt use was associated with reduced incidence of back injury claims or low back pain. JAMA. 2000;284:2727-2732.


Subject(s)
Back Injuries/prevention & control , Back Pain/prevention & control , Occupational Diseases/prevention & control , Protective Clothing , Workplace/standards , Adult , Back Injuries/epidemiology , Back Pain/epidemiology , Female , Humans , Male , Occupational Diseases/epidemiology , Prospective Studies , Protective Clothing/statistics & numerical data , Regression Analysis , United States , Workers' Compensation , Workplace/statistics & numerical data
7.
Scand J Work Environ Health ; 26(5): 406-13, 2000 Oct.
Article in English | MEDLINE | ID: mdl-11103839

ABSTRACT

OBJECTIVES: This study determined the impact of misclassification due to using job titles as surrogate variables for physical work exposures to assess confounding in a study of the preventive effect of back belts on back injury. The authors present retail merchandise data that quantify misclassification from residual confounding by physical work exposures on injury rate ratios when available administrative job titles are used. METHODS: Job title and direct observation data on 134 workers were used to calculate the percentage to which the job-title-adjusted rate ratio for back injury accounts for confounding by the true physical work exposures, awkward postures, and heavy weight handling. Workers' compensation data, an estimate of the effect of back belts from the literature, and the percentage of adjustment of the rate ratio due to the job title variable were used to calculate the magnitude of bias from the rate ratio adjusted for job title. RESULTS: The job title variable was found to have sensitivities of 97% and 85% and specificities of 68% and 58% for awkward postures and heavy weight handling, respectively. The magnitude of confounding bias remaining for the back-injury rate ratio when the job title surrogate was used was 24% for postures and 45% for heavy weight handling. CONCLUSIONS: The administrative job title performed poorly in this setting; residual confounding was sufficient to bias the rate ratio from 2.0 to 1.3. The effect of additional sources of misclassification and the need for better exposure measures than job title are discussed.


Subject(s)
Back Injuries/prevention & control , Bias , Musculoskeletal Diseases/prevention & control , Occupational Diseases/prevention & control , Work , Confounding Factors, Epidemiologic , Humans , Lifting , Models, Theoretical , Musculoskeletal Diseases/epidemiology , Occupational Diseases/epidemiology , Pilot Projects , Workers' Compensation/statistics & numerical data
8.
Am J Ind Med ; 38(1): 40-8, 2000 Jul.
Article in English | MEDLINE | ID: mdl-10861765

ABSTRACT

BACKGROUND: The association between slip and fall-related injuries and environmental temperature was examined for mostly enclosed (inside vehicles, machinery, or buildings), outdoor (outside, not enclosed), and enclosed/outdoor jobs in the coal mining industry to see if differences existed among the three work locations that had varying exposure to cold temperatures. METHODS: Temperature data from the National Climatic Data Center and injury data from the Mine Safety and Health Administration were evaluated from 1985-1990 for seven states. Proportionate methods were used to examine the relationship between slips and falls and temperature. RESULTS: Proportionate injury ratios of slips and fall-related injuries increased as temperature declined for all three work locations. Proportion of slips and fall-related injuries that occurred while running/walking increased with declining temperature, with the ground outside as the most common source of these injuries. CONCLUSIONS: Outside movement becomes a greater hazard at freezing temperatures for workers in all locations, not just outdoor workers. Any intervention methods geared toward reducing injury incidents facilitated by cold weather must also be directed toward workers who spend time in more enclosed locations.


Subject(s)
Accidental Falls/statistics & numerical data , Accidents, Occupational/statistics & numerical data , Coal Mining/statistics & numerical data , Cold Temperature , Occupational Health , Confidence Intervals , Female , Humans , Incidence , Injury Severity Score , Male , Probability , Registries , Risk Factors , United States/epidemiology
9.
Am J Epidemiol ; 150(8): 825-33, 1999 Oct 15.
Article in English | MEDLINE | ID: mdl-10522653

ABSTRACT

Risk factors for work-associated strain or sprain back injuries were investigated in a cohort of 31,076 material handlers from 260 retail merchandise stores in the United States. The workers studied were those with significant material-handling responsibilities--daily lifting and movement of merchandise. Workers in jobs with the greatest physical work requirements had an injury rate of 3.64 per 100 person-years versus 1.82 in workers with lesser work requirements. The unadjusted injury rate for males was 3.67 per 100 person-years compared with 2.34 per 100 person-years for females, but the excess for males was confounded by higher physical work requirements for men in the stocker/receiver job category. The injury rate ratio for short versus long duration of employment was 3.53 (95% confidence interval: 2.90, 4.30); for medium versus long duration of employment, it was 1.38 (95% confidence interval: 1.18, 1.62). The elevated rate ratios were maintained when the data were stratified by subsets with different rates of turnover. The results suggest that workers with the greatest physical work requirements and those with the shortest duration of employment are at the highest risk of back injuries. However, selection forces causing worker turnover within this cohort of active workers are not well characterized and have the potential to bias the measures for time-related factors such as duration of employment.


Subject(s)
Back Injuries/epidemiology , Occupational Diseases/epidemiology , Sprains and Strains/epidemiology , Accidents, Occupational/statistics & numerical data , Adult , Back Injuries/etiology , Cohort Studies , Female , Humans , Incidence , Lifting , Male , Occupational Diseases/etiology , Occupations/statistics & numerical data , Poisson Distribution , Prospective Studies , Risk Factors , Sprains and Strains/etiology , Time Factors , United States/epidemiology
10.
Int J Occup Environ Health ; 5(2): 79-87, 1999.
Article in English | MEDLINE | ID: mdl-10330506

ABSTRACT

This study examined biomechanical stressor variables (physical work exposures) in relation to job title, gender, and back-belt status in 134 retail store workers. The principal concerns were to quantitatively describe physical work exposures and to determine the degrees to which these quantitative variables correlated with job title and with the use of back belts. An additional objective was to assess the inter-rater reliability of the observation method. The systematic observation method employed was based on a modification of the PATH (Postures, Activities, Tools, and Handling) measurement method. Chi-square analysis indicated that the frequencies of bent or twisted postures followed the pattern of unloaders > stockers > department managers. For weight handled per lift, lower, or carry, the pattern was unloaders > department managers > stockers. The mean lifting frequencies per hour were 35.9 for department managers, 48.8 for stockers, and 137.4 for unloaders. Back-belt-wearing percentages were higher for unloaders (63%) compared with stockers (48%) and department managers (25%). Back-belt-wearing workers had higher levels of biomechanical stressor variables, including arm position, twisting, weight handled, and number of lifts per hour. Kappa statistics ranged from 0.5 to 0.63, a level of adequate or good reliability beyond chance. The method employed in this study is applicable in studies that require only fairly crude distinctions among biomechanical stressor variables. Nevertheless, this level of distinction may be sufficient when implementing intervention studies and control strategies for many material-handling-intensive jobs.


Subject(s)
Back Injuries/prevention & control , Ergonomics , Occupational Diseases/prevention & control , Risk Assessment/methods , Biomechanical Phenomena , Female , Humans , Male , Observer Variation , Posture , Protective Devices , Reproducibility of Results , Risk Factors , West Virginia
11.
J Acquir Immune Defic Syndr Hum Retrovirol ; 17(4): 345-53, 1998 Apr 01.
Article in English | MEDLINE | ID: mdl-9525436

ABSTRACT

OBJECTIVES: The study's objectives were to determine the size and duration of benefits of early versus delayed versus late treatment with zidovudine (ZDV) on disease progression and mortality in HIV-infected patients, and whether patients rapidly progressing before ZDV treatment had a different outcome from those not rapidly progressing before ZDV. DESIGN: The design was an inception cohort of 1003 HIV-infected patients. One hundred and seventy-four of the 1003 patients were treated before CD4 counts fell to <400 x 10(9)/L, ("early treatment"); 183 of 1003 patients were treated after CD4 counts fell to <400 x 10(9)/L but before clinical disease developed ("delayed treatment"); and 646 of the 1003 patients had either been treated after clinical disease developed or had not been treated at all by the end of follow-up ("late treatment"). Outcomes were progression to clinical HIV disease and mortality. RESULTS: The relative risk (RR) of progression for early versus delayed treatment was 0.58 (p < .03), and durability of ZDV benefits on progression was estimated at no more than 2.0 years; however, this estimate had wide confidence intervals. The RR of progression for delayed versus late treatment was 0.54 p < .0001, and durability of ZDV benefits was estimated at 1.74 years; this estimate had narrow confidence intervals. Survival was better for the early versus delayed treatment (RR = 0.55), but this difference was not statistically significant. In the subgroup of patients with more rapid CD4 decline prior to ZDV therapy, significant benefits on progression were observed for early versus delayed ZDV therapy (RR = 0.42, p = .02) and delayed versus late ZDV therapy (RR = 0.51; p = .0004). Duration of benefit was estimated to be 4.5 years (early versus delayed) and 1.7 years (delayed versus late). For patients with less rapid pre-ZDV decline in CD4 levels, a significant progression benefit was observed for delayed versus late therapy (RR = 0.50; p = .02). Duration of benefit in this subgroup was estimated to be 1.8 years. No significant benefit was found for early versus delayed treatment (RR = 1.12) in the less rapid pre-ZDV CD4 cell decline subgroup. CONCLUSIONS: Early treatment compared with delayed treatment was associated with a sizable reduction in HIV progression, but the duration of benefits was estimated to last only about 2 years. Delayed treatment compared with late treatment with ZDV was associated with substantial reduction of progression, but this reduction was also clearly limited in duration. Benefits on progression and mortality for the early treatment group were heavily dependent on the pre-ZDV CD4 slope. In the subgroup of patients with the most rapid pre-ZDV CD4 cell declines, the duration of benefit was much longer, possibly as long as 4 years.


Subject(s)
Anti-HIV Agents/therapeutic use , HIV Infections/drug therapy , Military Personnel , Zidovudine/therapeutic use , Adult , Anti-HIV Agents/administration & dosage , CD4 Lymphocyte Count , Cohort Studies , Disease Progression , Female , Follow-Up Studies , HIV Infections/mortality , Humans , Male , Proportional Hazards Models , Time Factors , United States , Zidovudine/administration & dosage
13.
JAMA ; 268(4): 495-503, 1992.
Article in English | MEDLINE | ID: mdl-1619741

ABSTRACT

OBJECTIVE: To assess the extent of the human immunodeficiency virus (HIV)/acquired immunodeficiency syndrome (AIDS) epidemic in the District of Columbia and demonstrate an approach to monitoring HIV infection and projecting AIDS incidence at a community level. DESIGN: Backcalculation methods to reconstruct HIV incidence from AIDS incidence in subgroups. Results were compared with directly measured HIV seroprevalence in selected sentinel populations: childbearing women, civilian applicants for military service, and hospital patients admitted for conditions unrelated to HIV infection. RESULTS: Between the start of the epidemic in 1980 and January 1, 1991, one in 57 District of Columbia men aged 20 to 64 years was diagnosed with AIDS. Unlike the plateau projected for the nation, AIDS incidence for the District of Columbia was projected to increase by 34% between 1990 and 1994. Models of HIV infection incidence suggested two broad epidemic waves of approximately equal size. The first occurred in men who have sex with men and peaked during the period from 1982 through 1983. The second began in the mid-1980s in injecting drug users and heterosexuals. We estimated that among District of Columbia residents aged 20 to 64 years, 0.3% of white women, 2.9% of white men, 1.6% of black women, and 4.9% of black men were living with HIV infection as of January 1, 1991. These estimates are broadly consistent with survey data: among black childbearing women in their 20s, HIV prevalence doubled to 2% between the fall of 1989 and the spring of 1991; from military applicant data, we estimated that over 5% of black men born from 1951 through 1967 were HIV-positive; in the sentinel hospital, HIV prevalence rates among male patients aged 25 to 34 years were 11.3% in white men and 16.9% in black men. CONCLUSION: Backcalculation and surveys yielded quantitatively consistent estimates of HIV prevalence. Many injecting drug users and heterosexuals in the District of Columbia were infected after January 1, 1986. Similar monitoring of the epidemic in other localities is needed to focus efforts to reduce the incidence of HIV transmission.


Subject(s)
Acquired Immunodeficiency Syndrome/epidemiology , Disease Outbreaks , HIV Infections/epidemiology , Urban Health/statistics & numerical data , Acquired Immunodeficiency Syndrome/ethnology , Adult , Black People , District of Columbia/epidemiology , Female , HIV Infections/ethnology , HIV Seroprevalence/trends , Humans , Incidence , Male , Middle Aged , Military Personnel , Models, Statistical , Population Surveillance , Pregnancy , Substance Abuse, Intravenous/complications
14.
Article in English | MEDLINE | ID: mdl-1517964

ABSTRACT

HIV-infected individuals in both early and late stages of HIV disease were evaluated over 2 years to assess temporal trends and determinants of disease progression. The Walter Reed (WR) staging system was used to categorize patients into an early-stage cohort (WR Stages 1 and 2, N = 1183) and a late-stage cohort (WR Stage 5, N = 260) based on the initial clinical evaluation. Progression was defined as the occurrence of Stage 5 disease or beyond for the early cohort and Stage 6 disease or beyond for the late cohort. The cumulative incidence of progression was 15.7% (137 events) for the early-stage cohort, and 53.7% (85 events) for the late-stage cohort. Baseline CD4+ T lymphocyte (T4) count was the most significant marker of progression: 26% of WR Stage 1 or 2 patients with T4 lymphocytes below 500/mm3 progressed, compared with 12% with T4 lymphocytes at or above 500/mm3. In late-stage individuals, 83% with T4 lymphocytes under 200/mm3 progressed, compared with 27% with T4 lymphocytes at or above 200/mm3. Older age was associated with progression in both early- and late-stage groups. Differences in the rates of disease progression were not significant between blacks and whites or between men and women. Two-year rates of progression among the late-stage patients dropped from 78 to 47% between 1986 and 1988. This contrasted with progression rates in the early-stage cohort, which remained stable: 18% for those entering follow-up in 1986 and 17% for those entering follow-up in 1988. These data indicate a significant slowing of HIV disease progression rates and mortality rates among individuals with late-stage disease that is temporally associated with the increased availability and use of therapies. With control of T4 lymphocyte count, age, and calendar time, neither gender nor race was significantly associated with progression in either early- or late-stage patients.


Subject(s)
HIV Infections/physiopathology , HIV-1 , Adolescent , Adult , Aged , Child , Cohort Studies , Enzyme-Linked Immunosorbent Assay , Female , HIV Infections/epidemiology , Humans , Incidence , Male , Middle Aged , Military Personnel , Proportional Hazards Models
15.
N Y State J Med ; 91(11): 479-82, 1991 Nov.
Article in English | MEDLINE | ID: mdl-1771042

ABSTRACT

A total of 10,722 members of the US Army Reserve and Army National Guard residing within the five boroughs of New York City (NYC) were tested for antibody to human immunodeficiency virus (HIV). The crude prevalence in NYC was found to be 10.6 per 1,000, compared with a national prevalence of 1.6 per 1,000. The highest prevalences were found among Blacks and Hispanics, males, and among those aged 30-39 years. Patterns of infection were found to be similar to those found among heterosexual and drug abusing populations. Prevalence was highest in the Bronx and varied by less than twofold over all boroughs. Comparisons with reported cases of acquired immunodeficiency syndrome (AIDS) in New York City indicated differences in the demographic and geographic characteristics of the current epidemic of AIDS and HIV infection among members of the Army Reserve components.


Subject(s)
HIV Infections/epidemiology , HIV Seroprevalence , Military Personnel , Adult , Ethnicity , Female , Humans , Male , Middle Aged , New York City/epidemiology , Sex Factors
16.
Am J Public Health ; 81(10): 1280-4, 1991 Oct.
Article in English | MEDLINE | ID: mdl-1928526

ABSTRACT

BACKGROUND: A natural history study of human immunodeficiency virus (HIV) disease was carried out among 1575 HIV-infected US Army men and 6220 demographically similar uninfected soldiers. Inpatient morbidity occurring up to 8 years prior to the date of HIV infection diagnosis among those men who became HIV infected was evaluated for both groups. METHODS: Incidence density rates were calculated for hospital admissions. Poisson regression was used to assess the trend in hospital admissions among those subsequently diagnosed with HIV infection. Prevalence ratios for discharge diagnoses were also calculated. RESULTS: Sixteen diagnoses/diagnosis categories occurred statistically more frequently among subsequently HIV diagnosed individuals than among those who remained uninfected. Among these were hepatitis B and abscess of anal/rectal region (6 to 8 years prior to HIV infection diagnosis); unspecified viral infection, enlarged lymph nodes, syphilis (3 to 5 years prior to HIV infection diagnosis); and diagnoses suggestive of acute retroviral syndrome (1 to 2 years prior to HIV infection diagnosis). CONCLUSIONS: Data such as these may provide useful information to HIV surveillance efforts regarding patterns of morbidity experienced prior to HIV infection as well as to health care providers regarding patients at high risk for becoming infected with HIV.


Subject(s)
HIV Infections/epidemiology , Military Personnel , Adult , HIV Infections/diagnosis , Hospitalization , Humans , Male , Morbidity , Risk Factors , United States
18.
Am J Epidemiol ; 133(12): 1210-9, 1991 Jun 15.
Article in English | MEDLINE | ID: mdl-2063829

ABSTRACT

The authors conducted a population-based study to attempt to estimate the effect of human immunodeficiency virus type 1 (HIV-1) seropositivity on Armed Services Vocational Aptitude Battery test scores in otherwise healthy individuals with early HIV-1 infection. The Armed Services Vocational Aptitude Battery is a 10-test written multiple aptitude battery administered to all civilian applicants for military enlistment prior to serologic screening for HIV-1 antibodies. A total of 975,489 induction testing records containing both Armed Services Vocational Aptitude Battery and HIV-1 results from October 1985 through March 1987 were examined. An analysis data set (n = 7,698) was constructed by choosing five controls for each of the 1,283 HIV-1-positive cases, matched on five-digit ZIP code, and a multiple linear regression analysis was performed to control for demographic and other factors that might influence test scores. Years of education was the strongest predictor of test scores, raising an applicant's score on a composite test nearly 0.16 standard deviation per year. The HIV-1-positive effect on the composite score was -0.09 standard deviation (99% confidence interval -0.17 to -0.02). Separate regressions on each component test within the battery showed HIV-1 effects between -0.39 and +0.06 standard deviation. The two Armed Services Vocational Aptitude Battery component tests felt a priori to be the most sensitive to HIV-1-positive status showed the least decrease with seropositivity. Much of the variability in test scores was not predicted by either HIV-1 serostatus or the demographic and other factors included in the model. There appeared to be little evidence of a strong HIV-1 effect.


Subject(s)
Aptitude Tests/statistics & numerical data , HIV Seropositivity , HIV-1/immunology , Military Personnel , Aptitude Tests/methods , Confounding Factors, Epidemiologic , Female , HIV Antibodies , Humans , Male , Regression Analysis , United States
19.
JAMA ; 265(13): 1709-14, 1991 Apr 03.
Article in English | MEDLINE | ID: mdl-2002572

ABSTRACT

Because soldiers in the US Army are recurrently tested for the presence of antibody to the human immunodeficiency virus (HIV), HIV seroconversion rates can be directly measured. From November 1985 through October 1989, 429 HIV seroconversions were detected among 718,780 soldiers who contributed 1,088,447 person-years of follow-up time (HIV seroconversion rate, 0.39 per 1000 person-years). Period-specific seroconversion rates declined significantly from 0.49 per 1000 person-years (November 1985 through October 1987) to 0.33 per 1000 person-years (November 1987 through October 1988) to 0.29 per 1000 person-years (November 1988 through October 1989). The HIV seroconversion risk among active-duty soldiers was significantly associated with race/ethnic group, age, gender, and marital status. Based on these trends, we estimate that approximately 220 soldiers (95% confidence interval, 160 to 297 soldiers) were infected with HIV during 1989 and 1990, with potentially fewer in future years.


Subject(s)
HIV Seropositivity/epidemiology , Military Personnel , Adolescent , Adult , Female , Follow-Up Studies , HIV Seropositivity/ethnology , Humans , Male , Marriage , Middle Aged , Multivariate Analysis , Naval Medicine , United States/epidemiology
20.
Article in English | MEDLINE | ID: mdl-1684387

ABSTRACT

We modeled the decline of CD4+ T-lymphocytes (T4 cells) in HIV-infected individuals with a continuous-time Markov process. The model partitions the HIV infection period into six progressive T4-cell count intervals (states), followed by a seventh state: a definitive HIV-infection end point, i.e., AIDS diagnosis or Walter Reed stage 6 (opportunistic infections). The Markov model was used to estimate the state-specific progression rates from data as functions of important progression cofactors. We applied the model to data on 1,796 HIV-positive individuals in the U.S. Army. The estimated mean waiting time from seroconversion to when the T4-cell count persistently drops below 500/mm3, but is greater than 349/mm3, is 4.1 years, and the waiting time to a T4-cell count of less than 200/mm3 is estimated at 8.0 years. The estimated rate of T4-cell decline was higher for HIV-infected individuals with initially high numbers of T4 cells, but the estimated rate of decline remains relatively uniform when the T4-cell count dropped persistently below 500/mm3. The opportunistic infection incubation period, i.e., the time from seroconversion to opportunistic infection diagnosis, is estimated at 9.6 years. Age is found to be an important cofactor. The estimated mean opportunistic infection incubation periods are 11.1, 10.0, and 8.9 years for the youngest (less than or equal to 25 years old), the middle (26-30 years old), and the oldest (greater than 30 years old) age groups, respectively.(ABSTRACT TRUNCATED AT 250 WORDS)


Subject(s)
Acquired Immunodeficiency Syndrome/pathology , CD4-Positive T-Lymphocytes/pathology , Leukocyte Count , Markov Chains , Acquired Immunodeficiency Syndrome/physiopathology , Adult , Age Factors , CD4 Antigens , Humans , Time Factors
SELECTION OF CITATIONS
SEARCH DETAIL
...