Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 24
Filter
1.
PloS med ; 11(8): e1001709, Aug. 2018.
Article in English | SDG | ID: biblio-1026131

ABSTRACT

Poor sanitation is thought to be a major cause of enteric infections among young children. However, there are no previously published randomized trials to measure the health impacts of large-scale sanitation programs. India's Total Sanitation Campaign (TSC) is one such program that seeks to end the practice of open defecation by changing social norms and behaviors, and providing technical support and financial subsidies. The objective of this study was to measure the effect of the TSC implemented with capacity building support from the World Bank's Water and Sanitation Program in Madhya Pradesh on availability of individual household latrines (IHLs), defecation behaviors, and child health (diarrhea, highly credible gastrointestinal illness [HCGI], parasitic infections, anemia, growth).


Subject(s)
Humans , Male , Female , Child, Preschool , Child , Adolescent , Cluster Analysis , Defecation/physiology , Diarrhea/prevention & control , Anemia/etiology , Anemia/epidemiology , Rural Population/statistics & numerical data , Toilet Facilities/statistics & numerical data , Environmental Hazards , Gastrointestinal Diseases/prevention & control , India
2.
Water Sci Technol ; 52(8): 133-42, 2005.
Article in English | MEDLINE | ID: mdl-16312960

ABSTRACT

We conducted a search to identify all English language papers (published between 1 January 1985 and 26 June 2003) with evidence on the effectiveness of water, sanitation and hygiene interventions in developing countries, in which diarrhoea morbidity in non-outbreak conditions was reported. A total of 39 studies were identified as relevant after an initial review of over 2000 titles. Data were extracted and, where possible, combined using meta-analysis to provide a summary estimate of the effectiveness of specific interventions, including water supply and water treatment. Most of the interventions (including multiple interventions, hygiene and water quality) were found to significantly reduce the levels of diarrhoeal illness, with the greatest impact being seen for hygiene and household treatment interventions (after removal of studies classed as poor quality). Sanitation interventions could not be assessed as only a single study suitable for meta-analysis was identified.


Subject(s)
Developing Countries , Diarrhea/epidemiology , Diarrhea/prevention & control , Hygiene , Sanitation/methods , Water Purification/methods , Water Supply/analysis , Humans , Water Supply/standards
3.
Cochrane Database Syst Rev ; (4): CD005482, 2005 Oct 19.
Article in English | MEDLINE | ID: mdl-16235406

ABSTRACT

BACKGROUND: Although antiretroviral treatment (ART) has led to a decline in morbidity and mortality of HIV-infected patients in developed countries, it has also presented challenges. These challenges include increases in pill burden; adherence to treatment; development of resistance and treatment failure; development of drug toxicities; and increase in cost of HIV treatment and care. These issues stimulated interest in investigating the short-term and long-term consequences of discontinuing ART, thus providing support for research in structured treatment interruptions (STI). Structured treatment interruptions of antiretroviral treatment involve taking supervised breaks from ART. STI are defined as one or more planned, timing pre-specified, cyclical interruptions in ART. STI are attempted in monitored clinical settings in eligible participants. STI have generated hopes of reducing drug toxicities, decreasing costs and total time on treatment in HIV-positive patients. The first STI was attempted in the case of a patient in Germany, who later permanently discontinued treatment. This successful anecdotal case report led to several trials on STI worldwide. OBJECTIVES: The objective of this systematic review was to assess the effects of structured treatment interruptions (STI) of antiretroviral therapy (ART) in the management of chronic suppressed HIV infection, using all available high-quality studies. SEARCH STRATEGY: Nine databases covering the time period from January 1996 to March 2005 were searched. Bibliographies were scanned and experts contacted in the field to identify unpublished research and ongoing trials. Two reviewers independently extracted data, and evaluated study eligibility and quality. Disagreements were resolved in consultation with a third reviewer. Data from 33 studies were included in the review. SELECTION CRITERIA: STI is a planned, timing pre-specified experimental intervention. In our review, we decided to include all available intervention trials in HIV-infected patients, with or without control groups. We reviewed evidence from 18 randomized and non-randomized controlled trials, and 15 single arm trials. Single arm trials were included because these pilot studies made significant contribution to the early development and refutation of hypotheses in STI. DATA COLLECTION AND ANALYSIS: Trials included in this review varied in study participants, methodology and reported inconsistent measures of effect. Due to this heterogeneity, we did not attempt to meta-analyse them. Results were tabulated and a qualitative systematic review was done MAIN RESULTS: For the purpose of this review, STI strategies were classified either as a timed-cycle STI strategy or a CD4-guided STI strategy. In timed-cycle STI strategy, a predetermined period of fixed duration (e.g. one week, one month) off ART was attempted followed by resumption of ART, while closely monitoring changes in CD4 levels and viral load levels. Predetermined criteria for interruption and resumption were laid out in this strategy. Timed-cycle STI fell out of favor due to reports of development of resistance in many studies. Moreover, there were no significant immunological and virological benefits, and no reduction in toxicities, reported in these studies. In CD4-guided STI strategy, ART was interrupted for variable durations guided by CD4 levels. Participants with high nadir CD4 levels qualified for this approach. A reduction in costs of ART, a reduction in mutation, and a better tolerability of this CD4-guided STI strategy was reported. However, concerns about long-term safety of this strategy on immunological, virological, and clinical outcomes were also raised. AUTHORS' CONCLUSIONS: Timed-cycle STI have not been proven to be safe in the short term. Although CD4-guided STI strategy has reported favorable outcomes in the short term, the long-term safety, efficacy and tolerability of this strategy has not been fully investigated. Based on the studies we reviewed, the evidence to support the use of timed-cycle STI and CD4-guided STI cycles as a standard of care in the management of chronic suppressed HIV infection is inconclusive.


Subject(s)
Antiretroviral Therapy, Highly Active/methods , HIV Infections/drug therapy , Adult , Anti-Retroviral Agents/administration & dosage , Chronic Disease , Drug Administration Schedule , Humans , Randomized Controlled Trials as Topic
4.
Epidemiol Infect ; 129(2): 315-23, 2002 Oct.
Article in English | MEDLINE | ID: mdl-12405100

ABSTRACT

This manuscript extends our previously published work (based on data from one clinic) on the association between three drinking water-treatment modalities (boiling, filtering, and bottling) and diarrhoeal disease in HIV-positive persons by incorporating data from two additional clinics collected in the following year. We conducted a cross-sectional survey of drinking water patterns, medication usage, and episodes of diarrhoea among HIV-positive persons attending clinics associated with the San Francisco Community Consortium. We present combined results from our previously published work in one clinic (n = 226) with data from these two additional clinics (n = 458). In this combined analysis we employed logistic regression and marginal structural modelling of the data. The relative risk of diarrhoea for 'always' vs. 'never' drinking boiled water was 0.68 (95% CI 0.45-1.04) and for 'always' vs. 'never' drinking bottled water was 1.22 (95 % CI 0.82-1.82). Drinking filtered water was unrelated to diarrhoea (1.03 (95% CI 0.78, 1.35) for 'always' vs. 'never' drinking filtered water]. Adjustment for confounding did not have any notable effect on the point estimates (0.61, 1.35 and 0.98 for boiled, bottled, and filtered water respectively, as defined above). The risk of diarrhoea was lower among those consuming boiled water but this finding was not statistically significant. Because of these findings, the importance of diarrhoea in immunocompromised individuals, and the limitations of cross-sectional data further prospective investigations of water consumption and diarrhoea among HIV-positive individuals are needed.


Subject(s)
Diarrhea/epidemiology , Diarrhea/etiology , HIV Infections , Water Purification/methods , Adolescent , Adult , Aged , Aged, 80 and over , CD4 Lymphocyte Count , California/epidemiology , Child , Confounding Factors, Epidemiologic , Cross-Sectional Studies , Female , Humans , Logistic Models , Male , Medical Records , Middle Aged , Risk Factors , San Francisco/epidemiology , Water Supply
5.
Epidemiol Infect ; 128(1): 73-81, 2002 Feb.
Article in English | MEDLINE | ID: mdl-11895094

ABSTRACT

In a cross-sectional survey of 226 HIV-infected men, we examined the occurrence of diarrhoea and its relationship to drinking water consumption patterns, risk behaviours, immune status and medication use. Diarrhoea was reported by 47% of the respondents. Neither drinking boiled nor filtered water was significantly associated with diarrhoea (OR = 0.5 [0.2, 1.6], 1.2 [0.6, 2.5] respectively), whereas those that drank bottled water were at risk for diarrhoea (OR = 3.0 [1.1, 7.8]). Overall, 47% always or often used at least one water treatment. Of the 37% who were very concerned about drinking water, 62% had diarrhoea, 70% always or often used at least one water treatment. An increase in CD4 count was protective only for those with a low risk of diarrhoea associated with medication (OR = 0.6 [0.5, 0.9]). A 30% attributable risk to diarrhoea was estimated for those with high medication risk compared to those with low medication risk. The significant association between concern with drinking water and diarrhoea as well as between concern with drinking water and water treatment suggests awareness that drinking water is a potential transmission pathway for diarrhoeal disease. At the same time we found that a significant portion of diarrhoea was associated with other sources not related to drinking water such as medication usage.


Subject(s)
Diarrhea/etiology , HIV Infections/complications , Immunocompromised Host , Risk-Taking , Water Supply , Adult , Cross-Sectional Studies , Diarrhea/epidemiology , Drinking Behavior , Drug-Related Side Effects and Adverse Reactions , Humans , Male , Middle Aged , Odds Ratio , Risk Factors
6.
Emerg Infect Dis ; 7(6): 1004-9, 2001.
Article in English | MEDLINE | ID: mdl-11747729

ABSTRACT

Advances in serologic assays for Cryptosporidium parvum have made serology an attractive surveillance tool. The sensitivity, specificity, and predictive value of these new assays for surveillance of immunocompromised populations, however, have not been reported. Using stored serum specimens collected for the San Francisco Men's Health Study, we conducted a case-control study with 11 clinically confirmed cases of cryptosporidiosis. Based on assays using a 27-kDa antigen (CP23), the serum specimens from cases had a median response immunoglobulin (Ig) G level following clinical diagnosis (1,334) and a net response (433, change in IgG level from baseline) that were significantly higher than their respective control values (329 and -32, Wilcoxon p value = 0.01). Receiver operator curves estimated a cutoff of 625 U as the optimal sensitivity (0.86 [0.37, 1.0]) and specificity (0.86 [0.37, 1.0]) for predicting Cryptosporidium infection. These data suggest that the enzyme-linked immunosorbent assay technique can be an effective epidemiologic tool to monitor Cryptosporidium infection in immunocompromised populations.


Subject(s)
AIDS-Related Opportunistic Infections/immunology , Antibodies, Protozoan/blood , Cryptosporidiosis/immunology , Cryptosporidium parvum/immunology , AIDS-Related Opportunistic Infections/blood , AIDS-Related Opportunistic Infections/epidemiology , AIDS-Related Opportunistic Infections/parasitology , Adult , Animals , Case-Control Studies , Cryptosporidiosis/blood , Cryptosporidiosis/epidemiology , Cryptosporidiosis/parasitology , Humans , Male , Middle Aged , Prospective Studies , San Francisco/epidemiology
8.
Epidemiol Infect ; 127(3): 535-43, 2001 Dec.
Article in English | MEDLINE | ID: mdl-11811888

ABSTRACT

To study whether African-Americans are less likely than whites to present with cryptosporidiosis as an AIDS-defining condition (ADC), a case-control study was conducted using a large, population-based surveillance registry of AIDS patients in California. Data from January 1980 through June 1999 were analysed using risk factor stratification and multivariate logistic regression to evaluate confounding by other risk factors such as gender, injection drug use (IDU), CD4 counts, age and sexual orientation. Cases included 1373 subjects with cryptosporidiosis as an ADC and controls included 97,419 subjects with other ADC. The results indicate a significantly lower risk for presentation with cryptosporidiosis as an ADC among African-Americans compared with whites (OR vs. whites = 0.5, 95% CI 0.4, 0.7). Additionally, there is evidence that heterosexuals are less likely than homosexual/bisexual males to present with cryptosporidiosis (OR = 0.5, 95% CI 0.4, 0.7). Our analyses also suggest a decreasing risk with increasing age. The possibility that there may be biologic factors or differential lifetime exposures that account for the difference between the racial/ethnic groups merits further investigation.


Subject(s)
AIDS-Related Opportunistic Infections/epidemiology , Black or African American , Cryptosporidiosis/epidemiology , Population Surveillance , White People , AIDS-Related Opportunistic Infections/ethnology , Adolescent , Adult , Aged , California/epidemiology , Case-Control Studies , Cryptosporidiosis/ethnology , Female , Heterosexuality , Humans , Logistic Models , Male , Middle Aged , Registries , Risk Factors , Substance Abuse, Intravenous
9.
JAMA ; 284(11): 1417-24, 2000 Sep 20.
Article in English | MEDLINE | ID: mdl-10989405

ABSTRACT

CONTEXT: Chorioamnionitis has been implicated in the pathogenesis of cerebral palsy, but most studies have not reported a significant association. Cystic periventricular leukomalacia (cPVL) is believed to be a precursor of cerebral palsy in preterm infants. OBJECTIVES: To determine whether chorioamnionitis is associated with cerebral palsy or cPVL and to examine factors that may explain differences in study results. DATA SOURCES: Searches of MEDLINE (1966-1999), Index Medicus (1960-1965), Doctoral Dissertation Abstracts On-Line (1861-1999), bibliographies, and online conference proceedings (1999) were performed for English-language studies with titles or abstracts that discussed prenatal risk factors for cerebral palsy or cPVL. STUDY SELECTION: Of 229 initially identified publications, meta-analyses were performed on studies that addressed the association between clinical (n = 19) or histologic (n = 7) chorioamnionitis and cerebral palsy or cPVL in both preterm and full-term infants. Inclusion criteria were: presence of appropriate exposure and outcome measures, case-control or cohort study design, and provision of sufficient data to calculate relative risks (RRs) or odds ratios with 95% confidence intervals (CIs). Studies evaluating risk of cerebral palsy following maternal fever, urinary tract infection, or other maternal infection were collected, but not included in the meta-analysis. DATA EXTRACTION: Information from individual studies was abstracted using standardized forms by 2 independent observers blinded to authors' names, journal titles, and funding sources. DATA SYNTHESIS: Using a random effects model, clinical chorioamnionitis was significantly associated with both cerebral palsy (RR, 1.9; 95% CI, 1.4-2.5) and cPVL (RR, 3.0; 95% CI, 2.2-4.0) in preterm infants. The RR of histologic chorioamnionitis and cerebral palsy was 1.6 (95% CI, 0.9-2.7) in preterm infants, and histologic chorioamnionitis was significantly associated with cPVL (RR, 2.1; 95% CI, 1.5-2.9). Among full-term infants, a positive association was found between clinical chorioamnionitis and cerebral palsy (RR, 4.7; 95% CI, 1.3-16.2). Factors explaining differences in study results included varying definitions of clinical chorioamnionitis, extent of blinding in determining exposure status, and whether individual studies adjusted for potential confounders. CONCLUSION: Our meta-analysis indicates that chorioamnionitis is a risk factor for both cerebral palsy and cPVL. JAMA. 2000;284:1417-1424.


Subject(s)
Cerebral Palsy/etiology , Chorioamnionitis/complications , Leukomalacia, Periventricular/etiology , Cerebral Palsy/epidemiology , Female , Humans , Infant, Newborn , Infant, Premature , Leukomalacia, Periventricular/epidemiology , Pregnancy , Risk Factors
10.
Lifetime Data Anal ; 6(3): 237-50, 2000 Sep.
Article in English | MEDLINE | ID: mdl-10949861

ABSTRACT

In disease registries there can be a delay between death of a subject and the reporting of this death to the data analyst. If researchers use the Kaplan-Meier estimator and implicitly assumed that subjects who have yet to have death reported are still alive, i.e. are censored at the time of analysis, the Kaplan-Meier estimator is typically inconsistent. Assuming censoring is independent of failure, we provide a simple estimator that is consistent and asymptotically efficient. We also provide estimates of the asymptotic variance of our estimator and simulations that demonstrate the favorable performance of these estimators. Finally, we demonstrate our methods by analyzing AIDS survival data. This analysis underscores the pitfalls of not accounting for delay when estimating the survival distribution and suggests a significant reduction in bias by using our estimator.


Subject(s)
Acquired Immunodeficiency Syndrome/mortality , Registries , Information Management , Models, Statistical , Survival Analysis , Time Factors , United States/epidemiology
11.
Pediatrics ; 105(2): E19, 2000 Feb.
Article in English | MEDLINE | ID: mdl-10654979

ABSTRACT

OBJECTIVE: The recommended dosing frequency of oral penicillin for the treatment of acute streptococcal tonsillopharyngitis has long been 3 to 4 times daily. In 1994, treatment guidelines included twice-daily (BID) dosing for the first time, a recommendation that could significantly increase the ease of compliance. This meta-analysis was performed to determine whether overall cure rates differed between BID or once-daily (QD) versus more frequent dosing schedules in the treatment of streptococcal tonsillopharyngitis. DATA SOURCES: Candidate studies for this meta-analysis included all clinical trials of therapy for streptococcal tonsillopharyngitis published through August 1998 and identified using Medline, Dissertation Abstracts, conference proceedings, and bibliographies of all retrieved articles. STUDY SELECTION: A study was eligible for inclusion if it was a randomized clinical trial that compared the efficacies of different dosing frequencies of 10-day penicillin or amoxicillin in the treatment of streptococcal tonsillopharyngitis. Of the 30 articles initially identified, 6 studies met eligibility criteria. OUTCOME MEASURE: The measure of interest was the difference in proportion cured between the BID or QD dosing group and the comparison group with more frequent dosing. RESULTS: The results of this analysis suggest that BID dosing of 10-day penicillin is as efficacious as more frequent dosing regimens in the treatment of streptococcal tonsillopharyngitis. This result also holds true in a subgroup analysis confined to pediatric cases and does not vary with total daily dose of the regimen. QD dosing of penicillin is associated with a cure rate that is 12 percentage points lower than more frequent dosing (95% confidence interval: 3-21). In contrast, this decreased efficacy is not found with QD dosing of amoxicillin. CONCLUSIONS: This meta-analysis supports current recommendations for BID dosing of penicillin in treating streptococcal tonsillopharyngitis. QD penicillin is associated with decreased efficacy and should not be used. Simplified regimens of amoxicillin of shorter duration or of less frequent dosing should be further investigated.


Subject(s)
Amoxicillin/administration & dosage , Penicillins/administration & dosage , Pharyngitis/drug therapy , Streptococcal Infections/drug therapy , Streptococcus pyogenes , Administration, Oral , Drug Administration Schedule , Humans , Pharyngitis/microbiology , Streptococcal Infections/microbiology , Tonsillitis/drug therapy , Tonsillitis/microbiology , Treatment Outcome
12.
J Air Waste Manag Assoc ; 49(4): 454-62, 1999 Apr.
Article in English | MEDLINE | ID: mdl-10232060

ABSTRACT

OBJECTIVES: Approximately 1,100 communities in the United States have combined sewer and stormwater systems whose capacity may be exceeded during moderate or heavy rainfall. Outflows may occur that can deposit water with varying concentrations of the components of sewage onto public areas, where contact with residents or workers is possible, potentially resulting in a range of adverse health effects. This study proposes and applies three analytic methods to evaluate the impact of such outflows on public health. METHODS: The work attendance records of 449 U.S Postal Service letter carriers in Sacramento, CA, were reviewed to determine the frequency of sick leave use (as a surrogate measure of illness) in relationship to rainfall and potential exposure to sewage-contaminated outflows in two distinct groups of letter carriers from October 1, 1992, through April 30, 1993. Rainfall was a surrogate measure of outflows from the combined sewer and stormwater system because no information existed about the extent of exposure of these (or any other) workers. One group of letter carriers delivered mail exclusively within an area served by the combined sewer and stormwater collection system; the second group delivered mail exclusively outside this area where sewage and stormwater collection are separated. The first approach to the assessment of the data was a description of the temporal relationship between rainfall patterns and absentee rates in the two groups of workers. The second approach used logistic regression modelling with varying lags between rainfall and sick leave usage. The third approach used Poisson regression analysis of the entire study period to examine the differential impact of rainfall on the two groups. RESULTS: The descriptive analyses detected no relationship between rainfall and sick leave use. The logistic regression analyses detected evidence (as measured by the interaction coefficient in logistic models) of an increased use of sick leave by the letter carriers within the Combined System area at lag periods of one, four, and five days after rainfall. These estimates were not, however, statistically significant (p > 0.05). The Poisson regression analysis showed no evidence of a differential impact of rainfall on the two groups (incidence rate ratio = 1.19 in both groups for periods of rain versus no rain). CONCLUSIONS: These three methods can be used to investigate the public health impact of combined system outflows. Ideally, however, these approaches would be applied to prospectively collected surveillance data that would rely upon direct measurements of exposure and illness, rather than surrogate variables.


Subject(s)
Models, Statistical , Public Health , Sewage , Water Pollution/adverse effects , Female , Humans , Infections/epidemiology , Male , Rain , Refuse Disposal/methods , Regression Analysis , Retrospective Studies , Sick Leave
13.
West J Med ; 170(3): 156-60, 1999 Mar.
Article in English | MEDLINE | ID: mdl-10214102

ABSTRACT

To estimate the prevalence and predictors of hepatitis C virus (HCV) infection among inmates, a cross-sectional survey was conducted in 1994 among inmates entering six reception centers of the California Department of Corrections. Discarded serum samples were tested for antibodies to human immunodeficiency virus (HIV), HCV, hepatitis B core, and hepatitis B surface antigen (HBsAg). Of 4,513 inmates in this study, 87.0% were men and 13.0% were women. Among male inmates, 39.4% were anti-HCV-positive; by race/ethnicity, prevalences were highest among whites (49.1%). Among female inmates, 53.5% were anti-HCV-positive; the prevalence was highest among Latinas (69.7%). In addition, rates for HIV were 2.5% for men and 3.1% for women; and for HBsAg, 2.2% (men) and 1.2% (women). These data indicate that HCV infection is common among both men and women entering prison. The high seroprevalence of anti-HCV-positive inmates may reflect an increased prevalence of high-risk behaviors and should be of concern to the communities to which these inmates will be released.


Subject(s)
Hepatitis C/epidemiology , Prisoners , Adult , California/epidemiology , Cross-Sectional Studies , Female , Humans , Male , Prevalence , Retrospective Studies
14.
Neurology ; 51(2): 411-8, 1998 Aug.
Article in English | MEDLINE | ID: mdl-9710012

ABSTRACT

OBJECTIVE: The objective of this study is to estimate the risk of subarachnoid hemorrhage produced by oral contraceptive use. METHODS: Studies published since 1960 were identified using MEDLINE, Cumulated Index Medicus, Dissertation Abstracts On-line, and bibliographies of pertinent articles. Two independent reviewers screened published cohort and case-control studies that evaluated the risk of subarachnoid hemorrhage associated with oral contraceptives. Eleven of 21 pertinent studies met predefined quality criteria for inclusion in the meta-analysis. Relative risk (RR) estimations evaluating subarachnoid hemorrhage risk in oral contraceptive users compared with nonusers were extracted from each study by two independent reviewers. Study heterogeneity was assessed by design type, outcome measure (mortality versus incidence), exposure measure (current versus ever use), prevailing estrogen dose used, and control for smoking and hypertension. RESULTS: The overall summary RR of subarachnoid hemorrhage due to oral contraceptive use was 1.42 (95% CI, 1.12 to 1.80; p = 0.004). When the two study results failing to control for smoking were excluded from the analysis, a slightly greater effect was seen, with an RR of 1.55 (95% CI, 1.26 to 1.91; p < 0.0001). In the six studies controlling for smoking and hypertension the RR was 1.49 (95% CI, 1.20 to 1.85; p = 0.0003). High-estrogen oral contraceptives appeared to impart a greater risk than low-dose preparations in studies controlling for smoking, but the difference was not significant (high-dose RR, 1.94; 95% CI, 1.06 to 3.56; low-dose RR, 1.51; 95% CI, 1.18 to 1.92). CONCLUSIONS: This meta-analysis of observational studies suggests that oral contraceptive use produces a small increase in the risk of subarachnoid hemorrhage.


PIP: Both case-control and cohort studies have evaluated the risk of subarachnoid hemorrhage (SAH) among oral contraceptive (OC) users and identified relative risks as low as 0.5 and as high as 6.5. To determine whether OC use is indeed a risk factor for SAH after accounting for the variability in study designs and results, a meta-analysis was conducted of the 11 salient independent studies included in the research literature. The summary estimate of effect for all studies was a relative risk (RR) of 1.42 (95% confidence interval (CI), 1.12-1.80). There was a trend toward smaller RRs in the most recent studies, presumably as a result of decreases in the estrogen dose of modern OCs. In the 6 studies that controlled for both smoking and hypertension, the summary RR was 1.49 (95% CI, 1.20-1.85). Only 2 of the 11 studies found a protective effect of current OC use on SAH risk, and it was nonsignificant. Taken together, these studies support a weak positive association between OC use and SAH risk. In the US, an additional 430 patients each year with OC-related SAH would be expected. For most women, the SAH risk is inconsequential in evaluating the decision about OC use. However, for women at high risk of SAH due to unruptured aneurysms, a strong positive family history, smoking, or hypertension, it may be advisable to consider alternative contraceptive methods until more data are available.


Subject(s)
Contraceptives, Oral/adverse effects , Subarachnoid Hemorrhage/chemically induced , Case-Control Studies , Cohort Studies , Humans , Risk Factors , Treatment Outcome
15.
Epidemiology ; 9(3): 255-63, 1998 May.
Article in English | MEDLINE | ID: mdl-9583416

ABSTRACT

We combined information on the temporal pattern of disease incidence for the 1993 cryptosporidiosis outbreak in Milwaukee with information on oocyst levels to obtain insight into the epidemic process. We constructed a dynamic process model of the epidemic with continuous population compartments using reasonable ranges for the possible distribution of the model parameters. We then explored which combinations of parameters were consistent with the observations. A poor fit of the March 1-22 portion of the time series suggested that a smaller outbreak occurred before the March 23 treatment failure, beginning sometime on or before March 1. This finding suggests that had surveillance systems detected the earlier outbreak, up to 85% of the cases might have been prevented. The same conclusion was obtained independent of the model by transforming the incidence time series data of Mac Kenzie et al. This transformation is based on a background monthly incidence rate for watery diarrhea in the Milwaukee area of 0.5%. Further analysis using the incidence data from the onset of the major outbreak, March 23, through the end of April, resulted in three inferred properties of the infection process: (1) the mean incubation period was likely to have been between 3 and 7 days; (2) there was a necessary concurrent increase in Cryptospordium oocyst influent concentration and a decrease in treatment efficiency of the water; and (3) the variability of the dose-response function in the model did not appreciably affect the simulated outbreaks.


Subject(s)
Computer Simulation , Cryptosporidiosis/epidemiology , Disease Outbreaks , Models, Biological , Water Microbiology , Water Supply , Animals , Cryptosporidium/pathogenicity , Diarrhea/microbiology , Disease Transmission, Infectious , Humans , Incidence , Time Factors , Wisconsin
16.
Am J Epidemiol ; 146(2): 115-27, 1997 Jul 15.
Article in English | MEDLINE | ID: mdl-9230773

ABSTRACT

The authors investigated quarterly trends in survival after the diagnosis of Pneumocystis carinii pneumonia for 19,607 patients in California in the decade from January 1, 1983, through December 31, 1992. Subjects included all cases for whom P. carinii pneumonia was the initial (and only) acquired immunodeficiency syndrome (AIDS)-defining diagnosis as reported to the California human immunodeficiency virus/AIDS surveillance registry. There was a period of rapidly improving survival from approximately June 1986 until April 1988, coincident with the widespread introduction of antiretroviral therapy (zidovudine) and the institution of P. carinii pneumonia prophylaxis (with cotrimoxazole and pentamidine). There was no evidence, however, of meaningful improvements in survival for these patients after that period. The association of several covariates (risk transmission group, gender, race/ethnicity, certainty of P. carinii pneumonia diagnosis, age, region of residence, availability of CD4 count, and level of CD4 count) were also studied both by proportional hazards regression and by recursive partitioning (i.e., tree-based) survival analysis. The availability of a CD4 count (regardless of its level) was the single factor most strongly associated with survival (median survival 36 months among those with and 14 months among those without reported CD4 counts, p < 0.05). Data from this large, population-based surveillance registry of AIDS in California suggest that, despite earlier improvements in survival after the diagnosis of P. carinii pneumonia, the long-term survival of these patients remains poor (39% alive 2 years after diagnosis) and that no improvement in survival has occurred since 1988.


Subject(s)
AIDS-Related Opportunistic Infections/mortality , Pneumonia, Pneumocystis/mortality , AIDS-Related Opportunistic Infections/drug therapy , AIDS-Related Opportunistic Infections/immunology , Adult , Anti-HIV Agents/therapeutic use , CD4 Lymphocyte Count , California/epidemiology , Female , Humans , Male , Middle Aged , Mortality/trends , Pneumonia, Pneumocystis/drug therapy , Pneumonia, Pneumocystis/immunology , Proportional Hazards Models , Registries , Survival Analysis , Survival Rate
17.
Am J Epidemiol ; 144(9): 807-16, 1996 Nov 01.
Article in English | MEDLINE | ID: mdl-8890659

ABSTRACT

The authors reviewed the medical records of 194 human immunodeficiency virus (HIV)-positive patients newly diagnosed with cryptosporidiosis and all 3,564 patients with newly diagnosed acquired immunodeficiency syndrome (AIDS) at San Francisco General Hospital for the period 1986-1992. The study was designed to address three questions: 1) How do AIDS patients who present with cryptosporidiosis differ from other patients with AIDS? 2) What factors are associated with survival among AIDS patients with newly diagnosed cryptosporidiosis? 3) Does a diagnosis of cryptosporidiosis impact survival after AIDS diagnosis? A total of 194 cases of cryptosporidiosis among HIV-infected patients were identified during the study period. Of the 194 patients, 109 (56%) had no prior diagnosis of AIDS. These 109 patients represented 3.1% of the 3,564 newly diagnosed cases of AIDS in the same period. Among the 134 patients with CD4 T-lymphocyte counts performed within 3 months of Cryptosporidium diagnosis, 34 (25%) had CD4 counts greater than 209 cells/ml. In a multivariate conditional logistic regression model, the incidence of Cryptosporidium was related to ethnicity (for blacks vs. whites, matched odds ratio (OR) = 0.15, 95% confidence interval (CI) 0.03-0.73), CD4 count (for a CD4 count of < or = 53 cells/ml vs. > 53 cells/ml, matched OR = 12.60, 95% CI 4.01-39.61), and age (for a 10-year increase, matched OR = 0.51, 95% CI 0.27-0.98). Two factors measured at the time of Cryptosporidium diagnosis were identified as being independently associated with survival (p < 0.001) in the proportional hazards model: CD4 count < or = 53 cells/ml versus > 53 cells/ml (relative hazard = 6.18, 95% CI 2.99-12.76) and hematocrit < or 37% versus > 37% (relative hazard = 2.27, 95% CI 1.22-4.22). The median durations of survival in the four subgroups of Cryptosporidium-infected patients defined by these two variables differed significantly from each other (range, 204-1,119 days). Cryptosporidiosis as an initial AIDS-defining diagnosis was associated with an elevated relative hazard of death in comparison with other AIDS-defining diagnoses (relative hazard = 2.01, 95% CI 1.38-2.93). These data identify the groups of HIV-infected individuals at risk for presentation with symptomatic Cryptosporidium infection; the distinct survival patterns among subgroups of those patients already infected with this parasite; and the survival of AIDS patients with newly diagnosed cryptosporidiosis relative to patients with other AIDS-defining conditions. Such information is necessary for the design of prospective studies, the development of prophylactic strategies, the evaluation of candidate therapies, and the provision of prognostic information to patients.


Subject(s)
AIDS-Related Opportunistic Infections/mortality , Acquired Immunodeficiency Syndrome/mortality , Cryptosporidiosis/mortality , AIDS-Related Opportunistic Infections/epidemiology , Acquired Immunodeficiency Syndrome/diagnosis , Adult , CD4 Lymphocyte Count , Case-Control Studies , Cryptosporidiosis/epidemiology , Cryptosporidiosis/etiology , Female , HIV Infections/complications , HIV Infections/mortality , Humans , Incidence , Male , Risk Factors , San Francisco/epidemiology , Survival Analysis
18.
Am J Epidemiol ; 142(11): 1221-30, 1995 Dec 01.
Article in English | MEDLINE | ID: mdl-7485069

ABSTRACT

To evaluate the impact of older age (> 50 years old) on survival in late-stage human immunodeficiency virus (HIV) disease, the authors analyzed 846 HIV-infected patients at the San Francisco Veterans Affairs Medical Center from 1987 to 1992. The median age was 42 years with 171 (20.2%) subjects aged 50 or more years. Survival was measured from the date of initial lymphocyte testing (median CD4 count, 223 cells/mm3) until death or censoring. Compared with those aged less than 40 years, and after multivariate proportional hazards adjustment for other significant determinants of survival (CD4 percentage, CD8 count, hematocrit, and prior acquired immunodeficiency syndrome diagnosis), there was no difference in survival for those aged 40-49 years, but there was a trend toward decreased survival in those aged 50-59 years (relative hazard = 1.32, 95% confidence interval 0.90-1.94) and in those aged 60 or more years (relative hazard = 1.56, 95% confidence interval 0.99-2.46). The impact of older age on mortality in HIV disease is, however, less than the impact of age on overall mortality in the United States. Accordingly, while older HIV-infected patients do have a somewhat poorer survival, this risk need not be too highly emphasized in individual patients; older patients deserve aggressive management.


Subject(s)
HIV Infections/mortality , Adult , Age Factors , Aged , CD4 Lymphocyte Count , Cohort Studies , HIV Infections/immunology , Humans , Middle Aged , Mortality , Multivariate Analysis , Proportional Hazards Models , Survival Analysis
19.
Medicine (Baltimore) ; 74(4): 176-90, 1995 Jul.
Article in English | MEDLINE | ID: mdl-7623653

ABSTRACT

The importance of group B streptococcus (GBS) as a cause of serious infectious disease among adults is not widely appreciated. In adults, the modes of acquisition and transmission are unknown. Since most hospital-based studies of GBS bacteremia in adults consist of small numbers of patients, the clinical spectrum of disease is not well described. Our retrospective study reviews the clinical features, antimicrobial therapy, and risk factors for mortality of 32 adult patients (18 women and 14 men) with GBS bacteremia and compares the proportion of isolates from the different beta-hemolytic streptococci sero-groups. We found that 39% of isolates from adult blood cultures were group B, a frequency nearly identical to that of group A streptococcal bacteremia. Most (66%) adult patients were more than 50 years old. Primary bacteremia was the most frequent clinical diagnosis, occurring in 7 (22%) of 32 patients. Nonhematologic cancer was the most frequently associated condition (25%). Nineteen percent of the patients had diabetes mellitus. The overall mortality rate was 31% and was significantly associated with increasing age. Our results are compared to those obtained by a review of all 5 previous comparable studies and demonstrate that GBS bacteremia is a serious infection in adults with increased mortality related to advancing age.


Subject(s)
Bacteremia/epidemiology , Streptococcal Infections/epidemiology , Streptococcus agalactiae , Adolescent , Adult , Age Factors , Aged , Bacteremia/microbiology , Bacteremia/mortality , Child , Female , Humans , Male , Middle Aged , Streptococcal Infections/microbiology , Streptococcal Infections/mortality , Streptococcus agalactiae/isolation & purification
20.
N Engl J Med ; 330(5): 370-1, 1994 Feb 03.
Article in English | MEDLINE | ID: mdl-8277970
SELECTION OF CITATIONS
SEARCH DETAIL
...