Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 34
Filter
1.
Cancer Epidemiol ; 45: 26-31, 2016 Dec.
Article in English | MEDLINE | ID: mdl-27687075

ABSTRACT

BACKGROUND: Ascertaining incident cancers is a critical component of cancer-focused epidemiologic cohorts and of cancer prevention trials. Potential methods: for cancer case ascertainment include active follow-up and passive linkage with state cancer registries. Here we compare the two approaches in a large cancer screening trial. METHODS: The Prostate, Lung, Colorectal and Ovarian (PLCO) cancer screening trial enrolled 154,955 subjects at ten U.S. centers and followed them for all-cancer incidence. Cancers were ascertained by an active follow-up process involving annual questionnaires, retrieval of records and medical record abstracting to ascertain and confirm cancers. For a subset of centers, linkage with state cancer registries was also performed. We assessed the agreement of the two methods in ascertaining incident cancers from 1993 to 2009 in 80,083 subjects from six PLCO centers where cancers were ascertained both by active follow-up and through linkages with 14 state registries. RESULTS: The ratio (times 100) of confirmed cases ascertained by registry linkage compared to active follow-up was 96.4 (95% CI: 95.1-98.2). Of cancers ascertained by either method, 86.6% and 83.5% were identified by active follow-up and by registry linkage, respectively. Of cancers missed by active follow-up, 30% were after subjects were lost to follow-up and 16% were reported but could not be confirmed. Of cancers missed by the registries, 27% were not sent to the state registry of the subject's current address at the time of linkage. CONCLUSION: Linkage with state registries identified a similar number of cancers as active follow-up and can be a cost-effective method to ascertain incident cancers in a large cohort.


Subject(s)
Neoplasms/epidemiology , Registries , Adult , Aged , Cohort Studies , Female , Follow-Up Studies , Humans , Incidence , Male , Middle Aged
2.
Am J Epidemiol ; 165(8): 874-81, 2007 Apr 15.
Article in English | MEDLINE | ID: mdl-17244633

ABSTRACT

Volunteers for prevention or screening trials are generally healthier and have lower mortality than the general population. The Prostate, Lung, Colorectal, and Ovarian Cancer Screening Trial (PLCO) is an ongoing, multicenter, randomized trial that randomized 155,000 men and women aged 55-74 years to a screening or control arm between 1993 and 2001. The authors compared demographics, mortality rates, and cancer incidence and survival rates of PLCO subjects during the early phase of the trial with those of the US population. Incidence and mortality from PLCO cancers (prostate, lung, colorectal, and ovarian) were excluded because they are the subject of the ongoing trial. Standardized mortality ratios for all-cause mortality were 46 for men, 38 for women, and 43 overall (100 = standard). Cause-specific standardized mortality ratios were 56 for cancer, 37 for cardiovascular disease, and 34 for both respiratory and digestive diseases. Standardized mortality ratios for all-cause mortality increased with time on study from 31 at year 1 to 48 at year 7. Adjusting the PLCO population to a standardized demographic distribution would increase the standardized mortality ratio only modestly to 54 for women and 55 for men. Standardized incidence ratios for all cancer were 84 in women and 73 in men, with a large range of standardized incidence ratios observed for specific cancers.


Subject(s)
Health Status , Mass Screening , Neoplasms/epidemiology , Voluntary Programs , Aged , Colorectal Neoplasms/epidemiology , Female , Humans , Lung Neoplasms/epidemiology , Male , Middle Aged , Ovarian Neoplasms/epidemiology , Prostatic Neoplasms/epidemiology , United States/epidemiology
3.
Urology ; 58(4): 561-6, 2001 Oct.
Article in English | MEDLINE | ID: mdl-11597539

ABSTRACT

OBJECTIVES: To characterize the role of demographic and clinical parameters in the measurements of prostate-specific antigen (PSA), free PSA (fPSA), and percent free PSA (%fPSA). METHODS: This was a cohort study of volunteers to a randomized screening trial. A central laboratory determined PSA and fPSA for the Prostate, Lung, Colorectal and Ovarian (PLCO) Cancer Screening Trial. A baseline evaluation of free and total PSA was done for 7183 white, black, Asian, Hispanic, and other male volunteers, aged 55 to 74 years. Comparisons were made across racial and ethnic groups and across a set of clinical parameters from a baseline questionnaire. RESULTS: The median levels of serum PSA were less than 2.1 ng/mL in each age-race grouping of the study participants. The levels of free and total PSA were higher in black (n = 868, 12%) participants than in white (n = 4995, 70%) and Asian (n = 849, 11.8%) participants. Individuals who identified themselves as ethnically Hispanic (n = 339, 4.7%) had median PSA levels higher than whites who were not Hispanic. The free and total PSA levels increased with age, particularly among men 70 to 74 years old. However, the %fPSA levels showed less variation among the four racial groups or by age. The free and total PSA levels were higher among those who had a history of benign prostatic disease. CONCLUSIONS: Demographic (age and race/ethnicity) and clinical (history of benign prostatic disease) variables had a moderate effect on the measures of PSA and fPSA and very little effect on %fPSA.


Subject(s)
Mass Screening/standards , Prostate-Specific Antigen/blood , Prostate-Specific Antigen/genetics , Aged , Asian People/genetics , Black People/genetics , Cohort Studies , Hispanic or Latino/genetics , Hispanic or Latino/statistics & numerical data , Humans , Male , Middle Aged , Prostate-Specific Antigen/standards , Sensitivity and Specificity , White People/genetics
4.
Biometrics ; 57(2): 389-95, 2001 Jun.
Article in English | MEDLINE | ID: mdl-11414561

ABSTRACT

The model that specifies that cancer incidence, I, is the convolution of the preclinical incidence, g, and the density of time in the preclinical phase, f, has frequently been utilized to model data from cancer screening trials and to estimate such quantities as sojourn time, lead time, and sensitivity. When this model is fit to the above data, the parameters of f as well as the parameter(s) governing screening sensitivity must be estimated. Previously, g was either assumed to be equal to clinical incidence or assumed to be a constant or exponential function that also had to be estimated. Here we assume that the underlying incidence, I, in the study population (in the absence of screening) is known. With I known, g then becomes a function of f, which can be solved for using (numerical) deconvolution, thus eliminating the need to estimate g or make assumptions about it. Since numerical deconvolution procedures may be highly unstable, however, we incorporate a smoothing procedure that produces a realistic g function while still closely reproducing the original incidence function I upon convolution with f. We have also added the concept of competing mortality to the convolution model. This, along with the realistic preclinical incidence function described above, results in more accurate estimates of sojourn time and lead time and allows for estimation of quantities related to overdiagnosis, which we define here.


Subject(s)
Mass Screening , Neoplasms/diagnosis , Clinical Trials as Topic/statistics & numerical data , Colorectal Neoplasms/epidemiology , Colorectal Neoplasms/prevention & control , France/epidemiology , Humans , Incidence , Models, Statistical , Neoplasms/epidemiology , Neoplasms/mortality , Neoplasms/prevention & control , Predictive Value of Tests , Reproducibility of Results , Sensitivity and Specificity , Time Factors
5.
J Theor Biol ; 207(2): 129-43, 2000 Nov 21.
Article in English | MEDLINE | ID: mdl-11034825

ABSTRACT

The overwhelming proportion of colorectal carcinomas are believed to originate as adenomatous polyps (adenomas), and the identification and removal of adenomas is an important component of colorectal cancer prevention efforts. Mathematical modeling of adenomas can increase our understanding of the natural history and biology of adenomas and colorectal cancer and can help in the effort to devise optimal prevention and screening strategies. Here we adapt the multi-stage model of carcinogenesis to the problem of the development and growth of adenomas. We show that, using plausible values for the biological parameters, the model can fit various aspects of adenoma data including adenoma prevalence by age, the size distribution of adenomas, clustering of adenomas within individuals and the correlation between distal and proximal adenomas. Explaining the clustering of adenomas within individuals, as well as other findings, requires heterogeneity in risk in the population; we show how such heterogeneity can be related to the distribution of biological parameters in the population. The model can also be adapted to account for adenoma development in two major syndromes related to colorectal cancer, familial adenomatous polyposis and hereditary non-polyposis colorectal cancer.


Subject(s)
Adenoma/genetics , Colorectal Neoplasms/genetics , Models, Genetic , Adenoma/pathology , Adenomatous Polyposis Coli/genetics , Adenomatous Polyposis Coli/pathology , Adult , Aging , Apoptosis , Cell Death , Colorectal Neoplasms/pathology , Colorectal Neoplasms, Hereditary Nonpolyposis/genetics , Colorectal Neoplasms, Hereditary Nonpolyposis/pathology , Humans , Likelihood Functions , Mutation , Prognosis
7.
Am J Ind Med ; 38(2): 115-26, 2000 Aug.
Article in English | MEDLINE | ID: mdl-10893504

ABSTRACT

BACKGROUND: An elevated risk of lung cancer among workers in chromate production facilities has previously been reported. This excess risk is believed to be the result of exposure to hexavalent chromium. There have been mixed reports about whether trivalent chromium exposure is also associated with an excess lung cancer risk. Previous studies of measured hexavalent chromium exposure and lung cancer risk have not examined cigarette smoking as a risk factor. METHODS: A cohort of 2,357 workers first employed between 1950 and 1974 at a chromate production plant was identified. Vital status of the workers was followed until December 31, 1992. Work histories of cohort members were compiled from the beginning of employment through 1985, the year the plant closed. Annual average exposure estimates, based on historical exposure measurements, were made for each job title in the plant for the years 1950-1985. These exposure estimates were used to calculate the cumulative hexavalent chromium exposure of each member of the study population. Following closure of the plant, settled dust samples were collected and analyzed for hexavalent and trivalent chromium. The trivalent/hexavalent concentration ratios in each plant area were combined with historic air-sampling data to estimate cumulative trivalent chromium exposure for each individual in the study cohort. Smoking status (yes/no) as of the beginning of employment and clinical signs of potential chromium irritation were identified from company records. RESULTS: Cumulative hexavalent chromium exposure showed a strong dose-response relationship for lung cancer. Clinical signs of irritation, cumulative trivalent chromium exposure, and duration of work were not found to be associated with a risk of lung cancer when included in a proportional hazards model with cumulative hexavalent chromium exposure and smoking. Age-specific data on cumulative hexavalent chromium exposure, observed and expected numbers of lung cancer cases, and person-years of observation are provided. CONCLUSIONS: Cumulative hexavalent chromium exposure was associated with an increased lung cancer risk; cumulative trivalent chromium exposure was not. The excess risk of lung cancer associated with cumulative hexavalent chromium exposure was not confounded by smoking status. The current study offers the best quantitative evidence to date of the relationship between hexavalent chromium exposure and lung cancer. Am. J. Ind. Med. 38:115-126, 2000. Published 2000 Wiley-Liss, Inc.


Subject(s)
Chemical Industry , Chromium , Lung Neoplasms/epidemiology , Occupational Diseases/epidemiology , Humans , Occupational Exposure , Proportional Hazards Models , Risk Factors
8.
Am J Ind Med ; 38(2): 127-31, 2000 Aug.
Article in English | MEDLINE | ID: mdl-10893505

ABSTRACT

BACKGROUND: Several reports of workers in chromate production and chromeplating have indicated that exposure to hexavalent chromium is associated with skin and nasal irritation. METHODS: A cohort of 2, 357 workers first employed between 1950 and 1974 at a chromate production plant was identified. Clinical findings of irritation were identified by a physician as a result of routine examinations or visits to the medical clinic by members of the cohort. Percentages of the cohort with various clinical findings, the time from hire to occurrence of the first finding, and the mean and median annual hexavalent chromium (measured as CrO(3)) concentration for the job title where the clinical finding first occurred were determined. A proportional hazards model was used to evaluate the relationship between hexavalent chromium exposure and first occurrence of each of the clinical findings. RESULTS: Nasal irritation and nasal ulceration were the most common clinical findings reported, occurring in more than 60% of the cohort. The average time to first occurrence of these findings was less than 3 months, whereas the time to first occurrence of the other findings ranged from 10 to 22 months. Median exposure to hexavalent chromium at the time of occurrence for most of the findings was about 20 microg/m(3). The proportional hazards model indicated that ulcerated nasal septum, irritated skin, and perforated eardrum were significantly associated with ambient hexavalent chromium exposure; all clinical findings with the exception of conjunctivitis and irritated skin were associated with the calendar year of hire, with the risk being lower as the calendar year of hire became more recent. Annual average ambient hexavalent chromium concentrations generally dropped in the plant over the period of the study. CONCLUSIONS: Workers in the chromate production plant in this study experienced a variety of nasal and skin irritations. Irritated and ulcerated nasal septa, in particular, were quite common clinical findings, occurring in over 60% of the cohort, and they occurred in relatively short periods of time-less than 3 months from date of hire. Annual average concentrations of chromium may not be a good predictor of clinical findings of irritation. Am. J. Ind. Med. 38:127-131, 2000. Published 2000 Wiley-Liss, Inc.


Subject(s)
Chemical Industry , Chromium , Nose Diseases/chemically induced , Ulcer/chemically induced , Female , Humans , Male , Proportional Hazards Models
9.
Epidemiology ; 11(3): 297-303, 2000 May.
Article in English | MEDLINE | ID: mdl-10784247

ABSTRACT

Several studies have found an increased risk of colorectal cancer associated with a family history of colorectal cancer. Some studies, although not all, have also suggested that family history of colorectal cancer may be a risk factor for adenomatous polyps. Hereditary nonpolyposis colorectal cancer is a known genetic syndrome predisposing to colorectal cancer. The hypothesis of this paper is that the preponderance of the genetic or familial risk for colorectal cancer in the United States is mediated by hereditary nonpolyposis colorectal cancer. To test this hypothesis, I have incorporated what is known about hereditary nonpolyposis colorectal cancer into a genetic model that generates probabilities of family clustering of colorectal cancer. Using this model, which assumes that all familial risk for colorectal cancer is due to hereditary nonpolyposis colorectal cancer, the expected relative risks for colorectal cancer (and adenomas) associated with given types of family histories were calculated. The relative risks predicted by the model fairly closely matched the results found in the literature, especially those reported from a large cohort study. As observed in several studies, the model predicts that relative risks decrease sharply with age. In contrast to the elevated risk for colorectal cancer, the model predicts no elevated risk for adenomas associated with family history of colorectal cancer.


Subject(s)
Colonic Neoplasms/genetics , Colorectal Neoplasms, Hereditary Nonpolyposis/epidemiology , Cluster Analysis , Germ-Line Mutation , Humans , Incidence , Models, Statistical , Risk
10.
J Expo Anal Environ Epidemiol ; 8(2): 187-206, 1998.
Article in English | MEDLINE | ID: mdl-9577750

ABSTRACT

Data from several studies suggest that concentrations of dioxins rose in the environment from the 1930s to about the 1960s/70s and have been declining over the last decade or two. The most direct evidence of this trend comes from lake core sediments, which can be used to estimate past atmospheric depositions of dioxins. The primary source of human exposure to dioxins is through the food supply. The pathway relating atmospheric depositions to concentrations in food is quite complex, and accordingly, it is not known to what extent the trend in human exposure mirrors the trend in atmospheric depositions. This paper describes an attempt to statistically reconstruct the pattern of past human exposure to the most toxic dioxin congener, 2,3,7,8-TCDD (abbreviated TCDD), through use of a simple pharmacokinetic (PK) model which included a time-varying TCDD exposure dose. This PK model was fit to TCDD body burden data (i.e., TCDD concentrations in lipid) from five U.S. studies dating from 1972 to 1987 and covering a wide age range. A Bayesian statistical approach was used to fit TCDD exposure; model parameters other than exposure were all previously known or estimated from other data sources. The primary results of the analysis are as follows: (1) use of a time-varying exposure dose provided a far better fit to the TCDD body burden data than did using a dose that was constant over time; this is strong evidence that exposure to TCDD has, in fact, varied during the 20th century, (2) the year of peak TCDD exposure was estimated to be in the late 1960s, which coincides with peaks found in sediment core studies, (3) modeled average exposure doses during these peak years was estimated at 1.4-1.9 pg TCDD/kg-day, and (4) modeled exposure doses of TCDD for the late 1980s of less than 0.10 pg TCDD/kg-day correlated well with recent estimates of exposure doses around 0.17 pg TCDD/kg-day (recent estimates are based on food concentrations combined with food ingestion rates; food is thought to explain over 90% of total dioxin exposure). This paper describes these and other results, the goodness-of-fit between predicted and observed lipid TCDD concentrations, the modeled impact of breast feeding on lipid concentrations in young individuals, and sensitivity and uncertainty analyses.


Subject(s)
Environmental Exposure/analysis , Models, Statistical , Polychlorinated Dibenzodioxins/analysis , Adolescent , Adult , Age Factors , Aged , Animals , Bayes Theorem , Child , Child, Preschool , Female , Humans , Infant , Infant, Newborn , Male , Middle Aged , Milk/chemistry , Polychlorinated Dibenzodioxins/adverse effects , Polychlorinated Dibenzodioxins/pharmacokinetics , Sex Factors , Soil Pollutants/analysis
11.
Biol Cybern ; 73(2): 129-37, 1995 Jul.
Article in English | MEDLINE | ID: mdl-7545011

ABSTRACT

Synchronous firing of a population of neurons has been observed in many experimental preparations; in addition, various mathematical neural network models have been shown, analytically or numerically, to contain stable synchronous solutions. In order to assess the level of synchrony of a particular network over some time interval, quantitative measures of synchrony are needed. We develop here various synchrony measures which utilize only the spike times of the neurons; these measures are applicable in both experimental situations and in computer models. Using a mathematical model of the CA3 region of the hippocampus, we evaluate these synchrony measures and compare them with pictorial representations of network activity. We illustrate how synchrony is lost and synchrony measures change as heterogeneity amongst cells increases. Theoretical expected values of the synchrony measures for different categories of network solutions are derived and compared with results of simulations.


Subject(s)
Models, Neurological , Nerve Net/physiology , Action Potentials/physiology , Animals , Computer Simulation , Cybernetics , Hippocampus/physiology , Humans , In Vitro Techniques , N-Methylaspartate/metabolism , Periodicity , alpha-Amino-3-hydroxy-5-methyl-4-isoxazolepropionic Acid/metabolism
12.
J Comput Neurosci ; 1(1-2): 39-60, 1994 Jun.
Article in English | MEDLINE | ID: mdl-8792224

ABSTRACT

We have developed a two-compartment, eight-variable model of a CA3 pyramidal cell as a reduction of a complex 19-compartment cable model [Traub et al, 1991]. Our reduced model segregates the fast currents for sodium spiking into a proximal, soma-like, compartment and the slower calcium and calcium-mediated currents into a dendrite-like compartment. In each model periodic bursting gives way to repetitive soma spiking as somatic injected current increases. Steady dendritic stimulation can produce periodic bursting of significantly higher frequency (8-20 Hz) than can steady somatic input (< 8 Hz). Bursting in our model occurs only for an intermediate range of electronic coupling conductance. It depends on the segregation of channel types and on the coupling current that flows back-and-forth between compartments. When the soma and dendrite are tightly coupled electrically, our model reduces to a single compartment and does not burst. Network simulations with our model using excitatory AMPA and NMDA synapses (without inhibition) give results similar to those obtained with the complex cable model [Traub et al, 1991; Traub et al, 1992]. Brief stimulation of a single cell in a resting network produces multiple synchronized population bursts, with fast AMPA synapses providing the dominant synchronizing mechanism. The number of bursts increases with the level of maximal NMDA conductance. For high enough maximal NMDA conductance synchronized bursting repeats indefinitely. We find that two factors can cause the cells to desynchronize when AMPA synapses are blocked: heterogeneity of properties amongst cells and intrinsically chaotic burst dynamics. But even when cells are identical, they may synchronize only approximately rather than exactly. Since our model has a limited number of parameters and variables, we have studied its cellular and network dynamics computationally with relative ease and over wide parameter ranges. Thereby, we identify some qualitative features that parallel or are distinguished from those of other neuronal systems; e.g., we discuss how bursting here differs from that in some classical models.


Subject(s)
Dendrites/physiology , Hippocampus/physiology , Neural Networks, Computer , Pyramidal Cells/physiology , Animals , Membrane Potentials/physiology , Time Factors
13.
Am J Dis Child ; 145(7): 779-81, 1991 Jul.
Article in English | MEDLINE | ID: mdl-2058610

ABSTRACT

As part of a national telephone survey regarding health events associated with out-of-home child care, data regarding poisonings and injuries were collected. Of 171 reported poisonings, none occurred during out-of-home child care. The rate of injury during out-of-home child care was 1.69 per 100,000 child-hours compared with 2.66 for home care. Overall injury rates were slightly higher for children who attended out-of-home child care than for those who do not. This occurred because children who attended out-of-home child care had a higher injury rate during home care than did the children who did not attend out-of-home child care at all. Although out-of-home child care may carry an increased risk of infectious disease relative to home care, it does not appear to carry an increased risk of injury and, in fact, may confer a lower risk.


Subject(s)
Accidents, Home/statistics & numerical data , Child Care/statistics & numerical data , Poisoning/epidemiology , Wounds and Injuries/epidemiology , Accidents/statistics & numerical data , Child Day Care Centers/statistics & numerical data , Child, Preschool , Humans , Infant , Prevalence , United States/epidemiology
14.
Pediatrics ; 87(1): 62-9, 1991 Jan.
Article in English | MEDLINE | ID: mdl-1984620

ABSTRACT

The risk of respiratory and other illnesses in children (age groups: 6 weeks through 17 months, 18 through 35 months, and 36 through 59 months) in various types of day-care facilities was studied. Children considered exposed to day care were those who were enrolled in day care with at least one unrelated child for at least 10 hours per week in each of the 4 weeks before the interview; unexposed children were not enrolled in any regular child care with unrelated children and did not have siblings younger than 5 years of age receiving regular care with unrelated children. Although an increased risk of respiratory illness was associated with attending day care for children in all three age groups, this risk was statistically significant only for children 6 weeks through 17 months of age (odds ratio = 1.6; 95% confidence interval = 1.1 to 2.4) and children 18 through 35 months of age who had no older siblings (odds ratio = 3.4; 95% confidence interval = 2.0 to 6.0). In contrast, day-care attendance was not associated with an increased risk of respiratory illness in children 18 through 35 months of age with older siblings (odds ratio = 1.0). For children aged 6 weeks through 17 months, the exposure to older siblings was associated with an increased risk of respiratory illness; however, for children aged 36 through 59 months, older siblings were protective against respiratory illness. In addition, for the children in each age group currently in day care, increased duration of past exposure to day care was associated with a decreased risk of respiratory illness.(ABSTRACT TRUNCATED AT 250 WORDS)


Subject(s)
Child Day Care Centers , Respiratory Tract Infections/epidemiology , Chickenpox/epidemiology , Child, Preschool , Communicable Diseases/epidemiology , Family , Humans , Incidence , Infant , Regression Analysis , Risk Factors , Socioeconomic Factors , Surveys and Questionnaires , United States/epidemiology
15.
Am J Epidemiol ; 130(6): 1187-98, 1989 Dec.
Article in English | MEDLINE | ID: mdl-2556026

ABSTRACT

Between May 25 and July 5, 1986, an epidemic of acute hemorrhagic conjunctivitis affected an estimated 47% of the population on American Samoa. Coxsackievirus A24 variant was isolated from 18 of 22 patients. This is the first documented outbreak of acute hemorrhagic conjunctivitis due to coxsackievirus A24 variant outside of Southeast Asia and the Indian subcontinent. When this outbreak was compared with an outbreak on the island in 1981-1982 caused by enterovirus 70, conjunctival hemorrhage or injection and the severity of hemorrhage were less prevalent among cases in 1986, while upper respiratory and systemic symptoms were more common. Residents of traditional housing had significantly higher attack rates (48%) than residents of government housing (23%). Serum specimens collected from the residents of Samoa in 1985, before the outbreak, unexpectedly revealed the presence of neutralizing antibodies against coxsackievirus A24 variant. The presence of these antibodies correlated with protection against coxsackievirus A24 variant infection in this outbreak.


Subject(s)
Conjunctivitis, Acute Hemorrhagic/epidemiology , Coxsackievirus Infections , Disease Outbreaks/statistics & numerical data , Adolescent , Adult , Child , Child, Preschool , Conjunctivitis, Acute Hemorrhagic/microbiology , Conjunctivitis, Acute Hemorrhagic/physiopathology , Enterovirus/isolation & purification , Epidemiologic Methods , Female , Housing , Humans , Independent State of Samoa , Infant , Male , Middle Aged , Recurrence , Seasons
16.
Ann Neurol ; 26(5): 592-600, 1989 Nov.
Article in English | MEDLINE | ID: mdl-2817835

ABSTRACT

To determine whether neurological and neuropsychological abnormalities are associated with clinical manifestations of human immunodeficiency virus type 1 (HIV-1) infection in men who do not have acquired immunodeficiency syndrome (AIDS), we performed a historical prospective and cross-sectional study. One hundred HIV-1 seropositive homosexual or bisexual men, of whom 26 had AIDS-related complex, 31 had generalized lymphadenopathy, and 43 had no signs or symptoms of HIV-1 infection, and 157 HIV-1 seronegative men were enrolled from a cohort of 6,701 men who were originally recruited between 1978 and 1980 for studies of hepatitis B virus infection. Evaluation included medical history, physical examination, and neuropsychological tests. Of 26 HIV-1 seropositive subjects with AIDS-related complex, 11 (42%) reported neurological, cognitive, or affective symptoms compared with 30 (19%) of 157 HIV-1 seronegative subjects (relative risk = 2.2, p = 0.02). On neuropsychological testing, subjects with AIDS-related complex performed at a significantly lower level than the HIV-1 seronegative group (p = 0.001). A significantly higher percentage of subjects with AIDS-related complex (8[31%]of 26) than HIV-1 seronegative subjects (19 [12%] of 157) had abnormal results on two or more neuropsychological tests (rate ratio = 2.5, p = 0.03). Symptoms and impairment on neuropsychological tests were correlated only within the group who had AIDS-related complex. Subjects with generalized lymphadenopathy and subjects who had no signs or symptoms of HIV-1 infection were not different from HIV-1 seronegative subjects with respect to symptoms or performance on neuropsychological tests.(ABSTRACT TRUNCATED AT 250 WORDS)


Subject(s)
AIDS Dementia Complex/physiopathology , Acquired Immunodeficiency Syndrome/complications , HIV Seropositivity/physiopathology , AIDS Dementia Complex/immunology , Acquired Immunodeficiency Syndrome/immunology , Acquired Immunodeficiency Syndrome/physiopathology , Adult , HIV Antibodies , HIV Seropositivity/immunology , HIV Seropositivity/psychology , Humans , Male , Neuropsychological Tests
17.
Lancet ; 2(8669): 961-5, 1989 Oct 21.
Article in English | MEDLINE | ID: mdl-2571872

ABSTRACT

A 32-nm small round structured virus (SRSV), possibly related to the Snow Mountain agent (SMA), was implicated as the cause of recurrent outbreaks of gastroenteritis on a cruise ship. There was no identifiable relation to food or water consumption, but the risk of gastroenteritis among passengers who had shared toilet facilities was twice that of those who had a private bathroom and the rate of illness was related to the number of passengers sharing a communal restroom (ie, with one or more toilets): contaminated bathrooms may be an important vehicle for person-to-person spread of this enteric agent. In each cabin, index patients who had vomited in their cabins were more likely to have had cabinmates who subsequently became ill than were index patients who had not vomited. These epidemiological findings implicate vomitus in the transmission of viral gastroenteritis and they are consistent with the transmission of viral agents by airborne droplets or person-to-person contact. New strategies for prevention of viral gastroenteritis should include protection against environmental contamination by viruses in airborne droplets or vomitus.


Subject(s)
Disease Outbreaks , Gastroenteritis/etiology , Travel , Virus Diseases/transmission , Acute Disease , Case-Control Studies , Florida , Gastroenteritis/epidemiology , Humans , Norwalk virus/isolation & purification , Recurrence , Risk Factors , Ships , Surveys and Questionnaires , Viruses, Unclassified/isolation & purification , Vomiting/microbiology
18.
MMWR CDC Surveill Summ ; 38(1): 1-21, 1989 Aug.
Article in English | MEDLINE | ID: mdl-2505046

ABSTRACT

The primary purpose of the annual report on rabies surveillance is to assist local and state public health officials in the planning of rabies control programs and to guide health professionals in evaluating the need for rabies postexposure prophylaxis in patients who are exposed to animals that may be rabid. In 1988, a total of 4,724 cases of animal rabies were reported by 47 states, the District of Columbia, and Puerto Rico, similar to the total (4,729) for 1987. No human cases of rabies were reported. The South Atlantic, South Central, North Central, and Middle Atlantic states reported 81% of the cases. Pennsylvania, Texas, California, Maryland, and Virginia each reported over 300 rabid animals. Delaware (61 cases), New Mexico (15), Alaska (34), Connecticut (8), and South Carolina (127) each reported an increase in animal rabies cases greater than or equal to 100% in 1988 compared with 1987. Smaller but significant increases also were reported from Florida (66% increase), Pennsylvania (68%), and Georgia (40%). Eighty-eight percent of rabies cases were in wild animals, and 12% were in domestic animals. Skunks, raccoons, and bats accounted for 82% of all rabid animals. Cats became the most commonly reported domestic species for the first time since reporting to CDC began in 1960. The most effective methods of reducing the number of people exposed to rabies are to educate the public to avoid unfamiliar, especially wild, animals and to vaccinate susceptible pets against rabies. Rabies vaccination programs should target cats as well as dogs. Two cases of imported canine rabies emphasized the need to educate travelers of the risk of canine rabies in developing countries. Caution should be used when pets are imported from these countries.


Subject(s)
Rabies/veterinary , Animals , Animals, Domestic , Animals, Wild , Canada , Humans , Mexico , Population Surveillance , Puerto Rico , Rabies/epidemiology , United States
19.
N Engl J Med ; 320(21): 1372-6, 1989 May 25.
Article in English | MEDLINE | ID: mdl-2716783

ABSTRACT

Between January 12 and February 7, 1987, an outbreak of gastroenteritis affected an estimated 13,000 people in a county of 64,900 residents in western Georgia. Cryptosporidium oocysts were identified in the stools of 58 of 147 patients with gastroenteritis (39 percent) tested during the outbreak. Studies for bacterial, viral, and other parasitic pathogens failed to implicate any other agent. In a random telephone survey, 299 of 489 household members exposed to the public water supply (61 percent) reported gastrointestinal illness, as compared with 64 of 322 (20 percent) who were not exposed (relative risk, 3.1; 95 percent confidence interval, 2.4 to 3.9). The prevalence of IgG to cryptosporidium was significantly higher among exposed respondents to the survey who had become ill than among nonresident controls. Cryptosporidium oocysts were identified in samples of treated public water with use of a monoclonal-antibody test. Although the sand-filtered and chlorinated water system met all regulatory-agency quality standards, sub-optimal flocculation and filtration probably allowed the parasite to pass into the drinking-water supply. Low-level cryptosporidium infection in cattle in the watershed and a sewage overflow were considered as possible contributors to the contamination of the surface-water supply. We conclude that current standards for the treatment of public water supplies may not prevent the contamination of drinking water by cryptosporidium, with consequent outbreaks of cryptosporidiosis.


Subject(s)
Cryptosporidiosis/epidemiology , Disease Outbreaks , Water Microbiology , Water Pollution , Water Supply , Epidemiologic Methods , Filtration , Gastroenteritis/epidemiology , Georgia , Humans , Telephone
20.
JAMA ; 260(22): 3281-5, 1988 Dec 09.
Article in English | MEDLINE | ID: mdl-3184416

ABSTRACT

We reviewed national mortality data for 1973 through 1983 to assess the importance of diarrheal diseases as a cause of preventable childhood death in the United States. An average of 500 children aged 1 month to 4 years died each year with diarrhea reported as the cause of death. These diarrheal deaths were most common among children who were younger than 1 year of age, black, and living in the South, and were most common during the winter. In Mississippi, review of fatal cases of diarrhea identified maternal factors--black race, young age, unmarried status, low level of education, and little prenatal care--to be most associated with diarrheal death in the child. Fifty percent of these deaths occurred after a child had reached a medical facility. Our findings suggest that diarrheal deaths may be preventable and that targeted interventions could contribute to improved child survival in the United States.


Subject(s)
Diarrhea/mortality , Age Factors , Birth Weight , Child, Preschool , Diarrhea/epidemiology , Diarrhea/prevention & control , Diarrhea, Infantile/mortality , Female , Humans , Infant , Male , Risk Factors , Seasons , Sex Factors , Socioeconomic Factors , United States
SELECTION OF CITATIONS
SEARCH DETAIL
...