Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 75
Filter
1.
Fish Shellfish Immunol ; 30(6): 1209-22, 2011 Jun.
Article in English | MEDLINE | ID: mdl-21463691

ABSTRACT

The inception of ecological immunology has led to an increase in the number of studies investigating the impact of environmental stressors on host immune defence mechanisms. This in turn has led to an increased understanding of the importance of invertebrate groups for immunological research. This review discusses the advances made within marine invertebrate ecological immunology over the past decade. By demonstrating the environmental stressors tested, the immune parameters typically investigated, and the species that have received the greatest level of investigation, this review provides a critical assessment of the field of marine invertebrate ecological immunology. In highlighting the methodologies employed within this field, our current inability to understand the true ecological significance of any immune dysfunction caused by environmental stressors is outlined. Additionally, a number of examples are provided in which studies successfully demonstrate a measure of immunocompetence through alterations in disease resistance and organism survival to a realized pathogenic threat. Consequently, this review highlights the potential to advance our current understanding of the ecological and evolutionary significance of environmental stressor related immune dysfunction. Furthermore, the potential for the advancement of our understanding of the immune system of marine invertebrates, through the incorporation of newly emerging and novel molecular techniques, is emphasized.


Subject(s)
Ecosystem , Immunity, Cellular/immunology , Immunity, Humoral/immunology , Immunity, Innate/immunology , Invertebrates/immunology , Stress, Physiological/immunology , Animals , Antimicrobial Cationic Peptides/immunology , Marine Biology , Oceans and Seas , Phagocytosis/immunology , Respiratory Burst/immunology , Species Specificity
2.
J Dairy Sci ; 89(9): 3408-12, 2006 Sep.
Article in English | MEDLINE | ID: mdl-16899673

ABSTRACT

Dairy cattle with clinical mastitis caused by Escherichia coli exhibit a wide range of disease severity, from mild, with only local inflammatory changes of the mammary gland, to severe, with significant systemic derangement. The present study was designed to examine the relationship between serotype and virulence genes of E. coli mastitis isolates, different levels of systemic disease severity, and farm from which the E. coli strain was obtained. One hundred twenty-three E. coli milk isolates were obtained from cows with clinical mastitis of varying systemic disease severity from 6 different farms. No predominant serotype was identified by farm or by systemic disease severity; however, the most frequent serotype, O158:NM (n = 3), was isolated from cows in the moderate severity group. Virulence genes evaluated were identified infrequently and were not associated with systemic disease severity. Evaluation of genetic similarity showed no clustering assigned by farm or mastitis severity based on systemic disease signs. We concluded that a high degree of genotypic variability is characteristic of E. coli strains causing clinical mastitis within and between different farms and systemic severity groups, and that specific cow factors probably play a more important role in determining systemic disease severity.


Subject(s)
Escherichia coli Infections/veterinary , Escherichia coli/genetics , Escherichia coli/pathogenicity , Mastitis, Bovine/microbiology , Virulence Factors/genetics , Animals , Cattle , DNA Primers/chemistry , Escherichia coli/classification , Escherichia coli/isolation & purification , Escherichia coli Infections/microbiology , Female , Genotype , Milk/microbiology , Polymerase Chain Reaction/methods , Serotyping/methods , Serotyping/veterinary , Severity of Illness Index
3.
J Am Vet Med Assoc ; 227(1): 132-8, 2005 Jul 01.
Article in English | MEDLINE | ID: mdl-16013549

ABSTRACT

OBJECTIVE: To compare the frequency of isolation, genotypes, and in vivo production of major lethal toxins of Clostridium perfringens in adult dairy cows affected with hemorrhagic bowel syndrome (HBS) versus left-displaced abomasum (LDA). DESIGN: Case-control study. ANIMALS: 10 adult dairy cattle with HBS (cases) and 10 adult dairy cattle with LDA matched with cases by herd of origin (controls). PROCEDURE: Samples of gastrointestinal contents were obtained from multiple sites during surgery or necropsy examination. Each sample underwent testing for anaerobic bacteria by use of 3 culture methods. The genotype of isolates of C. perfringens was determined via multiplex polymerase chain reaction assay. Major lethal toxins were detected by use of an ELISA. Data were analyzed with multivariable logistic regression and chi2 analysis. RESULTS: C. perfringens type A and type A with the beta2 gene (A + beta2) were the only genotypes isolated. Isolation of C. perfringens type A and type A + beta2 was 6.56 and 3.3 times as likely, respectively, to occur in samples from cattle with HBS than in cattle with LDA. Alpha toxin was detected in 7 of 36 samples from cases and in 0 of 32 samples from controls. Beta2 toxin was detected in 9 of 36 samples from cases and 0 of 36 samples from controls. CONCLUSIONS AND CLINICAL RELEVANCE: C. perfringens type A and type A + beta2 can be isolated from the gastrointestinal tract with significantly greater odds in cattle with HBS than in herdmates with LDA. Alpha and beta2 toxins were detected in samples from cows with HBS but not from cows with LDA.


Subject(s)
Bacterial Toxins/isolation & purification , Cattle Diseases/microbiology , Clostridium Infections/veterinary , Clostridium perfringens/isolation & purification , Gastrointestinal Hemorrhage/veterinary , Abomasum/abnormalities , Animals , Bacterial Toxins/biosynthesis , Bacterial Toxins/classification , Case-Control Studies , Cattle , Clostridium Infections/microbiology , Clostridium perfringens/classification , Clostridium perfringens/metabolism , Gastrointestinal Hemorrhage/microbiology , Genotype , Logistic Models , Multivariate Analysis , Phylogeny
4.
J Exp Bot ; 53(371): 1163-76, 2002 May.
Article in English | MEDLINE | ID: mdl-11971927

ABSTRACT

Barley traits related to salt tolerance are mapped in a population segregating for a dwarfing gene associated with salt tolerance. Twelve quantitative trait loci (QTLs) were detected for seven seedling traits in doubled haploids from the spring barley cross Derkado x B83-12/21/5 when given saline treatment in hydroponics. The location of QTLs for seedling growth stage (leaf appearance rate), stem weight prior to elongation, and tiller number are reported for the first time. In addition, four QTLs were found for the mature plant traits grain nitrogen and plot yield. In total, seven QTLs are co-located with the dwarfing genes sdw1, on chromosome 3H, and ari-e.GP, on chromosome 5H, including seedling leaf response (SGa) to gibberellic acid (GA(3)). QTLs controlling the growth of leaves (GS2) on chromosomes 2H and 3H and emergence of tillers (TN2) and grain yield were independent of the dwarfing genes. Field trials were grown in eastern Scotland and England to estimate yield and grain composition. A genetic map was used to compare the positions of QTLs for seedling traits with the location of QTLs for the mature plant traits. The results are discussed in relation to the study of barley physiology and the location of genes for dwarf habit and responses to GA.


Subject(s)
Chromosome Mapping/methods , Gibberellins/pharmacology , Hordeum/genetics , Quantitative Trait, Heritable , Adaptation, Physiological/drug effects , Adaptation, Physiological/genetics , Genetic Markers , Genotype , Hordeum/drug effects , Hordeum/growth & development , Phenotype , Plant Leaves/drug effects , Plant Leaves/genetics , Plant Leaves/growth & development , Plant Roots/drug effects , Plant Roots/genetics , Plant Roots/growth & development , Plant Shoots/drug effects , Plant Shoots/genetics , Plant Shoots/growth & development , Salts/pharmacology
5.
Health Care Manag Sci ; 4(4): 281-7, 2001 Dec.
Article in English | MEDLINE | ID: mdl-11718460

ABSTRACT

Using the Diagnostic Cost Group (DCG) model developed from a national sample, we examine biased selection among one fee-for-service (FFS) plan, one preferred provider organization, and several health maintenance organizations (HMOs) in Massachusetts. The proportions of enrollees in low-risk groups are higher in the HMO plans and lower in the FFS plan. The average age in the FFS plan is 9 years greater than that in the HMO plans. Actual premiums are not consistent with risk levels among HMO plans, resulting in gains in some HMO plans and losses in others as high as 20% compared to expected expenses as computed by the DCG model.


Subject(s)
Fee-for-Service Plans/statistics & numerical data , Health Benefit Plans, Employee/statistics & numerical data , Health Maintenance Organizations/statistics & numerical data , Insurance Selection Bias , State Government , Costs and Cost Analysis , Diagnosis-Related Groups/economics , Fee-for-Service Plans/economics , Fees and Charges , Health Benefit Plans, Employee/economics , Health Maintenance Organizations/economics , Massachusetts , Models, Statistical , Risk Adjustment , United States
6.
Inquiry ; 38(3): 299-309, 2001.
Article in English | MEDLINE | ID: mdl-11761357

ABSTRACT

This paper explores explanations for why few private employers have adopted formal risk adjustment. The lack of data, challenges of using highly imperfect signals, and absence of market power are not compelling explanations. Alternative strategies that reduce selection problems are clearly important. The central argument is that U.S. health markets are not in equilibrium, but rather are changing rapidly. Since many agents-consumers, employers, health plans, and providers--do not currently demand formal risk adjustment, it is not surprising that adoption has been slow. Recent changes in health plan markets may change the demand and accelerate future adoption.


Subject(s)
Health Benefit Plans, Employee/economics , Managed Competition/economics , Private Sector/economics , Risk Adjustment/statistics & numerical data , Actuarial Analysis , Data Collection , Diagnosis-Related Groups/economics , Diffusion of Innovation , Fees and Charges , Health Care Sector/trends , Humans , Insurance Selection Bias , Social Change , United States
7.
Health Serv Res ; 36(6 Pt 2): 180-93, 2001 Dec.
Article in English | MEDLINE | ID: mdl-16148968

ABSTRACT

OBJECTIVE: To examine and evaluate models that use inpatient encounter data and outpatient pharmacy claims data to predict future health care expenditures. DATA SOURCES/STUDY DESIGN: The study group was the privately insured under-65 population in the 1997 and 1998 MEDSTAT Market Scan (R) Research Database. Pharmacy and disease profiles, created from pharmacy claims and inpatient encounter data, respectively, were used separately and in combination to predict each individual's subsequent-year health care expenditures. PRINCIPAL FINDINGS: The inpatient-diagnosis model predicts well for the low-hospitalization under-65 populations, explaining 8.4 percent of future individual total cost variation. The pharmacy-based and in patient-diagnosis models perform comparably overall, with pharmacy data better able to split off a group of truly low-cost people and inpatient diagnoses better able to find a small group with extremely high future costs. The model th at uses both kinds of data performed significantly better than either model alone, with an R2 value of 11.8 percent . CONCLUSIONS: Comprehensive pharmacy and inpatient diagnosis classification systems are each helpful for discriminating among people according to their expected costs. Properly organized and in combination these data are promising predictors of future costs.


Subject(s)
Drug Utilization/statistics & numerical data , Health Care Costs/trends , Health Expenditures/trends , Health Status Indicators , Hospitalization/statistics & numerical data , Models, Econometric , Risk Assessment/methods , Adolescent , Adult , Child , Child, Preschool , Diagnosis-Related Groups/statistics & numerical data , Drug Prescriptions/classification , Drug Prescriptions/statistics & numerical data , Female , Forecasting/methods , Humans , Infant , Infant, Newborn , Insurance Claim Review , Male , Middle Aged , Pharmacies/economics , Pharmacies/statistics & numerical data , United States
8.
Health Serv Res ; 36(6 Pt 2): 194-206, 2001 Dec.
Article in English | MEDLINE | ID: mdl-16148969

ABSTRACT

OBJECTIVE: To examine the value of two kinds of patient-level dat a (cost and diagnoses) for identifying a very small subgroup of a general population with high future costs that may be mitigated with medical management. DATA SOURCES: The study used the MEDSTAT Market Scan (R) Research Database, consisting of inpatient and ambulatory health care encounter records for individuals covered by employee- sponsored benefit plans during 1997 and 1998. STUDY DESIGN: Prior cost and a diagnostic cost group (DCG) risk model were each used with 1997 data to identify 0.5-percent-sized "top groups" of people most likely to be expensive in 1998. We compared the distributions of people, cost, and diseases commonly targeted for disease management for people in the two top groups and, as a bench mark, in the full population. PRINCIPAL FINDINGS: the prior cost- and DCG-identified top groups overlapped by only 38 percent. Each top group consisted of people with high year-two costs and high rates of diabetes, heart failure, major lung disease, and depression. The DCG top group identified people who are both somewhat more expensive ($27,292 vs. $25,981) and more likely ( 49.4 percent vs. 43.8 percent ) th an the prior-cost top group to have at least one of the diseases commonly targeted for disease management. The overlap group average cost was $46,219. CONCLUSIONS: Diagnosis-based risk models are at least as powerful as prior cost for identifying people who will be expensive. Combined cost and diagnostic data are even more powerful and more operation ally useful, especially because the diagnostic information identifies the medical problems that may be managed to achieve better out comes and lower costs.


Subject(s)
Ambulatory Care/statistics & numerical data , Diagnosis-Related Groups/statistics & numerical data , Disease Management , Health Benefit Plans, Employee/statistics & numerical data , Health Care Costs/statistics & numerical data , Hospitalization/statistics & numerical data , Models, Econometric , Risk Assessment/methods , Adolescent , Adult , Aged , Ambulatory Care/economics , Benchmarking , Child , Child, Preschool , Chronic Disease/economics , Chronic Disease/therapy , Databases as Topic , Diagnosis-Related Groups/economics , Diagnosis-Related Groups/trends , Female , Forecasting/methods , Health Care Costs/trends , Hospitalization/economics , Humans , Infant , Infant, Newborn , Male , Middle Aged , Sensitivity and Specificity , United States
9.
Int J Technol Assess Health Care ; 16(3): 799-810, 2000.
Article in English | MEDLINE | ID: mdl-11028135

ABSTRACT

OBJECTIVES: Guidelines for colorectal cancer screening and surveillance in people at average risk and at increased risk have recently been published by the American Gastroenterological Association. The guidelines for the population at average risk were evaluated using cost-effectiveness analyses. METHODS: Since colorectal cancers primarily arise from precancerous adenomas, a state transition model of disease progression from adenomatous polyps was developed. Rather than assuming that polyps turn to cancer after a fixed interval (dwell time), such transitions were modeled to occur as an exponential function of the age of the polyps. Screening strategies included periodic fecal occult blood test, flexible sigmoidoscopy, double-contrast barium enema, and colonoscopy. Screening costs in 1994 dollars were estimated using Medicare and private claims data, and clinical parameters were based upon published studies. RESULTS: Cost per life-year saved was $12,636 for flexible sigmoidoscopy every 5 years and $14,394 for annual fecal occult blood testing. The assumption made for polyp dwell time critically affected the attractiveness of alternative screening strategies. CONCLUSIONS: Sigmoidoscopy every 5 years and annual fecal blood testing were the two most cost-effective strategies, but with low compliance, occult blood testing was less cost-effective. Lowering colonoscopy costs greatly improved the cost-effectiveness of colonoscopy every 10 years.


Subject(s)
Colorectal Neoplasms/diagnosis , Colorectal Neoplasms/economics , Mass Screening/economics , Aged , Aged, 80 and over , Cost-Benefit Analysis , Decision Trees , Disease Progression , Female , Humans , Male , Mass Screening/methods , Middle Aged , Population Surveillance , Practice Guidelines as Topic , Risk Factors
10.
J Exp Bot ; 51(342): 9-17, 2000 Jan.
Article in English | MEDLINE | ID: mdl-10938791

ABSTRACT

The development of new barleys tolerant of abiotic and biotic stresses is an essential part of the continued improvement of the crop. The domestication of barley, as in many crops, resulted in a marked truncation of the genetical variation present in wild populations. This process is significant to agronomists and scientists because a lack of allelic variation will prevent the development of adapted cultivars and hinder the investigation of the genetic mechanisms underlying performance. Wild barley would be a useful source of new genetic variation for abiotic stress tolerance if surveys identify appropriate genetic variation and the development of marker-assisted selection allows efficient manipulation in cultivar development. There are many wild barley collections from all areas of its natural distribution, but the largest are derived from the Mediterranean region. The results of a range of assays designed to explore abiotic stress tolerance in barley are reported in this paper. The assays included; sodium chloride uptake in wild barley and a mapping population, effects for delta 13C and plant dry weight in wheat aneuploids, effects of photoperiod and vernalization in wild barley, and measurements of root length in wild barley given drought and nitrogen starvation treatments in hydroponic culture. There are examples of the use of wild barley in breeding programmes, for example, as a source of new disease resistance genes, but the further exploration of the differences between wild barley and cultivars is hampered by the lack of good genetic maps. In parallel to the need for genetic studies there is also a need for the development of good physiological models of crop responses to the environment. Given these tools, wild barley offers the prospect of a 'goldmine' of untapped genetic reserves.


Subject(s)
Crops, Agricultural/genetics , Hordeum/genetics , Chlorides/metabolism , Hordeum/metabolism , Sodium/metabolism
11.
J Exp Bot ; 51(342): 19-27, 2000 Jan.
Article in English | MEDLINE | ID: mdl-10938792

ABSTRACT

This article represents some current thinking and objectives in the use of molecular markers to abiotic stress tolerance. Barley has been chosen for study as it is an important crop species, as well as a model for genetic and physiological studies. It is an important crop and, because of its well-studied genetics and physiology, is an excellent candidate in which to devise more efficient breeding methods. Abiotic stress work on cultivated gene pools of small grain cereals frequently shows that adaptive and developmental genes are strongly associated with responses. Developmental genes have strong pleiotropic effects on a number of performance traits, not just abiotic stresses. One concern is that much of the genetic variation for improving abiotic stress tolerance has been lost during domestication, selection and modern breeding, leaving pleiotropic effects of the selected genes for development and adaptation. Such genes are critical in matching cultivars to their target agronomic environment, and since there is little leverage in changing these, other sources of variation may be required. In barley, and many other crops, greater variation to abiotic stresses exists in primitive landraces and related wild species gene pools. Wild barley, Hordeum spontaneum C. Koch is the progenitor of cultivated barley, Hordeum vulgare L. and is easily hybridized to H. vulgare. Genetic fingerprinting of H. spontaneum has revealed genetic marker associations with site-of-origin ecogeographic factors and also experimentally imposed stresses. Genotypes and collection sites have been identified which show the desired variation for particular stresses. Doubled haploid and other segregating populations, including landrace derivatives have been used to map genetically the loci involved. These data can be used in molecular breeding approaches to improve the drought tolerance of barley. One strategy involves screening for genetic markers and physiological traits for drought tolerance, and the associated problem of drought relief-induced mildew susceptibility in naturally droughted fields of North Africa.


Subject(s)
Adaptation, Physiological , Hordeum/physiology , DNA Fingerprinting , Genes, Plant , Genetic Variation , Hordeum/genetics , Quantitative Trait, Heritable
12.
J Exp Bot ; 51(342): 41-50, 2000 Jan.
Article in English | MEDLINE | ID: mdl-10938794

ABSTRACT

To integrate the complex physiological responses of plants to stress, natural abundances (delta) of the stable isotope pairs 15N/14N and 13C/12C were measured in 30 genotypes of wild barley (Hordeum spontaneum C. Koch.). These accessions, originating from ecologically diverse sites, were grown in a controlled environment and subjected to mild, short-term drought or N-starvation. Increases in total dry weight were paralleled by less negative delta 13C in shoots and, in unstressed and droughted plants, by less negative whole-plant delta 13C. Root delta 15N was correlated negatively with total dry weight, whereas shoot and whole-plant delta 15N were not correlated with dry weight. The difference in delta 15N between shoot and root varied with stress in all genotypes. Shoot-root delta 15N may be a more sensitive indicator of stress response than shoot, root or whole-plant delta 15N alone. Among the potentially most productive genotypes, the most stress-tolerant had the most negative whole-plant delta 15N, whether the stress was drought or N-starvation. In common, controlled experiments, genotypic differences in whole-plant delta 15N may reflect the extent to which N can be retained within plants when stressed.


Subject(s)
Hordeum/physiology , Carbon Isotopes , Genotype , Hordeum/genetics , Nitrogen Isotopes , Plant Roots , Plant Shoots
13.
Health Policy Plan ; 15(1): 76-84, 2000 Mar.
Article in English | MEDLINE | ID: mdl-10731238

ABSTRACT

This paper assesses the demand effects of a cost recovery and quality improvement pilot study conducted in Niger in 1993. Direct user charges and indirect insurance payments were implemented in government health care facilities in different parts of the country, and were preceded or accompanied by quality changes in these facilities. Decision-making by patients is modelled as a three-stage process of reporting an illness, seeking treatment and choice of provider; and multinomial nested logit techniques are used to estimate the parameters of the decision-tree. Overall, the results give a reasonably favourable impression of the policy changes. In neither case is there evidence of serious reductions in access or increases in cost. Particularly notable is that despite an increase in formal user charges, the observed decline in rates of visits is statistically insignificant, suggesting the success of measures to improve quality of health care in public facilities. The observed increase in the probability of formal visits in the district with indirect payments is also striking. Both contrast with the control region of Illela, where neither user charges were introduced nor were any efforts made to improve quality. The data suggest that higher utilization of formal care, probably due to improvements in quality, outweighed the decrease in utilization that may have come about due to introduction of cost recovery, so that the net effect of the policy changes was an increase in utilization. Quality considerations appear to be important in ensuring the long-term success of cost sharing.


Subject(s)
Health Care Costs , Health Policy , Health Services Needs and Demand , Quality of Health Care , Data Collection , Drug Prescriptions , Income , Logistic Models , Niger , Probability
14.
Health Care Financ Rev ; 21(3): 7-28, 2000.
Article in English | MEDLINE | ID: mdl-11481769

ABSTRACT

The Diagnostic Cost Group Hierarchical Condition Category (DCG/HCC) payment models summarize the health care problems and predict the future health care costs of populations. These models use the diagnoses generated during patient encounters with the medical delivery system to infer which medical problems are present. Patient demographics and diagnostic profiles are, in turn, used to predict costs. We describe the logic, structure, coefficients and performance of DCG/HCC models, as developed and validated on three important data bases (privately insured, Medicaid, and Medicare) with more than 1 million people each.


Subject(s)
Cost Allocation/methods , Diagnosis-Related Groups/economics , Health Expenditures/statistics & numerical data , Managed Care Programs/economics , Medicaid/economics , Medicare/economics , Models, Econometric , Adolescent , Adult , Aged , Child , Child, Preschool , Demography , Eligibility Determination , Female , Humans , Infant , Male , Middle Aged
15.
Health Care Financ Rev ; 21(3): 93-118, 2000.
Article in English | MEDLINE | ID: mdl-11481770

ABSTRACT

The Balanced Budget Act (BBA) of 1997 required HCFA to implement health-status-based risk adjustment for Medicare capitation payments for managed care plans by January 1, 2000. In support of this mandate, HCFA has been collecting inpatient encounter data from health plans since 1997. These data include diagnoses and other information that can be used to identify chronic medical problems that contribute to higher costs, so that health plans can be paid more when they care for sicker patients. In this article, the authors describe the risk-adjustment model HCFA is implementing in the year 2000, known as the Principal Inpatient Diagnostic Cost Group (PIPDCG) model.


Subject(s)
Capitation Fee/statistics & numerical data , Diagnosis-Related Groups/economics , Medicare Part C/economics , Models, Econometric , Risk Adjustment/economics , Adolescent , Adult , Aged , Centers for Medicare and Medicaid Services, U.S. , Child , Child, Preschool , Demography , Female , Humans , Infant , Infant, Newborn , Male , Medicaid/economics , Middle Aged , United States
16.
Mol Cell Probes ; 13(1): 61-5, 1999 Feb.
Article in English | MEDLINE | ID: mdl-10024434

ABSTRACT

Dichelobacter nodosus is the causative agent of ovine foot rot, a disease that is a constant economic burden for many Western sheep ranches. Vaccination is one method of treating foot rot. A higher and more specific immune response is observed when monovalent vaccines are used to treat foot rot, as compared to multivalent vaccines, which incorporate all 10 major New Zealand D. nodosus serogroups. There is no single assay for specifically identifying and grouping D. nodosus for the purpose of incorporating only the desired serogroup(s) in a vaccine. A polymerase chain reaction (PCR)-based assay was used to specifically identify and group D. nodosus from a foot rot lesion. Identification and grouping was determined by predicted fragment size analysis and nucleotide sequence information. The PCR approach vastly improves the accuracy in identifying and grouping D. nodosus from a foot rot lesion.


Subject(s)
Dichelobacter nodosus/isolation & purification , Foot Rot/microbiology , Gram-Negative Bacterial Infections/veterinary , Polymerase Chain Reaction , Sequence Analysis, DNA , Sheep Diseases/microbiology , Animals , Bacterial Vaccines , Dichelobacter nodosus/classification , Dichelobacter nodosus/genetics , Gram-Negative Bacterial Infections/microbiology , Serotyping , Sheep , Species Specificity , Vaccination/veterinary
17.
J Health Econ ; 17(5): 537-55, 1998 Oct.
Article in English | MEDLINE | ID: mdl-10185511

ABSTRACT

Reimbursement incentives influence both the intensity of services and who is treated when patients differ in severity of illness. The social optimum is compared to the private Cournot-Nash solution for three provider strategies: creaming--over-provision of services to low severity patients; skimping--under-provision of services to high severity patients; and dumping--the explicit avoidance of high severity patients. Cost-based reimbursement results in overprovision of services (creaming) to all types of patients. Prospectively paid providers cream low severity patients and skimp high severity ones. If there is dumping of high severity patients, then there will also be skimping.


Subject(s)
Economic Competition , Patient Transfer/economics , Physician Incentive Plans/economics , Reimbursement, Incentive/statistics & numerical data , Fraud , Health Care Sector , Health Services Accessibility/economics , Health Services Misuse/economics , Health Services Needs and Demand/economics , Health Services Needs and Demand/statistics & numerical data , Insurance Coverage , Models, Statistical , Prospective Payment System/statistics & numerical data , Quality of Health Care , Severity of Illness Index , United States
18.
J Am Vet Med Assoc ; 212(11): 1751-6, 1998 Jun 01.
Article in English | MEDLINE | ID: mdl-9621884

ABSTRACT

OBJECTIVE: To identify clinical signs, physical examination findings, results of diagnostic tests, treatments administered, and clinical outcome of neonatal foals with enterocolitis associated with Clostridium perfringens infection. DESIGN: Retrospective study. ANIMALS: 54 neonatal foals. RESULTS: Most foals had acute onset of obtunded mentation, colic, or diarrhea and developed leukopenia, neutropenia, an abnormally high number of band neutrophils, toxic WBC, and hypoproteinemia within 24 hours after admission, despite high serum IgG concentrations (> 800 mg/dl). Abdominocentesis and abdominal radiography of some foals revealed exudative peritonitis and gaseous distention of the small and large intestine, respectively. Cytologic examination of feces revealed spores or gram-positive rods in 8 of 10 foals. The most common genotypes of C perfringens isolates were type A and C, alone or in combination. Treatment did not alter mortality rate for most foals that had a positive culture for C perfringens type C. Of 54 foals, 29 (54%) that had C perfringens-associated enterocolitis died. Foals that had a culture that yielded C perfringens had higher sepsis scores, IgG concentrations, and mortality rates, compared with the overall hospital population of neonatal foals. CLINICAL IMPLICATIONS: Foals less than 7 days old that have enterocolitis associated with C perfringens infections, especially type C, have a guarded prognosis. Cytologic examination of feces to determine spore counts and detect rods may be a means for early identification of C perfringens infections. Polymerase chain reaction assays to determine genotype are important for designing preventive treatment regimens.


Subject(s)
Clostridium Infections/veterinary , Clostridium perfringens , Enterocolitis/veterinary , Horse Diseases/microbiology , Animals , Animals, Newborn , Anti-Bacterial Agents/therapeutic use , Bacteremia/microbiology , Clostridium Infections/diagnosis , Clostridium Infections/microbiology , Clostridium perfringens/classification , Clostridium perfringens/genetics , Clostridium perfringens/isolation & purification , Enterocolitis/diagnosis , Enterocolitis/microbiology , Feces/microbiology , Genotype , Horse Diseases/diagnosis , Horse Diseases/therapy , Horses , Intestine, Small/microbiology , Rectum/microbiology , Retrospective Studies
19.
J Am Vet Med Assoc ; 209(5): 962-6, 1996 Sep 01.
Article in English | MEDLINE | ID: mdl-8790550

ABSTRACT

OBJECTIVE: To determine the validity of a 5-antigen ELISA for detection of tuberculosis in cattle and Cervidae. DESIGN: Cross-sectional observational study. SAMPLE POPULATION: Serum samples collected from 5,304 cattle in 23 herds and 1,441 Cervidae in 12 herds. PROCEDURE: Discriminant analysis was used to determine the linear combination of antigens that accurately predicted the true Mycobacterium bovis infection status of the most animals. The resulting classification functions then were used to calculate the percentage of animals that were correctly classified (ie, sensitivity and specificity). The kappa statistic was calculated to evaluate different combinations of test results. RESULTS: Of the 23 cattle herds, 4 dairy and 2 beef herds were considered infected. Of the 12 Cervidae herds, 5 were considered infected. For cattle, the specificity and sensitivity of ELISA, using the discriminant function, were 56.4 and 65.6%, respectively. For Cervidae, the specificity and sensitivity of ELISA, using the discriminant function, were 78.6 and 70.0%, respectively. CLINICAL IMPLICATIONS: Results suggest that the 5-antigen ELISA would not be a good test for tuberculosis, especially in cattle, if used alone. However, when results of the ELISA and tuberculin test were interpreted in parallel, sensitivity of the combination was greater than sensitivity of either test alone. Similarly, when results of the 2 tests were interpreted in series, specificity of the combination was greater than specificity of either test alone.


Subject(s)
Antigens, Bacterial/blood , Cattle Diseases/diagnosis , Deer , Enzyme-Linked Immunosorbent Assay/veterinary , Tuberculin Test/veterinary , Tuberculosis, Bovine/diagnosis , Animals , Cattle , Cattle Diseases/blood , Cattle Diseases/epidemiology , Cross-Sectional Studies , Enzyme-Linked Immunosorbent Assay/methods , Enzyme-Linked Immunosorbent Assay/standards , Female , Male , Mycobacterium bovis/immunology , Reproducibility of Results , Sensitivity and Specificity , Tuberculin Test/methods , Tuberculin Test/standards , Tuberculosis, Bovine/blood , Tuberculosis, Bovine/epidemiology , United States/epidemiology
20.
J Health Econ ; 15(3): 257-77, 1996 Jun.
Article in English | MEDLINE | ID: mdl-10159442

ABSTRACT

In response to a change in reimbursement incentives, hospitals may change the intensity of services provided to a given set of patients, change the type (or severity) of patients they see, or change their market share. Each of these three responses, which we define as a moral hazard effect, a selection effect, and a practice-style effect, can influence average resource use in a population. We develop and implement a methodology for disentangling these effects using a panel data set of Medicaid psychiatric discharges in New Hampshire. We also find evidence for the form of quality competition hypothesized by Dranove (1987).


Subject(s)
Hospitals, Psychiatric/economics , Medicaid/organization & administration , Prospective Payment System/statistics & numerical data , Adolescent , Adult , Diagnosis-Related Groups/economics , Female , Health Services Research , Hospitals, Psychiatric/statistics & numerical data , Humans , Length of Stay , Male , Medicaid/economics , Mental Disorders , Middle Aged , Models, Economic , New Hampshire , Patient Admission/statistics & numerical data , United States
SELECTION OF CITATIONS
SEARCH DETAIL
...