Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 7 de 7
Filter
Add more filters










Database
Publication year range
2.
Curr Anthropol ; 49(6): 963-77; discussion 977-91, 2008 Dec.
Article in English | MEDLINE | ID: mdl-19391442

ABSTRACT

Osseous manifestation of infectious disease is of paramount importance to paleopathologists seeking to interpret ancient health, but the relationships among infectious agent exposure, development of disease, and skeletal involvement are complex. The outcome of an exposure strongly depends on multiple factors, including ecology, diet, nutrition, immune function, and the genetics of pathogen and host. Mycobacterial diseases are often studied in ancient remains but also are especially influenced by these factors; individual and population differences in severity and course are apparent following onset of active disease. The osteological record for these diseases represents the complex interplay of host and pathogen characteristics influencing within- and among-individual skeletal lesion prevalence and distribution. However, many of these characteristics may be assessed independently through the archaeological record. Here, we explore the contributions of dietary protein and iron to immune function, particularly the course and outcome of infection with Mycobacterium tuberculosis. We emphasize how nutrition may influence the dissemination of bacilli to the skeleton and subsequent formation of diagnostic lesions. We then generate models and hypotheses informed by this interplay and apply them to four prehistoric New World areas. Finally, discrepancies between our expectations and the observed record are explored as a basis for new hypotheses.


Subject(s)
Diet/history , Mycobacterium tuberculosis , Nutritional Status , Paleopathology/history , Tuberculosis, Pulmonary/history , Dietary Proteins/history , History, Ancient , History, Medieval , Humans , Iron, Dietary/history , Mycobacterium tuberculosis/immunology , Mycobacterium tuberculosis/isolation & purification
3.
Am J Phys Anthropol ; 128(2): 252-72, 2005 Oct.
Article in English | MEDLINE | ID: mdl-15795886

ABSTRACT

This paper presents three distinct models for the development of acquired anemia: iron-deficiency anemia produced by the inadequate intake and/or absorption of iron, the anemia of chronic disease (ACD) caused by the body's natural iron-withholding defense against microbial invaders, and megaloblastic anemia caused by insufficient intake and/or absorption of vitamin B(12) or folic acid. These etiological models are used to interpret the distribution and etiology of anemia among adult individuals interred at the Medieval Gilbertine Priory of St. Andrew, Fishergate, York (n = 147). This bioarchaeological analysis uncovered not only a strong relationship between decreasing status and increasing prevalence of anemia for both men and women, but also identified clear sex-based differences at this site. Within the high-status group, blood and iron loss as a result of rampant parasitism likely produced an environment ripe for the development of iron-deficiency anemia, while the parasitic consumption of vitamin B(12) may have caused occasional cases of megaloblastic anemia. As status decreases, the interpretation of anemia becomes more complex, with megaloblastic anemia and ACD emerging as viable, potentially heavy contributors to the anemia experiences of low-status people at St. Andrew's. Apart from status effects, women (especially young women) are disproportionately affected by anemia when compared to men within their own status group and, on average, are also more likely to have experienced anemia than their male peers from other status groups. This suggests that high iron-demand reproductive functions helped to make iron-deficiency anemia a chronic condition in many women's lives irrespective of their status affiliation.


Subject(s)
Anemia, Iron-Deficiency/history , Anemia, Megaloblastic/history , Age Factors , Anemia, Iron-Deficiency/epidemiology , Anemia, Iron-Deficiency/etiology , Anemia, Megaloblastic/epidemiology , Anemia, Megaloblastic/etiology , England/epidemiology , Female , History, Medieval , Humans , Iron, Dietary/history , Male , Models, Statistical , Prevalence , Sex Factors
6.
Arch Latinoam Nutr ; 49(3 Suppl 2): 7S-10S, 1999 Sep.
Article in Spanish | MEDLINE | ID: mdl-10971830

ABSTRACT

This is a non-comprehensive overview of the latest 50 years about the evolution of iron metabolism and the methodology we currently have for the diagnosis of iron deficiency and its effects on human health. In the 40's iron absorption was determined by chemistry. The amount of iron absorbed was calculated as the difference between dietary iron and excreted iron. The other methods used to measure dietary iron was hemoglobin repletion. In the 70's the measurement of plasmatic ferritin was an important contribution to iron metabolism to assess iron deficiency and iron overload. In the same decade the extrinsic and intrinsic labelled methodology was an important advancement. The 70's and 80's were years where scientists aimed at finding iron absorption inhibitors, namely coffee, calcium, tea, zinc and fiber. The 80's and 90's were characterized for the emerging knowledge an iron absorption from a food, a meal and a complete diet and for the favorable effect of food iron fortification in developing countries. Also for the effect of iron excess in overall health and myocardial infarction in developed countries were studied.


Subject(s)
Anemia, Iron-Deficiency/history , Iron Deficiencies , Iron, Dietary/history , Anemia, Iron-Deficiency/diagnosis , Anemia, Iron-Deficiency/prevention & control , History, 20th Century , Humans , Intestinal Absorption , Iron/metabolism , Iron, Dietary/pharmacokinetics
SELECTION OF CITATIONS
SEARCH DETAIL
...