Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 37
Filter
1.
Nutrients ; 14(19)2022 Sep 30.
Article in English | MEDLINE | ID: mdl-36235726

ABSTRACT

Vitamin D metabolism differs among human populations because our species has adapted to different natural and cultural environments. Two environments are particularly difficult for the production of vitamin D by the skin: the Arctic, where the skin receives little solar UVB over the year; and the Tropics, where the skin is highly melanized and blocks UVB. In both cases, natural selection has favored the survival of those individuals who use vitamin D more efficiently or have some kind of workaround that ensures sufficient uptake of calcium and other essential minerals from food passing through the intestines. Vitamin D scarcity has either cultural or genetic solutions. Cultural solutions include consumption of meat in a raw or boiled state and extended breastfeeding of children. Genetic solutions include higher uptake of calcium from the intestines, higher rate of conversion of vitamin D to its most active form, stronger binding of vitamin D to carrier proteins in the bloodstream, and greater use of alternative metabolic pathways for calcium uptake. Because their bodies use vitamin D more sparingly, indigenous Arctic and Tropical peoples can be misdiagnosed with vitamin D deficiency and wrongly prescribed dietary supplements that may push their vitamin D level over the threshold of toxicity.


Subject(s)
Vitamin D Deficiency , Vitamin D , Calcium , Calcium, Dietary , Carrier Proteins , Diet , Humans , Inuit , Vitamins
2.
Br Dent J ; 2021 Feb 11.
Article in English | MEDLINE | ID: mdl-33574577

ABSTRACT

Introduction Intraligamentary local anaesthesia (ILA) with articaine is described as an effective alternative to inferior alveolar nerve block (IANB) for extraction of posterior teeth in the mandible, with reduced risk of complications.Aim To investigate ILA with 4% articaine and conventional syringe as a unique method for providing tooth extractions in the posterior mandible.Materials and methods All consecutive teeth to be extracted in the posterior mandible were recruited to the study, within exclusion criteria, between 2002 and 2017 in one London NHS and private dental practice. Four percent articaine was given by ILA with a conventional syringe slowly at two points lingual and two points buccal adjacent to each tooth. Extraction procedures were all performed flapless. Heavily broken-down teeth (n = 43) were extracted by sectioning of roots, guttering and elevation with luxators using socket preservation techniques. Demographic, quantitative and qualitative data were collected at initial appointments and up to 15 years at review.Results The median age was 64 years (interquartile range 17). Teeth extracted included 272 mandibular molars and second premolars, due to periodontal disease (34%), irreversible pulpitis (29%) or posterior tooth fracture (27%). The majority of extractions were second molars (44%), followed by first molars (29%), second premolars (17%) and third molars (10%). Sufficient anaesthesia was achieved within five minutes for all extractions. Procedures lasted less than 30 minutes. Patient feedback reported that the extraction using ILA was quicker than expected and painless, with limited anaesthesia of tissues other than the teeth to be extracted. Numeric rating scale (NRS) scores for pain (0-10) were all less than 3. No complications were recorded.Conclusion The ILA anaesthetic technique is effective for the purpose of a broad range of posterior tooth extractions in the mandible and within certain clinical parameters. It mitigates risks, including nerve injury and cardiovascular disturbances, associated with repeated IANB. This is the largest study of its kind and is conducted in primary care.

3.
Perspect Biol Med ; 63(4): 591-601, 2020.
Article in English | MEDLINE | ID: mdl-33416798

ABSTRACT

Many pathogens, especially fungi, have evolved the capacity to manipulate host behavior, usually to improve their chances of spreading to other hosts. Such manipulation is difficult to observe in long-lived hosts, like humans. First, much time may separate cause from effect in the case of an infection that develops over a human life span. Second, the host-pathogen relationship may initially be commensal: the host becomes a vector for infection of other humans, and in exchange the pathogen remains discreet and does as little harm as possible. Commensalism breaks down with increasing age because the host is no longer a useful vector, being less socially active and at higher risk of death. Certain neurodegenerative diseases may therefore be the terminal stage of a longer-lasting relationship in which the host helps the pathogen infect other hosts, largely via sexual relations. Strains from the Candida genus are particularly suspect. Such pathogens seem to have co-evolved not only with their host population but also with the local social environment. Different social environments may have thus favored different pathogenic strategies for manipulation of human behavior.


Subject(s)
Fungi/pathogenicity , Host-Pathogen Interactions/physiology , Neurodegenerative Diseases/epidemiology , Neurodegenerative Diseases/microbiology , Social Environment , AIDS Dementia Complex/epidemiology , AIDS Dementia Complex/microbiology , Aging/physiology , Autism Spectrum Disorder/epidemiology , Autism Spectrum Disorder/microbiology , Candida/pathogenicity , Humans , Multiple Sclerosis/epidemiology , Multiple Sclerosis/microbiology , Sexually Transmitted Diseases/microbiology , Symbiosis
4.
J Prosthet Dent ; 119(6): 935-941, 2018 Jun.
Article in English | MEDLINE | ID: mdl-28969914

ABSTRACT

STATEMENT OF PROBLEM: Indirect restorations are an important treatment in dental practice, but long-term survival studies are lacking. PURPOSE: The purpose of this retrospective study was to report on the outcome of indirect restorations, which were followed up annually for up to 50 years in a dental practice. MATERIAL AND METHODS: A retrospective survival study was undertaken at a mixed National Health Service (NHS)/private dental practice in London, UK. Data were collected for restorations placed between 1966 and 1996 by 1 experienced operator. It was a requirement that patients had been followed up annually with clinical and radiographic examinations for up to 50 years. Patients were enrolled on a strict preventive policy and had excellent oral hygiene. Oral hygiene, restoration location, sensitivity, occlusion, and other details (preparation design, taper, cement used) were recorded. Restoration outcome was recorded as successful and surviving, unknown, or failed. The data were described descriptively. Kaplan-Meier survival curves and hazard curves were used to assess the survival of crowns and the probability of failure over time. RESULTS: A total of 223 restorations were placed in 47 patients between 1966 and 1996 and reviewed annually for up to 50 years (until 2016). These restorations included 154 metal-ceramic crowns (101 posterior and 53 anterior), 25 posterior gold crowns, 22 anterior ceramic veneers, and 22 anterior ceramic crowns. Restorations were in occlusion. The mean survival for metal-ceramic crowns was estimated as 47.53 years (95% confidence interval [CI]: 45.59-49.47 years). Failures in metal-ceramic crowns (n=6, 3.9%) were due to periapical periodontitis. The remaining restoration types had 100% survival at 50 years. CONCLUSIONS: This study showed that the survival of crowns and veneers is high over 50 years in clinical practice with annual follow-up and good oral hygiene. The proportion of teeth with loss of vitality, confirmed clinically and with radiographs, was minimal.


Subject(s)
Dental Veneers , Prosthesis Failure , Crowns , Female , Follow-Up Studies , Humans , London , Male , Middle Aged , Oral Hygiene , Retrospective Studies
5.
PLoS One ; 12(12): e0190238, 2017.
Article in English | MEDLINE | ID: mdl-29284020

ABSTRACT

Red hair is associated in women with pain sensitivity. This medical condition, and perhaps others, seems facilitated by the combination of being red-haired and female. We tested this hypothesis by questioning a large sample of Czech and Slovak respondents about the natural redness and darkness of their hair, their natural eye color, their physical and mental health (24 categories), and other personal attributes (height, weight, number of children, lifelong number of sexual partners, frequency of smoking). Red-haired women did worse than other women in ten health categories and better in only three, being particularly prone to colorectal, cervical, uterine, and ovarian cancer. Red-haired men showed a balanced pattern, doing better than other men in three health categories and worse in three. Number of children was the only category where both male and female redheads did better than other respondents. We also confirmed earlier findings that red hair is naturally more frequent in women than in men. Of the 'new' hair and eye colors, red hair diverges the most from the ancestral state of black hair and brown eyes, being the most sexually dimorphic variant not only in population frequency but also in health status. This divergent health status may have one or more causes: direct effects of red hair pigments (pheomelanins) or their by-products; effects of other genes that show linkage with genes involved in pheomelanin production; excessive prenatal exposure to estrogen (which facilitates expression of red hair during fetal development and which, at high levels, may cause health problems later in life); evolutionary recentness of red hair and corresponding lack of time to correct negative side effects; or genetic incompatibilities associated with the allele Val92Met, which seems to be of Neanderthal origin and is one of the alleles that can cause red hair.


Subject(s)
Eye Color , Hair Color , Health Status , Sex Factors , Czech Republic , Female , Humans , Male , Slovakia
6.
J Gen Psychol ; 142(4): 238-52, 2015.
Article in English | MEDLINE | ID: mdl-26649923

ABSTRACT

Two experiments were conducted to examine whether recognition memory for information and/or its source are influenced by confirmation bias. During Phase 1, subjects were shown a summary about the issue of gun control and asked to indicate a position on the issue. During Phase 2, 12 abstracts (Experiment 1) or social media posts (Experiment 2) were shown, one at a time. Posts in Experiment 2 were associated with either friends or strangers. Participants indicated whether they wanted to read a more extensive version of each abstract (Experiment 1) or post (Experiment 2). Phase 3 was the memory phase. Thirty-two abstract titles (Experiment 1) or posts (Experiment 2) were shown one at a time. Participants indicated yes or no, and whether they recognized the titles/posts from the last phase. Recognition memory for information that supported the participants' viewpoint was higher than that for opposing information.


Subject(s)
Memory , Recognition, Psychology , Adolescent , Adult , Female , Friends/psychology , Humans , Male , Social Media , Young Adult
7.
Dent Hist ; 60(2): 63-9, 2015 Jul.
Article in English | MEDLINE | ID: mdl-26399148

ABSTRACT

A history of Ronald Gain's dental practice is described including his service during the Second World War. An account is given of the bomb damage in and around the practice in Peckham Rye.


Subject(s)
Dentists/history , World War II , History, 20th Century , Humans , London , Military Dentistry/history
8.
BMC Health Serv Res ; 15: 114, 2015 Mar 21.
Article in English | MEDLINE | ID: mdl-25888922

ABSTRACT

BACKGROUND: Successful implementation of new methods and models of healthcare to achieve better patient outcomes and safe, person-centered care is dependent on the physical environment of the healthcare architecture in which the healthcare is provided. Thus, decisions concerning healthcare architecture are critical because it affects people and work processes for many years and requires a long-term financial commitment from society. In this paper, we describe and suggest several strategies (critical factors) to promote shared-decision making when planning and designing new healthcare environments. DISCUSSION: This paper discusses challenges and hindrances observed in the literature and from the authors extensive experiences in the field of planning and designing healthcare environments. An overview is presented of the challenges and new approaches for a process that involves the mutual exchange of knowledge among various stakeholders. Additionally, design approaches that balance the influence of specific and local requirements with general knowledge and evidence that should be encouraged are discussed. We suggest a shared-decision making and collaborative planning and design process between representatives from healthcare, construction sector and architecture based on evidence and end-users' perspectives. If carefully and systematically applied, this approach will support and develop a framework for creating high quality healthcare environments.


Subject(s)
Decision Making , Health Facility Environment , Interior Design and Furnishings , Quality Improvement , Cooperative Behavior , Delivery of Health Care , Humans , Patient-Centered Care , Quality of Health Care/standards , Self Care
9.
Evol Psychol ; 13(1): 230-43, 2015 Mar 06.
Article in English | MEDLINE | ID: mdl-25748943

ABSTRACT

Through its monopoly on violence, the State tends to pacify social relations. Such pacification proceeded slowly in Western Europe between the 5th and 11th centuries, being hindered by the rudimentary nature of law enforcement, the belief in a man's right to settle personal disputes as he saw fit, and the Church's opposition to the death penalty. These hindrances began to dissolve in the 11th century with a consensus by Church and State that the wicked should be punished so that the good may live in peace. Courts imposed the death penalty more and more often and, by the late Middle Ages, were condemning to death between 0.5 and 1.0% of all men of each generation, with perhaps just as many offenders dying at the scene of the crime or in prison while awaiting trial. Meanwhile, the homicide rate plummeted from the 14th century to the 20th. The pool of violent men dried up until most murders occurred under conditions of jealousy, intoxication, or extreme stress. The decline in personal violence is usually attributed to harsher punishment and the longer-term effects of cultural conditioning. It may also be, however, that this new cultural environment selected against propensities for violence.


Subject(s)
Homicide/history , Social Control, Formal , Violence/history , Europe , History, 15th Century , History, 16th Century , History, 17th Century , History, 18th Century , History, 19th Century , History, 20th Century , History, Medieval , Homicide/legislation & jurisprudence , Humans , Violence/legislation & jurisprudence
11.
J Int AIDS Soc ; 16: 18586, 2013 Sep 10.
Article in English | MEDLINE | ID: mdl-24029015

ABSTRACT

INTRODUCTION: The provision of HIV treatment and care in sub-Saharan Africa faces multiple challenges, including weak health systems and attrition of trained health workers. One potential response to overcome these challenges has been to engage community health workers (CHWs). METHODOLOGY: A systematic literature search for quantitative and qualitative studies describing the role and outcomes of CHWs in HIV care between inception and December 2012 in sub-Saharan Africa was performed in the following databases: PubMed, PsychINFO, Embase, Web of Science, JSTOR, WHOLIS, Google Scholar and SAGE journals online. Bibliographies of included articles were also searched. A narrative synthesis approach was used to analyze common emerging themes on the role and outcomes of CHWs in HIV care in sub-Saharan Africa. RESULTS: In total, 21 studies met the inclusion criteria, documenting a range of tasks performed by CHWs. These included patient support (counselling, home-based care, education, adherence support and livelihood support) and health service support (screening, referral and health service organization and surveillance). CHWs were reported to enhance the reach, uptake and quality of HIV services, as well as the dignity, quality of life and retention in care of people living with HIV. The presence of CHWs in clinics was reported to reduce waiting times, streamline patient flow and reduce the workload of health workers. Clinical outcomes appeared not to be compromised, with no differences in virologic failure and mortality comparing patients under community-based and those under facility-based care. Despite these benefits, CHWs faced challenges related to lack of recognition, remuneration and involvement in decision making. CONCLUSIONS: CHWs can clearly contribute to HIV services delivery and strengthen human resource capacity in sub-Saharan Africa. For their contribution to be sustained, CHWs need to be recognized, remunerated and integrated in wider health systems. Further research focusing on comparative costs of CHW interventions and successful models for mainstreaming CHWs into wider health systems is needed.


Subject(s)
Community Health Workers , HIV Infections/drug therapy , HIV Infections/prevention & control , Health Services Research , Africa , Africa South of the Sahara , Humans
12.
PLoS One ; 8(1): e53285, 2013.
Article in English | MEDLINE | ID: mdl-23326406

ABSTRACT

We tested whether eye color influences perception of trustworthiness. Facial photographs of 40 female and 40 male students were rated for perceived trustworthiness. Eye color had a significant effect, the brown-eyed faces being perceived as more trustworthy than the blue-eyed ones. Geometric morphometrics, however, revealed significant correlations between eye color and face shape. Thus, face shape likewise had a significant effect on perceived trustworthiness but only for male faces, the effect for female faces not being significant. To determine whether perception of trustworthiness was being influenced primarily by eye color or by face shape, we recolored the eyes on the same male facial photos and repeated the test procedure. Eye color now had no effect on perceived trustworthiness. We concluded that although the brown-eyed faces were perceived as more trustworthy than the blue-eyed ones, it was not brown eye color per se that caused the stronger perception of trustworthiness but rather the facial features associated with brown eyes.


Subject(s)
Eye Color/physiology , Trust , Adult , Face , Female , Form Perception/physiology , Humans , Male , Middle Aged , Young Adult
13.
Memory ; 21(3): 408-16, 2013 Apr.
Article in English | MEDLINE | ID: mdl-23075232

ABSTRACT

We examined how certain personality traits might relate to the formation of suggestive memory over time. We hypothesised that compliance and trust relate to initial acceptance of misinformation as memory, whereas fantasy proneness might relate to integration of misinformation into memory after later intervals (relative to the time of exposure to misinformation). Participants watched an excerpt from a movie--the simulated eyewitness event. They next answered a recall test that included embedded misinformation about the movie. Participants then answered a yes/no recognition test. A week later, participants answered a second yes/no recognition test about the movie (each yes/no recognition test included different questions). Before both recognition tests, participants were warned about the misinformation shown during recall and were asked to base their answer on the movie excerpt only. After completing the second recognition test, participants answered questions from the Neuroticism Extroversion Openness Personality Inventory-3 (McCrae, Costa, & Martin, 2005) and Creative Experiences Questionnaire (Merckelbach, Horselenberg, & Muris, 2001). While compliance correlated with misinformation effects immediately after exposure to misinformation, fantasy-prone personality accounted for more of the variability in false recognition rates than compliance after a 1-week interval.


Subject(s)
Individuality , Mental Recall , Personality , Recognition, Psychology , Suggestion , Adult , Female , Humans , Male , Personality Inventory , Photic Stimulation , Time Factors
15.
Int J Circumpolar Health ; 71: 18001, 2012 Mar 19.
Article in English | MEDLINE | ID: mdl-22456053

ABSTRACT

Vitamin D deficiency seems to be common among northern Native peoples, notably Inuit and Amerindians. It has usually been attributed to: (1) higher latitudes that prevent vitamin D synthesis most of the year; (2) darker skin that blocks solar UVB; and (3) fewer dietary sources of vitamin D. Although vitamin D levels are clearly lower among northern Natives, it is less clear that these lower levels indicate a deficiency. The above factors predate European contact, yet pre-Columbian skeletons show few signs of rickets-the most visible sign of vitamin D deficiency. Furthermore, because northern Natives have long inhabited high latitudes, natural selection should have progressively reduced their vitamin D requirements. There is in fact evidence that the Inuit have compensated for decreased production of vitamin D through increased conversion to its most active form and through receptors that bind more effectively. Thus, when diagnosing vitamin D deficiency in these populations, we should not use norms that were originally developed for European-descended populations who produce this vitamin more easily and have adapted accordingly.


Subject(s)
Indians, North American , Inuit , Vitamin D Deficiency/ethnology , Adaptation, Physiological , Canada/epidemiology , Humans , Vitamin D/blood , Vitamin D/toxicity
16.
Bioresour Technol ; 105: 15-23, 2012 Feb.
Article in English | MEDLINE | ID: mdl-22178488

ABSTRACT

An economic analysis was performed on treatment options for pig manure in Ireland. Costs were based on a 500 sow integrated pig farm producing 10,500 m(3) of manure per year at 4.8% dry matter. The anaerobic digestion of pig manure and grass silage (1:1; volatile solids basis) was unviable under the proposed tariffs, with costs at € 5.2 m(-3) manure. Subsequent solid-liquid separation of the digestate would cost an additional € 12.8 m(-3) manure. The treatment of the separated solid fraction by composting and of the liquid fraction by integrated constructed wetlands, would add € 2.8 and € 4.6 m(-3) manure, respectively to the treatment costs. The cost analysis presented showed that the technologies investigated are currently not cost effective in Ireland. Transport and spreading of raw manure, at € 4.9 m(-3) manure (15 km maximum distance from farm) is the most cost effective option.


Subject(s)
Manure/analysis , Refuse Disposal/methods , Anaerobiosis , Animals , Costs and Cost Analysis , Equipment Design , Hot Temperature , Ireland , Methane/analysis , Models, Economic , Nitrogen/analysis , Poaceae , Renewable Energy , Swine
17.
J Gastroenterol Hepatol ; 25(7): 1276-80, 2010 Jul.
Article in English | MEDLINE | ID: mdl-20594255

ABSTRACT

BACKGROUND AND AIM: Prisoners have a high prevalence of injection drug use (IDU) and chronic hepatitis C (CHC) infection. Treatment of CHC in these patients is effective; however, their long-term outcomes following treatment are unknown. We determined the durability of a sustained virological response (SVR) in prisoners treated for CHC. METHODS: Patients were treated as part of routine clinical practice with interferon (IFN) and ribavirin. A retrospective review of medical records and a computerized pathology system was performed for clinical and laboratory information. RESULTS: Seventy-four prisoners (70 males, mean age 34 years, IDU in 55%) were evaluable for a SVR over a 12-year period to December 2008; the mean follow-up period was 1243 days. Genotype 1, 2, 3, and 6 infection was present in 18, three, 38 and three patients, respectively; the genotype was unknown in 12. Three out of 52 biopsied had cirrhosis. Standard IFN was administered to 25 (34%; 11 with ribavirin), and 49 received pegylated IFN and ribavirin; one did not complete treatment, and two had breakthrough relapses. The end-of-treatment response was achieved in 57 and SVR in 53; 14 were non-responders. Five male patients, four with unknown genotypes and treated with standard IFN alone, relapsed late (following SVR, 9%). Five patients, all treated with pegylated IFN and ribavirin, were reinfected (one prior to and four following SVR). CONCLUSIONS: Prisoners are often successfully treated for CHC. However, this retrospective study indicates that there is a high (17%) prevalence of late recurrence of viremia that is likely a reflection of reinfection due to ongoing risk-taking behavior.


Subject(s)
Antiviral Agents/therapeutic use , Drug Users , Hepatitis C, Chronic/drug therapy , Prisoners , Substance Abuse, Intravenous/complications , Adult , Drug Therapy, Combination , Drug Users/statistics & numerical data , Female , Hepacivirus/genetics , Hepatitis C, Chronic/diagnosis , Hepatitis C, Chronic/epidemiology , Hepatitis C, Chronic/transmission , Humans , Interferons/therapeutic use , Male , Prevalence , Prisoners/statistics & numerical data , RNA, Viral/blood , Recurrence , Retrospective Studies , Ribavirin/therapeutic use , South Australia/epidemiology , Substance Abuse, Intravenous/epidemiology , Time Factors , Treatment Outcome , Viral Load
18.
Am J Psychol ; 123(2): 221-30, 2010.
Article in English | MEDLINE | ID: mdl-20518438

ABSTRACT

Three experiments were conducted to find out whether the standard Implicit Association Test (IAT) could be used to distinguish truthful and deceitful witnesses. We anticipated that IAT effects would be greater after lying. Participants were asked to answer questions with incorrect answers (i.e., the lie condition) or correct answers (i.e., the truthful condition). A third group of participants were not interviewed (a control group). Participants then took the IAT, in which they were asked to associate correct and incorrect answers with positive or negative attributes. Results demonstrate that standard IAT effects are greater after lying than after truth telling, but only when attribute labels were clearly and explicitly linked to positive and negative affect. Theoretical implications are considered.


Subject(s)
Lie Detection/psychology , Psychomotor Performance , Reaction Time , Word Association Tests/statistics & numerical data , Female , Humans , Male , Psychometrics/statistics & numerical data , Reproducibility of Results , Truth Disclosure
19.
Evol Psychol ; 8(3): 376-89, 2010 Jul 23.
Article in English | MEDLINE | ID: mdl-22947807

ABSTRACT

Over the last 10,000 years, the human genome has changed at an accelerating rate. The change seems to reflect adaptations to new social environments, including the rise of the State and its monopoly on violence. State societies punish young men who act violently on their own initiative. In contrast, non-State societies usually reward such behavior with success, including reproductive success. Thus, given the moderate to high heritability of male aggressiveness, the State tends to remove violent predispositions from the gene pool while favoring tendencies toward peacefulness and submission. This perspective is applied here to the Roman state, specifically its long-term effort to pacify the general population. By imperial times, this effort had succeeded so well that the Romans saw themselves as being inherently less violent than the "barbarians" beyond their borders. By creating a pacified and submissive population, the empire also became conducive to the spread of Christianity--a religion of peace and submission. In sum, the Roman state imposed a behavioral change that would over time alter the mix of genotypes, thus facilitating a subsequent ideological change.


Subject(s)
Roman World/history , Social Behavior/history , Social Control, Formal/methods , Social Environment , Violence/prevention & control , Aggression , Christianity/history , History, Ancient , Humans , Rome
SELECTION OF CITATIONS
SEARCH DETAIL
...