Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 2 de 2
Filter
Add more filters










Database
Language
Publication year range
1.
Camb Q Healthc Ethics ; 31(1): 119-130, 2022 01.
Article in English | MEDLINE | ID: mdl-35049457

ABSTRACT

The amount of data available to healthcare practitioners is growing, and the rapid increase in available patient data is becoming a problem for healthcare practitioners, as they are often unable to fully survey and process the data relevant for the treatment or care of a patient. Consequently, there are currently several efforts to develop systems that can aid healthcare practitioners with reading and processing patient data and, in this way, provide them with a better foundation for decision-making about the treatment and care of patients. There are also efforts to develop algorithms that provide suggestions for such decisions. However, the development of these systems and algorithms raises several concerns related to the privacy of patients, the patient-practitioner relationship, and the autonomy of healthcare practitioners. The aim of this article is to provide a foundation for understanding the ethical challenges related to the development of a specific form of data-processing systems, namely clinical algorithms.


Subject(s)
Morals , Privacy , Algorithms , Decision Making , Delivery of Health Care , Humans
2.
JMIR Form Res ; 5(8): e17971, 2021 Aug 09.
Article in English | MEDLINE | ID: mdl-34383666

ABSTRACT

BACKGROUND: As a preamble to an attempt to develop a tool that can aid health professionals at hospitals in identifying whether the patient may have an alcohol abuse problem, this study investigates opinions and attitudes among both health professionals and patients about using patient data from electronic health records (EHRs) in an algorithm screening for alcohol problems. OBJECTIVE: The aim of this study was to investigate the attitudes and opinions of patients and health professionals at hospitals regarding the use of previously collected data in developing and implementing an algorithmic helping tool in EHR for screening inexpedient alcohol habits; in addition, the study aims to analyze how patients would feel about asking and being asked about alcohol by staff, based on a notification in the EHR from such a tool. METHODS: Using semistructured interviews, we interviewed 9 health professionals and 5 patients to explore their opinions and attitudes about an algorithm-based helping tool and about asking and being asked about alcohol usage when being given a reminder from this type of tool. The data were analyzed using an ad hoc method consistent with a close reading and meaning condensing. RESULTS: The health professionals were both positive and negative about a helping tool grounded in algorithms. They were optimistic about the potential of such a tool to save some time by providing a quick overview if it was easy to use but, on the negative side, noted that this type of helping tool might take away the professionals' instinct. The patients were overall positive about the helping tool, stating that they would find this tool beneficial for preventive care. Some of the patients expressed concerns that the information provided by the tool could be misused. CONCLUSIONS: When developing and implementing an algorithmic helping tool, the following aspects should be considered: (1) making the helping tool as transparent in its recommendations as possible, avoiding black boxing, and ensuring room for professional discretion in clinical decision making; and (2) including and taking into account the attitudes and opinions of patients and health professionals in the design and development process of such an algorithmic helping tool.

SELECTION OF CITATIONS
SEARCH DETAIL
...