Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 128
Filter
1.
Pharm Stat ; 2024 Jul 10.
Article in English | MEDLINE | ID: mdl-38987217

ABSTRACT

Chemistry, manufacturing, and control (CMC) statisticians play a key role in the development and lifecycle management of pharmaceutical and biological products, working with their non-statistician partners to manage product quality. Information used to make quality decisions comes from studies, where success is facilitated through adherence to the scientific method. This is carried out in four steps: (1) an objective, (2) design, (3) conduct, and (4) analysis. Careful consideration of each step helps to ensure that a study conclusion and associated decision is correct. This can be a development decision related to the validity of an assay or a quality decision like conformance to specifications. Importantly, all decisions are made with risk. Conventional statistical risks such as Type 1 and Type 2 errors can be coupled with associated impacts to manage patient value as well as development and commercial costs. The CMC statistician brings focus on managing risk across the steps of the scientific method, leading to optimal product development and robust supply of life saving drugs and biologicals.

2.
Stud Hist Philos Sci ; 106: 186-195, 2024 Jul 18.
Article in English | MEDLINE | ID: mdl-39029139

ABSTRACT

Abraham Flexner's 1910 report on medical education is widely regarded as a watershed moment in the history of modern medicine in the US and beyond. Most commentators focus on its administrative and managerial impact, despite Flexner dedicating a sizeable portion of his report to a theoretical account of the kind of medicine that he seeks to implement. Close attention to these sections reveals a surprisingly coherent account of medicine that, based on a Deweyan Pragmatist philosophy of science, unites scientific investigator and medical practitioner in a new experimental paradigm of science. Flexner can develop an account that goes beyond a mere epistemic redefinition of medicine, providing the profession with a social, cultural, and ethical identity that avails itself of the extremely wide purview that Dewey granted to modern science. Due to the subsequent narrowing of philosophy of science to a delimited academic subdiscipline, these broad Pragmatist philosophical commitments at the roots of Flexner's scientific medicine remained a largely unexplored intellectual legacy.

3.
Ecol Evol ; 14(6): e11483, 2024 Jun.
Article in English | MEDLINE | ID: mdl-38826168

ABSTRACT

The anther-smut host-pathogen system has provided extensive insights into the evolutionary ecology of disease resistance, transmission modes, host shifts, pathogen specialization, and disease evolution in metapopulations. It also has led to unexpected insights into sex ratio distorters, sex chromosome evolution, and transposable elements in fungi. In addition, anther-smut disease played a major role in Linnaeus' germ theory and the correspondence on parasitic castration between Darwin and Becker, one of the first female botanists. Here, we explicitly highlight some of the realities in the process of science, using an unusual autobiographical approach to describe how we came to collaborate on this system in the 1980s. Using perspectives from our different career stages, we present a surprising narrative that could not be deduced from merely reading the published papers. While our work was grounded in previous ecological and evolutionary theory, it was the product as much of empirical failures and intellectual roadblocks, as the result of a progressive scientific method. Our experiences illustrate not only the "human dimension of science" but more importantly show that linear sequences of hypothesis testing do not necessarily lead to new study systems and new ideas. We suggest there is a need to re-evaluate the scientific method in ecology and evolution, especially where the challenge is to engage in a productive dialog between natural history and theory.

4.
PNAS Nexus ; 3(4): pgae112, 2024 Apr.
Article in English | MEDLINE | ID: mdl-38560527

ABSTRACT

Scientific, medical, and technological knowledge has transformed our world, but we still poorly understand the nature of scientific methodology. Science textbooks, science dictionaries, and science institutions often state that scientists follow, and should follow, the universal scientific method of testing hypotheses using observation and experimentation. Yet, scientific methodology has not been systematically analyzed using large-scale data and scientific methods themselves as it is viewed as not easily amenable to scientific study. Using data on all major discoveries across science including all Nobel Prize and major non-Nobel Prize discoveries, we can address the question of the extent to which "the scientific method" is actually applied in making science's groundbreaking research and whether we need to expand this central concept of science. This study reveals that 25% of all discoveries since 1900 did not apply the common scientific method (all three features)-with 6% of discoveries using no observation, 23% using no experimentation, and 17% not testing a hypothesis. Empirical evidence thus challenges the common view of the scientific method. Adhering to it as a guiding principle would constrain us in developing many new scientific ideas and breakthroughs. Instead, assessing all major discoveries, we identify here a general, common feature that the method of science can be reduced to: making all major discoveries has required using sophisticated methods and instruments of science. These include statistical methods, particle accelerators, and X-ray methods. Such methods extend our mind and generally make observing, experimenting, and testing hypotheses in science possible, doing so in new ways and ensure their replicability. This provides a new perspective to the scientific method-embedded in our sophisticated methods and instruments-and suggests that we need to reform and extend the way we view the scientific method and discovery process.

5.
Can J Public Health ; 2024 Mar 13.
Article in English | MEDLINE | ID: mdl-38478215

ABSTRACT

Biostatistics is foundational to public health research and Canada has a history of high impact contributions both in seminal methodological advances and in the rigorous application of methods for the design or analysis of public health studies. In this article, we provide a brief and personal review of selected contributions from Canadian biostatisticians to fields such as survival and life history analysis, sampling, clinical trial methodology, environmental risk assessment, infectious disease epidemiology, and early work on prediction. We also provide a brief look forward at the upcoming needs and future directions of biostatistical research.


RéSUMé: La biostatistique est fondamentale pour la recherche en santé publique et le Canada a un historique de contributions à fort impact, tant dans les avancées méthodologiques majeures que dans l'application rigoureuse de méthodes pour la conception ou l'analyse d'études de santé publique. Dans cet article, nous présentons un examen bref et personnel des contributions des biostatisticiens canadiens dans des domaines tels que l'analyse de la survie et de l'histoire de vie, l'échantillonnage, la méthodologie des essais cliniques, le risque environnemental, l'épidémiologie des maladies infectieuses et les premiers travaux sur la prédiction et la classification. Nous fournissons également un bref aperçu des besoins à venir et des orientations futures de la recherche biostatistique.

6.
Biosystems ; 238: 105179, 2024 Apr.
Article in English | MEDLINE | ID: mdl-38492627

ABSTRACT

Ervin Bauer was the only biologist who recognized that the best way to develop theoretical biology on an equal footing with theoretical physics was to follow the method that has ensured the great successes of modern theoretical physics: the general method of science. Following this method, he succeeded to find the universal principle of biology. From this principle he managed to derive all the basic equations of biology, that of metabolism, reproduction, growth, responsiveness and successfully explained all the fundamental phenomena of life. In this paper, I introduce Bauer's theoretical biology and discuss whether he understood it within the framework of the modern physical worldview, or in a broader framework. I point out that the theoretical biology of Ervin Bauer is the first to go beyond the physical worldview, to establish a deeper, biological worldview, and thus to represent a major advance in our understanding of the nature of life, with a significance even greater than that of the Copernican turn. Clarifying the difference between the living and the non-living, it is important to consider the difference between machines and living organisms. It is well known that machines are the manifestations of a dual control; globally, their behavior is controlled by their given structure, while locally, their behavior is governed by the physical laws. Based on Bauer's theoretical biology, it is pointed out that living organisms manifest a three-level causality; the 'additional', biological level corresponds to the autonomous, time-dependent control of their structures.


Subject(s)
Biology , Physics
8.
MethodsX ; 11: 102417, 2023 Dec.
Article in English | MEDLINE | ID: mdl-37881625

ABSTRACT

Today's students face new challenges that demand high levels of intelligence and meta-thinking skills. Science-based educational pedagogies like STEM, 5E's, and discovery-based education have earned a strong reputation for nurturing children's reasoning and critical thinking skills. However, there's a need for them to be more open and conceptual in order to better prepare students for modern life's challenges. The realms of philosophical and scientific-based educational models currently dominate the educational arena. In retrospect, both models have inherent values that would benefit learners. It is beneficial to examine both methods to develop a teaching framework that fosters higher-order thinking, metacognition, and problem-solving skills. The objective of this paper is to delve into the essenceof the philosophy for children (P4C) method in comparison with the scientific method and their impact onstudents' learning. Using the six reasoning strands, I will systematically compare the two models for strengths and similarities. Throughout this comparison, I aim to maintain objectivity by by drawing on references and practical experience, avoiding any undue bias in favor of one model over the other. Subsequently, by applyingTrompenaars Hampden-Turner™ dilemmas reconciliation model, I will propose the Scie-losophy model, which represents reconciliation between the two methods for the benefit of the learner and the greater good of society. •Currently, philosophical, and scientific-based education models dominate the educational arena. Both models have inherent values that would benefit learners.•Adopting one approach above the other will yield less than optimal results. I strongly discourage following a saturated educational system in which one of the two methods is used exclusively.•I propose a model that represents reconciliation between the two methods for the benefit of the learner and the interest of society.

10.
Heliyon ; 9(10): e20237, 2023 Oct.
Article in English | MEDLINE | ID: mdl-37842628

ABSTRACT

Few things have impacted our lives as much as science and technology, but how we developed science and civilisation is one of the most challenging questions that has not yet been well explained. Attempting to identify the central driver, leading scientists have highlighted the role of culture, cooperation and geography. They focus thus on broad factors that are important basic preconditions but that we cannot directly influence. To better address the question, this paper integrates evidence from evolutionary biology, cognitive science, methodology, archaeology and anthropology. The paper identifies 9 main preconditions necessary for contemporary science, which include 6 main preconditions for civilisation. Using a kind of quasi-experimental research design we observe that some cultures (experimental groups) met the preconditions while other cultures (control groups) did not. Among the preconditions, we explain how our mind's evolved methodological abilities (to observe, solve problems and experiment) have directly enabled acquiring knowledge about the world and collectively developing increasingly sophisticated methods (such as mathematics and more systematic experimentation) that have enabled science and civilisation. We have driven the major revolutions throughout our history - the palaeolithic technological and agricultural revolutions and later the so-called scientific, industrial and digital revolutions - by using our methodological abilities in new ways and developing new methods and tools, i.e. through methodological revolutions. Viewing our methods as the main mechanism through which we have directly developed scientific and technological knowledge, and thus science and civilisation, provides a new framework for understanding science and the history of science. Viewing humans as homo methodologicus, using an expanding methodological toolbox, provides a nuanced explanation of how we have been directly able to meet our needs, solve problems and develop vast bodies of technological and scientific knowledge. By better understanding the origin and foundations of science, we can better understand their limits and, most importantly, how to push those limits. We can do so especially by addressing the evolved cognitive constraints and biases we face and improving the methods we use.

11.
MethodsX ; 11: 102367, 2023 Dec.
Article in English | MEDLINE | ID: mdl-37732291

ABSTRACT

Big data launches a modern way of producing science and research around the world. Due to an explosion of data available in scientific databases, combined with recent advances in information technology, the researcher has at his disposal new methods and technologies that facilitate scientific development. Considering the challenges of producing science in a dynamic and complex scenario, the main objective of this article is to present a method aligned with tools recently developed to support scientific production, based on steps and technologies that will help researchers to materialize their objectives efficiently and effectively. Applying this method, the researcher can apply science mapping and bibliometric techniques with agility, taking advantage of an easy-to-use solution with cloud computing capabilities. From the application of the "Scientific Mapping Process", the researcher will be able to generate strategic information for a result-oriented scientific production, assertively going through the main steps of research and boosting scientific discovery in the most diverse fields of investigation. •The Scientific Mapping Process provides a method and a system to boost scientific development.•It automates Science Mapping and bibliometric analysis from scientific datasets.•It facilitates the researcher's work, increasing the assertiveness in scientific production.

12.
An. R. Acad. Nac. Farm. (Internet) ; 89(3): 379-386, Juli-Sep. 2023.
Article in Spanish | IBECS | ID: ibc-226792

ABSTRACT

La brecha entre predictibilidad y comprensibilidad amenaza todo el proyecto científico porque los modelos matemáticos de los procesos, alimentados por enormes cantidades de datos de origen muy diverso, proporcionan resultados excepcionalmente precisos pero, al mismo tiempo, ocultan la explicación de los procesos. El conocimiento de “qué sabemos” de la ontología es tan relevante en ciencia como el de “cómo sabemos” y el de “cuánto sabemos” de la epistemología. La inteligencia artificial (IA) implica la comprensión científica de los mecanismos que subyacen al pensamiento y la conducta inteligente, así como su encarnación en máquinas capacitadas por sus creadores de razonar en un sentido convencional. Su formulación “débil” se refiere al empleo de programas informáticos complejos, diseñados con el fin de complementar o auxiliar el razonamiento humano para resolver o completar complejos problemas de cálculo, de mantenimiento de sistemas, de reconocimiento de todo tipo de imágenes, de diseño, de análisis de patrones de datos, etc., muchos de los cuales serían prácticamente inabordables mediante procedimientos convencionales; pero todo ello sin incluir capacidades sentientes o éticas humanas, que sí serían objeto de una – por ahora – inexistente IA “fuerte”, aquella que igualaría o incluso excedería la inteligencia sentiente humana. La vulgarización de la IA “generativa”, desarrollada para crear contenido – texto, imágenes, música o vídeos, entre otras muchas áreas – a partir de información previa, está contribuyendo a consolidar popularmente la idea errónea de que la actual IA excede el razonamiento a nivel humano y exacerba el riesgo de transmisión de información falsa y estereotipos negativos a las personas. Los modelos de lenguaje de la inteligencia artificial no funcionan emulando un cerebro biológico sino que se fundamentan en la búsqueda de patrones lógicos a partir de grandes bases de datos procedentes de fuentes diversas, que no siempre están actualizadas ni depuradas de falsedades, de errores ni de sesgos conceptuales o factuales, tanto involuntarios como interesados. Y la IA empleada en ciencia no es ajena a estas limitaciones y sesgos. Una cuestión particularmente sensible es la posibilidad de utilizar la IA generativa para redactar o incluso inventarse artículos científicos que llegan a pasar desapercibidos por los revisores por pares de las revistas científicas más prestigiosas del mundo, apuntando a un problema más aún profundo: los revisores por pares de las revistas científicas a menudo no tienen tiempo para revisar los manuscritos a fondo en busca de señales de alerta y, en muchos casos, además carecen de recursos informáticos adecuados y formación especializada.(AU)


The gap between predictability and comprehensibility threatens the entire scientific project because mathematical models of processes, fed by enormous amounts of data of very diverse origin, provide exceptionally precise results but, at the same time, hide the explanation of the processes. The knowledge of “what we know” of ontology is as relevant in science as that of “how we know” and “how much we know” of epistemology. Artificial intelligence (AI) involves the scientific understanding of the mechanisms underlying intelligent thought and behavior, as well as their embodiment in machines trained by their creators to reason in a conventional sense. Its “weak” formulation refers to the use of complex computer programs, designed with the purpose of complementing or assisting human reasoning to solve or complete complex problems of calculation, system maintenance, recognition of all types of images, design, analysis of data patterns, etc., many of which would be practically unapproachable using conventional procedures; but all this without including human sentient or ethical capabilities, which would be the subject of a – at the moment – non-existent “strong” AI, that would equal or even exceed human sentient intelligence. The popularization of “generative” AI, developed to create content – text, images, music or videos, among many other areas – from previous information, is helping to popularly consolidate the erroneous idea that current AI exceeds reasoning human level and exacerbates the risk of transmitting false information and negative stereotypes to people. The language models of artificial intelligence do not work by emulating a biological brain but are based on the search for logical patterns from large databases from diverse sources, which are not always updated or purged of falsehoods, errors or errors. conceptual or factual biases, both involuntary and self-serving. And the AI used in science is no stranger to these limitations and biases. A particularly sensitive issue is the possibility of using generative AI to write or even invent scientific articles that go unnoticed by the peer reviewers of the most prestigious scientific journals in the world, pointing to an even deeper problem: peer reviewers. Reviewers often do not have the time to review manuscripts thoroughly for red flags and, in many cases, they also lack adequate computing resources and specialized training.(AU)


Subject(s)
Humans , Artificial Intelligence/trends , Biological Ontologies , Knowledge , Medicine
13.
J Microbiol Biol Educ ; 24(2)2023 Aug.
Article in English | MEDLINE | ID: mdl-37614897

ABSTRACT

Undergraduate microbiology students are exposed to the theory of the scientific method throughout their undergraduate coursework, but laboratory course curricula often focus on technical skills rather than fully integrating scientific thinking as a component of competencies addressed. Here, we have designed a six-session inquiry-based laboratory (IBL) curriculum for an upper-level microbiology laboratory course that fully involves students in the scientific process using bacterial conjugation as the model system, including both online discussions and in-person laboratory sessions. The student learning objectives focus on the scientific method, experimental design, data analysis, bacterial conjugation mechanisms, and scientific communication. We hypothesized students would meet these learning objectives after completing this IBL and tracked student learning and surveyed students to provide an assessment of the structure of the IBL using pre- and post-IBL quizzes and the Laboratory Course Assessment Survey. Overall, our results show this IBL results in positive student learning gains.

14.
Ecol Evol ; 13(7): e10255, 2023 Jul.
Article in English | MEDLINE | ID: mdl-37408635

ABSTRACT

The Structure of Scientific Revolutions by Thomas Kuhn has influenced scientists for decades. It focuses on a progression of science involving periodic, fundamental shifts-revolutions-from one existing paradigm to another. Embedded in this theory is the concept of normal science, that is, scientists work within the confines of established theory, a process often compared to a type of puzzle-solving. This Kuhnian aspect of scientific research has received little attention relative to the much-scrutinized concepts of revolutions and paradigms. We use Kuhn's normal science framework to reflect on the way ecologists practice science. This involves a discussion of how theory dependence influences each step of the scientific method, specifically, how past experiences and existing research frameworks guide the way ecologists acquire knowledge. We illustrate these concepts with ecological examples, including food web structure and the biodiversity crisis, emphasizing that the way one views the world influences how that person engages in scientific research. We conclude with a discussion of how Kuhnian ideas inform ecological research at practical levels, such as influences on grant funding allocation, and we make a renewed call for the inclusion of philosophical foundations of ecological principles in pedagogy. By studying the processes and traditions of how science is carried out, ecologists can better direct scientific insight to address the world's most pressing environmental problems.

15.
Respir Care ; 68(8): 1180-1185, 2023 08.
Article in English | MEDLINE | ID: mdl-37041024

ABSTRACT

An understanding of the research process is an essential skill for designing a study and developing the research protocol. Poor study design can lead to fatal flaws in research methodology, ultimately resulting in rejection for publication or limiting the reliability of the results. Following the steps of the research process and devising the research question and hypothesis prior to study initiation can avoid common problems encountered with research questions and study design. Formulating the research question is the first step in the research process and provides the foundation for framing the hypothesis. Research questions should be feasible, interesting, novel, ethical, and relevant (FINER). Application of the FINER criteria can assist with ensuring the question is valid and will generate new knowledge that has clinical impact. Utilization of the population, intervention, comparison, and outcome (PICO) format helps to structure the question as well as refine and narrow the focus from a broad topic. The hypothesis is derived from the research question and is used to determine the experiments or interventions that will answer the question. This aim of this paper is to provide guidance for developing research questions and forming a testable hypothesis through application of the FINER criteria and the PICO process.


Subject(s)
Research Design , Humans , Reproducibility of Results
16.
Respir Care ; 68(9): 1309-1313, 2023 09.
Article in English | MEDLINE | ID: mdl-37072162

ABSTRACT

Surveys provide evidence for the social sciences for knowledge, attitudes, and other behaviors, and, in health care, to quantify qualitative research and to assist in policymaking. A survey-designed research project is about asking questions of individuals, and, from the answers, the researcher can generalize the findings from a sample of respondents to a population. Therefore, this overview can serve as a guide to conducting survey research that can provide answers for practitioners, educators, and leaders, but only if the right questions and methods are used. The main advantage of using surveys is their economical access to participants online. A major disadvantage of survey research is the low response rates in most situations. Online surveys have many limitations that should be expected before conducting a search, and then described after the survey is complete. Any conclusions and recommendations are to be supported by evidence in a clear and objective manner. Presenting evidence in a structured format is crucial but well-developed reporting guidelines are needed for researchers who conduct survey research.


Subject(s)
Research Design , Humans , Surveys and Questionnaires
17.
Synthese ; 201(4): 146, 2023.
Article in English | MEDLINE | ID: mdl-37073305

ABSTRACT

In the autumn of 1959, Arne Naess and J. L. Austin, both pioneers of empirical study in the philosophy of language, discussed their points of agreement and disagreement at a meeting in Oslo. This article considers the fragmentary record that has survived of that meeting, and investigates what light it can shed on the question of why the two philosophers apparently found so little common ground, given their shared commitment to the importance of data in the study of language. Naess and Austin held different views about two significant aspects of the relationship between scientific method and philosophical investigation. The first aspect concerns the nature of experimental data; Naess used the statistical analysis of data collected from non-philosophical informants while Austin advocated deliberation leading to agreement over usage by a few skilled experts. The second aspect relates to their respective attitudes to the role of theory in philosophical inquiry, attitudes which drew on discussions of scientific method, and its relevance to philosophy, from the early decades of the twentieth century. This article traces the evidence for these views on scientific method in Naess's and Austin's respective published work, and in the record of their Oslo meeting. It concludes with a brief overview of opinions about scientific method manifest in the decades since that meeting in various branches of linguistics. These opinions speak to the enduring importance of attitudes to scientific method in relation to our study and understanding of human language.

18.
J Microbiol Biol Educ ; 24(1)2023 Apr.
Article in English | MEDLINE | ID: mdl-37089220

ABSTRACT

Engaging undergraduate biology majors may present challenges for educators disseminating science concepts utilizing standard lecture classroom formats. Moreover, animal behavior courses teaching ethology may often require the use of live animals, field excursions, or students having to develop projects which can be both time-consuming or require financial investment, or that may not be well-suited to the flexibility of being taught online. Therefore, developing in-class activities which allow students to use self-discovery when generating their own observational data, work in groups, and practice hands-on science may serve to ameliorate these challenges facing faculty teaching animal behavior course content. To this end, I developed a straightforward, engaging in-class activity which allowed students to scan images available on the smartphone identifier iNaturalist to generate their own ethograms (catalogs of behaviors) for local state species. Students successfully described behaviors across a variety of animal taxa, reptiles, mammals, birds, and insects when generating their own ethograms and data, and they actively discussed how this activity enabled them to further understand both ethograms and their importance to animal behavior and overall how animals behave and allocate time performing a variety of behaviors. This activity can be modified for further use in both introductory and upper-level course work in organismal biology and can incorporate data analysis, graphing, or presentation skill sets for science majors.

19.
Physiol Biochem Zool ; 96(1): 1-16, 2023.
Article in English | MEDLINE | ID: mdl-36626844

ABSTRACT

AbstractKrogh's principle states, "For such a large number of problems there will be some animal of choice, or a few such animals, on which it can be most conveniently studied." The downside of picking a question first and then finding an ideal organism on which to study it is that it will inevitably leave many organisms neglected. Here, we promote the inverse Krogh principle: all organisms are worthy of study. The inverse Krogh principle and the Krogh principle are not opposites. Rather, the inverse Krogh principle emphasizes a different starting point for research: start with a biological unit, such as an organism, clade, or specific organism trait, then seek or create tractable research questions. Even the hardest-to-study species have research questions that can be asked of them: Where does it fall within the tree of life? What resources does it need to survive and reproduce? How does it differ from close relatives? Does it have unique adaptations? The Krogh and inverse Krogh approaches are complementary, and many research programs naturally include both. Other considerations for picking a study species include extreme species, species informative for phylogenetic analyses, and the creation of models when a suitable species does not exist. The inverse Krogh principle also has pitfalls. A scientist that picks the organism first might choose a research question not really suited to the organism, and funding agencies rarely fund organism-centered grant proposals. The inverse Krogh principle does not call for all organisms to receive the same amount of research attention. As knowledge continues to accumulate, some organisms-models-will inevitably have more known about them than others. Rather, it urges a broader search across organismal diversity to find sources of inspiration for research questions and the motivation needed to pursue them.


Subject(s)
Adaptation, Physiological , Animals , Phylogeny , Phenotype
20.
Big Data ; 11(3): 199-214, 2023 06.
Article in English | MEDLINE | ID: mdl-34612727

ABSTRACT

Although confirmatory modeling has dominated much of applied research in medical, business, and behavioral sciences, modeling large data sets with the goal of accurate prediction has become more widely accepted. The current practice for fitting predictive models is guided by heuristic-based modeling frameworks that lead researchers to make a series of often isolated decisions regarding data preparation and cleaning that may result in substandard predictive performance. In this article, we use an experimental design to evaluate the impact of six factors related to data preparation and model selection (techniques for numerical imputation, categorical imputation, encoding, subsampling for unbalanced data, feature selection, and machine learning algorithm) and their interactions on the predictive accuracy of models applied to a large, publicly available heart transplantation database. Our factorial experiment includes 10,800 models evaluated on 5 independent test partitions of the data. Results confirm that some decisions made early in the modeling process interact with later decisions to affect predictive performance; therefore, the current practice of making these decisions independently can negatively affect predictive outcomes. A key result of this case study is to highlight the need for improved rigor in applied predictive research. By using the scientific method to inform predictive modeling, we can work toward a framework for applied predictive modeling and a standard for reproducibility in predictive research.


Subject(s)
Algorithms , Machine Learning , Reproducibility of Results , Databases, Factual
SELECTION OF CITATIONS
SEARCH DETAIL
...