Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 5 de 5
Filter
Add more filters










Database
Language
Publication year range
1.
Interface Focus ; 11(6): 20210018, 2021 Dec 06.
Article in English | MEDLINE | ID: mdl-34956592

ABSTRACT

The race to meet the challenges of the global pandemic has served as a reminder that the existing drug discovery process is expensive, inefficient and slow. There is a major bottleneck screening the vast number of potential small molecules to shortlist lead compounds for antiviral drug development. New opportunities to accelerate drug discovery lie at the interface between machine learning methods, in this case, developed for linear accelerators, and physics-based methods. The two in silico methods, each have their own advantages and limitations which, interestingly, complement each other. Here, we present an innovative infrastructural development that combines both approaches to accelerate drug discovery. The scale of the potential resulting workflow is such that it is dependent on supercomputing to achieve extremely high throughput. We have demonstrated the viability of this workflow for the study of inhibitors for four COVID-19 target proteins and our ability to perform the required large-scale calculations to identify lead antiviral compounds through repurposing on a variety of supercomputers.

2.
Philos Trans A Math Phys Eng Sci ; 379(2197): 20200067, 2021 May 17.
Article in English | MEDLINE | ID: mdl-33775149

ABSTRACT

With the relentless rise of computer power, there is a widespread expectation that computers can solve the most pressing problems of science, and even more besides. We explore the limits of computational modelling and conclude that, in the domains of science and engineering which are relatively simple and firmly grounded in theory, these methods are indeed powerful. Even so, the availability of code, data and documentation, along with a range of techniques for validation, verification and uncertainty quantification, are essential for building trust in computer-generated findings. When it comes to complex systems in domains of science that are less firmly grounded in theory, notably biology and medicine, to say nothing of the social sciences and humanities, computers can create the illusion of objectivity, not least because the rise of big data and machine-learning pose new challenges to reproducibility, while lacking true explanatory power. We also discuss important aspects of the natural world which cannot be solved by digital means. In the long term, renewed emphasis on analogue methods will be necessary to temper the excessive faith currently placed in digital computation. This article is part of the theme issue 'Reliability and reproducibility in computational science: implementing verification, validation and uncertainty quantification in silico'.

3.
J Comput Sci ; 46: 101093, 2020 Oct.
Article in English | MEDLINE | ID: mdl-33312270

ABSTRACT

Many believe that the future of innovation lies in simulation. However, as computers are becoming ever more powerful, so does the hyperbole used to discuss their potential in modelling across a vast range of domains, from subatomic physics to chemistry, climate science, epidemiology, economics and cosmology. As we are about to enter the era of quantum and exascale computing, machine learning and artificial intelligence have entered the field in a significant way. In this article we give a brief history of simulation, discuss how machine learning can be more powerful if underpinned by deeper mechanistic understanding, outline the potential of exascale and quantum computing, highlight the limits of digital computing - classical and quantum - and distinguish rhetoric from reality in assessing the future of modelling and simulation, when we believe analogue computing will play an increasingly important role.

4.
Philos Trans A Math Phys Eng Sci ; 374(2080)2016 Nov 13.
Article in English | MEDLINE | ID: mdl-27698035

ABSTRACT

The current interest in big data, machine learning and data analytics has generated the widespread impression that such methods are capable of solving most problems without the need for conventional scientific methods of inquiry. Interest in these methods is intensifying, accelerated by the ease with which digitized data can be acquired in virtually all fields of endeavour, from science, healthcare and cybersecurity to economics, social sciences and the humanities. In multiscale modelling, machine learning appears to provide a shortcut to reveal correlations of arbitrary complexity between processes at the atomic, molecular, meso- and macroscales. Here, we point out the weaknesses of pure big data approaches with particular focus on biology and medicine, which fail to provide conceptual accounts for the processes to which they are applied. No matter their 'depth' and the sophistication of data-driven methods, such as artificial neural nets, in the end they merely fit curves to existing data. Not only do these methods invariably require far larger quantities of data than anticipated by big data aficionados in order to produce statistically reliable results, but they can also fail in circumstances beyond the range of the data used to train them because they are not designed to model the structural characteristics of the underlying system. We argue that it is vital to use theory as a guide to experimental design for maximal efficiency of data collection and to produce reliable predictive models and conceptual knowledge. Rather than continuing to fund, pursue and promote 'blind' big data projects with massive budgets, we call for more funding to be allocated to the elucidation of the multiscale and stochastic processes controlling the behaviour of complex systems, including those of life, medicine and healthcare.This article is part of the themed issue 'Multiscale modelling at the physics-chemistry-biology interface'.


Subject(s)
Datasets as Topic/trends , Information Storage and Retrieval/methods , Models, Theoretical , User-Computer Interface , Computer Simulation , Database Management Systems
5.
Neuron ; 76(6): 1225-37, 2012 Dec 20.
Article in English | MEDLINE | ID: mdl-23259956

ABSTRACT

What makes one person more intellectually able than another? Can the entire distribution of human intelligence be accounted for by just one general factor? Is intelligence supported by a single neural system? Here, we provide a perspective on human intelligence that takes into account how general abilities or "factors" reflect the functional organization of the brain. By comparing factor models of individual differences in performance with factor models of brain functional organization, we demonstrate that different components of intelligence have their analogs in distinct brain networks. Using simulations based on neuroimaging data, we show that the higher-order factor "g" is accounted for by cognitive tasks corecruiting multiple networks. Finally, we confirm the independence of these components of intelligence by dissociating them using questionnaire variables. We propose that intelligence is an emergent property of anatomically distinct cognitive systems, each of which has its own capacity.


Subject(s)
Brain Mapping , Brain/physiology , Intelligence/physiology , Nerve Net/physiology , Adolescent , Adult , Aged , Cohort Studies , Cortical Synchronization/physiology , Female , Humans , Magnetic Resonance Imaging , Male , Middle Aged , Models, Neurological , Models, Psychological , Neural Pathways/physiology , Pilot Projects , Young Adult
SELECTION OF CITATIONS
SEARCH DETAIL
...