Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 13 de 13
Filter
1.
IEEE Trans Pattern Anal Mach Intell ; 41(11): 2756-2769, 2019 11.
Article in English | MEDLINE | ID: mdl-30130177

ABSTRACT

Ticker is a probabilistic stereophonic single-switch text entry method for visually-impaired users with motor disabilities who rely on single-switch scanning systems to communicate. Such scanning systems are sensitive to a variety of noise sources, which are inevitably introduced in practical use of single-switch systems. Ticker uses a novel interaction model based on stereophonic sound coupled with statistical models for robust inference of the user's intended text in the presence of noise. As a consequence of its design, Ticker is resilient to noise and therefore a practical solution for single-switch scanning systems. Ticker's performance is validated using a combination of simulations and empirical user studies.


Subject(s)
Communication Aids for Disabled , Software , Visually Impaired Persons , Acoustic Stimulation , Algorithms , Bayes Theorem , Computer Simulation , Humans , Motor Disorders , User-Computer Interface
2.
Proc Biol Sci ; 284(1862)2017 Sep 13.
Article in English | MEDLINE | ID: mdl-28904138

ABSTRACT

Exposure to ionizing radiation is ubiquitous, and it is well established that moderate and high doses cause ill-health and can be lethal. The health effects of low doses or low dose-rates of ionizing radiation are not so clear. This paper describes a project which sets out to summarize, as a restatement, the natural science evidence base concerning the human health effects of exposure to low-level ionizing radiation. A novel feature, compared to other reviews, is that a series of statements are listed and categorized according to the nature and strength of the evidence that underpins them. The purpose of this restatement is to provide a concise entrée into this vibrant field, pointing the interested reader deeper into the literature when more detail is needed. It is not our purpose to reach conclusions on whether the legal limits on radiation exposures are too high, too low or just right. Our aim is to provide an introduction so that non-specialist individuals in this area (be they policy-makers, disputers of policy, health professionals or students) have a straightforward place to start. The summary restatement of the evidence and an extensively annotated bibliography are provided as appendices in the electronic supplementary material.


Subject(s)
Radiation Exposure/adverse effects , Radiation, Ionizing , Humans
4.
Philos Trans A Math Phys Eng Sci ; 371(1996): 20110431, 2013 Aug 13.
Article in English | MEDLINE | ID: mdl-23816908

ABSTRACT

Taking the UK as a case study, this paper describes current energy use and a range of sustainable energy options for the future, including solar power and other renewables. I focus on the area involved in collecting, converting and delivering sustainable energy, looking in particular detail at the potential role of solar power. Britain consumes energy at a rate of about 5000 watts per person, and its population density is about 250 people per square kilometre. If we multiply the per capita energy consumption by the population density, then we obtain the average primary energy consumption per unit area, which for the UK is 1.25 watts per square metre. This areal power density is uncomfortably similar to the average power density that could be supplied by many renewables: the gravitational potential energy of rainfall in the Scottish highlands has a raw power per unit area of roughly 0.24 watts per square metre; energy crops in Europe deliver about 0.5 watts per square metre; wind farms deliver roughly 2.5 watts per square metre; solar photovoltaic farms in Bavaria, Germany, and Vermont, USA, deliver 4 watts per square metre; in sunnier locations, solar photovoltaic farms can deliver 10 watts per square metre; concentrating solar power stations in deserts might deliver 20 watts per square metre. In a decarbonized world that is renewable-powered, the land area required to maintain today's British energy consumption would have to be similar to the area of Britain. Several other high-density, high-consuming countries are in the same boat as Britain, and many other countries are rushing to join us. Decarbonizing such countries will only be possible through some combination of the following options: the embracing of country-sized renewable power-generation facilities; large-scale energy imports from country-sized renewable facilities in other countries; population reduction; radical efficiency improvements and lifestyle changes; and the growth of non-renewable low-carbon sources, namely 'clean' coal, 'clean' gas and nuclear power. If solar is to play a large role in the future energy system, then we need new methods for energy storage; very-large-scale solar either would need to be combined with electricity stores or it would need to serve a large flexible demand for energy that effectively stores useful energy in the form of chemicals, heat, or cold.

5.
Philos Trans A Math Phys Eng Sci ; 371(1986): 20110560, 2013 Mar 13.
Article in English | MEDLINE | ID: mdl-23359732

ABSTRACT

While the main thrust of the Discussion Meeting Issue on 'Material efficiency: providing material services with less material production' was to explore ways in which society's net demand for materials could be reduced, this review examines the possibility of converting industrial energy demand to electricity, and switching to clean electricity sources. This review quantifies the scale of infrastructure required in the UK, focusing on wind and nuclear power as the clean electricity sources, and sets these requirements in the context of the decarbonization of the whole energy system using wind, biomass, solar power in deserts and nuclear options. The transition of industry to a clean low-carbon electricity supply, although technically possible with several different technologies, would have very significant infrastructure requirements.

6.
PLoS One ; 4(10): e7481, 2009 Oct 22.
Article in English | MEDLINE | ID: mdl-19847300

ABSTRACT

Selection methods that require only a single-switch input, such as a button click or blink, are potentially useful for individuals with motor impairments, mobile technology users, and individuals wishing to transmit information securely. We present a single-switch selection method, "Nomon," that is general and efficient. Existing single-switch selection methods require selectable options to be arranged in ways that limit potential applications. By contrast, traditional operating systems, web browsers, and free-form applications (such as drawing) place options at arbitrary points on the screen. Nomon, however, has the flexibility to select any point on a screen. Nomon adapts automatically to an individual's clicking ability; it allows a person who clicks precisely to make a selection quickly and allows a person who clicks imprecisely more time to make a selection without error. Nomon reaps gains in information rate by allowing the specification of beliefs (priors) about option selection probabilities and by avoiding tree-based selection schemes in favor of direct (posterior) inference. We have developed both a Nomon-based writing application and a drawing application. To evaluate Nomon's performance, we compared the writing application with a popular existing method for single-switch writing (row-column scanning). Novice users wrote 35% faster with the Nomon interface than with the scanning interface. An experienced user (author TB, with 10 hours practice) wrote at speeds of 9.3 words per minute with Nomon, using 1.2 clicks per character and making no errors in the final text.


Subject(s)
Artificial Intelligence , Computer Peripherals , User-Computer Interface , Adult , Algorithms , Communication Aids for Disabled , Computers , Equipment Design , Female , Humans , Male , Models, Statistical , Neural Networks, Computer , Reproducibility of Results , Software
7.
Bioinformatics ; 25(12): i374-82, 2009 Jun 15.
Article in English | MEDLINE | ID: mdl-19478012

ABSTRACT

MOTIVATION: G-quadruplexes are stable four-stranded guanine-rich structures that can form in DNA and RNA. They are an important component of human telomeres and play a role in the regulation of transcription and translation. The biological significance of a G-quadruplex is crucially linked with its thermodynamic stability. Hence the prediction of G-quadruplex stability is of vital interest. RESULTS: In this article, we present a novel Bayesian prediction framework based on Gaussian process regression to determine the thermodynamic stability of previously unmeasured G-quadruplexes from the sequence information alone. We benchmark our approach on a large G-quadruplex dataset and compare our method to alternative approaches. Furthermore, we propose an active learning procedure which can be used to iteratively acquire data in an optimal fashion. Lastly, we demonstrate the usefulness of our procedure on a genome-wide study of quadruplexes in the human genome. AVAILABILITY: A data table with the training sequences is available as supplementary material. Source code is available online at http://www.inference.phy.cam.ac.uk/os252/projects/quadruplexes. SUPPLEMENTARY INFORMATION: Supplementary data are available at Bioinformatics online.


Subject(s)
Computational Biology/methods , G-Quadruplexes , Bayes Theorem , DNA/chemistry , Databases, Genetic , Genome, Human , Humans , RNA/chemistry , Telomere/chemistry
8.
IEEE Trans Biomed Eng ; 55(9): 2143-51, 2008 Sep.
Article in English | MEDLINE | ID: mdl-18713683

ABSTRACT

Heart rate data collected during nonlaboratory conditions present several data-modeling challenges. First, the noise in such data is often poorly described by a simple Gaussian; it has outliers and errors come in bursts. Second, in large-scale studies the ECG waveform is usually not recorded in full, so one has to deal with missing information. In this paper, we propose a robust postprocessing model for such applications. Our model to infer the latent heart rate time series consists of two main components: unsupervised clustering followed by Bayesian regression. The clustering component uses auxiliary data to learn the structure of outliers and noise bursts. The subsequent Gaussian process regression model uses the cluster assignments as prior information and incorporates expert knowledge about the physiology of the heart. We apply the method to a wide range of heart rate data and obtain convincing predictions along with uncertainty estimates. In a quantitative comparison with existing postprocessing methodology, our model achieves a significant increase in performance.


Subject(s)
Arrhythmias, Cardiac/diagnosis , Arrhythmias, Cardiac/physiopathology , Artifacts , Electrocardiography/methods , Heart Rate , Data Interpretation, Statistical , Humans , Normal Distribution , Regression Analysis , Reproducibility of Results , Sensitivity and Specificity
9.
IEEE Trans Neural Syst Rehabil Eng ; 14(2): 244-6, 2006 Jun.
Article in English | MEDLINE | ID: mdl-16792304

ABSTRACT

DASHER is a human-computer interface for entering text using continuous or discrete gestures. Through its use of an internal language model, DASHER efficiently converts bits received from the user into text, and has been shown to be a competitive alternative to existing text-entry methods in situations where an ordinary keyboard cannot be used. We propose that DASHER would be well-matched to the low bit-rate, noisy output obtained from brain-computer interfaces (BCIs), and discuss the issues surrounding the use of DASHER with BCI systems.


Subject(s)
Brain/physiology , Communication Aids for Disabled , Computer Peripherals , Electroencephalography/methods , Software , Therapy, Computer-Assisted/methods , User-Computer Interface , Writing , Humans , Man-Machine Systems
10.
Oncogene ; 23(39): 6677-83, 2004 Aug 26.
Article in English | MEDLINE | ID: mdl-15247901

ABSTRACT

Gene microarray technology is highly effective in screening for differential gene expression and has hence become a popular tool in the molecular investigation of cancer. When applied to tumours, molecular characteristics may be correlated with clinical features such as response to chemotherapy. Exploitation of the huge amount of data generated by microarrays is difficult, however, and constitutes a major challenge in the advancement of this methodology. Independent component analysis (ICA), a modern statistical method, allows us to better understand data in such complex and noisy measurement environments. The technique has the potential to significantly increase the quality of the resulting data and improve the biological validity of subsequent analysis. We performed microarray experiments on 31 postmenopausal endometrial biopsies, comprising 11 benign and 20 malignant samples. We compared ICA to the established methods of principal component analysis (PCA), Cyber-T, and SAM. We show that ICA generated patterns that clearly characterized the malignant samples studied, in contrast to PCA. Moreover, ICA improved the biological validity of the genes identified as differentially expressed in endometrial carcinoma, compared to those found by Cyber-T and SAM. In particular, several genes involved in lipid metabolism that are differentially expressed in endometrial carcinoma were only found using this method. This report highlights the potential of ICA in the analysis of microarray data.


Subject(s)
Endometrial Neoplasms/genetics , Oligonucleotide Array Sequence Analysis , Female , Gene Expression Regulation, Neoplastic , Humans
11.
Comp Funct Genomics ; 4(3): 300-17, 2003.
Article in English | MEDLINE | ID: mdl-18629283

ABSTRACT

DNA microarrays allow the measurement of transcript abundances for thousands of genes in parallel. Most commonly, a particular sample of interest is studied next to a neutral control, examining relative changes (ratios). Independent component analysis (ICA) is a promising modern method for the analysis of such experiments. The condition of ICA algorithms can, however, depend on the characteristics of the data examined, making algorithm properties such as robustness specific to the given application domain. To address the lack of studies examining the robustness of ICA applied to microarray measurements, we report on the stability of variational Bayesian ICA in this domain. Microarray data are usually preprocessed and transformed. Hence we first examined alternative transforms and data selections for the smallest modelling reconstruction errors. Log-ratio data are reconstructed better than non-transformed ratio data by our linear model with a Gaussian error term. To compare ICA results we must allow for ICA invariance under rescaling and permutation of the extracted signatures, which hold the loadings of the original variables (gene transcript ratios) on particular latent variables. We introduced a method to optimally match corresponding signatures between sets of results. The stability of signatures was then examined after (1) repetition of the same analysis run with different random number generator seeds, and (2) repetition of the analysis with partial data sets. The effects of both dropping a proportion of the gene transcript ratios and dropping measurements for several samples have been studied. In summary, signatures with a high relative data power were very likely to be retained, resulting in an overall stability of the analyses. Our analysis of 63 yeast wildtype vs. wild-type experiments, moreover, yielded 10 reliably identified signatures, demonstrating that the variance observed is not just noise.

12.
Bioinformatics ; 18(12): 1617-24, 2002 Dec.
Article in English | MEDLINE | ID: mdl-12490446

ABSTRACT

MOTIVATION: A number of algorithms and analytical models have been employed to reduce the multidimensional complexity of DNA array data and attempt to extract some meaningful interpretation of the results. These include clustering, principal components analysis, self-organizing maps, and support vector machine analysis. Each method assumes an implicit model for the data, many of which separate genes into distinct clusters defined by similar expression profiles in the samples tested. A point of concern is that many genes may be involved in a number of distinct behaviours, and should therefore be modelled to fit into as many separate clusters as detected in the multidimensional gene expression space. The analysis of gene expression data using a decomposition model that is independent of the observer involved would be highly beneficial to improve standard and reproducible classification of clinical and research samples. RESULTS: We present a variational independent component analysis (ICA) method for reducing high dimensional DNA array data to a smaller set of latent variables, each associated with a gene signature. We present the results of applying the method to data from an ovarian cancer study, revealing a number of tissue type-specific and tissue type-independent gene signatures present in varying amounts among the samples surveyed. The observer independent results of such molecular analysis of biological samples could help identify patients who would benefit from different treatment strategies. We further explore the application of the model to similar high-throughput studies.


Subject(s)
Algorithms , Gene Expression Profiling/methods , Gene Expression/genetics , Models, Genetic , Oligonucleotide Array Sequence Analysis/methods , Cluster Analysis , Cystadenoma/classification , Cystadenoma/genetics , Female , Gene Expression Regulation/genetics , Humans , Models, Statistical , Observer Variation , Ovarian Neoplasms/classification , Ovarian Neoplasms/genetics , Quality Control , Reference Values , Reproducibility of Results , Sensitivity and Specificity , Transcription, Genetic/genetics
SELECTION OF CITATIONS
SEARCH DETAIL
...