Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 17 de 17
Filter
Add more filters










Publication year range
1.
J Synchrotron Radiat ; 31(Pt 2): 394-398, 2024 Mar 01.
Article in English | MEDLINE | ID: mdl-38306298

ABSTRACT

xrdPlanner is a software package designed to aid in the planning and preparation of powder X-ray diffraction and total scattering beam times at synchrotron facilities. Many modern beamlines provide a flexible experimental setup and may have several different detectors available. In combination with a range of available X-ray energies, it often makes it difficult for the user to explore the available parameter space relevant for a given experiment prior to the scheduled beam time. xrdPlanner was developed to provide a fast and straightforward tool that allows users to visualize the accessible part of reciprocal space of their experiment at a given combination of photon energy and detector geometry. To plan and communicate the necessary geometry not only saves time but also helps the beamline staff to prepare and accommodate for an experiment. The program is tailored toward powder X-ray diffraction and total scattering experiments but may also be useful for other experiments that rely on an area detector and for which detector placement and achievable momentum-transfer range are important experimental parameters.

2.
Chimia (Aarau) ; 77(1-2): 7-16, 2023 Feb 22.
Article in English | MEDLINE | ID: mdl-38047848

ABSTRACT

Accelerating R&D is essential to address some of the challenges humanity is currently facing, such as achieving the global sustainability goals. Today's Edisonian approach of trial-and-error still prevalent in R&D labs takes up to two decades of fundamental and applied research for new materials to reach the market. Turning around this situation calls for strategies to upgrade R&D and expedite innovation. By conducting smart experiment planning that is data-driven and guided by AI/ML, researchers can more efficiently search through the complex - often constrained - space of possible experiments and find or hit the global optima much faster than with the current approaches. Moreover, with digitized data management, researchers will be able to maximize the utility of their data in the short and long terms with the aid of statistics, ML and visualization tools. In what follows, we describe a framework and lay out the key technologies to accelerate R&D and optimize experiment planning.

3.
Materials (Basel) ; 16(11)2023 May 25.
Article in English | MEDLINE | ID: mdl-37297105

ABSTRACT

This article presents the results of a study of the physical and mechanical properties of fine-grained fly ash concrete based on a combined reinforcement with steel and basalt fibers. The main studies were conducted using mathematical planning of experiments, which allowed the experiments to be algorithmized in terms of both the amount of experimental work and statistical requirements. Quantitative dependences characterizing the effect of the content of cement, fly ash binder, steel, and basalt fiber on the compressive strength and tensile splitting strength of fiber-reinforced concrete were obtained. It has been shown that the use of fiber can increase the efficiency factor of dispersed reinforcement (the tensile splitting strength to compressive strength ratio). To increase the resistance of basalt fiber, it is proposed to use fly ash in cement systems, which reduces the amount of free lime in the hydrating cement environment.

4.
Polymers (Basel) ; 14(24)2022 Dec 13.
Article in English | MEDLINE | ID: mdl-36559834

ABSTRACT

The article shows the effectiveness of the use of polymer additives for the production of fine-grained concrete mixtures and concretes based on using coal fly ash, which can be used as working mixtures for a 3D printer. Using mathematical planning of experiments, a set of experimental-statistical models was obtained that describes the influence of mixture composition factors including copolymer additive on the most important properties of ash-containing concrete mixtures and concretes for 3D concrete printing in the presence of a hardening accelerator additive. It is shown that when the dry mixture is mixed in water, the redispersed polymer powders are converted into an adhesive polymer dispersion, which, when the solution cures, creates "rubber bridges" in its pores and at the border with the base. They have high tensile strength and elastically reinforce the cement stone; in addition, they are also capable of not only significantly increasing the adhesion between the layers of the extruded mixture, but also significantly smoothing out such shortcomings of the cement stone as increased brittleness, low ultimate elongation, and a tendency to cracking.

5.
Nanomaterials (Basel) ; 12(14)2022 Jul 14.
Article in English | MEDLINE | ID: mdl-35889643

ABSTRACT

The prospects of using biopolymer nano-containing films for wound healing were substantiated. The main components of biopolymer composites are gelatin, polyvinyl alcohol, glycerin, lactic acid, distilled water, and zinc oxide (ZnO) nanoparticles (NPs). Biopolymer composites were produced according to various technological parameters using a mould with a chrome coating. The therapeutic properties of biopolymer films were evaluated by measuring the diameter of the protective effect. Physico-mechanical properties were studied: elasticity, vapour permeability, degradation time, and swelling. To study the influence of technological parameters of the formation process of therapeutic biopolymer nanofilled films on their therapeutic and physico-mechanical properties, the planning of the experiment was used. According to the results of the experiments, mathematical models of the second-order were built. The optimal values of technological parameters of the process are determined, which provide biopolymer nanofilled films with maximum healing ability (diameter of protective action) and sufficiently high physical and mechanical properties: elasticity, vapour permeability, degradation time and swelling. The research results showed that the healing properties of biopolymer films mainly depend on the content of ZnO NPs. Degradation of these biopolymer films provides dosed drug delivery to the affected area. The products of destruction are carbon dioxide, water, and a small amount of ZnO in the bound state, which indicates the environmental safety of the developed biopolymer film.

6.
Materials (Basel) ; 14(10)2021 May 18.
Article in English | MEDLINE | ID: mdl-34070197

ABSTRACT

This paper uses mathematical methods as the basic tool at the stage of experiment planning. The importance of research programming applications was shown using the theory of experiments and the STATISTICA software. The method of experiment planning used in the case of studying the properties of a mixture, depending on its composition, features considerable complexity. The aim of the statistical analysis was to determine the influence of variable chemical composition of waste materials on selected properties of glass-ceramic materials. A statistical approach to multicomponent systems, such as ceramic sets, enables the selection of appropriate amounts of raw materials through the application of 'a plan for mixtures'. To utilize the raw waste materials, e.g., slags from a solid waste incinerator, fly or bottom ashes, in the modeling of new materials, a mathematical relationship was developed, which enables estimating, based on the waste chemical composition, selected technological and practical properties of the glass so as to obtain a material featuring the required technological-practical parameters. For the obtained glasses, a comparative analysis of the experimentally and computationally determined properties was carried out: transformation temperature, liquidus temperature, density, and thermal expansion coefficient. The obtained high theoretical approximation (at the level of determination correlation coefficient R2 > 0.8) confirms the suitability of the polynomial model for mixtures for applications in the design of new glass-ceramic products.

7.
J Proteome Res ; 20(4): 1936-1942, 2021 04 02.
Article in English | MEDLINE | ID: mdl-33661641

ABSTRACT

Bottom-up proteomics is currently the dominant strategy for proteome analysis. It relies critically upon the use of a protease to digest proteins into peptides, which are then identified by liquid chromatography-mass spectrometry (LC-MS). The choice of protease(s) has a substantial impact upon the utility of the bottom-up results obtained. Protease selection determines the nature of the peptides produced, which in turn affects the ability to infer the presence and quantities of the parent proteins and post-translational modifications in the sample. We present here the software tool ProteaseGuru, which provides in silico digestions by candidate proteases, allowing evaluation of their utility for bottom-up proteomic experiments. This information is useful for both studies focused on a single or small number of proteins, and for analysis of entire complex proteomes. ProteaseGuru provides a convenient user interface, valuable peptide information, and data visualizations enabling the comparison of digestion results of different proteases. The information provided includes data tables of theoretical peptide sequences and their biophysical properties, results summaries outlining the numbers of shared and unique peptides per protease, histograms facilitating the comparison of proteome-wide proteolytic data, protein-specific summaries, and sequence coverage maps. Examples are provided of its use to inform analysis of variant-containing proteins in the human proteome, as well as for studies requiring the use of multiple proteomic databases such as a human:mouse xenograft model, and microbiome metaproteomics.


Subject(s)
Peptide Hydrolases , Proteomics , Amino Acid Sequence , Animals , Chromatography, Liquid , Mice , Proteome/genetics
8.
J Appl Crystallogr ; 53(Pt 6): 1613-1619, 2020 Dec 01.
Article in English | MEDLINE | ID: mdl-33304227

ABSTRACT

The Extreme Environment Diffractometer was a neutron time-of-flight instrument equipped with a constant-field hybrid magnet providing magnetic fields up to 26 T. The magnet infrastructure and sample environment imposed limitations on the geometry of the experiment, making it necessary to plan the experiment with care. EXEQ is the software tool developed to allow users of the instrument to find the optimal sample orientation for their diffraction experiment. InEXEQ fulfilled the same role for the inelastic neutron scattering experiments. The source code of the software is licensed under the GNU General Public Licence 3, allowing it to be used by other facilities and adapted for use on other instruments.

9.
Ciênc. rural (Online) ; 50(1): e20190195, 2020. tab
Article in English | LILACS-Express | LILACS | ID: biblio-1055841

ABSTRACT

ABSTRACT: The objective of this research was to determine the optimal plot size and the number of replications to evaluate the fresh matter of ryegrass sown to haul. Twenty uniformity trials were conducted, each trial with 16 basic experimental units (BEU) of 0.5 m2. At 117, 118 and 119 days after sowing, the fresh matter of ryegrass in the BEUs of 5, 10 and 5 uniformity trials, respectively, were determined. The optimal plot size was determined by the maximum curvature method of the variation coefficient model. Next, the replications number was determined in scenarios formed by combinations of i treatments (i = 3, 4, ... 50) and d minimum differences between means of treatments to be detected as significant at 5% of probability by the Tukey test, expressed in experimental mean percentage (d = 10, 11, ... 20%). The optimal plot size to determine the fresh matter of ryegrass seeded at the haul is 2.19 m2, with a variation coefficient of 9.79%. To identify as significant at 5% probability, by the Tukey test, differences between treatment means of 20%, are required five, six, seven and eight replications, respectively, in ryegrass experiments with up to 5, 10, 20 and 50 treatments.


RESUMO: Os objetivos deste trabalho foram determinar o tamanho ótimo de parcela e o número de repetições para avaliar a massa verde de azevém semeado a lanço. Foram conduzidos 20 ensaios de uniformidade, sendo cada ensaio composto por 16 unidades experimentais básicas (UEB) de 0,5 m2. Aos 117, 118 e 119 dias após semeadura foi determinada a massa verde de azevém nas UEB de 5, 10 e 5 ensaios de uniformidade, respectivamente. Foi determinado o tamanho ótimo de parcela pelo método da curvatura máxima do modelo do coeficiente de variação. A seguir, foi determinado o número de repetições em cenários formados pelas combinações de i tratamentos (i = 3, 4, ..., 50) e d diferença mínima significativa do teste de Tukey a 5% de probabilidade, expressa em percentagem da média do experimento (d = 10%, 11%, ..., 20%). Conclui-se que o tamanho ótimo de parcela para determinar a massa verde de azevém semeado a lanço é de 2,19m2, com coeficiente de variação de 9,79%. Para identificar como significativas a 5% de probabilidade pelo teste de Tukey, diferenças entre médias de tratamentos de 20%, o número de repetições necessárias é de cinco, seis, sete e oito repetições, respectivamente, para experimentos com até 5, 10, 20 e 50 tratamentos na cultura do azevém.

10.
Ciênc. rural (Online) ; 50(11): e20200222, 2020. tab, graf
Article in English | LILACS-Express | LILACS | ID: biblio-1133218

ABSTRACT

ABSTRACT: The hybridization between wheat and rye crops resulted in the triticale crop, which presents rusticity, versatility in animal and human food and possibility of use as a cover plant. The objective of this research was to determine the optimal plot size and the replications number to evaluate the fresh weight of triticale in two evaluation moments. An experiment was carried out with the triticale cultivar IPR111. The experimental area was divided into 48 uniformity trials, each containing 36 basic experimental units of 0.51 m2. The fresh weight was evaluated in 24 uniformity trials at 99 days after sowing (DAS) and in 24 uniformity trials at 127 DAS. The optimal plot size was determined by the method of the maximum curvature of the coefficient of variation and the replications number was determined in scenarios of treatments number and differences between means to be detected as significant by Tukey test. To determine the fresh weight of triticale, the optimal plot size is 3.12 m2, with coefficient of variation of 13.69%. Six replications are sufficient to identify as significant, differences between treatment means of 25% for experiments with up to seven treatments and of 30% for experiments with up to 28 treatments, regardless of the experimental design.


RESUMO: O cruzamento das culturas de trigo e centeio resultou na obtenção da cultura do triticale, que apresenta rusticidade, versatilidade na alimentação animal e humana e possibilidade de uso como planta de cobertura. O objetivo deste trabalho foi determinar o tamanho ótimo de parcela e o número de repetições para avaliar a massa verde de triticale em dois momentos de avaliação. Foi conduzido um experimento com a cultivar de triticale IPR111, sendo a área experimental dividida em 48 ensaios de uniformidade, cada ensaio contendo 36 unidades experimentais básicas de 0,51 m2. A massa verde foi avaliada em 24 ensaios aos 99 dias após a semeadura (DAS ) e em 24 ensaios aos 127 DAS. O tamanho ótimo de parcela foi determinado pelo método da máxima curvatura do coeficiente de variação e o número de repetições foi determinado em combinações de número de tratamentos e de diferenças entre médias a serem detectadas como significativas pelo teste de Tukey. Para determinar a massa verde de triticale, o tamanho ótimo de parcela é de 3,12 m2 com coeficiente de variação de 13,69%. Seis repetições são suficientes para identificar como significativas, diferenças entre médias de tratamentos de 25% para experimentos com até sete tratamentos e de 30% para experimentos com até 28 tratamentos, independentemente do delineamento experimental.

11.
Methods Mol Biol ; 1766: 175-194, 2018.
Article in English | MEDLINE | ID: mdl-29605853

ABSTRACT

DNA methylation is a crucial regulatory mechanism of gene expression, affected in many human pathologies. Therefore, it is not surprising that nowadays, in the era of high-throughput methods, a lot of data sets representing DNA methylation in various conditions are available and the amount of such data keeps growing. In this chapter, we discuss those aspects of experiment planning and data analysis, which we consider the most important for reliability and reproducibility of DNA methylation studies: usage of replicates, data quality control at various stages, selection of a statistical model, and incorporation of DNA methylation into the multi-omics analysis.


Subject(s)
DNA Methylation , Data Analysis , Research Design , CpG Islands , Data Accuracy , Gene Expression Profiling , High-Throughput Nucleotide Sequencing , Humans , Models, Statistical , Reproducibility of Results , Sequence Analysis, DNA , Software
12.
J Comput Sci ; 29: 59-69, 2018 Nov.
Article in English | MEDLINE | ID: mdl-30931048

ABSTRACT

Several computational models of Vein Graft Bypass (VGB) adaptation have been developed in order to improve the surgical outcome and they all share a common property: their accuracy relies on a winning choice of their driving coefficients which are best to be retrieved from experimental data. Since experiments are time-consuming and resources-demanding, the golden standard is to know in advance which measures need to be retrieved on the experimental table and out of how many samples. Accordingly, our goal is to build a computational framework able to pre-design an effective experimental structure to optimize the computational models setup. Our hypothesis is that an Agent-Based Model (ABM) developed by our group is comparable enough to a true set of experiments to be used to generate reliable virtual experimental data. Thanks to a twofold usage of our ABM, we created a filter to be posed before the real experiment in order to drive its optimal design. This work is the natural continuation of a previous study from our group [1], where the attention was posed on simple single-cellular events models. With this new version we focused on more complex models with the purpose of verifying that the complexity of the experimental setup grows proportionally with the accuracy of the model itself.

13.
Comput Sci ICCS ; 10860: 352-362, 2018 Jun.
Article in English | MEDLINE | ID: mdl-31032487

ABSTRACT

Several computational models have been developed in order to improve the outcome of Vein Graft Bypasses in response to arterial occlusions and they all share a common property: their accuracy relies on a winning choice of the coefficients' value related to biological functions that drive them. Our goal is to optimize the retrieval of these unknown coefficients on the base of experimental data and accordingly, as biological experiments are noisy in terms of statistical analysis and the models are typically stochastic and complex, this work wants first to elucidate which experimental measurements might be sufficient to retrieve the targeted coefficients and second how many specimens would constitute a good dataset to guarantee a sufficient level of accuracy. Since experiments are often costly and time consuming, the planning stage is critical to the success of the operation and, on the base of this consideration, the present work shows how, thanks to an ad hoc use of a computational model of vascular adaptation, it is possible to estimate in advance the entity and the quantity of resources needed in order to efficiently reproduce the experimental reality.

14.
Front Neuroinform ; 11: 12, 2017.
Article in English | MEDLINE | ID: mdl-28243197

ABSTRACT

Computers help neuroscientists to analyze experimental results by automating the application of statistics; however, computer-aided experiment planning is far less common, due to a lack of similar quantitative formalisms for systematically assessing evidence and uncertainty. While ontologies and other Semantic Web resources help neuroscientists to assimilate required domain knowledge, experiment planning requires not only ontological but also epistemological (e.g., methodological) information regarding how knowledge was obtained. Here, we outline how epistemological principles and graphical representations of causality can be used to formalize experiment planning toward causal discovery. We outline two complementary approaches to experiment planning: one that quantifies evidence per the principles of convergence and consistency, and another that quantifies uncertainty using logical representations of constraints on causal structure. These approaches operationalize experiment planning as the search for an experiment that either maximizes evidence or minimizes uncertainty. Despite work in laboratory automation, humans must still plan experiments and will likely continue to do so for some time. There is thus a great need for experiment-planning frameworks that are not only amenable to machine computation but also useful as aids in human reasoning.

15.
Biochem Pharmacol ; 97(3): 225-35, 2015 Oct 01.
Article in English | MEDLINE | ID: mdl-26208784

ABSTRACT

Recent reports have highlighted studies in biomedical research that cannot be reproduced, tending to undermine the credibility, relevance and sustainability of the research process. To address this issue, a number of factors can be monitored to improve the overall probability of reproducibility. These include: (i) shortcomings in experimental design and execution that involve hypothesis conceptualization, statistical analysis, and data reporting; (ii) investigator bias and error; (iii) validation of reagents including cells and antibodies; and (iv) fraud. Historically, research data that have undergone peer review and are subsequently published are then subject to independent replication via the process of self-correction. This often leads to refutation of the original findings and retraction of the paper by which time considerable resources have been wasted in follow-on studies. New NIH guidelines focused on experimental conduct and manuscript submission are being widely adopted in the peer-reviewed literature. These, in their various iterations, are intended to improve the transparency and accuracy of data reporting via the use of checklists that are often accompanied by "best practice" guidelines that aid in validating the methodologies and reagents used in data generation. The present Editorial provides background and context to a newly developed checklist for submissions to Biochemical Pharmacology that is intended to be clear, logical, useful and unambiguous in assisting authors in preparing manuscripts and in facilitating the peer review process. While currently optional, development of this checklist based on user feedback will result in it being mandatory within the next 12 months.


Subject(s)
Biomedical Research , Guidelines as Topic , Manuscripts, Medical as Topic , Peer Review, Research , Periodicals as Topic/standards , Pharmacology , Checklist , Guideline Adherence
16.
Ciênc. rural ; 41(11): 1890-1898, nov. 2011. ilus, tab
Article in Portuguese | LILACS | ID: lil-608058

ABSTRACT

O objetivo deste trabalho foi comparar dois métodos de estimação do tamanho ótimo de parcela (Xo) para avaliar 12 caracteres de híbridos simples, triplo e duplo de milho e verificar a variabilidade do Xo entre caracteres e entre híbridos. Foram mensurados caracteres morfológicos (altura de planta e de espiga) e produtivos (peso de espiga, número de fileiras por espiga, comprimento e diâmetro de espiga, peso e diâmetro de sabugo, massa de cem grãos, número de grãos por espiga e comprimento e produtividade de grãos), em unidades experimentais básicas de um metro linear, em 48 ensaios de uniformidade. Foi determinado Xo por meio dos métodos da curvatura máxima modificada e da curvatura máxima do modelo do coeficiente de variação para cada caractere. O tamanho ótimo de parcela obtido por meio do método da curvatura máxima modificada é concordante, mas maior que o obtido pelo método da curvatura máxima do modelo do coeficiente de variação. O tamanho ótimo de parcela dos caracteres produtivos é maior que dos morfológicos. O tamanho ótimo de parcela entre os híbridos de milho é de 5,04 unidades experimentais básicas para o híbrido simples (4,03m²), 5,24 para o híbrido triplo (4,19m²) e 5,53 para o híbrido duplo (4,42m²).


The objective of this research was to compare two methods of estimating the optimum plot size (Xo) to assess 12 characters of corn hybrids single, triple and double and verify the variability of the Xo among characters and hybrids. It was measured morphological (height plant and cob) and productive (ear weight, number of rows per ear, length and diameter of ear, weight and diameter of cob, weight of hundred grains, number of grain per ear and length and grain yield) characters, in basic experimental units of a meter, in 48 uniformity assays. The Xo was determined using the methods of the maximum curvature modified and the maximum curvature of the model coefficient of variation for each character. The optimum plot size obtained by maximum curvature modified method is concordant, but higher than obtained by maximum curvature of the model coefficient of variation. The optimum plot size of productive characters is greater the morphological. The optimum plot size among the hybrids is 5.04 basic experimental units for the simple hybrid (4.03m²), 5.24 for the triple hybrid (4.19m²) and 5.53 for the double hybrid (4.42m²).

17.
Ciênc. rural ; 38(2): 315-320, mar.-abr. 2008. tab
Article in Portuguese | LILACS | ID: lil-474490

ABSTRACT

O planejamento experimental é de fundamental importância quando se deseja precisão e qualidade dos resultados. Neste contexto, a escolha correta do tamanho e da forma de parcela em função da cultura e do tipo de ambiente merece atenção especial. O objetivo deste trabalho foi estimar o tamanho ótimo de parcela e a precisão experimental, na avaliação do rendimento de grãos de sorgo, em duas épocas de semeadura. Avaliaram-se dados de um experimento com sorgo granífero, cultivar "BR 304", conduzido em duas épocas de semeadura, durante o ano agrícola 2005/2006, na Universidade Federal de Santa Maria. Três áreas de sorgo foram semeadas em duas épocas (dezembro de 2005 e janeiro de 2006). Cada área foi constituída por quatro fileiras espaçadas em 0,5m, com 36 plantas por fileira. Estas 144 plantas foram tomadas como unidades básicas para a simulação de diferentes tamanhos de parcela. Conclui-se que, para experimentos de avaliação da produtividade de grãos de sorgo granífero, o tamanho ótimo de parcela é de oito plantas ou, aproximadamente, um metro linear de plantas, e que maior precisão pode ser obtida ao se aumentar o número de repetições, independentemente do número de tratamentos e da época de semeadura.


Good experimental planning is essential in order to obtain reliable and good results. In this context, the correct choice of plot size and form, which depends on the kind of crop and environment, deserves special attention. This research was aimed at estimating the optimal plot size as a function of the experimental precision, in a sorghum yield (variety ‘BR 304’) experiment in two sowing seasons. Data of the experiment were studied during the 2005/2006 growing season at the Universidade Federal de Santa Maria. Three areas were sown in two dates, (12/7/2005 and 01/05/2006) in a completely randomized design with three replications. Each area was formed by four rows 0.5 m apart, totaling 36 plants per row. These 144 plants were considered as basic units in order to simulate different plot sizes. It was concluded that for sorghum yield experiments, the optimal plot size is formed by eight plants or approximately or one linear meter of row and greater precision can be obtained by increasing the number of replications regardless the number of treatments and sowing dates.

SELECTION OF CITATIONS
SEARCH DETAIL
...