Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 13 de 13
Filter
Add more filters










Publication year range
1.
Sci Justice ; 61(1): 47-60, 2021 01.
Article in English | MEDLINE | ID: mdl-33357827

ABSTRACT

Nowadays, forensic age estimation takes an important role in worldwide forensic and medico-legal institutes that are solicited by judicial or administrative authorities for providing an expert report on the age of individuals. The authorities' ultimate issue of interest is often the probability that the person is younger or older than a given age threshold, which is usually the age of majority. Such information is fundamental for deciding whether a person being judged falls under the legal category of an adult. This is a decision that may have important consequences for the individual, depending on the legal framework in which the decision is made. The aim of this paper is to introduce a normative approach for assisting the authority in the decision-making process given knowledge from available findings reported by means of probabilities. The normative approach proposed here has been acknowledged in the forensic framework, and represents a promising structure for reasoning that can support the decision-making process in forensic age estimation. The paper introduces the fundamental elements of decision theory applied to the specific case of age estimation, and provides some examples to illustrate its practical application.


Subject(s)
Decision Support Techniques , Forensic Medicine , Humans
2.
Forensic Sci Int ; 309: 110213, 2020 Apr.
Article in English | MEDLINE | ID: mdl-32142993

ABSTRACT

Forensic science has been evolving towards a separation of more and more specialised tasks, with forensic practitioners increasingly identifying themselves with only one sub-discipline or task of forensic science. Such divisions are viewed as a threat to the advancement of science because they tend to polarise researchers and tear apart scientific communities. The objective of this article is to highlight that a piece of information is not either intelligence or evidence, and that a forensic scientist is not either an investigator or an evaluator, but that these notions must all be applied in conjunction to successfully understand a criminal problem or solve a case. To capture the scope, strength and contribution of forensic science, this paper proposes a progressive but non-linear continuous model that could serve as a guide for forensic reasoning and processes. In this approach, hypothetico-deductive reasoning, iterative thinking and the notion of entropy are used to frame the continuum, situate forensic scientists' operating contexts and decision points. Situations and examples drawn from experience and practice are used to illustrate the approach. The authors argue that forensic science, as a discipline, should not be defined according to the context it serves (i.e. an investigation, a court decision or an intelligence process), but as a general, scientific and holistic trace-focused practice that contributes to a broad range of goals in various contexts. Since forensic science does not work in isolation, the approach also provides a useful basis as to how forensic scientists should contribute to collective and collaborative problem-solving to improve justice and security.


Subject(s)
Decision Support Techniques , Forensic Sciences , Intelligence , Humans
3.
Forensic Sci Int ; 310: 110251, 2020 May.
Article in English | MEDLINE | ID: mdl-32203853

ABSTRACT

Stiffelman [1] gives a broad critique of the application of likelihood ratios (LRs) in forensic science, in particular their use in probabilistic genotyping (PG) software. These are discussed in this review. LRs do not infringe on the ultimate issue. The Bayesian paradigm clearly separates the role of the scientist from that of the decision makers and distances the scientist from comment on the ultimate and subsidiary issues. LRs do not affect the reasonable doubt standard. Fact finders must still make decisions based on all the evidence and they must do this considering all evidence, not just that given probabilistically. LRs do not infringe on the presumption of innocence. The presumption of innocence does not equate with a prior probability of zero but simply that the person of interest (POI) is no more likely than anyone else to be the donor. Propositions need to be exhaustive within the context of the case. That is, propositions deemed relevant by either defense or prosecution which are not fanciful must not be omitted from consideration.


Subject(s)
DNA Fingerprinting , DNA/chemistry , Forensic Medicine , Decision Making , Humans , Likelihood Functions
4.
Sci Justice ; 59(3): 362-365, 2019 05.
Article in English | MEDLINE | ID: mdl-31054826

ABSTRACT

This letter to the Editor comments on the paper 'Strategic choice in linear sequential unmasking' by Roger Koppl (Science & Justice, https://doi.org/10.1016/j.scijus.2018.10.010).


Subject(s)
Social Justice
5.
J Forensic Sci ; 64(2): 393-405, 2019 Mar.
Article in English | MEDLINE | ID: mdl-30132900

ABSTRACT

Forensic DNA interpretation is transitioning from manual interpretation based usually on binary decision-making toward computer-based systems that model the probability of the profile given different explanations for it, termed probabilistic genotyping (PG). Decision-making by laboratories to implement probability-based interpretation should be based on scientific principles for validity and information that supports its utility, such as criteria to support admissibility. The principles behind STRmix™ are outlined in this study and include standard mathematics and modeling of peak heights and variability in those heights. All PG methods generate a likelihood ratio (LR) and require the formulation of propositions. Principles underpinning formulations of propositions include the identification of reasonably assumed contributors. Substantial data have been produced that support precision, error rate, and reliability of PG, and in particular, STRmix™. A current issue is access to the code and quality processes used while coding. There are substantial data that describe the performance, strengths, and limitations of STRmix™, one of the available PG software.


Subject(s)
DNA Fingerprinting , Genotyping Techniques , Microsatellite Repeats , Software Design , Software , Bias , Forensic Genetics , Genotype , Humans , Likelihood Functions , Reproducibility of Results
6.
Forensic Sci Int ; 288: e15-e19, 2018 Jul.
Article in English | MEDLINE | ID: mdl-29857959

ABSTRACT

Recently, Lund and Iyer (L&I) raised an argument regarding the use of likelihood ratios in court. In our view, their argument is based on a lack of understanding of the paradigm. L&I argue that the decision maker should not accept the expert's likelihood ratio without further consideration. This is agreed by all parties. In normal practice, there is often considerable and proper exploration in court of the basis for any probabilistic statement. We conclude that L&I argue against a practice that does not exist and which no one advocates. Further we conclude that the most informative summary of evidential weight is the likelihood ratio. We state that this is the summary that should be presented to a court in every scientific assessment of evidential weight with supporting information about how it was constructed and on what it was based.

8.
Forensic Sci Int Genet ; 28: 178-187, 2017 05.
Article in English | MEDLINE | ID: mdl-28273509

ABSTRACT

An update was performed of the classic experiments that led to the view that profile probability assignments are usually within a factor of 10 of each other. The data used in this study consist of 15 Identifiler loci collected from a wide range of forensic populations. Following Budowle et al. [1], the terms cognate and non-cognate are used. The cognate database is the database from which the profiles are simulated. The profile probability assignment was usually larger in the cognate database. In 44%-65% of the cases, the profile probability for 15 loci in the non-cognate database was within a factor of 10 of the profile probability in the cognate database. This proportion was between 60% and 80% when the FBI and NIST data were used as the non-cognate databases. A second experiment compared the match probability assignment using a generalised database and recommendation 4.2 from NRC II (the 4.2 assignment) with a proxy for the matching proportion developed using subpopulation allele frequencies and the product rule. The findings support that the 4.2 assignment has a large conservative bias. These results are in agreement with previous research results.


Subject(s)
DNA Fingerprinting , Databases, Nucleic Acid , Models, Statistical , Probability , Gene Frequency , Humans , Racial Groups/genetics
9.
Forensic Sci Int ; 272: e7-e9, 2017 Mar.
Article in English | MEDLINE | ID: mdl-27817943

ABSTRACT

This letter comments on the report "Forensic science in criminal courts: Ensuring scientific validity of feature-comparison methods" recently released by the President's Council of Advisors on Science and Technology (PCAST). The report advocates a procedure for evaluation of forensic evidence that is a two-stage procedure in which the first stage is "match"/"non-match" and the second stage is empirical assessment of sensitivity (correct acceptance) and false alarm (false acceptance) rates. Almost always, quantitative data from feature-comparison methods are continuously-valued and have within-source variability. We explain why a two-stage procedure is not appropriate for this type of data, and recommend use of statistical procedures which are appropriate.

10.
Data Brief ; 8: 375-86, 2016 Sep.
Article in English | MEDLINE | ID: mdl-27331117

ABSTRACT

Low-template DNA analyses are affected by stochastic effects which can produce a configuration of peaks in the electropherogram (EPG) that is different from the genotype of the DNA׳s donor. A probabilistic and decision-theoretic model can quantify the expected net gain (ENG) of performing a DNA analysis by the difference between the expected value of information (EVOI) and the cost of performing the analysis. This article presents data on the ENG of performing DNA analyses of low-template DNA for a single amplification, two replicate amplifications, and for a second replicate amplification given the result of a first analysis. The data were obtained using amplification kits AmpFlSTR Identifiler Plus and Promega׳s PowerPlex 16 HS, an ABI 3130xl genetic sequencer, and Applied Biosystem׳s GeneMapper ID-X software. These data are supplementary to an original research article investigating whether a forensic DNA analyst should perform a single DNA analysis or two replicate analyses from a decision-theoretic point of view, entitled "Low-template DNA: a single DNA analysis or two replicates?" (Gittelson et al., 2016) [1].

11.
Forensic Sci Int ; 264: 139-45, 2016 07.
Article in English | MEDLINE | ID: mdl-27131143

ABSTRACT

This study investigates the following two questions: (1) Should the DNA analyst concentrate the DNA extract into a single amplification or should he/she split it up to do two replicates? (2) Given the electropherogram obtained from a first analysis, is it worthwhile for the DNA analyst to invest in obtaining a second replicate? A decision-theoretic approach addresses these questions by quantitatively expressing the expected net gain (ENG) of each DNA analysis of interest. The results indicate that two replicates generally have a greater ENG than a single DNA analysis for DNA quantities capable of producing two replicates having an average allelic peak height as low as 43rfu. This supports the position that two replicates increase the information content with regard to a single analysis.


Subject(s)
DNA Fingerprinting/methods , DNA/analysis , Decision Theory , Genotype , Humans
12.
J Forensic Sci ; 61(1): 186-95, 2016 Jan.
Article in English | MEDLINE | ID: mdl-26248867

ABSTRACT

The interpretation of complex DNA profiles is facilitated by a Bayesian approach. This approach requires the development of a pair of propositions: one aligned to the prosecution case and one to the defense case. This note explores the issue of proposition setting in an adversarial environment by a series of examples. A set of guidelines generalize how to formulate propositions when there is a single person of interest and when there are multiple individuals of interest. Additional explanations cover how to handle multiple defense propositions, relatives, and the transition from subsource level to activity level propositions. The propositions depend on case information and the allegations of each of the parties. The prosecution proposition is usually known. The authors suggest that a sensible proposition is selected for the defense that is consistent with their stance, if available, and consistent with a realistic defense if their position is not known.


Subject(s)
DNA Fingerprinting , Forensic Medicine/legislation & jurisprudence , Likelihood Functions , Humans
13.
J Forensic Sci ; 57(5): 1199-216, 2012 Sep.
Article in English | MEDLINE | ID: mdl-22458915

ABSTRACT

Forensic scientists face increasingly complex inference problems for evaluating likelihood ratios (LRs) for an appropriate pair of propositions. Up to now, scientists and statisticians have derived LR formulae using an algebraic approach. However, this approach reaches its limits when addressing cases with an increasing number of variables and dependence relationships between these variables. In this study, we suggest using a graphical approach, based on the construction of Bayesian networks (BNs). We first construct a BN that captures the problem, and then deduce the expression for calculating the LR from this model to compare it with existing LR formulae. We illustrate this idea by applying it to the evaluation of an activity level LR in the context of the two-trace transfer problem. Our approach allows us to relax assumptions made in previous LR developments, produce a new LR formula for the two-trace transfer problem and generalize this scenario to n traces.

SELECTION OF CITATIONS
SEARCH DETAIL
...