Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 4 de 4
Filter
Add more filters










Database
Language
Publication year range
2.
Int J Technol Assess Health Care ; 38(1): e29, 2022 Mar 08.
Article in English | MEDLINE | ID: mdl-35256029

ABSTRACT

OBJECTIVE: To undertake a technical review of the search interface of the ISPOR Presentations Database. By technical review, we mean an evaluation of the technical aspects of the search interface and functionality, which a user must navigate to complete a search. METHODS: A validated checklist (Bethel and Rogers, 2014, Health Info Libr J, 31, 43-53) was used to identify where the interface performed well, where the interface was adequate, where the interface performed poorly, where functionality available in core biomedical bibliographic databases does not exist in the ISPOR database, and to establish a list of any issues arising during the review. Two researchers independently undertook the technical review in October 2021. RESULTS: The ISPOR database scored 35 of a possible 165 (27/111 essential criteria and 8/54 desirable criteria). Two issues arising were identified, both of which will cause searchers to miss potentially eligible abstracts: (i) that search terms, which include * or ? as truncation or wildcard symbols should not be capitalized (e.g., cost* not Cost*; organi?ation not Organi?ation) and (ii) that quotation marks should be straight sided in phrase searching (e.g., "cost analyses" not "cost analyses"). CONCLUSIONS: The ISPOR database is a promising and free database to identify abstracts/posters presented at ISPOR. We summarize two key issues arising, and we set out proposed changes to the search interface, including: adding the ability to export abstracts to a bibliographic tool, exporting search strategies, adding a researcher account, and updating the help guide. All suggestions will further improve this helpful database.


Subject(s)
Databases, Factual
3.
Res Synth Methods ; 12(4): 557-570, 2021 Jul.
Article in English | MEDLINE | ID: mdl-33713573

ABSTRACT

There is limited guidance on how to web-search in systematic reviews and concern relates to the reproducibility of searches using search engines such as Google. The aim of this paper is to address one potential source of variation in Google searches: does the geographical location of a researcher affect Google search returns? Using a virtual private network, we ran the same web-search for the medical technology Dasatinib in 12 different countries. Two researchers independently extracted the search returns by country organised by page rank. We compared: C1. any difference in the items returned by Google searches between countries and C2. any difference in the page rank of items returned between countries. Searches were undertaken on Monday September 28th 2020. From 12 countries, 43 items were identified. For C1: 19 items were common to all 12 countries. Twenty-four items were missed by searches in some countries. This means that there were differences in search returns between countries. For C2: a randomised trial reported by Raddich et al was the first search return for all countries. All other items, common to all countries, varied in their page-rank. We find that geographic location would appear to influence Google search returns based on the findings of this case study. The findings suggest that recording the location of the researcher undertaking web-searching may now be an important factor to report alongside detail on steps taken to minimise personalisation of web-searches covered by recent guidance. This finding also has implications for stopping-rules.


Subject(s)
Search Engine , Reproducibility of Results , Systematic Reviews as Topic
4.
Res Synth Methods ; 12(3): 384-393, 2021 May.
Article in English | MEDLINE | ID: mdl-33555126

ABSTRACT

Clinical trials registers form an important part of the search for studies in systematic reviews of intervention effectiveness but the search interfaces and functionality of registers can be challenging to search systematically and resource intensive to search well. We report a technical review of the search interfaces of three leading trials register resources: ClinicalTrials.gov, the EU Clinical Trials Register and the WHO International Clinical Trials Registers Platform. The technical review used a validated checklist to identify areas where the search interfaces of these trials register resources performed well, where performance was adequate, where performance was poor, and to identify differences between search interfaces. The review found low overall scores for each of the interfaces (ClinicalTrials.gov 55/165, the EU Clinical Trials Register 25/165, the WHO International Clinical Trials Registers Platform 32/165). This finding suggests a need for joined-up dialogue between the producers of the registers and researchers who search them via these interfaces. We also set out a series of four proposed changes which might improve the search interfaces. Trials registers are an invaluable resource in systematic reviews of intervention effectiveness. With the continued growth in systematic reviews, and initiatives such as 'AllTrials', there is an anticipated need for these resources. We conclude that small changes to the search interfaces, and improved dialogue with providers, might improve the future search functionality of these valuable resources.


Subject(s)
Clinical Trials as Topic , Registries
SELECTION OF CITATIONS
SEARCH DETAIL
...