University of Derby

Literature Reviews: systematic searching at various levels

  • for assignments
  • for dissertations / theses
  • Search strategy and searching

Search strategy template

  • Screening & critiquing
  • Citation Searching
  • Google Scholar (with Lean Library)
  • Resources for literature reviews
  • Adding a referencing style to EndNote
  • Exporting from different databases
  • PRISMA Flow Diagram
  • Grey Literature

You can map out your search strategy in whatever way works for you.

Some people like lists and so plan their search strategy out in a grid-box or table format. Some people are more visual and like to draw their strategy out using a mind-map approach (either on paper or using mind-mapping software). Some people use sticky notes or Trello or a spreadsheet.

If it works for you then as long as it enables you to search systematically and thoroughly there's no need to change the way you work. 

If your search strategies are not very developed, the method you use doesn't lead to a good search, then consider using one of the other methods to see if changing your approach helps.

  • Search Strategy Document
  • << Previous: Search strategy and searching
  • Next: Screening & critiquing >>
  • Last Updated: Nov 24, 2023 4:33 PM
  • URL: https://libguides.derby.ac.uk/literature-reviews

How to undertake a literature search: a step-by-step guide

Affiliation.

  • 1 Literature Search Specialist, Library and Archive Service, Royal College of Nursing, London.
  • PMID: 32279549
  • DOI: 10.12968/bjon.2020.29.7.431

Undertaking a literature search can be a daunting prospect. Breaking the exercise down into smaller steps will make the process more manageable. This article suggests 10 steps that will help readers complete this task, from identifying key concepts to choosing databases for the search and saving the results and search strategy. It discusses each of the steps in a little more detail, with examples and suggestions on where to get help. This structured approach will help readers obtain a more focused set of results and, ultimately, save time and effort.

Keywords: Databases; Literature review; Literature search; Reference management software; Research questions; Search strategy.

  • Databases, Bibliographic*
  • Information Storage and Retrieval / methods*
  • Nursing Research
  • Review Literature as Topic*

Searching for Systematic Reviews & Evidence Synthesis: Drawing up your search strategy

  • Define your search question
  • Searching Databases
  • Drawing up your search strategy
  • Advanced search techniques
  • Using Filters
  • Grey Literature
  • Recording your search strategy and results
  • Managing References & Software Tools
  • Further information
  • Library Workshops, Drop ins and 1-2-1s
  • AI tools in evidence synthesis

Subject Headings

  • Fixed list of terms arranged hierarchically with broader and narrower terms.
  • Indexers classify the article by tagging it with subject headings that relate to the content.
  • Some tags represent the main focus of the article and some refer to secondary aspects of the work.
  • Can allow you to search more effectively and avoid missing relevant articles.
  • Can retrieve relevant articles where the term does not occur in title or abstract.
  • In these databases you need to rely on just keyword searching, ensuring that you use as many synonyms, and alternate terms as possible.
  • This means that you cannot use the same subject headings from one database in another but will need to research for each concept in order to locate the relevant subject heading (if it exists) and add that to your search.
  • On other database platforms you may need to select the subject heading you wish to use. You may also be presented with a list of subject headings to select on databases on the OvidSP platform if there is no exact match. Remember that the subject heading you select should be the one for the concept you are searching for - sometimes those subject headings suggested may be relevant to your search question but not the concept you are searching for at that point.
  • If you are uncertain what explode and focus mean look at the explanation on this page.
  • Use the literature search template (available from the Define your search question tab ) to remind you about breaking your concept down and combining it accurately with AND and OR.

Types of research design

Cochrane systematic reviews and those systematic reviews considering interventions often include a filter to restrict the results to studies reporting Randomised Controlled Trials (RCTs). Consider whether you wish to include a filter for a particular type of study design as part of your search strategy. These exist for RCTs but also for other study types including Observational Studies and Patient Issues. For more information about where to source standardised, pre-tested search filter search strategies optimised for different databases which you can copy and paste and add to your own search strategy see the Using Filters tab of this guide.

Free-text keyword searching

  • article title

Combining your search - OR and AND

AND and OR are Boolean operators.

OR is used to combine synonyms, abbreviations and all related terms on a similar concept. You can OR together subject headings for a particular concept with relevant keyword searches. 'OR is more'. Your result set will get larger as you OR together more terms.

AND is used (normally at the end of the search) to combine together different concepts and to retrieve results where all the concepts are present. AND narrows down your results and makes your search more specific. AND is sometimes automatic for two or more search terms depending on the database.

Explode and Focus - Subject Heading searching

Explode: 

  • The indexers will select the most specific subject heading possible to tag an article so by choosing explode you can select a high level subject heading (e.g. antibiotic agent is used in Embase for the concept antibiotic) and by exploding the subject heading you will automatically include the narrower, more specific terms, e.g. named antibiotics.
  • To view the more specific subject headings which would be included click on the subject heading itself to view it in the thesaurus. The structure of the thesaurus may be displayed in different ways depending on the database, e.g. in Embase on the Ovid platform these are listed under Narrower Terms; in Medline narrower terms are indented in.
  • In general it is good practice to explode the Subject Headings in your search. If you feel that too much irrelevant material is being retrieved then explore the thesaurus to see whether you need to pick a high level subject heading (not exploded) and also only some of the more specific subject headings which fall beneath it in the thesaurus.

Focus  (in the  Ovid  databases):

  • If you select Focus you will restrict your results to only those articles which the indexer feels the subject heading you have selected to search is key to what the article is about. 
  • In other databases Focus may be known as something else, e.g. Major Concept in CINAHL on the Ebsco platform.
  • On the Ovid platform if you view the Complete Reference for an article you will see all the subject headings which have been assigned to an article, and the ones which are key (which would be retrieved by a focused search) are marked by an asterisk before the subject heading. In CINAHL if you view the Detailed Record you will see them listed under Major Subjects.
  • Use Focus with care in a systematic review as it will dramatically reduce the number of results retrieved. Initially it would be a good idea to see what the results are without focusing after you have combined your terms. You may find it useful to use Focus for very peripheral aspects of your topic which you wish to include without being inundated with results.

It is possible to both Explode and Focus a subject heading search.

  • By applying both Explode and Focus you will retrieve articles tagged with your subject heading and the narrower subject heading terms that fall underneath it in the thesaurus  (E xplode) , but also limit (reduce) the results to where either the top level subject heading or any of its narrower terms have been identified by the indexers as being key/the focus/the major concept which the article is about.

A search on Embase on the Ovid platform showing the initial Subject Heading selection screen. Explode and Focus are available to select:

Image of Ovid Subject Headings

By clicking on a specific Subject Heading in the list, e.g. antibiotic agent, you enter the thesaurus and can see what the term is used for, the broader terms and narrower terms. By selecting Explode for antibiotic agent in Embase you will automatically retrieve articles which have also been tagged with all of the narrower more specific named antibiotics.

Image of Ovid Subject Headings Thesaurus

Below you can see the Full Record for a particular article (in Embase on the Ovid platform) with the Subject Headings listed which have been used to tag this particular article. You can see that 'penicillin allergy' is one of the subject headings assigned to this article and the asterisk * before the subject heading indicates that this was deemed to be key to what the article was about when it was added to the database and subject headings assigned. A normal subject heading search for 'penicillin allergy' or its exploded broader term would retrieve this article but so would a focussed search *penicillin allergy/. An exploded and focused search for the broader term, e.g. drug hypersensitivity, would also retrieve this article - exp *drug hypersensitivity/.   

Image of article with subject headings

Using search strategies from published systematic reviews

It is always worth checking to see whether any systematic reviews which have a concept in common with your search question have published their search strategy. If they have then this will act as a useful starting point for you to use for your search. Remember it is not necessary for all the concepts of the systematic review to be the same as you should be able to isolate the specific lines of the search strategy relevant to you. So for example if I am undertaking a systematic review on the 'effectiveness of phototherapy for neonatal jaundice' I may find that part of a search strategy for a published review on 'Phototherapy for treating pressure ulcers' is very relevant for me. I will then need to either create a search strategy element for my other concept of 'neonatal jaundice' or locate another systematic review which may have already created a search strategy including an element for this concept, e.g. one on 'Early intravenous nutrition for the prevention of neonatal jaundice'.

Below is an excerpt from a published Cochrane systematic review on  ' Phototherapy  for treating pressure ulcers' showing the search lines for the phototherapy concept (optimised for Ovid Medline) which could then be used in a different search and combined with other concepts.

Screenshot of highlight section of Cochrane Systematic Review cearch strategy

Chen C, Hou WH, Chan ESY, Yeh ML, Lo HLD. Phototherapy for treating pressure ulcers. Cochrane Database of Systematic Reviews 2014, Issue 7. Art. No.: CD009224. DOI: 10.1002/14651858.CD009224.pub2.

Systematic reviews in the Cochrane Library should all publish their search strategies and in many cases will show the search strategy they used for each database they searched (optimised for each database) - this normally appears in the Appendix of the full text for a Cochrane review.

The PRISMA  2020 checklist states for '#7 Search strategy' that you should "Present the full search strategies for  all  databases, registers and websites, including any filters and limits used". The 2021 PRISMA searching extension increases that to "Include the search strategies for each database and information source, copied and pasted exactly as run". However, unfortunately you may find that older systematic reviews and even some current ones may not do this. Note too that in some cases the full search strategy may appear as a supporting document on a journal site as opposed to forming part of the pdf of the article. 

Pay attention when looking to use a published search strategy to both the database and the platform it is hosted on and for which the search strategy has been optimised. If you are searching the same database on the same platform then you should simply be able to copy and paste the search strategy in to generate results. If you are uncertain about what the lines of the search strategy mean and what they are searching, e.g. exp, adj or .tw then see the information on the  Advanced search techniques tab.

Remember when using a search strategy from another systematic review that you should assess it for quality rather than simply copying it in. Are there any other search terms that should be included, have they located all appropriate subject headings, and so on? See the information about PRESS: Peer Review of Electronic Search Strategies on the Advanced search techniques tab. The age of the source systematic review is also important as subject headings change to reflect changes in medical science. As such if you are working in an area of rapid development and the strategy is a few years old you may wish to just use it as a starting point rather than directly copying.

Finally if you do use all or part of another's search strategy then do remember to acknowledge the source.

  • << Previous: Searching Databases
  • Next: Advanced search techniques >>
  • Last Updated: Feb 12, 2024 11:26 AM
  • URL: https://libguides.kcl.ac.uk/systematicreview

© 2017 King's College London | Strand | London WC2R 2LS | England | United Kingdom | Tel +44 (0)20 7836 5454

Charles Sturt University

Literature Review: Developing a search strategy

  • Traditional or narrative literature reviews
  • Scoping Reviews
  • Systematic literature reviews
  • Annotated bibliography
  • Keeping up to date with literature
  • Finding a thesis
  • Evaluating sources and critical appraisal of literature
  • Managing and analysing your literature
  • Further reading and resources

From research question to search strategy

Keeping a record of your search activity

Good search practice could involve keeping a search diary or document detailing your search activities (Phelps et. al. 2007, pp. 128-149), so that you can keep track of effective search terms, or to help others to reproduce your steps and get the same results. 

This record could be a document, table or spreadsheet with:

  • The names of the sources you search and which provider you accessed them through - eg Medline (Ovid), Web of Science (Thomson Reuters). You should also include any other literature sources you used.
  • how you searched (keyword and/or subject headings)
  • which search terms you used (which words and phrases)
  • any search techniques you employed (truncation, adjacency, etc)
  • how you combined your search terms (AND/OR). Check out the Database Help guide for more tips on Boolean Searching.
  • The number of search results from each source and each strategy used. This can be the evidence you need to prove a gap in the literature, and confirms the importance of your research question.

A search planner may help you to organise you thoughts prior to conducting your search. If you have any problems with organising your thoughts prior, during and after searching please contact your Library  Faculty Team   for individual help.

  • Literature search - a librarian's handout to introduce tools, terms and techniques Created by Elsevier librarian, Katy Kavanagh Web, this document outlines tools, terms and techniques to think about when conducting a literature search.
  • Search planner

Literature search cycle

literature search strategy table

Diagram text description

This diagram illustrates the literature search cycle. It shows a circle in quarters. Top left quarter is identify main concepts with rectangle describing how to do this by identifying:controlled vocabulary terms, synonyms, keywords and spelling. Top right quarter select library resources to search and rectangle describing resources to search library catalogue relevant journal articles and other resource. Bottom right corner of circle search resources and in rectangle consider using boolean searchingproximity searching and truncated searching techniques. Bottom left quarter of circle review and refine results. In rectangle evaluate results, rethink keywords and create alerts.

Have a search framework

Search frameworks are mnemonics which can help you focus your research question. They are also useful in helping you to identify the concepts and terms you will use in your literature search.

PICO is a search framework commonly used in the health sciences to focus clinical questions.  As an example, you work in an aged care facility and are interested in whether cranberry juice might help reduce the common occurrence of urinary tract infections.  The PICO framework would look like this:

Now that the issue has been broken up to its elements, it is easier to turn it into an answerable research question: “Does cranberry juice help reduce urinary tract infections in people living in aged care facilities?”

Other frameworks may be helpful, depending on your question and your field of interest. PICO can be adapted to PICOT (which adds T ime) or PICOS (which adds S tudy design), or PICOC (adding C ontext).

For qualitative questions you could use

  • SPIDER : S ample,  P henomenon of  I nterest,  D esign,  E valuation,  R esearch type  

For questions about causes or risk,

  • PEO : P opulation,  E xposure,  O utcomes

For evaluations of interventions or policies, 

  • SPICE: S etting,  P opulation or  P erspective,  I ntervention,  C omparison,  E valuation or
  • ECLIPSE: E xpectation,  C lient group,  L ocation,  I mpact,  P rofessionals,  SE rvice 

See the University of Notre Dame Australia’s examples of some of these frameworks. 

You can also try some PICO examples in the National Library of Medicine's PubMed training site: Using PICO to frame clinical questions.

Contact Your Faculty Team Librarian

Faculty librarians are here to provide assistance to students, researchers and academic staff by providing expert searching advice, research and curriculum support.

  • Faculty of Arts & Education team
  • Faculty of Business, Justice & Behavioural Science team
  • Faculty of Science team

Further reading

Cover Art

  • << Previous: Annotated bibliography
  • Next: Keeping up to date with literature >>
  • Last Updated: Jan 16, 2024 1:39 PM
  • URL: https://libguides.csu.edu.au/review

Acknowledgement of Country

Charles Sturt University is an Australian University, TEQSA Provider Identification: PRV12018. CRICOS Provider: 00005F.

Duke University Libraries

Literature Reviews

  • 3. Search the literature
  • Getting started
  • Types of reviews
  • 1. Define your research question
  • 2. Plan your search

Creating a search strategy

Select your database(s), document your search, rinse and repeat, grey literature, grey literature sources.

  • 4. Organize your results
  • 5. Synthesize your findings
  • 6. Write the review
  • Thompson Writing Studio This link opens in a new window
  • Need to write a systematic review? This link opens in a new window

literature search strategy table

Contact a Librarian

Ask a Librarian

  • Thesauri / subject headings
  • Ask a librarian!

When conducting a literature review, it is imperative to brainstorm a list of keywords related to your topic. Examining the titles, abstracts, and author-provided keywords of pertinent literature is a great starting point.

Things to keep in mind:

  • Alternative spellings (e.g., behavior and behaviour)
  • Variants and truncation (e.g., environ* = environment, environments, environmental, environmentally)
  • Synonyms (e.g., alternative fuels >> electricity, ethanol, natural gas, hydrogen fuel cells)
  • Phrases and double quotes (e.g., "food security" versus food OR security) 

One way to visually organize your thoughts is to create a table where each column represents one concept in your research question. For example, if your research question is...

Does social media play a role in the number of eating disorder diagnoses in college-aged women?

...then your table might look something like this:

Generative AI tools, such as chatbots, are actually quite helpful at this stage when it comes to brainstorming synonyms and other related terms. You can also look at author-provided keywords from benchmark articles (key papers related to your topic), databases' controlled vocabularies, or do a preliminary search and look through abstracts from relevant papers.

Generative AI tools :  ChatGPT ,  Google Gemini (formerly Bard) ,  Claude , Microsoft Copilot

For more information on how to incorporate AI tools into your research, check out the section on  AI Tools .

Boolean searching yields more effective and precise search results. Boolean operators include  AND , OR , and NOT . These are logic-based words that help search engines narrow down or broaden search results.

Using the Operators

The Boolean operator  AND  tells a search engine that you want to find information about two (or more) search terms. For example, sustainability AND plastics. This will narrow down your search results because the search engine will only bring back results that include both search terms.

The Boolean operator  OR  tells the search engine that you want to find information about either search term you've entered. For example, sustainability OR plastics. This will broaden your search results because the search engine will bring back any results that have either search term in them.

The Boolean operator  NOT  tells the search engine that you want to find information about the first search term, but nothing about the second. For example, sustainability NOT plastics. This will narrow down your research results because the search engine will bring back only resources about the first search term (sustainability), but exclude any resources that include the second search term (plastics).

Boolean searching Venn diagram

Some databases offer a thesaurus , controlled vocabulary , or list of available subject headings that are assigned to each of its records, either by an indexer or by the original author. The use of controlled vocabularies is a highly effective, efficient, and deliberate way of comprehensively discovering the material within a field of study.

  • APA Thesaurus of Psychological Index Terms  (via PsycInfo database)
  • Medical Subject Headings (MeSH)  (via PubMed)
  • List of ProQuest database thesauri

Web of Science's Core Collection offers a list of subject categories that are searchable by the  Web of Science  Categories field .

Reach out to a Duke University Libraries librarian at [email protected] or use the chat function.

Information animated icons created by Freepik - Flaticon

Not sure where to start when selecting a scholarly database to search? Here are some top databases:

While not essential for traditional literature reviews, documenting your search can help you:

  • Keep track of what you've done so that you don't repeat unproductive searches
  • Reuse successful search strategies for future papers
  • Help you describe your search process for manuscripts
  • Justify your search process

Documenting your search will help you stay organized and save time when tweaking your search strategy. This is a critical step for rigorous review papers, such as  systematic reviews .

One of the easiest ways to document your search strategy is to use a table like this:

literature search strategy table

If you find that you're receiving too many results , try the following tips:

  • Use more AND operators to connect keywords/concepts in order to narrow down your search.
  • Use more specific keywords rather than an umbrella term (e.g., "formaldehyde" instead of "chemical").
  • Use quotation marks (" ") to search an entire phrase.
  • Use filters such as date, language, document type, etc.
  • Examine your research question to see if it's still too broad.

On the other hand, if you're not receiving enough results :

  • Use more OR operators to connect related terms and bring in additional results.
  • Use more generic terms (e.g., "acetone" instead of "dimethyl ketone") or fewer keywords altogether.
  • Use wildcard operators (*) to expand your results (e.g., toxi* searches toxic, toxin, toxins).
  • Examine your research question to see if it's too narrow.

Grey (or gray) literature refers to research materials and publications that are not commercially published or widely distributed through traditional academic channels. If you are tasked with doing an intensive type of review or evidence synthesis, or you are involved in research related to policy-making, you will likely want to include searching for grey literature.   This type of literature includes:

  • working papers
  • government documents
  • conference proceedings
  • theses and dissertations
  • white papers...etc.

For more information on grey literature, please see our Grey Literature guide .

  • Public policy
  • Health/medicine
  • Statistics/data
  • Thesis/dissertation
  • ProQuest Central This link opens in a new window Search for articles from thousands of scholarly journals
  • OpenDOAR OpenDOAR is the quality-assured, global Directory of Open Access Repositories. We host repositories that provide free, open access to academic outputs and resources.
  • OAIster A catalog of millions of open-access resources harvested from WorldCat.
  • GreySource An index of repository hyperlinks across all disciplines.
  • Pew Research Center Pew Research Center is a nonpartisan fact tank that informs the public about the issues, attitudes and trends shaping the world. We conduct public opinion polling, demographic research, content analysis and other data-driven social science research.
  • The World Bank The World Bank is a vital source of financial and technical assistance to developing countries around the world.
  • World Health Organization (WHO): IRIS IRIS is the Institutional Repository for Information Sharing, a digital library of WHO's published material and technical information in full text produced since 1948.
  • PolicyArchive PolicyArchive is a comprehensive digital library of public policy research containing over 30,000 documents.
  • Kaiser Family Foundation KFF is the independent source for health policy research, polling, and journalism. Our mission is to serve as a nonpartisan source of information for policymakers, the media, the health policy community, and the public.
  • MedNar Mednar is a free, medically-focused deep web search engine that uses Explorit Everywhere!, an advanced search technology by Deep Web Technologies. As an alternative to Google, Mednar accelerates your research with a search of authoritative public and deep web resources, returning the most relevant results to one easily navigable page.
  • Global Index Medicus The Global Index Medicus (GIM) provides worldwide access to biomedical and public health literature produced by and within low-middle income countries. The main objective is to increase the visibility and usability of this important set of resources. The material is collated and aggregated by WHO Regional Office Libraries on a central search platform allowing retrieval of bibliographical and full text information.

For more in-depth information related to grey literature searching in medicine, please visit Duke Medical Center Library's guide .

  • Education Resources Information Center (ERIC) ERIC is a comprehensive, easy-to-use, searchable, Internet-based bibliographic and full-text database of education research and information. It is sponsored by the Institute of Education Sciences within the U.S. Department of Education.
  • National Center for Occupational Safety and Health (NIOSHTIC-2) NIOSHTIC-2 is a searchable bibliographic database of occupational safety and health publications, documents, grant reports, and other communication products supported in whole or in part by NIOSH (CDC).
  • National Technical Information Service (NTIS) The National Technical Information Service acquires, indexes, abstracts, and archives the largest collection of U.S. government-sponsored technical reports in existence. The NTRL offers online, free and open access to these authenticated government technical reports.
  • Science.gov Science.gov provides access to millions of authoritative scientific research results from U.S. federal agencies.
  • GovInfo GovInfo is a service of the United States Government Publishing Office (GPO), which is a Federal agency in the legislative branch. GovInfo provides free public access to official publications from all three branches of the Federal Government.
  • CQ Press Library This link opens in a new window Search for analysis of Congressional actions and US political issues. Includes CQ Weekly and CQ Researcher.
  • Congressional Research Service (CRS) This collection provides the public with access to research products produced by the Congressional Research Service (CRS) for the United States Congress.

Please see the Data Sets and Collections page from our Statistical Sciences guide.

  • arXiv arXiv is a free distribution service and an open-access archive for nearly 2.4 million scholarly articles in the fields of physics, mathematics, computer science, quantitative biology, quantitative finance, statistics, electrical engineering and systems science, and economics. Materials on this site are not peer-reviewed by arXiv.
  • OSF Preprints OSF Preprints is an open access option for discovering multidisciplinary preprints as well as postprints and working papers.
  • << Previous: 2. Plan your search
  • Next: 4. Organize your results >>
  • Last Updated: Feb 15, 2024 1:45 PM
  • URL: https://guides.library.duke.edu/lit-reviews

Duke University Libraries

Services for...

  • Faculty & Instructors
  • Graduate Students
  • Undergraduate Students
  • International Students
  • Patrons with Disabilities

Twitter

  • Harmful Language Statement
  • Re-use & Attribution / Privacy
  • Support the Libraries

Creative Commons License

  • Open access
  • Published: 14 August 2018

Defining the process to literature searching in systematic reviews: a literature review of guidance and supporting studies

  • Chris Cooper   ORCID: orcid.org/0000-0003-0864-5607 1 ,
  • Andrew Booth 2 ,
  • Jo Varley-Campbell 1 ,
  • Nicky Britten 3 &
  • Ruth Garside 4  

BMC Medical Research Methodology volume  18 , Article number:  85 ( 2018 ) Cite this article

194k Accesses

193 Citations

121 Altmetric

Metrics details

Systematic literature searching is recognised as a critical component of the systematic review process. It involves a systematic search for studies and aims for a transparent report of study identification, leaving readers clear about what was done to identify studies, and how the findings of the review are situated in the relevant evidence.

Information specialists and review teams appear to work from a shared and tacit model of the literature search process. How this tacit model has developed and evolved is unclear, and it has not been explicitly examined before.

The purpose of this review is to determine if a shared model of the literature searching process can be detected across systematic review guidance documents and, if so, how this process is reported in the guidance and supported by published studies.

A literature review.

Two types of literature were reviewed: guidance and published studies. Nine guidance documents were identified, including: The Cochrane and Campbell Handbooks. Published studies were identified through ‘pearl growing’, citation chasing, a search of PubMed using the systematic review methods filter, and the authors’ topic knowledge.

The relevant sections within each guidance document were then read and re-read, with the aim of determining key methodological stages. Methodological stages were identified and defined. This data was reviewed to identify agreements and areas of unique guidance between guidance documents. Consensus across multiple guidance documents was used to inform selection of ‘key stages’ in the process of literature searching.

Eight key stages were determined relating specifically to literature searching in systematic reviews. They were: who should literature search, aims and purpose of literature searching, preparation, the search strategy, searching databases, supplementary searching, managing references and reporting the search process.

Conclusions

Eight key stages to the process of literature searching in systematic reviews were identified. These key stages are consistently reported in the nine guidance documents, suggesting consensus on the key stages of literature searching, and therefore the process of literature searching as a whole, in systematic reviews. Further research to determine the suitability of using the same process of literature searching for all types of systematic review is indicated.

Peer Review reports

Systematic literature searching is recognised as a critical component of the systematic review process. It involves a systematic search for studies and aims for a transparent report of study identification, leaving review stakeholders clear about what was done to identify studies, and how the findings of the review are situated in the relevant evidence.

Information specialists and review teams appear to work from a shared and tacit model of the literature search process. How this tacit model has developed and evolved is unclear, and it has not been explicitly examined before. This is in contrast to the information science literature, which has developed information processing models as an explicit basis for dialogue and empirical testing. Without an explicit model, research in the process of systematic literature searching will remain immature and potentially uneven, and the development of shared information models will be assumed but never articulated.

One way of developing such a conceptual model is by formally examining the implicit “programme theory” as embodied in key methodological texts. The aim of this review is therefore to determine if a shared model of the literature searching process in systematic reviews can be detected across guidance documents and, if so, how this process is reported and supported.

Identifying guidance

Key texts (henceforth referred to as “guidance”) were identified based upon their accessibility to, and prominence within, United Kingdom systematic reviewing practice. The United Kingdom occupies a prominent position in the science of health information retrieval, as quantified by such objective measures as the authorship of papers, the number of Cochrane groups based in the UK, membership and leadership of groups such as the Cochrane Information Retrieval Methods Group, the HTA-I Information Specialists’ Group and historic association with such centres as the UK Cochrane Centre, the NHS Centre for Reviews and Dissemination, the Centre for Evidence Based Medicine and the National Institute for Clinical Excellence (NICE). Coupled with the linguistic dominance of English within medical and health science and the science of systematic reviews more generally, this offers a justification for a purposive sample that favours UK, European and Australian guidance documents.

Nine guidance documents were identified. These documents provide guidance for different types of reviews, namely: reviews of interventions, reviews of health technologies, reviews of qualitative research studies, reviews of social science topics, and reviews to inform guidance.

Whilst these guidance documents occasionally offer additional guidance on other types of systematic reviews, we have focused on the core and stated aims of these documents as they relate to literature searching. Table  1 sets out: the guidance document, the version audited, their core stated focus, and a bibliographical pointer to the main guidance relating to literature searching.

Once a list of key guidance documents was determined, it was checked by six senior information professionals based in the UK for relevance to current literature searching in systematic reviews.

Identifying supporting studies

In addition to identifying guidance, the authors sought to populate an evidence base of supporting studies (henceforth referred to as “studies”) that contribute to existing search practice. Studies were first identified by the authors from their knowledge on this topic area and, subsequently, through systematic citation chasing key studies (‘pearls’ [ 1 ]) located within each key stage of the search process. These studies are identified in Additional file  1 : Appendix Table 1. Citation chasing was conducted by analysing the bibliography of references for each study (backwards citation chasing) and through Google Scholar (forward citation chasing). A search of PubMed using the systematic review methods filter was undertaken in August 2017 (see Additional file 1 ). The search terms used were: (literature search*[Title/Abstract]) AND sysrev_methods[sb] and 586 results were returned. These results were sifted for relevance to the key stages in Fig.  1 by CC.

figure 1

The key stages of literature search guidance as identified from nine key texts

Extracting the data

To reveal the implicit process of literature searching within each guidance document, the relevant sections (chapters) on literature searching were read and re-read, with the aim of determining key methodological stages. We defined a key methodological stage as a distinct step in the overall process for which specific guidance is reported, and action is taken, that collectively would result in a completed literature search.

The chapter or section sub-heading for each methodological stage was extracted into a table using the exact language as reported in each guidance document. The lead author (CC) then read and re-read these data, and the paragraphs of the document to which the headings referred, summarising section details. This table was then reviewed, using comparison and contrast to identify agreements and areas of unique guidance. Consensus across multiple guidelines was used to inform selection of ‘key stages’ in the process of literature searching.

Having determined the key stages to literature searching, we then read and re-read the sections relating to literature searching again, extracting specific detail relating to the methodological process of literature searching within each key stage. Again, the guidance was then read and re-read, first on a document-by-document-basis and, secondly, across all the documents above, to identify both commonalities and areas of unique guidance.

Results and discussion

Our findings.

We were able to identify consensus across the guidance on literature searching for systematic reviews suggesting a shared implicit model within the information retrieval community. Whilst the structure of the guidance varies between documents, the same key stages are reported, even where the core focus of each document is different. We were able to identify specific areas of unique guidance, where a document reported guidance not summarised in other documents, together with areas of consensus across guidance.

Unique guidance

Only one document provided guidance on the topic of when to stop searching [ 2 ]. This guidance from 2005 anticipates a topic of increasing importance with the current interest in time-limited (i.e. “rapid”) reviews. Quality assurance (or peer review) of literature searches was only covered in two guidance documents [ 3 , 4 ]. This topic has emerged as increasingly important as indicated by the development of the PRESS instrument [ 5 ]. Text mining was discussed in four guidance documents [ 4 , 6 , 7 , 8 ] where the automation of some manual review work may offer efficiencies in literature searching [ 8 ].

Agreement between guidance: Defining the key stages of literature searching

Where there was agreement on the process, we determined that this constituted a key stage in the process of literature searching to inform systematic reviews.

From the guidance, we determined eight key stages that relate specifically to literature searching in systematic reviews. These are summarised at Fig. 1 . The data extraction table to inform Fig. 1 is reported in Table  2 . Table 2 reports the areas of common agreement and it demonstrates that the language used to describe key stages and processes varies significantly between guidance documents.

For each key stage, we set out the specific guidance, followed by discussion on how this guidance is situated within the wider literature.

Key stage one: Deciding who should undertake the literature search

The guidance.

Eight documents provided guidance on who should undertake literature searching in systematic reviews [ 2 , 4 , 6 , 7 , 8 , 9 , 10 , 11 ]. The guidance affirms that people with relevant expertise of literature searching should ‘ideally’ be included within the review team [ 6 ]. Information specialists (or information scientists), librarians or trial search co-ordinators (TSCs) are indicated as appropriate researchers in six guidance documents [ 2 , 7 , 8 , 9 , 10 , 11 ].

How the guidance corresponds to the published studies

The guidance is consistent with studies that call for the involvement of information specialists and librarians in systematic reviews [ 12 , 13 , 14 , 15 , 16 , 17 , 18 , 19 , 20 , 21 , 22 , 23 , 24 , 25 , 26 ] and which demonstrate how their training as ‘expert searchers’ and ‘analysers and organisers of data’ can be put to good use [ 13 ] in a variety of roles [ 12 , 16 , 20 , 21 , 24 , 25 , 26 ]. These arguments make sense in the context of the aims and purposes of literature searching in systematic reviews, explored below. The need for ‘thorough’ and ‘replicable’ literature searches was fundamental to the guidance and recurs in key stage two. Studies have found poor reporting, and a lack of replicable literature searches, to be a weakness in systematic reviews [ 17 , 18 , 27 , 28 ] and they argue that involvement of information specialists/ librarians would be associated with better reporting and better quality literature searching. Indeed, Meert et al. [ 29 ] demonstrated that involving a librarian as a co-author to a systematic review correlated with a higher score in the literature searching component of a systematic review [ 29 ]. As ‘new styles’ of rapid and scoping reviews emerge, where decisions on how to search are more iterative and creative, a clear role is made here too [ 30 ].

Knowing where to search for studies was noted as important in the guidance, with no agreement as to the appropriate number of databases to be searched [ 2 , 6 ]. Database (and resource selection more broadly) is acknowledged as a relevant key skill of information specialists and librarians [ 12 , 15 , 16 , 31 ].

Whilst arguments for including information specialists and librarians in the process of systematic review might be considered self-evident, Koffel and Rethlefsen [ 31 ] have questioned if the necessary involvement is actually happening [ 31 ].

Key stage two: Determining the aim and purpose of a literature search

The aim: Five of the nine guidance documents use adjectives such as ‘thorough’, ‘comprehensive’, ‘transparent’ and ‘reproducible’ to define the aim of literature searching [ 6 , 7 , 8 , 9 , 10 ]. Analogous phrases were present in a further three guidance documents, namely: ‘to identify the best available evidence’ [ 4 ] or ‘the aim of the literature search is not to retrieve everything. It is to retrieve everything of relevance’ [ 2 ] or ‘A systematic literature search aims to identify all publications relevant to the particular research question’ [ 3 ]. The Joanna Briggs Institute reviewers’ manual was the only guidance document where a clear statement on the aim of literature searching could not be identified. The purpose of literature searching was defined in three guidance documents, namely to minimise bias in the resultant review [ 6 , 8 , 10 ]. Accordingly, eight of nine documents clearly asserted that thorough and comprehensive literature searches are required as a potential mechanism for minimising bias.

The need for thorough and comprehensive literature searches appears as uniform within the eight guidance documents that describe approaches to literature searching in systematic reviews of effectiveness. Reviews of effectiveness (of intervention or cost), accuracy and prognosis, require thorough and comprehensive literature searches to transparently produce a reliable estimate of intervention effect. The belief that all relevant studies have been ‘comprehensively’ identified, and that this process has been ‘transparently’ reported, increases confidence in the estimate of effect and the conclusions that can be drawn [ 32 ]. The supporting literature exploring the need for comprehensive literature searches focuses almost exclusively on reviews of intervention effectiveness and meta-analysis. Different ‘styles’ of review may have different standards however; the alternative, offered by purposive sampling, has been suggested in the specific context of qualitative evidence syntheses [ 33 ].

What is a comprehensive literature search?

Whilst the guidance calls for thorough and comprehensive literature searches, it lacks clarity on what constitutes a thorough and comprehensive literature search, beyond the implication that all of the literature search methods in Table 2 should be used to identify studies. Egger et al. [ 34 ], in an empirical study evaluating the importance of comprehensive literature searches for trials in systematic reviews, defined a comprehensive search for trials as:

a search not restricted to English language;

where Cochrane CENTRAL or at least two other electronic databases had been searched (such as MEDLINE or EMBASE); and

at least one of the following search methods has been used to identify unpublished trials: searches for (I) conference abstracts, (ii) theses, (iii) trials registers; and (iv) contacts with experts in the field [ 34 ].

Tricco et al. (2008) used a similar threshold of bibliographic database searching AND a supplementary search method in a review when examining the risk of bias in systematic reviews. Their criteria were: one database (limited using the Cochrane Highly Sensitive Search Strategy (HSSS)) and handsearching [ 35 ].

Together with the guidance, this would suggest that comprehensive literature searching requires the use of BOTH bibliographic database searching AND supplementary search methods.

Comprehensiveness in literature searching, in the sense of how much searching should be undertaken, remains unclear. Egger et al. recommend that ‘investigators should consider the type of literature search and degree of comprehension that is appropriate for the review in question, taking into account budget and time constraints’ [ 34 ]. This view tallies with the Cochrane Handbook, which stipulates clearly, that study identification should be undertaken ‘within resource limits’ [ 9 ]. This would suggest that the limitations to comprehension are recognised but it raises questions on how this is decided and reported [ 36 ].

What is the point of comprehensive literature searching?

The purpose of thorough and comprehensive literature searches is to avoid missing key studies and to minimize bias [ 6 , 8 , 10 , 34 , 37 , 38 , 39 ] since a systematic review based only on published (or easily accessible) studies may have an exaggerated effect size [ 35 ]. Felson (1992) sets out potential biases that could affect the estimate of effect in a meta-analysis [ 40 ] and Tricco et al. summarize the evidence concerning bias and confounding in systematic reviews [ 35 ]. Egger et al. point to non-publication of studies, publication bias, language bias and MEDLINE bias, as key biases [ 34 , 35 , 40 , 41 , 42 , 43 , 44 , 45 , 46 ]. Comprehensive searches are not the sole factor to mitigate these biases but their contribution is thought to be significant [ 2 , 32 , 34 ]. Fehrmann (2011) suggests that ‘the search process being described in detail’ and that, where standard comprehensive search techniques have been applied, increases confidence in the search results [ 32 ].

Does comprehensive literature searching work?

Egger et al., and other study authors, have demonstrated a change in the estimate of intervention effectiveness where relevant studies were excluded from meta-analysis [ 34 , 47 ]. This would suggest that missing studies in literature searching alters the reliability of effectiveness estimates. This is an argument for comprehensive literature searching. Conversely, Egger et al. found that ‘comprehensive’ searches still missed studies and that comprehensive searches could, in fact, introduce bias into a review rather than preventing it, through the identification of low quality studies then being included in the meta-analysis [ 34 ]. Studies query if identifying and including low quality or grey literature studies changes the estimate of effect [ 43 , 48 ] and question if time is better invested updating systematic reviews rather than searching for unpublished studies [ 49 ], or mapping studies for review as opposed to aiming for high sensitivity in literature searching [ 50 ].

Aim and purpose beyond reviews of effectiveness

The need for comprehensive literature searches is less certain in reviews of qualitative studies, and for reviews where a comprehensive identification of studies is difficult to achieve (for example, in Public health) [ 33 , 51 , 52 , 53 , 54 , 55 ]. Literature searching for qualitative studies, and in public health topics, typically generates a greater number of studies to sift than in reviews of effectiveness [ 39 ] and demonstrating the ‘value’ of studies identified or missed is harder [ 56 ], since the study data do not typically support meta-analysis. Nussbaumer-Streit et al. (2016) have registered a review protocol to assess whether abbreviated literature searches (as opposed to comprehensive literature searches) has an impact on conclusions across multiple bodies of evidence, not only on effect estimates [ 57 ] which may develop this understanding. It may be that decision makers and users of systematic reviews are willing to trade the certainty from a comprehensive literature search and systematic review in exchange for different approaches to evidence synthesis [ 58 ], and that comprehensive literature searches are not necessarily a marker of literature search quality, as previously thought [ 36 ]. Different approaches to literature searching [ 37 , 38 , 59 , 60 , 61 , 62 ] and developing the concept of when to stop searching are important areas for further study [ 36 , 59 ].

The study by Nussbaumer-Streit et al. has been published since the submission of this literature review [ 63 ]. Nussbaumer-Streit et al. (2018) conclude that abbreviated literature searches are viable options for rapid evidence syntheses, if decision-makers are willing to trade the certainty from a comprehensive literature search and systematic review, but that decision-making which demands detailed scrutiny should still be based on comprehensive literature searches [ 63 ].

Key stage three: Preparing for the literature search

Six documents provided guidance on preparing for a literature search [ 2 , 3 , 6 , 7 , 9 , 10 ]. The Cochrane Handbook clearly stated that Cochrane authors (i.e. researchers) should seek advice from a trial search co-ordinator (i.e. a person with specific skills in literature searching) ‘before’ starting a literature search [ 9 ].

Two key tasks were perceptible in preparing for a literature searching [ 2 , 6 , 7 , 10 , 11 ]. First, to determine if there are any existing or on-going reviews, or if a new review is justified [ 6 , 11 ]; and, secondly, to develop an initial literature search strategy to estimate the volume of relevant literature (and quality of a small sample of relevant studies [ 10 ]) and indicate the resources required for literature searching and the review of the studies that follows [ 7 , 10 ].

Three documents summarised guidance on where to search to determine if a new review was justified [ 2 , 6 , 11 ]. These focused on searching databases of systematic reviews (The Cochrane Database of Systematic Reviews (CDSR) and the Database of Abstracts of Reviews of Effects (DARE)), institutional registries (including PROSPERO), and MEDLINE [ 6 , 11 ]. It is worth noting, however, that as of 2015, DARE (and NHS EEDs) are no longer being updated and so the relevance of this (these) resource(s) will diminish over-time [ 64 ]. One guidance document, ‘Systematic reviews in the Social Sciences’, noted, however, that databases are not the only source of information and unpublished reports, conference proceeding and grey literature may also be required, depending on the nature of the review question [ 2 ].

Two documents reported clearly that this preparation (or ‘scoping’) exercise should be undertaken before the actual search strategy is developed [ 7 , 10 ]).

The guidance offers the best available source on preparing the literature search with the published studies not typically reporting how their scoping informed the development of their search strategies nor how their search approaches were developed. Text mining has been proposed as a technique to develop search strategies in the scoping stages of a review although this work is still exploratory [ 65 ]. ‘Clustering documents’ and word frequency analysis have also been tested to identify search terms and studies for review [ 66 , 67 ]. Preparing for literature searches and scoping constitutes an area for future research.

Key stage four: Designing the search strategy

The Population, Intervention, Comparator, Outcome (PICO) structure was the commonly reported structure promoted to design a literature search strategy. Five documents suggested that the eligibility criteria or review question will determine which concepts of PICO will be populated to develop the search strategy [ 1 , 4 , 7 , 8 , 9 ]. The NICE handbook promoted multiple structures, namely PICO, SPICE (Setting, Perspective, Intervention, Comparison, Evaluation) and multi-stranded approaches [ 4 ].

With the exclusion of The Joanna Briggs Institute reviewers’ manual, the guidance offered detail on selecting key search terms, synonyms, Boolean language, selecting database indexing terms and combining search terms. The CEE handbook suggested that ‘search terms may be compiled with the help of the commissioning organisation and stakeholders’ [ 10 ].

The use of limits, such as language or date limits, were discussed in all documents [ 2 , 3 , 4 , 6 , 7 , 8 , 9 , 10 , 11 ].

Search strategy structure

The guidance typically relates to reviews of intervention effectiveness so PICO – with its focus on intervention and comparator - is the dominant model used to structure literature search strategies [ 68 ]. PICOs – where the S denotes study design - is also commonly used in effectiveness reviews [ 6 , 68 ]. As the NICE handbook notes, alternative models to structure literature search strategies have been developed and tested. Booth provides an overview on formulating questions for evidence based practice [ 69 ] and has developed a number of alternatives to the PICO structure, namely: BeHEMoTh (Behaviour of interest; Health context; Exclusions; Models or Theories) for use when systematically identifying theory [ 55 ]; SPICE (Setting, Perspective, Intervention, Comparison, Evaluation) for identification of social science and evaluation studies [ 69 ] and, working with Cooke and colleagues, SPIDER (Sample, Phenomenon of Interest, Design, Evaluation, Research type) [ 70 ]. SPIDER has been compared to PICO and PICOs in a study by Methley et al. [ 68 ].

The NICE handbook also suggests the use of multi-stranded approaches to developing literature search strategies [ 4 ]. Glanville developed this idea in a study by Whitting et al. [ 71 ] and a worked example of this approach is included in the development of a search filter by Cooper et al. [ 72 ].

Writing search strategies: Conceptual and objective approaches

Hausner et al. [ 73 ] provide guidance on writing literature search strategies, delineating between conceptually and objectively derived approaches. The conceptual approach, advocated by and explained in the guidance documents, relies on the expertise of the literature searcher to identify key search terms and then develop key terms to include synonyms and controlled syntax. Hausner and colleagues set out the objective approach [ 73 ] and describe what may be done to validate it [ 74 ].

The use of limits

The guidance documents offer direction on the use of limits within a literature search. Limits can be used to focus literature searching to specific study designs or by other markers (such as by date) which limits the number of studies returned by a literature search. The use of limits should be described and the implications explored [ 34 ] since limiting literature searching can introduce bias (explored above). Craven et al. have suggested the use of a supporting narrative to explain decisions made in the process of developing literature searches and this advice would usefully capture decisions on the use of search limits [ 75 ].

Key stage five: Determining the process of literature searching and deciding where to search (bibliographic database searching)

Table 2 summarises the process of literature searching as reported in each guidance document. Searching bibliographic databases was consistently reported as the ‘first step’ to literature searching in all nine guidance documents.

Three documents reported specific guidance on where to search, in each case specific to the type of review their guidance informed, and as a minimum requirement [ 4 , 9 , 11 ]. Seven of the key guidance documents suggest that the selection of bibliographic databases depends on the topic of review [ 2 , 3 , 4 , 6 , 7 , 8 , 10 ], with two documents noting the absence of an agreed standard on what constitutes an acceptable number of databases searched [ 2 , 6 ].

The guidance documents summarise ‘how to’ search bibliographic databases in detail and this guidance is further contextualised above in terms of developing the search strategy. The documents provide guidance of selecting bibliographic databases, in some cases stating acceptable minima (i.e. The Cochrane Handbook states Cochrane CENTRAL, MEDLINE and EMBASE), and in other cases simply listing bibliographic database available to search. Studies have explored the value in searching specific bibliographic databases, with Wright et al. (2015) noting the contribution of CINAHL in identifying qualitative studies [ 76 ], Beckles et al. (2013) questioning the contribution of CINAHL to identifying clinical studies for guideline development [ 77 ], and Cooper et al. (2015) exploring the role of UK-focused bibliographic databases to identify UK-relevant studies [ 78 ]. The host of the database (e.g. OVID or ProQuest) has been shown to alter the search returns offered. Younger and Boddy [ 79 ] report differing search returns from the same database (AMED) but where the ‘host’ was different [ 79 ].

The average number of bibliographic database searched in systematic reviews has risen in the period 1994–2014 (from 1 to 4) [ 80 ] but there remains (as attested to by the guidance) no consensus on what constitutes an acceptable number of databases searched [ 48 ]. This is perhaps because thinking about the number of databases searched is the wrong question, researchers should be focused on which databases were searched and why, and which databases were not searched and why. The discussion should re-orientate to the differential value of sources but researchers need to think about how to report this in studies to allow findings to be generalised. Bethel (2017) has proposed ‘search summaries’, completed by the literature searcher, to record where included studies were identified, whether from database (and which databases specifically) or supplementary search methods [ 81 ]. Search summaries document both yield and accuracy of searches, which could prospectively inform resource use and decisions to search or not to search specific databases in topic areas. The prospective use of such data presupposes, however, that past searches are a potential predictor of future search performance (i.e. that each topic is to be considered representative and not unique). In offering a body of practice, this data would be of greater practicable use than current studies which are considered as little more than individual case studies [ 82 , 83 , 84 , 85 , 86 , 87 , 88 , 89 , 90 ].

When to database search is another question posed in the literature. Beyer et al. [ 91 ] report that databases can be prioritised for literature searching which, whilst not addressing the question of which databases to search, may at least bring clarity as to which databases to search first [ 91 ]. Paradoxically, this links to studies that suggest PubMed should be searched in addition to MEDLINE (OVID interface) since this improves the currency of systematic reviews [ 92 , 93 ]. Cooper et al. (2017) have tested the idea of database searching not as a primary search method (as suggested in the guidance) but as a supplementary search method in order to manage the volume of studies identified for an environmental effectiveness systematic review. Their case study compared the effectiveness of database searching versus a protocol using supplementary search methods and found that the latter identified more relevant studies for review than searching bibliographic databases [ 94 ].

Key stage six: Determining the process of literature searching and deciding where to search (supplementary search methods)

Table 2 also summaries the process of literature searching which follows bibliographic database searching. As Table 2 sets out, guidance that supplementary literature search methods should be used in systematic reviews recurs across documents, but the order in which these methods are used, and the extent to which they are used, varies. We noted inconsistency in the labelling of supplementary search methods between guidance documents.

Rather than focus on the guidance on how to use the methods (which has been summarised in a recent review [ 95 ]), we focus on the aim or purpose of supplementary search methods.

The Cochrane Handbook reported that ‘efforts’ to identify unpublished studies should be made [ 9 ]. Four guidance documents [ 2 , 3 , 6 , 9 ] acknowledged that searching beyond bibliographic databases was necessary since ‘databases are not the only source of literature’ [ 2 ]. Only one document reported any guidance on determining when to use supplementary methods. The IQWiG handbook reported that the use of handsearching (in their example) could be determined on a ‘case-by-case basis’ which implies that the use of these methods is optional rather than mandatory. This is in contrast to the guidance (above) on bibliographic database searching.

The issue for supplementary search methods is similar in many ways to the issue of searching bibliographic databases: demonstrating value. The purpose and contribution of supplementary search methods in systematic reviews is increasingly acknowledged [ 37 , 61 , 62 , 96 , 97 , 98 , 99 , 100 , 101 ] but understanding the value of the search methods to identify studies and data is unclear. In a recently published review, Cooper et al. (2017) reviewed the literature on supplementary search methods looking to determine the advantages, disadvantages and resource implications of using supplementary search methods [ 95 ]. This review also summarises the key guidance and empirical studies and seeks to address the question on when to use these search methods and when not to [ 95 ]. The guidance is limited in this regard and, as Table 2 demonstrates, offers conflicting advice on the order of searching, and the extent to which these search methods should be used in systematic reviews.

Key stage seven: Managing the references

Five of the documents provided guidance on managing references, for example downloading, de-duplicating and managing the output of literature searches [ 2 , 4 , 6 , 8 , 10 ]. This guidance typically itemised available bibliographic management tools rather than offering guidance on how to use them specifically [ 2 , 4 , 6 , 8 ]. The CEE handbook provided guidance on importing data where no direct export option is available (e.g. web-searching) [ 10 ].

The literature on using bibliographic management tools is not large relative to the number of ‘how to’ videos on platforms such as YouTube (see for example [ 102 ]). These YouTube videos confirm the overall lack of ‘how to’ guidance identified in this study and offer useful instruction on managing references. Bramer et al. set out methods for de-duplicating data and reviewing references in Endnote [ 103 , 104 ] and Gall tests the direct search function within Endnote to access databases such as PubMed, finding a number of limitations [ 105 ]. Coar et al. and Ahmed et al. consider the role of the free-source tool, Zotero [ 106 , 107 ]. Managing references is a key administrative function in the process of review particularly for documenting searches in PRISMA guidance.

Key stage eight: Documenting the search

The Cochrane Handbook was the only guidance document to recommend a specific reporting guideline: Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) [ 9 ]. Six documents provided guidance on reporting the process of literature searching with specific criteria to report [ 3 , 4 , 6 , 8 , 9 , 10 ]. There was consensus on reporting: the databases searched (and the host searched by), the search strategies used, and any use of limits (e.g. date, language, search filters (The CRD handbook called for these limits to be justified [ 6 ])). Three guidance documents reported that the number of studies identified should be recorded [ 3 , 6 , 10 ]. The number of duplicates identified [ 10 ], the screening decisions [ 3 ], a comprehensive list of grey literature sources searched (and full detail for other supplementary search methods) [ 8 ], and an annotation of search terms tested but not used [ 4 ] were identified as unique items in four documents.

The Cochrane Handbook was the only guidance document to note that the full search strategies for each database should be included in the Additional file 1 of the review [ 9 ].

All guidance documents should ultimately deliver completed systematic reviews that fulfil the requirements of the PRISMA reporting guidelines [ 108 ]. The guidance broadly requires the reporting of data that corresponds with the requirements of the PRISMA statement although documents typically ask for diverse and additional items [ 108 ]. In 2008, Sampson et al. observed a lack of consensus on reporting search methods in systematic reviews [ 109 ] and this remains the case as of 2017, as evidenced in the guidance documents, and in spite of the publication of the PRISMA guidelines in 2009 [ 110 ]. It is unclear why the collective guidance does not more explicitly endorse adherence to the PRISMA guidance.

Reporting of literature searching is a key area in systematic reviews since it sets out clearly what was done and how the conclusions of the review can be believed [ 52 , 109 ]. Despite strong endorsement in the guidance documents, specifically supported in PRISMA guidance, and other related reporting standards too (such as ENTREQ for qualitative evidence synthesis, STROBE for reviews of observational studies), authors still highlight the prevalence of poor standards of literature search reporting [ 31 , 110 , 111 , 112 , 113 , 114 , 115 , 116 , 117 , 118 , 119 ]. To explore issues experienced by authors in reporting literature searches, and look at uptake of PRISMA, Radar et al. [ 120 ] surveyed over 260 review authors to determine common problems and their work summaries the practical aspects of reporting literature searching [ 120 ]. Atkinson et al. [ 121 ] have also analysed reporting standards for literature searching, summarising recommendations and gaps for reporting search strategies [ 121 ].

One area that is less well covered by the guidance, but nevertheless appears in this literature, is the quality appraisal or peer review of literature search strategies. The PRESS checklist is the most prominent and it aims to develop evidence-based guidelines to peer review of electronic search strategies [ 5 , 122 , 123 ]. A corresponding guideline for documentation of supplementary search methods does not yet exist although this idea is currently being explored.

How the reporting of the literature searching process corresponds to critical appraisal tools is an area for further research. In the survey undertaken by Radar et al. (2014), 86% of survey respondents (153/178) identified a need for further guidance on what aspects of the literature search process to report [ 120 ]. The PRISMA statement offers a brief summary of what to report but little practical guidance on how to report it [ 108 ]. Critical appraisal tools for systematic reviews, such as AMSTAR 2 (Shea et al. [ 124 ]) and ROBIS (Whiting et al. [ 125 ]), can usefully be read alongside PRISMA guidance, since they offer greater detail on how the reporting of the literature search will be appraised and, therefore, they offer a proxy on what to report [ 124 , 125 ]. Further research in the form of a study which undertakes a comparison between PRISMA and quality appraisal checklists for systematic reviews would seem to begin addressing the call, identified by Radar et al., for further guidance on what to report [ 120 ].

Limitations

Other handbooks exist.

A potential limitation of this literature review is the focus on guidance produced in Europe (the UK specifically) and Australia. We justify the decision for our selection of the nine guidance documents reviewed in this literature review in section “ Identifying guidance ”. In brief, these nine guidance documents were selected as the most relevant health care guidance that inform UK systematic reviewing practice, given that the UK occupies a prominent position in the science of health information retrieval. We acknowledge the existence of other guidance documents, such as those from North America (e.g. the Agency for Healthcare Research and Quality (AHRQ) [ 126 ], The Institute of Medicine [ 127 ] and the guidance and resources produced by the Canadian Agency for Drugs and Technologies in Health (CADTH) [ 128 ]). We comment further on this directly below.

The handbooks are potentially linked to one another

What is not clear is the extent to which the guidance documents inter-relate or provide guidance uniquely. The Cochrane Handbook, first published in 1994, is notably a key source of reference in guidance and systematic reviews beyond Cochrane reviews. It is not clear to what extent broadening the sample of guidance handbooks to include North American handbooks, and guidance handbooks from other relevant countries too, would alter the findings of this literature review or develop further support for the process model. Since we cannot be clear, we raise this as a potential limitation of this literature review. On our initial review of a sample of North American, and other, guidance documents (before selecting the guidance documents considered in this review), however, we do not consider that the inclusion of these further handbooks would alter significantly the findings of this literature review.

This is a literature review

A further limitation of this review was that the review of published studies is not a systematic review of the evidence for each key stage. It is possible that other relevant studies could help contribute to the exploration and development of the key stages identified in this review.

This literature review would appear to demonstrate the existence of a shared model of the literature searching process in systematic reviews. We call this model ‘the conventional approach’, since it appears to be common convention in nine different guidance documents.

The findings reported above reveal eight key stages in the process of literature searching for systematic reviews. These key stages are consistently reported in the nine guidance documents which suggests consensus on the key stages of literature searching, and therefore the process of literature searching as a whole, in systematic reviews.

In Table 2 , we demonstrate consensus regarding the application of literature search methods. All guidance documents distinguish between primary and supplementary search methods. Bibliographic database searching is consistently the first method of literature searching referenced in each guidance document. Whilst the guidance uniformly supports the use of supplementary search methods, there is little evidence for a consistent process with diverse guidance across documents. This may reflect differences in the core focus across each document, linked to differences in identifying effectiveness studies or qualitative studies, for instance.

Eight of the nine guidance documents reported on the aims of literature searching. The shared understanding was that literature searching should be thorough and comprehensive in its aim and that this process should be reported transparently so that that it could be reproduced. Whilst only three documents explicitly link this understanding to minimising bias, it is clear that comprehensive literature searching is implicitly linked to ‘not missing relevant studies’ which is approximately the same point.

Defining the key stages in this review helps categorise the scholarship available, and it prioritises areas for development or further study. The supporting studies on preparing for literature searching (key stage three, ‘preparation’) were, for example, comparatively few, and yet this key stage represents a decisive moment in literature searching for systematic reviews. It is where search strategy structure is determined, search terms are chosen or discarded, and the resources to be searched are selected. Information specialists, librarians and researchers, are well placed to develop these and other areas within the key stages we identify.

This review calls for further research to determine the suitability of using the conventional approach. The publication dates of the guidance documents which underpin the conventional approach may raise questions as to whether the process which they each report remains valid for current systematic literature searching. In addition, it may be useful to test whether it is desirable to use the same process model of literature searching for qualitative evidence synthesis as that for reviews of intervention effectiveness, which this literature review demonstrates is presently recommended best practice.

Abbreviations

Behaviour of interest; Health context; Exclusions; Models or Theories

Cochrane Database of Systematic Reviews

The Cochrane Central Register of Controlled Trials

Database of Abstracts of Reviews of Effects

Enhancing transparency in reporting the synthesis of qualitative research

Institute for Quality and Efficiency in Healthcare

National Institute for Clinical Excellence

Population, Intervention, Comparator, Outcome

Preferred Reporting Items for Systematic Reviews and Meta-Analyses

Setting, Perspective, Intervention, Comparison, Evaluation

Sample, Phenomenon of Interest, Design, Evaluation, Research type

STrengthening the Reporting of OBservational studies in Epidemiology

Trial Search Co-ordinators

Booth A. Unpacking your literature search toolbox: on search styles and tactics. Health Information & Libraries Journal. 2008;25(4):313–7.

Article   Google Scholar  

Petticrew M, Roberts H. Systematic reviews in the social sciences: a practical guide. Oxford: Blackwell Publishing Ltd; 2006.

Book   Google Scholar  

Institute for Quality and Efficiency in Health Care (IQWiG). IQWiG Methods Resources. 7 Information retrieval 2014 [Available from: https://www.ncbi.nlm.nih.gov/books/NBK385787/ .

NICE: National Institute for Health and Care Excellence. Developing NICE guidelines: the manual 2014. Available from: https://www.nice.org.uk/media/default/about/what-we-do/our-programmes/developing-nice-guidelines-the-manual.pdf .

Sampson M. MJ, Lefebvre C, Moher D, Grimshaw J. Peer Review of Electronic Search Strategies: PRESS; 2008.

Google Scholar  

Centre for Reviews & Dissemination. Systematic reviews – CRD’s guidance for undertaking reviews in healthcare. York: Centre for Reviews and Dissemination, University of York; 2009.

eunetha: European Network for Health Technology Assesment Process of information retrieval for systematic reviews and health technology assessments on clinical effectiveness 2016. Available from: http://www.eunethta.eu/sites/default/files/Guideline_Information_Retrieval_V1-1.pdf .

Kugley SWA, Thomas J, Mahood Q, Jørgensen AMK, Hammerstrøm K, Sathe N. Searching for studies: a guide to information retrieval for Campbell systematic reviews. Oslo: Campbell Collaboration. 2017; Available from: https://www.campbellcollaboration.org/library/searching-for-studies-information-retrieval-guide-campbell-reviews.html

Lefebvre C, Manheimer E, Glanville J. Chapter 6: searching for studies. In: JPT H, Green S, editors. Cochrane Handbook for Systematic Reviews of Interventions; 2011.

Collaboration for Environmental Evidence. Guidelines for Systematic Review and Evidence Synthesis in Environmental Management.: Environmental Evidence:; 2013. Available from: http://www.environmentalevidence.org/wp-content/uploads/2017/01/Review-guidelines-version-4.2-final-update.pdf .

The Joanna Briggs Institute. Joanna Briggs institute reviewers’ manual. 2014th ed: the Joanna Briggs institute; 2014. Available from: https://joannabriggs.org/assets/docs/sumari/ReviewersManual-2014.pdf

Beverley CA, Booth A, Bath PA. The role of the information specialist in the systematic review process: a health information case study. Health Inf Libr J. 2003;20(2):65–74.

Article   CAS   Google Scholar  

Harris MR. The librarian's roles in the systematic review process: a case study. Journal of the Medical Library Association. 2005;93(1):81–7.

PubMed   PubMed Central   Google Scholar  

Egger JB. Use of recommended search strategies in systematic reviews and the impact of librarian involvement: a cross-sectional survey of recent authors. PLoS One. 2015;10(5):e0125931.

Li L, Tian J, Tian H, Moher D, Liang F, Jiang T, et al. Network meta-analyses could be improved by searching more sources and by involving a librarian. J Clin Epidemiol. 2014;67(9):1001–7.

Article   PubMed   Google Scholar  

McGowan J, Sampson M. Systematic reviews need systematic searchers. J Med Libr Assoc. 2005;93(1):74–80.

Rethlefsen ML, Farrell AM, Osterhaus Trzasko LC, Brigham TJ. Librarian co-authors correlated with higher quality reported search strategies in general internal medicine systematic reviews. J Clin Epidemiol. 2015;68(6):617–26.

Weller AC. Mounting evidence that librarians are essential for comprehensive literature searches for meta-analyses and Cochrane reports. J Med Libr Assoc. 2004;92(2):163–4.

Swinkels A, Briddon J, Hall J. Two physiotherapists, one librarian and a systematic literature review: collaboration in action. Health Info Libr J. 2006;23(4):248–56.

Foster M. An overview of the role of librarians in systematic reviews: from expert search to project manager. EAHIL. 2015;11(3):3–7.

Lawson L. OPERATING OUTSIDE LIBRARY WALLS 2004.

Vassar M, Yerokhin V, Sinnett PM, Weiher M, Muckelrath H, Carr B, et al. Database selection in systematic reviews: an insight through clinical neurology. Health Inf Libr J. 2017;34(2):156–64.

Townsend WA, Anderson PF, Ginier EC, MacEachern MP, Saylor KM, Shipman BL, et al. A competency framework for librarians involved in systematic reviews. Journal of the Medical Library Association : JMLA. 2017;105(3):268–75.

Cooper ID, Crum JA. New activities and changing roles of health sciences librarians: a systematic review, 1990-2012. Journal of the Medical Library Association : JMLA. 2013;101(4):268–77.

Crum JA, Cooper ID. Emerging roles for biomedical librarians: a survey of current practice, challenges, and changes. Journal of the Medical Library Association : JMLA. 2013;101(4):278–86.

Dudden RF, Protzko SL. The systematic review team: contributions of the health sciences librarian. Med Ref Serv Q. 2011;30(3):301–15.

Golder S, Loke Y, McIntosh HM. Poor reporting and inadequate searches were apparent in systematic reviews of adverse effects. J Clin Epidemiol. 2008;61(5):440–8.

Maggio LA, Tannery NH, Kanter SL. Reproducibility of literature search reporting in medical education reviews. Academic medicine : journal of the Association of American Medical Colleges. 2011;86(8):1049–54.

Meert D, Torabi N, Costella J. Impact of librarians on reporting of the literature searching component of pediatric systematic reviews. Journal of the Medical Library Association : JMLA. 2016;104(4):267–77.

Morris M, Boruff JT, Gore GC. Scoping reviews: establishing the role of the librarian. Journal of the Medical Library Association : JMLA. 2016;104(4):346–54.

Koffel JB, Rethlefsen ML. Reproducibility of search strategies is poor in systematic reviews published in high-impact pediatrics, cardiology and surgery journals: a cross-sectional study. PLoS One. 2016;11(9):e0163309.

Article   PubMed   PubMed Central   CAS   Google Scholar  

Fehrmann P, Thomas J. Comprehensive computer searches and reporting in systematic reviews. Research Synthesis Methods. 2011;2(1):15–32.

Booth A. Searching for qualitative research for inclusion in systematic reviews: a structured methodological review. Systematic Reviews. 2016;5(1):74.

Article   PubMed   PubMed Central   Google Scholar  

Egger M, Juni P, Bartlett C, Holenstein F, Sterne J. How important are comprehensive literature searches and the assessment of trial quality in systematic reviews? Empirical study. Health technology assessment (Winchester, England). 2003;7(1):1–76.

Tricco AC, Tetzlaff J, Sampson M, Fergusson D, Cogo E, Horsley T, et al. Few systematic reviews exist documenting the extent of bias: a systematic review. J Clin Epidemiol. 2008;61(5):422–34.

Booth A. How much searching is enough? Comprehensive versus optimal retrieval for technology assessments. Int J Technol Assess Health Care. 2010;26(4):431–5.

Papaioannou D, Sutton A, Carroll C, Booth A, Wong R. Literature searching for social science systematic reviews: consideration of a range of search techniques. Health Inf Libr J. 2010;27(2):114–22.

Petticrew M. Time to rethink the systematic review catechism? Moving from ‘what works’ to ‘what happens’. Systematic Reviews. 2015;4(1):36.

Betrán AP, Say L, Gülmezoglu AM, Allen T, Hampson L. Effectiveness of different databases in identifying studies for systematic reviews: experience from the WHO systematic review of maternal morbidity and mortality. BMC Med Res Methodol. 2005;5

Felson DT. Bias in meta-analytic research. J Clin Epidemiol. 1992;45(8):885–92.

Article   PubMed   CAS   Google Scholar  

Franco A, Malhotra N, Simonovits G. Publication bias in the social sciences: unlocking the file drawer. Science. 2014;345(6203):1502–5.

Hartling L, Featherstone R, Nuspl M, Shave K, Dryden DM, Vandermeer B. Grey literature in systematic reviews: a cross-sectional study of the contribution of non-English reports, unpublished studies and dissertations to the results of meta-analyses in child-relevant reviews. BMC Med Res Methodol. 2017;17(1):64.

Schmucker CM, Blümle A, Schell LK, Schwarzer G, Oeller P, Cabrera L, et al. Systematic review finds that study data not published in full text articles have unclear impact on meta-analyses results in medical research. PLoS One. 2017;12(4):e0176210.

Egger M, Zellweger-Zahner T, Schneider M, Junker C, Lengeler C, Antes G. Language bias in randomised controlled trials published in English and German. Lancet (London, England). 1997;350(9074):326–9.

Moher D, Pham B, Lawson ML, Klassen TP. The inclusion of reports of randomised trials published in languages other than English in systematic reviews. Health technology assessment (Winchester, England). 2003;7(41):1–90.

Pham B, Klassen TP, Lawson ML, Moher D. Language of publication restrictions in systematic reviews gave different results depending on whether the intervention was conventional or complementary. J Clin Epidemiol. 2005;58(8):769–76.

Mills EJ, Kanters S, Thorlund K, Chaimani A, Veroniki A-A, Ioannidis JPA. The effects of excluding treatments from network meta-analyses: survey. BMJ : British Medical Journal. 2013;347

Hartling L, Featherstone R, Nuspl M, Shave K, Dryden DM, Vandermeer B. The contribution of databases to the results of systematic reviews: a cross-sectional study. BMC Med Res Methodol. 2016;16(1):127.

van Driel ML, De Sutter A, De Maeseneer J, Christiaens T. Searching for unpublished trials in Cochrane reviews may not be worth the effort. J Clin Epidemiol. 2009;62(8):838–44.e3.

Buchberger B, Krabbe L, Lux B, Mattivi JT. Evidence mapping for decision making: feasibility versus accuracy - when to abandon high sensitivity in electronic searches. German medical science : GMS e-journal. 2016;14:Doc09.

Lorenc T, Pearson M, Jamal F, Cooper C, Garside R. The role of systematic reviews of qualitative evidence in evaluating interventions: a case study. Research Synthesis Methods. 2012;3(1):1–10.

Gough D. Weight of evidence: a framework for the appraisal of the quality and relevance of evidence. Res Pap Educ. 2007;22(2):213–28.

Barroso J, Gollop CJ, Sandelowski M, Meynell J, Pearce PF, Collins LJ. The challenges of searching for and retrieving qualitative studies. West J Nurs Res. 2003;25(2):153–78.

Britten N, Garside R, Pope C, Frost J, Cooper C. Asking more of qualitative synthesis: a response to Sally Thorne. Qual Health Res. 2017;27(9):1370–6.

Booth A, Carroll C. Systematic searching for theory to inform systematic reviews: is it feasible? Is it desirable? Health Info Libr J. 2015;32(3):220–35.

Kwon Y, Powelson SE, Wong H, Ghali WA, Conly JM. An assessment of the efficacy of searching in biomedical databases beyond MEDLINE in identifying studies for a systematic review on ward closures as an infection control intervention to control outbreaks. Syst Rev. 2014;3:135.

Nussbaumer-Streit B, Klerings I, Wagner G, Titscher V, Gartlehner G. Assessing the validity of abbreviated literature searches for rapid reviews: protocol of a non-inferiority and meta-epidemiologic study. Systematic Reviews. 2016;5:197.

Wagner G, Nussbaumer-Streit B, Greimel J, Ciapponi A, Gartlehner G. Trading certainty for speed - how much uncertainty are decisionmakers and guideline developers willing to accept when using rapid reviews: an international survey. BMC Med Res Methodol. 2017;17(1):121.

Ogilvie D, Hamilton V, Egan M, Petticrew M. Systematic reviews of health effects of social interventions: 1. Finding the evidence: how far should you go? J Epidemiol Community Health. 2005;59(9):804–8.

Royle P, Milne R. Literature searching for randomized controlled trials used in Cochrane reviews: rapid versus exhaustive searches. Int J Technol Assess Health Care. 2003;19(4):591–603.

Pearson M, Moxham T, Ashton K. Effectiveness of search strategies for qualitative research about barriers and facilitators of program delivery. Eval Health Prof. 2011;34(3):297–308.

Levay P, Raynor M, Tuvey D. The Contributions of MEDLINE, Other Bibliographic Databases and Various Search Techniques to NICE Public Health Guidance. 2015. 2015;10(1):19.

Nussbaumer-Streit B, Klerings I, Wagner G, Heise TL, Dobrescu AI, Armijo-Olivo S, et al. Abbreviated literature searches were viable alternatives to comprehensive searches: a meta-epidemiological study. J Clin Epidemiol. 2018;102:1–11.

Briscoe S, Cooper C, Glanville J, Lefebvre C. The loss of the NHS EED and DARE databases and the effect on evidence synthesis and evaluation. Res Synth Methods. 2017;8(3):256–7.

Stansfield C, O'Mara-Eves A, Thomas J. Text mining for search term development in systematic reviewing: A discussion of some methods and challenges. Research Synthesis Methods.n/a-n/a.

Petrova M, Sutcliffe P, Fulford KW, Dale J. Search terms and a validated brief search filter to retrieve publications on health-related values in Medline: a word frequency analysis study. Journal of the American Medical Informatics Association : JAMIA. 2012;19(3):479–88.

Stansfield C, Thomas J, Kavanagh J. 'Clustering' documents automatically to support scoping reviews of research: a case study. Res Synth Methods. 2013;4(3):230–41.

PubMed   Google Scholar  

Methley AM, Campbell S, Chew-Graham C, McNally R, Cheraghi-Sohi S. PICO, PICOS and SPIDER: a comparison study of specificity and sensitivity in three search tools for qualitative systematic reviews. BMC Health Serv Res. 2014;14:579.

Andrew B. Clear and present questions: formulating questions for evidence based practice. Library Hi Tech. 2006;24(3):355–68.

Cooke A, Smith D, Booth A. Beyond PICO: the SPIDER tool for qualitative evidence synthesis. Qual Health Res. 2012;22(10):1435–43.

Whiting P, Westwood M, Bojke L, Palmer S, Richardson G, Cooper J, et al. Clinical effectiveness and cost-effectiveness of tests for the diagnosis and investigation of urinary tract infection in children: a systematic review and economic model. Health technology assessment (Winchester, England). 2006;10(36):iii-iv, xi-xiii, 1–154.

Cooper C, Levay P, Lorenc T, Craig GM. A population search filter for hard-to-reach populations increased search efficiency for a systematic review. J Clin Epidemiol. 2014;67(5):554–9.

Hausner E, Waffenschmidt S, Kaiser T, Simon M. Routine development of objectively derived search strategies. Systematic Reviews. 2012;1(1):19.

Hausner E, Guddat C, Hermanns T, Lampert U, Waffenschmidt S. Prospective comparison of search strategies for systematic reviews: an objective approach yielded higher sensitivity than a conceptual one. J Clin Epidemiol. 2016;77:118–24.

Craven J, Levay P. Recording database searches for systematic reviews - what is the value of adding a narrative to peer-review checklists? A case study of nice interventional procedures guidance. Evid Based Libr Inf Pract. 2011;6(4):72–87.

Wright K, Golder S, Lewis-Light K. What value is the CINAHL database when searching for systematic reviews of qualitative studies? Syst Rev. 2015;4:104.

Beckles Z, Glover S, Ashe J, Stockton S, Boynton J, Lai R, et al. Searching CINAHL did not add value to clinical questions posed in NICE guidelines. J Clin Epidemiol. 2013;66(9):1051–7.

Cooper C, Rogers M, Bethel A, Briscoe S, Lowe J. A mapping review of the literature on UK-focused health and social care databases. Health Inf Libr J. 2015;32(1):5–22.

Younger P, Boddy K. When is a search not a search? A comparison of searching the AMED complementary health database via EBSCOhost, OVID and DIALOG. Health Inf Libr J. 2009;26(2):126–35.

Lam MT, McDiarmid M. Increasing number of databases searched in systematic reviews and meta-analyses between 1994 and 2014. Journal of the Medical Library Association : JMLA. 2016;104(4):284–9.

Bethel A, editor Search summary tables for systematic reviews: results and findings. HLC Conference 2017a.

Aagaard T, Lund H, Juhl C. Optimizing literature search in systematic reviews - are MEDLINE, EMBASE and CENTRAL enough for identifying effect studies within the area of musculoskeletal disorders? BMC Med Res Methodol. 2016;16(1):161.

Adams CE, Frederick K. An investigation of the adequacy of MEDLINE searches for randomized controlled trials (RCTs) of the effects of mental health care. Psychol Med. 1994;24(3):741–8.

Kelly L, St Pierre-Hansen N. So many databases, such little clarity: searching the literature for the topic aboriginal. Canadian family physician Medecin de famille canadien. 2008;54(11):1572–3.

Lawrence DW. What is lost when searching only one literature database for articles relevant to injury prevention and safety promotion? Injury Prevention. 2008;14(6):401–4.

Lemeshow AR, Blum RE, Berlin JA, Stoto MA, Colditz GA. Searching one or two databases was insufficient for meta-analysis of observational studies. J Clin Epidemiol. 2005;58(9):867–73.

Sampson M, Barrowman NJ, Moher D, Klassen TP, Pham B, Platt R, et al. Should meta-analysts search Embase in addition to Medline? J Clin Epidemiol. 2003;56(10):943–55.

Stevinson C, Lawlor DA. Searching multiple databases for systematic reviews: added value or diminishing returns? Complementary Therapies in Medicine. 2004;12(4):228–32.

Suarez-Almazor ME, Belseck E, Homik J, Dorgan M, Ramos-Remus C. Identifying clinical trials in the medical literature with electronic databases: MEDLINE alone is not enough. Control Clin Trials. 2000;21(5):476–87.

Taylor B, Wylie E, Dempster M, Donnelly M. Systematically retrieving research: a case study evaluating seven databases. Res Soc Work Pract. 2007;17(6):697–706.

Beyer FR, Wright K. Can we prioritise which databases to search? A case study using a systematic review of frozen shoulder management. Health Info Libr J. 2013;30(1):49–58.

Duffy S, de Kock S, Misso K, Noake C, Ross J, Stirk L. Supplementary searches of PubMed to improve currency of MEDLINE and MEDLINE in-process searches via Ovid. Journal of the Medical Library Association : JMLA. 2016;104(4):309–12.

Katchamart W, Faulkner A, Feldman B, Tomlinson G, Bombardier C. PubMed had a higher sensitivity than Ovid-MEDLINE in the search for systematic reviews. J Clin Epidemiol. 2011;64(7):805–7.

Cooper C, Lovell R, Husk K, Booth A, Garside R. Supplementary search methods were more effective and offered better value than bibliographic database searching: a case study from public health and environmental enhancement (in Press). Research Synthesis Methods. 2017;

Cooper C, Booth, A., Britten, N., Garside, R. A comparison of results of empirical studies of supplementary search techniques and recommendations in review methodology handbooks: A methodological review. (In Press). BMC Systematic Reviews. 2017.

Greenhalgh T, Peacock R. Effectiveness and efficiency of search methods in systematic reviews of complex evidence: audit of primary sources. BMJ (Clinical research ed). 2005;331(7524):1064–5.

Article   PubMed Central   Google Scholar  

Hinde S, Spackman E. Bidirectional citation searching to completion: an exploration of literature searching methods. PharmacoEconomics. 2015;33(1):5–11.

Levay P, Ainsworth N, Kettle R, Morgan A. Identifying evidence for public health guidance: a comparison of citation searching with web of science and Google scholar. Res Synth Methods. 2016;7(1):34–45.

McManus RJ, Wilson S, Delaney BC, Fitzmaurice DA, Hyde CJ, Tobias RS, et al. Review of the usefulness of contacting other experts when conducting a literature search for systematic reviews. BMJ (Clinical research ed). 1998;317(7172):1562–3.

Westphal A, Kriston L, Holzel LP, Harter M, von Wolff A. Efficiency and contribution of strategies for finding randomized controlled trials: a case study from a systematic review on therapeutic interventions of chronic depression. Journal of public health research. 2014;3(2):177.

Matthews EJ, Edwards AG, Barker J, Bloor M, Covey J, Hood K, et al. Efficient literature searching in diffuse topics: lessons from a systematic review of research on communicating risk to patients in primary care. Health Libr Rev. 1999;16(2):112–20.

Bethel A. Endnote Training (YouTube Videos) 2017b [Available from: http://medicine.exeter.ac.uk/esmi/workstreams/informationscience/is_resources,_guidance_&_advice/ .

Bramer WM, Giustini D, de Jonge GB, Holland L, Bekhuis T. De-duplication of database search results for systematic reviews in EndNote. Journal of the Medical Library Association : JMLA. 2016;104(3):240–3.

Bramer WM, Milic J, Mast F. Reviewing retrieved references for inclusion in systematic reviews using EndNote. Journal of the Medical Library Association : JMLA. 2017;105(1):84–7.

Gall C, Brahmi FA. Retrieval comparison of EndNote to search MEDLINE (Ovid and PubMed) versus searching them directly. Medical reference services quarterly. 2004;23(3):25–32.

Ahmed KK, Al Dhubaib BE. Zotero: a bibliographic assistant to researcher. J Pharmacol Pharmacother. 2011;2(4):303–5.

Coar JT, Sewell JP. Zotero: harnessing the power of a personal bibliographic manager. Nurse Educ. 2010;35(5):205–7.

Moher D, Liberati A, Tetzlaff J, Altman DG, The PG. Preferred reporting items for systematic reviews and meta-analyses: the PRISMA statement. PLoS Med. 2009;6(7):e1000097.

Sampson M, McGowan J, Tetzlaff J, Cogo E, Moher D. No consensus exists on search reporting methods for systematic reviews. J Clin Epidemiol. 2008;61(8):748–54.

Toews LC. Compliance of systematic reviews in veterinary journals with preferred reporting items for systematic reviews and meta-analysis (PRISMA) literature search reporting guidelines. Journal of the Medical Library Association : JMLA. 2017;105(3):233–9.

Booth A. "brimful of STARLITE": toward standards for reporting literature searches. Journal of the Medical Library Association : JMLA. 2006;94(4):421–9. e205

Faggion CM Jr, Wu YC, Tu YK, Wasiak J. Quality of search strategies reported in systematic reviews published in stereotactic radiosurgery. Br J Radiol. 2016;89(1062):20150878.

Mullins MM, DeLuca JB, Crepaz N, Lyles CM. Reporting quality of search methods in systematic reviews of HIV behavioral interventions (2000–2010): are the searches clearly explained, systematic and reproducible? Research Synthesis Methods. 2014;5(2):116–30.

Yoshii A, Plaut DA, McGraw KA, Anderson MJ, Wellik KE. Analysis of the reporting of search strategies in Cochrane systematic reviews. Journal of the Medical Library Association : JMLA. 2009;97(1):21–9.

Bigna JJ, Um LN, Nansseu JR. A comparison of quality of abstracts of systematic reviews including meta-analysis of randomized controlled trials in high-impact general medicine journals before and after the publication of PRISMA extension for abstracts: a systematic review and meta-analysis. Syst Rev. 2016;5(1):174.

Akhigbe T, Zolnourian A, Bulters D. Compliance of systematic reviews articles in brain arteriovenous malformation with PRISMA statement guidelines: review of literature. Journal of clinical neuroscience : official journal of the Neurosurgical Society of Australasia. 2017;39:45–8.

Tao KM, Li XQ, Zhou QH, Moher D, Ling CQ, Yu WF. From QUOROM to PRISMA: a survey of high-impact medical journals' instructions to authors and a review of systematic reviews in anesthesia literature. PLoS One. 2011;6(11):e27611.

Wasiak J, Tyack Z, Ware R. Goodwin N. Jr. Poor methodological quality and reporting standards of systematic reviews in burn care management. International wound journal: Faggion CM; 2016.

Tam WW, Lo KK, Khalechelvam P. Endorsement of PRISMA statement and quality of systematic reviews and meta-analyses published in nursing journals: a cross-sectional study. BMJ Open. 2017;7(2):e013905.

Rader T, Mann M, Stansfield C, Cooper C, Sampson M. Methods for documenting systematic review searches: a discussion of common issues. Res Synth Methods. 2014;5(2):98–115.

Atkinson KM, Koenka AC, Sanchez CE, Moshontz H, Cooper H. Reporting standards for literature searches and report inclusion criteria: making research syntheses more transparent and easy to replicate. Res Synth Methods. 2015;6(1):87–95.

McGowan J, Sampson M, Salzwedel DM, Cogo E, Foerster V, Lefebvre C. PRESS peer review of electronic search strategies: 2015 guideline statement. J Clin Epidemiol. 2016;75:40–6.

Sampson M, McGowan J, Cogo E, Grimshaw J, Moher D, Lefebvre C. An evidence-based practice guideline for the peer review of electronic search strategies. J Clin Epidemiol. 2009;62(9):944–52.

Shea BJ, Reeves BC, Wells G, Thuku M, Hamel C, Moran J, et al. AMSTAR 2: a critical appraisal tool for systematic reviews that include randomised or non-randomised studies of healthcare interventions, or both. BMJ (Clinical research ed). 2017;358.

Whiting P, Savović J, Higgins JPT, Caldwell DM, Reeves BC, Shea B, et al. ROBIS: a new tool to assess risk of bias in systematic reviews was developed. J Clin Epidemiol. 2016;69:225–34.

Relevo R, Balshem H. Finding evidence for comparing medical interventions: AHRQ and the effective health care program. J Clin Epidemiol. 2011;64(11):1168–77.

Medicine Io. Standards for Systematic Reviews 2011 [Available from: http://www.nationalacademies.org/hmd/Reports/2011/Finding-What-Works-in-Health-Care-Standards-for-Systematic-Reviews/Standards.aspx .

CADTH: Resources 2018.

Download references

Acknowledgements

CC acknowledges the supervision offered by Professor Chris Hyde.

This publication forms a part of CC’s PhD. CC’s PhD was funded through the National Institute for Health Research (NIHR) Health Technology Assessment (HTA) Programme (Project Number 16/54/11). The open access fee for this publication was paid for by Exeter Medical School.

RG and NB were partially supported by the National Institute for Health Research (NIHR) Collaboration for Leadership in Applied Health Research and Care South West Peninsula.

The views expressed are those of the author(s) and not necessarily those of the NHS, the NIHR or the Department of Health.

Author information

Authors and affiliations.

Institute of Health Research, University of Exeter Medical School, Exeter, UK

Chris Cooper & Jo Varley-Campbell

HEDS, School of Health and Related Research (ScHARR), University of Sheffield, Sheffield, UK

Andrew Booth

Nicky Britten

European Centre for Environment and Human Health, University of Exeter Medical School, Truro, UK

Ruth Garside

You can also search for this author in PubMed   Google Scholar

Contributions

CC conceived the idea for this study and wrote the first draft of the manuscript. CC discussed this publication in PhD supervision with AB and separately with JVC. CC revised the publication with input and comments from AB, JVC, RG and NB. All authors revised the manuscript prior to submission. All authors read and approved the final manuscript.

Corresponding author

Correspondence to Chris Cooper .

Ethics declarations

Ethics approval and consent to participate, consent for publication, competing interests.

The authors declare that they have no competing interests.

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Additional file

Additional file 1:.

Appendix tables and PubMed search strategy. Key studies used for pearl growing per key stage, working data extraction tables and the PubMed search strategy. (DOCX 30 kb)

Rights and permissions

Open Access This article is distributed under the terms of the Creative Commons Attribution 4.0 International License ( http://creativecommons.org/licenses/by/4.0/ ), which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made. The Creative Commons Public Domain Dedication waiver ( http://creativecommons.org/publicdomain/zero/1.0/ ) applies to the data made available in this article, unless otherwise stated.

Reprints and permissions

About this article

Cite this article.

Cooper, C., Booth, A., Varley-Campbell, J. et al. Defining the process to literature searching in systematic reviews: a literature review of guidance and supporting studies. BMC Med Res Methodol 18 , 85 (2018). https://doi.org/10.1186/s12874-018-0545-3

Download citation

Received : 20 September 2017

Accepted : 06 August 2018

Published : 14 August 2018

DOI : https://doi.org/10.1186/s12874-018-0545-3

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Literature Search Process
  • Citation Chasing
  • Tacit Models
  • Unique Guidance
  • Information Specialists

BMC Medical Research Methodology

ISSN: 1471-2288

literature search strategy table

A Guide to Evidence Synthesis: 4. Write a Search Strategy

  • Meet Our Team
  • Our Published Reviews and Protocols
  • What is Evidence Synthesis?
  • Types of Evidence Synthesis
  • Evidence Synthesis Across Disciplines
  • Finding and Appraising Existing Systematic Reviews
  • 0. Develop a Protocol
  • 1. Draft your Research Question
  • 2. Select Databases
  • 3. Select Grey Literature Sources
  • 4. Write a Search Strategy
  • 5. Register a Protocol
  • 6. Translate Search Strategies
  • 7. Citation Management
  • 8. Article Screening
  • 9. Risk of Bias Assessment
  • 10. Data Extraction
  • 11. Synthesize, Map, or Describe the Results
  • Evidence Synthesis Institute for Librarians
  • Open Access Evidence Synthesis Resources

Video: Databases and search strategies (3:40 minutes)

Writing a Search Strategy

It is recommended that you work with a librarian to help you design comprehensive search strategies across a variety of databases. Writing a successful search strategy takes an intimate knowledge of bibliographic databases.  

Using Boolean logic is an important component of writing a search strategy: 

  • "AND" narrows the search, e.g.  children AND exercise
  • "OR" broadens the search, e.g.  (children OR adolescents) AND (exercise OR diet) 
  • "NOT" excludes terms, e.g.  exercise NOT diet 
  • "*" at the root of a word finds all forms of that word, e.g.  (child* OR adolescen*) AND (exercise* OR diet*)
  • parentheses ensure all terms will be searched together as a set 
  • quotations around a phrase searches that exact phrase, e.g.  (child* OR adolescen* OR "young adult*") 

3 Venn diagrams displaying the differences between the Boolean operators AND, OR, and NOT. Using AND narrows a search by requiring that both terms (puppy and kitten) be included in the results. Using OR broadens a search by requiring either term (puppy or kitten) be included in the results. Using NOT excludes just one term (kitten) so that included results only mention puppy and any results that mention kitten are excluded.

Evidence Synthesis Search Strategy Examples

Agriculture Example: 

  • Research question:  What are the strategies that farmer organizations use, and what impacts do those strategies have on small-scale producers in Sub Saharan Africa and India? 
  • Key concepts from the question combined with AND:  (farmer organizations) AND (Sub-Saharan Africa OR India) 
  • Protocol and search strategies for this question in CAB Abstracts, Scopus, EconLit, and grey literature
  • Published scoping review for this question

Nutrition Example: 

  • Research question:  What are the health benefits and safety of folic acid fortification of wheat and maize flour (i.e. alone or in combination with other micronutrients) on folate status and health outcomes in the overall population, compared to wheat or maize flour without folic acid (or no intervention)? 
  • Key concepts from the question combined with AND:  (folic acid) AND (fortification) 
  • Protocol on PROSPERO
  • Published systematic review for this question with search strategies used in 14 databases

Search Strategy Template and Filters

  • Human Studies Filter
  • Randomized Controlled Trial Filters
  • Other Methodology Search Filters

If you want to exclude animal studies from your search results, you may add a "human studies filter" to the end of your search strategy. This approach works best with databases that use Medical Subject Headings (MeSH) or other controlled vocabulary.

See Appendix 2 at the end of this published search strategy for an example of a human studies filter in a MEDLINE(Ovid) search strategy.

Line 13 searches for all animal studies, and then line 14 searches for only the full search results in line 12, NOT including any of the animal studies from line 13 (#12 NOT #13).

Add the following lines to the end of your search strategy to filter for randomized controlled trials. These are "validated search filters" meaning they have been tested for sensitivity and specificity, and the results of those tests have been published as a scientific article. The ISSG Search Filters Resource provides validated search filters for many other study design types. 

Highly Sensitive MEDLINE (via PubMed) Filter from Cochrane  

(randomized controlled trial [pt] OR controlled clinical trial [pt] OR randomized [tiab] OR placebo [tiab] OR drug therapy [sh] OR randomly [tiab] OR trial [tiab] OR groups [tiab])

Highly Sensitive MEDLINE (OVID) Filter from Cochrane 

((randomized controlled trial.pt. or controlled clinical trial.pt. or randomized.ab. or placebo.ab. or drug therapy.fs. or randomly.ab. or trial.ab. or groups.ab.) not (exp animals/ not humans.sh.)) ​

CINAHL Filter from Cochrane 

TX allocat* random* OR (MH "Quantitative Studies") OR (MH "Placebos") OR TX placebo* OR TX random* allocat* OR (MH "Random Assignment") OR TX randomi* control* trial* OR TX ( (singl* n1 blind*) OR (singl* n1 mask*) ) OR TX ( (doubl* n1 blind*) OR (doubl* n1 mask*) ) OR TX ( (tripl* n1 blind*) OR (tripl* n1 mask*) ) OR TX ( (trebl* n1 blind*) OR (trebl* n1 mask*) ) OR TX clinic* n1 trial* OR PT Clinical trial OR (MH "Clinical Trials+")

PsycINFO Filter from ProQuest:

SU.EXACT("Treatment Effectiveness Evaluation") OR SU.EXACT.EXPLODE("Treatment Outcomes") OR SU.EXACT("Placebo") OR SU.EXACT("Followup Studies") OR placebo* OR random* OR "comparative stud*" OR  clinical NEAR/3 trial* OR research NEAR/3 design OR evaluat* NEAR/3 stud* OR prospectiv* NEAR/3 stud* OR (singl* OR doubl* OR trebl* OR tripl*) NEAR/3 (blind* OR mask*)

Web Of Science (WoS) Filter from University of Alberta - Not Validated

TS= clinical trial* OR TS=research design OR TS=comparative stud* OR TS=evaluation stud* OR TS=controlled trial* OR TS=follow-up stud* OR TS=prospective stud* OR TS=random* OR TS=placebo* OR TS=(single blind*) OR TS=(double blind*)

Scopus Filter from Children's Mercy Kansas City

 Copy/paste into 'advanced search':

TITLE-ABS-KEY((clinic* w/1 trial*) OR (randomi* w/1 control*) OR (randomi* w/2 trial*) OR (random* w/1 assign*) OR (random* w/1 allocat*) OR (control* w/1 clinic*) OR (control* w/1 trial) OR placebo* OR (Quantitat* w/1 Stud*) OR (control* w/1 stud*) OR (randomi* w/1 stud*) OR (singl* w/1 blind*) or (singl* w/1 mask*) OR (doubl* w/1 blind*) OR (doubl* w/1 mask*) OR (tripl* w/1 blind*) OR (tripl* w/1 mask*) OR (trebl* w/1 blind*) OR (trebl* w/1 mask*)) AND NOT (SRCTYPE(b) OR SRCTYPE(k) OR SRCTYPE(p) OR SRCTYPE(r) OR SRCTYPE(d) OR DOCTYPE(ab) OR DOCTYPE(bk) OR DOCTYPE(ch) OR DOCTYPE(bz) OR DOCTYPE(cr) OR DOCTYPE(ed) OR DOCTYPE(er) OR DOCTYPE(le) OR DOCTYPE(no) OR DOCTYPE(pr) OR DOCTYPE(rp) OR DOCTYPE(re) OR DOCTYPE(sh))

Sources and more information:

  • Cochrane Handbook for Systematic Reviews of Interventions
  • Cochrane RCT Filters for Different Databases
  • American University of Beirut University Libraries Search Filters / Hedges
  • Methodology Search Filters by Study Design Filters for RCTs, CCTs, Non-randomized/observational designs, and tests of diagnostic accuracy. Source: Countway Library of Medicine. (2019). Systematic Reviews and Meta Analysis: Methodology Filters.
  • American University of Beirut University Libraries Search Filters Filters for RCTs, GUIDELINEs, systematic reviews, qualitative studies, etc. Source: American University of Beirut University Libraries. (2019). Systematic Reviews: Search Filters / Hedges.

Pre-generated queries in Scopus for the UN Sustainable Development Goals

Pre-written SDG search strategies available in Scopus 

Scopus, a database of multidisciplinary research, provides pre-written search strategies to capture articles on topics about each of the 17 United Nations Sustainable Development Goals  (SDGs). To access these SDG search strategies in Scopus: 

  • Click on "Advanced Document Search" 
  • At the bottom of the right-hand column, click on "pre-generated queries." When you click on one of the 17 SDGs, a search strategy for that SDG will populate in the search field in Scopus syntax.  

More about the Sustainable Development Goals: 

" The 2030 Agenda for Sustainable Development,  adopted by all United Nations Member States in 2015, provides a shared blueprint for peace and prosperity for people and the planet, now and into the future. At its heart are the 17 Sustainable Development Goals (SDGs), which are an urgent call for action by all countries - developed and developing - in a global partnership. They recognize that ending poverty and other deprivations must go hand-in-hand with strategies that improve health and education, reduce inequality, and spur economic growth – all while tackling climate change and working to preserve our oceans and forests."

Source:  https://sdgs.un.org/goals 

  • << Previous: 3. Select Grey Literature Sources
  • Next: 5. Register a Protocol >>
  • Last Updated: Dec 19, 2023 9:46 AM
  • URL: https://guides.library.cornell.edu/evidence-synthesis

University of Leeds logo

  • Study and research support
  • Literature searching

Literature searching explained

Develop a search strategy.

A search strategy is an organised structure of key terms used to search a database. The search strategy combines the key concepts of your search question in order to retrieve accurate results.

Your search strategy will account for all:

  • possible search terms
  • keywords and phrases
  • truncated and wildcard variations of search terms
  • subject headings (where applicable)

Each database works differently so you need to adapt your search strategy for each database. You may wish to develop a number of separate search strategies if your research covers several different areas.

It is a good idea to test your strategies and refine them after you have reviewed the search results.

How a search strategy looks in practice

Take a look at this example literature search in PsycINFO (PDF) about self-esteem.

The example shows the subject heading and keyword searches that have been carried out for each concept within our research question and how they have been combined using Boolean operators. It also shows where keyword techniques like truncation, wildcards and adjacency searching have been used.

Search strategy techniques

The next sections show some techniques you can use to develop your search strategy.

Skip straight to:

  • Choosing search terms
  • Searching with keywords
  • Searching for exact phrases
  • Using truncated and wildcard searches

Searching with subject headings

  • Using Boolean logic

Citation searching

Choose search terms.

Concepts can be expressed in different ways eg “self-esteem” might be referred to as “self-worth”. Your aim is to consider each of your concepts and come up with a list of the different ways they could be expressed.

To find alternative keywords or phrases for your concepts try the following:

  • Use a thesaurus to identify synonyms.
  • Search for your concepts on a search engine like Google Scholar, scanning the results for alternative words and phrases.
  • Examine relevant abstracts or articles for alternative words, phrases and subject headings (if the database uses subject headings).

When you've done this, you should have lists of words and phrases for each concept as in this completed PICO model (PDF) or this example concept map (PDF).

As you search and scan articles and abstracts, you may discover different key terms to enhance your search strategy.

Using truncation and wildcards can save you time and effort by finding alternative keywords.

Search with keywords

Keywords are free text words and phrases. Database search strategies use a combination of free text and subject headings (where applicable).

A keyword search usually looks for your search terms in the title and abstract of a reference. You may wish to search in title fields only if you want a small number of specific results.

Some databases will find the exact word or phrase, so make sure your spelling is accurate or you will miss references.

Search for the exact phrase

If you want words to appear next to each other in an exact phrase, use quotation marks, eg “self-esteem”.

Phrase searching decreases the number of results you get and makes your results more relevant. Most databases allow you to search for phrases, but check the database guide if you are unsure.

Truncation and wildcard searches

You can use truncated and wildcard searches to find variations of your search term. Truncation is useful for finding singular and plural forms of words and variant endings.

Many databases use an asterisk (*) as their truncation symbol. Check the database help section if you are not sure which symbol to use. For example, “therap*” will find therapy, therapies, therapist or therapists. A wildcard finds variant spellings of words. Use it to search for a single character, or no character.

Check the database help section to see which symbol to use as a wildcard.

Wildcards are useful for finding British and American spellings, for example: “behavio?r” in Medline will find both behaviour and behavior.

There are sometimes different symbols to find a variable single character. For example, in the Medline database, “wom#n” will find woman and also women.

Use adjacency searching for more accurate results

You can specify how close two words appear together in your search strategy. This can make your results more relevant; generally the closer two words appear to each other, the closer the relationship is between them.

Commands for adjacency searching differ among databases, so make sure you consult database guides.

In OvidSP databases (like Medline), searching for “physician ADJ3 relationship” will find both physician and relationship within two major words of each other, in any order. This finds more papers than "physician relationship".

Using this adjacency retrieves papers with phrases like "physician patient relationship", "patient physician relationship", "relationship of the physician to the patient" and so on.

Database subject headings are controlled vocabulary terms that a database uses to describe what an article is about.

Watch our 3-minute introduction to subject headings video . You can also  View the video using Microsoft Stream (link opens in a new window, available for University members only).

Using appropriate subject headings enhances your search and will help you to find more results on your topic. This is because subject headings find articles according to their subject, even if the article does not use your chosen key words.

You should combine both subject headings and keywords in your search strategy for each of the concepts you identify. This is particularly important if you are undertaking a systematic review or an in-depth piece of work

Subject headings may vary between databases, so you need to investigate each database separately to find the subject headings they use. For example, for Medline you can use MeSH (Medical Subject Headings) and for Embase you can use the EMTREE thesaurus.

SEARCH TIP: In Ovid databases, search for a known key paper by title, select the "complete reference" button to see which subject headings the database indexers have given that article, and consider adding relevant ones to your own search strategy.

Use Boolean logic to combine search terms

Boolean operators (AND, OR and NOT) allow you to try different combinations of search terms or subject headings.

Databases often show Boolean operators as buttons or drop-down menus that you can click to combine your search terms or results.

The main Boolean operators are:

OR is used to find articles that mention either of the topics you search for.

AND is used to find articles that mention both of the searched topics.

NOT excludes a search term or concept. It should be used with caution as you may inadvertently exclude relevant references.

For example, searching for “self-esteem NOT eating disorders” finds articles that mention self-esteem but removes any articles that mention eating disorders.

Citation searching is a method to find articles that have been cited by other publications.

Use citation searching (or cited reference searching) to:

  • find out whether articles have been cited by other authors
  • find more recent papers on the same or similar subject
  • discover how a known idea or innovation has been confirmed, applied, improved, extended, or corrected
  • help make your literature review more comprehensive.

You can use cited reference searching in:

  • OvidSP databases
  • Google Scholar
  • Web of Science

Cited reference searching can complement your literature search. However be careful not to just look at papers that have been cited in isolation. A robust literature search is also needed to limit publication bias.

Banner

Best Practice for Literature Searching

  • Literature Search Best Practice
  • What is literature searching?
  • What are literature reviews?
  • Hierarchies of evidence
  • 1. Managing references
  • 2. Defining your research question
  • 3. Where to search
  • 4. Search strategy
  • 5. Screening results
  • 6. Paper acquisition
  • 7. Critical appraisal
  • Further resources
  • Training opportunities and videos
  • Join FSTA student advisory board This link opens in a new window
  • Chinese This link opens in a new window
  • Italian This link opens in a new window
  • Persian This link opens in a new window
  • Portuguese This link opens in a new window
  • Spanish This link opens in a new window

Creating a search strategy

Once you have determined what your research question is and where you think you should search, you need to translate your question into a useable search. Doing so will:

  • Make it much more likely that you will find the relevant research and minimise false hits (irrelevant results)
  • Save you time in the long run
  • Help you to stay objective throughout your searching and stick to your plan
  • Help you replicate and update your results (where needed)
  • Help future researchers build on your research.

If you need to explore a topic first, your search strategy can initially be quite loose. You can then revisit search terms and update your search strategy accordingly. Record your search strategy as you develop it and capture the final version for each place that you search.

Remember that information retrieval in the area of food is complex because of the broadness of the field and the way in which content is indexed.   As a result, there is often a high level of ‘noise’ when searching food topics in a database not designed for food content. Creating successful search strategies involves knowledge of a database, its scope, indexing and structure.

literature search strategy table

  • Key concepts and meaningful terms
  • Keywords or subject headings
  • Alternative keywords
  • Care in linking concepts correctly
  • Regular evaluation of search results, to ensure that your search is focused
  • A detailed record of your final strategy. You will need to re-run your search at the end of the review process to catch any new literature published since you began.
  • Search matrix
  • Populated matrix
  • Revised matrix (after running searches)

literature search strategy table

  • DOWNLOAD THE SEARCH MATRIX

Using a search matrix helps you brainstorm and collect words to include in your search. To populate a search matrix:

  • Identify the main concepts in your search
  • Run initial searches with your terms, scanning abstract and subject terms (sometimes called descriptors, keywords, MeSH headings, or thesaurus terms, depending on which database you are using) of relevant results for words to add to the matrix.
  •  Explore a database thesaurus hierarchy for suitable broader and narrower terms.
Note : You don’t need to fill all of the boxes in a search matrix.

literature search strategy table

You will find that you need to do some searches as you experiment in running it and this will help you refine your search strategy. For the search on this example question:

  • Some of the broader terms turned out to be too broad, introducing a host of irrelevant results about pork and chicken
  • Some of the narrower terms were unnecessary, as any result containing “beef extract” is captured by just using the term beef.

See the revised matrix (after running searches) tab!

literature search strategy table

This revised matrix shows both adjustments made to terms, and how the terms are connected with Boolean operators.  Different forms of the same concept (the columns) are connected with OR, and each of the different concepts are connected with AND.   

Search tools

  • Boolean operators
  • Phrases and proximity searching
  • Truncation and wildcards

literature search strategy table

Boolean operators tell a database or search engine how the terms you type are related to each other.  

Use OR to connect variations representing the same concept . In many search interfaces you will want to put your OR components inside parentheses like this: (safe OR “food safety” OR decontamination OR contamination OR disinfect*). These are now lumped together into a single food safety concept for your search.

Use AND to link different concepts. By typing (safe OR “food safety” OR decontamination OR contamination OR disinfect*) AND (beef OR “cattle carcasses”)—you are directing the database to display results containing both concepts.

NOT  eliminates all results containing a specific word.  Use NOT with caution. The term excluded might be used in a way you have not anticipated, and you will not know because you will not see the missing results.

Learn more about using Boolean operators:  Research Basics: Using Boolean Operators to Build a Search (ifis.org)

The search in the matrix above would look like this in a database:

("food safety"  OR  safety  OR  decontamination  OR  contamination  OR  disinfection)  AND  (thaw*  OR  defrost*  OR  "thawing medium")  AND  ("sensory quality attributes"  OR  "sensory perception"  OR  quality  OR  aroma  OR  appearance  OR  "eating quality"  OR  juiciness  OR  mouthfeel  OR  texture  OR  "mechanical properties"  OR  "sensory analysis"  OR  "rheological properties")  AND  (beef  OR  "cattle carcasses")

Thesaurus terms will help you capture variations in words and spellings that researchers might use to refer to the same concept, but you can and should also use other mechanisms utilised by databases to do the same. This is especially important for searches in databases where the thesaurus is not specialised for food science.

  • Phrase searching , putting two or more words inside quotation marks like “food safety” will ensure that those words appear in a single field (i.e. title or abstract or subject heading) together as the phrase. Phrase searching can eliminate false hits where the words used separately do not represent the needed concept.
  • Some databases allow you to use proximity searching to specify that words need to be near each other. For instance, if you type ripening N5 cheese you will get results with a maximum of five words between ripening and cheese .  You would get results containing cheese ripening as well as results containing ripening of semi-hard goat cheese .

Learn how to test if a phrase search or a proximity search is the better choice for your search:  Proximity searching, phrase searching, and Boolean AND: 3 techniques to focus your literature search (ifis.org)

Note : Proximity symbols vary from database to database. Some use N plus a number, while others use NEAR, ADJ or W. Always check the database help section to be sure that you are using the right symbols for that database .

Truncating a word mean typing the start of a word, followed by a symbol, usually an asterisk (*).  This symbol tells the database to return the letters you have typed followed either by no letters (if appropriate) or letters.  It is an easy way to capture a concept that might be expressed with a variety of endings. 

Sometimes you need to adjust where you truncate to avoid irrelevant results.  See the difference between results for nutri* or nutrit*

Inserting  wildcard  symbols into words covers spelling variations.  In some databases, typing  organi?ation  would return results with  organisation  or  organization , and  flavo#r  would bring back results with  flavor  or  flavour .  

Note : While the truncation symbol is often *, it can also be $ or !.   Wildcard symbols also vary from database to database. $ or ? are sometimes used. Always check the database help section to be sure that you are using the right symbols for that database.

In building a search you can combine all the tools available to you.    “Brewer* yeast”  , which uses both phrase searching and truncation, will bring back results for  brewer yeast ,  brewer’s yeast  and  brewers yeast , three variations which are all used in the literature.

Best Practice!

BEST PRACTICE RECOMMENDATION:   Always check a database's help section to be sure that you are using the correct  proximity, truncation or wildcard symbols for that database. 

Handsearching

It is good practice to supplement your database searches with handsearching . This is the process of manually looking through the table of contents of journals and conferences to find studies that your database searches missed. A related activity is looking through the reference lists of relevant articles found through database searches. There are three reasons why doing both these things is a good idea:

  • If, through handsearching, you identify additional articles which are in the database you used but weren’t included in the results from your searches, you can look at the article records to consider if you need to adjust your search strategy. You may have omitted a useful variation of a concept from your search string.
  • Even when your search string is excellent, some abstracts and records don’t contain terms that allow them to be easily identified in a search, but are relevant to your research.
  • References might point to research published before the indexing began for the databases you are using.

For handsearching, target journals or conference proceedings that are clearly in the area of your topic and look through tables of contents. Sometimes valuable information within supplements or letters is not indexed within databases.

Academic libraries might subscribe to tools which can speed the process such as Zetoc  (which includes conference and journal contents) or Browzine (which only covers journals).  You can also see past and current issues’ tables of contents on a journal’s webpage.

Handsearching is a valuable but labour-intensive activity, so think carefully about where to invest your time.

Best practice!

BEST PRACTICE RECOMMENDATION:   Ask a colleague, lecturer, or librarian to review your search strategy. This can be very helpful, especially if you are new to a topic. It adds credibility to your literature search and will help ensure that you are running the best search possible.

BEST PRACTICE RECOMMENDATION:   Remember to save a detailed record of your searches so that you can run them shortly before you are ready to submit your project to see if any new relevant research has been published since you embarked on your project. A good way to do this is to document:

  • Where the search was run
  • The exact search
  • The date it was run
  • The number of results

Keeping all this information will make it easy to see if your search picks up new results when you run it again.

BEST PRACTICE RECOMMENDATION: If you are publishing your research, take note of journals appearing frequently in your search results for an indication of where to publish a research topic for good impact.

  • << Previous: 3. Where to search
  • Next: 5. Screening results >>
  • Last Updated: Sep 15, 2023 2:17 PM
  • URL: https://ifis.libguides.com/literature_search_best_practice
  • Open access
  • Published: 26 January 2021

PRISMA-S: an extension to the PRISMA Statement for Reporting Literature Searches in Systematic Reviews

  • Melissa L. Rethlefsen   ORCID: orcid.org/0000-0001-5322-9368 1 ,
  • Shona Kirtley   ORCID: orcid.org/0000-0002-7801-5777 2 ,
  • Siw Waffenschmidt   ORCID: orcid.org/0000-0001-6860-6699 3 ,
  • Ana Patricia Ayala   ORCID: orcid.org/0000-0002-3613-2270 4 ,
  • David Moher   ORCID: orcid.org/0000-0003-2434-4206 5 ,
  • Matthew J. Page   ORCID: orcid.org/0000-0002-4242-7526 6 ,
  • Jonathan B. Koffel   ORCID: orcid.org/0000-0003-1723-5087 7 &

PRISMA-S Group

Systematic Reviews volume  10 , Article number:  39 ( 2021 ) Cite this article

134k Accesses

821 Citations

360 Altmetric

Metrics details

Literature searches underlie the foundations of systematic reviews and related review types. Yet, the literature searching component of systematic reviews and related review types is often poorly reported. Guidance for literature search reporting has been diverse, and, in many cases, does not offer enough detail to authors who need more specific information about reporting search methods and information sources in a clear, reproducible way. This document presents the PRISMA-S (Preferred Reporting Items for Systematic reviews and Meta-Analyses literature search extension) checklist, and explanation and elaboration.

The checklist was developed using a 3-stage Delphi survey process, followed by a consensus conference and public review process.

The final checklist includes 16 reporting items, each of which is detailed with exemplar reporting and rationale.

Conclusions

The intent of PRISMA-S is to complement the PRISMA Statement and its extensions by providing a checklist that could be used by interdisciplinary authors, editors, and peer reviewers to verify that each component of a search is completely reported and therefore reproducible.

Peer Review reports

Introduction

One crucial component of a systematic review is the literature search. The literature search, or information retrieval process, not only informs the results of a systematic review; it is the underlying process that establishes the data available for analysis. Additional components of the systematic review process such as screening, data extraction, and qualitative or quantitative synthesis procedures are dependent on the identification of eligible studies. As such, the literature search must be designed to be both robust and reproducible to ensure the minimization of bias.

Guidelines exist for both the conduct of literature searches (Table 2 ) for systematic reviews and their reporting [ 2 , 3 , 4 , 5 , 6 , 7 ]. Problematically, however, the many guidelines for reporting systematic review searches share few common reporting elements. In fact, Sampson et al. discovered that of the eleven instruments designed to help authors report literature searches well, only one item appeared in all eleven instruments [ 8 ]. Though Sampson et al.’s study was conducted in 2007, the problem has only been compounded as new checklists and tools have continued to be developed. The most commonly used reporting guidance for systematic reviews, which covers the literature search component, is the Preferred Reporting Items for Systematic reviews and Meta-Analyses Statement, or PRISMA Statement [ 9 ]. The 2009 PRISMA Statement checklist included three items related to literature search reporting, items 7, 8, and 17:

Item 7: Describe all information sources (e.g., databases with dates of coverage, contact with study authors to identify additional studies) in the search and date last searched.
Item 8: Present full electronic search strategy for at least one database, including any limits used, such that it could be repeated.
Item 17: Give numbers of studies screened, assessed for eligibility, and included in the review, with reasons for exclusions at each stage, ideally with a flow diagram.

Despite wide usage of the PRISMA Statement [ 10 ], compliance with its items regarding literature search reporting is low [ 11 , 12 , 13 , 14 ]. Even for those studies which explicitly reference PRISMA, there is only slight, statistically non-significant evidence of improved reporting, as found by Page et al. [ 15 ]. Part of the challenge may be the multifactorial nature of each of the PRISMA items relating to searches; authors may feel if they completed one of the components of the item that they can check off that item altogether. Another part of the challenge may be that many systematic reviews do not include librarians or information specialists as members of the systematic review team or as authors on the final manuscript [ 11 , 16 , 17 , 18 ]. Preliminary research suggests that librarian or information specialist involvement is correlated with reproducibility of searches [ 16 , 17 , 18 ], likely due to their expertise surrounding search development and documentation. However, reviews where librarians are authors still include reproducible searches only 64% of the time [ 17 ].

A larger issue may be that, even amongst librarians and information specialists, debate exists as to what constitutes a reproducible search and how best to report the details of the search. Researchers assessing the reproducibility of the search have used varying methods to determine what constitutes a reproducible search [ 11 , 17 , 19 , 20 ]. Post-publication peer review of search methods, even amongst Cochrane reviews, which generally have superior reporting compared to non-Cochrane reviews [ 15 ], has shown that reporting that appears complete may still pose challenges for those wishing to reproduce searches [ 20 , 21 , 22 , 23 , 24 ]. Furthermore, little guidance on how to report searches using information sources or methods other than literature databases, such as searching web sites or study registries, exists [ 25 , 26 ].

Incomplete reporting of the literature search methods can introduce doubt and diminish trust in the final systematic review conclusions. If researchers are unable to understand or reproduce how information was gathered for a systematic review, they may suspect the authors of having introduced bias into their review by not conducting a thorough or pre-specified literature search. After observing the high number of systematic reviews with poorly reported literature searches, we sought to create an extension to the PRISMA statement. Our aims were four-fold:

To provide extensive guidance on reporting the literature search components of a systematic review.

To create a checklist that could be used by authors, editors, and peer reviewers to verify that each component of a search was completely reported and therefore reproducible.

To develop an interdisciplinary checklist applicable to all method-driven literature searches for evidence synthesis.

To complement the PRISMA Statement and its extensions.

Because we intend the checklist to be used in all fields and disciplines, we use “systematic reviews” throughout this document as a representative name for the entire family of evidence syntheses [ 27 ]. This includes, but is not limited to, scoping reviews, rapid reviews, realist reviews, metanarrative reviews, mixed methods reviews, umbrella reviews, and evidence maps [ 28 ]. We use the term “literature search” or “search” throughout to encompass the full range of possible search methods and information sources.

Part 1: Developing the Checklist

After consultation with members of the PRISMA Statement steering group (D.M. and D.G.A.), we formed an executive committee (M.L.R, J.K., S.K.) and developed a protocol [ 29 ] according to the steps outlined in the “Guidance for Developers of Health Research Reporting Guidelines [ 30 ].” The protocol was registered on the EQUATOR Network [ 29 ]. We identified 405 potential items relevant to reporting searches in systematic reviews from 61 sources (see Additional file 1 ) located through a search of MEDLINE via Ovid, Embase via Embase.com , and LISTA via EBSCOhost, in addition to reviewing all of the sources identified by the EQUATOR Network relating to systematic reviews. We also searched our personal files and examined references of included documents for additional sources. Details of the search are available in Additional file 1 . Sources included both explicit reporting guidelines and studies assessing reproducibility of search strategies. The 405 items were reviewed for overlap and consolidated into 123 remaining items for potential inclusion in a checklist.

To narrow the list into a usable checklist, we then used a three-step Delphi survey process [ 31 ]. The first survey included the initially identified 123 items and asked respondents to rate each item on a 4-point Likert-type scale. Items that 70% of experts rated as 3 or 4 (4 being “essential” and 1 “not important”) and that received a mean score of at least 3.25 were retained for rating in the second round of the Delphi process. Respondents to the first survey were invited to participate in the second and third rounds. The second round asked respondents to pick the 25 most essential items out of the remaining 53 potential items; the third round was identical, except respondents also selected the most appropriate location for reporting their selected items (e.g., in the main text, or a supplementary file). The items were ranked and categorized by general theme for discussion at an in-person consensus conference.

We created a list of one hundred and sixty-three international experts, including librarian and information specialists with expertise in systematic reviews, researchers who had written about systematic review reporting, journal editors, and systematic review methodologists, to whom we sent our initial Delphi survey. The list of experts was created using a combination of publications, mailing lists, conference proceedings, and knowledge of the authors to represent research groups and experts in 23 countries. We received 52 responses (32% response rate) to the first survey, and of these, 35 (67% response rate) completed both surveys two and three. This study was declared exempt by the University of Utah Institutional Review Board (IRB_00088425).

The results of the Delphi process were reported at a consensus conference meeting that took place in May 2016 concurrently with Mosaic ‘16, the joint meeting of the Medical Library Association, Canadian Health Libraries Association/Association des bibliothèques de la santé du Canada, and the International Clinical Librarian Conference (ICLC). 38 individuals attended the consensus conference, 14 (37%) of whom had participated in the Delphi surveys. At the consensus conference, the grouped and ranked remaining items were distributed to small groups who were asked to discuss, consolidate, remove, or add missing critical items under the guidance of a group leader. After two rounds of discussion, the group leaders presented the discussion and proposed list items from their small groups for consideration by the whole group of experts.

Upon completion of the consensus conference, 30 items remained from those identified during the Delphi process, with an additional three items that had been excluded during the Delphi process added back to the draft checklist because meeting attendees considered them critical to the guideline. The list was then consolidated and reviewed by executive committee members, including two new information specialist members (S.W. and A.P.A). The draft checklist and explanation and elaboration document was released to the public on March 20, 2019, along with all data and study materials [ 32 ]. All participants in the Delphi process and/or consensus conference were contacted via email with instructions on how to provide feedback on the draft checklist items and/or elaboration and explanation document by commenting directly on the explanation and elaboration draft using a private commenting system, Hypothesis [ 33 ], or if preferred, via email. Comments from other interested individuals were solicited via Twitter, conference presentations, and personal contacts. Comments were collected from the private Hypothesis group, the public Hypothesis comments, and via email. All comments were combined into a single document. Executive committee members reviewed each comment in duplicate to indicate what type of feedback was received (i.e., linguistic, major substantive, minor substantive, or unclear) and, for substantive comments, whether change was recommended or required further discussion.

During the draft and revision process (March 20–June 15, 2019), 358 separate comments were received from 22 individuals and organizations. Based upon the extensive feedback received, the executive team revised the checklist and developed the next iteration, which was released on December 6, 2019, to coincide with the 2019 Virtual Cochrane Colloquium Santiago. Additional feedback from this release was incorporated into the final checklist. Throughout the draft and revision process, several teleconferences were held with the lead of the PRISMA 2020 statement (M.J.P), as an update of the 2009 PRISMA statement was in development, to ensure that the content on search methods was consistent between the PRISMA 2020 and PRISMA-S guidelines [ 34 , 35 ].

Part 2: Checklist

PRISMA-S is a 16-item checklist that covers multiple aspects of the search process for systematic reviews. It is intended to guide reporting, not conduct, of the search. The checklist should be read in conjunction with the Explanation and Elaboration (Part 3), which provides more detail about each item. We also include two boxes, one a glossary of terms (see Table 2 ) and the other, guidance on depositing search data and method descriptions in online repositories (see Table 3 ).

The Explanation and Elaboration also includes examples of good reporting for each item. Each exemplar is drawn from published systematic reviews. For clarity, some exemplars are edited to match the style of this document, including any original citations, and abbreviations are spelled out to aid comprehension. Any other edits to the text are noted with square brackets. A description of the rationale behind the item is explained, followed by additional suggestions for clear reporting and a suggested location(s) for reporting the item.

Not every systematic review will make use of all of the items in the Information Sources and Methods section of the checklist, depending on the research question and the methods chosen by the authors. The checklist provides a framework for the current most common and recommended types of information sources and methods for systematic reviews, but authors should use and report those items relevant and appropriate to their review. The checklist may also be used for systematic review protocols to fully document the planned search, in conjunction with the PRISMA-P reporting guideline [ 36 ] (Table 1 ).

Part 3: Explanation and Elaboration

Item 1. database name.

Name each individual database searched, stating the platform for each.

“The following electronic databases were searched: MEDLINE (Ovid), CINAHL (EBSCOhost), PsycINFO (Ovid), Cochrane Central Register of Controlled Trials (Ovid), SPORTDiscus (EBSCOhost), EMBASE (Ovid) and ProQuest Dissertations and Theses Global (ProQuest).” [ 38 ]

Explanation

Databases are the most commonly used tool to locate studies to include in systematic reviews and meta-analyses [ 6 , 39 ]. There is no single database that is able to provide a complete and accurate list of all studies that meet systematic review criteria due to the differences in the articles included and the indexing methods used between databases (Table 2 ). These differences have led to recommendations that systematic review teams search multiple databases to maximize the likelihood of finding relevant studies [ 6 , 39 , 40 ]. This may include using broad disciplinary databases (e.g., MEDLINE [ 41 ], Embase [ 42 ], Scopus [ 43 ]), specialized databases (e.g., PsycINFO [ 44 ] or EconLit [ 45 ]), or regional databases (e.g., LILACS [ 46 ] or African Index Medicus [ 47 ]).

Many of these literature databases are available through multiple different search platforms (Table 2 ). For example, the MEDLINE database is available through at least 10 different platforms, including Ovid, EBSCOhost, Web of Science, and PubMed. Each platform offers different ways of searching the databases, such as platform-specific field codes (Table 2 ), phrase searching, truncation, or searching full-text versus abstract and keyword only [ 48 ]. Different platforms may contain additional data that are not available in the original database, such as times cited, social media impact, or additional keywords. These differences between the platforms can have a meaningful impact on the results provided [ 48 , 49 , 50 ].

Authors should identify which specific literature databases were searched to locate studies included in the systematic review. It is important that authors indicate not only the database, but the platform through which the database was searched. This helps readers to evaluate the quality and comprehensiveness of the search and supports reproducibility and updating (Table 2 ) in the future by allowing the strategy to be copied and pasted as recommended in Item 8, below.

The distinctions between database and platform may not always be clear to authors, especially when the database is the only one available through a platform (e.g., Scopus [ 43 ]). In these cases, authors may choose to include the web address of the database in the text or the bibliography to provide clarity for their readers.

Suggested location for reporting

Report each database name and platform in the methods section and any supplementary materials (Table 2 ). If space permits, report key database names in the abstract.

Item 2. Multi-database searching

If databases were searched simultaneously on a single platform, state the name of the platform, listing all of the databases searched.

“The MEDLINE and Embase strategies were run simultaneously as a multi-file search in Ovid and the results de-duplicated using the Ovid de-duplication tool.” [ 51 ]
“A systematic literature search was performed in Web of Knowledge™ (including KCI Korean Journal Database, MEDLINE, Russian Science Citation Index, and SciELO Citation Index)….” [ 52 ]

Authors may choose to search multiple databases at once through a single search platform to increase efficiency. Along with the name of the platform, it is necessary to list the names of each of the individual databases included as part of the search. Including information about using this approach in the text of the manuscript helps readers immediately understand how the search was constructed and executed. This helps readers determine how effective the search strategy (Table 2 ) will be for each database [ 1 ].

Report any multi-database search (Table 2 ) in the methods section and any supplementary materials. If space permits, report key individual database names in the abstract, even if run through a multi-database search.

Item 3. Study registries

List any study registries searched.

“[We] searched several clinical trial registries ( ClinicalTrials.gov , Current Controlled Trials ( www.controlled-trials.com ), Australian New Zealand Clinical Trials Registry ( www.actr.org.au ), and University Hospital Medical Information Network Clinical Trials Registry ( www.umin.ac.jp/ctr )) to identify ongoing trials.” [ 53 ]

Study registries are a key source of information for systematic reviews and meta-analyses in the health sciences and increasingly in other disciplines. In the health sciences, study registries (Table 2 ) allow researchers to locate ongoing clinical trials and studies that may have gone unpublished [ 54 , 55 , 56 ]. Some funders, including the National Institutes of Health, require principal investigators to share their data on study registries within a certain time frame after grant completion [ 57 ]. This data may not have been published in any other location, making study registries a critical component of an information strategy, though timely reporting remains a challenge [ 58 , 59 ]. Different countries have their own study registries, as do many pharmaceutical companies.

Outside the health sciences, study registries are becoming increasingly important as many disciplines adopt study pre-registration as a tactic for improving the rigor of research. Though not yet as established as in the health sciences, these study registries are continually expanding and will serve as key sources for finding unpublished studies in fields in the social sciences and beyond.

To fully describe the study registries searched, list the name of each study registry searched, and include a citation or link to the study registry.

Report any study registries searched in the methods section and any supplementary materials.

Item 4. Online resources and browsing

Describe any online or print source purposefully searched or browsed (e.g., tables of contents, print conference proceedings, web sites), and how this was done.

“ We also searched the grey literature using the search string: “public attitudes” AND “sharing” AND “health data” on Google (in June 2017). The first 20 results were selected and screened.” [ 60 ]
“The grey literature search was conducted in October 2015 and included targeted, iterative hand searching of 22 government and/or research organization websites that were suggested during the expert consultation and are listed in S1 Protocol. Twenty two additional citations were added to the review from the grey literature search.” [ 61 ]
“To locate unpublished studies, we searched Embase [via Embase.com ] for conference proceedings since 2000 and hand-searched meeting abstracts of the Canadian Conference on Physician Health and the International Conference on Physician Health (2012 to 2016).” [ 62 ]

Systematic reviews were developed to remove as much bias as possible from the literature review process. One of the most important ways they achieve this reduction in bias is by searching beyond literature databases, which are skewed towards English-language publications with positive results [ 63 , 64 ]. To achieve a fuller picture of what the research on a specific topic looks like, systematic reviewers could seek out research that may be in progress and research that was never published [ 6 ]. Using other methods of finding research also helps identify research that may have been indexed in literature databases, but went undiscovered when searching those sources [ 40 ]. Seeking out this research often involves a complex strategy, drawing on a wealth of online and print resources as well as personal contacts.

Web search engines and specific web sites

Searching general internet search engines and searching the contents of specific websites is a key component of many systematic reviews [ 26 , 65 ]. Government, non-profit organization, and pharmaceutical company websites, for example, contain a wealth of information not published elsewhere [ 6 , 66 ]. Though searching a general search engine like Google or using a general search engine to search a specific website may introduce some bias into the search methodology through the personalization algorithms inherent in many of these tools [ 67 , 68 ], it is still important to fully document how web searches were conducted [ 65 ].

Authors should list all websites searched, along with their corresponding web address. Readers should be able to clearly understand if researchers used a website’s native search interface or advanced search techniques within a general search engine. If authors used a general search engine, authors should declare whether steps were taken to reduce personalization bias (e.g., using “incognito” mode in a browser). Authors may choose whether to detail the websites searched within the text (i.e., Google ( http://www.google.com )), by citing the websites in the bibliography, or by listing the website with corresponding web address in supplementary material, as shown in the examples above.

Review teams may occasionally set an artificial limit to the number of items they will screen from a given search or source [ 65 ]. This is because searching web search engines and individual websites will often lead to an unmanageable number of results, the search engine itself may only display a restricted number of results (e.g., Google will only display 1000 results), or the team has a finite budget or timeline to complete the review. Thus, many systematic review teams utilizing web search engines will often pre-designate a limit to the number of results they review. If review teams choose to review a limited set of results, it should be noted in the text, along with the rationale.

Conference proceedings

Studies show that large percentages of research presented as papers and posters at conferences never make their way into the published literature, particularly if the study’s results were statistically negative [ 63 , 69 ]. Conference proceedings are often the only way to locate these studies. Including conference proceedings in a systematic review search helps minimize bias [ 70 ]. The introduction of online conference proceedings has been a boon to researchers and reduced the need to review printed abstract books. Additionally, some databases either include conference proceedings along with journal articles (i.e., Embase [ 42 ]) or contain only conference proceedings (i.e., ProceedingsFirst [ 71 ] or Directory of Published Papers [ 72 ]). Some conferences have made their abstracts available in a single database (i.e., International AIDS Society’s Abstract Archive [ 73 ]). When using these types of databases to search conference proceedings, authors can treat them as above in Item 1.

Individual conferences’ online proceedings may be password-protected for association members or conference attendees [ 74 ]. When reporting on conference proceedings searched or browsed (Table 2 ) via a conference or association’s online or print proceedings, authors must specify the conference names, the dates of conferences included, and the method used to search the proceedings (i.e., browsing print abstract books or using an online source). If the conference proceedings are searched online, authors should specify the web address(es) for the conference proceedings and the date(s) of the conferences. If the conference proceedings are published in a journal, the authors should cite the journal. If the proceedings are a standalone publication, authors may choose to cite them using the same methods used to cite a book or by providing the full information about the conference (name, location, dates, etc.) in a supplementary file.

General browsing

Authors also commonly browse print or online tables of contents, full contents of journals, or other sources that are the most likely to contain research on the topic sought. When purposefully browsing, describe any method used, the name of the journal or other source, and the time frame covered by the search, if applicable.

Report online information sources (Table 2 ) searched or browsed in the methods section and in any supplementary materials. Systematic reviews using several of these methods, or using multiple information sources for each method, may need to report their methods briefly in the methods section, but should fully report all necessary information to describe their approaches in supplementary materials.

Item 5. Citation searching

Indicate whether cited references or citing references were examined, and describe any methods used for locating cited/citing references (e.g., browsing reference lists, using a citation index, setting up email alerts for references citing included studies).

“Reference lists of included articles were manually screened to identify additional studies.” [ 75 ]
“[W]e used all shared decision making measurement instruments that were identified in Gärtner et al’s recent systematic review (Appendix A). We then performed a systematic citation search, collecting all articles that cited the original papers reporting on the development, validation, or translation of any the observational and/or self-reported shared decision making measurement instruments identified in that review. An experienced librarian (P.J.E.) searched Web of Science [Science Citation Index] and Scopus for articles published between January 2012 and February 2018.” [ 76 ]
“We [conducted] citation tracking of included studies in Web of Science Core Collection on an ongoing basis, using citation alerts in Web of Science Core Collection.” [ 77 ]

One of the most common search methods is reviewing the references or bibliographies of included studies [ 11 , 17 ]. This type of citation searching (looking for cited references) can be additive to other cited reference searching methods, such as examining bibliographies of relevant systematic reviews. In addition, researchers may choose to look for articles that cite specified studies [ 78 ]. This may include looking beyond one level forwards and backwards (e.g., examining the bibliographies of articles cited by specified articles) [ 78 ]. Looking at bibliographies of included articles or other specified articles is often conducted by examining full-text articles, but it can also be accomplished using online tools called citation indexes (Table 2 ).

The use of these methods can be complicated to describe, but the explanation should clearly state the database used, if applicable (i.e., Scopus, Google Scholar, Science Citation Index) and describe any other methods used. Authors also must cite the “base” article(s) that citation searching was performed upon, either for examining cited or citing articles (Table 2 ). If the same database is used for both a topical search as well as citation searching, describe each use separately. For manually checking the reference lists for included articles, a simple statement as in the first example is sufficient.

Report citation searching details in the methods section and in any supplementary materials.

Item 6. Contacts

Indicate whether additional studies or data were sought by contacting authors, experts, manufacturers, or others.

“We contacted representatives from the manufacturers of erythropoietin-receptor agonists (Amgen, Ortho-Biotech, Roche), corresponding or first authors of all included trials and subject-area experts for information about ongoing studies.” [ 79 ]
“We also sought data via expert requests. We requested data on the epidemiology of injecting drug use and blood-borne viruses in October, 2016, via an email distribution process and social media. This process consisted of initial emails sent to more than 2000 key experts and organisations, including contacts in the global, regional, and country offices of WHO, UNAIDS, Global Fund, and UNODC (appendix p 61). Staff in those agencies also forwarded the request to their colleagues and other relevant contacts. One member of the research team (SL) posted a request for data on Twitter, which was delivered to 5525 individual feeds (appendix p 62).” [ 80 ]

Contacting manufacturers (e.g., pharmaceutical companies), or reaching out to authors or experts directly or through organizations, is a key method to locate unpublished and ongoing studies [ 6 ]. Contacting authors or manufacturers may also be necessary when publications, conference proceedings, or clinical trials registry records do not provide the complete information needed [ 63 , 81 ]. Contacting manufacturers or regulating agencies might be required to acquire complete trial data from the clinical study reports [ 82 , 83 ]. More broad calls for evidence may also be conducted when no specific groups or individuals are targeted.

Contact methods may vary widely, depending on the context, and may include personal contact, web forms, email mailing lists, mailed letters, social media contacts, or other methods. As these strategies are inherently difficult to reproduce, researchers should attempt to give as much detail as possible on what data or information was sought, who or what group(s) provided data or information, and how the individuals or groups were identified.

Report information about contacts to solicit additional information in the methods section and in any supplementary materials. Systematic reviews using elaborate calls for evidence or making extensive use of contacts as an information source may need to report their methods briefly in the methods section, but should fully report all necessary information to describe their approaches in supplementary materials.

Item 7. Other methods

Describe any additional information sources or search methods used.

“We also searched… our personal files.” [ 84 ]
“PubMed’s related articles search was performed on all included articles.” [ 85 ]

A thorough systematic review may utilize many additional methods of locating studies beyond database searching, many of which may not be reproducible methods. A key example is searching personal files. Another is using databases’ built in tools, such as PubMed’s Related Articles feature [ 86 ] or Clarivate Analytics’ Web of Science’s Related Records feature [ 87 ], to locate relevant articles based on commonalities with a starting article. Because these tools are often proprietary and their algorithms opaque, researchers may not be able to replicate the exact results at a later date. To attempt to be as transparent as possible, researchers should both note the tool that was used and cite any articles these operations were run upon. For all “other” methods, it is still important to declare that the method was used, even if it may not be fully replicable.

Report information about any other additional information sources or search methods used in the methods section and in any supplementary materials.

Item 8. Full search strategies

Include the search strategies for each database and information source, copied and pasted exactly as run.

Database search. Methods section description . “The reproducible searches for all databases are available at DOI:10.7302/Z2VH5M1H.” [ 88 ]
Database search. One of the full search strategies from supplemental materials in online repository . “ Embase.com (692 on Jan 19, 2017) 1. 'social media'/exp OR (social NEAR/2 (media* OR medium* OR network*)):ti OR twitter:ti OR youtube:ti OR facebook:ti OR linkedin:ti OR pinterest:ti OR microblog*:ti OR blog:ti OR blogging:ti OR tweeting:ti OR 'web 2.0':ti 2. 'professionalism'/exp OR 'ethics'/exp OR 'professional standard'/de OR 'professional misconduct'/de OR ethic*:ab,ti OR unprofessional*:ab,ti OR professionalism:ab,ti OR (professional* NEAR/3 (standard* OR misconduct)):ab,ti OR ((professional OR responsib*) NEAR/3 (behavi* OR act OR conduct*)):ab,ti 3. #1 AND #2 AND [english]/lim NOT ('conference abstract':it OR 'conference paper':it) [ 88 ]
Online resources and browsing. Methods section description . “The approach to study identification from this systematic review is transparently reported in the Electronic Supplementary Material Appendix S1.” [ 89 ]
Online resources and browsing. One of the full online resource search strategies reported in supplement . “Date: 12/01/16. Portal/URL: Google. https://www.google.co.uk/webhp?hl=en . Search terms: ((Physical training) and (man or men or male or males) and (female or females or women or woman) and (military)). Notes: First 5 pages screened on title (n=50 records).” [ 89 ]

Systematic reviews and related review types rely on thorough and complex search strategies to identify literature on a given topic. The search strategies used to conduct this data gathering are essential to the transparency and reproducibility of any systematic review. Without being able to assess the quality of the search strategies used, readers are unable to assess the quality of the systematic review [ 9 , 11 , 17 ].

When space was at a premium in publications, complete reporting of search strategies was a challenge. Because it was necessary to balance the need for transparency with publication restrictions, previous PRISMA guidelines recommended including the complete search strategy from a minimum of one database searched [ 9 ]. Many systematic reviews therefore reported only the minimum necessary. However, reporting only selected search strategies can contribute to the observed irreproducibility of many systematic reviews [ 11 , 17 ].

The prior versions of PRISMA did not elaborate on methods for reporting search strategies outside of literature databases. Subsequent to its publication, many groups have begun identifying the challenges of fully documenting other types of search methods [ 90 , 91 ]. Now recommended is the explicit documentation of all of the details of all search strategies undertaken [ 91 , 92 ]. These should be reported to ensure transparency and maximum reproducibility, including searches and purposeful browsing activities undertaken in web search engines, websites, conference proceeding databases, electronic journals, and study registries.

Journal restrictions vary, but many journals now allow authors to publish supplementary materials with their manuscripts. At minimum, all search strategies, including search strategies for web search engines, websites, conference proceedings databases, electronic journals, and study registries, should be submitted as a supplement for publication. Though most supplements are appropriately accessible on journal publishers’ web sites as submitted, supplements may disappear [ 17 ]. In addition, many supplements are only available to journal subscribers [ 93 ]. Similarly, manuscripts available on public access systems like PubMed Central [ 94 ] may not have the corresponding supplemental materials properly linked. For optimal accessibility, authors should upload complete documentation to a data repository (Table 2 ), an institutional repository, or other secure and permanent online archive instead of relying on journal publication (see Table 3 for additional information).

It is important to document and report the search strategy exactly as run, typically by copying and pasting the search strategy directly as entered into the search platform. This is to ensure that information such as the fields searched, term truncation, and combinations of terms (i.e., Boolean logic or phrases) are accurately recorded. Many times, the copied and pasted version of a search strategy will also include key information such as limits (see Item 9; Table 2 ) used, databases searched within a multi-database search, and other database-specific detail that will enable more accurate reporting and greater reproducibility. This documentation must also repeat the database or resource name, database platform or web address, and other details necessary to clearly describe the resource.

Report the full search strategy in supplementary materials as described above. Describe and link to the location of the supplementary materials in the methods section.

Item 9: Limits and restrictions

Specify that no limits were used, or describe any limits or restrictions applied to a search (e.g., date or time period, language, study design) and provide justification for their use.

No limits . “We imposed no language or other restrictions on any of the searches.” [ 95 ]
Limits described without justification . “The search was limited to the English language and to human studies.” [ 96 ]
“The following search limits were then applied: randomized clinical trials (RCTs) of humans 18 years or older, systematic reviews, and meta-analyses.” [ 97 ]
Limits described with justification . “The search was limited to publications from 2000 to 2018 given that more contemporary studies included patient cohorts that are most reflective of current co-morbidities and patient characteristics as a result of the evolving obesity epidemic.” [ 98 ]
Limits described, one with justification . “Excluded publication types were comments, editorials, patient education handouts, newspaper articles, biographies, autobiographies, and case reports. All languages were included in the search result; non-English results were removed during the review process…. To improve specificity, the updated search was limited to human participants.” [ 99 ]

Many databases have features that allow searchers to quickly restrict a search using limits. What limits are available in a database are unique to both the database and the platform used to search it. Limits are dependent on the accuracy of the indexer, the timeliness of indexing, and the quality of any publisher-supplied data. For instance, using database limits to restrict searches to randomized controlled trials will only find records identified by the indexer as randomized controlled trials. Since the indexing may take 6 months or more to complete for any given article, searchers risk missing new articles when using database limits.

Using database-provided limit features should not be confused with using filters (see Item 10; Table 2 ) or inclusion criteria for the systematic review. For example, systematic review teams may choose to only include English-language randomized controlled trials. This can be done using limits, a combination of a filter (see Item 10) and screening, or screening alone. It should be clear to the reader which approach is used. For instance, in the “ Limits described, with one justification ” example above, the authors used database limits to restrict their search by publication type, but they did not use database limits to restrict by language, even though that was a component of their eligibility criteria. They also used database limits to restrict to human participants in their search update.

It is important for transparency and reproducibility that any database limits applied when running the search are reported accurately, as their use has high potential for introducing bias into a search [ 1 , 64 , 100 , 101 ]. Database limits are not recommended for use in systematic reviews, due to their fallibility [ 39 , 100 ]. If used, review teams should include a statement of justification for each use of a database limit in the methods section, the limitations section, or both [ 102 , 103 ]. In the examples above, only the last two examples provide some justification in the methods section (“to improve specificity” [ 99 ] and “contemporary studies included patient cohorts that are most reflective of current co-morbidities and patient characteristics as a result of the evolving obesity epidemic” [ 98 ]).

Report any limits or restrictions used or that no limits were used in the abstract, methods section, and in any supplementary materials, including the full search strategies (Item 8). Report the justification for any limits used within the methods section and/or in the limitations section.

Item 10. Search filters

Indicate whether published search filters were used (as originally designed or modified), and if so, cite the filter(s) used.

“For our MEDLINE search we added a highly sensitive filter for identifying randomised trials developed by the Cochrane Collaboration [38]. For Embase we used the filter for randomised trials proposed by the Scottish Intercollegiate Guidelines Network [ 104 ].” [ 105 ]

Filters are a predefined combination of search terms developed to identify references with a specific content, such as a particular type of study design (e.g., randomized controlled trials) [ 106 ], populations (e.g., the elderly), or a topic (e.g., heart failure) [ 107 ]. They often consist of a combination of subject headings, free-text terms, and publication types [ 107 ]. For systematic reviews, filters are generally recommended for use instead of limits built into databases, as discussed in Item 9, because they provide the much higher sensitivity (Table 2 ) required for a comprehensive search [ 108 ].

Any filters used as part of the search strategy should be cited, whether published in a journal article or other source. This enables readers to assess the quality of the filter(s) used, as most published search filters are validated and/or peer reviewed [ 106 , 107 ]. Many commonly used filters are published on the InterTASC Information Specialists’ Sub-Group [ 109 ], in the Cochrane Handbook [ 4 , 39 ], and through the Health Information Research Unit of McMaster University [ 110 ].

Cite any search filter used in the methods section and describe adaptations made to any filter. Include the copied and pasted details of any search filter used or adapted for use as part of the full search strategy (Item 8).

Item 11. Prior work

Indicate when search strategies from other literature reviews were adapted or reused for a substantive part or all of the search, citing the previous review(s).

“We included [search strategies] used in other systematic reviews for research design [ 111 ], setting [ 112 , 113 ], physical activity and healthy eating [ 114 , 115 , 116 ], obesity [ 111 ], tobacco use prevention [ 117 ], and alcohol misuse [ 118 ]. We also used a search [strategy] for intervention (implementation strategies) that had been employed in previous Cochrane Reviews [ 119 , 120 ], and which was originally developed based on common terms in implementation and dissemination research.” [ 121 ]

Many authors may also examine previously published search strategies to develop the search strategies for their review. Sometimes, authors adapt or reuse these searches for different systematic reviews [ 122 ]. When basing a new search strategy on a published search strategy, it is appropriate to cite the original publication(s) consulted.

Search strategies differ from filters (Item 10) because search strategies are often developed for a specific project, not necessarily designed to be repeatedly used. Filters, on the other hand, are developed with the express purpose of reuse. Filters are often objectively derived, tested, and validated, whereas most search strategies published as part of systematic review or other evidence synthesis are “best guess,” relying on the expertise of the searcher and review team [ 107 ].

As in the example above, researchers may rely on multiple prior published searches to construct a new search for a novel review. Many times, researchers will use the same searches from a published systematic review to update the existing systematic review. In either case, it is helpful to the readers to understand whether major portions of a search are being adapted or reused.

Report any prior work consulted, adapted, or reused in the methods section. Include the copied and pasted search strategies used, including portions or the entirety of any prior work used or adapted for use, in the full search strategy (Item 8).

Item 12. Updates

Report the methods used to update the search(es) (e.g., rerunning searches, email alerts).

“Ovid Auto Alerts were set up to provide weekly updates of new literature until July 09, 2012.” [ 123 ]
“ Two consecutive searches were conducted and limited by publication type and by date, first from January 1, 1990, to November 30, 2012, and again from December 1, 2012, to July 31, 2015, in an updated search…. The original search strategy was used to model the updated search from December 1, 2012, to July 31, 2015. The updated search strategy was consistent with the original search; however, changes were required in the ERIC database search because of a change in the ERIC search algorithm. Excluded publication types were identical to the initial search. To improve specificity, the updated search was limited to human participants.” [ 99 ]

The literature search is usually conducted at the initial stage of the production of a systematic review. As a consequence, the results of a search may be outdated before the review is published [ 124 , 125 , 126 ]. The last search in a review should be conducted ideally less than 6 months before publication [ 90 , 92 , 125 ]. For this reason, authors often update searches by rerunning (Table 2 ) the same search(es) or otherwise updating searches before the planned publication date. Updating searches differs from updating a systematic review, i.e., when the same or different authors or groups decide to redo a published systematic review to bring its findings up to date. If authors are updating a published systematic review, either authored by the same review team or another, Item 11 contains relevant guidance.

When reporting search updates, the extent of reporting depends on methods used and any changes that were made while updating the searches. If there are no changes in information sources and/or search syntax (Table 2 ), it is sufficient to indicate the date the last search was run in the methods section and in the supplementary materials. If there are any changes in information sources and/or search syntax, the changes should be indicated (e.g., different set of databases, changes in search syntax, date restrictions) in the methods section. Authors should explain why these changes were made. When there were changes in the search strategy syntax, the original and the updated searches should both be reported as described in Item 8.

If authors use email alerts or other methods to update searches, these methods can be briefly described by indicating the method used, the frequency of any updates, the name of the database(s) used, or other relevant information that will help readers understand how the authors conducted search updates. If deduplication methods are used as part of the search update process, these methods can be described using guidance in Item 16.

Report the methods used to update the searches in the methods section and the supplementary materials, as described above.

Item 13. Dates of searches

For each search strategy, provide the date when the last search occurred.

“A comprehensive literature search was initially run on 26 February 2017 and then rerun on 5 February 2018….” [ 127 ]

Most literature databases are regularly updated with new citations as articles are published. Citations already in the database may also be updated once new information (such as indexing terms or citing articles) is available. As an example, MEDLINE added over 900,000 indexed citations (Table 2 ) in fiscal year 2018 [ 41 ]. In addition, the information gathered by databases (such as author affiliations in MEDLINE) can change over time. Because new citations are regularly being added, systematic review guidelines recommend updating searches throughout the writing process to ensure that all relevant articles are retrieved [ 6 , 92 ].

It is necessary for authors to document the date when searches were executed, either the date the initial search was conducted, if only searched once, or the most recent date the search was rerun. This allows readers to evaluate the currency of each search and understand what literature the search could have potentially identified [ 125 ]. In addition, it supports reproducibility and updating by allowing other researchers to use date limits to view the same “slice” of the database that the original authors used or to update a systematic review by searching from the last time point searched.

Report the date of the last search of the primary information sources used in the abstract for optimal clarity for readers [ 128 ]. Report the time frame during which searches were conducted, the initial search date(s), and/or the last update search date(s) in the methods section. Report the initial and/or last update search date with each complete search strategy in the supplementary materials, as in the examples for Item 8.

Item 14. Peer review

Describe any search peer review process.

“The strategies were peer reviewed by another senior information specialist prior to execution using the PRESS Checklist [ 1 ].” [ 129 ]

Peer reviewing search strategies is an increasingly valued component of search strategy development for systematic reviews. Expert guidance recommends taking this step to help increase the robustness of the search strategy [ 6 , 74 ]. Peer reviewing (Table 2 ) searches is useful to help to guide and improve electronic search strategies. One of peer review’s main benefits is the reduction of errors [ 23 , 130 ]. Peer review may also increase the number of relevant records found for inclusion in reviews, thus improving the overall quality of the systematic review [ 131 ].

Authors should consider using the Peer Review of Electronic Search Strategies (PRESS) Guideline statement, a practice guideline for literature search peer review outlining the major components important to review and the benefits of peer reviewing searches [ 1 ]. Authors should strongly consider having the search strategy peer reviewed by an experienced searcher, information specialist, or librarian [ 1 , 131 ]. Though peer review may be conducted generally with publication of a protocol, for example, this item is designed to document search-specific peer review.

Describe the use of peer review in the methods section.

Item 15. Total records

Document the total number of records identified from each database and other information sources.

Methods section . “A total of 3251 citations were retrieved from the six databases and four grey literature websites.” [ 133 ] Flow diagram . Fig. 1 . Fig. 1 “Figure 1. PRISMA 2009 flow diagram” [ 132 ] Full size image

Recording the flow of citations through the systematic review process is a key component of the PRISMA Statement [ 9 , 35 ]. It is helpful to identify how many records (Table 2 ) were identified within each database and additional source. Readers can use this information to see whether databases or expert contacts constituted the majority of the records reviewed, for example. Knowing the number of records from each source also helps with reproducibility. If a reader tries to duplicate a search from a systematic review, one would expect to retrieve nearly the same results when limiting to the timeframe in the original review. If instead, the searcher locates a drastically different number of results than reported in the original review, this can be indicative of errors in the published search [ 23 ] or major changes to a database, both of which might be reasons to update a systematic review or view the systematic review’s results with skepticism.

Report the total number of references retrieved from all sources, including updates, in the results section. Report the total number of references from each database and information source in the supplementary materials. If space permits, report the total number of references from each database in the PRISMA flow diagram [ 35 ].

Item 16. Deduplication

Describe the processes and any software used to deduplicate records from multiple database searches and other information sources.

“Duplicates were removed by the librarians (LP, PJE), using EndNote's duplicate identification strategy and then manually.” [ 134 ]

Databases contain significant overlap in content. When searching in multiple databases and additional information sources, as is necessary for a systematic review, authors often employ a variety of techniques to reduce the number of duplicates within their results prior to screening [ 135 , 136 , 137 , 138 ]. Techniques vary in their efficacy, sensitivity, and specificity (Table 2 ) [ 136 , 138 ]. Knowing which method is used enables readers to evaluate the process and understand to what extent these techniques may have removed false positive duplicates [ 138 ]. Authors should describe and cite any software or technique used, when applicable. If duplicates were removed manually, authors should include a description.

Report any deduplication method used in the methods section. The total number of references after deduplication should be reported in the PRISMA flow diagram [ 35 ].

Part 5. Discussion and conclusions

The PRISMA-S extension is designed to be used in conjunction with PRISMA 2020 [ 35 ] and PRISMA extensions including PRISMA-P for protocols [ 36 ], PRISMA-ScR for scoping reviews [ 139 ], the PRISMA Network Meta-analyses statement [ 140 ], and PRISMA-IPD for systematic reviews using individual patient data [ 141 ]. It may also be used with other reporting guidelines that relate to systematic reviews and related review types, such as RepOrting standards for Systematic Evidence Syntheses (ROSES) [ 142 ]. It provides additional guidance for systematic review teams, information specialists, librarians, and other researchers whose work contains a literature search as a component of the research methods. Though its origins are in the biomedical fields, PRISMA-S is flexible enough to be applied in all disciplines that use method-driven literature searching. Ultimately, PRISMA-S attempts to give systematic review teams a framework that helps ensure transparency and maximum reproducibility of the search component of their review.

PRISMA-S is intended to capture and provide specific guidance for reporting the most common methods used in systematic reviews today. As new methods and information sources are adopted, authors may need to adjust their reporting methods to accommodate new processes. Currently, PRISMA-S does not address using text mining or text analysis methods to create the search, for example, though this is an increasingly common way for information specialists to develop robust and objective search strategies [ 143 , 144 , 145 ]. Likewise, PRISMA-S does not require that decisions about the rationale behind choices in search terms and search construction be recorded, though this provides readers a great deal of insight. In the future, methods and rationales used to create search strategies may become more important for reproducibility.

PRISMA-S offers extensive guidance for many different types of information source and methods, many of them not described in detail in other reporting guidelines relating to literature searching. This includes detailed information on reporting study registry searches, web searches, multi-database searches, and updates. PRISMA-S can help authors report all components of their search, hopefully making the reporting process easier. As a note, PRISMA-S provides guidance on transparent reporting to authors and is not intended as a tool to either guide conduct of a systematic review or to evaluate the quality of a search or a systematic review.

The PRISMA-S checklist is available for download in Word and PDF formats from the PRISMA Statement web site [ 37 ]. The checklist should be used together with its Explanation & Elaboration documentation to provide authors with guidance for the complexities of different types of information sources and methods.

We intend to work with systematic review and information specialist organizations to broadly disseminate PRISMA-S and encourage its adoption by journals. In addition, we plan to host a series of webinars discussing how to use PRISMA-S most effectively. These webinars will also be available for later viewing and will serve as a community resource.

We hope that journal editors will recommend authors of systematic reviews and other related reviews to use PRISMA-S and submit a PRISMA-S checklist with their manuscripts. We also hope that journal editors will encourage more stringent peer review of systematic review searches to ensure greater transparency and reproducibility within the review literature.

Availability of data and materials

All data is available via the PRISMA-S PRISMA Search Reporting Extension OSF site ( https://doi.org/10.17605/OSF.IO/YGN9W ) [ 32 ]. This includes all data relating to item development, survey instruments, data from the Delphi surveys, and consent documents.

Abbreviations

Digital object identifier

Peer Review of Electronic Search Strategies

Preferred Reporting Items for Systematic reviews and Meta-Analyses

PRISMA for individual patient data

PRISMA for systematic review protocols

PRISMA for scoping reviews

RepOrting standards for Systematic Evidence Syntheses

McGowan J, Sampson M, Salzwedel DM, Cogo E, Foerster V, Lefebvre C. PRESS Peer Review of Electronic Search Strategies: 2015 guideline statement. J Clin Epidemiol. 2016;75:40–6.

Article   PubMed   Google Scholar  

Lefebvre C, Glanville J, Briscoe S, et al. Searching for and selecting studies. In: Higgins J, Thomas J, Chandler J, et al, eds. Cochrane Handbook for Systematic Reviews of Interventions: version 6.0. 2019. https://training.cochrane.org/handbook/current/chapter-04 .

Centre for Reviews and Dissemination. Systematic reviews: CRD’s guidance for undertaking reviews in health care. 1.3 Undertaking the review. 2009; https://www.york.ac.uk/media/crd/Systematic_Reviews.pdf . Accessed 31 Jan, 2020.

Lefebvre C, Glanville J, Briscoe S, Littlewood A, Marshall C, Metzendorf MI. Technical supplement to chapter 4: searching for and selecting studies. In: Higgins JPT, Thomas J, Chandler J, Cumpston M, Li T, Page MJ, editors. Cochrane Handbook for Systematic Reviews of Interventions: version 6.0. 2019. https://training.cochrane.org/handbook .

Relevo R, Balshem H. Finding evidence for comparing medical interventions: AHRQ and the Effective Health Care Program. J Clin Epidemiol. 2011;64(11):1168–77.

Institute of Medicine. Finding What Works in Health Care : Standards for Systematic Reviews. Washington, D.C.: National Academies Press; 2011. https://doi.org/10.17226/13059 .

European Network for Health Technology Assessment. Process of information retrieval for systematic reviews and health technology assessments on clinical effectiveness: guideline; version 2.0. 2019; https://eunethta.eu/wp-content/uploads/2020/01/EUnetHTA_Guideline_Information_Retrieval_v2-0.pdf . Accessed 31 Jan, 2020.

Sampson M, McGowan J, Tetzlaff J, Cogo E, Moher D. No consensus exists on search reporting methods for systematic reviews. J Clin Epidemiol. 2008;61(8):748–54.

Liberati A, Altman DG, Tetzlaff J, et al. The PRISMA statement for reporting systematic reviews and meta-analyses of studies that evaluate health care interventions: explanation and elaboration. PLoS Med. 2009;6(7):e1000100.

Article   PubMed   PubMed Central   Google Scholar  

Page MJ, Moher D. Evaluations of the uptake and impact of the Preferred Reporting Items for Systematic reviews and Meta-Analyses (PRISMA) Statement and extensions: a scoping review. Syst Rev. 2017;6(1):263.

Koffel JB, Rethlefsen ML. Reproducibility of search strategies is poor in systematic reviews published in high-impact pediatrics, cardiology and surgery journals: a cross-sectional study. PLoS One. 2016;11(9):e0163309.

Article   PubMed   PubMed Central   CAS   Google Scholar  

Faggion CM Jr, Wu YC, Tu YK, Wasiak J. Quality of search strategies reported in systematic reviews published in stereotactic radiosurgery. Br J Radiol. 2016;89(1062):20150878.

Layton D. A critical review of search strategies used in recent systematic reviews published in selected prosthodontic and implant-related journals: are systematic reviews actually systematic? Int J Prosthodont. 2017;30(1):13–21.

Yaylali IE, Alacam T. Critical assessment of search strategies in systematic reviews in endodontics. J Endod. 2016;42(6):854–60.

Page MJ, Shamseer L, Altman DG, et al. Epidemiology and reporting characteristics of systematic reviews of biomedical research: a cross-sectional study. PLoS Med. 2016;13(5):e1002028.

Koffel JB. Use of recommended search strategies in systematic reviews and the impact of librarian involvement: a cross-sectional survey of recent authors. PLoS One. 2015;10(5):e0125931.

Rethlefsen ML, Farrell AM, Osterhaus Trzasko LC, Brigham TJ. Librarian co-authors correlated with higher quality reported search strategies in general internal medicine systematic reviews. J Clin Epidemiol. 2015;68(6):617–26.

Meert D, Torabi N, Costella J. Impact of librarians on reporting of the literature searching component of pediatric systematic reviews. J Med Libr Assoc. 2016;104(4):267–77.

Maggio LA, Tannery NH, Kanter SL. Reproducibility of literature search reporting in medical education reviews. Acad Med. 2011;86(8):1049–54.

Yoshii A, Plaut DA, McGraw KA, Anderson MJ, Wellik KE. Analysis of the reporting of search strategies in Cochrane systematic reviews. J Med Libr Assoc. 2009;97(1):21–9.

Franco JVA, Garrote VL, Escobar Liquitay CM, Vietto V. Identification of problems in search strategies in Cochrane Reviews. Res Synth Methods. 2018;9(3):408–16.

Salvador-Oliván JA, Marco-Cuenca G, Arquero-Avilés R. Errors in search strategies used in systematic reviews and their effects on information retrieval. J Med Libr Assoc. 2019;107(2):210–21.

Sampson M, McGowan J. Errors in search strategies were identified by type and frequency. J Clin Epidemiol. 2006;59(10):1057–63.

Mullins MM, DeLuca JB, Crepaz N, Lyles CM. Reporting quality of search methods in systematic reviews of HIV behavioral interventions (2000-2010): are the searches clearly explained, systematic and reproducible? Res Synth Methods. 2014;5(2):116–30.

Rader T, Mann M, Stansfield C, Cooper C, Sampson M. Methods for documenting systematic review searches: a discussion of common issues. Res Synth Methods. 2014;5(2):98–115.

Briscoe S. Web searching for systematic reviews: a case study of reporting standards in the UK Health Technology Assessment programme. BMC Res Note. 2015;8:153.

Article   Google Scholar  

Moher D, Stewart L, Shekelle P. All in the family: systematic reviews, rapid reviews, scoping reviews, realist reviews, and more. Syst Rev. 2015;4:183.

Grant MJ, Booth A. A typology of reviews: an analysis of 14 review types and associated methodologies. Health Info Libr . 2009;26(2):91–108.

Rethlefsen ML, Koffel JB, Kirtley S. PRISMA-Search (PRISMA-S) extension to PRISMA development protocol. 2016; https://www.equator-network.org/wp-content/uploads/2009/02/Protocol-PRISMA-S-Delphi.pdf . Accessed 16 Jan, 2020.

Moher D, Schulz KF, Simera I, Altman DG. Guidance for developers of health research reporting guidelines. PLoS Med. 2010;7(2):e1000217.

Hsu C, Sandford BA. The Delphi technique: making sense of consensus. Pract Assess Res Eval. 2007;12:10. https://doi.org/10.7275/pdz9-th90 .

Rethlefsen ML, Koffel JB, Kirtley S, Ayala AP, Waffenschmidt S. PRISMA-S: PRISMA Search Reporting Extension. 2019; https://doi.org/10.17605/OSF.IO/YGN9W . Accessed 5 Feb, 2020.

Hypothesis. 2020; https://web.hypothes.is/ . Accessed 3 Jan, 2020.

Page MJ, McKenzie JE, Bossuyt PM, et al. Updating the PRISMA reporting guideline for systematic reviews and meta-analyses: study protocol. 2018; http://osf.io/2v7mk . Accessed 13 Feb, 2020.

Page MJ, McKenzie JE, Bossuyt PM, et al. The PRISMA 2020 statement: an updated guideline for reporting systematic reviews. MetaArXiv Preprints 2020; https://doi.org/10.31222/osf.io/v7gm2 . Accessed 25 Oct 2020.

Shamseer L, Moher D, Clarke M, et al. Preferred reporting items for systematic review and meta-analysis protocols (PRISMA-P) 2015: elaboration and explanation. BMJ. 2015;350:g7647.

PRISMA: Transparent reporting of systematic reviews and meta-analyses. 2015; http://prisma-statement.org/ . Accessed 13 Feb, 2020.

Schneider KJ, Leddy JJ, Guskiewicz KM, et al. Rest and treatment/rehabilitation following sport-related concussion: a systematic review. Br J Sports Med. 2017;51(12):930–4.

Lefebvre C, Manheimer E, Glanville J. Searching for studies. 2011; http://handbook.cochrane.org/chapter_6/6_searching_for_studies.htm . Accessed 25 Nov, 2014.

Bramer WM, Rethlefsen ML, Kleijnen J, Franco OH. Optimal database combinations for literature searches in systematic reviews: a prospective exploratory study. Syst Rev. 2017;6(1):245.

National Library of Medicine. MEDLINE®: description of the database 2019; https://www.nlm.nih.gov/bsd/medline.html . Accessed 1 Feb, 2020.

Embase content. 2019; https://www.elsevier.com/solutions/embase-biomedical-research/embase-coverage-and-content . Accessed 28 February, 2019.

Scopus. 2020; http://www.scopus.com . Accessed 1 Feb, 2020.

PsycINFO. 2020; https://www.apa.org/pubs/databases/psycinfo . Accessed 1 Feb, 2020.

EconLit. 2020; https://www.aeaweb.org/econlit/ . Accessed 1 Feb, 2020.

BIREME - PAHO - WHO Latin American and Caribbean Center on Health Sciences Information. LILACS. 2020; http://lilacs.bvsalud.org/en/ . Accessed 1 Feb, 2020.

World Health Organization. African Index Medicus. 2020; http://indexmedicus.afro.who.int/ . Accessed 1 Feb, 2020.

Craven J, Jefferies J, Kendrick J, Nicholls D, Boynton J, Frankish R. A comparison of searching the Cochrane library databases via CRD, Ovid and Wiley: implications for systematic searching and information services. Health Info Libr J. 2014;31(1):54–63.

Bramer WM, Giustini D, Kleijnen J, Franco OH. Searching Embase and MEDLINE by using only major descriptors or title and abstract fields: a prospective exploratory study. Syst Rev. 2018;7(1):200.

Younger P, Boddy K. When is a search not a search? A comparison of searching the AMED complementary health database via EBSCOhost, OVID and DIALOG. Health Info Libr J. 2009;26(2):126–35.

Fraser C, Murray A, Burr J. Identifying observational studies of surgical interventions in MEDLINE and EMBASE. BMC Med Res Methodol. 2006;6:41.

De Cagna F, Fusar-Poli L, Damiani S, et al. The role of intranasal oxytocin in anxiety and depressive disorders: a systematic review of randomized controlled trials. Clin Psychopharmacol Neurosci. 2019;17(1):1–11.

Rutjes AW, Juni P, da Costa BR, Trelle S, Nuesch E, Reichenbach S. Viscosupplementation for osteoarthritis of the knee: a systematic review and meta-analysis. Ann Intern Med. 2012;157(3):180–91.

Potthast R, Vervolgyi V, McGauran N, Kerekes MF, Wieseler B, Kaiser T. Impact of inclusion of industry trial results registries as an information source for systematic reviews. PLoS One. 2014;9(4):e92067.

Turner EH, Matthews AM, Linardatos E, Tell RA, Rosenthal R. Selective publication of antidepressant trials and its influence on apparent efficacy. N Engl J Med. 2008;358(3):252–60.

Article   CAS   PubMed   Google Scholar  

Baudard M, Yavchitz A, Ravaud P, Perrodeau E, Boutron I. Impact of searching clinical trial registries in systematic reviews of pharmaceutical treatments: methodological systematic review and reanalysis of meta-analyses. BMJ. 2017;356:j448.

Zarin DA, Tse T, Williams RJ, Carr S. Trial reporting in ClinicalTrials.gov - the final rule. N Engl J Med 2016;375(20):1998-2004.

Anderson ML, Chiswell K, Peterson ED, Tasneem A, Topping J, Califf RM. Compliance with results reporting at ClinicalTrials.gov. N Engl J Med. 2015;372(11):1031–9.

Article   CAS   PubMed   PubMed Central   Google Scholar  

DeVito NJ, Bacon S, Goldacre B. Compliance with legal requirement to report clinical trial results on ClinicalTrials.gov: a cohort study. Lancet. 2020;395(10221):361–9.

Stockdale J, Cassell J, Ford E. “Giving something back”: a systematic review and ethical enquiry into public views on the use of patient data for research in the United Kingdom and the Republic of Ireland. Wellcome Open Res. 2018;3:6.

Mascarenhas M, Garasia S, Berthiaume P, et al. A scoping review of published literature on chikungunya virus. PLoS One. 2018;13(11):e0207554.

Gates M, Wingert A, Featherstone R, Samuels C, Simon C, Dyson MP. Impact of fatigue and insufficient sleep on physician and patient outcomes: a systematic review. BMJ Open. 2018;8(9):e021967.

Song F, Parekh-Bhurke S, Hooper L, et al. Extent of publication bias in different categories of research cohorts: a meta-analysis of empirical studies. BMC Med Res Methodol. 2009;9:79.

Egger M, Zellweger-Zahner T, Schneider M, Junker C, Lengeler C, Antes G. Language bias in randomised controlled trials published in English and German. Lancet. 1997;350(9074):326–9.

Stansfield C, Dickson K, Bangpan M. Exploring issues in the conduct of website searching and other online sources for systematic reviews: how can we be systematic? Syst Rev. 2016;5(1):191.

Farrah K, Mierzwinski-Urban M. Almost half of references in reports on new and emerging nondrug health technologies are grey literature. J Med Libr Assoc. 2019;107(1):43–8.

Piasecki J, Waligora M, Dranseika V. Google search as an additional source in systematic reviews. Sci Eng Ethics. 2018;24(2):809–10.

PubMed   Google Scholar  

Dax the duck. Measuring the “filter bubble”: how Google is influencing what you click. DuckDuckGo Blog 2018; https://spreadprivacy.com/google-filter-bubble-study/ .

Scherer RW, Meerpohl JJ, Pfeifer N, Schmucker C, Schwarzer G, von Elm E. Full publication of results initially presented in abstracts. Cochrane Database Syst Rev. 2018;11:MR000005.

McAuley L, Pham B, Tugwell P, Moher D. Does the inclusion of grey literature influence estimates of intervention effectiveness reported in meta-analyses? Lancet. 2000;356(9237):1228–31.

ProceedingsFirst. 2018; https://help.oclc.org/Discovery_and_Reference/FirstSearch/FirstSearch_databases/ProceedingsFirst . Accessed 28 February, 2019.

Directory of Published Papers. InterDok Media Services LLC; 2019. http://www.interdok.com/search_paper.php .

Abstract Archive. International AIDS Society; 2019. http://www.abstract-archive.org/ .

Foster MJ, Jewell ST. Assembling the pieces of a systematic review: guide for librarians. Lanham, MD: Rowman & Littlefield; 2017.

Google Scholar  

Stephens RJ, Dettmer MR, Roberts BW, et al. Practice patterns and outcomes associated with early sedation depth in mechanically ventilated patients: a systematic review and meta-analysis. Crit Care Med. 2018;46(3):471–9.

Kunneman M, Gionfriddo MR, Toloza FJK, et al. Humanistic communication in the evaluation of shared decision making: a systematic review. Patient Educ Couns. 2018;102(3):452–66.

Spurling GK, Del Mar CB, Dooley L, Foxlee R, Farley R. Delayed antibiotic prescriptions for respiratory infections. Cochrane Database Syst Rev. 2017;9:CD004417.

Wright K, Golder S, Rodriguez-Lopez R. Citation searching: a systematic review case study of multiple risk behaviour interventions. BMC Med Res Methodol. 2014;14:73.

Zarychanski R, Turgeon AF, McIntyre L, Fergusson DA. Erythropoietin-receptor agonists in critically ill patients: a meta-analysis of randomized controlled trials. CMAJ. 2007;177(7):725–34.

Degenhardt L, Peacock A, Colledge S, et al. Global prevalence of injecting drug use and sociodemographic characteristics and prevalence of HIV, HBV, and HCV in people who inject drugs: a multistage systematic review. Lancet Glob Health. 2017;5(12):e1192–207.

Kirkham JJ, Dwan KM, Altman DG, et al. The impact of outcome reporting bias in randomised controlled trials on a cohort of systematic reviews. BMJ. 2010;340:c365.

Hodkinson A, Dietz KC, Lefebvre C, et al. The use of clinical study reports to enhance the quality of systematic reviews: a survey of systematic review authors. Syst Rev. 2018;7(1):117.

Musini VM, Lawrence KA, Fortin PM, Bassett K, Wright JM. Blood pressure lowering efficacy of renin inhibitors for primary hypertension. Cochrane Database Syst Rev. 2017;4:CD007066.

Samarasekera N, Smith C, Al-Shahi SR. The association between cerebral amyloid angiopathy and intracerebral haemorrhage: systematic review and meta-analysis. J Neurol Neurosurg Psychiatry. 2012;83(3):275–81.

Tham T, Machado R, Khaymovich J, Costantino P. Detection of HPV16, HPV18, p16, and E6/E7 MRNA in nasopharyngeal cancer: a systematic review and meta-analysis. bioRxiv 2018:401554. https://www.biorxiv.org/content/biorxiv/early/2018/08/27/401554.full.pdf .

Lin J, Wilbur WJ. PubMed related articles: a probabilistic topic-based model for content similarity. BMC Bioinformatics. 2007;8:423.

Clarivate Analytics. Related records. Web of Science Core Collection Help 2018; https://images.webofknowledge.com/images/help/WOS/hp_related_records.html . Accessed 1 Feb, 2020.

Bennett KG, Berlin NL, MacEachern MP, Buchman SR, Preminger BA, Vercler CJ. The ethical and professional use of social media in surgery: a systematic review of the literature. Plast Reconstr Surg. 2018;142(3):388e–98e.

Varley-Campbell J, Cooper C, Wilkerson D, Wardle S, Greeves J, Lorenc T. Sex-specific changes in physical performance following military training: a systematic review. Sports Med. 2018;48(11):2623–40.

Chandler J, Churchill R, Higgins J, Lasserson T, Tovey D. Methodological standards for the conduct of new Cochrane Intervention Reviews: version 2.1. 2011; https://sti.cochrane.org/sites/sti.cochrane.org/files/public/uploads/Methodological%20standards%20for%20the%20conduct%20of%20Cochrane%20Intervention%20Reviews.PDF . Accessed 1 Feb, 2020.

CADTH. Grey Matters: a practical tool for searching health-related grey literature. 2019; https://www.cadth.ca/resources/finding-evidence/grey-matters . Accessed 1 Feb, 2020.

Higgins J, Lasserson T, Chandler J, Tovey D, Churchill R. Methodological Expectations of Cochrane Intervention Reviews. 2019; https://community.cochrane.org/mecir-manual . Accessed 3 Jan 2020.

Kim Y. Fostering scientists’ data sharing behaviors via data repositories, journal supplements, and personal communication methods. Inform Process Manag. 2017;53(4):871–85.

National Center for Biotechnology Information. PubMed Central. 2020; https://www.ncbi.nlm.nih.gov/pmc/ . Accessed 9 Jan 2020.

Thomas LH, Coupe J, Cross LD, Tan AL, Watkins CL. Interventions for treating urinary incontinence after stroke in adults. Cochrane Database Syst Rev. 2019;2:CD004462.

Speer K, Upton D, Semple S, McKune A. Systemic low-grade inflammation in post-traumatic stress disorder: a systematic review. J Inflamm Res. 2018;11:111–21.

Rudmik L, Soler ZM. Medical therapies for adult chronic sinusitis: a systematic review. JAMA. 2015;314(9):926–39.

Joseph MS, Tincopa MA, Walden P, Jackson E, Conte ML, Rubenfire M. The impact of structured exercise programs on metabolic syndrome and its components: a systematic review. Diabetes Metab Syndr Obes. 2019;12:2395–404.

Lumba-Brown A, Yeates KO, Sarmiento K, et al. Diagnosis and management of mild traumatic brain injury in children: a systematic review. JAMA Pediatr. 2018;172(11):e182847.

Kugley S, Wade A, Thomas J, et al. Searching for studies: a guide to information retrieval for Campbell systematic reviews. Campbell Syst Rev. 2017;13(1):1–73.

Iansavichene AE, Sampson M, McGowan J, Ajiferuke IS. Should systematic reviewers search for randomized, controlled trials published as letters? Ann Intern Med. 2008;148(9):714–5.

Cooper C, Booth A, Varley-Campbell J, Britten N, Garside R. Defining the process to literature searching in systematic reviews: a literature review of guidance and supporting studies. BMC Med Res Methodol. 2018;18(1):85.

Craven J, Levay P. Recording database searches for systematic reviews - what is the value of adding a narrative to peer-review checklists? A case study of NICE interventional procedures guidance. Evid Based Libr Inf Pract. 2011;6(4):72–87.

Scottish Intercollegiate Guidelines Network (SIGN). Search filters. 2011; https://www.sign.ac.uk/search-filters.html .

Karagiannis T, Paschos P, Paletas K, Matthews DR, Tsapas A. Dipeptidyl peptidase-4 inhibitors for treatment of type 2 diabetes mellitus in the clinical setting: systematic review and meta-analysis. BMJ. 2012;344:e1369.

Article   PubMed   CAS   Google Scholar  

Lefebvre C, Glanville J, Beale S, et al. Assessing the performance of methodological search filters to improve the efficiency of evidence information retrieval: five literature reviews and a qualitative study. Health Technol Assess. 2017;21(69):1–148.

Damarell RA, May N, Hammond S, Sladek RM, Tieman JJ. Topic search filters: a systematic scoping review. Health Info Libr J. 2019;36(1):4–40.

McKibbon KA, Wilczynski NL, Haynes RB, Hedges T. Retrieving randomized controlled trials from medline: a comparison of 38 published search filters. Health Info Libr J. 2009;26(3):187–202.

InterTASC Information Specialists’ Sub-Group. The InterTASC Information Specialists’ Sub-Group Search Filter Resource. 2020; https://sites.google.com/a/york.ac.uk/issg-search-filters-resource . Accessed 1 Feb, 2020.

Health Information Research Unit. Search filters for MEDLINE in Ovid syntax and the PubMed translation. 2016; http://hiru.mcmaster.ca/hiru/HIRU_Hedges_MEDLINE_Strategies.aspx . Accessed 1 Feb, 2020.

Waters E, de Silva-Sanigorski A, Hall BJ, et al. Interventions for preventing obesity in children. Cochrane Database Syst Rev. 2011;(12):Cd00187.

Cahill K, Lancaster T. Workplace interventions for smoking cessation. Cochrane Database Syst Rev. 2014;(2):Cd003440.

Freak-Poli RL, Cumpston M, Peeters A, Clemes SA. Workplace pedometer interventions for increasing physical activity. Cochrane Database Syst Rev. 2013;(4):Cd009209.

Dobbins M, Husson H, DeCorby K, LaRocca RL. School-based physical activity programs for promoting physical activity and fitness in children and adolescents aged 6 to 18. Cochrane Database Syst Rev. 2013;(2):Cd007651.

Guerra PH, Nobre MR, da Silveira JA, Taddei JA. School-based physical activity and nutritional education interventions on body mass index: a meta-analysis of randomised community trials - project PANE. Prev Med. 2014;61:81–9.

Jaime PC, Lock K. Do school based food and nutrition policies improve diet and reduce obesity? Prev Med. 2009;48(1):45–53.

Thomas RE, McLellan J, Perera R. School-based programmes for preventing smoking. Cochrane Database Syst Rev. 2013;(4):Cd001293.

Foxcroft D, Ireland D, Lowe G, Breen R. Primary prevention for alcohol misuse in young people. Cochrane Database Syst Rev. 2011;(9):Cd003024.

Wolfenden L, Jones J, Williams CM, et al. Strategies to improve the implementation of healthy eating, physical activity and obesity prevention policies, practices or programmes within childcare services. Cochrane Database Syst Rev. 2016;10:Cd011779.

Wolfenden L, Nathan NK, Sutherland R, et al. Strategies for enhancing the implementation of school-based policies or practices targeting risk factors for chronic disease. Cochrane Database Syst Rev. 2017;11:Cd011677.

Wolfenden L, Goldman S, Stacey FG, et al. Strategies to improve the implementation of workplace-based policies or practices targeting tobacco, alcohol, diet, physical activity and obesity. Cochrane Database Syst Rev. 2018;11:CD012439.

White CM, Ip S, McPheeters M, et al. Using existing systematic reviews to replace de novo processes in conducting comparative effectiveness reviews. In: Methods Guide for Effectiveness and Comparative Effectiveness Reviews. Rockville (MD); 2008.

Lopez-Olivo MA, Tayar JH, Martinez-Lopez JA, et al. Risk of malignancies in patients with rheumatoid arthritis treated with biologic therapy: a meta-analysis. JAMA. 2012;308(9):898–908.

Sampson M, Shojania KG, Garritty C, Horsley T, Ocampo M, Moher D. Systematic reviews can be produced and published faster. J Clin Epidemiol. 2008;61(6):531–6.

Shojania KG, Sampson M, Ansari MT, Ji J, Doucette S, Moher D. How quickly do systematic reviews go out of date? A survival analysis. Ann Intern Med. 2007;147(4):224–33.

Shojania KG, Sampson M, Ansari MT, et al. Updating systematic reviews: AHRQ publication no 07-0087. AHRQ Technical Reviews 2007; 16: http://www.ncbi.nlm.nih.gov/books/NBK44099/pdf/TOC.pdf . Accessed 1 Feb, 2020.

Bhaskar V, Chan HL, MacEachern M, Kripfgans OD. Updates on ultrasound research in implant dentistry: a systematic review of potential clinical indications. Dentomaxillofac Radiol. 2018;47(6):20180076.

Beller EM, Chen JK, Wang UL, Glasziou PP. Are systematic reviews up-to-date at the time of publication? Syst Rev. 2013;2:36.

Velez MP, Hamel C, Hutton B, et al. Care plans for women pregnant using assisted reproductive technologies: a systematic review. Reprod Health. 2019;16(1):9.

Relevo R, Paynter R. Peer review of search strategies. AHRQ Methods for Effective Health Care 2012; https://www.ncbi.nlm.nih.gov/books/NBK98353/ .

Spry C, Mierzwinski-Urban M. The impact of the peer review of literature search strategies in support of rapid review reports. Res Synth Methods. 2018;9(4):521–6.

Banno M, Harada Y, Taniguchi M, et al. Exercise can improve sleep quality: a systematic review and meta-analysis. PeerJ. 2018;6:e5172.

Tsamalaidze L, Stauffer JA, Brigham T, Asbun HJ. Postsplenectomy thrombosis of splenic, mesenteric, and portal vein (PST-SMPv): a single institutional series, comprehensive systematic review of a literature and suggested classification. Am J Surg. 2018;216(6):1192–204.

Barakat S, Boehmer K, Abdelrahim M, et al. Does health coaching grow capacity in cancer survivors? A systematic review. Popul Health Manag. 2018;21(1):63–81.

Qi X, Yang M, Ren W, et al. Find duplicates among the PubMed, EMBASE, and Cochrane Library Databases in systematic review. PLoS One. 2013;8(8):e71838.

Bramer WM, Giustini D, de Jonge GB, Holland L, Bekhuis T. De-duplication of database search results for systematic reviews in EndNote. J Med Libr Assoc. 2016;104(3):240–3.

Rathbone J, Carter M, Hoffmann T, Glasziou P. Better duplicate detection for systematic reviewers: evaluation of Systematic Review Assistant-Deduplication Module. Syst Rev. 2015;4:6.

Kwon Y, Lemieux M, McTavish J, Wathen N. Identifying and removing duplicate records from systematic review searches. J Med Libr Assoc. 2015;103(4):184–8.

Tricco AC, Lillie E, Zarin W, et al. PRISMA extension for Scoping Reviews (PRISMA-ScR): checklist and explanation. Ann Intern Med. 2018;169(7):467–73.

Hutton B, Salanti G, Caldwell DM, et al. The PRISMA extension statement for reporting of systematic reviews incorporating network meta-analyses of health care interventions: checklist and explanations. Ann Intern Med. 2015;162(11):777–84.

Stewart LA, Clarke M, Rovers M, et al. Preferred Reporting Items for Systematic Review and Meta-Analyses of individual participant data: the PRISMA-IPD Statement. JAMA. 2015;313(16):1657–65.

Haddaway NR, Macura B, Whaley P, Pullin AS. ROSES RepOrting standards for Systematic Evidence Syntheses: pro forma, flow-diagram and descriptive summary of the plan and conduct of environmental systematic reviews and systematic maps. Environ Evid. 2018;7(1):7.

Stansfield C, O'Mara-Eves A, Thomas J. Text mining for search term development in systematic reviewing: a discussion of some methods and challenges. Res Synth Methods. 2017;8(3):355–65.

Hausner E, Guddat C, Hermanns T, Lampert U, Waffenschmidt S. Prospective comparison of search strategies for systematic reviews: an objective approach yielded higher sensitivity than a conceptual one. J Clin Epidemiol. 2016;77:118–24.

Paynter R, Banez LL, Berliner E, et al. EPC Methods: an exploration of the use of text-mining software in systematic reviews. 2016; https://www.ncbi.nlm.nih.gov/books/NBK362044/ . Accessed 3 Feb,2020.

Download references

Acknowledgements

We would like to thank all of the members of the PRISMA-S Group, which is comprised of participants in the Delphi process, consensus conference, or both. PRISMA-S Group members include Heather Blunt (Dartmouth College), Tara Brigham (Mayo Clinic in Florida), Steven Chang (La Trobe University), Justin Clark (Bond University), Aislinn Conway (BORN Ontario and CHEO Research Institute), Rachel Couban (McMaster University), Shelley de Kock (Kleijnen Systematic Reviews Ltd), Kelly Farrah (Canadian Agency for Drugs and Technologies in Health (CADTH)), Paul Fehrmann (Kent State University), Margaret Foster (Texas A & M University), Susan A. Fowler (Washington University in St. Louis), Julie Glanville (University of York), Elizabeth Harris (La Trobe University), Lilian Hoffecker (University of Colorado Denver), Jaana Isojarvi (Tampere University), David Kaunelis (Canadian Agency for Drugs and Technologies in Health (CADTH)), Hans Ket (VU Amsterdam), Paul Levay (National Institute for Health and Care Excellence (NICE)), Jennifer Lyon, Jessie McGowan (uOttawa), M. Hassan Murad (Mayo Clinic), Joey Nicholson (NYU Langone Health), Virginia Pannabecker (Virginia Tech), Robin Paynter (VA Portland Health Care System), Rachel Pinotti (Icahn School of Medicine at Mount Sinai), Amanda Ross-White (Queens University), Margaret Sampson (CHEO), Tracy Shields (Naval Medical Center Portsmouth), Adrienne Stevens (Ottawa Hospital Research Institute), Anthea Sutton (University of Sheffield), Elizabeth Weinfurter (University of Minnesota), Kath Wright (University of York), and Sarah Young (Carnegie Mellon University). We would also like to thank Kate Nyhan (Yale University), Katharina Gronostay (IQWiG), the many others who contributed to the PRISMA-S project anonymously or as draft reviewers, and our peer reviewers. We would like to give special thanks to the late Douglas G. Altman (D.G.A.; University of Oxford) for his support and guidance, and the co-chairs of the Medical Library Association’s Systematic Reviews SIG in 2016, Margaret Foster (Texas A & M University) and Susan Fowler (Washington University in St. Louis), for allowing us to use one of their meeting times for the consensus conference.

Melissa Rethlefsen was funded in part by the University of Utah’s Center for Clinical and Translational Science under the National Center for Advancing Translational Sciences of the National Institutes of Health Award Number UL1TR002538 in 2017–2018 . The content is solely the responsibility of the authors and does not necessarily represent the official views of the National Institutes of Health.

Shona Kirtley was funded by the Cancer Research UK (grant C49297/A27294). The funder had no role in the study design, data collection and analysis, decision to publish, or preparation of the manuscript. The views expressed are those of the authors and not necessarily those of the Cancer Research UK.

Matthew Page is supported by an Australian Research Council Discovery Early Career Researcher Award (DE200101618).

David Moher is supported by a University Research Chair, University of Ottawa, Ottawa, Canada.

The consensus conference was sponsored by the Systematic Reviews SIG of the Medical Library Association. There was no specific funding associated with this event.

Author information

Authors and affiliations.

Health Science Center Libraries, George A. Smathers Libraries, University of Florida, Gainesville, USA

Melissa L. Rethlefsen

UK EQUATOR Centre, Centre for Statistics in Medicine (CSM), Nuffield Department of Orthopaedics, Rheumatology and Musculoskeletal Sciences (NDORMS), Botnar Research Centre, University of Oxford, Windmill Road, Oxford, OX3 7LD, UK

Shona Kirtley

Institute for Quality and Efficiency in Health Care, Cologne, Germany

Siw Waffenschmidt

Gerstein Science Information Centre, University of Toronto, Toronto, Canada

Ana Patricia Ayala

Centre for Journalology, Clinical Epidemiology Program, Ottawa Hospital Research Institute, The Ottawa Hospital, General Campus, Centre for Practice Changing Research Building, 501 Smyth Road, PO BOX 201B, Ottawa, Ontario, K1H 8L6, Canada

David Moher

School of Public Health and Preventive Medicine, Monash University, Melbourne, Australia

Matthew J. Page

University of Minnesota, Minneapolis, USA

Jonathan B. Koffel

You can also search for this author in PubMed   Google Scholar

  • Heather Blunt
  • , Tara Brigham
  • , Steven Chang
  • , Justin Clark
  • , Aislinn Conway
  • , Rachel Couban
  • , Shelley de Kock
  • , Kelly Farrah
  • , Paul Fehrmann
  • , Margaret Foster
  • , Susan A. Fowler
  • , Julie Glanville
  • , Elizabeth Harris
  • , Lilian Hoffecker
  • , Jaana Isojarvi
  • , David Kaunelis
  • , Paul Levay
  • , Jennifer Lyon
  • , Jessie McGowan
  • , M. Hassan Murad
  • , Joey Nicholson
  • , Virginia Pannabecker
  • , Robin Paynter
  • , Rachel Pinotti
  • , Amanda Ross-White
  • , Margaret Sampson
  • , Tracy Shields
  • , Adrienne Stevens
  • , Anthea Sutton
  • , Elizabeth Weinfurter
  • , Kath Wright
  •  & Sarah Young

Contributions

M.L.R. conceived and designed the study, conducted the thematic and quantitative analyses, curated the data, drafted the manuscript, and reviewed and edited the manuscript. M.L.R. is the guarantor. J.B.K. and S.K. contributed to the design of the study, developed the literature search strategies, contributed to the thematic content analyses, drafted a portion of the Elaboration & Explanation, and reviewed and edited the manuscript. J.B.K. developed the survey instrument. M.L.R., J.B.K., and S.K. hosted and organized the consensus conference. S.W. and A.P.A. contributed to the thematic content analysis, drafted a portion of the Elaboration & Explanation, and reviewed and edited the manuscript. S.W. supervised the draft revision documentation. D.M. helped conceive and design the study. M.J.P. provided substantive review and editing of the checklist, Explanation & Elaboration, and final manuscript. The author (s) read and approved the final manuscript.

Corresponding author

Correspondence to Melissa L. Rethlefsen .

Ethics declarations

Ethics approval and consent to participate.

This study was declared exempt by the University of Utah Institutional Review Board (IRB_00088425). Consent was received from all survey participants.

Consent for publication

Not applicable

Competing interests

The authors declare no competing interests. MJP and DM are leading the PRISMA 2020 update.

Additional information

Publisher’s note.

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Supplementary Information

Additional file 1., rights and permissions.

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/ . The Creative Commons Public Domain Dedication waiver ( http://creativecommons.org/publicdomain/zero/1.0/ ) applies to the data made available in this article, unless otherwise stated in a credit line to the data.

Reprints and permissions

About this article

Cite this article.

Rethlefsen, M.L., Kirtley, S., Waffenschmidt, S. et al. PRISMA-S: an extension to the PRISMA Statement for Reporting Literature Searches in Systematic Reviews. Syst Rev 10 , 39 (2021). https://doi.org/10.1186/s13643-020-01542-z

Download citation

Received : 28 February 2020

Accepted : 23 November 2020

Published : 26 January 2021

DOI : https://doi.org/10.1186/s13643-020-01542-z

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Systematic reviews
  • Reporting guidelines
  • Search strategies
  • Literature search
  • Information retrieval
  • Reproducibility

Systematic Reviews

ISSN: 2046-4053

  • Submission enquiries: Access here and click Contact Us
  • General enquiries: [email protected]

literature search strategy table

  • Interlibrary Loan

Ask an Expert

Ask an expert about access to resources, publishing, grants, and more.

MD Anderson faculty and staff can also request a one-on-one consultation with a librarian or scientific editor.

  • Library Calendar

Log in to the Library's remote access system using your MyID account.

The University of Texas MD Anderson Cancer Center Home

  • Library Home
  • Research Guides

Literature Search Basics

Develop a search strategy.

  • Define your search
  • Decide where to search

What is a search strategy

Advanced search tips.

  • Track and save your search
  • Class Recording: Writing an Effective Narrative Review
  • A search strategy includes  a combination of keywords, subject headings, and limiters (language, date, publication type, etc.)
  • A search strategy should be planned out and practiced before executing the final search in a database.
  • A search strategy and search results should be documented throughout the searching process.

What is a search strategy?

A search strategy is an organized combination of keywords, phrases, subject headings, and limiters used to search a database.

Your search strategy will include:

  • keywords 
  • boolean operators
  • variations of search terms (synonyms, suffixes)
  • subject headings 

Your search strategy  may  include:

  • truncation (where applicable)
  • phrases (where applicable)
  • limiters (date, language, age, publication type, etc.)

A search strategy usually requires several iterations. You will need to test the strategy along the way to ensure that you are finding relevant articles. It's also a good idea to review your search strategy with your co-authors. They may have ideas about terms or concepts you may have missed.

Additionally, each database you search is developed differently. You will need to adjust your strategy for each database your search.  For instance, Embase is a European database, many of the medical terms are slightly different than those used in MEDLINE and PubMed.

Choose search terms

Start by writing down as many terms as you can think of that relate to your question. You might try  cited reference searching  to find a few good articles that you can review for relevant terms.

Remember than most terms or  concepts can be expressed in different ways.  A few things to consider:

  • synonyms: "cancer" may be referred to as "neoplasms", "tumors", or "malignancy"
  • abbreviations: spell out the word instead of abbreviating
  • generic vs. trade names of drugs

Search for the exact phrase

If you want words to appear next to each other in an exact phrase, use quotation marks, eg “self-esteem”.

Phrase searching decreases the number of results you get. Most databases allow you to search for phrases, but check the database guide if you are unsure.

Truncation and wildcards

Many databases use an asterisk (*) as their truncation symbol  to find various word endings like singulars and plurals.  Check the database help section if you are not sure which symbol to use. 

"Therap*"

retrieves: therapy, therapies, therapist or therapists.

Use a wildcard (?) to find different spellings like British and American spellings.

"Behavio?r" retrieves behaviour and behavior.

Searching with subject headings

Database subject headings are controlled vocabulary terms that a database uses to describe what an article is about.

Using appropriate subject headings enhances your search and will help you to find more results on your topic. This is because subject headings find articles according to their subject, even if the article does not use your chosen key words.

You should combine both subject headings and keywords in your search strategy for each of the concepts you identify. This is particularly important if you are undertaking a systematic review or an in-depth piece of work

Subject headings may vary between databases, so you need to investigate each database separately to find the subject headings they use. For example, for MEDLINE you can use MeSH (Medical Subject Headings) and for Embase you can use the EMTREE thesaurus.

SEARCH TIP:  In Ovid databases, search for a known key paper by title, select the "complete reference" button to see which subject headings the database indexers have given that article, and consider adding relevant ones to your own search strategy.

Use Boolean logic to combine search terms

literature search strategy table

Boolean operators (AND, OR and NOT) allow you to try different combinations of search terms or subject headings.

Databases often show Boolean operators as buttons or drop-down menus that you can click to combine your search terms or results.

The main Boolean operators are:

OR is used to find articles that mention  either  of the topics you search for.

AND is used to find articles that mention  both  of the searched topics.

NOT excludes a search term or concept. It should be used with caution as you may inadvertently exclude relevant references.

For example, searching for “self-esteem NOT eating disorders” finds articles that mention self-esteem but removes any articles that mention eating disorders.

Adjacency searching 

Use adjacency operators to search by phrase or with two or more words in relation to one another. A djacency searching commands differ among databases. Check the database help section if you are not sure which searching commands to use. 

In Ovid Medline

"breast ADJ3 cancer" finds the word breast within three words of cancer, in any order.

This includes breast cancer or cancer of the breast.

Cited Reference Searching

Cited reference searching is a method to find articles that have been cited by other publications. 

Use cited reference searching to:

  • find keywords or terms you may need to include in your search strategy
  • find pivotal papers the same or similar subject area
  • find pivotal authors in the same or similar subject area
  • track how a topic has developed over time

Cited reference searching is available through these tools:

  • Web of Science
  • GoogleScholar
  • << Previous: Decide where to search
  • Next: Track and save your search >>
  • Last Updated: Nov 29, 2022 3:34 PM
  • URL: https://mdanderson.libguides.com/literaturesearchbasics

Presenting a search strategy

Have you done a structured search related to a literature review or other work? Do you need to present how you found the articles you selected? Are you thinking about how you can present articles that you have found alongside the search, for example via a reference list to another article? Here you can see what information should be included in a search strategy presentation, and some examples of what it might look like.

What should be included in a search strategy presentation?

How can you present your search strategy.

  • Strive to be as transparent as possible, it should preferably be possible for someone else to repeat your search and get the same result.
  • In  the methods section you describe how you did: if you did test searches, how you found search words, if you searched with both free-text words and controlled subject headings.
  • In the method section, you can also describe and report articles that you have found in other ways than via the database search, for example if you have found articles via a reference list to another article or by manual search .
  • The complete search strategy is usually also presented in table form. The table can be added as an appendix to the work.
  • In order for you to be able to present your search strategy, it is important that you save the search you have made , in some way. A tip is to cut and paste from the database's "Search history".
  • It is also possible to create an account in the databases to save searches.

What should a search strategy presentation contain?

  • The name of the database
  • The date you did the search
  • Which search terms you have used
  • How you searched (quotes, especially fields, truncations, etc.)
  • How you have combined your search terms
  • If you have used any filter , or restriction (language, year, etc.)

Example of text in the methods section

The searches were conducted during June 2018 in the databases CINAHL, Web of Science and PubMed.

The Mesh terms identified for the PubMed search were adapted to corresponding terms in CINAHL. Every individual search term was supplemented with relevant free text terms. When appropriate, the free text terms have been truncated in order to include alternative word endings.

The search result was limited to articles that were written in English as well as articles published during the last ten years. The full search strategy is included as an appendix.

The database searches were complemented with manual review of the reference lists of relevant articles, which resulted in a few additional articles included in the study.

Examples from different databases

In the tables below we present searches in three different databases. In all databases we have used the topic " Patients' experience of day surgery ".

Example from CINAHL

MH = Exact Subject Heading

Example from PubMed

Example from web of science.

Students studying in the library in Solna.

Guide for students: Structured literature reviews

A step-by-step guide aimed at Master's students undertaking a structured literature review as part of their Master's thesis. In this guide we will go through the different steps of a structured literature review and provide tips on how to make your search strategy more structured and extensive.

  • How to conduct a systematic review
  • Chapter 4 about Literature searching in the book Assessment of methods in health care - a handbook from the Swedish Agency for Health Technology Assessment and Assessment of Social Services.

Two women sitting in front of a laptop.

Support in information searching

Are you looking for scientific articles or writing references and need advice? You can get help from our librarians. We offer both drop-in via Zoom and booked consultations.

Opening hours drop-in support in Zoom

If you would like us to get back to you, please submit your contact information in the form below along with your feeback.

  • Subject guides
  • Researching for your literature review
  • Develop a search strategy

Researching for your literature review: Develop a search strategy

  • Literature reviews
  • Literature sources
  • Getting started
  • Keyword search activity
  • Subject search activity
  • Combined keyword and subject searching
  • Online tutorials
  • Apply search limits
  • Run a search in different databases
  • Supplementary searching
  • Save your searches
  • Manage results

Identify key terms and concepts

Start developing a search strategy by identifying the key words and concepts within your research question. The aim is to identify the words likely to have been used in the published literature on this topic.

For example: What are the key infection control strategies for preventing the transmission of Methicillin-resistant Staphylococcus aureus (MRSA) in aged care homes .

Treat each component as a separate concept so that your topic is organised into separate blocks (concepts).

For each concept block, list the key words derived from your research question, as well as any other relevant terms or synonyms that you have found in your preliminary searches. Also consider singular and plural forms of words, variant spellings, acronyms and relevant index terms (subject headings).  

As part of the process of developing a search strategy, it is recommended that you keep a master list of search terms for each key concept. This will make it easier when it comes to translating your search strategy across multiple database platforms. 

Concept map template for documenting search terms

Combine search terms and concepts

Boolean operators are used to combine the different concepts in your topic to form a search strategy. The main operators used to connect your terms are AND and OR . See an explanation below:

  • Link keywords related to a single concept with OR
  • Linking with OR broadens a search (increases the number of results) by searching for any of the alternative keywords

Example: nursing home OR aged care home

  • Link different concepts with AND
  • Linking with AND narrows a search (reduces the number of results) by retrieving only those records that include all of your specified keywords

Example: nursing home AND infection control

  • using NOT narrows a search by excluding results that contain certain search terms
  • Most searches do not require the use of the NOT operator

Example: aged care homes NOT residential homes will retrieve all the results that include the words aged care homes but don't include the words residential homes . So if an article discussed both concepts this article would not be retrieved as it would be excluded on the basis of the words residential homes .

See the website for venn diagrams demonstrating the function of AND/OR/NOT:

Combine the search terms using Boolean

Advanced search operators - truncation and wildcards

By using a truncation symbol you can capture all of the various endings possible for a particular word. This may increase the number of results and reduce the likelihood of missing something relevant. Some tips about truncation:

  • The truncation symbol is generally an asterisk symbol * and is added at the end of a word.
  • It may be added to the root of a word that is a word in itself. Example: prevent * will retrieve prevent, prevent ing , prevent ion prevent ative etc. It may also be added to the root of a word that is not a word in itself. Example: strateg * will retrieve strateg y , strateg ies , strateg ic , strateg ize etc.
  • If you don't want to retrieve all possible variations, an easy alternative is to utilise the OR operator instead e.g. strategy OR strategies. Always use OR instead of truncation where the root word is too small e.g. ill OR illness instead of ill*

There are also wildcard symbols that function like truncation but are often used in the middle of a word to replace zero, one or more characters.

  • Unlike the truncator which is usually an asterisk, wildcards vary across database platforms
  • Common wildcards symbols are the question mark ? and hash #.
  • Example:  wom # n finds woman or women, p ? ediatric finds pediatric or paediatric.  

See the Database search tips for details of these operators, or check the Help link in any database.

Phrase searching

For words that you want to keep as a phrase, place two or more words in "inverted commas" or "quote marks". This will ensure word order is maintained and that you only retrieve results that have those words appearing together.

Example: “nursing homes”

There are a few databases that don't require the use of quote marks such as Ovid Medline and other databases in the Ovid suite. The Database search tips provides details on phrase searching in key databases, or you can check the Help link in any database.

Subject headings (index terms)

Identify appropriate subject headings (index terms).

Many databases use subject headings to index content. These are selected from a controlled list and describe what the article is about. 

A comprehensive search strategy is often best achieved by using a combination of keywords and subject headings where possible.

In-depth knowledge of subject headings is not required for users to benefit from improved search performance using them in their searches.

Advantages of subject searching:

  • Helps locate articles that use synonyms, variant spellings, plurals
  • Search terms don’t have to appear in the title or abstract

Note: Subject headings are often unique to a particular database, so you will need to look for appropriate subject headings in each database you intend to use.

Subject headings are not available for every topic, and it is best to only select them if they relate closely to your area of interest.

MeSH (Medical Subject Headings)

The MeSH thesaurus provides standard terminology, imposing uniformity and consistency on the indexing of biomedical literature. In Pubmed/Medline each record is tagged with  MeSH  (Medical Subject Headings).

The MeSH vocabulary includes:

  • Represent concepts found in the biomedical literature
  • Some headings are commonly considered for every article (eg. Species (including humans), Sex, Age groups (for humans), Historical time periods)
  • attached to MeSH headings to describe a specific aspect of a concept
  • describe the type of publication being indexed; i.e., what the item is, not what the article is about (eg. Letter, Review, Randomized Controlled Trial)
  • Terms in a separate thesaurus, primarily substance terms

Create a 'gold set'

It is useful to build a ‘sample set’ or ‘gold set’ of relevant references before you develop your search strategy..

Sources for a 'gold set' may include:

  • key papers recommended by subject experts or supervisors
  • citation searching - looking at a reference list to see who has been cited, or using a citation database (eg. Scopus, Web of Science) to see who has cited a known relevant article
  • results of preliminary scoping searches.

The papers in your 'gold set' can then be used to help you identify relevant search terms

  • Look up your 'gold set' articles in a database that you will use for your literature review. For the articles indexed in the database, look at the records to see what keywords and/or subject headings are listed.

The 'gold set' will also provide a means of testing your search strategy

  • When an article in the sample set that is also indexed in the database is not retrieved, your search strategy can be revised in order to include it (see what concepts or keywords can be incorporated into your search strategy so that the article is retrieved).
  • If your search strategy is retrieving a lot of irrelevant results, look at the irrelevant records to determine why they are being retrieved. What keywords or subject headings are causing them to appear? Can you change these without losing any relevant articles from your results?
  • Information on the process of testing your search strategy using a gold set can be found in the systematic review guide

Example search strategy

A search strategy is the planned and structured organisation of terms used to search a database.

An example of a search strategy incorporating all three concepts, that could be applied to different databases is shown below:

screenshot of search strategy entered into a database Advanced search screen

You will use a combination of search operators to construct a search strategy, so it’s important to keep your concepts grouped together correctly. This can be done with parentheses (round brackets), or by searching for each concept separately or on a separate line.

The above search strategy in a nested format (combined into a single line using parentheses) would look like:

("infection control*" OR "infection prevention") AND ("methicillin resistant staphylococcus aureus" OR "meticillin resistant staphylococcus aureus" OR MRSA) AND ( "aged care home*" OR "nursing home*")

  • << Previous: Search strategies - Health/Medical topic example
  • Next: Keyword search activity >>

Covidence website will be inaccessible as we upgrading our platform on Monday 23rd August at 10am AEST, / 2am CEST/1am BST (Sunday, 15th August 8pm EDT/5pm PDT) 

How to write a search strategy for your systematic review

Home | Blog | How To | How to write a search strategy for your systematic review

Practical tips to write a search strategy for your systematic review

With a great review question and a clear set of eligibility criteria already mapped out, it’s now time to plan the search strategy. The medical literature is vast. Your team plans a thorough and methodical search, but you also know that resources and interest in the project are finite. At this stage it might feel like you have a mountain to climb.

The bottom line? You will have to sift through some irrelevant search results to find the studies that you need for your review. Capturing a proportion of irrelevant records in your search is necessary to ensure that it identifies as many relevant records as possible. This is the trade-off of precision versus sensitivity and, because systematic reviews aim to be as comprehensive as possible, it is best to favour sensitivity – more is more.

By now, the size of this task might be sounding alarm bells. The good news is that a range of techniques and web-based tools can help to make searching more efficient and save you time. We’ll look at some of them as we walk through the four main steps of searching for studies:

  • Decide where to search
  • Write and refine the search
  • Run and record the search
  • Manage the search results

Searching is a specialist discipline and the information given here is not intended to replace the advice of a skilled professional. Before we look at each of the steps in turn, the most important systematic reviewer pro-tip for searching is:

 Pro Tip – Talk to your librarian and do it early!

1. decide where to search .

It’s important to come up with a comprehensive list of sources to search so that you don’t miss anything potentially relevant. In clinical medicine, your first stop will likely be the databases MEDLINE , Embase , and CENTRAL . Depending on the subject of the review, it might also be appropriate to run the search in databases that cover specific geographical regions or specialist areas, such as traditional Chinese medicine.

In addition to these databases, you’ll also search for grey literature (essentially, research that was not published in journals). That’s because your search of bibliographic databases will not find relevant information if it is part of, for example:

  • a trials register
  • a study that is ongoing
  • a thesis or dissertation
  • a conference abstract.

Over-reliance on published data introduces bias in favour of positive results. Studies with positive results are more likely to be submitted to journals, published in journals, and therefore indexed in databases. This is publication bias and systematic reviews seek to minimise its effects by searching for grey literature.

2. Write and refine the search 

Search terms are derived from key concepts in the review question and from the inclusion and exclusion criteria that are specified in the protocol or research plan.

Keywords will be searched for in the title or abstract of the records in the database. They are often truncated (for example, a search for therap* to find therapy, therapies, therapist). They might also use wildcards to allow for spelling variants and plurals (for example, wom#n to find woman and women). The symbols used to perform truncation and wildcard searches vary by database.

Index terms  

Using index terms such as MeSH and Emtree in a search can improve its performance. Indexers with subject area expertise work through databases and tag each record with subject terms from a prespecified controlled vocabulary.

This indexing can save review teams a lot of time that would otherwise be spent sifting through irrelevant records. Using index terms in your search, for example, can help you find the records that are actually about the topic of interest (tagged with the index term) but ignore those that contain only a brief mention of it (not tagged with the index term).

Indexers assign terms based on a careful read of each study, rather than whether or not the study contains certain words. So the index terms enable the retrieval of relevant records that cannot be captured by a simple search for the keyword or phrase.

Use a combination

Relying solely on index terms is not advisable. Doing so could miss a relevant record that for some reason (indexer’s judgment, time lag between a record being listed in a database and being indexed) has not been tagged with an index term that would enable you to retrieve it. Good search strategies include both index terms and keywords.

literature search strategy table

Let’s see how this works in a real review! Figure 2 shows the search strategy for the review ‘Wheat flour fortification with iron and other micronutrients for reducing anaemia and improving iron status in populations’. This strategy combines index terms and keywords using the Boolean operators AND, OR, and NOT. OR is used first to reach as many records as possible before AND and NOT are used to narrow them down.

  • Lines 1 and 2: contain MeSH terms (denoted by the initial capitals and the slash at the end).
  • Line 3: contains truncated keywords (‘tw’ in this context is an instruction to search the title and abstract fields of the record).
  • Line 4: combines the three previous lines using Boolean OR to broaden the search.
  • Line 11: combines previous lines using Boolean AND to narrow the search.
  • Lines 12 and 13: further narrow the search using Boolean NOT to exclude records of studies with no human subjects.

literature search strategy table

Writing a search strategy is an iterative process. A good plan is  to try out a new strategy and check that it has picked up the key studies that you would expect it to find based on your existing knowledge of the topic area. If it hasn’t, you can explore the reasons for this, revise the strategy, check it for errors, and try it again!

3. Run and record the search

Because of the different ways that individual databases are structured and indexed, a separate search strategy is needed for each database. This adds complexity to the search process, and it is important to keep a careful record of each search strategy as you run it. Search strategies can often be saved in the databases themselves, but it is a good idea to keep an offline copy as a back-up; Covidence allows you to store your search strategies online in your review settings.

The reporting of the search will be included in the methods section of your review and should follow the PRISMA guidelines. You can download a flow diagram from PRISMA’s website to help you log the number of records retrieved from the search and the subsequent decisions about the inclusion or exclusion of studies. The PRISMA-S extension provides guidance on reporting literature searches.

literature search strategy table

It is very important that search strategies are reproduced in their entirety (preferably using copy and paste to avoid typos) as part of the published review so that they can be studied and replicated by other researchers. Search strategies are often made available as an appendix because they are long and might otherwise interrupt the flow of the text in the methods section.

4. Manage the search results 

Once the search is done and you have recorded the process in enough detail to write up a thorough description in the methods section, you will move on to screening the results. This is an exciting stage in any review because it’s the first glimpse of what the search strategies have found. A large volume of results may be daunting but your search is very likely to have captured some irrelevant studies because of its high sensitivity, as we have already seen. Fortunately, it will be possible to exclude many of these irrelevant studies at the screening stage on the basis of the title and abstract alone 😅.

Search results from multiple databases can be collated in a single spreadsheet for screening. To benefit from process efficiencies, time-saving and easy collaboration with your team, you can import search results into a specialist tool such as Covidence. A key benefit of Covidence is that you can track decisions made about the inclusion or exclusion of studies in a simple workflow and resolve conflicting decisions quickly and transparently. Covidence currently supports three formats for file imports of search results:

  • EndNote XML
  • PubMed text format
  • RIS text format

If you’d like to try this feature of Covidence but don’t have any data yet, you can download some ready-made sample data .

And you’re done!

There is a lot to think about when planning a search strategy. With practice, expert help, and the right tools your team can complete the search process with confidence.

This blog post is part of the Covidence series on how to write a systematic review.

Sign up for a free trial of Covidence today!

[1] Witt  KG, Hetrick  SE, Rajaram  G, Hazell  P, Taylor Salisbury  TL, Townsend  E, Hawton  K. Pharmacological interventions for self‐harm in adults . Cochrane Database of Systematic Reviews 2020, Issue 12. Art. No.: CD013669. DOI: 10.1002/14651858.CD013669.pub2. Accessed 02 February 2021

literature search strategy table

Laura Mellor. Portsmouth, UK

Perhaps you'd also like....

literature search strategy table

Top 5 Tips for High-Quality Systematic Review Data Extraction

Data extraction can be a complex step in the systematic review process. Here are 5 top tips from our experts to help prepare and achieve high quality data extraction.

literature search strategy table

How to get through study quality assessment Systematic Review

Find out 5 tops tips to conducting quality assessment and why it’s an important step in the systematic review process.

literature search strategy table

How to extract study data for your systematic review

Learn the basic process and some tips to build data extraction forms for your systematic review with Covidence.

Better systematic review management

Head office, working for an institution or organisation.

Find out why over 350 of the world’s leading institutions are seeing a surge in publications since using Covidence!

Request a consultation with one of our team members and start empowering your researchers:

By using our site you consent to our use of cookies to measure and improve our site’s performance. Please see our Privacy Policy for more information. 

Banner

Systematic literature searching

  • Getting Started
  • Defining the question
  • Formulate your search
  • Identify sources to search
  • Conduct your search
  • Review your search
  • Cited reference searching
  • Hand-searching
  • Critical appraisal and evaluating sources

Record your search

Example search strategies.

  • Manage your references
  • Keep up to date
  • Too many or too few results?

Create a document listing all the keywords and subject headings you need for your search. 

Save your search strategy from each database – use the workspace in each database to save your final search.  Or simply print it out.

You’ll find it helpful to keep a simple spreadsheet recording the databases you have searched. Make a note of the date, the database searched, which interface you used (eg EBSCO), the number of hits and the number of relevant results found.

Fully described in the following article by Booth, freely available from Pubmed Central.

Booth, A. (2006). “Brimful of STARLITE”: toward standards for reporting literature searches . Journal of the Medical Library Association, 94(4), 421–e205.

  • Example systematic search strategy An example of a detailed search strategy presented in a word document

Search terms table

search terms in a table. readable version saved as pdf

Full search strategies for one or more databases searched will usually be included as a supplementary file or appendix to the journal article. Search strategies will usually show:

  • the name of database and provider e.g. APA PsycINFO (EBSCO)
  • database-specific syntax and field codes used
  • the terms used and the operators used to combine them
  • May also include numbers of results for each line
  • << Previous: Critical appraisal and evaluating sources
  • Next: Manage your references >>
  • Last Updated: Jan 26, 2024 4:16 PM
  • URL: https://lancaster.libguides.com/health/systematic

U.S. flag

An official website of the United States government

The .gov means it's official. Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you're on a federal government site.

The site is secure. The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

  • Publications
  • Account settings
  • Browse Titles

NCBI Bookshelf. A service of the National Library of Medicine, National Institutes of Health.

Newhouse K, Weaver A, Lee J, et al. Toxicological Review of Ethyl Tertiary Butyl Ether (CASRN 637-92-3). Washington (DC): U.S. Environmental Protection Agency; 2021 Jul.

Cover of Toxicological Review of Ethyl Tertiary Butyl Ether (CASRN 637-92-3)

Toxicological Review of Ethyl Tertiary Butyl Ether (CASRN 637-92-3).

Literature search strategy | study selection and evaluation.

A literature search and screening strategy consisted of a broad search of online scientific databases and other sources to identify all potentially pertinent studies. In subsequent steps, references were screened to exclude papers not pertinent to an assessment of the health effects of ethyl tertiary butyl ether (ETBE), and remaining references were sorted into categories for further evaluation. The original chemical-specific search was conducted in four online scientific databases, PubMed, Toxline, Web of Science, and Toxic Substances Control Act Test Submissions (TSCATS), through December 2016, using the keywords and limits described in Table LS-1 . The overall literature search approach is shown graphically in Figure LS-1 . Another 114 citations were obtained using additional search strategies described in Table LS-2 . After electronically eliminating duplicates from the citations retrieved through these databases, 847 unique citations were identified. The resulting 847 citations were screened for pertinence and separated into categories as presented in Figure LS-1 using the title and either abstract or full text, or both, to examine the health effects of ETBE exposure. The inclusion and exclusion criteria used to screen the references and identify sources of health effects data are provided in Table LS-3 .

Table LS-1. Details of the search strategy employed for ETBE.

Details of the search strategy employed for ETBE.

Figure LS-1

Summary of literature search and screening process for ETBE. ADME = absorption, distribution, metabolism, excretion; QSAR = quantitative structure-activity relationship.

Table LS-2. Summary of additional search strategies for ETBE.

Summary of additional search strategies for ETBE.

Table LS-3. Inclusion-exclusion criteria.

Inclusion-exclusion criteria.

  • 33 references were identified as potential “Sources of Health Effects Data” and were considered for data extraction to evidence tables and exposure-response arrays.
  • 70 references were identified as “Supporting Studies.” These included 31 studies describing physiologically based pharmacokinetic (PBPK) models and other toxicokinetic information; 25 studies providing genotoxicity and other mechanistic information; 9 acute, short-term, or preliminary toxicity studies; and 5 direct administration (e.g., dermal) studies of ETBE. Although still considered sources of health effects information, studies investigating the effects of acute and direct chemical exposures are generally less pertinent for characterizing health hazards associated with chronic oral and inhalation exposures. Therefore, information from these studies was not considered for extraction into evidence tables. Nevertheless, these studies were still evaluated as possible sources of supplementary health effects information.
  • 29 references were identified as “Secondary Literature and Sources of Contextual Information” (e.g., reviews and other agency assessments); these references were retained as additional resources for development of the Toxicological Review.
  • 715 references were identified as being not pertinent (not on topic) for evaluating health effects for ETBE and were excluded from further consideration (see Figure LS-1 for exclusion categories and Table LS-3 for exclusion criteria). For example, health-effect studies of gasoline and ETBE mixtures were not considered pertinent to the assessment because the separate effects of gasoline components could not be determined. Retrieving numerous references that are not on topic is a consequence of applying an initial search strategy designed to cast a wide net and to minimize the possibility of missing potentially relevant health effects data.

The complete list of references as sorted above can be found on the ETBE project page of the Health and Environmental Research Online (HERO) website at https://hero.epa.gov/hero/index.cfm/project/page/project_id/1376 .

LS.1. POSTPEER-REVIEW LITERATURE SEARCH UPDATE

A post-peer-review literature search update was conducted in PubMed, Toxline, TSCATS, and Defense Technical Information Center (DTIC) for the period December 2016 to July 2019 using a search strategy consistent with previous literature searches (see Table LS-1 ). The documentation and results for the literature search and screen, including the specific references identified using each search strategy and tags assigned to each reference based on the manual screen, can be found on the HERO website on the ETBE project page at: ( https://hero.epa.gov/hero/index.cfm/project/page/project_id/1376 ).

Consistent with the Integrated Risk Information System (IRIS) Stopping Rules ( https://www.epa.gov/sites/production/files/2014-06/documents/iris_stoppingrules.pdf ), manual screening of the literature search update focused on identifying new studies that might change a major conclusion of the assessment. The last formal literature search was in 2019 while the draft was in external peer review, after which the literature was monitored in PubMed through January 2021. No animal bioassays or epidemiological studies were identified in the post-peer-review literature searches that would change any major conclusions in the assessment.

LS.2. SELECTION OF STUDIES FOR INCLUSION IN EVIDENCE TABLES

To summarize the important information systematically from the primary health effects studies in the ETBE evidence base, evidence tables were constructed in a standardized tabular format as recommended by NRC (2011) . Studies were arranged in evidence tables by route of exposure and then alphabetized by author. Of the studies retained after the literature search and screen, 33 were identified as “Sources of Health Effects Data” and considered for extraction into evidence tables for the hazard identification in Section 1 . An initial review of studies examining neurotoxic endpoints did not find consistent effects to warrant a comprehensive hazard evaluation; thus, the one subchronic study ( Dorman et al., 1997 ) that examined only neurotoxic endpoints (functional observational battery, motor activity, and terminal neuropathology) was not included in evidence tables. Data from the remaining 32 studies were extracted into evidence tables.

Supplementary studies that contain pertinent information for the Toxicological Review and augment hazard identification conclusions, such as genotoxic and mechanistic studies, studies describing the kinetics and disposition of ETBE absorption and metabolism, and pilot studies, were not included in the evidence tables. One controlled human exposure toxicokinetic study was identified, which is discussed in Appendix Section B.2 (Toxicokinetics). Short-term and acute studies did not differ qualitatively from the results of the longer term studies (i.e., ≥90-day exposure studies). These were grouped as supplementary studies, however, because the evidence base of chronic and subchronic rodent studies was considered sufficient for evaluating chronic health effects of ETBE exposure. Additionally, studies of effects from chronic exposure are most pertinent to lifetime human exposure (i.e., the primary characterization provided by IRIS assessments) and are the focus of this assessment. Such supplementary studies can be discussed in the narrative sections of Section 1 and are described, for example, in sections such as “Mode-of-Action Analysis” to augment the discussion or are presented in appendices, if they provide additional information.

LS.3. EVIDENCE BASE EVALUATION

  • Study design;
  • Nature of the assay and validity for its intended purpose;
  • Characterization of the nature and extent of impurities and contaminants of ETBE administered, if applicable;
  • Characterization of dose and dosing regimen (including age at exposure) and their adequacy to elicit adverse effects, including latent effects;
  • Sample sizes to detect dose-related differences or trends;
  • Ascertainment of survival, vital signs, disease or effects, and cause of death; and
  • Control of other variables that could influence the occurrence of effects.

Additionally, several general considerations, presented in Table LS-4 , were used in evaluating the animal studies (see Table LS-5 ). Much of the key information for conducting this evaluation can be determined based on study methods and how the study results were reported. Importantly, the evaluation at this stage does not consider the direction or magnitude of any reported effects.

Table LS-4. Considerations for evaluating experimental animal studies.

Considerations for evaluating experimental animal studies.

Table LS-5. Summary of experimental animal evidence base.

Summary of experimental animal evidence base.

EPA considered statistical tests to evaluate whether the observations might be due to chance. The standard for determining statistical significance of a response is a trend test or comparison of outcomes in the exposed groups against those of concurrent controls. Studies that did not report statistical testing were identified, and when appropriate, statistical tests were conducted by EPA.

Information on study features related to this evaluation is reported in evidence tables and documented in the synthesis of evidence. Discussions of study strengths and limitations are included in the text where relevant. If EPA’s interpretation of a study differs from that of the study authors, the draft assessment discusses the basis for the difference.

LS.3.1. Experimental Animal Studies

The 33 experimental animal studies, all of which were performed on rats, mice, and rabbits, were associated with drinking water, gavage, or inhalation exposures to ETBE. Many of these studies were conducted according to Organisation for Economic Co-operation and Development Good Laboratory Practice (GLP) guidelines and used well-established methods, were well reported, and evaluated an extensive range of endpoints and histopathological data. For the body of available studies, detailed discussion of any identified methodological concerns precedes each endpoint evaluated in the hazard identification section. Overall, the experimental animal studies of ETBE involving repeated oral or inhalation exposure were considered acceptable in quality, and whether yielding positive, negative, or null results, were considered in assessing the evidence for health effects associated with chronic exposure to ETBE.

  • Cite this Page Newhouse K, Weaver A, Lee J, et al. Toxicological Review of Ethyl Tertiary Butyl Ether (CASRN 637-92-3). Washington (DC): U.S. Environmental Protection Agency; 2021 Jul. LITERATURE SEARCH STRATEGY | STUDY SELECTION AND EVALUATION.
  • PDF version of this title (4.3M)

In this Page

  • POSTPEER-REVIEW LITERATURE SEARCH UPDATE
  • SELECTION OF STUDIES FOR INCLUSION IN EVIDENCE TABLES
  • EVIDENCE BASE EVALUATION

Other titles in this collection

  • EPA IRIS Assessments

Recent Activity

  • LITERATURE SEARCH STRATEGY | STUDY SELECTION AND EVALUATION - Toxicological Revi... LITERATURE SEARCH STRATEGY | STUDY SELECTION AND EVALUATION - Toxicological Review of Ethyl Tertiary Butyl Ether (CASRN 637-92-3)

Your browsing activity is empty.

Activity recording is turned off.

Turn recording back on

Connect with NLM

National Library of Medicine 8600 Rockville Pike Bethesda, MD 20894

Web Policies FOIA HHS Vulnerability Disclosure

Help Accessibility Careers

statistics

COMMENTS

  1. PDF Guide to the search strategy

    Guide to the search strategy Prepared by: Vittoria Lutje, Information Retrieval Specialist Date: March 2013 1. Introduction: Developing the search strategy Part of developing a Cochrane Review involves a preparing a search strategy to help authors identify the studies for their review.

  2. Search strategy template

    LibGuides Research Guides Search strategy template Literature Reviews: systematic searching at various levels Search strategy template You can map out your search strategy in whatever way works for you. Some people like lists and so plan their search strategy out in a grid-box or table format.

  3. Literature search strategies

    Literature search strategies - Evidence review for targets - NCBI Bookshelf The literature searches for this review are detailed below and complied with the methodology outlined in Developing NICE guidelines: the manual 2014, updated 2017.

  4. Use of a search summary table to improve systematic review search

    Publishing a search summary table in all systematic reviews would add to the growing evidence base about information retrieval, which would help in determining which databases to search for which type of review (in terms of either topic or scope), what supplementary search methods are most effective, what type of literature is being included, an...

  5. How to carry out a literature search for a systematic review: a

    Literature reviews are conducted for the purpose of (a) locating information on a topic or identifying gaps in the literature for areas of future study, (b) synthesising conclusions in an area of ambiguity and (c) helping clinicians and researchers inform decision-making and practice guidelines.

  6. How to undertake a literature search: a step-by-step guide

    This article suggests 10 steps that will help readers complete this task, from identifying key concepts to choosing databases for the search and saving the results and search strategy. It discusses each of the steps in a little more detail, with examples and suggestions on where to get help.

  7. Recording your search strategy and results

    The PRISMA 2020 checklist states for '#7 Search strategy' that you should "Present the full search strategies for all databases, registers and websites, including any filters and limits used". The new 2021 PRISMA extension for searching extends this to "include the search strategies for each database and information source, copied and pasted exactly as run".

  8. Drawing up your search strategy

    The PRISMA 2020 checklist states for '#7 Search strategy' that you should "Present the full search strategies for all databases, registers and websites, including any filters and limits used". The 2021 PRISMA searching extension increases that to "Include the search strategies for each database and information source, copied and pasted exactly ...

  9. Literature Review: Developing a search strategy

    Have a search framework. Search frameworks are mnemonics which can help you focus your research question. They are also useful in helping you to identify the concepts and terms you will use in your literature search. PICO is a search framework commonly used in the health sciences to focus clinical questions. As an example, you work in an aged ...

  10. 3. Search the literature

    Help you describe your search process for manuscripts; Justify your search process; Documenting your search will help you stay organized and save time when tweaking your search strategy. This is a critical step for rigorous review papers, such as systematic reviews. One of the easiest ways to document your search strategy is to use a table like ...

  11. Defining the process to literature searching in systematic reviews: a

    Table 2 The order of literature search methods as presented in the guidance documents. Full size table. For each key stage, we set out the specific guidance, followed by discussion on how this guidance is situated within the wider literature. ... and, secondly, to develop an initial literature search strategy to estimate the volume of relevant ...

  12. A Guide to Evidence Synthesis: 4. Write a Search Strategy

    Writing a Search Strategy It is recommended that you work with a librarian to help you design comprehensive search strategies across a variety of databases. Writing a successful search strategy takes an intimate knowledge of bibliographic databases. Using Boolean logic is an important component of writing a search strategy:

  13. Develop a search strategy

    A search strategy is an organised structure of key terms used to search a database. The search strategy combines the key concepts of your search question in order to retrieve accurate results. Your search strategy will account for all: possible search terms keywords and phrases truncated and wildcard variations of search terms

  14. 4. Search strategy

    A good search strategy will include: Key concepts and meaningful terms. Keywords or subject headings. Alternative keywords. Care in linking concepts correctly. Regular evaluation of search results, to ensure that your search is focused. A detailed record of your final strategy. You will need to re-run your search at the end of the review ...

  15. PRISMA-S: an extension to the PRISMA Statement for Reporting Literature

    Open access Published: 26 January 2021 PRISMA-S: an extension to the PRISMA Statement for Reporting Literature Searches in Systematic Reviews Melissa L. Rethlefsen, Shona Kirtley, Siw Waffenschmidt, Ana Patricia Ayala, David Moher, Matthew J. Page, Jonathan B. Koffel & PRISMA-S Group

  16. Develop a search strategy

    A search strategy should be planned out and practiced before executing the final search in a database. A search strategy and search results should be documented throughout the searching process. What is a search strategy? A search strategy is an organized combination of keywords, phrases, subject headings, and limiters used to search a database.

  17. Search strategy formulation for systematic reviews: Issues, challenges

    Search strategy formulation for systematic reviews: Issues, challenges and opportunities ... identifying the barriers that emerge through inspection of the literature. Search strategy development methods: Current practice ... Table 1 provides a mapping between the barriers identified in this review (column 2), their source (column 3), potential ...

  18. A systematic approach to searching: an efficient and complete method to

    This method describes how single-line search strategies can be prepared in a text document by typing search syntax (such as field codes, parentheses, and Boolean operators) before copying and pasting search terms (keywords and free-text synonyms) that are found in the thesaurus.

  19. Presenting a search strategy

    The complete search strategy is usually also presented in table form. The table can be added as an appendix to the work. In order for you to be able to present your search strategy, it is important that you save the search you have made, in some way. A tip is to cut and paste from the database's "Search history".

  20. Researching for your literature review: Develop a search strategy

    By using a truncation symbol you can capture all of the various endings possible for a particular word. This may increase the number of results and reduce the likelihood of missing something relevant. Some tips about truncation: The truncation symbol is generally an asterisk symbol * and is added at the end of a word.; It may be added to the root of a word that is a word in itself.

  21. How to write a search strategy for your systematic review

    1. Decide where to search It's important to come up with a comprehensive list of sources to search so that you don't miss anything potentially relevant. In clinical medicine, your first stop will likely be the databases MEDLINE, Embase, and CENTRAL.

  22. LibGuides: Systematic literature searching: Record your search

    STARLITE mnemonic for recording literature search strategies; S - Sampling: Comprehensive / Purposive / Selective: T - Type of studies: e.g. methodologies used in the studies: ... Search terms table. An example of the search terms used for a multi-database search presented in a table showing the concepts, databases used, subject headings and ...

  23. Methods for Literature Search

    To identify potentially relevant articles in the medical literature, we searched MEDLINE ® and Cochrane databases and references provided by our Expert Advisors. MEDLINE ® search strategies. We searched MEDLINE ® (January 1980 to December 15, 2003) for English language articles using the search terms described in Table 2. Some citations were ...

  24. Literature Search Strategy

    A literature search and screening strategy consisted of a broad search of online scientific databases and other sources to identify all potentially pertinent studies. In subsequent steps, references were screened to exclude papers not pertinent to an assessment of the health effects of ethyl tertiary butyl ether (ETBE), and remaining references were sorted into categories for further evaluation.