• Interlibrary Loan

Ask an Expert

Ask an expert about access to resources, publishing, grants, and more.

MD Anderson faculty and staff can also request a one-on-one consultation with a librarian or scientific editor.

  • Library Calendar

Log in to the Library's remote access system using your MyID account.

The University of Texas MD Anderson Cancer Center Home

  • Library Home
  • Research Guides

Literature Search Basics

Develop a search strategy.

  • Define your search
  • Decide where to search

What is a search strategy

Advanced search tips.

  • Track and save your search
  • Class Recording: Writing an Effective Narrative Review
  • A search strategy includes  a combination of keywords, subject headings, and limiters (language, date, publication type, etc.)
  • A search strategy should be planned out and practiced before executing the final search in a database.
  • A search strategy and search results should be documented throughout the searching process.

What is a search strategy?

A search strategy is an organized combination of keywords, phrases, subject headings, and limiters used to search a database.

Your search strategy will include:

  • keywords 
  • boolean operators
  • variations of search terms (synonyms, suffixes)
  • subject headings 

Your search strategy  may  include:

  • truncation (where applicable)
  • phrases (where applicable)
  • limiters (date, language, age, publication type, etc.)

A search strategy usually requires several iterations. You will need to test the strategy along the way to ensure that you are finding relevant articles. It's also a good idea to review your search strategy with your co-authors. They may have ideas about terms or concepts you may have missed.

Additionally, each database you search is developed differently. You will need to adjust your strategy for each database your search.  For instance, Embase is a European database, many of the medical terms are slightly different than those used in MEDLINE and PubMed.

Choose search terms

Start by writing down as many terms as you can think of that relate to your question. You might try  cited reference searching  to find a few good articles that you can review for relevant terms.

Remember than most terms or  concepts can be expressed in different ways.  A few things to consider:

  • synonyms: "cancer" may be referred to as "neoplasms", "tumors", or "malignancy"
  • abbreviations: spell out the word instead of abbreviating
  • generic vs. trade names of drugs

Search for the exact phrase

If you want words to appear next to each other in an exact phrase, use quotation marks, eg “self-esteem”.

Phrase searching decreases the number of results you get. Most databases allow you to search for phrases, but check the database guide if you are unsure.

Truncation and wildcards

Many databases use an asterisk (*) as their truncation symbol  to find various word endings like singulars and plurals.  Check the database help section if you are not sure which symbol to use. 

"Therap*"

retrieves: therapy, therapies, therapist or therapists.

Use a wildcard (?) to find different spellings like British and American spellings.

"Behavio?r" retrieves behaviour and behavior.

Searching with subject headings

Database subject headings are controlled vocabulary terms that a database uses to describe what an article is about.

Using appropriate subject headings enhances your search and will help you to find more results on your topic. This is because subject headings find articles according to their subject, even if the article does not use your chosen key words.

You should combine both subject headings and keywords in your search strategy for each of the concepts you identify. This is particularly important if you are undertaking a systematic review or an in-depth piece of work

Subject headings may vary between databases, so you need to investigate each database separately to find the subject headings they use. For example, for MEDLINE you can use MeSH (Medical Subject Headings) and for Embase you can use the EMTREE thesaurus.

SEARCH TIP:  In Ovid databases, search for a known key paper by title, select the "complete reference" button to see which subject headings the database indexers have given that article, and consider adding relevant ones to your own search strategy.

Use Boolean logic to combine search terms

what is a literature search strategy

Boolean operators (AND, OR and NOT) allow you to try different combinations of search terms or subject headings.

Databases often show Boolean operators as buttons or drop-down menus that you can click to combine your search terms or results.

The main Boolean operators are:

OR is used to find articles that mention  either  of the topics you search for.

AND is used to find articles that mention  both  of the searched topics.

NOT excludes a search term or concept. It should be used with caution as you may inadvertently exclude relevant references.

For example, searching for “self-esteem NOT eating disorders” finds articles that mention self-esteem but removes any articles that mention eating disorders.

Adjacency searching 

Use adjacency operators to search by phrase or with two or more words in relation to one another. A djacency searching commands differ among databases. Check the database help section if you are not sure which searching commands to use. 

In Ovid Medline

"breast ADJ3 cancer" finds the word breast within three words of cancer, in any order.

This includes breast cancer or cancer of the breast.

Cited Reference Searching

Cited reference searching is a method to find articles that have been cited by other publications. 

Use cited reference searching to:

  • find keywords or terms you may need to include in your search strategy
  • find pivotal papers the same or similar subject area
  • find pivotal authors in the same or similar subject area
  • track how a topic has developed over time

Cited reference searching is available through these tools:

  • Web of Science
  • GoogleScholar
  • << Previous: Decide where to search
  • Next: Track and save your search >>
  • Last Updated: Nov 29, 2022 3:34 PM
  • URL: https://mdanderson.libguides.com/literaturesearchbasics

University of Leeds logo

  • Study and research support
  • Literature searching

Literature searching explained

Develop a search strategy.

A search strategy is an organised structure of key terms used to search a database. The search strategy combines the key concepts of your search question in order to retrieve accurate results.

Your search strategy will account for all:

  • possible search terms
  • keywords and phrases
  • truncated and wildcard variations of search terms
  • subject headings (where applicable)

Each database works differently so you need to adapt your search strategy for each database. You may wish to develop a number of separate search strategies if your research covers several different areas.

It is a good idea to test your strategies and refine them after you have reviewed the search results.

How a search strategy looks in practice

Take a look at this example literature search in PsycINFO (PDF) about self-esteem.

The example shows the subject heading and keyword searches that have been carried out for each concept within our research question and how they have been combined using Boolean operators. It also shows where keyword techniques like truncation, wildcards and adjacency searching have been used.

Search strategy techniques

The next sections show some techniques you can use to develop your search strategy.

Skip straight to:

  • Choosing search terms
  • Searching with keywords
  • Searching for exact phrases
  • Using truncated and wildcard searches

Searching with subject headings

  • Using Boolean logic

Citation searching

Choose search terms.

Concepts can be expressed in different ways eg “self-esteem” might be referred to as “self-worth”. Your aim is to consider each of your concepts and come up with a list of the different ways they could be expressed.

To find alternative keywords or phrases for your concepts try the following:

  • Use a thesaurus to identify synonyms.
  • Search for your concepts on a search engine like Google Scholar, scanning the results for alternative words and phrases.
  • Examine relevant abstracts or articles for alternative words, phrases and subject headings (if the database uses subject headings).

When you've done this, you should have lists of words and phrases for each concept as in this completed PICO model (PDF) or this example concept map (PDF).

As you search and scan articles and abstracts, you may discover different key terms to enhance your search strategy.

Using truncation and wildcards can save you time and effort by finding alternative keywords.

Search with keywords

Keywords are free text words and phrases. Database search strategies use a combination of free text and subject headings (where applicable).

A keyword search usually looks for your search terms in the title and abstract of a reference. You may wish to search in title fields only if you want a small number of specific results.

Some databases will find the exact word or phrase, so make sure your spelling is accurate or you will miss references.

Search for the exact phrase

If you want words to appear next to each other in an exact phrase, use quotation marks, eg “self-esteem”.

Phrase searching decreases the number of results you get and makes your results more relevant. Most databases allow you to search for phrases, but check the database guide if you are unsure.

Truncation and wildcard searches

You can use truncated and wildcard searches to find variations of your search term. Truncation is useful for finding singular and plural forms of words and variant endings.

Many databases use an asterisk (*) as their truncation symbol. Check the database help section if you are not sure which symbol to use. For example, “therap*” will find therapy, therapies, therapist or therapists. A wildcard finds variant spellings of words. Use it to search for a single character, or no character.

Check the database help section to see which symbol to use as a wildcard.

Wildcards are useful for finding British and American spellings, for example: “behavio?r” in Medline will find both behaviour and behavior.

There are sometimes different symbols to find a variable single character. For example, in the Medline database, “wom#n” will find woman and also women.

Use adjacency searching for more accurate results

You can specify how close two words appear together in your search strategy. This can make your results more relevant; generally the closer two words appear to each other, the closer the relationship is between them.

Commands for adjacency searching differ among databases, so make sure you consult database guides.

In OvidSP databases (like Medline), searching for “physician ADJ3 relationship” will find both physician and relationship within two major words of each other, in any order. This finds more papers than "physician relationship".

Using this adjacency retrieves papers with phrases like "physician patient relationship", "patient physician relationship", "relationship of the physician to the patient" and so on.

Database subject headings are controlled vocabulary terms that a database uses to describe what an article is about.

Watch our 3-minute introduction to subject headings video . You can also  View the video using Microsoft Stream (link opens in a new window, available for University members only).

Using appropriate subject headings enhances your search and will help you to find more results on your topic. This is because subject headings find articles according to their subject, even if the article does not use your chosen key words.

You should combine both subject headings and keywords in your search strategy for each of the concepts you identify. This is particularly important if you are undertaking a systematic review or an in-depth piece of work

Subject headings may vary between databases, so you need to investigate each database separately to find the subject headings they use. For example, for Medline you can use MeSH (Medical Subject Headings) and for Embase you can use the EMTREE thesaurus.

SEARCH TIP: In Ovid databases, search for a known key paper by title, select the "complete reference" button to see which subject headings the database indexers have given that article, and consider adding relevant ones to your own search strategy.

Use Boolean logic to combine search terms

Boolean operators (AND, OR and NOT) allow you to try different combinations of search terms or subject headings.

Databases often show Boolean operators as buttons or drop-down menus that you can click to combine your search terms or results.

The main Boolean operators are:

OR is used to find articles that mention either of the topics you search for.

AND is used to find articles that mention both of the searched topics.

NOT excludes a search term or concept. It should be used with caution as you may inadvertently exclude relevant references.

For example, searching for “self-esteem NOT eating disorders” finds articles that mention self-esteem but removes any articles that mention eating disorders.

Citation searching is a method to find articles that have been cited by other publications.

Use citation searching (or cited reference searching) to:

  • find out whether articles have been cited by other authors
  • find more recent papers on the same or similar subject
  • discover how a known idea or innovation has been confirmed, applied, improved, extended, or corrected
  • help make your literature review more comprehensive.

You can use cited reference searching in:

  • OvidSP databases
  • Google Scholar
  • Web of Science

Cited reference searching can complement your literature search. However be careful not to just look at papers that have been cited in isolation. A robust literature search is also needed to limit publication bias.

Charles Sturt University

Literature Review: Developing a search strategy

  • Traditional or narrative literature reviews
  • Scoping Reviews
  • Systematic literature reviews
  • Annotated bibliography
  • Keeping up to date with literature
  • Finding a thesis
  • Evaluating sources and critical appraisal of literature
  • Managing and analysing your literature
  • Further reading and resources

From research question to search strategy

Keeping a record of your search activity

Good search practice could involve keeping a search diary or document detailing your search activities (Phelps et. al. 2007, pp. 128-149), so that you can keep track of effective search terms, or to help others to reproduce your steps and get the same results. 

This record could be a document, table or spreadsheet with:

  • The names of the sources you search and which provider you accessed them through - eg Medline (Ovid), Web of Science (Thomson Reuters). You should also include any other literature sources you used.
  • how you searched (keyword and/or subject headings)
  • which search terms you used (which words and phrases)
  • any search techniques you employed (truncation, adjacency, etc)
  • how you combined your search terms (AND/OR). Check out the Database Help guide for more tips on Boolean Searching.
  • The number of search results from each source and each strategy used. This can be the evidence you need to prove a gap in the literature, and confirms the importance of your research question.

A search planner may help you to organise you thoughts prior to conducting your search. If you have any problems with organising your thoughts prior, during and after searching please contact your Library  Faculty Team   for individual help.

  • Literature search - a librarian's handout to introduce tools, terms and techniques Created by Elsevier librarian, Katy Kavanagh Web, this document outlines tools, terms and techniques to think about when conducting a literature search.
  • Search planner

Literature search cycle

what is a literature search strategy

Diagram text description

This diagram illustrates the literature search cycle. It shows a circle in quarters. Top left quarter is identify main concepts with rectangle describing how to do this by identifying:controlled vocabulary terms, synonyms, keywords and spelling. Top right quarter select library resources to search and rectangle describing resources to search library catalogue relevant journal articles and other resource. Bottom right corner of circle search resources and in rectangle consider using boolean searchingproximity searching and truncated searching techniques. Bottom left quarter of circle review and refine results. In rectangle evaluate results, rethink keywords and create alerts.

Have a search framework

Search frameworks are mnemonics which can help you focus your research question. They are also useful in helping you to identify the concepts and terms you will use in your literature search.

PICO is a search framework commonly used in the health sciences to focus clinical questions.  As an example, you work in an aged care facility and are interested in whether cranberry juice might help reduce the common occurrence of urinary tract infections.  The PICO framework would look like this:

Now that the issue has been broken up to its elements, it is easier to turn it into an answerable research question: “Does cranberry juice help reduce urinary tract infections in people living in aged care facilities?”

Other frameworks may be helpful, depending on your question and your field of interest. PICO can be adapted to PICOT (which adds T ime) or PICOS (which adds S tudy design), or PICOC (adding C ontext).

For qualitative questions you could use

  • SPIDER : S ample,  P henomenon of  I nterest,  D esign,  E valuation,  R esearch type  

For questions about causes or risk,

  • PEO : P opulation,  E xposure,  O utcomes

For evaluations of interventions or policies, 

  • SPICE: S etting,  P opulation or  P erspective,  I ntervention,  C omparison,  E valuation or
  • ECLIPSE: E xpectation,  C lient group,  L ocation,  I mpact,  P rofessionals,  SE rvice 

See the University of Notre Dame Australia’s examples of some of these frameworks. 

You can also try some PICO examples in the National Library of Medicine's PubMed training site: Using PICO to frame clinical questions.

Contact Your Faculty Team Librarian

Faculty librarians are here to provide assistance to students, researchers and academic staff by providing expert searching advice, research and curriculum support.

  • Faculty of Arts & Education team
  • Faculty of Business, Justice & Behavioural Science team
  • Faculty of Science team

Further reading

Cover Art

  • << Previous: Annotated bibliography
  • Next: Keeping up to date with literature >>
  • Last Updated: Jan 16, 2024 1:39 PM
  • URL: https://libguides.csu.edu.au/review

Acknowledgement of Country

Charles Sturt University is an Australian University, TEQSA Provider Identification: PRV12018. CRICOS Provider: 00005F.

  • Open access
  • Published: 14 August 2018

Defining the process to literature searching in systematic reviews: a literature review of guidance and supporting studies

  • Chris Cooper   ORCID: orcid.org/0000-0003-0864-5607 1 ,
  • Andrew Booth 2 ,
  • Jo Varley-Campbell 1 ,
  • Nicky Britten 3 &
  • Ruth Garside 4  

BMC Medical Research Methodology volume  18 , Article number:  85 ( 2018 ) Cite this article

194k Accesses

190 Citations

122 Altmetric

Metrics details

Systematic literature searching is recognised as a critical component of the systematic review process. It involves a systematic search for studies and aims for a transparent report of study identification, leaving readers clear about what was done to identify studies, and how the findings of the review are situated in the relevant evidence.

Information specialists and review teams appear to work from a shared and tacit model of the literature search process. How this tacit model has developed and evolved is unclear, and it has not been explicitly examined before.

The purpose of this review is to determine if a shared model of the literature searching process can be detected across systematic review guidance documents and, if so, how this process is reported in the guidance and supported by published studies.

A literature review.

Two types of literature were reviewed: guidance and published studies. Nine guidance documents were identified, including: The Cochrane and Campbell Handbooks. Published studies were identified through ‘pearl growing’, citation chasing, a search of PubMed using the systematic review methods filter, and the authors’ topic knowledge.

The relevant sections within each guidance document were then read and re-read, with the aim of determining key methodological stages. Methodological stages were identified and defined. This data was reviewed to identify agreements and areas of unique guidance between guidance documents. Consensus across multiple guidance documents was used to inform selection of ‘key stages’ in the process of literature searching.

Eight key stages were determined relating specifically to literature searching in systematic reviews. They were: who should literature search, aims and purpose of literature searching, preparation, the search strategy, searching databases, supplementary searching, managing references and reporting the search process.

Conclusions

Eight key stages to the process of literature searching in systematic reviews were identified. These key stages are consistently reported in the nine guidance documents, suggesting consensus on the key stages of literature searching, and therefore the process of literature searching as a whole, in systematic reviews. Further research to determine the suitability of using the same process of literature searching for all types of systematic review is indicated.

Peer Review reports

Systematic literature searching is recognised as a critical component of the systematic review process. It involves a systematic search for studies and aims for a transparent report of study identification, leaving review stakeholders clear about what was done to identify studies, and how the findings of the review are situated in the relevant evidence.

Information specialists and review teams appear to work from a shared and tacit model of the literature search process. How this tacit model has developed and evolved is unclear, and it has not been explicitly examined before. This is in contrast to the information science literature, which has developed information processing models as an explicit basis for dialogue and empirical testing. Without an explicit model, research in the process of systematic literature searching will remain immature and potentially uneven, and the development of shared information models will be assumed but never articulated.

One way of developing such a conceptual model is by formally examining the implicit “programme theory” as embodied in key methodological texts. The aim of this review is therefore to determine if a shared model of the literature searching process in systematic reviews can be detected across guidance documents and, if so, how this process is reported and supported.

Identifying guidance

Key texts (henceforth referred to as “guidance”) were identified based upon their accessibility to, and prominence within, United Kingdom systematic reviewing practice. The United Kingdom occupies a prominent position in the science of health information retrieval, as quantified by such objective measures as the authorship of papers, the number of Cochrane groups based in the UK, membership and leadership of groups such as the Cochrane Information Retrieval Methods Group, the HTA-I Information Specialists’ Group and historic association with such centres as the UK Cochrane Centre, the NHS Centre for Reviews and Dissemination, the Centre for Evidence Based Medicine and the National Institute for Clinical Excellence (NICE). Coupled with the linguistic dominance of English within medical and health science and the science of systematic reviews more generally, this offers a justification for a purposive sample that favours UK, European and Australian guidance documents.

Nine guidance documents were identified. These documents provide guidance for different types of reviews, namely: reviews of interventions, reviews of health technologies, reviews of qualitative research studies, reviews of social science topics, and reviews to inform guidance.

Whilst these guidance documents occasionally offer additional guidance on other types of systematic reviews, we have focused on the core and stated aims of these documents as they relate to literature searching. Table  1 sets out: the guidance document, the version audited, their core stated focus, and a bibliographical pointer to the main guidance relating to literature searching.

Once a list of key guidance documents was determined, it was checked by six senior information professionals based in the UK for relevance to current literature searching in systematic reviews.

Identifying supporting studies

In addition to identifying guidance, the authors sought to populate an evidence base of supporting studies (henceforth referred to as “studies”) that contribute to existing search practice. Studies were first identified by the authors from their knowledge on this topic area and, subsequently, through systematic citation chasing key studies (‘pearls’ [ 1 ]) located within each key stage of the search process. These studies are identified in Additional file  1 : Appendix Table 1. Citation chasing was conducted by analysing the bibliography of references for each study (backwards citation chasing) and through Google Scholar (forward citation chasing). A search of PubMed using the systematic review methods filter was undertaken in August 2017 (see Additional file 1 ). The search terms used were: (literature search*[Title/Abstract]) AND sysrev_methods[sb] and 586 results were returned. These results were sifted for relevance to the key stages in Fig.  1 by CC.

figure 1

The key stages of literature search guidance as identified from nine key texts

Extracting the data

To reveal the implicit process of literature searching within each guidance document, the relevant sections (chapters) on literature searching were read and re-read, with the aim of determining key methodological stages. We defined a key methodological stage as a distinct step in the overall process for which specific guidance is reported, and action is taken, that collectively would result in a completed literature search.

The chapter or section sub-heading for each methodological stage was extracted into a table using the exact language as reported in each guidance document. The lead author (CC) then read and re-read these data, and the paragraphs of the document to which the headings referred, summarising section details. This table was then reviewed, using comparison and contrast to identify agreements and areas of unique guidance. Consensus across multiple guidelines was used to inform selection of ‘key stages’ in the process of literature searching.

Having determined the key stages to literature searching, we then read and re-read the sections relating to literature searching again, extracting specific detail relating to the methodological process of literature searching within each key stage. Again, the guidance was then read and re-read, first on a document-by-document-basis and, secondly, across all the documents above, to identify both commonalities and areas of unique guidance.

Results and discussion

Our findings.

We were able to identify consensus across the guidance on literature searching for systematic reviews suggesting a shared implicit model within the information retrieval community. Whilst the structure of the guidance varies between documents, the same key stages are reported, even where the core focus of each document is different. We were able to identify specific areas of unique guidance, where a document reported guidance not summarised in other documents, together with areas of consensus across guidance.

Unique guidance

Only one document provided guidance on the topic of when to stop searching [ 2 ]. This guidance from 2005 anticipates a topic of increasing importance with the current interest in time-limited (i.e. “rapid”) reviews. Quality assurance (or peer review) of literature searches was only covered in two guidance documents [ 3 , 4 ]. This topic has emerged as increasingly important as indicated by the development of the PRESS instrument [ 5 ]. Text mining was discussed in four guidance documents [ 4 , 6 , 7 , 8 ] where the automation of some manual review work may offer efficiencies in literature searching [ 8 ].

Agreement between guidance: Defining the key stages of literature searching

Where there was agreement on the process, we determined that this constituted a key stage in the process of literature searching to inform systematic reviews.

From the guidance, we determined eight key stages that relate specifically to literature searching in systematic reviews. These are summarised at Fig. 1 . The data extraction table to inform Fig. 1 is reported in Table  2 . Table 2 reports the areas of common agreement and it demonstrates that the language used to describe key stages and processes varies significantly between guidance documents.

For each key stage, we set out the specific guidance, followed by discussion on how this guidance is situated within the wider literature.

Key stage one: Deciding who should undertake the literature search

The guidance.

Eight documents provided guidance on who should undertake literature searching in systematic reviews [ 2 , 4 , 6 , 7 , 8 , 9 , 10 , 11 ]. The guidance affirms that people with relevant expertise of literature searching should ‘ideally’ be included within the review team [ 6 ]. Information specialists (or information scientists), librarians or trial search co-ordinators (TSCs) are indicated as appropriate researchers in six guidance documents [ 2 , 7 , 8 , 9 , 10 , 11 ].

How the guidance corresponds to the published studies

The guidance is consistent with studies that call for the involvement of information specialists and librarians in systematic reviews [ 12 , 13 , 14 , 15 , 16 , 17 , 18 , 19 , 20 , 21 , 22 , 23 , 24 , 25 , 26 ] and which demonstrate how their training as ‘expert searchers’ and ‘analysers and organisers of data’ can be put to good use [ 13 ] in a variety of roles [ 12 , 16 , 20 , 21 , 24 , 25 , 26 ]. These arguments make sense in the context of the aims and purposes of literature searching in systematic reviews, explored below. The need for ‘thorough’ and ‘replicable’ literature searches was fundamental to the guidance and recurs in key stage two. Studies have found poor reporting, and a lack of replicable literature searches, to be a weakness in systematic reviews [ 17 , 18 , 27 , 28 ] and they argue that involvement of information specialists/ librarians would be associated with better reporting and better quality literature searching. Indeed, Meert et al. [ 29 ] demonstrated that involving a librarian as a co-author to a systematic review correlated with a higher score in the literature searching component of a systematic review [ 29 ]. As ‘new styles’ of rapid and scoping reviews emerge, where decisions on how to search are more iterative and creative, a clear role is made here too [ 30 ].

Knowing where to search for studies was noted as important in the guidance, with no agreement as to the appropriate number of databases to be searched [ 2 , 6 ]. Database (and resource selection more broadly) is acknowledged as a relevant key skill of information specialists and librarians [ 12 , 15 , 16 , 31 ].

Whilst arguments for including information specialists and librarians in the process of systematic review might be considered self-evident, Koffel and Rethlefsen [ 31 ] have questioned if the necessary involvement is actually happening [ 31 ].

Key stage two: Determining the aim and purpose of a literature search

The aim: Five of the nine guidance documents use adjectives such as ‘thorough’, ‘comprehensive’, ‘transparent’ and ‘reproducible’ to define the aim of literature searching [ 6 , 7 , 8 , 9 , 10 ]. Analogous phrases were present in a further three guidance documents, namely: ‘to identify the best available evidence’ [ 4 ] or ‘the aim of the literature search is not to retrieve everything. It is to retrieve everything of relevance’ [ 2 ] or ‘A systematic literature search aims to identify all publications relevant to the particular research question’ [ 3 ]. The Joanna Briggs Institute reviewers’ manual was the only guidance document where a clear statement on the aim of literature searching could not be identified. The purpose of literature searching was defined in three guidance documents, namely to minimise bias in the resultant review [ 6 , 8 , 10 ]. Accordingly, eight of nine documents clearly asserted that thorough and comprehensive literature searches are required as a potential mechanism for minimising bias.

The need for thorough and comprehensive literature searches appears as uniform within the eight guidance documents that describe approaches to literature searching in systematic reviews of effectiveness. Reviews of effectiveness (of intervention or cost), accuracy and prognosis, require thorough and comprehensive literature searches to transparently produce a reliable estimate of intervention effect. The belief that all relevant studies have been ‘comprehensively’ identified, and that this process has been ‘transparently’ reported, increases confidence in the estimate of effect and the conclusions that can be drawn [ 32 ]. The supporting literature exploring the need for comprehensive literature searches focuses almost exclusively on reviews of intervention effectiveness and meta-analysis. Different ‘styles’ of review may have different standards however; the alternative, offered by purposive sampling, has been suggested in the specific context of qualitative evidence syntheses [ 33 ].

What is a comprehensive literature search?

Whilst the guidance calls for thorough and comprehensive literature searches, it lacks clarity on what constitutes a thorough and comprehensive literature search, beyond the implication that all of the literature search methods in Table 2 should be used to identify studies. Egger et al. [ 34 ], in an empirical study evaluating the importance of comprehensive literature searches for trials in systematic reviews, defined a comprehensive search for trials as:

a search not restricted to English language;

where Cochrane CENTRAL or at least two other electronic databases had been searched (such as MEDLINE or EMBASE); and

at least one of the following search methods has been used to identify unpublished trials: searches for (I) conference abstracts, (ii) theses, (iii) trials registers; and (iv) contacts with experts in the field [ 34 ].

Tricco et al. (2008) used a similar threshold of bibliographic database searching AND a supplementary search method in a review when examining the risk of bias in systematic reviews. Their criteria were: one database (limited using the Cochrane Highly Sensitive Search Strategy (HSSS)) and handsearching [ 35 ].

Together with the guidance, this would suggest that comprehensive literature searching requires the use of BOTH bibliographic database searching AND supplementary search methods.

Comprehensiveness in literature searching, in the sense of how much searching should be undertaken, remains unclear. Egger et al. recommend that ‘investigators should consider the type of literature search and degree of comprehension that is appropriate for the review in question, taking into account budget and time constraints’ [ 34 ]. This view tallies with the Cochrane Handbook, which stipulates clearly, that study identification should be undertaken ‘within resource limits’ [ 9 ]. This would suggest that the limitations to comprehension are recognised but it raises questions on how this is decided and reported [ 36 ].

What is the point of comprehensive literature searching?

The purpose of thorough and comprehensive literature searches is to avoid missing key studies and to minimize bias [ 6 , 8 , 10 , 34 , 37 , 38 , 39 ] since a systematic review based only on published (or easily accessible) studies may have an exaggerated effect size [ 35 ]. Felson (1992) sets out potential biases that could affect the estimate of effect in a meta-analysis [ 40 ] and Tricco et al. summarize the evidence concerning bias and confounding in systematic reviews [ 35 ]. Egger et al. point to non-publication of studies, publication bias, language bias and MEDLINE bias, as key biases [ 34 , 35 , 40 , 41 , 42 , 43 , 44 , 45 , 46 ]. Comprehensive searches are not the sole factor to mitigate these biases but their contribution is thought to be significant [ 2 , 32 , 34 ]. Fehrmann (2011) suggests that ‘the search process being described in detail’ and that, where standard comprehensive search techniques have been applied, increases confidence in the search results [ 32 ].

Does comprehensive literature searching work?

Egger et al., and other study authors, have demonstrated a change in the estimate of intervention effectiveness where relevant studies were excluded from meta-analysis [ 34 , 47 ]. This would suggest that missing studies in literature searching alters the reliability of effectiveness estimates. This is an argument for comprehensive literature searching. Conversely, Egger et al. found that ‘comprehensive’ searches still missed studies and that comprehensive searches could, in fact, introduce bias into a review rather than preventing it, through the identification of low quality studies then being included in the meta-analysis [ 34 ]. Studies query if identifying and including low quality or grey literature studies changes the estimate of effect [ 43 , 48 ] and question if time is better invested updating systematic reviews rather than searching for unpublished studies [ 49 ], or mapping studies for review as opposed to aiming for high sensitivity in literature searching [ 50 ].

Aim and purpose beyond reviews of effectiveness

The need for comprehensive literature searches is less certain in reviews of qualitative studies, and for reviews where a comprehensive identification of studies is difficult to achieve (for example, in Public health) [ 33 , 51 , 52 , 53 , 54 , 55 ]. Literature searching for qualitative studies, and in public health topics, typically generates a greater number of studies to sift than in reviews of effectiveness [ 39 ] and demonstrating the ‘value’ of studies identified or missed is harder [ 56 ], since the study data do not typically support meta-analysis. Nussbaumer-Streit et al. (2016) have registered a review protocol to assess whether abbreviated literature searches (as opposed to comprehensive literature searches) has an impact on conclusions across multiple bodies of evidence, not only on effect estimates [ 57 ] which may develop this understanding. It may be that decision makers and users of systematic reviews are willing to trade the certainty from a comprehensive literature search and systematic review in exchange for different approaches to evidence synthesis [ 58 ], and that comprehensive literature searches are not necessarily a marker of literature search quality, as previously thought [ 36 ]. Different approaches to literature searching [ 37 , 38 , 59 , 60 , 61 , 62 ] and developing the concept of when to stop searching are important areas for further study [ 36 , 59 ].

The study by Nussbaumer-Streit et al. has been published since the submission of this literature review [ 63 ]. Nussbaumer-Streit et al. (2018) conclude that abbreviated literature searches are viable options for rapid evidence syntheses, if decision-makers are willing to trade the certainty from a comprehensive literature search and systematic review, but that decision-making which demands detailed scrutiny should still be based on comprehensive literature searches [ 63 ].

Key stage three: Preparing for the literature search

Six documents provided guidance on preparing for a literature search [ 2 , 3 , 6 , 7 , 9 , 10 ]. The Cochrane Handbook clearly stated that Cochrane authors (i.e. researchers) should seek advice from a trial search co-ordinator (i.e. a person with specific skills in literature searching) ‘before’ starting a literature search [ 9 ].

Two key tasks were perceptible in preparing for a literature searching [ 2 , 6 , 7 , 10 , 11 ]. First, to determine if there are any existing or on-going reviews, or if a new review is justified [ 6 , 11 ]; and, secondly, to develop an initial literature search strategy to estimate the volume of relevant literature (and quality of a small sample of relevant studies [ 10 ]) and indicate the resources required for literature searching and the review of the studies that follows [ 7 , 10 ].

Three documents summarised guidance on where to search to determine if a new review was justified [ 2 , 6 , 11 ]. These focused on searching databases of systematic reviews (The Cochrane Database of Systematic Reviews (CDSR) and the Database of Abstracts of Reviews of Effects (DARE)), institutional registries (including PROSPERO), and MEDLINE [ 6 , 11 ]. It is worth noting, however, that as of 2015, DARE (and NHS EEDs) are no longer being updated and so the relevance of this (these) resource(s) will diminish over-time [ 64 ]. One guidance document, ‘Systematic reviews in the Social Sciences’, noted, however, that databases are not the only source of information and unpublished reports, conference proceeding and grey literature may also be required, depending on the nature of the review question [ 2 ].

Two documents reported clearly that this preparation (or ‘scoping’) exercise should be undertaken before the actual search strategy is developed [ 7 , 10 ]).

The guidance offers the best available source on preparing the literature search with the published studies not typically reporting how their scoping informed the development of their search strategies nor how their search approaches were developed. Text mining has been proposed as a technique to develop search strategies in the scoping stages of a review although this work is still exploratory [ 65 ]. ‘Clustering documents’ and word frequency analysis have also been tested to identify search terms and studies for review [ 66 , 67 ]. Preparing for literature searches and scoping constitutes an area for future research.

Key stage four: Designing the search strategy

The Population, Intervention, Comparator, Outcome (PICO) structure was the commonly reported structure promoted to design a literature search strategy. Five documents suggested that the eligibility criteria or review question will determine which concepts of PICO will be populated to develop the search strategy [ 1 , 4 , 7 , 8 , 9 ]. The NICE handbook promoted multiple structures, namely PICO, SPICE (Setting, Perspective, Intervention, Comparison, Evaluation) and multi-stranded approaches [ 4 ].

With the exclusion of The Joanna Briggs Institute reviewers’ manual, the guidance offered detail on selecting key search terms, synonyms, Boolean language, selecting database indexing terms and combining search terms. The CEE handbook suggested that ‘search terms may be compiled with the help of the commissioning organisation and stakeholders’ [ 10 ].

The use of limits, such as language or date limits, were discussed in all documents [ 2 , 3 , 4 , 6 , 7 , 8 , 9 , 10 , 11 ].

Search strategy structure

The guidance typically relates to reviews of intervention effectiveness so PICO – with its focus on intervention and comparator - is the dominant model used to structure literature search strategies [ 68 ]. PICOs – where the S denotes study design - is also commonly used in effectiveness reviews [ 6 , 68 ]. As the NICE handbook notes, alternative models to structure literature search strategies have been developed and tested. Booth provides an overview on formulating questions for evidence based practice [ 69 ] and has developed a number of alternatives to the PICO structure, namely: BeHEMoTh (Behaviour of interest; Health context; Exclusions; Models or Theories) for use when systematically identifying theory [ 55 ]; SPICE (Setting, Perspective, Intervention, Comparison, Evaluation) for identification of social science and evaluation studies [ 69 ] and, working with Cooke and colleagues, SPIDER (Sample, Phenomenon of Interest, Design, Evaluation, Research type) [ 70 ]. SPIDER has been compared to PICO and PICOs in a study by Methley et al. [ 68 ].

The NICE handbook also suggests the use of multi-stranded approaches to developing literature search strategies [ 4 ]. Glanville developed this idea in a study by Whitting et al. [ 71 ] and a worked example of this approach is included in the development of a search filter by Cooper et al. [ 72 ].

Writing search strategies: Conceptual and objective approaches

Hausner et al. [ 73 ] provide guidance on writing literature search strategies, delineating between conceptually and objectively derived approaches. The conceptual approach, advocated by and explained in the guidance documents, relies on the expertise of the literature searcher to identify key search terms and then develop key terms to include synonyms and controlled syntax. Hausner and colleagues set out the objective approach [ 73 ] and describe what may be done to validate it [ 74 ].

The use of limits

The guidance documents offer direction on the use of limits within a literature search. Limits can be used to focus literature searching to specific study designs or by other markers (such as by date) which limits the number of studies returned by a literature search. The use of limits should be described and the implications explored [ 34 ] since limiting literature searching can introduce bias (explored above). Craven et al. have suggested the use of a supporting narrative to explain decisions made in the process of developing literature searches and this advice would usefully capture decisions on the use of search limits [ 75 ].

Key stage five: Determining the process of literature searching and deciding where to search (bibliographic database searching)

Table 2 summarises the process of literature searching as reported in each guidance document. Searching bibliographic databases was consistently reported as the ‘first step’ to literature searching in all nine guidance documents.

Three documents reported specific guidance on where to search, in each case specific to the type of review their guidance informed, and as a minimum requirement [ 4 , 9 , 11 ]. Seven of the key guidance documents suggest that the selection of bibliographic databases depends on the topic of review [ 2 , 3 , 4 , 6 , 7 , 8 , 10 ], with two documents noting the absence of an agreed standard on what constitutes an acceptable number of databases searched [ 2 , 6 ].

The guidance documents summarise ‘how to’ search bibliographic databases in detail and this guidance is further contextualised above in terms of developing the search strategy. The documents provide guidance of selecting bibliographic databases, in some cases stating acceptable minima (i.e. The Cochrane Handbook states Cochrane CENTRAL, MEDLINE and EMBASE), and in other cases simply listing bibliographic database available to search. Studies have explored the value in searching specific bibliographic databases, with Wright et al. (2015) noting the contribution of CINAHL in identifying qualitative studies [ 76 ], Beckles et al. (2013) questioning the contribution of CINAHL to identifying clinical studies for guideline development [ 77 ], and Cooper et al. (2015) exploring the role of UK-focused bibliographic databases to identify UK-relevant studies [ 78 ]. The host of the database (e.g. OVID or ProQuest) has been shown to alter the search returns offered. Younger and Boddy [ 79 ] report differing search returns from the same database (AMED) but where the ‘host’ was different [ 79 ].

The average number of bibliographic database searched in systematic reviews has risen in the period 1994–2014 (from 1 to 4) [ 80 ] but there remains (as attested to by the guidance) no consensus on what constitutes an acceptable number of databases searched [ 48 ]. This is perhaps because thinking about the number of databases searched is the wrong question, researchers should be focused on which databases were searched and why, and which databases were not searched and why. The discussion should re-orientate to the differential value of sources but researchers need to think about how to report this in studies to allow findings to be generalised. Bethel (2017) has proposed ‘search summaries’, completed by the literature searcher, to record where included studies were identified, whether from database (and which databases specifically) or supplementary search methods [ 81 ]. Search summaries document both yield and accuracy of searches, which could prospectively inform resource use and decisions to search or not to search specific databases in topic areas. The prospective use of such data presupposes, however, that past searches are a potential predictor of future search performance (i.e. that each topic is to be considered representative and not unique). In offering a body of practice, this data would be of greater practicable use than current studies which are considered as little more than individual case studies [ 82 , 83 , 84 , 85 , 86 , 87 , 88 , 89 , 90 ].

When to database search is another question posed in the literature. Beyer et al. [ 91 ] report that databases can be prioritised for literature searching which, whilst not addressing the question of which databases to search, may at least bring clarity as to which databases to search first [ 91 ]. Paradoxically, this links to studies that suggest PubMed should be searched in addition to MEDLINE (OVID interface) since this improves the currency of systematic reviews [ 92 , 93 ]. Cooper et al. (2017) have tested the idea of database searching not as a primary search method (as suggested in the guidance) but as a supplementary search method in order to manage the volume of studies identified for an environmental effectiveness systematic review. Their case study compared the effectiveness of database searching versus a protocol using supplementary search methods and found that the latter identified more relevant studies for review than searching bibliographic databases [ 94 ].

Key stage six: Determining the process of literature searching and deciding where to search (supplementary search methods)

Table 2 also summaries the process of literature searching which follows bibliographic database searching. As Table 2 sets out, guidance that supplementary literature search methods should be used in systematic reviews recurs across documents, but the order in which these methods are used, and the extent to which they are used, varies. We noted inconsistency in the labelling of supplementary search methods between guidance documents.

Rather than focus on the guidance on how to use the methods (which has been summarised in a recent review [ 95 ]), we focus on the aim or purpose of supplementary search methods.

The Cochrane Handbook reported that ‘efforts’ to identify unpublished studies should be made [ 9 ]. Four guidance documents [ 2 , 3 , 6 , 9 ] acknowledged that searching beyond bibliographic databases was necessary since ‘databases are not the only source of literature’ [ 2 ]. Only one document reported any guidance on determining when to use supplementary methods. The IQWiG handbook reported that the use of handsearching (in their example) could be determined on a ‘case-by-case basis’ which implies that the use of these methods is optional rather than mandatory. This is in contrast to the guidance (above) on bibliographic database searching.

The issue for supplementary search methods is similar in many ways to the issue of searching bibliographic databases: demonstrating value. The purpose and contribution of supplementary search methods in systematic reviews is increasingly acknowledged [ 37 , 61 , 62 , 96 , 97 , 98 , 99 , 100 , 101 ] but understanding the value of the search methods to identify studies and data is unclear. In a recently published review, Cooper et al. (2017) reviewed the literature on supplementary search methods looking to determine the advantages, disadvantages and resource implications of using supplementary search methods [ 95 ]. This review also summarises the key guidance and empirical studies and seeks to address the question on when to use these search methods and when not to [ 95 ]. The guidance is limited in this regard and, as Table 2 demonstrates, offers conflicting advice on the order of searching, and the extent to which these search methods should be used in systematic reviews.

Key stage seven: Managing the references

Five of the documents provided guidance on managing references, for example downloading, de-duplicating and managing the output of literature searches [ 2 , 4 , 6 , 8 , 10 ]. This guidance typically itemised available bibliographic management tools rather than offering guidance on how to use them specifically [ 2 , 4 , 6 , 8 ]. The CEE handbook provided guidance on importing data where no direct export option is available (e.g. web-searching) [ 10 ].

The literature on using bibliographic management tools is not large relative to the number of ‘how to’ videos on platforms such as YouTube (see for example [ 102 ]). These YouTube videos confirm the overall lack of ‘how to’ guidance identified in this study and offer useful instruction on managing references. Bramer et al. set out methods for de-duplicating data and reviewing references in Endnote [ 103 , 104 ] and Gall tests the direct search function within Endnote to access databases such as PubMed, finding a number of limitations [ 105 ]. Coar et al. and Ahmed et al. consider the role of the free-source tool, Zotero [ 106 , 107 ]. Managing references is a key administrative function in the process of review particularly for documenting searches in PRISMA guidance.

Key stage eight: Documenting the search

The Cochrane Handbook was the only guidance document to recommend a specific reporting guideline: Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) [ 9 ]. Six documents provided guidance on reporting the process of literature searching with specific criteria to report [ 3 , 4 , 6 , 8 , 9 , 10 ]. There was consensus on reporting: the databases searched (and the host searched by), the search strategies used, and any use of limits (e.g. date, language, search filters (The CRD handbook called for these limits to be justified [ 6 ])). Three guidance documents reported that the number of studies identified should be recorded [ 3 , 6 , 10 ]. The number of duplicates identified [ 10 ], the screening decisions [ 3 ], a comprehensive list of grey literature sources searched (and full detail for other supplementary search methods) [ 8 ], and an annotation of search terms tested but not used [ 4 ] were identified as unique items in four documents.

The Cochrane Handbook was the only guidance document to note that the full search strategies for each database should be included in the Additional file 1 of the review [ 9 ].

All guidance documents should ultimately deliver completed systematic reviews that fulfil the requirements of the PRISMA reporting guidelines [ 108 ]. The guidance broadly requires the reporting of data that corresponds with the requirements of the PRISMA statement although documents typically ask for diverse and additional items [ 108 ]. In 2008, Sampson et al. observed a lack of consensus on reporting search methods in systematic reviews [ 109 ] and this remains the case as of 2017, as evidenced in the guidance documents, and in spite of the publication of the PRISMA guidelines in 2009 [ 110 ]. It is unclear why the collective guidance does not more explicitly endorse adherence to the PRISMA guidance.

Reporting of literature searching is a key area in systematic reviews since it sets out clearly what was done and how the conclusions of the review can be believed [ 52 , 109 ]. Despite strong endorsement in the guidance documents, specifically supported in PRISMA guidance, and other related reporting standards too (such as ENTREQ for qualitative evidence synthesis, STROBE for reviews of observational studies), authors still highlight the prevalence of poor standards of literature search reporting [ 31 , 110 , 111 , 112 , 113 , 114 , 115 , 116 , 117 , 118 , 119 ]. To explore issues experienced by authors in reporting literature searches, and look at uptake of PRISMA, Radar et al. [ 120 ] surveyed over 260 review authors to determine common problems and their work summaries the practical aspects of reporting literature searching [ 120 ]. Atkinson et al. [ 121 ] have also analysed reporting standards for literature searching, summarising recommendations and gaps for reporting search strategies [ 121 ].

One area that is less well covered by the guidance, but nevertheless appears in this literature, is the quality appraisal or peer review of literature search strategies. The PRESS checklist is the most prominent and it aims to develop evidence-based guidelines to peer review of electronic search strategies [ 5 , 122 , 123 ]. A corresponding guideline for documentation of supplementary search methods does not yet exist although this idea is currently being explored.

How the reporting of the literature searching process corresponds to critical appraisal tools is an area for further research. In the survey undertaken by Radar et al. (2014), 86% of survey respondents (153/178) identified a need for further guidance on what aspects of the literature search process to report [ 120 ]. The PRISMA statement offers a brief summary of what to report but little practical guidance on how to report it [ 108 ]. Critical appraisal tools for systematic reviews, such as AMSTAR 2 (Shea et al. [ 124 ]) and ROBIS (Whiting et al. [ 125 ]), can usefully be read alongside PRISMA guidance, since they offer greater detail on how the reporting of the literature search will be appraised and, therefore, they offer a proxy on what to report [ 124 , 125 ]. Further research in the form of a study which undertakes a comparison between PRISMA and quality appraisal checklists for systematic reviews would seem to begin addressing the call, identified by Radar et al., for further guidance on what to report [ 120 ].

Limitations

Other handbooks exist.

A potential limitation of this literature review is the focus on guidance produced in Europe (the UK specifically) and Australia. We justify the decision for our selection of the nine guidance documents reviewed in this literature review in section “ Identifying guidance ”. In brief, these nine guidance documents were selected as the most relevant health care guidance that inform UK systematic reviewing practice, given that the UK occupies a prominent position in the science of health information retrieval. We acknowledge the existence of other guidance documents, such as those from North America (e.g. the Agency for Healthcare Research and Quality (AHRQ) [ 126 ], The Institute of Medicine [ 127 ] and the guidance and resources produced by the Canadian Agency for Drugs and Technologies in Health (CADTH) [ 128 ]). We comment further on this directly below.

The handbooks are potentially linked to one another

What is not clear is the extent to which the guidance documents inter-relate or provide guidance uniquely. The Cochrane Handbook, first published in 1994, is notably a key source of reference in guidance and systematic reviews beyond Cochrane reviews. It is not clear to what extent broadening the sample of guidance handbooks to include North American handbooks, and guidance handbooks from other relevant countries too, would alter the findings of this literature review or develop further support for the process model. Since we cannot be clear, we raise this as a potential limitation of this literature review. On our initial review of a sample of North American, and other, guidance documents (before selecting the guidance documents considered in this review), however, we do not consider that the inclusion of these further handbooks would alter significantly the findings of this literature review.

This is a literature review

A further limitation of this review was that the review of published studies is not a systematic review of the evidence for each key stage. It is possible that other relevant studies could help contribute to the exploration and development of the key stages identified in this review.

This literature review would appear to demonstrate the existence of a shared model of the literature searching process in systematic reviews. We call this model ‘the conventional approach’, since it appears to be common convention in nine different guidance documents.

The findings reported above reveal eight key stages in the process of literature searching for systematic reviews. These key stages are consistently reported in the nine guidance documents which suggests consensus on the key stages of literature searching, and therefore the process of literature searching as a whole, in systematic reviews.

In Table 2 , we demonstrate consensus regarding the application of literature search methods. All guidance documents distinguish between primary and supplementary search methods. Bibliographic database searching is consistently the first method of literature searching referenced in each guidance document. Whilst the guidance uniformly supports the use of supplementary search methods, there is little evidence for a consistent process with diverse guidance across documents. This may reflect differences in the core focus across each document, linked to differences in identifying effectiveness studies or qualitative studies, for instance.

Eight of the nine guidance documents reported on the aims of literature searching. The shared understanding was that literature searching should be thorough and comprehensive in its aim and that this process should be reported transparently so that that it could be reproduced. Whilst only three documents explicitly link this understanding to minimising bias, it is clear that comprehensive literature searching is implicitly linked to ‘not missing relevant studies’ which is approximately the same point.

Defining the key stages in this review helps categorise the scholarship available, and it prioritises areas for development or further study. The supporting studies on preparing for literature searching (key stage three, ‘preparation’) were, for example, comparatively few, and yet this key stage represents a decisive moment in literature searching for systematic reviews. It is where search strategy structure is determined, search terms are chosen or discarded, and the resources to be searched are selected. Information specialists, librarians and researchers, are well placed to develop these and other areas within the key stages we identify.

This review calls for further research to determine the suitability of using the conventional approach. The publication dates of the guidance documents which underpin the conventional approach may raise questions as to whether the process which they each report remains valid for current systematic literature searching. In addition, it may be useful to test whether it is desirable to use the same process model of literature searching for qualitative evidence synthesis as that for reviews of intervention effectiveness, which this literature review demonstrates is presently recommended best practice.

Abbreviations

Behaviour of interest; Health context; Exclusions; Models or Theories

Cochrane Database of Systematic Reviews

The Cochrane Central Register of Controlled Trials

Database of Abstracts of Reviews of Effects

Enhancing transparency in reporting the synthesis of qualitative research

Institute for Quality and Efficiency in Healthcare

National Institute for Clinical Excellence

Population, Intervention, Comparator, Outcome

Preferred Reporting Items for Systematic Reviews and Meta-Analyses

Setting, Perspective, Intervention, Comparison, Evaluation

Sample, Phenomenon of Interest, Design, Evaluation, Research type

STrengthening the Reporting of OBservational studies in Epidemiology

Trial Search Co-ordinators

Booth A. Unpacking your literature search toolbox: on search styles and tactics. Health Information & Libraries Journal. 2008;25(4):313–7.

Article   Google Scholar  

Petticrew M, Roberts H. Systematic reviews in the social sciences: a practical guide. Oxford: Blackwell Publishing Ltd; 2006.

Book   Google Scholar  

Institute for Quality and Efficiency in Health Care (IQWiG). IQWiG Methods Resources. 7 Information retrieval 2014 [Available from: https://www.ncbi.nlm.nih.gov/books/NBK385787/ .

NICE: National Institute for Health and Care Excellence. Developing NICE guidelines: the manual 2014. Available from: https://www.nice.org.uk/media/default/about/what-we-do/our-programmes/developing-nice-guidelines-the-manual.pdf .

Sampson M. MJ, Lefebvre C, Moher D, Grimshaw J. Peer Review of Electronic Search Strategies: PRESS; 2008.

Google Scholar  

Centre for Reviews & Dissemination. Systematic reviews – CRD’s guidance for undertaking reviews in healthcare. York: Centre for Reviews and Dissemination, University of York; 2009.

eunetha: European Network for Health Technology Assesment Process of information retrieval for systematic reviews and health technology assessments on clinical effectiveness 2016. Available from: http://www.eunethta.eu/sites/default/files/Guideline_Information_Retrieval_V1-1.pdf .

Kugley SWA, Thomas J, Mahood Q, Jørgensen AMK, Hammerstrøm K, Sathe N. Searching for studies: a guide to information retrieval for Campbell systematic reviews. Oslo: Campbell Collaboration. 2017; Available from: https://www.campbellcollaboration.org/library/searching-for-studies-information-retrieval-guide-campbell-reviews.html

Lefebvre C, Manheimer E, Glanville J. Chapter 6: searching for studies. In: JPT H, Green S, editors. Cochrane Handbook for Systematic Reviews of Interventions; 2011.

Collaboration for Environmental Evidence. Guidelines for Systematic Review and Evidence Synthesis in Environmental Management.: Environmental Evidence:; 2013. Available from: http://www.environmentalevidence.org/wp-content/uploads/2017/01/Review-guidelines-version-4.2-final-update.pdf .

The Joanna Briggs Institute. Joanna Briggs institute reviewers’ manual. 2014th ed: the Joanna Briggs institute; 2014. Available from: https://joannabriggs.org/assets/docs/sumari/ReviewersManual-2014.pdf

Beverley CA, Booth A, Bath PA. The role of the information specialist in the systematic review process: a health information case study. Health Inf Libr J. 2003;20(2):65–74.

Article   CAS   Google Scholar  

Harris MR. The librarian's roles in the systematic review process: a case study. Journal of the Medical Library Association. 2005;93(1):81–7.

PubMed   PubMed Central   Google Scholar  

Egger JB. Use of recommended search strategies in systematic reviews and the impact of librarian involvement: a cross-sectional survey of recent authors. PLoS One. 2015;10(5):e0125931.

Li L, Tian J, Tian H, Moher D, Liang F, Jiang T, et al. Network meta-analyses could be improved by searching more sources and by involving a librarian. J Clin Epidemiol. 2014;67(9):1001–7.

Article   PubMed   Google Scholar  

McGowan J, Sampson M. Systematic reviews need systematic searchers. J Med Libr Assoc. 2005;93(1):74–80.

Rethlefsen ML, Farrell AM, Osterhaus Trzasko LC, Brigham TJ. Librarian co-authors correlated with higher quality reported search strategies in general internal medicine systematic reviews. J Clin Epidemiol. 2015;68(6):617–26.

Weller AC. Mounting evidence that librarians are essential for comprehensive literature searches for meta-analyses and Cochrane reports. J Med Libr Assoc. 2004;92(2):163–4.

Swinkels A, Briddon J, Hall J. Two physiotherapists, one librarian and a systematic literature review: collaboration in action. Health Info Libr J. 2006;23(4):248–56.

Foster M. An overview of the role of librarians in systematic reviews: from expert search to project manager. EAHIL. 2015;11(3):3–7.

Lawson L. OPERATING OUTSIDE LIBRARY WALLS 2004.

Vassar M, Yerokhin V, Sinnett PM, Weiher M, Muckelrath H, Carr B, et al. Database selection in systematic reviews: an insight through clinical neurology. Health Inf Libr J. 2017;34(2):156–64.

Townsend WA, Anderson PF, Ginier EC, MacEachern MP, Saylor KM, Shipman BL, et al. A competency framework for librarians involved in systematic reviews. Journal of the Medical Library Association : JMLA. 2017;105(3):268–75.

Cooper ID, Crum JA. New activities and changing roles of health sciences librarians: a systematic review, 1990-2012. Journal of the Medical Library Association : JMLA. 2013;101(4):268–77.

Crum JA, Cooper ID. Emerging roles for biomedical librarians: a survey of current practice, challenges, and changes. Journal of the Medical Library Association : JMLA. 2013;101(4):278–86.

Dudden RF, Protzko SL. The systematic review team: contributions of the health sciences librarian. Med Ref Serv Q. 2011;30(3):301–15.

Golder S, Loke Y, McIntosh HM. Poor reporting and inadequate searches were apparent in systematic reviews of adverse effects. J Clin Epidemiol. 2008;61(5):440–8.

Maggio LA, Tannery NH, Kanter SL. Reproducibility of literature search reporting in medical education reviews. Academic medicine : journal of the Association of American Medical Colleges. 2011;86(8):1049–54.

Meert D, Torabi N, Costella J. Impact of librarians on reporting of the literature searching component of pediatric systematic reviews. Journal of the Medical Library Association : JMLA. 2016;104(4):267–77.

Morris M, Boruff JT, Gore GC. Scoping reviews: establishing the role of the librarian. Journal of the Medical Library Association : JMLA. 2016;104(4):346–54.

Koffel JB, Rethlefsen ML. Reproducibility of search strategies is poor in systematic reviews published in high-impact pediatrics, cardiology and surgery journals: a cross-sectional study. PLoS One. 2016;11(9):e0163309.

Article   PubMed   PubMed Central   CAS   Google Scholar  

Fehrmann P, Thomas J. Comprehensive computer searches and reporting in systematic reviews. Research Synthesis Methods. 2011;2(1):15–32.

Booth A. Searching for qualitative research for inclusion in systematic reviews: a structured methodological review. Systematic Reviews. 2016;5(1):74.

Article   PubMed   PubMed Central   Google Scholar  

Egger M, Juni P, Bartlett C, Holenstein F, Sterne J. How important are comprehensive literature searches and the assessment of trial quality in systematic reviews? Empirical study. Health technology assessment (Winchester, England). 2003;7(1):1–76.

Tricco AC, Tetzlaff J, Sampson M, Fergusson D, Cogo E, Horsley T, et al. Few systematic reviews exist documenting the extent of bias: a systematic review. J Clin Epidemiol. 2008;61(5):422–34.

Booth A. How much searching is enough? Comprehensive versus optimal retrieval for technology assessments. Int J Technol Assess Health Care. 2010;26(4):431–5.

Papaioannou D, Sutton A, Carroll C, Booth A, Wong R. Literature searching for social science systematic reviews: consideration of a range of search techniques. Health Inf Libr J. 2010;27(2):114–22.

Petticrew M. Time to rethink the systematic review catechism? Moving from ‘what works’ to ‘what happens’. Systematic Reviews. 2015;4(1):36.

Betrán AP, Say L, Gülmezoglu AM, Allen T, Hampson L. Effectiveness of different databases in identifying studies for systematic reviews: experience from the WHO systematic review of maternal morbidity and mortality. BMC Med Res Methodol. 2005;5

Felson DT. Bias in meta-analytic research. J Clin Epidemiol. 1992;45(8):885–92.

Article   PubMed   CAS   Google Scholar  

Franco A, Malhotra N, Simonovits G. Publication bias in the social sciences: unlocking the file drawer. Science. 2014;345(6203):1502–5.

Hartling L, Featherstone R, Nuspl M, Shave K, Dryden DM, Vandermeer B. Grey literature in systematic reviews: a cross-sectional study of the contribution of non-English reports, unpublished studies and dissertations to the results of meta-analyses in child-relevant reviews. BMC Med Res Methodol. 2017;17(1):64.

Schmucker CM, Blümle A, Schell LK, Schwarzer G, Oeller P, Cabrera L, et al. Systematic review finds that study data not published in full text articles have unclear impact on meta-analyses results in medical research. PLoS One. 2017;12(4):e0176210.

Egger M, Zellweger-Zahner T, Schneider M, Junker C, Lengeler C, Antes G. Language bias in randomised controlled trials published in English and German. Lancet (London, England). 1997;350(9074):326–9.

Moher D, Pham B, Lawson ML, Klassen TP. The inclusion of reports of randomised trials published in languages other than English in systematic reviews. Health technology assessment (Winchester, England). 2003;7(41):1–90.

Pham B, Klassen TP, Lawson ML, Moher D. Language of publication restrictions in systematic reviews gave different results depending on whether the intervention was conventional or complementary. J Clin Epidemiol. 2005;58(8):769–76.

Mills EJ, Kanters S, Thorlund K, Chaimani A, Veroniki A-A, Ioannidis JPA. The effects of excluding treatments from network meta-analyses: survey. BMJ : British Medical Journal. 2013;347

Hartling L, Featherstone R, Nuspl M, Shave K, Dryden DM, Vandermeer B. The contribution of databases to the results of systematic reviews: a cross-sectional study. BMC Med Res Methodol. 2016;16(1):127.

van Driel ML, De Sutter A, De Maeseneer J, Christiaens T. Searching for unpublished trials in Cochrane reviews may not be worth the effort. J Clin Epidemiol. 2009;62(8):838–44.e3.

Buchberger B, Krabbe L, Lux B, Mattivi JT. Evidence mapping for decision making: feasibility versus accuracy - when to abandon high sensitivity in electronic searches. German medical science : GMS e-journal. 2016;14:Doc09.

Lorenc T, Pearson M, Jamal F, Cooper C, Garside R. The role of systematic reviews of qualitative evidence in evaluating interventions: a case study. Research Synthesis Methods. 2012;3(1):1–10.

Gough D. Weight of evidence: a framework for the appraisal of the quality and relevance of evidence. Res Pap Educ. 2007;22(2):213–28.

Barroso J, Gollop CJ, Sandelowski M, Meynell J, Pearce PF, Collins LJ. The challenges of searching for and retrieving qualitative studies. West J Nurs Res. 2003;25(2):153–78.

Britten N, Garside R, Pope C, Frost J, Cooper C. Asking more of qualitative synthesis: a response to Sally Thorne. Qual Health Res. 2017;27(9):1370–6.

Booth A, Carroll C. Systematic searching for theory to inform systematic reviews: is it feasible? Is it desirable? Health Info Libr J. 2015;32(3):220–35.

Kwon Y, Powelson SE, Wong H, Ghali WA, Conly JM. An assessment of the efficacy of searching in biomedical databases beyond MEDLINE in identifying studies for a systematic review on ward closures as an infection control intervention to control outbreaks. Syst Rev. 2014;3:135.

Nussbaumer-Streit B, Klerings I, Wagner G, Titscher V, Gartlehner G. Assessing the validity of abbreviated literature searches for rapid reviews: protocol of a non-inferiority and meta-epidemiologic study. Systematic Reviews. 2016;5:197.

Wagner G, Nussbaumer-Streit B, Greimel J, Ciapponi A, Gartlehner G. Trading certainty for speed - how much uncertainty are decisionmakers and guideline developers willing to accept when using rapid reviews: an international survey. BMC Med Res Methodol. 2017;17(1):121.

Ogilvie D, Hamilton V, Egan M, Petticrew M. Systematic reviews of health effects of social interventions: 1. Finding the evidence: how far should you go? J Epidemiol Community Health. 2005;59(9):804–8.

Royle P, Milne R. Literature searching for randomized controlled trials used in Cochrane reviews: rapid versus exhaustive searches. Int J Technol Assess Health Care. 2003;19(4):591–603.

Pearson M, Moxham T, Ashton K. Effectiveness of search strategies for qualitative research about barriers and facilitators of program delivery. Eval Health Prof. 2011;34(3):297–308.

Levay P, Raynor M, Tuvey D. The Contributions of MEDLINE, Other Bibliographic Databases and Various Search Techniques to NICE Public Health Guidance. 2015. 2015;10(1):19.

Nussbaumer-Streit B, Klerings I, Wagner G, Heise TL, Dobrescu AI, Armijo-Olivo S, et al. Abbreviated literature searches were viable alternatives to comprehensive searches: a meta-epidemiological study. J Clin Epidemiol. 2018;102:1–11.

Briscoe S, Cooper C, Glanville J, Lefebvre C. The loss of the NHS EED and DARE databases and the effect on evidence synthesis and evaluation. Res Synth Methods. 2017;8(3):256–7.

Stansfield C, O'Mara-Eves A, Thomas J. Text mining for search term development in systematic reviewing: A discussion of some methods and challenges. Research Synthesis Methods.n/a-n/a.

Petrova M, Sutcliffe P, Fulford KW, Dale J. Search terms and a validated brief search filter to retrieve publications on health-related values in Medline: a word frequency analysis study. Journal of the American Medical Informatics Association : JAMIA. 2012;19(3):479–88.

Stansfield C, Thomas J, Kavanagh J. 'Clustering' documents automatically to support scoping reviews of research: a case study. Res Synth Methods. 2013;4(3):230–41.

PubMed   Google Scholar  

Methley AM, Campbell S, Chew-Graham C, McNally R, Cheraghi-Sohi S. PICO, PICOS and SPIDER: a comparison study of specificity and sensitivity in three search tools for qualitative systematic reviews. BMC Health Serv Res. 2014;14:579.

Andrew B. Clear and present questions: formulating questions for evidence based practice. Library Hi Tech. 2006;24(3):355–68.

Cooke A, Smith D, Booth A. Beyond PICO: the SPIDER tool for qualitative evidence synthesis. Qual Health Res. 2012;22(10):1435–43.

Whiting P, Westwood M, Bojke L, Palmer S, Richardson G, Cooper J, et al. Clinical effectiveness and cost-effectiveness of tests for the diagnosis and investigation of urinary tract infection in children: a systematic review and economic model. Health technology assessment (Winchester, England). 2006;10(36):iii-iv, xi-xiii, 1–154.

Cooper C, Levay P, Lorenc T, Craig GM. A population search filter for hard-to-reach populations increased search efficiency for a systematic review. J Clin Epidemiol. 2014;67(5):554–9.

Hausner E, Waffenschmidt S, Kaiser T, Simon M. Routine development of objectively derived search strategies. Systematic Reviews. 2012;1(1):19.

Hausner E, Guddat C, Hermanns T, Lampert U, Waffenschmidt S. Prospective comparison of search strategies for systematic reviews: an objective approach yielded higher sensitivity than a conceptual one. J Clin Epidemiol. 2016;77:118–24.

Craven J, Levay P. Recording database searches for systematic reviews - what is the value of adding a narrative to peer-review checklists? A case study of nice interventional procedures guidance. Evid Based Libr Inf Pract. 2011;6(4):72–87.

Wright K, Golder S, Lewis-Light K. What value is the CINAHL database when searching for systematic reviews of qualitative studies? Syst Rev. 2015;4:104.

Beckles Z, Glover S, Ashe J, Stockton S, Boynton J, Lai R, et al. Searching CINAHL did not add value to clinical questions posed in NICE guidelines. J Clin Epidemiol. 2013;66(9):1051–7.

Cooper C, Rogers M, Bethel A, Briscoe S, Lowe J. A mapping review of the literature on UK-focused health and social care databases. Health Inf Libr J. 2015;32(1):5–22.

Younger P, Boddy K. When is a search not a search? A comparison of searching the AMED complementary health database via EBSCOhost, OVID and DIALOG. Health Inf Libr J. 2009;26(2):126–35.

Lam MT, McDiarmid M. Increasing number of databases searched in systematic reviews and meta-analyses between 1994 and 2014. Journal of the Medical Library Association : JMLA. 2016;104(4):284–9.

Bethel A, editor Search summary tables for systematic reviews: results and findings. HLC Conference 2017a.

Aagaard T, Lund H, Juhl C. Optimizing literature search in systematic reviews - are MEDLINE, EMBASE and CENTRAL enough for identifying effect studies within the area of musculoskeletal disorders? BMC Med Res Methodol. 2016;16(1):161.

Adams CE, Frederick K. An investigation of the adequacy of MEDLINE searches for randomized controlled trials (RCTs) of the effects of mental health care. Psychol Med. 1994;24(3):741–8.

Kelly L, St Pierre-Hansen N. So many databases, such little clarity: searching the literature for the topic aboriginal. Canadian family physician Medecin de famille canadien. 2008;54(11):1572–3.

Lawrence DW. What is lost when searching only one literature database for articles relevant to injury prevention and safety promotion? Injury Prevention. 2008;14(6):401–4.

Lemeshow AR, Blum RE, Berlin JA, Stoto MA, Colditz GA. Searching one or two databases was insufficient for meta-analysis of observational studies. J Clin Epidemiol. 2005;58(9):867–73.

Sampson M, Barrowman NJ, Moher D, Klassen TP, Pham B, Platt R, et al. Should meta-analysts search Embase in addition to Medline? J Clin Epidemiol. 2003;56(10):943–55.

Stevinson C, Lawlor DA. Searching multiple databases for systematic reviews: added value or diminishing returns? Complementary Therapies in Medicine. 2004;12(4):228–32.

Suarez-Almazor ME, Belseck E, Homik J, Dorgan M, Ramos-Remus C. Identifying clinical trials in the medical literature with electronic databases: MEDLINE alone is not enough. Control Clin Trials. 2000;21(5):476–87.

Taylor B, Wylie E, Dempster M, Donnelly M. Systematically retrieving research: a case study evaluating seven databases. Res Soc Work Pract. 2007;17(6):697–706.

Beyer FR, Wright K. Can we prioritise which databases to search? A case study using a systematic review of frozen shoulder management. Health Info Libr J. 2013;30(1):49–58.

Duffy S, de Kock S, Misso K, Noake C, Ross J, Stirk L. Supplementary searches of PubMed to improve currency of MEDLINE and MEDLINE in-process searches via Ovid. Journal of the Medical Library Association : JMLA. 2016;104(4):309–12.

Katchamart W, Faulkner A, Feldman B, Tomlinson G, Bombardier C. PubMed had a higher sensitivity than Ovid-MEDLINE in the search for systematic reviews. J Clin Epidemiol. 2011;64(7):805–7.

Cooper C, Lovell R, Husk K, Booth A, Garside R. Supplementary search methods were more effective and offered better value than bibliographic database searching: a case study from public health and environmental enhancement (in Press). Research Synthesis Methods. 2017;

Cooper C, Booth, A., Britten, N., Garside, R. A comparison of results of empirical studies of supplementary search techniques and recommendations in review methodology handbooks: A methodological review. (In Press). BMC Systematic Reviews. 2017.

Greenhalgh T, Peacock R. Effectiveness and efficiency of search methods in systematic reviews of complex evidence: audit of primary sources. BMJ (Clinical research ed). 2005;331(7524):1064–5.

Article   PubMed Central   Google Scholar  

Hinde S, Spackman E. Bidirectional citation searching to completion: an exploration of literature searching methods. PharmacoEconomics. 2015;33(1):5–11.

Levay P, Ainsworth N, Kettle R, Morgan A. Identifying evidence for public health guidance: a comparison of citation searching with web of science and Google scholar. Res Synth Methods. 2016;7(1):34–45.

McManus RJ, Wilson S, Delaney BC, Fitzmaurice DA, Hyde CJ, Tobias RS, et al. Review of the usefulness of contacting other experts when conducting a literature search for systematic reviews. BMJ (Clinical research ed). 1998;317(7172):1562–3.

Westphal A, Kriston L, Holzel LP, Harter M, von Wolff A. Efficiency and contribution of strategies for finding randomized controlled trials: a case study from a systematic review on therapeutic interventions of chronic depression. Journal of public health research. 2014;3(2):177.

Matthews EJ, Edwards AG, Barker J, Bloor M, Covey J, Hood K, et al. Efficient literature searching in diffuse topics: lessons from a systematic review of research on communicating risk to patients in primary care. Health Libr Rev. 1999;16(2):112–20.

Bethel A. Endnote Training (YouTube Videos) 2017b [Available from: http://medicine.exeter.ac.uk/esmi/workstreams/informationscience/is_resources,_guidance_&_advice/ .

Bramer WM, Giustini D, de Jonge GB, Holland L, Bekhuis T. De-duplication of database search results for systematic reviews in EndNote. Journal of the Medical Library Association : JMLA. 2016;104(3):240–3.

Bramer WM, Milic J, Mast F. Reviewing retrieved references for inclusion in systematic reviews using EndNote. Journal of the Medical Library Association : JMLA. 2017;105(1):84–7.

Gall C, Brahmi FA. Retrieval comparison of EndNote to search MEDLINE (Ovid and PubMed) versus searching them directly. Medical reference services quarterly. 2004;23(3):25–32.

Ahmed KK, Al Dhubaib BE. Zotero: a bibliographic assistant to researcher. J Pharmacol Pharmacother. 2011;2(4):303–5.

Coar JT, Sewell JP. Zotero: harnessing the power of a personal bibliographic manager. Nurse Educ. 2010;35(5):205–7.

Moher D, Liberati A, Tetzlaff J, Altman DG, The PG. Preferred reporting items for systematic reviews and meta-analyses: the PRISMA statement. PLoS Med. 2009;6(7):e1000097.

Sampson M, McGowan J, Tetzlaff J, Cogo E, Moher D. No consensus exists on search reporting methods for systematic reviews. J Clin Epidemiol. 2008;61(8):748–54.

Toews LC. Compliance of systematic reviews in veterinary journals with preferred reporting items for systematic reviews and meta-analysis (PRISMA) literature search reporting guidelines. Journal of the Medical Library Association : JMLA. 2017;105(3):233–9.

Booth A. "brimful of STARLITE": toward standards for reporting literature searches. Journal of the Medical Library Association : JMLA. 2006;94(4):421–9. e205

Faggion CM Jr, Wu YC, Tu YK, Wasiak J. Quality of search strategies reported in systematic reviews published in stereotactic radiosurgery. Br J Radiol. 2016;89(1062):20150878.

Mullins MM, DeLuca JB, Crepaz N, Lyles CM. Reporting quality of search methods in systematic reviews of HIV behavioral interventions (2000–2010): are the searches clearly explained, systematic and reproducible? Research Synthesis Methods. 2014;5(2):116–30.

Yoshii A, Plaut DA, McGraw KA, Anderson MJ, Wellik KE. Analysis of the reporting of search strategies in Cochrane systematic reviews. Journal of the Medical Library Association : JMLA. 2009;97(1):21–9.

Bigna JJ, Um LN, Nansseu JR. A comparison of quality of abstracts of systematic reviews including meta-analysis of randomized controlled trials in high-impact general medicine journals before and after the publication of PRISMA extension for abstracts: a systematic review and meta-analysis. Syst Rev. 2016;5(1):174.

Akhigbe T, Zolnourian A, Bulters D. Compliance of systematic reviews articles in brain arteriovenous malformation with PRISMA statement guidelines: review of literature. Journal of clinical neuroscience : official journal of the Neurosurgical Society of Australasia. 2017;39:45–8.

Tao KM, Li XQ, Zhou QH, Moher D, Ling CQ, Yu WF. From QUOROM to PRISMA: a survey of high-impact medical journals' instructions to authors and a review of systematic reviews in anesthesia literature. PLoS One. 2011;6(11):e27611.

Wasiak J, Tyack Z, Ware R. Goodwin N. Jr. Poor methodological quality and reporting standards of systematic reviews in burn care management. International wound journal: Faggion CM; 2016.

Tam WW, Lo KK, Khalechelvam P. Endorsement of PRISMA statement and quality of systematic reviews and meta-analyses published in nursing journals: a cross-sectional study. BMJ Open. 2017;7(2):e013905.

Rader T, Mann M, Stansfield C, Cooper C, Sampson M. Methods for documenting systematic review searches: a discussion of common issues. Res Synth Methods. 2014;5(2):98–115.

Atkinson KM, Koenka AC, Sanchez CE, Moshontz H, Cooper H. Reporting standards for literature searches and report inclusion criteria: making research syntheses more transparent and easy to replicate. Res Synth Methods. 2015;6(1):87–95.

McGowan J, Sampson M, Salzwedel DM, Cogo E, Foerster V, Lefebvre C. PRESS peer review of electronic search strategies: 2015 guideline statement. J Clin Epidemiol. 2016;75:40–6.

Sampson M, McGowan J, Cogo E, Grimshaw J, Moher D, Lefebvre C. An evidence-based practice guideline for the peer review of electronic search strategies. J Clin Epidemiol. 2009;62(9):944–52.

Shea BJ, Reeves BC, Wells G, Thuku M, Hamel C, Moran J, et al. AMSTAR 2: a critical appraisal tool for systematic reviews that include randomised or non-randomised studies of healthcare interventions, or both. BMJ (Clinical research ed). 2017;358.

Whiting P, Savović J, Higgins JPT, Caldwell DM, Reeves BC, Shea B, et al. ROBIS: a new tool to assess risk of bias in systematic reviews was developed. J Clin Epidemiol. 2016;69:225–34.

Relevo R, Balshem H. Finding evidence for comparing medical interventions: AHRQ and the effective health care program. J Clin Epidemiol. 2011;64(11):1168–77.

Medicine Io. Standards for Systematic Reviews 2011 [Available from: http://www.nationalacademies.org/hmd/Reports/2011/Finding-What-Works-in-Health-Care-Standards-for-Systematic-Reviews/Standards.aspx .

CADTH: Resources 2018.

Download references

Acknowledgements

CC acknowledges the supervision offered by Professor Chris Hyde.

This publication forms a part of CC’s PhD. CC’s PhD was funded through the National Institute for Health Research (NIHR) Health Technology Assessment (HTA) Programme (Project Number 16/54/11). The open access fee for this publication was paid for by Exeter Medical School.

RG and NB were partially supported by the National Institute for Health Research (NIHR) Collaboration for Leadership in Applied Health Research and Care South West Peninsula.

The views expressed are those of the author(s) and not necessarily those of the NHS, the NIHR or the Department of Health.

Author information

Authors and affiliations.

Institute of Health Research, University of Exeter Medical School, Exeter, UK

Chris Cooper & Jo Varley-Campbell

HEDS, School of Health and Related Research (ScHARR), University of Sheffield, Sheffield, UK

Andrew Booth

Nicky Britten

European Centre for Environment and Human Health, University of Exeter Medical School, Truro, UK

Ruth Garside

You can also search for this author in PubMed   Google Scholar

Contributions

CC conceived the idea for this study and wrote the first draft of the manuscript. CC discussed this publication in PhD supervision with AB and separately with JVC. CC revised the publication with input and comments from AB, JVC, RG and NB. All authors revised the manuscript prior to submission. All authors read and approved the final manuscript.

Corresponding author

Correspondence to Chris Cooper .

Ethics declarations

Ethics approval and consent to participate, consent for publication, competing interests.

The authors declare that they have no competing interests.

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Additional file

Additional file 1:.

Appendix tables and PubMed search strategy. Key studies used for pearl growing per key stage, working data extraction tables and the PubMed search strategy. (DOCX 30 kb)

Rights and permissions

Open Access This article is distributed under the terms of the Creative Commons Attribution 4.0 International License ( http://creativecommons.org/licenses/by/4.0/ ), which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made. The Creative Commons Public Domain Dedication waiver ( http://creativecommons.org/publicdomain/zero/1.0/ ) applies to the data made available in this article, unless otherwise stated.

Reprints and permissions

About this article

Cite this article.

Cooper, C., Booth, A., Varley-Campbell, J. et al. Defining the process to literature searching in systematic reviews: a literature review of guidance and supporting studies. BMC Med Res Methodol 18 , 85 (2018). https://doi.org/10.1186/s12874-018-0545-3

Download citation

Received : 20 September 2017

Accepted : 06 August 2018

Published : 14 August 2018

DOI : https://doi.org/10.1186/s12874-018-0545-3

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Literature Search Process
  • Citation Chasing
  • Tacit Models
  • Unique Guidance
  • Information Specialists

BMC Medical Research Methodology

ISSN: 1471-2288

what is a literature search strategy

University of Maryland Libraries Logo

Systematic Review

  • Library Help
  • What is a Systematic Review (SR)?
  • Steps of a Systematic Review
  • Framing a Research Question

Developing a Search Strategy

  • Searching the Literature
  • Managing the Process
  • Meta-analysis
  • Publishing your Systematic Review

Workshop materials

  • PICO Worksheet
  • Search Strategy Example
  • Search Strategy Presentation Slides
  • Search strings for demo

Find step-by-step instructions on how to develop a search strategy on p. 44

what is a literature search strategy

Errors in search strategies

Salvador-Oliván, J. A., Marco-Cuenca, G., & Arquero-Avilés, R. (2019).  Errors in search strategies used in systematic reviews and their effects on information retrieval .  Journal of the Medical Library Association : JMLA ,  107 (2), 210–221.  https://doi.org/10.5195/jmla.2019.567 . 

  • Search Term Harvesting
  • Text Mining Tools
  • Search Filters / Hedges
  • Documenting
  • Blogs & Discussion Lists

Translating search strategies across databases

  • ChatGPT Ask ChatGPT using this prompt, "Covert this search into terms appropriate for the [name] database." Further reading: Wang, S., Scells, H., Koopman, B., & Zuccon, G. (2023). Can ChatGPT write a good boolean query for systematic review literature search?. arXiv preprint arXiv:2302.03495.
  • LitSonar Use the Help section for further guidance on how to use this tool (https://litsonar.com/help). Capable of searching eight different databases simultaneously
  • Polyglot Use the Polyglot tool to translate search strings from PubMed across multiple databases. Access the tool's tutorial for more information (https://sr-accelerator.com/#/help/polyglot). 
  • MEDLINE Transpose Use this tool to translate your MEDLINE (PubMed) search to MEDLINE (Ovid) format or vice versa.
  • Database Syntax Guide ​Guide to translating syntax for multiple databases. From Cochrane.

_____________________________________________________________

Take control of your search and turn off Pubmed's  Automatic Term Mapping (ATM) ! It will not include all variant terminology and automatically explodes MeSH terms. Not using ATM allows for clearer documentation of the search method.

For more information on Automatic Term Mapping, watch the video below .

Further readings

  • Burns, C. S., Nix, T., Shapiro, R. M., & Huber, J. T. (2021). MEDLINE search retrieval issues: A longitudinal query analysis of five vendor platforms. PLoS ONE , 16 (5), e0234221. https://doi.org/10.1371/journal.pone.0234221
  • PubMed Pub ReMiner  - Text mining for PubMed to look at commonalities between MeSH terms and keywords
  • Go PubMed  - Text mining tool for PubMed or MeSH terms. This article  explains the features of this text mining tool.
  • PubVenn - This tool enables you to explore PubMed using venn diagrams. Also, try Search Workbench .
  • Yale MeSH Analyzer  - Watch this tutorial (7 min.). This tool allows users to enter up to 20 PubMed ID numbers, which it uses to aggregate the metadata from the associated articles into a spreadsheet. For systematic reviews, it is useful in search strategy development to quickly aggregate the Medical Subject Heading (MeSH) terms associated with relevant articles. While it only works for PubMed, it can be useful for developing searches in medical-adjacent fields, such as psychology, nutrition, and animal health. 
  • NCBI MeSH on Demand  identifies MeSH® terms in your submitted text (abstract or manuscript). MeSH on Demand also lists PubMed similar articles relevant to your submitted text.
  • HelioBLAST - This tool finds text records that are similar to the submitted query. Your query is searched against the citations (abstract and titles) in Medline/PubMed and the top matching articles are returned in the results.
  • Coremine - It is ideal for those seeking an overview of a complex subject while allowing the possibility to "drill down" to specific details. Instructions
  • Carrot2 - This tool can automatically organize search results into topics. It can query PubMed and allows boolean searching.
  • SWIFT-Review - Desktop text mining tool specific to systematic reviews. To obtain your free license for SWIFT Review, simply browse to the  Sciome Software  web page to login and/or create your SWIFT-Review account.  
  • Voyant  - General text mining (this is the download). For the web version go to http://voyant-tools.org
  • TerMine  - General text mining
  • JSTOR Text Analyzer - Recommends journal articles in JSTOR relevant to text.
  • CREBP-SRA Word Frequency Analyser (WFA) - This tool helps determine which words you should use to construct and refine a search strategy
  • Medline Ranker  requires a set of known relevant records with PubMed identifiers and a test set of records (e.g. search results from a highly sensitive search). Medline Ranker sorts the records in the test set and presents those that were most similar to the relevant records first. Medline Ranker also provides a list of discriminating terms which discriminate relevant records from non-relevant records.

_________________________________________________________________________

For more information on text mining tools - review and comparison, read the following article:

Paynter, R., Bañez, L. L., Berliner, E., Erinoff, E., Lege-Matsuura, J., Potter, S., & Uhl, S. (2016). EPC methods: an exploration of the use of text-mining software in systematic reviews .

You might limit to a particular publication type in Pubmed. See a full list of Pubmed publication types .

  • Cochrane Handbook Part 2, Section 6.4.11 provides search filters to limit to randomized controlled trials in Medline/PubMed, Medline/Ovid, and Embase
  • McMaster - Filters by the Hedges team

Open Access

  • PubMed Systematic Review Filter Search Strategy
  • Search Filters from Univ. of Texas School of Public Health

Hedges by Topic (in alphabetical order )

  • ​ Prady, S. L., Uphoff, E. P., Power, M., & Golder, S. (2018). Development and validation of a search filter to identify equity-focused studies: Reducing the number needed to screen. BMC Medical Research Methodology, 18 (1), 106. https://doi.org/10.1186/s12874-018-0567-x
  • Health Risk Assessment by Vicky Tessier at the INSPQ
  • Effectiveness of Interventions
  • van der Mierden, S., Hooijmans, C. R., Tillema, A. H., Rehn, S., Bleich, A., & Leenaars, C. H. (2022). Laboratory animals search filter for different literature databases: PubMed, Embase, Web of Science and PsycINFO.  Laboratory animals ,  56 (3), 279–286. https://doi.org/10.1177/00236772211045485
  • Updated Press Checklist (2015) See page 41-42
  • IOM Standards for Systematic Reviews
  • PRESS Checklist

what is a literature search strategy

McGowan J, Sampson M, Salzwedel DM, Cogo E, Foerster V, Lefebvre C.  PRESS Peer Review of Electronic Search Strategies: 2015 guideline statement .  Journal of Clinical  Epidemiology, 75 , 40-46. 

Systematic Literature Review Worksheet

Use the Database Search Log to record your search terms, search strategy and databases searched.

Guidance on Reporting Systematic Reviews

Cochrane strongly encourages that review authors include a study flow diagram as recommended by the PRISMA statement.

  • PRISMA Flow Diagram
  • PRISMA Flow Diagram Generator
  • PRISMA Checklist

Other checklists include:

  • ARRIVE and DSPC for animal studies
  • MOOSE - meta-analysis of observational studies in epidemiology
  • STARLITE - general health policy and clinical practice
  • TIDier-PHP - population health and policy interventions

Examples of documented search methodologies:

  • Full search strategies for all database searches provided in the Appendices:

Bath, P. & Krishnan, K. (2014). Interventions for deliberately altering blood pressure in acute stroke .  Cochrane Database of Systematic Reviews, 10.

  • A summary of sources searched and keywords used in the Sources section:

McIntyre, S, Taitz, D, Keogh, J, Goldsmith, S, Badawi, N & Blair, E. (2013). A systematic review of risk factors for cerebral palsy in children born at term in developed countries . Developmental Medicine & Child Neurology, 55( 6), 499-508.

  • ACRL Systematic Reviews & Related Methods Interest Group [email protected]
  • Cindy Schmidt's Blog: PubMed Search Strategies This blog has been created to share PubMed search strategies. Search strategies posted here are not perfect. They are posted in the hope that others will benefit from the work already put into their creation and/or will offer suggestions for improvements.
  • MedTerm Search Assist from the University of Pittsburgh By librarians for librarians - A database to share biomedical terminology and strategies for comprehensive searches.
  • MLA expert searching discussion list [email protected] - This discussion list often discusses subject strategies and sometimes search filters.
  • PRESS Forum This closed wiki-based forum is a place for librarians to request reviews of systematic review search strategies, and to review the searches of others.
  • << Previous: Framing a Research Question
  • Next: Searching the Literature >>
  • Last Updated: Jan 26, 2024 4:35 PM
  • URL: https://lib.guides.umd.edu/SR
  • Subject guides
  • Researching for your literature review
  • Develop a search strategy

Researching for your literature review: Develop a search strategy

  • Literature reviews
  • Literature sources
  • Getting started
  • Keyword search activity
  • Subject search activity
  • Combined keyword and subject searching
  • Online tutorials
  • Apply search limits
  • Run a search in different databases
  • Supplementary searching
  • Save your searches
  • Manage results

Identify key terms and concepts

Start developing a search strategy by identifying the key words and concepts within your research question. The aim is to identify the words likely to have been used in the published literature on this topic.

For example: What are the key infection control strategies for preventing the transmission of Methicillin-resistant Staphylococcus aureus (MRSA) in aged care homes .

Treat each component as a separate concept so that your topic is organised into separate blocks (concepts).

For each concept block, list the key words derived from your research question, as well as any other relevant terms or synonyms that you have found in your preliminary searches. Also consider singular and plural forms of words, variant spellings, acronyms and relevant index terms (subject headings).  

As part of the process of developing a search strategy, it is recommended that you keep a master list of search terms for each key concept. This will make it easier when it comes to translating your search strategy across multiple database platforms. 

Concept map template for documenting search terms

Combine search terms and concepts

Boolean operators are used to combine the different concepts in your topic to form a search strategy. The main operators used to connect your terms are AND and OR . See an explanation below:

  • Link keywords related to a single concept with OR
  • Linking with OR broadens a search (increases the number of results) by searching for any of the alternative keywords

Example: nursing home OR aged care home

  • Link different concepts with AND
  • Linking with AND narrows a search (reduces the number of results) by retrieving only those records that include all of your specified keywords

Example: nursing home AND infection control

  • using NOT narrows a search by excluding results that contain certain search terms
  • Most searches do not require the use of the NOT operator

Example: aged care homes NOT residential homes will retrieve all the results that include the words aged care homes but don't include the words residential homes . So if an article discussed both concepts this article would not be retrieved as it would be excluded on the basis of the words residential homes .

See the website for venn diagrams demonstrating the function of AND/OR/NOT:

Combine the search terms using Boolean

Advanced search operators - truncation and wildcards

By using a truncation symbol you can capture all of the various endings possible for a particular word. This may increase the number of results and reduce the likelihood of missing something relevant. Some tips about truncation:

  • The truncation symbol is generally an asterisk symbol * and is added at the end of a word.
  • It may be added to the root of a word that is a word in itself. Example: prevent * will retrieve prevent, prevent ing , prevent ion prevent ative etc. It may also be added to the root of a word that is not a word in itself. Example: strateg * will retrieve strateg y , strateg ies , strateg ic , strateg ize etc.
  • If you don't want to retrieve all possible variations, an easy alternative is to utilise the OR operator instead e.g. strategy OR strategies. Always use OR instead of truncation where the root word is too small e.g. ill OR illness instead of ill*

There are also wildcard symbols that function like truncation but are often used in the middle of a word to replace zero, one or more characters.

  • Unlike the truncator which is usually an asterisk, wildcards vary across database platforms
  • Common wildcards symbols are the question mark ? and hash #.
  • Example:  wom # n finds woman or women, p ? ediatric finds pediatric or paediatric.  

See the Database search tips for details of these operators, or check the Help link in any database.

Phrase searching

For words that you want to keep as a phrase, place two or more words in "inverted commas" or "quote marks". This will ensure word order is maintained and that you only retrieve results that have those words appearing together.

Example: “nursing homes”

There are a few databases that don't require the use of quote marks such as Ovid Medline and other databases in the Ovid suite. The Database search tips provides details on phrase searching in key databases, or you can check the Help link in any database.

Subject headings (index terms)

Identify appropriate subject headings (index terms).

Many databases use subject headings to index content. These are selected from a controlled list and describe what the article is about. 

A comprehensive search strategy is often best achieved by using a combination of keywords and subject headings where possible.

In-depth knowledge of subject headings is not required for users to benefit from improved search performance using them in their searches.

Advantages of subject searching:

  • Helps locate articles that use synonyms, variant spellings, plurals
  • Search terms don’t have to appear in the title or abstract

Note: Subject headings are often unique to a particular database, so you will need to look for appropriate subject headings in each database you intend to use.

Subject headings are not available for every topic, and it is best to only select them if they relate closely to your area of interest.

MeSH (Medical Subject Headings)

The MeSH thesaurus provides standard terminology, imposing uniformity and consistency on the indexing of biomedical literature. In Pubmed/Medline each record is tagged with  MeSH  (Medical Subject Headings).

The MeSH vocabulary includes:

  • Represent concepts found in the biomedical literature
  • Some headings are commonly considered for every article (eg. Species (including humans), Sex, Age groups (for humans), Historical time periods)
  • attached to MeSH headings to describe a specific aspect of a concept
  • describe the type of publication being indexed; i.e., what the item is, not what the article is about (eg. Letter, Review, Randomized Controlled Trial)
  • Terms in a separate thesaurus, primarily substance terms

Create a 'gold set'

It is useful to build a ‘sample set’ or ‘gold set’ of relevant references before you develop your search strategy..

Sources for a 'gold set' may include:

  • key papers recommended by subject experts or supervisors
  • citation searching - looking at a reference list to see who has been cited, or using a citation database (eg. Scopus, Web of Science) to see who has cited a known relevant article
  • results of preliminary scoping searches.

The papers in your 'gold set' can then be used to help you identify relevant search terms

  • Look up your 'gold set' articles in a database that you will use for your literature review. For the articles indexed in the database, look at the records to see what keywords and/or subject headings are listed.

The 'gold set' will also provide a means of testing your search strategy

  • When an article in the sample set that is also indexed in the database is not retrieved, your search strategy can be revised in order to include it (see what concepts or keywords can be incorporated into your search strategy so that the article is retrieved).
  • If your search strategy is retrieving a lot of irrelevant results, look at the irrelevant records to determine why they are being retrieved. What keywords or subject headings are causing them to appear? Can you change these without losing any relevant articles from your results?
  • Information on the process of testing your search strategy using a gold set can be found in the systematic review guide

Example search strategy

A search strategy is the planned and structured organisation of terms used to search a database.

An example of a search strategy incorporating all three concepts, that could be applied to different databases is shown below:

screenshot of search strategy entered into a database Advanced search screen

You will use a combination of search operators to construct a search strategy, so it’s important to keep your concepts grouped together correctly. This can be done with parentheses (round brackets), or by searching for each concept separately or on a separate line.

The above search strategy in a nested format (combined into a single line using parentheses) would look like:

("infection control*" OR "infection prevention") AND ("methicillin resistant staphylococcus aureus" OR "meticillin resistant staphylococcus aureus" OR MRSA) AND ( "aged care home*" OR "nursing home*")

  • << Previous: Search strategies - Health/Medical topic example
  • Next: Keyword search activity >>

Developing NICE guidelines: the manual

NICE process and methods [PMG20] Published: 31 October 2014 Last updated: 17 January 2024

  • Tools and resources
  • 1 Introduction
  • 2 The scope
  • 3 Decision-making committees
  • 4 Developing review questions and planning the evidence review

5 Identifying the evidence: literature searching and evidence submission

  • 6 Reviewing evidence
  • 7 Incorporating economic evaluation
  • 8 Linking to other guidance
  • 9 Interpreting the evidence and writing the guideline
  • 10 The validation process for draft guidelines, and dealing with stakeholder comments
  • 11 Finalising and publishing the guideline recommendations
  • 12 Support for putting the guideline recommendations into practice
  • 13 Ensuring that published guidelines are current and accurate
  • 14 Updating guideline recommendations
  • 15 Appendices
  • Update information

NICE process and methods

5.1 introduction, 5.2 searches during guideline recommendation scoping and surveillance, 5.3 searches during guideline recommendation development, 5.4 health inequalities and equality and diversity, 5.5 quality assurance, 5.6 documenting the search, 5.7 re-running searches, 5.8 calls for evidence from stakeholders, 5.9 references and further reading.

The systematic identification of evidence is an essential step in developing NICE guideline recommendations.

This chapter sets out how evidence is identified at each stage of the guideline development cycle. It provides details of the systematic literature searching methods used to identify the best available evidence for NICE guidelines. It also provides details of associated information management processes including quality assurance (peer review), re‑running searches, and documenting the search process.

Our searching methods are informed by the chapter on searching & selecting studies in the Cochrane Handbook for Systematic Reviews of Interventions and the Campbell Collaboration's searching for studies guide . The Summarized Research in Information Retrieval for HTA (SuRe Info) resource also provides research-based advice on information retrieval for systematic reviews.

Our literature searches are designed to be systematic, transparent, and reproducible, and minimise dissemination bias. Dissemination bias may affect the results of reviews and includes publication bias and database bias.

We use search methods that balance recall and precision. When the need to reduce the number of studies requires pragmatic search approaches that may increase the risk of missing relevant studies, the context and trade-offs are discussed and agreed within the development team and made explicit in the reported search methods.

A flexible approach to identifying evidence is adopted, guided by the subject of the review question (see the chapter on developing review questions and planning the evidence review ), type of evidence sought, and the resource constraints of the evidence review. Often an evidence review will be an update of our earlier work, therefore the approach can be informed by previous searches and surveillance reviews (see the chapter on ensuring that published guidelines are current and accurate ).

Scoping searches

Scoping searches are top-level searches to support scope development. The purpose of the searches is to investigate the current evidence around the topic, and to identify any areas where an evidence review may be beneficial and any research gaps. The results of the searches are used to draft the scope of the upcoming guideline or update and to inform the discussions at scoping workshops (if held). Scoping searches do not aim to be exhaustive.

In some cases, scoping searches are not required when it is more efficient to use the surveillance review (see the chapter on the scope ).

The sources searched at scoping stage will vary according to the topic, type of review questions the guideline or update will seek to address, and type of evidence sought. Each scoping search is tailored using combinations of the following types of information:

NICE guidance and guidance from other organisations

policy and legislation guides

key systematic reviews and epidemiological reviews

economic evaluations

current practice data, including costs and resource use and any safety concerns

views and experiences of people using services, their family members or carers, or the public

other real-world health and social care data (for example audits, surveys, registries, electronic health records, patient-generated health data), if appropriate

summaries of interventions that may be appropriate, including any national safety advice

statistics (for example on epidemiology, natural history of the condition, service configuration or national prevalence data).

All scoping searches are fully documented and if new issues are identified at a scoping workshop, the search is updated. A range of possible sources considered for scoping searches is provided in the appendix on suggested sources for scoping .

Health inequalities searches

The purpose of these searches is to identify evidence to help inform the scope, health inequalities briefing, or the equality and health inequalities assessment (EHIA). They help identify key issues relevant to health inequalities on the topic, for example covering protected characteristics, groups experiencing or at risk of inequalities, or wider determinants of health.

The searches involve finding key data sources, such as routinely available national databases, audits or published reports by charities, non-governmental bodies, or government organisations.

Surveillance searches

Surveillance determines whether published recommendations remain current. The searches are tailored to the evidence required. This may include searches for new or updated policies, legislation, guidance from other organisations, or ongoing studies in the area covered by the evidence review.

If required, published evidence is identified by searching a range of bibliographic databases relevant to the topic. Surveillance searches generally use the same core set of databases used during the development of the original evidence review. A list of sources is given in the appendix on sources for evidence reviews .

The search approach and sources will vary between topics and may include:

population and intervention searches

focused searches for specific question areas

forward and backward citation searching.

Searches usually focus on randomised controlled trials and systematic reviews, although other study types will be considered where appropriate, for example for diagnostic questions.

The search period starts at either the end of the search for the last update of a guideline evidence review, or at the last search date for any previous surveillance check. Where appropriate, living evidence surveillance could be set up to continuously monitor the publication of new evidence over a period of time until impact reaches the threshold for actions. For more information on NICE guideline recommendation surveillance, see the chapter on ensuring that guideline recommendations are current and accurate and appendix on surveillance - interim principles for monitoring approaches of guideline recommendations .

Search protocols

Search protocols form part of the wider guideline review protocol (see the appendix on the review protocol template ). They pre‑define how the evidence is identified and provide a basis for developing the search strategies.

Once the final scope is agreed, the information specialist develops the search protocols and agrees them with the development team before the evidence search begins.

A search protocol includes the following elements:

approach to the search strategy, tailored to the review question and eligibility criteria

sources to be searched

plans to use any additional or alternative search techniques , when known at the protocol development stage, and the reasons for their use

details of any limits to be applied to the search

references to any key papers used to inform the search approach.

Searches are done on a mix of bibliographic databases, websites and other sources, depending on the subject of the review question and the type of evidence sought.

For most searches there are key sources that are prioritised, and other potentially relevant sources that can be considered. It is important to ensure adequate coverage of the relevant literature and to search a range of sources. However, there are practical limits to the number of sources that can be searched in the standard time available for an evidence review.

The selection of sources varies according to the requirements of the review question.

Clinical intervention sources

For reviews of the effectiveness of clinical interventions the following sources are prioritised for searching:

the Cochrane Central Register of Controlled Trials (CENTRAL)

the Cochrane Database of Systematic Reviews (CDSR)

Clinical safety sources

In addition to the sources searched for clinical interventions, the following should be prioritised for clinical safety review questions:

MHRA drug safety updates

National patient safety alerts .

Antimicrobial resistance sources

For reviews of antimicrobial resistance, the following sources should be prioritised:

UK Health Security Agency's English surveillance programme for antimicrobial utilisation and resistance (ESPAUR) report

UK Health Security Agency's antimicrobial resistance local indicators .

Cost-effectiveness sources

For reviews of cost effectiveness, economic databases are used in combination with general bibliographic databases, such as MEDLINE and Embase (see appendix G on sources for economic reviews ).

Economic evaluations of social care interventions may be published in journals that are not identified through standard searches. Targeted searches based on references of key articles and contacting authors can be considered to identify relevant papers.

Topic-specific sources

Some topics we cover may require the use of topic-specific sources. Examples include:

PsycINFO (psychology and psychiatry)

CINAHL (nursing and allied health professions)

ASSIA (Applied Social Sciences Index and Abstracts)

HealthTalk , and other sources to identify the views and experiences of people using services, carers and the public

social policy and practice

social care online

sociological abstracts

transport database

Greenfile (environmental literature)

HMIC (Health Management Information Consortium).

Searching for model inputs

Evidence searches may be needed to inform design-oriented conceptual models. Examples include precise searches to find representative NHS costs for an intervention or finding out the proportion of people offered an intervention who take up the offer.

Some model inputs, such as costs, use national sources such as national list prices or national audit data. In some cases, it may be more appropriate to identify costs from the academic literature. Further advice on methods to identify model inputs are also informed by Paisley (2016) and Kaltenhaler et al. (2011). See also the chapter on incorporating economic evaluation .

Real-world data

Information specialists can identify sources of real-world data (such as electronic health records, registries, and audits) for data analysts to explore further. The Health Data Research Innovation Gateway can be used to identify datasets. The NICE real-world evidence framework (2022) has additional guidance on searching for and selecting real-world data sources.

Grey literature

For some review questions, for example, where significant evidence is likely to be published in non-journal sources and there is a paucity of evidence in published journal sources, it may be appropriate to search for grey literature . Useful sources of grey literature include:

HMIC (Health Management Information Consortium)

TRIP database

Canadian Agency for Drugs and Technology in Health (CADTH) Grey Matters resource .

Committee members may also be able to suggest additional appropriate sources for grey literature.

A list containing potential relevant sources is provided in the appendix on sources for evidence reviews .

Developing search strategies

The approach to devising and structuring search strategies is informed by the review protocol. The PICO (population, intervention, comparator and outcome) or SPICE (setting, perspective, intervention, comparison, evaluation) frameworks may be used to structure a search strategy for intervention review questions. For other types of review questions, alternative frameworks may be more suitable.

It is sometimes more efficient to conduct a single search for multiple review questions, rather than conducting a separate search for each question.

Some topics may not easily lend themselves to PICO- or SPICE-type frameworks. In these cases, it may be better to combine multiple, shorter searches rather than attempting to capture the entire topic using a single search. This is often referred to as multi-stranded searching.

In some instances, for example where the terminology around a topic is diffuse or ill defined, it may be difficult to specify the most appropriate search terms in advance. In these cases, an iterative approach to searching can be used.

In an iterative approach, searching is done in several stages, with each search considering the evidence that has already been retrieved (for example, see Booth et al. 2020 ). Searching in stages allows the reviewers to review the most relevant, high-quality information first and then make decisions for identifying additional evidence if needed.

Decisions to use iterative approaches are agreed by the development team and staff with responsibility for quality assurance because it can affect timelines.

Updating previous work

Where high-quality review-level evidence is available on a topic, the review team may choose to update or expand this previous work rather than duplicating the existing findings. In these cases, the original review searches are re-run and expanded to account for any differences in scope and inclusion criteria between the original review and the update.

Cost-effectiveness searches

There are several methods that can be used to identify economic evaluations:

All relevant review questions can be covered by a single search using the population search terms, combined with a search filter, to identify economic evidence.

The search strategies for individual review questions can be combined with search filters to identify economic evidence. If using this approach, it may be necessary to adapt strategies for some databases to ensure adequate sensitivity.

Economic evidence can be manually sifted while screening evidence from a general literature search (so no separate searches are required).

The rationale for the selected approach is recorded in the search protocol.

Where searches are needed to populate an economic model, these are usually done separately.

Identifying search terms

Search terms usually consist of a combination of subject headings and free‑text terms from the titles and abstracts of relevant references.

When identifying subject headings, variations in thesaurus and indexing terms for each database should be considered, for example MeSH (Medical Subject Headings) in MEDLINE and Emtree in Embase. Not all databases have indexing terms and some contain records that have not yet been indexed.

Free‑text terms may include synonyms, acronyms and abbreviations, spelling variants, old and new terminology, brand and generic medicine names, and lay and medical terminology.

For updates, previous search terms, including those from surveillance searches, are reviewed and used to inform new search terms. New or changed terms are identified, as well as any changes to indexing terms. This also applies when an existing review, for example a Cochrane review, is being updated to answer a review question.

Key studies can be a useful source of search terms, as can reports, guidelines, topic-specific websites, committee members and topic experts.

Some websites and databases have limited search functionality. It may be necessary to use fewer search terms or do multiple searches of the same resource with different search term combinations.

It may be helpful to use frequency analysis or text mining to develop the search-term strategy. Tools such as PubReMiner and Medline Ranker can help, either by highlighting search terms that might not otherwise be apparent, or by flagging terms of high value when exhaustive synonym searching is unfeasible or inadvisable.

Search limits

The application of limits to search strategies will reflect the eligibility criteria in the review protocol. Typically, English language limits, date limits, and the exclusion of conference abstracts and animal studies are usually done as a matter of routine.

Search filters

A search filter is a string of search terms with known (validated) performance. When a particular study design is required for a review question, relevant search filters are usually applied to literature search strategies.

Other search filters relating to age, setting, geography, and health inequalities are also applied as relevant. The most comprehensive list of available search filters is the search filter resource of the InterTASC Information Specialists' SubGroup . This resource also includes critical appraisal tools, which are used for filter selection.

Economics-related filters

A variety of search filters of relevance to cost effectiveness are available. These include filters for economic evaluations, quality of life data, and cost-utilities data. It may be necessary to use more than 1 filter to identify relevant data. In addition, it may be appropriate to add geographic search filters, such as those for the UK or Organisation for Economic Co-operation and Development (OECD) countries, to retrieve economic studies relevant to the UK or OECD (Ayiku et al. 2017, 2019, 2021).

Use of machine learning-based classifiers

Machine learning-based classification software has been developed for some study types (for example the Cochrane RCT classifier, Thomas et al. 2020 ). These classifiers apply a probability weighting to each bibliographical reference within a set of search results. The weighting relates to the reference's likelihood to be a particular study type, based on a model created from analysis of known, relevant papers. The weightings can then be used to either order references for screening or be used with a fixed cut-off value to divide a list of references into those more likely to be included, and those that can be excluded without manual screening.

We support the use of machine classifiers if their performance characteristics are known, and if they improve efficiency in the search and screening process. However, caution is needed when using classifiers, because they may not be as effective if used on data that is different to the type of data for which they were originally developed. For example, the Cochrane RCT classifier is reported to have over 99% recall for health studies but showed "unacceptably low" recall for educational research ( Stansfield et al. 2022 ).

Priority screening, a type of machine classifier that orders references for manual sifting based on previous sifting decisions, is considered in the chapter on reviewing evidence .

Additional search techniques

Additional search techniques are used alongside database searching when it is known, or reasonably likely, that relevant evidence is not indexed in bibliographic databases, or when it will be difficult to retrieve relevant evidence from databases in a way that adequately balances recall and precision. Additional search techniques include forward and backward citation searching, journal hand-searches and contacting experts and stakeholders.

Existing reviews may provide an additional source of primary studies, with reference lists being used as an indirect method of identifying primary research.

Various tools, including Citationchaser and Web of Science, are available to speed up the process of citation searching. These may not be as comprehensive as manual reference list checking (due to limitations of the underlying data sources), but the trade-off in terms of speed is generally acceptable.

All search techniques should follow the same principles of transparency, rigour and reproducibility as other search methods.

If possible, additional search techniques should be considered at the outset and documented in the search protocol. They should also be documented in the supporting appendices for the final evidence review.

All searches aim to be inclusive. This may mean not specifying any population groups.

Searches should avoid inadvertently excluding relevant groups. For example, if the population group is older people, a search for older people should pick up subpopulations such as disabled older people.

Additional search strategies may be needed to target evidence about people with protected characteristics or people experiencing or at risk from other inequalities.

Searches may need to be developed iteratively to ensure coverage of the health inequalities issues or evidence on the impacts of an intervention on equality.

Appropriate terminology for the search should be used, considering how language has evolved.

Quality assuring the literature search is an important step in developing guideline recommendations. Studies have shown that errors do occur.

For each search (including economic searches), the initial MEDLINE search strategy is quality assured by a second information specialist. A standardised checklist, based on the PRESS peer review of electronic search strategies: 2015 guideline statement , is used to ensure clarity and consistency when quality assuring search strategies.

The information specialist carrying out the quality assurance process also considers how appropriate the overall search approach is to the parameters of the evidence review (for example, the time available to carry out the review). The quality assurance comments are recorded and the information specialist who conducted the search should respond to the comments and revise the search strategy as needed.

Search strategy translations across the remaining databases are also checked by a second information specialist to ensure that the strategies have been adapted appropriately, in accordance with the interfaces and search functionality of the sources used.

Details of the evidence search are included as appendices to the individual evidence reviews. They are published for consultation alongside the draft evidence review and included in the final version.

Records are kept of the searches undertaken during guideline recommendation development for all review questions to ensure that the process for identifying the evidence is transparent and reproducible.

We use the PRISMA-S: an extension to the PRISMA statement for reporting literature searches in systematic reviews to inform search reporting. The search documentation is an audit trail that allows the reader to understand both the technical aspect of what was done (such as which sources were searched; what platform was used and on what date; any deviations from the original search protocol) and the underlying rationale for the search approach where this may not be immediately apparent.

Documenting the search begins with creating the search protocol (see the section on search protocols ). If using an iterative or emergent stepped approach, initial search strategies, key decision points and the reasons for subsequent search steps are clearly documented in the search protocol and final evidence review. When using a proprietary search engine such as Google, whose underlying algorithm adapts to different users, the search is reported in a way that should allow the reader to understand what was done.

Searches undertaken to identify evidence for each review question (including economics searches) may be re-run before consultation or before publication. For example, searches are re‑run if the evidence changes quickly, there is reason to believe that substantial new evidence exists, or the development time is longer than usual.

A decision to re‑run searches is taken by the development team and staff with responsibility for quality assurance.

If undertaken, searches are re‑run at least 6 to 8 weeks before the final committee meeting before consultation.

If evidence is identified after the last cut‑off date for searching but before publication, a judgement on its impact is made by the development team and staff with responsibility for quality assurance. In exceptional circumstances, this evidence can be considered if its impact is judged as potentially substantial.

In some topic areas or for some review questions, staff with responsibility for quality assurance, the development team or the committee may believe that there is relevant evidence in addition to that identified by the searches. In these situations, the development team may invite stakeholders, and possibly also other relevant organisations or individuals with a significant role or interest (see expert witnesses in the section on other attendees at committee meetings in the chapter on decision-making committees ), to submit evidence. A call for evidence is issued directly to registered stakeholders on the NICE website. Examples and details of process are included in the appendix on call for evidence and expert witnesses . Confidential information should be kept to an absolute minimum.

Ayiku L, Levay P, Hudson T et al. (2017) The medline UK filter: development and validation of a geographic search filter to retrieve research about the UK from OVID medline. Health Information and Libraries Journal 34(3): 200–216

Ayiku L, Levay P, Hudson T et al. (2019) The Embase UK filter: validation of a geographic search filter to retrieve research about the UK from OVID Embase. Health Information and Libraries Journal 36(2): 121–133

Ayiku L, Hudson T, Williams C et al. (2021) The NICE OECD countries' geographic search filters: Part 2-validation of the MEDLINE and Embase (Ovid) filters . Journal of the Medical Library Association 109(4): 583–9

Booth A, Briscoe S, Wright JM (2020) The "realist search": a systematic review of current practice and reporting . Research Synthesis Methods 11: 14–35

Canadian Agency for Drugs and Technologies in Health (2019) Grey Matters: a practical tool for searching health-related grey literature [online; accessed 24 July 2023]

Glanville J, Lefebvre C, Wright K (editors) (2008, updated 2017) The InterTASC Information Specialists' Subgroup Search Filters Resource [online; accessed 24 July 2023]

Kaltenthaler E, Tappenden P, Paisley S (2011) NICE DSU Technical support document 13: identifying and reviewing evidence to inform the conceptualisation and population of cost-effectiveness models [online; accessed 24 July 2023]

Kugley S, Wade A, Thomas J et al. (2017) Searching for studies: a guide to information retrieval for Campbell systematic reviews . Oslo: The Campbell Collaboration

Lefebvre C, Glanville J, Briscoe S et al. Chapter 4: Searching for and selecting studies . In: Higgins JPT, Thomas J, Cumpston M et al. (editors). Cochrane Handbook for Systematic Reviews of Interventions version 6.2 (updated February 2021). Cochrane, 2021

McGowan J, Sampson M, Salzwedel DM et al. (2016) PRESS Peer Review of Electronic Search Strategies: 2015 guideline statement . Journal of Clinical Epidemiology 75: 40–6

National Institute for Health and Care Excellence (2022) NICE real-world evidence framework [online; accessed 24 July 2023]

Paisley S (2016) Identification of key parameters in decision-analytic models of cost-effectiveness: a description of sources and a recommended minimum search requirement. Pharmacoeconomics 34: 597–8

Rethlefsen M, Kirtley S, Waffenschmidt S et al. (2021) PRISMA-S: an extension to the PRISMA statement for reporting literature searches in systematic reviews . Systematic Reviews 10: 39

Stansfield C, Stokes G, Thoman J (2022) Applying machine classifiers to update searches: analysis from two case studies . Research Synthesis Methods 13: 121–33

Summarized research for Information Retrieval in HTA (SuRe Info) [online; accessed 24 July 2023]

How to undertake a literature search: a step-by-step guide

Affiliation.

  • 1 Literature Search Specialist, Library and Archive Service, Royal College of Nursing, London.
  • PMID: 32279549
  • DOI: 10.12968/bjon.2020.29.7.431

Undertaking a literature search can be a daunting prospect. Breaking the exercise down into smaller steps will make the process more manageable. This article suggests 10 steps that will help readers complete this task, from identifying key concepts to choosing databases for the search and saving the results and search strategy. It discusses each of the steps in a little more detail, with examples and suggestions on where to get help. This structured approach will help readers obtain a more focused set of results and, ultimately, save time and effort.

Keywords: Databases; Literature review; Literature search; Reference management software; Research questions; Search strategy.

  • Databases, Bibliographic*
  • Information Storage and Retrieval / methods*
  • Nursing Research
  • Review Literature as Topic*

Conducting a Literature Review

  • Literature Review
  • Developing a Topic
  • Planning Your Literature Review

Developing a Search Strategy

  • Managing Citations
  • Critical Appraisal Tools
  • Writing a Literature Review

A search strategy is an organized structure of key terms used to search a database. The search strategy combines the key concepts of your search question in order to retrieve accurate results.

Your search strategy will account for all:

  • possible search terms
  • keywords and phrases
  • truncated and wildcard variations of search terms
  • subject headings (where applicable)

Each database works differently so you need to adapt your search strategy for each database. You may wish to develop a number of separate search strategies if your research covers several different areas. 

It is a good idea to test your strategies and refine them after you have reviewed the search results.

This is a sample planner to develop your search terms from a PICO format

Simple chart with PICO as headings, and descriptions below

Identifying Search Terms

Once you have developed your research question or chosen your topic you can begin to brainstorm terms to use in your database search.

  • Brainstorm terms authors or indexers might use to describe your topic
  • Make a list of terminology and relevant terms to use in your search
  • Include synonyms or similar terms to combine using the Boolean operator OR
  • Search for controlled vocabulary in the databases i.e. search PubMed for MeSH terms

Combine the Elements of Your PICO Question with Boolean Operators

Boolean Operators (Using AND, OR NOT):

Boolean logic is a building block of many computer applications and is an important concept in database searching.  Using the correct Boolean operator can make all the difference in a successful search.

AND, OR, NOT

There are three basic Boolean search commands:  AND ,  OR  and  NOT .

  • AND  searches find all of the search terms.  For example, searching on dengue  AND  malaria  AND  zika  returns only results that contain all three search terms.  Very limited results.
  • OR  searches find one term or the other.  Searching on dengue  OR  malaria  OR  zika returns all items that contain any of the three search terms.  Returns a large number of results.
  • NOT  eliminates items that contain the specified term.  Searching on malaria  NOT  zika returns items that are about malaria, but will specifically  NOT  return items that contain the word zika.  This is a way to fine-tune results. Note:  sometimes  AND NOT  is used; serves the same function as  NOT

Using Boolean Search with Exact Phrases:

If you're searching for a phrase rather than just a single word, you can group the words together with quotation marks.  Searching on "dengue fever" will return only items with that exact phrase.  

When to use Parentheses?

Think of your search in concepts, then put those concepts inside parentheses.  Different databases have different rules about combining searches.  To make sure you get the search you want, use parentheses - every database follows those rules.

Run a Preliminary Search

Look at titles and publication dates to decide which articles you want to look at in depth.

  • Select an article and begin the skimming and scanning process.
  • If the list has too many irrelevant results, consider selecting different keywords and revising your search.
  • If the list has too many results, consider setting date limiters or narrowing your results by searching phrases instead of keywords.
  • If the list has too few results, consider selecting different keywords

Seton Hall logo

  • The Interprofessional Health Sciences Library
  • 123 Metro Boulevard
  • Nutley, NJ 07110
  • [email protected]
  • Visiting Campus
  • News and Events
  • Parents and Families
  • Web Accessibility
  • Career Center
  • Public Safety
  • Accountability
  • Privacy Statements
  • Report a Problem
  • Login to LibApps

Duke University Libraries

Literature Reviews

  • 3. Search the literature
  • Getting started
  • Types of reviews
  • 1. Define your research question
  • 2. Plan your search

Creating a search strategy

Select your database(s), document your search, rinse and repeat, grey literature, grey literature sources.

  • 4. Organize your results
  • 5. Synthesize your findings
  • 6. Write the review
  • Thompson Writing Studio This link opens in a new window
  • Need to write a systematic review? This link opens in a new window

what is a literature search strategy

Contact a Librarian

Ask a Librarian

  • Thesauri / subject headings
  • Ask a librarian!

When conducting a literature review, it is imperative to brainstorm a list of keywords related to your topic. Examining the titles, abstracts, and author-provided keywords of pertinent literature is a great starting point.

Things to keep in mind:

  • Alternative spellings (e.g., behavior and behaviour)
  • Variants and truncation (e.g., environ* = environment, environments, environmental, environmentally)
  • Synonyms (e.g., alternative fuels >> electricity, ethanol, natural gas, hydrogen fuel cells)
  • Phrases and double quotes (e.g., "food security" versus food OR security) 

One way to visually organize your thoughts is to create a table where each column represents one concept in your research question. For example, if your research question is...

Does social media play a role in the number of eating disorder diagnoses in college-aged women?

...then your table might look something like this:

Generative AI tools, such as chatbots, are actually quite helpful at this stage when it comes to brainstorming synonyms and other related terms. You can also look at author-provided keywords from benchmark articles (key papers related to your topic), databases' controlled vocabularies, or do a preliminary search and look through abstracts from relevant papers.

Generative AI tools :  ChatGPT ,  Google Gemini (formerly Bard) ,  Claude , Microsoft Copilot

For more information on how to incorporate AI tools into your research, check out the section on  AI Tools .

Boolean searching yields more effective and precise search results. Boolean operators include  AND , OR , and NOT . These are logic-based words that help search engines narrow down or broaden search results.

Using the Operators

The Boolean operator  AND  tells a search engine that you want to find information about two (or more) search terms. For example, sustainability AND plastics. This will narrow down your search results because the search engine will only bring back results that include both search terms.

The Boolean operator  OR  tells the search engine that you want to find information about either search term you've entered. For example, sustainability OR plastics. This will broaden your search results because the search engine will bring back any results that have either search term in them.

The Boolean operator  NOT  tells the search engine that you want to find information about the first search term, but nothing about the second. For example, sustainability NOT plastics. This will narrow down your research results because the search engine will bring back only resources about the first search term (sustainability), but exclude any resources that include the second search term (plastics).

Boolean searching Venn diagram

Some databases offer a thesaurus , controlled vocabulary , or list of available subject headings that are assigned to each of its records, either by an indexer or by the original author. The use of controlled vocabularies is a highly effective, efficient, and deliberate way of comprehensively discovering the material within a field of study.

  • APA Thesaurus of Psychological Index Terms  (via PsycInfo database)
  • Medical Subject Headings (MeSH)  (via PubMed)
  • List of ProQuest database thesauri

Web of Science's Core Collection offers a list of subject categories that are searchable by the  Web of Science  Categories field .

Reach out to a Duke University Libraries librarian at [email protected] or use the chat function.

Information animated icons created by Freepik - Flaticon

Not sure where to start when selecting a scholarly database to search? Here are some top databases:

While not essential for traditional literature reviews, documenting your search can help you:

  • Keep track of what you've done so that you don't repeat unproductive searches
  • Reuse successful search strategies for future papers
  • Help you describe your search process for manuscripts
  • Justify your search process

Documenting your search will help you stay organized and save time when tweaking your search strategy. This is a critical step for rigorous review papers, such as  systematic reviews .

One of the easiest ways to document your search strategy is to use a table like this:

what is a literature search strategy

If you find that you're receiving too many results , try the following tips:

  • Use more AND operators to connect keywords/concepts in order to narrow down your search.
  • Use more specific keywords rather than an umbrella term (e.g., "formaldehyde" instead of "chemical").
  • Use quotation marks (" ") to search an entire phrase.
  • Use filters such as date, language, document type, etc.
  • Examine your research question to see if it's still too broad.

On the other hand, if you're not receiving enough results :

  • Use more OR operators to connect related terms and bring in additional results.
  • Use more generic terms (e.g., "acetone" instead of "dimethyl ketone") or fewer keywords altogether.
  • Use wildcard operators (*) to expand your results (e.g., toxi* searches toxic, toxin, toxins).
  • Examine your research question to see if it's too narrow.

Grey (or gray) literature refers to research materials and publications that are not commercially published or widely distributed through traditional academic channels. If you are tasked with doing an intensive type of review or evidence synthesis, or you are involved in research related to policy-making, you will likely want to include searching for grey literature.   This type of literature includes:

  • working papers
  • government documents
  • conference proceedings
  • theses and dissertations
  • white papers...etc.

For more information on grey literature, please see our Grey Literature guide .

  • Public policy
  • Health/medicine
  • Statistics/data
  • Thesis/dissertation
  • ProQuest Central This link opens in a new window Search for articles from thousands of scholarly journals
  • OpenDOAR OpenDOAR is the quality-assured, global Directory of Open Access Repositories. We host repositories that provide free, open access to academic outputs and resources.
  • OAIster A catalog of millions of open-access resources harvested from WorldCat.
  • GreySource An index of repository hyperlinks across all disciplines.
  • Pew Research Center Pew Research Center is a nonpartisan fact tank that informs the public about the issues, attitudes and trends shaping the world. We conduct public opinion polling, demographic research, content analysis and other data-driven social science research.
  • The World Bank The World Bank is a vital source of financial and technical assistance to developing countries around the world.
  • World Health Organization (WHO): IRIS IRIS is the Institutional Repository for Information Sharing, a digital library of WHO's published material and technical information in full text produced since 1948.
  • PolicyArchive PolicyArchive is a comprehensive digital library of public policy research containing over 30,000 documents.
  • Kaiser Family Foundation KFF is the independent source for health policy research, polling, and journalism. Our mission is to serve as a nonpartisan source of information for policymakers, the media, the health policy community, and the public.
  • MedNar Mednar is a free, medically-focused deep web search engine that uses Explorit Everywhere!, an advanced search technology by Deep Web Technologies. As an alternative to Google, Mednar accelerates your research with a search of authoritative public and deep web resources, returning the most relevant results to one easily navigable page.
  • Global Index Medicus The Global Index Medicus (GIM) provides worldwide access to biomedical and public health literature produced by and within low-middle income countries. The main objective is to increase the visibility and usability of this important set of resources. The material is collated and aggregated by WHO Regional Office Libraries on a central search platform allowing retrieval of bibliographical and full text information.

For more in-depth information related to grey literature searching in medicine, please visit Duke Medical Center Library's guide .

  • Education Resources Information Center (ERIC) ERIC is a comprehensive, easy-to-use, searchable, Internet-based bibliographic and full-text database of education research and information. It is sponsored by the Institute of Education Sciences within the U.S. Department of Education.
  • National Center for Occupational Safety and Health (NIOSHTIC-2) NIOSHTIC-2 is a searchable bibliographic database of occupational safety and health publications, documents, grant reports, and other communication products supported in whole or in part by NIOSH (CDC).
  • National Technical Information Service (NTIS) The National Technical Information Service acquires, indexes, abstracts, and archives the largest collection of U.S. government-sponsored technical reports in existence. The NTRL offers online, free and open access to these authenticated government technical reports.
  • Science.gov Science.gov provides access to millions of authoritative scientific research results from U.S. federal agencies.
  • GovInfo GovInfo is a service of the United States Government Publishing Office (GPO), which is a Federal agency in the legislative branch. GovInfo provides free public access to official publications from all three branches of the Federal Government.
  • CQ Press Library This link opens in a new window Search for analysis of Congressional actions and US political issues. Includes CQ Weekly and CQ Researcher.
  • Congressional Research Service (CRS) This collection provides the public with access to research products produced by the Congressional Research Service (CRS) for the United States Congress.

Please see the Data Sets and Collections page from our Statistical Sciences guide.

  • arXiv arXiv is a free distribution service and an open-access archive for nearly 2.4 million scholarly articles in the fields of physics, mathematics, computer science, quantitative biology, quantitative finance, statistics, electrical engineering and systems science, and economics. Materials on this site are not peer-reviewed by arXiv.
  • OSF Preprints OSF Preprints is an open access option for discovering multidisciplinary preprints as well as postprints and working papers.
  • << Previous: 2. Plan your search
  • Next: 4. Organize your results >>
  • Last Updated: Feb 15, 2024 1:45 PM
  • URL: https://guides.library.duke.edu/lit-reviews

Duke University Libraries

Services for...

  • Faculty & Instructors
  • Graduate Students
  • Undergraduate Students
  • International Students
  • Patrons with Disabilities

Twitter

  • Harmful Language Statement
  • Re-use & Attribution / Privacy
  • Support the Libraries

Creative Commons License

Want to Get your Dissertation Accepted?

Discover how we've helped doctoral students complete their dissertations and advance their academic careers!

what is a literature search strategy

Join 200+ Graduated Students

textbook-icon

Get Your Dissertation Accepted On Your Next Submission

Get customized coaching for:.

  • Crafting your proposal,
  • Collecting and analyzing your data, or
  • Preparing your defense.

Trapped in dissertation revisions?

How to develop a literature search strategy, published by dr. courtney watson on may 9, 2022 may 9, 2022.

Last Updated on: 3rd June 2022, 04:18 am

Diving into your literature search is one of the most exciting parts of the dissertation process , and also one of the most vital. While it takes some thought and preparation, developing a sound literature search strategy is a must. In order to maximize your efforts and avoid getting lost in the databases, it’s a good idea to do some careful planning before you start the research process. 

These are some frequently asked questions about literature search strategies: 

  • What is a literature search strategy? 
  • Why do I need a literature search strategy?
  • How should I structure my literature search? 
  • How do I perfect my literature search strategy? 

What is a Literature Search Strategy?

A literature search strategy is an action plan for martialing all of the scholarly sources that you will use in your dissertation. Your search will give you a clear picture of the history of your topic and the current critical conversation, as well as expose gaps in the literature ripe for investigation. Eventually, one of these gaps will help you form the research question that will be the beating heart of your dissertation. 

girl with brown hair reading a book in the library

Your literature search strategy should reflect your desires and goals for developing your topic and completing your dissertation. It will inform the depth and scope of your search, as well as the variety of sources you seek out. Your search strategy will also influence your research timeline and when you will complete each milestone of your dissertation.  

Why Do I Need a Literature Search Strategy? 

In my experience as a scholar, professor, and dissertation consultant, I have reviewed hundreds of reference pages. Each one tells a story, a saga of works cited or dismissed, consulted or overlooked. When a literature search includes dozens, or even hundreds, of sources, it’s easy to get lost. Having a clear literature search strategy will help you stay out of the weeds. 

In addition to getting overwhelmed by the amount of sources you’ll be examining during your literature search, it is also easy to get distracted. When I was working on my dissertation, my advisor cautioned me to be mindful of all the “bright, shiny objects” that would lead me down rabbit holes that were only tangentially related to my topic. While it’s hard to resist the lure of new horizons, a set literature search strategy will help you stay focused on the task at hand. 

african american woman looking for a book in the library

How Should I Structure My Literature Search?

A good place to start is a thorough accounting of all of the resources that are available to you. While I had years’ worth of graduate scholarship experience before I started my dissertation , it wasn’t until I started my research that I fully understood how vast the body of knowledge about my topic really was, or how much work was ahead of me. For me, it was the beginning of a transformative learning experience (which is kind of the point, right?), and it made me see how research is a science as well as an art.  

My search began with the databases my university library subscribed to, and I spent my days combing through JSTOR and EBSCO refining my search terms, conducting advanced searches, skimming abstracts, and wading through theory. I became well-acquainted with the inter-library loan system (check it out, it’s great!) as well as the seemingly endless ink-and-paper journal archives. While I started with an informal literature search strategy, it became more targeted the further I progressed. 

Perfecting Your Literature Search Strategy

While no plan is foolproof, there are steps you can take to safeguard your literature search strategy. My best advice is not to wing it; having a plan will save you a lot of time and frustration. Also, get to know your academic library and all of its resources. Research librarians are specialists in their field, and the unsung heroes of academic research. They’re a great friend to have by your side when you’re in the trenches of the literature search. 

librarian in a yellow sweater helping a graduate student on the computer

Don’t be afraid to seek professional help beyond the library. There are a multitude of resources available to you. When mapping out your literature search strategy, discuss your plan with your advisor or a dissertation consultant in order to gain perspective and insight about possible pitfalls. If you hit a snag, reach out and ask for help early; waiting just leads to frustration and lost time. Momentum is an important key to finishing your dissertation , and an expert literature search strategy is the perfect way to get the ball rolling. 

what is a literature search strategy

Dr. Courtney Watson

Dr. Courtney Watson has research, professional, and dissertation committee experience in the humanities and social sciences, health sciences, education, and liberal arts. With a background in peer-reviewed qualitative research and scholarship, she is skilled at coaching clients through the developmental phases of dissertation research, writing, revision, feedback analysis, and citation. She also offers thoughtful and thorough academic job market preparation, guidance through the dissertation process, and higher education career advice. Book a Free Consultation with Courtney Watson

Related Posts

grad student studying in the library

Dissertation

What makes a good research question.

Creating a good research question is vital to successfully completing your dissertation. Here are some tips that will help you formulate a good research question.  What Makes a Good Research Question? These are the three Read more…

concentrated grad student taking dissertation notes

Dissertation Structure

When it comes to writing a dissertation, one of the most fraught questions asked by graduate students is about dissertation structure. A dissertation is the lengthiest writing project that many graduate students ever undertake, and Read more…

professor consulting students in his office

Choosing a Dissertation Chair

Choosing your dissertation chair is one of the most important decisions that you’ll make in graduate school. Your dissertation chair will in many ways shape your experience as you undergo the most rigorous intellectual challenge Read more…

Make This Your Last Round of Dissertation Revision.

Learn How to Get Your Dissertation Accepted .

Discover the 5-Step Process in this Free Webinar .

Almost there!

Please verify your email address by clicking the link in the email message we just sent to your address.

If you don't see the message within the next five minutes, be sure to check your spam folder :).

Hack Your Dissertation

5-Day Mini Course: How to Finish Faster With Less Stress

Interested in more helpful tips about improving your dissertation experience? Join our 5-day mini course by email!

Research-Methodology

Literature search strategy

Sometimes you are required to explain your literature search strategy used in your research. Even when you are not officially required to do so, including the explanation of literature search strategy in the literature review chapter is going to boost your marks considerably.

Keeping a literature search diary to write your search activities is a good way of keeping track of your literature review progress. The diary can be in the paper format, a Microsoft Word file or an Excel spreadsheet and include the following

  • Names of sources
  • Search terms used
  • The numbers of search results generated from each source.

Generally, you can conduct your literature search strategy in the following stages:

1. Identification of search terms . For example, for a study entitled “An investigation into the impacts of management practices on the levels of employee motivation at Coca-Cola USA” search terms can be specified as management, management style, motivation, employee morale, leadership, satisfaction, work-life balance , and others.

Your search strategy for the relevant literature should also consider synonyms of key words. For example above, the search term of employee motivation might be referred to elsewhere as employee morale or employee willingness.

2. Finding an initial pool of online and offline resources according to the search term . Equipped with search terms, a vast pool of relevant literature can be generated from a wide range of sources. The most effective secondary data sources include the following:

  • Bibliographic databases such as Emerald and Google Scholar
  • Online libraries such as Questia
  • Conference proceedings
  • Key industry journals and magazines

3. Filtering the literature according to credentials of authors . Due to the word limits imposed for the literature review chapter, as well as, other chapters of the dissertation, it is not possible, nor desirable to discuss all of the sources you have found in this chapter. Only the works of the most noteworthy scholars and authors need to be included in the literature review.

Scholars with the highest credentials do usually publish their articles on peer-reviewed journals and respectable magazines, rather than newspapers and online blogs. You should take this into account when devising and applying your literature search strategy.

4. Further filtering the remaining literature according to contribution of the text to the development of the research area . Regardless of the type of the selected research area, the literature review will identify many works that have been completed by respected authorities in the area. Due to word limitation requirement only the most important contributions of the research area need to be mentioned in the literature review.

For example, within the research area of organizational culture such contributions can be mentioned as Harrison’s Model of Culture (1972), Competing Values Framework by Cameron and Quinn (1999), Geert Hofstede’s Cultural Dimensions, Trompenaar’s Cultural Dimensions and others.

5. Filtering the remaining literature according to date of publication . Your search strategy for the literature needs to give more preference to recent publications. Apart from the inclusion of major models and theoretical frameworks, you have to focus on the latest developments in the research area. Therefore, it is important to be critically analyzing up-to-date sources in the literature, and the majority of the literature discussed in this chapter need to be the ones that were published during the last five years.

Literature review chapter of your dissertation needs to include literature that remains after applying all five literature search stages discussed above.

Literature search strategy

John Dudovskiy

Banner

MFT 204: MFT 204: INDIVIDUAL AND FAMILY LIFE CYCLE DEVELOPMENT Bosley 2024

  • Literature review
  • Using PICO to Determine Search Terms
  • Find Articles

Search strategies

  • Quantitative vs. Qualitative Research
  • Evidence Based Practice
  • Find Journals
  • Citation: APA 7

Research is not always a linear process and figuring out the best keywords often involves starting out with a few words then adapting, shifting, and exploring. Sometimes one specific word will be enough. Other times, you'll need several different words to describe a concept AND you'll need to connect that concept to a second (and/or third) concept.

Boolean operators  (AND, NOT, OR) connect words and concepts. 

File:Diagram Explaining Boolean Operators.png - Wikimedia Commons

Other search tricks:

Truncation:  Place an asterisk (*) to end a word at its core, allowing you to retrieve many more documents containing variations of the search term.  Example: replicat** will find replicate, replicates, replication, replicating, etc.  

Phrase searching:  put quotations marks around two or more words, so that the database looks for those words in that exact order. examples: "public health" and "prenatal care.", controlled vocabulary:  use the terms the database uses to describe what each article is about as search terms. searching using controlled vocabularies is a great way to get at everything on a topic in a database. , related libguide.

Search Strategies

Keyword vs. Subject Searching

Search terms  are extracted from your research question (such as the terms that may up your PICO) and can be entered into whichever database(s) you decide to use. Databases give you the option of using keywords or subject headings. 

Each database has its own set of subject headings, designed specifically for the literature from the field(s) of study the database contains. Knowing the difference between keywords and subject headings, as well as the advantages and disadvantages for both of them, can help you perform better searches.

Keyword searching  is how you typically search web search engines.  Think of important words or phrases and type them in to get results. 

  • Natural language words describing your topic - good to start with 
  • Flexible and able to be combined in any number of ways
  • Searches for matching words or phrases anywhere in the records the database contains (such as title, abstract, journal title)
  • Sometimes either too broad or too narrow, resulting in either too many or too few results
  • Reflective of recent phenomena in advance of when the subject headings are added

Subject headings  describe the content of each item in a database. Use these headings to find relevant items on the same topic.  

  • Pre-defined, "controlled" vocabulary used to describe the content of a text found in a database (such as PubMed MeSH or CINAHL Subject Headings).
  •  Less flexible and  must  be chosen from the thesaurus used by the database; if the incorrect subject heading is selected, none of the results will be relevant.
  • Database looks for subjects only in the subject heading or descriptor field, where the most relevant words appear 
  • Helpful for retrieving a set of articles with fewer irrelevant results
  • Slow to change--this means that the most recent changes in knowledge--on diseases, drugs, devices, procedures, concepts--may not be reflected in the controlled vocabulary.

Keyword or Subject Heading Search?

Some basic guidelines are:

  • If the term or topic is very recent, keywords may be the best option
  • If no Subject Heading exists for your term, or seems inadequate, use a keyword
  • e.g. neuroses would be a very broad keyword search
  • e.g. Heart attack OR Myocardial Infarction

Use the filters in each database to narrow your search down and eliminate irrelevant results. These helps to maximize your search process by making it more efficient. 

Limiting to a specific year range and peer reviewed articles are the most commonly used filters.

Other helpful filters include by full text availability, study type, population group as well as more specific filters such as "Randomized Control Trials' and "Evidence Based Practice." Filters vary according to database. Typically, you can find filters either on the Advanced Search page, underneath the search boxes or on the left-hand side of search results. These two databases go over in greater detail how to use certain filters in individual databases:   Evidence-Based Practice   and  Quantitative and Qualitative Research . 

  • << Previous: Find Articles
  • Next: Quantitative vs. Qualitative Research >>
  • Last Updated: Feb 14, 2024 9:50 AM
  • URL: https://libguides.hofstra.edu/MFT204

Hofstra University

This site is compliant with the W3C-WAI Web Content Accessibility Guidelines HOFSTRA UNIVERSITY Hempstead, NY 11549-1000 (516) 463-6600 © 2000-2009 Hofstra University

  • Research Guides

Grey Literature

  • What is Grey Literature? Activities
  • Why Use Grey Literature?
  • Types of Grey Literature
  • Sources of Grey Literature
  • Searching for Grey Literature
  • Evaluating Grey Literature Activity
  • How to Incorporate & Cite Grey Literature

This guide includes content adapted with permission from the University of Illinois Library .

what is a literature search strategy

Incorporate & Cite

There are many ways to incorporate grey literature into your work, which is not much different from incorporating scholarly works. However, there are some key differences to remember when using grey literature. 

Evaluate and Verify

It is worth repeating that grey literature is not peer-reviewed like scholarly works, so evaluating and verifying grey literature before using it is extremely important. Use this guide's "Evaluating Grey Literature" page to see how this process works! 

Firstly, make sure that you are citing your sources properly! See the APA Citation Style guide link below to help with any challenges.

As with scholarly works, it is essential to clearly and correctly cite grey literature. However, with grey literature, it would behoove both you as a scholar and those reading your work to give more context for the grey literature to make it clear that the works you are citing could potentially be biased or inaccurate in ways scholarly works are not. 

For example, do you see a difference between the following two in-text citations?

"A 2019 report from Amnesty International states that the Chinese government is committing cultural genocide against Ughegrs. Human Rights Watch's 2020 report corroborates this and the reporting of the New York Times."  

"China is committing genocide against Muslims (Amnesty International 2019)."

The first citation is very clear in giving context, specificity, and verification for the grey literature, which will be extremely helpful to future researchers and make the paper's statements more accurate. 

  • APA for Peabody Students Provides support for using APA style.
  • << Previous: Evaluating Grey Literature Activity
  • Last Updated: Feb 14, 2024 11:35 AM
  • URL: https://researchguides.library.vanderbilt.edu/greyliterature

Creative Commons License

  • GradPost Blog

Upcoming Library workshops for graduate students

Career & Tools

The Library is here to support graduate student development with the addition of 4 new workshops throughout the winter quarter.

UCSB Library Logo

Check out these upcoming offerings and be sure to sign up today!

Literature Reviews in the Humanities & Social Sciences: Practical Tips and Strategies

Date & Time: Wednesday, 1/31/2024 | 12-1:00pm Location: Library 1312

In this workshop, you will learn how to get started creating a Literature Review, including:

  • What Literature Reviews are, and where to find them
  • How to use concept mapping to identify which secondary sources you need
  • How to find and organize your sources to facilitate your writing
  •  Save your searches
  • Create Alerts for new citations in your topic
  • How to best cite and export your sources

Snacks will be provided.  Register Here!

Questions? Contact Marisol Ramos or Jane Faulkner

Zotero Workshop for Graduate Students

Date & Time: Wednesday, 2/14/2024 | 12-1:00pm Location: Library 1312

What is Zotero? Zotero is a free bibliography management tool that helps you collect citations for your research and generate entries for notes and bibliographies. While geared to graduate students, UCSB faculty and other affiliated researchers are welcome to attend. The “Zotero Basics” workshop lasts one hour and will cover all aspects of using Zotero to manage citations for your personal research. This workshop is designed for beginners, however we will also share tips to help experienced users make better use of Zotero. Registration in advance required. Refreshments will be provided.

Questions? Contact Kristen LaBonte

Copyright: What Do Users Need to Know? What Do Creators Need to Know?

Date & Time: Wednesday, 2/28/2024 | 12-1:00pm Location: Library 1312

This presentation you will learn the basics of copyright under US Law, including:

  • What is copyright? What does it cover?
  • How does a creator get their work copyrighted? What is "work for hire"?
  • How long does a copyright last?
  • What is public domain? What is Fair Use?

Refreshments will be provided. Advanced Registration is required. Register Here!

Questions? Contact Chuck Huber

ORCID Workshop for Graduate Students

Date & Time: Wednesday, 3/06/2024 | 12-1:00pm Location: Library 1312

UCSB Library invites graduate students to learn how to use ORCID in your research. ORCID (Open Researcher and Contributor ID) is a unique, persistent digital identifier that you own and control, and that distinguishes you from every other researcher.

In this workshop we will cover:

  • Creating a free ORCID ID and managing your ORCID account
  • Placing your publications in ORCID
  • Funding, publisher, and research institution requirements
  • Tips and tricks for the best use of ORCID, especially through name and institution changes

Snacks will be provided. Advance registration required. Register Here!

This will ensure that you receive the pre-workshop email with instructions for what you need to do on your computer ahead of time.  

Questions? Contact Kristen LaBonte or Chuck Huber

U.S. flag

An official website of the United States government

The .gov means it's official. Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you're on a federal government site.

The site is secure. The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

  • Publications
  • Account settings
  • Browse Titles

NCBI Bookshelf. A service of the National Library of Medicine, National Institutes of Health.

Dudley RA, Frolich A, Robinowitz DL, et al. Strategies To Support Quality-based Purchasing: A Review of the Evidence. Rockville (MD): Agency for Healthcare Research and Quality (US); 2004 Jul. (Technical Reviews, No. 10.)

Cover of Strategies To Support Quality-based Purchasing

Strategies To Support Quality-based Purchasing: A Review of the Evidence.

2 methods for literature search.

  • Technical Expert Advisory Panel

For advice on the scope of the project, refinement of the key questions, and preparation of this technical review, we consulted technical experts in the following fields: employer purchasing strategies, provider performance assessment, consumer use of report cards and consumer preferences for health care information, risk adjustment, and economics. (See Appendix A , available at www.ahrq.gov/clinic/epcindex.htm .)

  • Target Audiences and Population

The decisionmakers addressed in this technical review are purchasers (both private purchasers such as employers and public purchasers such as the Centers for Medicare & Medicaid Services and State Medicaid programs), executives in health plans that must negotiate incentive arrangements with provider organizations or individual providers, executives in provider organizations that must negotiate incentive arrangements with providers, public health officials and other organizations interested in creating health care performance reports for public release, and policymakers. For the purpose of this report, provider organizations include all clinical health providers such as physicians, nurses, and hospitals. Public health officials and policymakers include those at the local, State, Federal, and international levels.

The ultimate target population of this report is the U.S. population at risk for morbidity or mortality resulting from quality problems in the provision of health care. We are interested in QBP strategies that affect the entire U.S. population—all members of which are at risk for receiving poor quality care—including those of all racial and ethnic backgrounds, all ages, and both genders.

  • Key Questions

We developed the key questions in collaboration with AHRQ, the Alliance (the nominating partner), and our Technical Expert Panel. The goal of these discussions was to identify the issues purchasers interested in QBP faced so that, if the available research offered conclusions about these aspects of QBP, the various stakeholders would be in a better position to select optimal approaches to QBP.

The key questions for which literature, ongoing research, or results from analyses were sought in preparation of this report were:

  • What is the evidence on the extent to which health plans and employers use incentives to improve quality and efficiency?
  • Does the use of financial incentives for quality and efficiency actually increase the probability that patients receive high quality, efficient care?
  • The basis of the incentive (structure, process, outcome)?
  • The nature of the incentive (bonus, penalties or holdback, tiering or patient steerage/referral)?
  • To whom the incentive is targeted (plan vs. provider group vs. individual provider)?
  • The payer of the incentive (purchaser vs. plan vs. medical group)?
  • The magnitude of the incentive?
  • Does the use of nonfinancial incentives for quality and efficiency actually increase the probability that patients receive high quality, efficient care?
  • The nature of the incentive (public release of performance report vs. confidential performance report)?
  • Does greater spending result in higher quality?
  • What are the cost savings for the health care provider and purchaser as a result of the quality improvement?
  • What are the cost savings associated with different approaches to preventing medical errors or otherwise improving quality?
  • What specific processes and structures result in quantifiable cost savings? Who realizes the savings? How should they be shared?
  • What contextual variables (e.g., provider supply, employer number and market share, health plan competition, organizational system/infrastructure, employee demographics) positively or negatively influence the effectiveness of financial and nonfinancial incentives for providers?
  • Literature Review Methods

Based on input from our expert advisors, our conceptual model, and practical considerations, we developed literature review methods that included: inclusion and exclusion criteria to identify potentially relevant articles, search strategies to retrieve articles, abstract review protocols, and a system of scoring published studies for completeness.

Inclusion and Exclusion Criteria

To be considered an article that provided evidence regarding one of the key questions above, the article had to address one of the predictor variables and either quality (as measured by processes or outcomes) or cost. In addition, the intervention in the trial had to be a strategy that could plausibly be introduced by a purchaser. Our focus was on articles that provided definitive primary data from randomized, controlled trials, but we also included systematic reviews to determine whether these contained any additional information not covered by the primary randomized, controlled trial reports.

We excluded articles that did not meet specific criteria in terms of the quality of the research and reporting. These were:

  • Intervention randomized
  • Inclusion/exclusion criteria clear and appropriate
  • Greater than 75% follow-up
  • Note: two criteria usually used to judge the quality of a randomized, controlled trial—provision of placebo to the control group and blinding of the subjects—are not applicable in this situation
  • Information source appropriate
  • Information source adequately searched
  • Data abstraction performed by at least 2 independent reviewers
  • Principal measures of effect and the methods of combining results appropriate

Search Strategy

The objective of our search strategy was to identify all published QBP randomized trials and all ongoing research into QBP strategies. For the literature review, we used standard search strategies involving the querying of two online databases (MEDLINE ® and Cochrane) using key words, followed by evaluation of the bibliographies of relevant articles, Web sites of relevant organizations (especially of funding agencies providing project summaries and of employer organizations pursuing QBP), and reference lists provided by our Technical Expert Panel ( Table 1 ).

Table 1. Information sources for literature review and catalog of ongoing research.

Table 1. Information sources for literature review and catalog of ongoing research.

Database Searches

To identify potentially relevant articles in the medical literature, we searched MEDLINE ® and Cochrane databases and references provided by our Expert Advisors.

MEDLINE ® search strategies. We searched MEDLINE ® (January 1980 to December 15, 2003) for English language articles using the search terms described in Table 2 . Some citations were reviewed and articles were retrieved in more than one of the searches listed below.

Table 2. MEDLINE® searches to identify potentially relevant primary data.

Table 2. MEDLINE ® searches to identify potentially relevant primary data.

Cochrane search strategies. We searched the Cochrane databases from January 1, 1990 through December 15, 2003 (OVID, Evidence Based Medicine Reviews Multifile) using the search terms described in Table 3 .

Table 3. Search terms and citations for Cochrane databases.

Table 3. Search terms and citations for Cochrane databases.

Abstract Review

To identify potentially relevant articles for focused searching, at least two investigators (to ensure consistent application of the inclusion and exclusion criteria) reviewed each citation and, whenever an abstract was available, the abstract. Discrepancies in inclusion were resolved by discussion and re-review.

Evaluating Published Articles for Completeness of Reporting

We assessed each of the published articles for their completeness in reporting the factors we identified in our conceptual model that could influence a provider's response to incentives. Specifically, we scored them for the inclusion (or not) of descriptions of the elements in Table 4 . We also recorded the type of care (preventive care, acute care, or chronic care) to which the quality measured pertained.

Table 4. Evaluating randomized controlled trials for completeness of reporting.

Table 4. Evaluating randomized controlled trials for completeness of reporting.

  • Identifying Ongoing Research

Based on input from our expert advisors, our conceptual model, and practical considerations, we developed methods to catalog ongoing research into QBP that involved specifying: inclusion and exclusion criteria to identify potentially relevant research projects, search strategies to retrieve project abstracts, abstract review protocols, and a system of describing the study design of ongoing research projects.

Since the search for ongoing research focused on projects not yet reported in the literature, the criteria for identifying relevant projects focused on the planned intervention. Two types of research potentially met our inclusion criteria: projects designed as randomized controlled trials, or projects with interventions using QBP methods as described above (i.e., payment or performance reporting strategies) and applied at the community level (or in a broader geographic region, such as a State) that included historical or contemporaneous non-randomized control groups.

We searched online health services research databases (HSRProj and AHRQ's Grants-On-Line Database or GOLD). We also searched the Web sites of other funders or coordinators of projects (e.g., the Leapfrog Group at www.leapfroggroup.org/RewardingResults/ ). Finally, we inquired of staff at AHRQ, the Robert Wood Johnson Foundation, the California HealthCare Foundation, and the Commonwealth Fund whether there was ongoing research that met our inclusion criteria being funded by those organizations. Table 5 lists our information sources for this aspect of the report.

Table 5. Information sources for the catalog of ongoing research.

Table 5. Information sources for the catalog of ongoing research.

We searched the two available databases for ongoing health services research, using a similar search strategy for each ( Tables 6 and 7 ). We accessed HSRProj through the National Library of Medicine's Gateway database at gateway.nlm.nih.gov/gw/Cmd and GOLD at www.gold.ahrq.gov .

Table 6. Search terms and citations for GOLD.

Table 6. Search terms and citations for GOLD.

Table 7. Search terms and citations for HSRProj database.

Table 7. Search terms and citations for HSRProj database.

GOLD search strategies . We searched GOLD through February 15, 2004 for grants funded by AHRQ using the categories described in Table 6 . Through our combination of searches, we eventually evaluated all projects in GOLD.

HSRProj search strategies . We searched the HSRProj database through February 15, 2004 using the categories described in Table 7 .

Grant Abstract Review

Two investigators reviewed the abstracts of projects identified from the database searches to assess relevance to the technical review. Discrepancies in inclusion were resolved by discussion and re-review and by discussion with project officers at funding agencies or with the principal investigator of the project under consideration.

Describing the Study Design of Ongoing Research

For each research project, we interviewed either project staff (usually the principal investigator) or the project officer to determine the study design. We obtained information about the intervention—performance measures and incentives used—and the control group. The information sought is described in Table 8 .

Table 8. Design information sought about ongoing research.

Table 8. Design information sought about ongoing research.

  • Cite this Page Dudley RA, Frolich A, Robinowitz DL, et al. Strategies To Support Quality-based Purchasing: A Review of the Evidence. Rockville (MD): Agency for Healthcare Research and Quality (US); 2004 Jul. (Technical Reviews, No. 10.) 2, Methods for Literature Search.
  • PDF version of this title (1.7M)

In this Page

Other titles in these collections.

  • AHRQ Technical Reviews
  • Health Services/Technology Assessment Text (HSTAT)

Recent Activity

  • Methods for Literature Search - Strategies To Support Quality-based Purchasing Methods for Literature Search - Strategies To Support Quality-based Purchasing

Your browsing activity is empty.

Activity recording is turned off.

Turn recording back on

Connect with NLM

National Library of Medicine 8600 Rockville Pike Bethesda, MD 20894

Web Policies FOIA HHS Vulnerability Disclosure

Help Accessibility Careers

statistics

  • Open access
  • Published: 06 February 2024

What are the learning objectives in surgical training – a systematic literature review of the surgical competence framework

  • Niklas Pakkasjärvi 1 , 2 ,
  • Henrika Anttila 3 &
  • Kirsi Pyhältö 3 , 4  

BMC Medical Education volume  24 , Article number:  119 ( 2024 ) Cite this article

190 Accesses

2 Altmetric

Metrics details

To map the landscape of contemporary surgical education through a competence framework by conducting a systematic literature review on learning outcomes of surgical education and the instructional methods applied to attain the outcomes.

Surgical education has seen a paradigm shift towards competence-based training. However, a gap remains in the literature regarding the specific components of competency taught and the instructional methods employed to achieve these outcomes. This paper aims to bridge this gap by conducting a systematic review on the learning outcomes of surgical education within a competence framework and the instructional methods applied. The primary outcome measure was to elucidate the components of competency emphasized by modern surgical curricula. The secondary outcome measure was to discern the instructional methods proven effective in achieving these competencies.

A search was conducted across PubMed, Medline, ProQuest Eric, and Cochrane databases, adhering to PRISMA guidelines, limited to 2017–2021. Keywords included terms related to surgical education and training. Inclusion criteria mandated original empirical studies that described learning outcomes and methods, and targeted both medical students and surgical residents.

Out of 42 studies involving 2097 participants, most concentrated on technical skills within competency-based training, with a lesser emphasis on non-technical competencies. The effect on clinical outcomes was infrequently explored.

The shift towards competency in surgical training is evident. However, further studies on its ramifications on clinical outcomes are needed. The transition from technical to clinical competence and the creation of validated assessments are crucial for establishing a foundation for lifelong surgical learning.

Peer Review reports

Introduction

Surgery requires a highly specialized set of surgical knowledge, skills, and attitudes that will allow a surgeon to perform the requisite procedures in collaboration with the patient and the multi-professional team. These competencies are fundamental to a surgeon’s ability to function effectively, necessitating flexibility, adaptability, and continuous professional development. In the field of learning sciences, the term competence is used to refer to the combination of knowledge, skills, and attitudes that allows an individual to solve the job-related task or a problem at hand and act professionally [ 1 , 2 , 3 , 4 ]. Accordingly, it can be claimed that cultivating a set of surgical competencies organically integrating knowledge, skills, and attitudes needed in surgeons’ work is imperative for high-quality surgical education. This calls for the understanding of both the range of competencies acquired in surgery training and the kinds of instructional methods that are effective in adopting them. Interestingly, many studies in surgical education, including systematic literature reviews, appear to often focus on a single learning outcome. This typically involves exploring either a specific technical skill or content knowledge in a surgical area, along with assessing the effectiveness of a particular instructional method [ 5 , 6 , 7 , 8 , 9 ].

The traditional Halstedian methods, with their focus on incremental responsibility and volume-based exposure, have been foundational in surgical training. Over the past few decades, the approach has been complemented with more tailored instructional methods [ 10 , 11 ]. For example, technical skills are often contemplated with models and simulators [ 12 , 13 ], thus increasing patient safety during surgery, and allowing the training surgeon to focus on the operation without feeling pressured to execute technical tasks [ 11 ]. Simulation training has demonstrated positive effects, especially in technical skills [ 14 , 15 , 16 ], but also in the longitudinal transfer of skills [ 17 ]. Much of the research on simulation has focused on training assessment with validated programs becoming more widely available [ 18 , 19 , 20 , 21 , 22 ]. Procedure-specific assessment has become common in evaluating surgical learning outcomes and has resulted in a set of validated task-specific assessment tools, such as OSATS (Objective Structured Assessment of Technical Skills) [ 23 ]. However, reducing surgery to separated technical tasks infers risks related to developing surgical competence, mainly a lack of integration in the learning of surgical skills, knowledge, and attitudes, further compromising continuous professional development, and thus potentially occupational wellbeing. There is also contradictory evidence on the effectiveness of the surgical training method in achieving the desired learning outcomes, but this may be more related to the unrealized potential of evidence-based training methods [ 24 ]. Further, the implementation of modern surgical training is lagging [ 25 ]. To sum up, while research on surgical education has significantly advanced our understanding of more tailored methods for cultivating surgical learning, it has also typically adapted a single ingredient approach [ 10 , 11 ]. A problem with this approach is that it neglects the complexity of surgical competence development and, without coherence building, bears the inherent risk of reducing surgery into mastering a series of technical tasks rather than providing tools for cultivating surgical competencies. Moreover, only a few prior systematic reviews on surgical education have studied surgical learning across the fields of surgery or among both medical students and surgical residents. Our study aims to comprehensively analyze the competencies targeted in contemporary surgical education, as revealed through a systematic literature review. We seek to elucidate the nature of these competencies—including skills, knowledge, and attitudes—and the instructional methods employed to develop them in medical students and surgical residents. This approach will highlight how competencies are defined, integrated, and cultivated in surgical education according to existing literature. Specifically, our primary outcome is to identify and detail the competencies (skills, knowledge, and attitudes) emphasized in the existing research on surgical education. We aim to understand how these competencies are conceptualized, taught, and developed, providing insights into the current focus of surgical training programs. As a secondary objective, we will examine the instructional methods discussed in the literature for teaching these competencies. This involves analyzing the effectiveness and application of different teaching strategies in nurturing a comprehensive set of surgical competencies, focusing on integrating technical and non-technical skills. To our knowledge, this is the first published effort within surgery to review the literature comprehensively on surgical competencies development and instructional methods across the fields of surgery, with studies conducted with both medical students and surgical residents.

We conducted a systematic literature review by using the guidelines of the Preferred Reporting Item for Systematic Reviews and Meta-analysis statement (PRISMA) [ 26 ].

Research strategy and data sources

We searched four electronic databases: PubMed, Medline, ProQuest Eric, and Cochrane databases on 18 February 2021. Only articles in English were considered, and the search was limited to years 2017–2021. This restriction was based on a pilot search, which identified a high volume of review articles before 2017 and a significant increase in the quantity and relevance of primary research studies on the surgical competence framework beginning in 2017. The search string consisted of the following keywords: “Surgical Education”, “Surgical Training”, “Surgical Intern*”, “surgical resident” OR “surgical apprentice” AND “learning”. The detailed syntax of the search was: (“surgical intern” AND learning) OR (“surgical training” AND learning) OR (“surgical intern*” AND learning) OR (“surgical resident” AND learning) OR (“surgical apprentice” AND learning). The database search resulted 1305 articles (1297 from PubMed/Medline, 6 from Cochrane databases, and 2 from ProQuest Eric).

Inclusion criteria and study selection

We applied five inclusion criteria for the data. To be included in the review, the articles had to fulfil the following criteria:

be original empirical studies.

be published in a peer-reviewed journal between 2017 and 2021.

be written in English, although the study could have been conducted in any country.

include surgical residents and/or medical students as participants.

include descriptions of learning outcomes and methods of learning in the results of the study.

Data were extracted manually in several increments. Two of the authors (NP) and (HA) independently reviewed the titles and abstracts of all articles identified by the search and marked potentially relevant articles for full-text retrieval (see Fig.  1 for the PRISMA diagram for the review flow). After reading the titles and abstracts, and removing the duplicates, 1236 articles were excluded as they did not meet the inclusion criteria. This also included 13 literature reviews that were excluded from the study as they were not empirical. However, the references of the reviews were reviewed by using a snowball method to detect additional references. This resulted in 16 studies being added to the full-text analysis. After this, the two authors independently examined the full texts of the remaining 85 articles with the inclusion criteria and selected the studies eligible for inclusion in the review. At this point, 43 articles were excluded as they did not explain learning outcomes or learning activities. Disagreements between the two authors were minimal and were resolved through a joint review of the full-text articles and discussion with the third co-author (KP). All articles that matched the inclusion criteria were included in the review, resulting in 42 articles being included in the review.

figure 1

The PRISMA diagram depicts the flow of the systematic review, from the initial identification of 1305 database hits to the ultimate inclusion of 42 articles

Data extraction

Two of the authors (NP) and (HA) extracted and documented information about 11 factors of each study into the Excel file to create a data sheet for the analysis. The following characteristics of the studies were recorded: country, participants, field of surgery, study design, use of a control group, tool, outcome measure, core finding, results on surgical learning outcomes, instructional design applied and clinical setting. Learning outcomes were categorized according to the three components of surgical competence: (a) knowledge , (b) skills (including both technical and non-technical skills), and (c) attitudes [ 22 ]. Surgical knowledge included results concerning training surgeons’ theoretical and practical knowledge about surgery, procedure, or medicine in more general. Surgical skills entailed results on their technical and non-technical skills, strategies, reflection, and self-regulation. Surgical attitudes involved results on training surgeons about their attitudes to their work and views about themselves as surgeons. The instructional design reported in the studies was coded into seven categories according to the mode of instruction applied in the study for training surgeons: (a) learning by doing , including (b) learning through reflection , including instructions where the training surgeons reflected their own learning (c) learning by modelling , (d) learning by direct instruction , (e) learning by self-directed study , (f) learning by mentoring or teaching , and (g) learning by gaming.

The “ Learning by doing ” category included instructional situations in which medical students and surgeons learned while working as surgeons, for example, by completing surgical tasks and procedures. “ Learning through reflection ” included situations in which they learned by reflecting on their prior experiences, thoughts, own development, and performance in specific tasks.

In the “ Learning by modeling ” category, learning occurred by observing or copying the behaviors of their peers or more experienced surgeons. “ Learning by direct instruction ” included situations in which they learned while attending formal education, lectures, or seminars and by receiving tips or practical guidance from others.

The “ Learning by self-directed study ” category encompassed situations where training surgeons learned through self-directed study, such as reading, seeking information, and independently watching procedure videos, without any external intervention.

In the “ Learning by mentoring or teaching ” category, training surgeons learned while they taught or mentored their peers. “ Learning by gaming ” included situations where training surgeons played games to improve their competence.

Regarding categorization, each of the studies included in the review could belong to one or more of these categories. However, to be included in a category, the article needed to clearly explain that the instructional method in question was used in the study. For example, even though performing surgical procedures might also involve self-reflection, the article was categorized under “ learning by doing ” and not additionally under “ learning by self-reflection ” unless the reflection was explicitly mentioned in the article.

We included 42 empirical studies involving 2097 medical students and surgeons in training in this systematic review. The studies on surgical learning were geographically distributed across ten countries. Most of the studies were conducted in the USA ( n  = 22), and Canada ( n  = 12), however studies from the UK, the Netherlands, Austria, Chile, Germany, Finland, and Switzerland were also present. Surgical learning was typically explored with small-scale studies with a median of 28 participants, interquartile range 46 (see Table  1 ). Most of the studies focused on surgical residents’ learning ( n  = 29), whereas medical students’ surgical learning was explored in 11 studies. One study had both residents and medical students as participants. Twenty-seven studies investigated surgical learning in general surgery, with the remaining 16 in various other surgical specialties (including gynecology, cardiology, urology, pediatrics, neurosurgery, microsurgery, orthopedics, vascular surgery, gastro surgery and otolaryngology). The study design of the empirical studies varied from simulation (including bench models, animals, human cadavers, and virtual reality (VR)), operating room (OR) procedures, interviews, surveys, writing tasks, to knowledge tests and the resident report card. Most of the studies employed multimodal designs. Eighteen of the studies were controlled; 13 studies were randomized controlled trials (RCT), and five were controlled trials (CT). The core finding was discussed in all studies and where applicable, statistical tests were applied to highlight the significance. Almost half of the studies ( n  = 18) were conducted in clinical settings.

Primary outcome measures: learning objectives of surgeons in training and competency components

Most of the included studies on surgical learning focused on surgical skills and their attainment ( n  = 36) (See Table  1 ). Training surgeons commonly learned technical skills such as knot tying, distinct surgical procedures, and robotic skills ( n  = 25). In contrast, learning of non-technical skills ( n  = 11), such as communication, patient management, reflection, self-regulation, and decision-making skills, were less often reported. Twenty-two studies focused on the acquisition of surgical knowledge, such as general medical or surgical knowledge or more specific knowledge of certain procedures. Some of the studies ( n  = 10) reported attitudinal learning outcomes including confidence, resilience, and self-efficacy. Most of the studies ( n  = 26) had a single focus on surgical competence, i.e., they focused on learning of skills, knowledge, or attitudes. However, in 19 studies, the training surgeons’ learning was a combination of several skills, knowledge, and attitudes, most typically technical skills, and surgical knowledge. Empirical studies relied on performance assessment ( n  = 15), including studies in which the performance assessment was utilized by other reports, such as senior surgeons assessing the performance of the training surgeons, and self-reporting of the learning outcomes ( n  = 11). Sixteen studies combined both performance assessment and self-report of learning.

Learning was measured with validated objective tools in half of the studies. Most studies utilized either the OSATS global evaluation tool or a derivative optimized for the given conditions. These derivatives included ABSITE (The American Board of Surgery In-Service Training Exam) [ 69 ]; OSA-LS (OSATS salpingectomy-specific form) [ 70 ]; ASSET (Arthroscopic Surgical Skill Evaluation Tool) [ 71 ]; SP-CAT (Simulation Participants-Communication Assessment Tool) [ 72 ]; UWOMSA (University of Western Ontario Microsurgical Acquisition/assessment instrument) [ 73 ], and NRS (Numeric Rating Scale). Cognitive task analysis (CTA) was utilized in only two studies. In both studies, CTA improved scores in outcome testing [ 62 , 64 ]. CTA-based training was considered suitable for expediting learning but based on our study cohort, it is scarcely applied.

Secondary outcome measures: what kind of instructional designs do surgeons in training learn through?

The included studies in the present review employed various instructional methods ranging from learning by doing to mentoring and teaching fellow residents. Learning by doing , including technical training (of specific procedures, knot tying, etc.) both in OR settings and in simulation (e.g., VR, robotic, bench model, human cadaver, and animal), was most typically applied as the primary instructional method ( n  = 26), especially in teaching technical skills and non-technical surgical skills both for surgical residents and medical students. Partly mixed resulted in terms of the effectiveness of the method for novice and more advanced surgical students. For example, while Feins et al. showed that residents’ performance in component tasks and complete cardiac surgical procedures improved by simulation, Korte et al. reported, that especially more novice surgeons benefitted from simulations more than those who had more experience [ 29 , 37 ]. Most skill curricula improved assessment scores, but surgical outcomes may remain unaffected by similar interventions as shown by Jokinen et al. [ 43 ]. Also, learning through reflection , through which training surgeons reflecting on their own learning experiences and development, such as by participating in debriefing after operations or via video-based guided reflection ( n  = 13) was a commonly emphasized instructional method. Engaging in reflection was shown to be effective in promoting the learning of non-technical skills and attitudes. Trickey et al. showed that reflecting on positive learning experiences increased residents’ confidence and improved their communication skills, while Soucisse et al. and Naik et al. reported that self-reflecting on surgical tasks performed improved technical skills as well [ 55 , 57 , 65 ]. Ranney et al. furthermore showed that residents, who can reflect on their learning and thought processes are more in control and proceed to autonomy more quickly [ 56 ].

Commonly used instructional methods for enhancing surgical learning include modeling ( n  = 5), particularly observing more experienced surgeons performing surgical procedures, s elf-directed study ( n  = 6), such as preparing for surgery, reading, and self-studying and direct instruction ( n  = 7). The latter included participating in contact teaching and lectures, watching videos, and getting practical advice from senior surgeons, and these were frequently used in teaching future surgeons. Raiche et al. showed that observing and modelling, have their limitations, as residents have challenges in identifying where to focus their attention and in understanding what it is teaching them [ 52 ]. To be effective, such a form of instruction seems to call for explanation and support from senior surgeons. Naik et al. showed that receiving feedback during technical skill learning had a significant impact on residents’ performance in technical skills [ 57 ]. The results also emphasized the importance of pre-preparation for the OR for learning gains. For example, Logishetty et al. showed that residents preparing for arthroplasty with a CTA tool improved operative times and reduced mistakes and were taught both decision-making skills as well as technical skills [ 64 ].

On the other hand, learning through gaming (including playing escape rooms, jeopardy, and other quiz games) ( n  = 4) and mentoring or teaching fellow training surgeons ( N  = 1) were seldomly applied in the teaching of future surgeons. The empirical evidence still implies that such instructional methods can enhance surgical learning. Hancock et al., Chon et al., Kinio et al. and Amer et al., all showed that gaming improved surgical knowledge [ 40 , 42 , 54 , 61 ]. Zundel et al. found that peers are an extremely important source of instruction for training surgeons and that they both acquire knowledge and learn technical skills every day from each other [ 44 ]. Unfortunately, they receive little educational training in peer mentoring and thus the resource of peers as learning support is not exploited to its full potential [ 44 ].

To sum up, the results indicate that multimodal instructional designs are more commonly applied in studies exploring surgical learning and means to enhance it. In just over half of the studies ( n  = 23) participants were engaged in a combination of two to three different instructional activities.

Our results show that studies on surgical residents and medical students’ surgical learning focus heavily on learning surgical skills, particularly technical skills, and acquiring knowledge on how to perform specific procedures or surgical tasks. This indicates that, at least implicitly, quite a few studies on surgical learning are drawing on a competence framework by combining the learning of surgical skills and knowledge acquisition. However, the scope of such studies typically remains very specific.

Learning surgical soft skills such as communication and teamwork, learning skills, and adaptability were rarely investigated. Interestingly, none of the studies address learning skills such as self- or co-regulated learning as part of surgical learning. However, they are fundamental for flexible and adaptive professional behaviors and engagement in continuous professional development [ 74 , 75 ]. In addition, the studies included in the review rarely addressed learning of attitudes such as self- or co-efficacy or resilience as part of surgical learning, though self-efficacy has shown to be one of the main predictors of learning outcomes and good performance [ 76 , 77 ]. This may imply that such skills and attitudes are not considered to be at the core of surgical learning or that they are expected to result as by-product of other surgical learning activities. This can be considered to be a gap in the literature on surgical learning. The lack of knowledge on developing soft skills and attitudes among future surgeons also has practical implications since they play a central role in patient safety and a surgeon’s recovery from adverse events [ 78 , 79 ]. The importance of these non-technical skills is further supported by research from Galayia et al. and Gleason et al. [ 80 , 81 ]. Their studies highlight how factors like workload, emotional intelligence, and resilience are crucial in managing burnout, with a clear correlation shown between these skills, job resources, and burnout rates among surgical trainees.

Surgeons’ lack of familiarity with non-technical skills and insufficient training for handling adverse events [ 82 , 83 ] exacerbate this issue. In our review, systematic approaches to address adverse events were notably absent. The fact that soft skills and attitudes are often overlooked in surgical competencies poses a challenge for both research on surgical learning and the development of informed surgical education.

Recently, high incidences of burnout among surgery residents have been reported [ 84 ]. This concerning trend underscores the need for a holistic approach to surgical education. Addressing stressors in surgical education is not solely an individual concern but a systemic issue, necessitating substantial transformations in healthcare delivery and success measurement [ 85 ]. Fortunately, there has been a noticeable increase in publications emphasizing the acquisition of non-technical skills, reflecting a growing awareness of their importance in surgical training [ 86 ]. However, it is essential to note that most literature on simulation-based surgical training still predominantly focuses on technical skills [ 86 ]. This ongoing emphasis suggests that while strides are being made towards a more comprehensive educational approach, there remains a significant skew towards technical proficiency in current training paradigms.

The studies we reviewed applied various validated assessment tools. In this systematic review, learning was most focused on technical skills and evaluated by OSATS or a derivative. OSATS is a validated evaluation tool used for technical skill assessment [ 87 ]. While it is the gold standard in evaluation, it has limitations. The use of OSATS is limited in clinical operating room settings. Hence many studies have attempted to optimize and modify it according to their specific needs [ 32 , 88 , 89 ]. An assessment tool must meet the following requirements: (1) the inter-rater reliability must exceed 0.90, and (2) this reliability should be based on the amount of agreement between the observers [ 90 ]. Based on Groenier et al.’s systematic review and meta-analysis, considerable caution is required with the use of assessment tools, especially when high-stake decision-making is required [ 91 ]. Advancing proficiency in technical skills with progression toward clinical application poses many issues. Surgeons gaining false self-confidence through inadequate testing may increase the risks of adverse events in clinical applications. Thus, competence testing protocols must be validated, and must be evidence based. In addition to technical proficiency, a surgical intervention requires vast competence and robust, validated assessment tools for surgical soft skills, including learning and interpersonal skills and attitudes.

The results showed that learning by doing, typically simulation, and learning through guided reflection were the most used instructional methods to promote surgical residents’ and medical students’ surgical learning. Both methods effectively promote acquiring knowledge about performing surgical tasks and surgical skills. For instance, simulation training has been shown to enhance fluency in technical performance of specific surgical procedures and patient safety and in increasing a surgeon’s confidence [ 17 , 51 , 91 ]. While building confidence is essential for progression, self-reflection to maintain competence awareness is needed. Hence, self-assessment is fundamental to surgical learning and can be used in many forms [ 92 ]. Also, modeling, particularly observing more experienced surgeons performing surgical procedures, self-directed study, and direct instruction were commonly applied to enhance surgical learning. In turn, learning by gaming and mentoring or teaching fellow training surgeons was rarely applied in the studies as forms of instruction in cultivating surgical learning. The result indicates that gaming and peer learning are still both under-studied and under-utilized resources for systematically promoting the learning of future surgeons. The quality and quantity of social interactions with peers, senior surgeons, and patients are fundamental for surgical learning. Learning of all higher-order competencies proceeds from an inter-individual to an intra-individual sphere [ 93 , 94 , 95 ]. Moreover, since no surgeon works alone, the surgeon must be trained to work with and within the team. Accordingly, systematic use of peer learning would be essential not only for enhancing specific surgical knowledge and skills, but also for cultivating much-needed surgical soft skills. Nevertheless, emerging qualitative evidence suggests that peer learning is being increasingly implemented in medical education [ 96 ]. This trend underscores the growing recognition of the value of collaborative learning environments, where peers can share knowledge, challenge each other, and collectively develop the comprehensive skill set required in modern surgical practice.

Half of the studies we reviewed applied multimodal instruction to enhance surgical learning. This reflects a more modern understanding of learning in which varied instructional methods should be used depending on the object of learning, participants, and context. It also implies that traditional surgical teaching methods of incremental responsibility, with increasing volume-based exposure during residency, will gradually complement more varied research-informed instructional practices. However, it is essential to recall that learning always depends on our actions. This means that if we want to educate reflective practitioners who are good at solving complex problems [ 36 ], able to work in teams and engaged in continuous professional development, the instructional designs must systematically engage the future surgeons in such activities [ 97 ].

However, based on our review, many questions remain unanswered. The most fundamental of these is related to the transfer of surgical learning from a learning setting to other settings and across the competence ingredients. Firstly, further studies are needed on the extent and how surgical competencies, particularly beyond the technical skills attained in simulation (for instance), transfer into clinical work. This is also connected with the optimal length of the interval between preparation and execution, which was not analyzed thoroughly in most articles, nor was the time for initiation of skill waning explicitly stated. Feins et al. observed a transient decline from the end of one session to the beginning of the next, which was subsequently recovered and improved [ 37 ]. Green et al. showed that technical skills attained during preparatory courses are maintained into residency without additional interventions, with similar results from Maertens et al. and Lee-Riddle et al., who recorded proficiency levels to be maintained for at least three months [ 41 , 51 , 60 ]. Secondly, based on our review, studies addressing the learning and training of surgical competencies were highly task specific. Accordingly, further studies on the interrelation between competence ingredients, including surgical knowledge, technical and soft skills, and attitudes, are needed to promote the development of comprehensive surgical competencies among future surgeons. Thirdly, while simulation has proven essential for technical training, many operative interventions contain elements that cannot be simulated with current systems. The preparation for such interventions demands a multimodal approach, including preparatory discussions and visualization, until further methods become available.

Surgical residency is demanding in many aspects, not the least timewise. Among surgeons, mini-fellowships are uncommon as a learning method as opposed to traditional learning-by-doing approaches. While more effective methods are acknowledged, they are not applied due to time concerns [ 98 ]. As shown by Bohl et al., dedicated synthetic model training may alleviate time demands, allowing residents to recover better and thus improving preparedness for subsequent tasks [ 45 ]. Cognitive task analysis-based training is a valuable adjunct to the modern surgical curriculum, especially considering the global reduction in operating times and volumes during training [ 99 , 100 ]. CTA-based training improves procedural knowledge and technical performance [ 99 ]. However, it was applied in only a few of the studies analyzed here. Interestingly, CTA seems more effective in the later stages of surgical education, with less impact on medical students [ 101 ]. In addition, CTA-based training is suitable for electronic delivery, utilization through web-based tools, and gaming applications, all of which are accessible and provide opportunities for frequent revisits without personnel or resource investments [ 102 , 103 ]. Learning through gaming was also rarely applied in teaching situations in the studies analyzed here. While serious gaming in medical education is beneficial, validating each application for a specific purpose is mandatory [ 104 ].

Postgraduate medical education has recently moved towards competency-based education in many countries. Entrusted professional activities (EPA) are utilized as milestones in many competency frameworks [ 105 ]. Although EPAs have been applied to and gained rapid acceptance in postgraduate medical education, their potential within undergraduate education remains unverified [ 106 ]. In addition, while EPAs are becoming more prominent in surgical education, their widespread adoption and dissemination remain challenging [ 107 ]. We advocate for using all tools that collectively embrace a holistic approach to all competency components within surgical learning.

Our study is not without limitations. While we attempted to acquire a comprehensive picture of the pedagogical surgical landscape, we may have yet to detect some reports. Although geographical coverage was acceptable, all the studies we identified were from Western countries. Thus, the actual coverage of multimodal surgical learning warrants further studies. One potential limitation of our study is the decision to restrict our literature search to studies published from 2017 onwards. While this approach allowed us to focus on the most recent and relevant developments in surgical training and competence, it may have excluded earlier studies that could provide additional historical context or foundational insights into the evolution of surgical education practices. Finally, although we limited our study population to students and residents, learning continues through a surgeon’s career and evolves depending on the learner’s situation. Competence-based learning applies equally to all stages of surgical learning and should be incorporated, irrespective of career stage.

Advancing proficiency through adequate competency assessment is crucial for effective surgical learning. As we observe, contemporary surgical education is high quality and continuously evolves. Most studies focused on objective assessments, yet the measurement and assurance of the transition from technical to clinical proficiency remain areas for further exploration. Defining competency and creating validated assessments are fundamental to lifelong surgical learning.

While acquiring operational skills, decision-making knowledge, and confidence in performing technical tasks are teachable, the ultimate success in learning also hinges on the learner’s attitude and willingness to learn. Therefore, it is vital to incorporate non-technical skills alongside technical aptitude testing and academic achievements in designing modern surgical curricula.

To optimize learning outcomes, learners must adopt an approach encompassing the full spectrum of surgical education. This means integrating technical and non-technical skills to create a learning environment that nurtures a broad range of competencies essential for comprehensive surgical expertise.

Availability of data and materials

The dataset supporting the conclusions of the current study is available from the corresponding author on reasonable request.

Lizzio A, Wilson K. Action learning in higher education: an investigation of its potential to develop professional capability. Stud High Educ. 2004;29(4):469–88.

Article   Google Scholar  

Parry S. Just what is a competency? (and why should you care?). Training. 1996;35(6):58–64.

Google Scholar  

Eraut M. Developing professional knowledge and competence. London: Taylor & Francis Group; 1994.

Baartman L, Bastiaens T, Kirschner P, Van der Vleuten C. Evaluating assessment quality in competence-based education: a qualitative comparison of two frameworks. Educational Res Rev. 2007;2(2):114–29.

Aim F, Lonjon G, Hannouche D, Nizard R. Effectiveness of virtual reality training on orthopaedic surgery. Arthroscopy. 2016;32(1):224–32.

Article   PubMed   Google Scholar  

Alaker M, Wynn GR, Arulampalam T. Virtual reality training in laparoscopic surgery: a systematic review & meta-analysis. Int J Surg. 2016;29:85–94.

Zendekas B, Brydges R, Hamstra S, Cook D. State of the evidence on simulation-based training for laparoscopic surgery - a systematic review. Ann Surg. 2013;257(4):586–93.

Yokoyama S, Mizunuma K, Kurashima Y, Watanabe Y, Mizota T, Poudel S, et al. Evaluation methods and impact of simulation-based training in pediatric surgery: a systematic review. Pediatr Surg Int. 2019;35(10):1085–94.

Herrera-Aliaga E, Estrada LD. Trends and innovations of simulation for twenty first century medical education. Front Public Health. 2022;10: 619769.

Article   PubMed   PubMed Central   Google Scholar  

Haluck RS, Krummel TM. Computers and virtual reality for surgical education in the 21st century. Arch Surg. 2000;135(7):786–92.

Article   CAS   PubMed   Google Scholar  

Reznick RK, MacRae H. Teaching surgical skills–changes in the wind. N Engl J Med. 2006;355(25):2664–9.

Scallon SE, Fairholm DJ, Cochrane DD, Taylor DC. Evaluation of the operating room as a surgical teaching venue. Can J Surg. 1992;35(2):173–6.

CAS   PubMed   Google Scholar  

Reznick RK. Teaching and testing technical skills. Am J Surg. 1993;165(3):358–61.

Sutherland LM, Middleton PF, Anthony A, Hamdorf J, Cregan P, Scott D, et al. Surgical simulation: a systematic review. Ann Surg. 2006;243(3):291–300.

Tavakol M, Mohagheghi MA, Dennick R. Assessing the skills of surgical residents using simulation. J Surg Educ. 2008;65(2):77–83.

Young M, Lewis C, Kailavasan M, Satterthwaite L, Safir O, Tomlinson J, et al. A systematic review of methodological principles and delivery of surgical simulation bootcamps. Am J Surg. 2022;223(6):1079–87.

Dawe SR, Pena GN, Windsor JA, Broeders JA, Cregan PC, Maddern GJ. Systematic review of skills transfer after surgical simulation-based training. Br J Surg. 2014;101(9):1063–76.

Grober ED, Hamstra SJ, Wanzel KR, Reznick RK, Matsumoto ED, Sidhu RS, et al. The educational impact of bench model fidelity on the acquisition of technical skill: the use of clinically relevant outcome measures. Ann Surg. 2004;240(2):374–81.

Grantcharov TP, Kristiansen VB, Bendix J, Bardram L, Rosenberg J, Funch-Jensen P. Randomized clinical trial of virtual reality simulation for laparoscopic skills training. Br J Surg. 2004;91(2):146–50.

Seymour NE, Gallagher AG, Roman SA, O’Brien MK, Bansal VK, Andersen DK, et al. Virtual reality training improves operating room performance: results of a randomized, double-blinded study. Ann Surg. 2002;236(4):458-463. discussion 63-4.

Grantcharov TP, Reznick RK. Teaching procedural skills. BMJ. 2008;336(7653):1129–31.

Seil R, Hoeltgen C, Thomazeau H, Anetzberger H, Becker R. Surgical simulation training should become a mandatory part of orthopaedic education. J Exp Orthop. 2022;9(1):22.

Reznick R, Regehr G, MacRae H, Martin J, McCulloch W. Testing technical skill via an innovative “bench station” examination. Am J Surg. 1997;173(3):226–30.

Bjerrum F, Thomsen ASS, Nayahangan LJ, Konge L. Surgical simulation: current practices and future perspectives for technical skills training. Med Teach. 2018;40(7):668–75.

Kurashima Y, Hirano S. Systematic review of the implementation of simulation training in surgical residency curriculum. Surg Today. 2017;47(7):777–82.

Moher D, Liberati A, Tetzlaff J, Altman DG, Group P. Preferred reporting items for systematic reviews and meta-analyses: the PRISMA statement. BMJ. 2009;339:b2535.

Babchenko O, Scott K, Jung S, Frank S, Elmaraghi S, Thiagarajasubramaniam S, et al. Resident perspectives on Effective Surgical training: incivility, confidence, and Mindset. J Surg Educ. 2020;77(5):1088–96.

Geoffrion R, Koenig NA, Sanaee MS, Lee T, Todd NJ. Optimizing resident operative self-confidence through competency-based surgical education modules: are we there yet? Int Urogynecol J. 2019;30(3):423–8.

Korte W, Merz C, Kirchhoff F, Heimeshoff J, Goecke T, Beckmann E, et al. Train early and with deliberate practice: simple coronary surgery simulation platform results in fast increase in technical surgical skills in residents and students. Interact Cardiovasc Thorac Surg. 2020;30(6):871–8.

Pandian TK, Buckarma EH, Mohan M, Gas BL, Naik ND, Abbott EF, et al. At home preresidency preparation for general surgery internship: a pilot study. J Surg Educ. 2017;74(6):952–7.

Charak G, Prigoff JG, Heneghan S, Cooper S, Weil H, Nowygrod R. Surgical education and the longitudinal model at the columbia-bassett program. J Surg Educ. 2020;77(4):854–8.

Harriman D, Singla R, Nguan C. The resident report card: A tool for operative feedback and evaluation of technical skills. J Surg Res. 2019;239:261–8.

Kumins NH, Qin VL, Driscoll EC, Morrow KL, Kashyap VS, Ning AY, et al. Computer-based video training is effective in teaching basic surgical skills to novices without faculty involvement using a self-directed, sequential and incremental program. Am J Surg. 2021;221(4):780–7.

Peshkepija AN, Basson MD, Davis AT, Ali M, Haan PS, Gupta RN, et al. Perioperative self-reflection among surgical residents. Am J Surg. 2017;214(3):564–70.

Cadieux DC, Mishra A, Goldszmidt MA. Before the scalpel: exploring surgical residents’ preoperative preparatory strategies. Med Educ. 2021;55(6):733–40.

Dressler JA, Ryder BA, Connolly M, Blais MD, Miner TJ, Harrington DT. Tweet-format writing is an effective tool for medical student reflection. J Surg Educ. 2018;75(5):1206–10.

Feins RH, Burkhart HM, Conte JV, Coore DN, Fann JI, Hicks GL Jr, et al. Simulation-based training in cardiac surgery. Ann Thorac Surg. 2017;103(1):312–21.

Patel P, Martimianakis MA, Zilbert NR, Mui C, Hammond Mobilio M, Kitto S, et al. Fake it ‘Til you make it: pressures to measure up in surgical training. Acad Med. 2018;93(5):769–74.

Acosta D, Castillo-Angeles M, Garces-Descovich A, Watkins AA, Gupta A, Critchlow JF, et al. Surgical practical skills learning curriculum: implementation and interns’ confidence perceptions. J Surg Educ. 2018;75(2):263–70.

Chon SH, Timmermann F, Dratsch T, Schuelper N, Plum P, Bertlh F, et al. Serious games in surgical medical education: a virtual emergency department as a tool for teaching clinical reasoning to medical students. JMIR Serious Games. 2019;7(1):e13028.

Green CA, Huang E, Zhao NW, O’Sullivan PS, Kim E, Chern H. Technical skill improvement with surgical preparatory courses: what advantages are reflected in residency? Am J Surg. 2018;216(1):155–9.

Hancock KJ, Klimberg VS, Williams TP, Tyler DS, Radhakrishnan R, Tran S. Surgical Jeopardy: play to learn. J Surg Res. 2021;257:9–14.

Jokinen E, Mikkola TS, Harkki P. Effect of structural training on surgical outcomes of residents’ first operative laparoscopy: a randomized controlled trial. Surg Endosc. 2019;33(11):3688–95.

Zundel S, Stocker M, Szavay P. Resident as teacher in pediatric surgery: Innovation is overdue in Central Europe. J Pediatr Surg. 2017;52(11):1859–65.

Bohl MA, McBryan S, Spear C, Pais D, Preul MC, Wilhelmi B, et al. Evaluation of a novel surgical skills training course: are cadavers still the gold standard for surgical skills training? World Neurosurg. 2019;127:63–71.

Lees MC, Zheng B, Daniels LM, White JS. Factors affecting the development of confidence among surgical trainees. J Surg Educ. 2019;76(3):674–83.

Harris DJ, Vine SJ, Wilson MR, McGrath JS, LeBel ME, Buckingham G. A randomised trial of observational learning from 2D and 3D models in robotically assisted surgery. Surg Endosc. 2018;32(11):4527–32.

Gabrysz-Forget F, Young M, Zahabi S, Nepomnayshy D, Nguyen LHP. Surgical errors happen, but are learners trained to recover from them? A survey of North American surgical residents and fellows. J Surg Educ. 2020;77(6):1552–61.

Klitsie PJ, Ten Brinke B, Timman R, Busschbach JJV, Theeuwes HP, Lange JF, et al. Training for endoscopic surgical procedures should be performed in the dissection room: a randomized study. Surg Endosc. 2017;31(4):1754–9.

Siroen KL, Ward CDW, Escoto A, Naish MD, Bureau Y, Patel RV, et al. Mastery learning - does the method of learning make a difference in skills acquisition for robotic surgery? Int J Med Robot. 2017;13(4):e1828.

Maertens H, Aggarwal R, Moreels N, Vermassen F, Van Herzeele I. A proficiency based stepwise endovascular curricular training (PROSPECT) program enhances operative performance in real life: a randomised controlled trial. Eur J Vasc Endovasc Surg. 2017;54(3):387–96.

Raiche I, Hamstra S, Gofton W, Balaa F, Dionne E. Cognitive challenges of junior residents attempting to learn surgical skills by observing procedures. Am J Surg. 2019;218(2):430–5.

LeCompte M, Stewart M, Harris T, Rives G, Guth C, Ehrenfeld J, et al. See one, do one, teach one: a randomized controlled study evaluating the benefit of autonomy in surgical education. Am J Surg. 2019;217(2):281–7.

Kinio AE, Dufresne L, Brandys T, Jetty P. Break out of the classroom: the use of escape rooms as an alternative teaching strategy in Surgical Education. J Surg Educ. 2019;76(1):134–9.

Soucisse ML, Boulva K, Sideris L, Drolet P, Morin M, Dube P. Video coaching as an efficient teaching method for surgical residents-a randomized controlled trial. J Surg Educ. 2017;74(2):365–71.

Ranney SE, Bedrin NG, Roberts NK, Hebert JC, Forgione PM, Nicholas CF. Maximizing learning in the operating room: residents’ perspectives. J Surg Res. 2021;263:5–13.

Naik ND, Abbott EF, Gas BL, Murphy BL, Farley DR, Cook DA. Personalized video feedback improves suturing skills of incoming general surgery trainees. Surgery. 2018;163(4):921–6.

Lesch H, Johnson E, Peters J, Cendan JC. VR Simulation leads to enhanced procedural confidence for Surgical trainees. J Surg Educ. 2020;77(1):213–8.

Fletcher BP, Gusic ME, Robinson WP. Simulation training incorporating a pulsatile carotid endarterectomy model results in increased procedure-specific knowledge, confidence, and comfort in post-graduate trainees. J Surg Educ. 2020;77(5):1289–99.

Lee-Riddle GS, Sigmon DF, Newton AD, Kelz RR, Dumon KR, Morris JB. Surgical Boot camps increases confidence for residents transitioning to senior responsibilities. J Surg Educ. 2021;78(3):987–90.

Amer K, Mur T, Amer K, Ilyas A. A mobile-based surgical simulation application: a comparative analysis of efficacy using a carpal tunnel release module. J Hand Surg. 2017;42(5):P389.E1-.E9.

Bhattacharyya R, Davidson DJ, Sugand K, Bartlett MJ, Bhattacharya R, Gupte CM. Knee arthroscopy simulation: a Randomized controlled trial evaluating the effectiveness of the imperial knee arthroscopy cognitive task analysis (IKACTA) Tool. J Bone Joint Surg Am. 2017;99(19):e103.

Levin A, Haq I. Pre-course cognitive training using a smartphone application in orthopaedic intern surgical skills “boot camps.” J Orthop. 2018;15:506–8.

Logishetty K, Gofton WT, Rudran B, Beaule PE, Gupte CM, Cobb JP. A multicenter randomized controlled trial evaluating the effectiveness of cognitive training for anterior approach total hip arthroplasty. J Bone Joint Surg Am. 2020;102(2):e7.

Trickey AW, Newcomb AB, Porrey M, Piscitani F, Wright J, Graling P, et al. Two-year experience implementing a curriculum to improve residents’ patient-centered communication skills. J Surg Educ. 2017;74(6):e124–32.

Grant AL, Temple-Oberle C. Utility of a validated rating scale for self-assessment in microsurgical training. J Surg Educ. 2017;74(2):360–4.

Quick JA, Kudav V, Doty J, Crane M, Bukoski AD, Bennett BJ, et al. Surgical resident technical skill self-evaluation: increased precision with training progression. J Surg Res. 2017;218:144–9.

Jethwa AR, Perdoni CJ, Kelly EA, Yueh B, Levine SC, Adams ME. Randomized controlled pilot study of video self-assessment for resident mastoidectomy training. OTO Open. 2018;2(2):2473974X18770417.

Miller AT, Swain GW, Widmar M, Divino CM. How important are American board of surgery in-training examination scores when applying for fellowships? J Surg Educ. 2010;67(3):149–51.

Larsen CR, Grantcharov T, Schouenborg L, Ottosen C, Soerensen JL, Ottesen B. Objective assessment of surgical competence in gynaecological laparoscopy: development and validation of a procedure-specific rating scale. BJOG. 2008;115(7):908–16.

Koehler RJ, Amsdell S, Arendt EA, Bisson LJ, Braman JP, Butler A, et al. The arthroscopic Surgical skill evaluation Tool (ASSET). Am J Sports Med. 2013;41(6):1229–37.

Makoul G, Krupat E, Chang CH. Measuring patient views of physician communication skills: development and testing of the communication assessment tool. Patient Educ Couns. 2007;67(3):333–42.

Dumestre D, Yeung JK, Temple-Oberle C. Evidence-based microsurgical skills acquisition series part 2: validated assessment instruments–a systematic review. J Surg Educ. 2015;72(1):80–9.

Schunk D, Greene J. Handbook of self-regulation of learning and performance. London: Routledge / Taylor & Francis Group; 2018.

Hadwin A, Järvelä D, Miller M. Self-regulated, coregulated and socially shared regulation of learning. In: Zimmerman B, Schunk D, editors. Handbook of self-regulation of learning and performance. New York, NY: Routledge; 2011. p. 65–84.

Zimmerman BJ. Self-efficacy: an essential motive to learn. Contemp Educ Psychol. 2000;25(1):82–91.

Jackson JW. Enhancing self-efficacy and learning performance. J Experimental Educ. 2002;70(3):243–54.

Dedy NJ, Bonrath EM, Zevin B, Grantcharov TP. Teaching nontechnical skills in surgical residency: a systematic review of current approaches and outcomes. Surgery. 2013;154(5):1000–8.

Srinivasa S, Gurney J, Koea J. Potential consequences of patient complications for Surgeon Well-being: a systematic review. JAMA Surg. 2019;154(5):451–7.

Galayia R, Kinross J, Arulampalam T. Factors associated with burnout syndrome in surgeons: a systematic review. Ann R Coll Surg Engl. 2020;102:401–7.

Gleason F, Baker SJ, Wood T, Wood L, Hollis RH, Chu DI, Lindeman B. Emotional Intelligence and Burnout in Surgical residents: a 5-Year study. J Surg Educ. 2020;77(6):e63–70.

Ounounou E, Aydin A, Brunckhorst O, Khan MS, Dasgupta P, Ahmed K. Nontechnical skills in surgery: a systematic review of current training modalities. J Surg Educ. 2019;76(1):14–24.

Turner K, Bolderston H, Thomas K, Greville-Harris M, Withers C, McDougall S. Impact of adverse events on surgeons. Br J Surg. 2022;109(4):308–10.

Hu Y-Y, Ellis RJ, Hewitt DB, Yang AD, Cheung EO, Moskowitz JT, et al. Discrimination, abuse, harassment, and burnout in surgical residency training. N Engl J Med. 2019;381(18):1741–52.

Hartzband P, Groopman J. Physician burnout, interrupted. N Engl J Med. 2020;382(26):2485–7.

Rosendal AA, Sloth SB, Rölfing JD, Bie M, Jensen RD. Techinical, non-technical, or both? A scoping review of skills in simulation-based surgical training. J Surg Educ. 2023;80(5):731–49.

Martin JA, Regehr G, Reznick R, MacRae H, Murnaghan J, Hutchison C, et al. Objective structured assessment of technical skill (OSATS) for surgical residents. Br J Surg. 1997;84(2):273–8.

Ahmed K, Miskovic D, Darzi A, Athanasiou T, Hanna GB. Observational tools for assessment of procedural skills: a systematic review. Am J Surg. 2011;202(4):469-80 e6.

van Hove PD, Tuijthof GJ, Verdaasdonk EG, Stassen LP, Dankelman J. Objective assessment of technical surgical skills. Br J Surg. 2010;97(7):972–87.

Groenier M, Brummer L, Bunting BP, Gallagher AG. Reliability of observational assessment methods for outcome-based assessment of surgical skill: systematic review and Meta-analyses. J Surg Educ. 2020;77(1):189–201.

Vanderbilt AA, Grover AC, Pastis NJ, Feldman M, Granados DD, Murithi LK, et al. Randomized controlled trials: a systematic review of laparoscopic surgery and simulation-based training. Glob J Health Sci. 2014;7(2):310–27.

Nayar SK, Musto L, Baruah G, Fernandes R, Bharathan R. Self-assessment of surgical skills: a systematic review. J Surg Educ. 2020;77(2):348–61.

Lave J, Wenger E. Situated learning: legitimate peripheral participation. Cambridge University Press; 1991.

Book   Google Scholar  

Bruner JS. The process of Education. Cambridge: Mass. Harvard University Press; 1960.

Vygotsky LS. Mind in society development of higher psychological processes.In: Michael Cole, Vera John-Steiner, Sylvia Scribner, and Ellen Souberman, editors. Cambridge: Harvard University Press; 1978.

Burgess A, van Diggele C, Roberts C, Mellis C. Introduction to the peer teacher training in health professional education supplement series. BMC Med Educ. 2020;20(Suppl 2):454.

Achenbach J, Schafer T. Modelling the effect of age, semester of study and its interaction on self-reflection of competencies in medical students. Int J Environ Res Public Health. 2022;19(15):9579.

Jaffe TA, Hasday SJ, Knol M, Pradarelli J, Pavuluri Quamme SR, Greenberg CC, et al. Strategies for new skill acquisition by practicing surgeons. J Surg Educ. 2018;75(4):928–34.

Schwartz SI, Galante J, Kaji A, Dolich M, Easter D, Melcher ML, et al. Effect of the 16-hour work limit on general surgery intern operative case volume: a multi-institutional study. JAMA Surg. 2013;148(9):829–33.

Tofel-Grehl C, Feldon D. Cognitive task analysis-based training: a meta-analysis of studies. J Cogn Eng Decis Making. 2013;7:293–304.

Edwards TC, Coombs AW, Szyszka B, Logishetty K, Cobb JP. Cognitive task analysis-based training in surgery: a meta-analysis. BJS Open. 2021;5(6):zrab122.

Maertens H, Madani A, Landry T, Vermassen F, Van Herzeele I, Aggarwal R. Systematic review of e-learning for surgical training. Br J Surg. 2016;103(11):1428–37.

Gentry SV, Gauthier A, L’Estrade Ehrstrom B, Wortley D, Lilienthal A, Tudor Car L, et al. Serious gaming and Gamification Education in Health professions: systematic review. J Med Internet Res. 2019;21(3): e12994.

Graafland M, Schraagen JM, Schijven MP. Systematic review of serious games for medical education and surgical skills training. Br J Surg. 2012;99(10):1322–30.

LoGiudice AB, Sibbald M, Monteiro S, Sherbino J, Keuhl A, Norman GR, et al. Intrinsic or invisible? An audit of CanMEDS roles in Entrustable Professional activities. Acad Med. 2022;97:1213–8.

Bramley AL, McKenna L. Entrustable professional activities in entry-level health professional education: a scoping review. Med Educ. 2021;55:1011–32.

Liu L, Jiang Z, Qi X, Xie A, Wu H, Cheng H, et al. An update on current EPAs in graduate medical education: a scoping review. Med Educ Online. 2021;26:1981198.

Download references

Acknowledgements

Not applicable.

Disclosures

This research did not receive any specific grants from funding agencies in the public, commercial, or not-for-profit sectors.

Consent to publish

Not applicable due to the nature of the study.

Conflict of interest

Open access funding provided by Uppsala University.

Author information

Authors and affiliations.

Department of Pediatric Surgery, New Children’s Hospital, Helsinki University Hospital, Helsinki, Finland

Niklas Pakkasjärvi

Department of Pediatric Surgery, Section of Urology, University Children’s Hospital, Uppsala, Sweden

Faculty of Educational Sciences, University of Helsinki, Helsinki, Finland

Henrika Anttila & Kirsi Pyhältö

Centre for Higher and Adult Education, Faculty of Education, Stellenbosch University, Stellenbosch, South Africa

Kirsi Pyhältö

You can also search for this author in PubMed   Google Scholar

Contributions

Conceptualization, N.P. & K.P; methodology, N.P.; software, H.A.; validation, N.P., H.A. and K.P.; formal analysis, N.P., H.A.; investigation, N.P., H.A.; resources, H.A.; data curation, H.A.; writing—original draft preparation, N.P.; writing—review and editing, N.P, , H.A., K.P.; visualization, N.P.; supervision, K.P.; project administration, K.P. All authors have read and agreed to the published version of the manuscript.

Corresponding author

Correspondence to Niklas Pakkasjärvi .

Ethics declarations

Ethics approval and consent to participate.

This systematic review did not involve any human participants or experimental interventions; therefore, ethical approval was not required. We adhered to PRISMA guidelines for methodology.

Consent for publication

Consent to participate was not applicable due to the nature of the study which did not involve human participants or experimental interventions.

Competing interests

The authors declare no competing interests.

Additional information

Publisher’s note.

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/ . The Creative Commons Public Domain Dedication waiver ( http://creativecommons.org/publicdomain/zero/1.0/ ) applies to the data made available in this article, unless otherwise stated in a credit line to the data.

Reprints and permissions

About this article

Cite this article.

Pakkasjärvi, N., Anttila, H. & Pyhältö, K. What are the learning objectives in surgical training – a systematic literature review of the surgical competence framework. BMC Med Educ 24 , 119 (2024). https://doi.org/10.1186/s12909-024-05068-z

Download citation

Received : 25 September 2023

Accepted : 17 January 2024

Published : 06 February 2024

DOI : https://doi.org/10.1186/s12909-024-05068-z

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Surgical competence
  • Surgical education
  • Systematic literature review

BMC Medical Education

ISSN: 1472-6920

what is a literature search strategy

Help | Advanced Search

Computer Science > Cryptography and Security

Title: the current state of security -- insights from the german software industry.

Abstract: These days, software development and security go hand in hand. Numerous techniques and strategies are discussed in the literature that can be applied to guarantee the incorporation of security into the software development process. In this paper the main ideas of secure software development that have been discussed in the literature are outlined. Next, a dataset on implementation in practice is gathered through a qualitative interview research involving 20 companies. Trends and correlations in this dataset are found and contrasted with theoretical ideas from the literature. The results show that the organizations that were polled are placing an increasing focus on security. Although the techniques covered in the literature are being used in the real world, they are frequently not fully integrated into formal, standardized processes. The insights gained from our research lay the groundwork for future research, which can delve deeper into specific elements of these methods to enhance our understanding of their application in real-world scenarios.

Submission history

Access paper:.

  • Download PDF
  • Other Formats

license icon

References & Citations

  • Google Scholar
  • Semantic Scholar

BibTeX formatted citation

BibSonomy logo

Bibliographic and Citation Tools

Code, data and media associated with this article, recommenders and search tools.

  • Institution

arXivLabs: experimental projects with community collaborators

arXivLabs is a framework that allows collaborators to develop and share new arXiv features directly on our website.

Both individuals and organizations that work with arXivLabs have embraced and accepted our values of openness, community, excellence, and user data privacy. arXiv is committed to these values and only works with partners that adhere to them.

Have an idea for a project that will add value for arXiv's community? Learn more about arXivLabs .

IMAGES

  1. Literature search strategy.

    what is a literature search strategy

  2. Develop Search Strategy

    what is a literature search strategy

  3. Flowchart of search strategy for literature review.

    what is a literature search strategy

  4. Search strategy

    what is a literature search strategy

  5. Literature search strategy

    what is a literature search strategy

  6. Literature search strategy (PRISMA flowchart).

    what is a literature search strategy

VIDEO

  1. Approaches to searching the literature

  2. Effective Review of Literature

  3. Literature Search Strategy by Vasumathi Sriganesh

  4. Literature Review 101

  5. What is Literature??

  6. Broaden your literature review with EBSCO search strategy- by Pradeep(June2023)

COMMENTS

  1. Develop a search strategy

    Literature Search Basics What is a search strategy KEY POINTS A search strategy includes a combination of keywords, subject headings, and limiters (language, date, publication type, etc.) A search strategy should be planned out and practiced before executing the final search in a database.

  2. How to carry out a literature search for a systematic review: a

    Literature reviews are conducted for the purpose of (a) locating information on a topic or identifying gaps in the literature for areas of future study, (b) synthesising conclusions in an area of ambiguity and (c) helping clinicians and researchers inform decision-making and practice guidelines.

  3. How to Construct an Effective Search Strategy

    How to Construct an Effective Search Strategy Literature Reviews This guide provides detailed information about conducting a literature review. Planning Your Review In this section we will review the steps you take in order to perform an effective search using databases and other resources. Every search begins with a research question or topic.

  4. A systematic approach to searching: an efficient and complete method to

    The described method can be used to create complex and comprehensive search strategies for different databases and interfaces, such as those that are needed when searching for relevant references for systematic reviews, and will assist both information specialists and practitioners when they are searching the biomedical literature. Go to:

  5. Develop a search strategy

    A search strategy is an organised structure of key terms used to search a database. The search strategy combines the key concepts of your search question in order to retrieve accurate results. Your search strategy will account for all: possible search terms keywords and phrases truncated and wildcard variations of search terms

  6. Literature search strategies

    Literature search strategies - Evidence review for targets - NCBI Bookshelf The literature searches for this review are detailed below and complied with the methodology outlined in Developing NICE guidelines: the manual 2014, updated 2017.

  7. Researching for your literature review: Develop a search strategy

    Start developing a search strategy by identifying the key words and concepts within your research question. For example: How do students view inclusive educational practices in schools? Treat each component as a separate concept (there are usually between 2-4 concepts).

  8. Literature Review: Developing a search strategy

    Literature search - a librarian's handout to introduce tools, terms and techniques Created by Elsevier librarian, Katy Kavanagh Web, this document outlines tools, terms and techniques to think about when conducting a literature search. Search planner Literature search cycle Diagram text description Have a search framework

  9. Defining the process to literature searching in systematic reviews: a

    Background Systematic literature searching is recognised as a critical component of the systematic review process. It involves a systematic search for studies and aims for a transparent report of study identification, leaving readers clear about what was done to identify studies, and how the findings of the review are situated in the relevant evidence. Information specialists and review teams ...

  10. Research Guides: Literature Reviews: Develop Search Strategies

    Developing a search strategy is a balance between needing a very precise search that yields fewer highly relevant results or a comprehensive search (high retrieval) with lower precision. The focus of a narrative literature review for a dissertation or thesis is thoroughness, so you should aim for high retrieval.

  11. Defining the process to literature searching in systematic reviews: a

    The NICE handbook also suggests the use of multi-stranded approaches to developing literature search strategies . Glanville developed this idea in a study by Whitting et al. and a worked example of this approach is included in the development of a search filter by Cooper et al. . Writing search strategies: Conceptual and objective approaches ...

  12. Developing a Search Strategy

    Image: Pinterest Techniques for search term harvesting. Begin brainstorming search terms by using the following techniques: Explore gold standard articles supplied by the principal investigator or found through preliminary searches.; Look at search strategies from published systematic reviews.; Scan records, articles, and searches for usable controlled vocabulary and natural language.

  13. Researching for your literature review: Develop a search strategy

    Researching for your literature review: Develop a search strategy Home Literature reviews Identify key terms and concepts Start developing a search strategy by identifying the key words and concepts within your research question. The aim is to identify the words likely to have been used in the published literature on this topic.

  14. 5 Identifying the evidence: literature searching and evidence ...

    Our literature searches are designed to be systematic, transparent, and reproducible, and minimise dissemination bias. Dissemination bias may affect the results of reviews and includes publication bias and database bias. We use search methods that balance recall and precision.

  15. How to undertake a literature search: a step-by-step guide

    Review Literature as Topic* Undertaking a literature search can be a daunting prospect. Breaking the exercise down into smaller steps will make the process more manageable. This article suggests 10 steps that will help readers complete this task, from identifying key concepts to choosing databases for the search and saving the …

  16. Developing a Search Strategy

    A search strategy is an organized structure of key terms used to search a database. The search strategy combines the key concepts of your search question in order to retrieve accurate results. Your search strategy will account for all: possible search terms; keywords and phrases; truncated and wildcard variations of search terms

  17. 3. Search the literature

    Reuse successful search strategies for future papers; Help you describe your search process for manuscripts; Justify your search process; Documenting your search will help you stay organized and save time when tweaking your search strategy. This is a critical step for rigorous review papers, such as systematic reviews.

  18. How to Develop a Literature Search Strategy

    A literature search strategy is an action plan for martialing all of the scholarly sources that you will use in your dissertation. Your search will give you a clear picture of the history of your topic and the current critical conversation, as well as expose gaps in the literature ripe for investigation.

  19. Literature search strategy

    Literature search strategy Sometimes you are required to explain your literature search strategy used in your research. Even when you are not officially required to do so, including the explanation of literature search strategy in the literature review chapter is going to boost your marks considerably.

  20. 7 Proven literature search strategies for scientific literature review

    Identify main keywords: One of the top literature search strategies is to segregate your area of research into broad topics, and use these to define relevant keywords, which can be used to fine-tune and focus your search. Also write down synonyms and/or alternative phrases for each keyword, in case you need to broaden your search.

  21. Literature search for research planning and identification of research

    Literature search is done to identify appropriate methodology, design of the study; population sampled and sampling methods, methods of measuring concepts and techniques of analysis.

  22. Search strategies

    Search terms are extracted from your research question (such as the terms that may up your PICO) and can be entered into whichever database(s) you decide to use.Databases give you the option of using keywords or subject headings. Each database has its own set of subject headings, designed specifically for the literature from the field(s) of study the database contains.

  23. How to Incorporate & Cite Grey Literature

    However, there are some key differences to remember when using grey literature. Evaluate and Verify. It is worth repeating that grey literature is not peer-reviewed like scholarly works, so evaluating and verifying grey literature before using it is extremely important. Use this guide's "Evaluating Grey Literature" page to see how this process ...

  24. Upcoming Library workshops for graduate students

    What Literature Reviews are, and where to find them; How to use concept mapping to identify which secondary sources you need; How to find and organize your sources to facilitate your writing; Advanced tips and strategies for searching library databases so you can: Save your searches; Create Alerts for new citations in your topic

  25. Cancer Screening Services: What Do Indigenous Communities Want? A

    The search terms used were "Indigenous community or Indigenous communities," "cancer screening," and "facilitators, enablers, desires, or needs." Qualitative studies published up to the August 30, 2022 investigating the perspectives of Indigenous communities on factors encouraging screening participation were included in the study.

  26. Methods for Literature Search

    Literature Review Methods. Based on input from our expert advisors, our conceptual model, and practical considerations, we developed literature review methods that included: inclusion and exclusion criteria to identify potentially relevant articles, search strategies to retrieve articles, abstract review protocols, and a system of scoring ...

  27. What are the learning objectives in surgical training

    To map the landscape of contemporary surgical education through a competence framework by conducting a systematic literature review on learning outcomes of surgical education and the instructional methods applied to attain the outcomes. Surgical education has seen a paradigm shift towards competence-based training. However, a gap remains in the literature regarding the specific components of ...

  28. [2402.08436] The current state of security -- Insights from the German

    These days, software development and security go hand in hand. Numerous techniques and strategies are discussed in the literature that can be applied to guarantee the incorporation of security into the software development process. In this paper the main ideas of secure software development that have been discussed in the literature are outlined. Next, a dataset on implementation in practice ...