Indian Journal of Urology Users online:905  
IJU
Home Current Issue Ahead of print Editorial Board Archives Symposia Guidelines Subscriptions Login 
Print this page  Email this page Small font sizeDefault font sizeIncrease font size


 
SYMPOSIUM
Year : 2009  |  Volume : 25  |  Issue : 2  |  Page : 246-250
 

Optimal strategies for literature search


Vattikuti Urology Institute, Henry Ford Hospital, Detroit, Michigan, USA

Date of Web Publication24-Jun-2009

Correspondence Address:
Siddharth Siva
Vattikuti Urology Institute, Henry Ford Hospital, Detroit, Michigan
USA
Login to access the Email id

Source of Support: None, Conflict of Interest: None


DOI: 10.4103/0970-1591.52936

Rights and Permissions

 
   Abstract 

With the large number of urological journals now indexed in online search engines, just reading a few journals will not keep urologists up to date on the latest developments. This paper proposes search strategies to quicken the search and retrieval of the required literature, so that the best evidence may be used to guide practice. This survey of optimal strategies begins with framing the inquiry so the search engine returns results within an accurate scope. The researcher must also isolate the type of evidence appropriate for the scenario and determine its validity. Finally, regardless of the extent of their institution's subscriptions, researchers should be able to attain the complete document. Besides search strategies, this article extensively reviews sources of information valuable to urologists, including databases and web links.


Keywords: Databases, evidence-based medicine, level of evidence, literature search


How to cite this article:
Siva S. Optimal strategies for literature search. Indian J Urol 2009;25:246-50

How to cite this URL:
Siva S. Optimal strategies for literature search. Indian J Urol [serial online] 2009 [cited 2019 Aug 24];25:246-50. Available from: http://www.indianjurol.com/text.asp?2009/25/2/246/52936



   Introduction Top


Urologists and other healthcare professionals are expected to use evidence-based medicine (EBM) when accumulating and applying their knowledge in their practice or research. [1] Developed by Sackett et al., in 1996, the most common definition of EBM is "the conscientious, explicit, and judicious use of current best evidence in making decisions about the care of the individual patient." It means integrating individual clinical expertise with the best available external clinical evidence from systematic research. [2] The basic steps of EBM can be described in six steps starting with the letter 'A', as described in [Table 1]. [3] This paper will focus on strategies to 'acquire' the best evidence, while touching on techniques for 'ask' and 'appraise'.

Keeping up with the plethora of new research and search technologies without substantial loss of time requires intelligent strategies that best optimize available resources. Ideally, every clinical and research institute should have access to a trained librarian to facilitate quick searches. However, in most cases, this is either too expensive a proposition for clinics or librarians are too busy to help healthcare practitioners who need a quick response. An astute understanding of the resources available and the best techniques to extract them makes every clinician substantially more self-sufficient in an evidence-based practice.


   Framing the Question Top


Healthcare professionals may explore literature for a variety of reasons including for an extensive meta-analysis project, for personal fulfillment, or to answer pressing clinical questions. Since the latter has the greatest time constraint, this paper will focus on such queries, though the strategies are applicable and effective in all three scenarios. The intelligent use of technology and validated evidence enables healthcare professionals to translate the developments of clinical research into patient care solutions. Indeed, research has shown that the use of online searches early in hospitalization reduces the length of hospital stay and its cost for patients. [4]

Clinical queries vary on the scope of the question being asked. Suppose the question is a relatively broad, for example: "What are the contemporary methods for treating struvite stones?" A doctor with this inquiry would be interested in methodologically developed guidelines based on critically identified and analyzed randomized control trials (RCTs) and systematic reviews. On the other hand, the clinician may have a more specific inquiry, for example: "t0 he relationship between  Oxalobacter formigenes Scientific Name Search d urolithiasis." The first objective should be to find a recent systematic review; ideally one developed without bias in the identification of research and the presentation of results, or explores RCTs. Of course, questions may not be about just therapies; they may be about diagnosis, prognosis, and cost-effectiveness and there are techniques to optimize searches for each of these study types.

Such a categorization of studies is only a part of search strategy. The first challenge of literature search is framing a focused question on the specific clinical problem. Research has shown that incomplete literature searches are a result of inappropriate terms and failure to cover query concepts. [5] A comprehensive question can provide the searcher with guidelines to derive relevant search terms. One frequently used mnemonic, referred to as 'PICOT', breaks the question content into five parts; identifying the patient problem or population (P), intervention (I), comparison (C), outcome (O), and type of study (T). [6],[7] The 'patient problem' should include the chief complaint, 'comparison' refers to the alternate to the intervention, and 'outcome' may depend on the patient's wishes or the physician's optimal scenario.

Formalizing this question-creation process helps the researcher focus on the chief complaint, brainstorm the key terms for conducting the search, and determine the ideal outcome and evidence. Krupski et al., provide a sample question: "In men with acute urinary retention (P), does treatment with an alpha-blocker (I) improve the chance of passing a voiding trial (O) compared to placebo?" [7] It is not necessary to use all concepts in the search strategy; in the given example 'comparison' and 'type of study' may be used for placing further limits on search results. Such flexibility allows the searcher to avoid queries too specific that would miss the key evidence or queries too broad that may find superfluous searches that need to be manually examined.


   Finding the Evidence Top


Distinguishing the quality of various types of evidence is fundamental to the effective practice of EBM. Several reviews have purported a hierarchy of evidence, where a study's level reflects its validity and lack of bias. [8],[9],[10] A commonly used hierarchy developed by Sackett et al., is displayed in [Table 2] and further analyzed at the website for the Centre for Evidence-Based Medicine ( http://www.cebm.net ). [9],[10] Also among the most popular and influential is the '5S' levels suggested by Haynes et al., described in [Table 3]. [11] This categorization proposed five different layers of analysis, with each successive layer being of greater reliability. The hierarchy builds from the bottom containing 'studies' such as clinical trials in PubMed, then 'syntheses' such as systematic reviews in the Cochrane Library, 'synopses' such as descriptions of articles in evidence-based journals, topic 'summaries' such as preappraised reviews of intervention studies, and finally 'systems', such as computer networks that can link patient characteristics to evidence. Haynes recommends starting at the highest level, systems, and working downwards until the researcher finds the evidence of interest. 'Systems' require a level of administration and investment that is not available is most clinics. The following are a list of potential sources under other categories.


   Summaries Top


The Physicians' Information and Education Resource (PIER - http://pier.acponline.org), created by the American College of Physicians, is segmented into modules that contain guidance and practice recommendations, backed by recently collected evidence. The modules themselves are split into sections such as Diseases, Screening and Prevention, Complementary and Alternative Medicine, Ethical and Legal Issues, and Procedures. There is also a database for drug searches. The BMJ publishing group has created Clinical Evidence ( http://www.clinicalevidence.com/ceweb/conditions/index.jsp ), a service that critically appraises and describes evidence from systematic reviews, RCTs, and in the absence of such studies, observational studies. Organized by section and topics, urologists will find a variety of topics of interest in Men's Health, Women's Health, Sexual Health, and Kidney Disorders. Both databases require subscriptions and PIER requires membership with the American College of Physicians.


   Synopses Top


These sources provide concise summaries of original studies and systematic review articles deemed relevant for immediate clinical impact. The descriptions are usually 1-2 pages and are accompanied by expert analysis on the most relevant information in the paper. ACP Journal Club ( http://www.acpjc.org ) scouts over a 100 clinical journals and rates methodologically chosen papers based on relevance from a scale of 1-7. The summaries are presented in 'structured abstracts' that include the objective, main outcomes, validity criteria, and analysis of the results. Evidenced-Based Medicine ( http://www.evidence-basedmedicine.com ) also applies strict criteria for the selection of research and provides expert commentary. Evidence-Based Medicine focuses primarily on coverage of primary care medicine.


   Synthesis Top


The Cochrane Library ( http://www.thecochranelibrary.com ) is a set of databases, most importantly, the Cochrane Database of Systematic Reviews, categorized by both topics and specialties. Another resource within the library is the Database of Abstracts of Reviews of Effects (DARE), which contains over 3000 critical reviews of systematic reviews developed outside of Cochrane and affiliated databases. The Cochrane Central Register of Controlled Trials (CENTRAL) includes summaries of articles, retrieved from MEDLINE, EMBASE, and other sources of controlled trials. Other unique features include databases such as The NHS Economic Evaluation Database (NHSEED), which studies the effectiveness of nearly 5000 cost-effective studies. Countries such as the UK, India, and Australia provide the service to citizens, though a subscription is required in the United States.

BMJ Updates ( http://www.bmjupdates.com ) is free of charge to online registrants; over 110 clinical journals are prerated for quality by BMJ research staff and then rated for clinical relevance by a panel of worldwide practicing physicians. The database allows search by type of study, specialty, and clinical question. The National Guideline Clearinghouse ( http://www.guidelines.gov ) is an initiative of the US Department of Health and Human Services and contains evidence-based clinical practice guidelines, abstract summaries, and a unique guideline comparison that highlights similarities and differences. Researchers can also try searching the Turning Research into Practice Database (TRIP - http://www.tripdatabase.com) which searches a vast variety of databases, such as Cochrane, Clinical Evidence, and several high impact journals, for systematic reviews and guidelines. Both the American Urological Association ( AUA- http://auanet.org/guidelines/ ) and European Urological Association ( EAU- http://www.uroweb.org ) have large depositories of practice guidelines.


   Sources of Primary Evidence Top


There are several excellent primary source search engines that most healthcare professionals will use during their career. The Science Citation Index ( SCI- http://www.isiwebofknowledge.com ) is a production of Thomson Scientific and one of the databases featured in the company's 'Web of Knowledge' resource. SCI covers 6400 of the world's journals on science and technology. OvidSp (http://ovidsp.ovid.com/), released in early 2008 as an update of the well-respected Ovid database, is a product of the information services company Wolters Kluwer. New features include better choices to manage results, search history, and 'Search Aid', a detailed footprint of the engine's search strategy, which can then be further modified for more accurate results.

Other notables include Elsevier's Scirus ( http://www.scirus.com ), and CiteSeer. The titans of the software industry, Microsoft and Google, have made their own medical literature search engines: Windows Live Academic and Google Scholar (GS), respectively. Although Live Academic was closed in 2008, GS ( http://scholar.google.com ) has thrived, both because of the parent company's success and the search engine's seemingly simpler search interface. Most of the criticism leveled against GS revolves around the secrecy of the list of actual scholarly journals and databases the engine is allowed to crawl. GS also lacks much of the search manipulation features available with PubMed, further discussed below. However, GS, along with Elsevier's Scopus ( http://www.scopus.com ), is popularly used for extensive author and article citation analysis.

Though GS services are also free, PubMed's ( http://www.PubMed.gov ) strengths are its search features and access to 17,000,000 articles and 5000 journals. Though PubMed results can take a sizeable investment of time to navigate and many abstracts require individual or institutional access to read, PubMed is the most widely used search engine. The rest of this paper develops search strategies applicable to a variety of databases, but focuses on PubMed due to its popularity.


   Communicating with Search Engines Top


Search engines vary in the complexity of searches allowed, the strictness of format, and syntax required, ability to build and save searches, and facilities to receive automatic updates. Upon completion of the first step, development of the question, researchers traditionally take one of two routes to identify information. The first is by searching for a string of characters, such as a word or a sequence of words. The search engines for PubMed and the Cochrane Library can be enabled to search for words that share the same root by entering the symbol '*' at the end of the string of characters, though other databases may use the symbol '$'. Researchers should be wary of databases that automatically truncate their searches as this will greatly increase frivolous findings. International searchers should also be cautious of alternate spellings for the same word, though most databases will now find other spellings or provide options for doing so.

In most databases, terms can be combined by using the Boolean operators 'AND', 'OR', and 'NOT'. Searching with AND limits searches to papers that contain both words, while OR will identify papers that have either of the words. The use of 'NOT is a drastic way of reducing the number of searches since it removes all search results with the given terms. Many databases offer the use of adjacency operators, a useful feature not yet available on PubMed: 'NEXT', 'NEAR', or 'ADJ'. The ADJ or NEXT can search for papers where two terms are placed adjacent to each other, within the distance of a certain number of words or sentences. This function is useful when the words separated would bring searches unrelated to their use together. The NEAR operator has different functions in different databases: In the Cochrane Library, a NEAR search for two terms will provide papers where they are placed within six words of each while in other databases they may just be in the same sentence.

Searching by words has several limitations. First, words must match exactly in order for matches to be retrieved (i.e. the spelling must be exact). Second, engines such as PubMed search only abstract and title and if they were composed incompletely, the paper is missed by the search. Finally, search intention and results will have inevitable variations because search words cannot always capture the concept the researcher is looking for. For example, a combined search for 'biopsies', 'detection', and 'prostate cancer' will give either result for biopsies for prostate cancer detection before treatment or for prostate cancer recurrence detection after treatment.

The alternative to searching by words is searching by index terms; in MEDLINE such terms are referred to as Medical Subject Headings (MeSH). A helpful tool when the original search fails to provide satisfactory results, MeSH searches will provide additional terms related to search, which may lead to at least similar information in related topics. Prepared by the compilers of MEDLINE, these terms are organized hierarchically, with 16 broad subject categories dividing in a tree-like structure to nearly 25,000 topics. Each article will usually have 10-15 subjects; besides the topic of the article, indexers will also assign terms to describe the age group, gender, and publication type. MEDLINE's compilers update terms frequently, as literature is revised with new terminologies. CINAHL and EMTREE, the index of EMBASE, use their own distinct terms.

In order to search exclusively by MeSH terms, users can click on 'MeSH Database' on the left column of the PubMed homepage and provide a topic for search. PubMed will either respond with a results page with citations for that search term or suggest MeSH terms used by compilers for the search term. Clinking on 'links' to the right of the MeSH term will provide two more options: ' Search for citations that include the concept' or 'search for citations where this concept is a major topic of the article'. The first choice will usually result in far greater searches, though they may subsequently be shortened using various limitations. PubMed offers several helpful tutorials for using MeSH terms in online searches ( http://www.ncbi.nlm.nih.gov/sites/entrez?db=mesh ).

If searchers are faced with an overabundance of results, they can use the LIMIT feature to restrict results by every category the database assigns to its abstracts. For example, MEDLINE searches can be limited by title of the article, abstract, author's names, author affiliation, medical subject headings (MeSH), source title, the type of the publication, and publication language. The types of publications can be further separated into clinical trials, editorials, letters, meta-analysis, practice guidelines, RCTs, and reviews. Though listed as an additional feature on the left column of PubMed, Clinical Queries essentially functions as another LIMIT feature. The function filters retrieval to studies that are immediately useful for evidence-based clinical medicine across four categories: Treatment, diagnosis, etiology, and prognosis. Based on the work of Haynes et al., the same author who contributed to the 5S system, the retrieval system can be manipulated to be specific or broad depending on the searcher's wishes. [12] PubMed provides a description of the operating strategies of the Clinical Queries search engine on its website: http://www.ncbi.nlm.nih.gov/entrez/query/static/clinicaltable.html. Using features like Clinical Queries, and numerous other features search engines have evolved and healthcare practitioners and researchers can now optimize search efficiency and accuracy.


   Determining the Validity of the Evidence Top


It is the researcher's burden to determine if their search results are truly reliable sources for information. The complete guidelines to determine the validity of resources are beyond the scope of this paper, though they can be fully understood with reading and experience. [13],[14],[15] Sackett et al., have determined three types of questions that should be asked of every study: Validity, clinical importance, and applicability [Table 4]. [9] There may be thousands of 'studies' on a medical condition, though none may apply to the type of patient a clinician may be enquiring about. 'Syntheses' developed from different groups of studies may disagree with one another. 'Synopses' take months to years to compose and the reviews and reports they review may already be outdated by their publication. There are a plethora of evidence-based 'summary' publications that cover a relatively low number of highly researched medical diseases. Computer decision support 'systems' are rare, expensive, and require extensive maintenance and administration to maintain reliability. Further, as discussed above, though useful and influential, the hierarchy of evidence suggested by Haynes is one among many several valid rankings.


   Extracting the Paper Top


Hospitals and academic centers traditionally buy a set of subscriptions that allow faculty, residents, and students to access the paper's full text online. The extent of the subscriptions per institution will vary both on the number of subscriptions in each specialty and the spectrum of specialties and medical interest topics covered. If an institution has subscription access, the majority of literature published within the last two decades should be available for download as a PDF document. In some occasions, documents published online in unsubscribed journals maybe acquired by hospitals and academic centers for a discounted rate by special request via the library.

Clinicians without the luxury of institutional access can use other strategies to retrieve papers. The first option is to use another institution's resources as a guest. If online articles are not available, the journal may still be shelved and capable of being photocopied. If nothing else, one may request a friendly favor from a local or international colleague who is able to access the literature. The final option, of course, is purchasing the article or the journal issue.


   Conclusion Top


Despite the abundance of resources on the internet, understanding the right search engines and employing optimal strategies are essential to acquire the best of the available evidence. The strategies and steps in this paper can be individualized for the searcher's specialties and environment. As literature search is mastered, a healthcare professional's process of developing and practicing EBM becomes substantially more proficient. The result of developing literature search skills is less frustration and uncertainty and better efficiency and patient care.

 
   References Top

1.Rosenberg W, Donald A. Evidence based medicine: An approach to clinical problem-solving. BMJ 1995;310:1122-6.  Back to cited text no. 1    
2.Sackett DL, Rosenberg WM, Gray JA, Haynes RB, Richardson WS. Evidence based medicine: What it is and what it isn't. BMJ 1996;312:71-2.  Back to cited text no. 2    
3.Petrisor BA, Bhandari M. Principles of teaching evidence-based medicine. Injury 2006;37:335-9.  Back to cited text no. 3    
4.Klein MS, Ross FV, Adams DL, Gilbert CM. Effect of on-line literature searching on length of stay and patient care costs. Acad Med 1994;69:489-95.  Back to cited text no. 4    
5.Doig GS, Simpson. Efficient literature searching: A core skill for the practice of evidence-based medicine. Intensive Care Med 2003;29: 2119-27.  Back to cited text no. 5    
6.Richardson WS, Wilson MC, Nishikawa J and Hayward RS. The well-built clinical question: A key to evidence-based decisions. ACP J Club 1995;123:A12-3.  Back to cited text no. 6    
7.Krupski TL, Dahm P, Fesperman SF, Schardt CM. How to perform a literature search. J Urol 2008;179:1264-70.  Back to cited text no. 7    
8.Agency for Healthcare Research and Quality. Systems to rate the strength of scientific evidence. Evidence Rep Technol Assess 2002;47:1-11.  Back to cited text no. 8    
9.Sackett DL. Evidence-Based Medicine: How to Practice and Teach EBM. 2 nd ed. Churchill Livingstone; 2000.  Back to cited text no. 9    
10.Centre for Evidence-Based Medicine: Levels of evidence. Available from: http://cebm.net/index.aspx?o=1025. [last accessed on 2008 Sep 10].  Back to cited text no. 10    
11.Haynes RB. Of studies, syntheses, synopses, summaries, and systems: The "5S" evolution of information services for evidence-based healthcare decisions. Evidence Based Med 2006;11:162-4.  Back to cited text no. 11    
12.Haynes RB, Wilczynski N, McKibbon KA. Developing optimal search strategies for detecting clinically sound studies in MEDLINE. J Am Med Inform Assoc 1994;1:447-58.  Back to cited text no. 12    
13.Tseng TY, Dahm P, Poolman RW, Preminger GM, Canales BJ, Montori VM. How to use a systematic literature review and meta-analysis. J Urol 2008;180:1249-56.  Back to cited text no. 13    
14.Scales CD, Preminger GM, Keitz SA, Dahm P. Evidence based clinical practice: A Primer for urologists. J Urol 2007;178:775-82.  Back to cited text no. 14    
15.Davis JW, Chang SS, Schellhammer PF. Clinical trials methodology. AUA Update Series 2004;23:2004.  Back to cited text no. 15    



 
 
    Tables

  [Table 1], [Table 2], [Table 3], [Table 4]

This article has been cited by
1 Difficulties and Challenges Associated with Literature Searches in Operating Room Management, Complete with Recommendations
Ruth E. Wachtel,Franklin Dexter
Anesthesia & Analgesia. 2013; 117(6): 1460
[Pubmed] | [DOI]



 

Top
Print this article  Email this article
Previous article Next article

    

 
   Search
 
   Next article
   Previous article 
   Table of Contents
  
    Similar in PUBMED
   Search Pubmed for
   Search in Google Scholar for
 Related articles
    Article in PDF (82 KB)
    Citation Manager
    Access Statistics
    Reader Comments
    Email Alert *
    Add to My List *
* Registration required (free)  


    Abstract
    Introduction
    Framing the Question
    Finding the Evidence
    Summaries
    Synopses
    Synthesis
    Sources of Prima...
    Communicating wi...
    Determining the ...
    Extracting the Paper
    Conclusion
    References
    Article Tables

 Article Access Statistics
    Viewed3239    
    Printed133    
    Emailed2    
    PDF Downloaded294    
    Comments [Add]    
    Cited by others 1    

Recommend this journal

HEALTHWARE INDIA