Skip to Main Content

AI Tools for Academic Literature Research

This guide includes examples of available AI-powered tools to assist with scholarly literature searches and reference management plus ethical considerations and warnings.

Why Use an AI Tool Specific for Research instead of Using ChatGPT for Everything?

  • Source Data - AI tools for academic research are trained on and pull only from academic research databases (not X, Reddit, YouTube, etc.)
  • Built-in Functions Useful to Literature Review Processes - Track commonly cited works, authors who cited each other, extract key findings, etc.

AI-Powered Academic Research Tools

Semantic Scholar

Overview

  • Semantic Scholar's database of over 200 million scholarly articles underpins many other AI-powered research assistant tools. [They offer API access to developers.]
  • Researchers can search and organize scholarly papers into folders.
  • Researchers can set up and access a personal research dashboard. Active researchers should expect to see recommended papers showing up on their dashboards and feeds based on what they have been searching.
  • Save to Library, Create Alert, Cite, Access to References, and Related Papers features are available. The Related Papers link is encircled in bright pink.

Ask This Paper

  • The AI-powered "Ask This Paper" feature is available for selected papers.

Shows article view in Semantic Scholar with the Ask This Paper AI Feature marked with an orange square and the Related papers link encircled in a bright pink ellipse.

View larger image

  • The image shows the paper view for an article found using the Semantic Scholar search function. This article has the "Ask This Paper" enabled. 
  • It offers sample question prompts, but users can input their own queries. 
  • Researchers who do not want their input data to be shared can check the box to opt out of that function. Checking the box may mean that the input data is not used for inference training, but that is not clear. 

[PDF] Semantic Reader

  • Some (not all) papers with available PDF files provide the [PDF] Semantic Reader function. 
  • Semantic Reader allows the researcher to enable Skimming Highlights to include Result (default), Goal, and Method. This feature extracts these pieces of information from the article in a sidebar and highlights the relevant passages in the article's abstract and full text. 

Shows generative-AI powered Skimming Highlights sidebar from Semantic Scholar.

View larger image

  • Semantic Reader also offers a Citation Cards feature which allows researchers to monitor connections among various works they have discovered.
  • Citation cards use color coding of in-text citations to show relationships among papers the researcher has saved in their personal Semantic Scholar Library.
    • Orange: Direct citations of works saved in the Library
    • Pink: Papers cited by a work saved to the Library

For more information, see About Semantic Scholar.

Elicit

Overview

  • Elicit uses Semantic Scholar and OpenAlex as its underlying databases so it offers access to over 200 million articles.
  • The researcher can choose among different paths from the opening screen: 
    • Research report
    • Systematic review (must be a paying user)
    • Find papers 
    • Upload papers and extract information 

Search screen for Elicit showing Find Papers function and search question: "What are ways generative AI is being used to support special education students in PreK-12 grade levels?"

View larger image

Find Papers (Searches)

  • Searches can target papers or clinical trials. 
  • Searches create a table/matrix covering 10 papers. The table includes 2 columns: Paper (citation/reference information) and Summary. A limited number of additional columns can be added to the table/matrix in the free version.
  • In addition to the table/matrix, a search results in a summary of the 10 papers and the option to "Chat with papers."

Elicit Search for Papers Matrix on AI innovations in Special Education. Shows 4 columns: Paper, Summary, Methodology, and Major Findings for up to 10 papers (Free version)

View larger image

Research Report 

  • The Research Report option creates a "structured report." First it searches for up to 50 sources; then it produces a Fast report in around 5-10 minutes based on information pulled from 10 papers. Paid users can generate reports based on larger numbers of sources. 
  • In addition to the structured report, the Research Report option creates a matrix of the 10 included papers with 6 columns, at least 2 of which seem specific to the topic.
    Matrix of ten studies included in research on AI innovations in special education and the characteristics of studies that were included

View larger image

  • Researchers can chat with generated reports to ask follow-up questions. 
  • Research reports can be downloaded as PDF files. An example report for the prompt: "AI innovations special education" using the free version of Elicit is attached.  

ResearchRabbit

Overview

  • ResearchRabbit is a free, powerful, AI-enabled search tool that recommends related academic sources and visualizes research landscapes for topics. It discovers and maps connections among papers and authors; it does not generate summaries of articles.
  • Its data sources are Semantic Scholar and OpenAlex; it uses search algorithms from the National Institutes of Health (NIH) and Semantic Scholar.

Getting Started

Researchers can...

  • Set up a free login.
  • Create and name a collection. 
  • Add papers to a collection by entering a keyword search, article title, DOI, or PMID. Researchers can also upload documents through Zotero or Mendeley integrations.
  • Choose Biomedical and Life Sciences or All Collections when searching, depending on the topic. 
  • Select one or more source documents to build connection maps.

Finding Related Papers

Select source documents to explore and then choose any or all of the following:

  • Similar Works: Usually retrieves largest results set using citations, references, and algorithmic analysis 
  • All References: Papers cited by the selected sources
  • All Citations: Papers that cite the selected sources
  • These Authors: Other works by the same authors as those in the source collection
  • Suggested Authors: Other authors recommended by algorithms 
  • Linked Work: Other online content from blogs, Wikipedia, etc. 

Results display with abstracts and full-text links when available.

Visualizations 

Graph type options include 

  • Network: Composed of dots and connecting lines
    • Green dots are source papers already in your collection. Blue dots are source papers not yet added to the collection.
    • Darker dots represent more recent papers. Lighter dots are older works.
    • Larger blue dots have more/stronger connection to source papers.
    • Lines show citation relationships--Arrows indicate direction of citation. "The dot receiving the arrow is being cited by the dot sending the arrow."

ResearchRabbit Collection and Similar Works for Generative AI and Special Education. Source collection includes 65 articles which identifies 204 related works. A network visualization is generated based on 40 of the similar works.

View larger image

  • Timeline: Composed of dots and connecting lines arranged according to date of publication

Manipulate visualizations by dragging nodes.

Saving and Exporting

  • Export graphs as png files.
  • Export source collections as BibTeX, RIS, or CSV.  

Additional Resources

Connected Papers

  • Connected Papers is connected to the Semantic Scholar paper database so it has access to around 200 million papers.
  • No log-in is required to try the features. Without a log-in, users can generate 2 graphs per month.
  • Users can search by keywords, paper title, DOI, or other identifiers
  • Upon selecting a paper as a starting point, Connected Papers builds a graph based on that seed paper. The graph shows papers that are similar based on overlapping citations and references (not just the papers cited in the seed paper).
  • Papers that are most similar are clustered together and have stronger connecting lines.
  • Circle (Node) size indicates the number of times a paper has been cited.
  • Different viewing options include: Prior Works, Derivative Works, and List View tables. Additional filters include by keyword, PDF availability, and publication date.
  • Free users with a login can generate up to 5 graphs per month. Paid plans that allow unlimited graphs start at $6/month.

Shows literature map in Connected Papers tools that uses a paper titled "Risky Teaching: Developing a Trauma-informed Pedagogy for Higher Education" and the resulting visualization graph of related papers built from it.

View larger image

  • The gold arrows point out the original paper used to create the graph.
  • The purple rectangle encloses the menu of options for changing the graph view and for filtering papers to be included in the selection.

For more information, see the Connected Papers About page.

Litmaps

  • Litmaps provides access to over 270 million research articles. 
  • The search function of Litmaps is built on open access metadata from Crossref, Semantic Scholar, and OpenAlex.
  • Litmaps is available as a website and a mobile app.
  • Users can search by keyword, author, DOI, Pubmed ID, or arXiv ID.
  • Litmaps produces visualizations showing relationships among articles. These relationships are determined using the following:
    • Shared citations and references
    • Common authors
    • Similar text--this function involves AI-powered semantic analysis
  • A free use level with no log-in required provides basic search of up to 20 inputs and 2 Litmaps per month with 100 articles per map.
  • Setting up a log-in offers options for setting up collections and iterating on prior collections and maps.
  • Users can start with a search and create a basic, auto-generated map.
  • More advanced uses include adding keyword tags (represented by colors) and producing maps by adding selected articles to a map and building more intentionally.
  • A Pro Educational license is available for $10/month and has advanced search capabilities, plus unlimited inputs, articles, and Litmaps. Team and Enterprise-level accounts are also available. There is even a Teach with Litmaps program.
  • Litmaps allows for importing multiple articles at once and also syncs with Zotero.

Shows LitMpas article visualization based on Weng (2024) "Assessment and learning outcomes for generative AI in higher education: A scoping review on current research status and trends"

View larger image

  • The image shows the seed article by Weng (2024) marked in gold rectangle above the Explore Related Articles column and in a gold oval at the bottom right corner of the visualization map.
  • The map shows twenty related articles arranged according to publication date, number of citations and relatedness to the seed article.
  • Visualizations can be downloaded and additional views are available.
  • Paid users can view more than 20 articles at a time and can add articles to their maps and lists.

For more information, see the LitMaps Features page. You can also sign-up for their Substack.

Consensus

  • Consensus is built on the Semantic Scholar content database with access to approximately 200 million papers. It uses keyword search and approximate nearest neighbor (ANN) algorithm-powered vector search across titles and abstracts to retrieve results.
  • Consensus offers multiple functions at the free level. A free account provides up to 10 Pro Analyses per month.
  • A user can enter a keyword search, question, or other prompt. 
  • In response, Consensus provides a response or answer that draws from the "top 10 most relevant papers."
  • Citations to the papers are included in the response. Although, the Pro Analysis summarizes 10 papers to create a response, it may not include in-text citations for all 10 papers.
  • In addition to the initial response to the user's prompt, recommended follow-up or related questions are offered. Then the 10 papers that were used to draft the response are listed.
  • Highly cited papers and paper types like preprint and systematic reviews are marked.
  • If the full-text of a paper in the list is available, the user will see an option to "Ask this Paper." This feature allows the user to query the individual paper for a summary, key takeaways, etc.

Shows a prompt in Consensus asking for a comparison between the potential learning gains and potential negative learning outcomes of AI use in higher education. The result is shown which lists 4 bullet points addressing each side of the issue and a conclusion. A note indicates that 10 sources were analyzed in generating the response.

View larger image

  • Scrolling further, three related questions are suggested.
  • Next, the 10 articles used to create the generated response are listed with labels noting publication date, article type, number of citations, etc.

Rest of Consensus results screen showing three related questions the user could ask plus a listing of the ten articles used to generate the body of the response.

View larger image

  • Finally, the "Ask this paper" feature is indicated with a purple arrow.

For more information, see the Consensus Help Center.

Scite_ (Scite.ai)

  • Without a login, users can try 2 free prompts.
  • Setting up a login begins a 7-day free trial. Personal accounts are available for $12/month. Discount prices for students and academics are available "if you recommend Scite to your institution." Institutions can subscribe to Enterprise accounts.
  • Scite has access to 187 million publications.
  • A user can enter a question or other content into a prompt box. Users can also enter a specific article title or DOI.
  • A multi-paragraph response is generated that includes linked in-text citations.
  • In addition to the AI-generated response, a side panel opens that offers two views: References and Search Strategy.
    • References lists the sources used to create the response with citations and links.
    • Search Strategy presents a list of searches used including links to the searches. The user can manually edit the searches to change the outputs.
  • Another option is the Table View, which allows the user to create an exportable, multi-column, literature review matrix with sources and extracted content.
  • The user can adjust settings before entering a query using the gear icon or by clicking on Settings.
    • "Always use references" can be selected to ensure that citations are incorporated into the generated response paragraphs.
    • The user can also select the number of references that should be consulted before generating the response.
  • With a paid account, the user can enter follow-up questions, revisit the menu, etc.
  • Additional paid account features include a user dashboard, browser extension, and a Zotero plug-in.

Scite.ai results screen for prompt: "Does social media impact mental health?" This view shows the search strategy panel with the three searches indicated along with the option to edit searches. The dashboard menu is also marked with a purple rectangle.

View larger image

For more information, see the Scite_ landing page, especially the Product dropdown menu. Additionally, the Scite development team published a journal article in 2021, explaining how they built the tool.

 

Undermind Research Assistant 

  • Undermind pulls papers from ArXiv, which is largely a preprint server, meaning that the sources may not have been peer-reviewed. Many papers published on ArXiv go on to be published in peer reviewed journals, but not all do.
  • Undermind.ai works best with specific, detailed queries about topics that have been empirically researched.
  • The free tier of access allows the user to search (across abstracts) and revise queries based on feedback generated by AI with a limit of 5 searches/month and a portion (not all results are displayed). 
  • When satisfied with the instructions or query entered, the user can click generate report.
  • The Undermind developers describe the process as, "Intelligent search adapts, follows citation trails to uncover everything." It "learns and updates its knowledge as it explores the literature." 

Shows partial Undermind.ai report for the topic AI and student learning achievement. The percentage of search completion, Discuss the results with an Expert, and the first part of the Report summary are indicated with callouts.

View larger image

  • The detailed report, with embedded citations, includes the following sections: Summary, Categories, Timeline, Foundational work, Adjacent work, and References. In the free version, the report can be exported but the reference list cannot be exported easily into an external reference manager (e.g. Zotero, Mendeley).
  • After generating the report, users can "Discuss results with an expert." The expert is an AI (not a person), and users can enter questions about the references or other aspects of the report. Prompt suggestions are offered, but a user can enter other questions.
  • Paid Pro Academic Subscriptions are available for $16 and remove search and chat limitations, searches and analyzes full-text when available, and provides access to the best AI models. 

For more information, see the Undermind.ai site for a video overview tutorial. You can also read the developers' whitepaper comparing search results and other features between Google Scholar and Undermind.ai