Skip to Main Content

Artificial Intelligence | Chat GPT for Students

Fact Check

Fact-Checking is Always Needed


AI Hallucination

ChatGPT sometimes makes stuff up; the official AI term for this is hallucination.  ChatGPT sometimes hallucinates because these systems are probabilistic (which incorporate randomness), not deterministic (information that is known to be true and accurate because it is supplied by people directly or is personally identifiable).

 

Web Search Results as Grounding

When an AI model is combined with a search engine, it hallucinates less. That's because it can search the web, read the pages it finds, and use the AI model to summarize those pages, with links to the pages. 

It may sometimes make a mistake in the summary, so it's always good to follow the links to the web results it found.

All of the major models now include the ability to search the web.

 

Scholarly Sources as Grounding

There are also systems that combine language models with scholarly sources. For example:

  • Elicit
    A research assistant using language models like GPT-3 to automate parts of researchers’ workflows. Currently, the main workflow in Elicit is Literature Review. If you ask a question, Elicit will show relevant papers and summaries of key information about those papers in an easy-to-use table. 
  • Consensus

    A search engine that uses AI to search for and surface claims made in peer-reviewed research papers. Ask a plain English research question, and get word-for-word quotes from research papers related to your question. The source material used in Consensus comes from the Semantic Scholar database, which includes over 200M papers across all domains of science.