Browse:
What are AI‑generated or “hallucinated” sources?
AI‑generated or hallucinated sources are citations that look legitimate—they may include a realistic article title, author names, journal titles, volume/issue numbers, page ranges, and even a DOI—but do not actually exist. They are created by artificial intelligence systems when generating text, particularly for tasks such as literature reviews, bibliographies, or annotated references.
These citations are not intentionally deceptive. Rather, they are a byproduct of how large language models generate language.
Why do AI tools generate non‑existent citations?
AI systems that generate text (such as large language models) do not search databases in real time unless explicitly designed to do so. Instead, they:
When asked to provide citations, the AI may:
The result is a citation that appears correct but lacks a corresponding publication.
Are the authors or journals ever real?
Yes—this is one of the most confusing aspects of AI‑generated citations.
Sometimes:
Because parts of the citation may be accurate, users may assume the entire citation is legitimate when it is not.
Why is this a problem for students and researchers?
Using a hallucinated source can:
For these reasons, all AI‑generated citations must be verified before being used in academic work.
How can I check whether a citation is real using Google Scholar?
Google Scholar is one of the easiest tools for verifying citations. Follow these steps:
If the citation is real, you will typically see:
If no results appear, try:
If the article still does not appear, it is likely a hallucination.
What if Google Scholar shows the author or journal but not the article title?
This is a common sign of an AI‑generated citation.
For example:
In such cases, the citation is almost certainly incorrect or fabricated.
How can I use a DOI to verify a citation?
If a citation includes a DOI (Digital Object Identifier), you should always verify it.
Steps:
A valid DOI should:
If the DOI:
Then the citation is incorrect or hallucinated.
Can a citation be partially correct but still unusable?
Yes. A citation may contain:
Even partial inaccuracies make a citation unreliable. Academic work requires fully accurate and verifiable sources.
What should I do if I discover a hallucinated source?
If you identify a citation that does not exist:
Academic integrity and responsibility for your sources
Submitting a paper that includes in-text citations or footnotes to hallucinated (non-existent) sources makes it clear that the text was not written independently by the student. You should only cite sources that you have actually obtained and read. If a source does not exist, it could not possibly have been read, and its inclusion is clear evidence that the work does not reflect genuine research or authorship. In addition, responsible scholarship requires reading the entire source you are citing—not just the abstract. Having access only to an abstract, summary, or citation record is not sufficient grounds to cite a work in an academic paper. Citations signal direct engagement with the full text, arguments, evidence, and conclusions of a source.
Key takeaway
AI tools cannot be trusted to generate accurate citations without verification. Don’t cite sources you haven’t seen. If there is any doubt, confirm sources using Google Scholar, DOI resolvers, and library databases before relying on them in academic work.
For further writing assistance, you can contact one of Liberty's Writing Centers:
Was this helpful? 0 0