Using AI tools for research is a great option in many cases. One thing to keep in mind is that you need to evaluate the information you get from generative AI tools. This includes scholarly articles, books, book chapters, or other resources recommended by a generative AI tool like Microsoft Copilot or ChatGPT.
Microsoft Copilot and other generative AI like ChatGPT currently do a terrible job when it comes to finding scholarly sources. At Lavery Library, this is what we recommend to Fisher students:
Use generative AI to find ideas. Use the library to find sources.
Gen AI tools like Microsoft Copilot and ChatGPT will "hallucinate" sources. In other words, they can create fake information. The fake information can include:
At Lavery Library, we are continually testing Copilot and ChatGPT and other gen AI tools. Over and over again we see the same thing in the results: hallucinated articles, and mislabeling undergraduate student papers and other sources as scholarly, peer-reviewed articles.
One of the best ways to protect yourself against AI hallucinations is to fact-check any sources provided to you by generative AI. If generative AI recommends an article to you and calls it scholarly or peer-reviewed, you should:
This video is a quick how-to about how to make sure an article actually exists by searching for the title in the Big Red Box.
This video shows you four things you can check to make sure an article is scholarly, and one trap to avoid.
You can explore the rest of this research guide, AI Tools and Resources, to learn how to use AI in research, whether you're a student or a faculty member. The library also offers these:
These articles show how ChatGPT and similar tools can hallucinate:
A note from the Librarians:
You can always check with Fisher experts about the use of AI in your courses or research. These experts include Fisher librarians, academic advisors, Writing & Tutoring, DePeters Center staff, and your faculty.
If you need help citing your use of AI, consult the Library's Citation Guide: