You can set up your own account to use these tools.
These tools do not link to Lavery Library's collections. You may need to take extra steps to track down the full text (PDF) of an article you discover while using these tools. You can follow the steps in Lavery Library's tutorial Finding Full Text: Article Title.
For more information you can visit St. John Fisher University's AI Toolkit: How To Access AI Tools.
St. John Fisher University has a policy on the use of generative artificial intelligence (AI):
You can use AI tools like a personal research assistant who is helping you make a research plan. (Just don't bother asking it to help you find scholarly sources.) It’s not the same thing as meeting with a librarian or your course instructor, but it can be helpful.
They can:
Lavery Library can help. Contact the Fisher librarians, who can help you do things like find scholarly sources and use generative AI tools when your instructors allow it. Librarians can help you explore research topics, fact-check information from generative AI tools, and find sources.
You can try these prompts in Microsoft Copilot, ChatGPT, and other text-based AI platforms. Copy and paste the following text, customizing it for your topic:
Did you find articles?
Wait! ChatGPT and other gen AI tools can make articles/links seem like scholarly, academic sources when they really are not. And, sometimes they even make up fake articles and books.
Check and see if the sources are real. (A librarian can help.) And, if they are, find and read the full text (PDF or hard copy). We show you how to do this on Evaluating & Citing:
Microsoft Copilot other generative AI like ChatGPT currently do a terrible job when it comes to finding scholarly sources. They may improve in the future, but for now:
Use generative AI to find ideas. Use the library to find sources.
Generative AI tools like ChatGPT will "hallucinate" sources. In other words, AI will create fake information. This can include recommending sources that don't exist and misrepresenting a real source by inaccurately reporting what it says.
This is one of the biggest reasons to use the library to find sources, and to evaluate and fact-check the sources you find using an AI tool.
We are continually testing Copilot and ChatGPT and other generative AI tools. Over and over again we see the same thing: AI will say something is a scholarly, peer-reviewed article when it's not. Sometimes it's an undergraduate student paper. Other times it's something else, such as an ebook chapter or on online report.
If you ask for scholarly, peer-reviewed sources, it is your responsibility to find, read, evaluate, and cite any sources suggested by these tools. Librarians can help you do this. We can also help you use generative AI when it is allowed in your coursework.
Chatbots aren't neutral. They can produce results that demonstrate political bias, racism, sexism, and other biases.
These tools are bad at creating citations. They don’t properly format citations in APA, MLA, or other styles.
Hallucination is one of the biggest reasons to evaluate and fact-check the sources you find using an AI tool. Visit Evaluating & Citing to learn more:
You should be cautious about entering any copyrighted material into the prompt of any generative AI tool.
Here are a few scenarios to consider: