Skip to Main Content

Artificial Intelligence: Generative AI Limitations

 

 

Limitations of ChatGPT

Chat GPT has many useful applications, but it cannot do everything. The things that it is good at, it does very well. But, where it fails, it can be disastrous. For this reason, and many others, we recommend against using Chat GPT as a source of information. Think of it like Wikipedia. It can help lead you to quality sources, but it shouldn't be something you cite in your paper.

Reasons:

1. Chat GPT doesn't "know" anything, it is just trying to respond to the prompt given in the most probable way that is expected from the prompt. It is better at verifying information than it used to be, and some generative AIs will provide sources.

2. Chat GPT can cite where it is getting it's information from if you ask it to in a prompt, but then you still need to verify the source.

3. Chat GPT is still wrong occasionally. It only knows how to respond to the prompts it is given. You will need to learn how to write better prompts.

Chat GPT is an effective and versatile tool, but that does not mean that it can do everything. It is important to learn the limitations of the software and the limitations of any Language Model before applying it in the workplace or the classroom.

Is it safe to use ChatGPT for your task?

Acknowledgement of adaptation

This guide was adapted from the Black Hawk College Library ChatGPT libguide. Use the following link to view the original guide:  https://bhc.libguides.com/ChatGPT