ChatGPT seems to have taken the world by storm. It can answer deep philosophical questions. It can answer knowledge questions, even some research questions in computer science and other areas. It can write stories on a plot, even poems. It can solve maths equations (not always correct though). Most of all, it can read and write and debug code, in multiple languages too! In short, it is the entire knowledge of humanity distilled in a neural network model.
The kind of things ChatGPT can do, the kind of things it has been demonstrated to do, all have created a sensation, even though it is by no means perfect and often makes mistakes in logical tasks. In this article I will share some of the fun experiments I have done with ChatGPT.
Questions about ChatGPT itself and related models
First, let us start by asking some questions about the algorithm and training used by ChatGPT and its related models like GPT-3.
What are some papers to learn about the GPT model?
The GPT (Generative Pre-training Transformer) is a popular language model developed by OpenAI. Some papers that are useful for learning about the GPT model are:
- “Improving Language Understanding by Generative Pre-Training” by Alec Radford, Karthik Narasimhan, Tim Salimans, and Ilya Sutskever. This is the original paper that introduced the GPT model.
- “Language Models are Unsupervised Multitask Learners” by Alec Radford, Jeffrey Wu, Rewon Child, David Luan, Dario Amodei, and Ilya Sutskever. This paper discusses the idea of using unsupervised learning to pre-train language models and demonstrates the effectiveness of this approach using the GPT model.
- “Exploring the Limits of Transfer Learning with a Unified Text-to-Text Transformer” by Colin Raffel, Noam Shazeer, Adam Roberts, Katherine Lee, Sharan Narang, Michael Matena, Yanqi Zhou, Wei Li, and Peter J. Liu. This paper presents the GPT-2 model, which is a larger and more powerful version of the GPT model, and shows its ability to perform a wide range of natural language tasks.
- “GPT-3: Language Models are Few-Shot Learners” by Tom B. Brown, Benjamin Mann, Nick Ryder, Melanie Subbiah, Jared Kaplan, Prafulla Dhariwal, Arvind Neelakantan, Pranav Shyam, Girish Sastry, Amanda Askell, Sandhini Agarwal, Ariel Herbert-Voss, Gretchen Krueger, Tom Henighan, Rewon Child, Aditya Ramesh, Daniel M. Ziegler, Matthew D. Hausknecht, Mark Chen, Eric S. Smith, Fernanda Viégas, Martin Wattenberg, and Dave O’Brien. This paper presents the GPT-3 model, which is an even larger and more powerful version of the GPT model, and demonstrates its ability to perform a wide range of natural language tasks with just a few examples.
Writing poems and stories
While I don’t feel ChatGPT has enough of a sense of humor or imagination, it can write poems and stories, presumably based on how it has been trained.
Below are screenshots of some poems it wrote, along with the prompts used to generate the poems:
Write a poem about bhagavad gita chapter 2 in style of byron shelley
In the Bhagavad Gita, Chapter 2,
Krishna speaks of duty and righteousness,
Of the path of action, and the quest for truth,
Of the eternal self, and the ultimate goal of life.
With words of wisdom, and a heart full of love,
Krishna guides Arjuna on the path he must take,
To fulfill his duty, and to rise above,
The doubts and the fears that his heart does quake.
So listen, dear Arjuna, to the words of the Lord,
And let them guide you on your journey ahead,
To the path of action, and the quest for the reward,
Of enlightenment, and the liberation from death.
For in the Bhagavad Gita, Chapter 2,
Krishna teaches us the way to be true,
To our duty, and to the path we must pursue,
To find peace and happiness, and the life we are due.
Below are some stories:
Answering questions on philosophical schools
Below are some responses to philosophical questions
Answers to some coding problems
I imagine ChatGPT has been trained with code from things like StackOverflow, so questions that need code that is easily found in StackOverFlow should be answered easily. Conversely, more complex coding problems that are NOT easily found it may have trouble answering.
Below are some coding problems which were answered by ChatGPT. Overall, I felt it can do a decent job with simple coding related questions.
Maths problems and puzzles
Unfortunately ChatGPT was a disappointment when it came to puzzles and many math problems, although it did solve a few. But that’s not surprising, for neural networks cannot do logic easily. Perhaps explicit modifications for handling logic has to be encoded in the architecture of the model before it can do better.
Research-ish questions or learning about a new topic
Sometimes, ChatGPT may be able to help save some time in researching stuff. Although, here too the lack of citations (sources for its research answers) is a concern.
However, its knowledge of facts is generally good, and it is (sometimes) better than Google in presenting information in a systematic and easy to read form.
Witty answers and sense of humor answers
This proved to be a disappointment. It does not generate witty answers. Maybe sense of humor needs to be explicitly trained into it in future releases.
Some other interesting articles about ChatGPT
The below articles (and many others too) also do a fair bit of exploration about the capabilities and limitations of ChatGPT
Conclusion
In this article we have briefly looked at some answers generated by ChatGPT to a wide variety of questions. While it is not always perfect, and occasionally it can even be wrong, its possibilities are very promising. We hope that this will be the start of a future with general purpose AI, an AI that carries the whole of human knowledge within it and can answer just about any question at least as well as an intelligent human can.