What is ChatGPT?

what is chatgpt?

ChatGPT is an Artificial Intelligent chatbot. There are two things here: AI, which stands for Artificial Intelligence, and chatbot. By Artificial Intelligence, we mean any machine or system that has some qualities similar to human intelligence, such as understanding sentences, answering them, understanding images and taking action according to them, understanding Natural Language and taking action accordingly, etc.

AI is being used everywhere, such as YouTube, Netflix, Amazon, Google Search, etc. But keep in mind that this intelligence is not the same as human intelligence, in which a person can make decisions without any data or with very little data. This is called Artificial General Intelligence or AGI, which is currently not possible for any machine to start thinking on its own.

These Artificial Intelligent machines or chatbots work on a Language Model, which uses Natural Language Processing to find the answers to the questions given. The more data a system has and the more training it undergoes, the more natural responses it will give.

Why do people consider ChatGPT a master of every art?

Note: There will be very little spice in this writing, and more technical details.

These days, ChatGPT is making waves by solving every problem and showcasing the pinnacle of artificial intelligence. In November of last year, when OpenAI released version 3 of the GPT, which powers ChatGPT, within just 5 days, over 100 million people began using it.

Microsoft, which had been an investor in the company since 2019, acquired OpenAI in 2022 and is now working on releasing version 4 of the GPT, which is expected to have extremely impressive capabilities and features. Let’s see what new things can be done with it and how true its claims are. Before that, I would like to explain to the average person what it actually is.

What is a language model? 

Whenever a chatbot or artificial intelligence system is created, it has to be taught everything like a young child. But the learning power required for this is millions of times greater. For example, in 2018, the BookCorpus dataset was used to train GPT version 1, which contained text equivalent to 7,000 books.

This dataset had 1.2 billion words, and it was created by experts from the University of Toronto. Another thing that contributes to the creation of this language model is the neural network.

What is a Neural Network?

All words have some kind of relationship or connection with each other, such as related words like “mother” and “family”. These relationships between words create a network called an Artificial Neural Network (ANN). The ANN is the essence of this Language Model that powers any artificial intelligence system.

Just like the human brain has a Biological Neural Network, which is connected through chemical reactions between neurons, creating billions of connections that give rise to human intelligence, an ANN connects words like nodes (with processing) to form an artificial neural network. These neural networks are initially created from training data, and over time, new words are added to them.

There are different ways to train an AI system, including Supervised and Unsupervised Learning and Reinforcement Learning, but the details of these methods are beyond the scope of this discussion.

What is GPT?

GPT is a language model, the details of which you have already read above, and it stands for Generative Pre-trained Transformer. This language model has the ability to respond like humans, and is trained on a large amount of data. Now, what does the term “Transformer” mean? It is a deep learning model that is a small part of machine learning.

It assigns some weight (weights) to each node present in the neural network, which is the most important task in providing the final answer from input to output. This transformer works like the backbone in applications such as Natural Language Processing, just like this chatbot, and works like the spine in applications such as Computer Vision. I am expressing many things in extremely abbreviated terms so that the length of the text does not bore you.

In the first version, there were 1.2 billion parameters or nodes. When GPT-2 was about to arrive, the same things were being said as are now happening before the arrival of version 4. It was said before the arrival of version 2 that after it, there will be a flood of unverified information and news, political parties will use it, and misinformation will prevail among people.

In November 2019, OpenAI released version 2, which had 40 GB of text data, 8 Million documents, and more than 400 million web pages upvoted on Reddit.

Then came GPT-3, which was a big leap, with a data set of 570 GB and 175 billion parameters (these are actually nodes of the neural network) being used…

Where does ChatGPT get so much information from?

Now, ChatGPT actually uses the GPT language model, which has a current version of 3.5 and version 4 is coming soon. The details of how language models are trained and where their data comes from are explained above.

What’s new in GPT-4?

Let’s talk about GPT-4, which is currently being developed. Before its release, the same work is being done as was done before the release of versions 2 and 3, i.e. speculation and rumors.

The Founder and CEO of OpenAI has also debunked all the rumors on social media regarding version 4, saying that people have attached too many expectations to it and will be disappointed upon its release. Version 4 was released in March and it is breaking all the rumors that were circulating before its release. Currently, there is no difference for the average person, except for an image-to-text search option for software developers, and video generation is not yet available.

The length of the text for search has been increased from 4 times to 16 times, which will give this chatbot more context and help it understand our conversations better, resulting in more accurate answers. Microsoft has integrated this version with its Bing Search. Nowhere in this version is there mention of 100 trillion parameters or video generation. This version is definitely a big leap, but it also has limitations. The biggest issue is biased answers and unverified edits…

Comparison between Google and Meta’s GPT-4:

If we compare Microsoft’s OpenAI ChatGPT, especially version 4, with Google’s Imagen Video, then Google is working on a technology that can convert text into videos with good results. You can see all the videos created as samples on Google’s Imagen Video website, and the AI videos generated by GPT-4 will not be very different. In addition, Google recently released an API for its Large Language Model called PaLM-E, which businesses can use to create their chatbots, as opposed to GPT-4.

Similarly, Facebook’s Meta company is also working on the same Multimodal system, which will have a feature to create videos from text and images. You can also check their website for more information.

When talking about GPT4, the company’s official release states that currently only the text-to-text and image-to-text features will be available, and the image recognition feature that provides information about the image by identifying it will only be available for software developers (API). Many other Artificial Intelligence systems are already working on tasks involving image recognition and transcription, such as DALL-E2 and MidJourney. There is no mention of video transcription in this release.

The company has also acknowledged many shortcomings in this version 4, including buggy programming code, incorrect medical advice, and incorrect information. Version 4 is currently undergoing testing to correct thousands of errors, but it is crucial not to make it surpass human intelligence. So far, no artificial intelligence system has been developed that can surpass human intelligence.

As for where ChatGPT can be used, you may have read multiple articles about it, but I will write one specifically about where it can be best used, especially for students, researchers, bloggers, and programmers.

Precautions for using it:

Its answers will not be 100% accurate because it does not have its own intelligence; it works on existing data such as Wikipedia, online books, articles, or discussion forums.

It cannot be trusted for medical or technical matters, and you should contact experts in those fields. For researchers or anyone using it for professional work, it is crucial to verify the information obtained from ChatGPT.

Final Words

ChatGPT is an artificial intelligence chatbot powered by the GPT language model. The GPT, or Generative Pre-trained Transformer, is a language model trained on large datasets using artificial neural networks. These networks are created from training data and form a backbone for applications such as natural language processing.

ChatGPT uses the GPT language model, which has been updated to version 3.5, and version 4 is being developed. The GPT-4 language model has generated much speculation and anticipation, but the Founder and CEO of OpenAI has debunked rumors and advised against overestimating its capabilities. While new features are expected, such as image-to-text search for software developers, the average person will see little difference.

Recommended Posts