Correct option is C
The correct answer is (c) Generative Pre-Trained Transformer.
Explanation:
● Generative: It can generate new content, such as text or code.
● Pre-Trained: It was trained on a massive dataset of text before being fine-tuned for specific tasks.
● Transformer: It uses a "transformer" neural network architecture, which is highly efficient for processing sequences of data like language.
Information Booster:
● Developed by OpenAI.
● Transformer Architecture – Introduced by Google researchers in the paper "Attention Is All You Need" (2017).
● Tokenization – The process of breaking down text into smaller units (tokens) for the model to process.
Additional Knowledge:
● Claude – AI chatbot by Anthropic.
● Gemini – AI model by Google (formerly Bard).