Published Online: January 21, 2025
Author Details
( * ) denotes Corresponding author
The present language models are increasingly becoming tools or instruments for different machines to understand, decipher and predict human languages as components of contextually applicable human form of communication. This study attempts to figure out the progress of language models commonly described as GPTs or Generative Pre-trained Transformers. R4.2.2 console programming was used to locate keywords in the topics of GPT among 72 blogs that appeared in OPENAI website. The main terms identified included “learning,” “openAI”, “models,” and “model.” Moreover, correlation analysis revealed associations between terms like “appreciated,” “creativity,” “flex,” “combine,” and “connections. Scholars, researchers, and professionals working on business and information technology applications can add to their knowledge base from reading this article.
Keywords
Context; Text; Transformer; Trained; Predictive