Search the World's Largest Database of Information Science & Technology Terms & Definitions
InfInfoScipedia LogoScipedia
A Free Service of IGI Global Publishing House
Below please find a list of definitions for the term that
you selected from multiple scholarly research resources.

What is Pre-Trained Language Models (PLMs)

Generative AI in Teaching and Learning
Language models trained on often large curated datasets of natural language training data.
Published in Chapter:
Stepping Stones for Self-Learning: Exploring the Use of Multimodal Text- and Image-Making Generative AI Tools
Shalin Hai-Jew (Kansas State University, USA)
Copyright: © 2024 |Pages: 58
DOI: 10.4018/979-8-3693-0074-9.ch005
Abstract
One of the themes in the emergence of text- and image-making (multimodal) generative AIs is their value in the learning space, with the vast potential just beginning to be explored by mass humanity. This chapter explores the potential and early use of large language models (LLMs) harnessed for their mass learning, human-friendly conversations, and their efficacies, for self-learning for individuals and groups, based on a review of the literature, system constraints and affordances, and abductive logic. There are insights shared about longitudinal and lifelong learning and foci on co-evolving processes between the human learner and the computing machines and large language models.
Full Text Chapter Download: US $37.50 Add to Cart
eContent Pro Discount Banner
InfoSci OnDemandECP Editorial ServicesAGOSR