Conceptual modeling and large language models: impressions from first experiments with ChatGPT

HG Fill, P Fettke, J Köpke - Enterprise Modelling and Information …, 2023 - folia.unifr.ch
Enterprise Modelling and Information Systems Architectures (EMISAJ), 2023folia.unifr.ch
Since OpenAI publicly released ChatGPT in November 20221, many ideas have emerged
as to which applications this type of technology could support. At its core, ChatGPT is a
conversational artificial intelligence, meaning that it can engage in a dialogue to respond to
user input given in natural language (Campbell 2020). Although such types of systems have
been well-known since Weizenbaum's Eliza program (Weizenbaum 1966) and are today
widely deployed in practice under the popular term chatbots, ChatGPT has a particular set of …
Since OpenAI publicly released ChatGPT in November 20221, many ideas have emerged as to which applications this type of technology could support. At its core, ChatGPT is a conversational artificial intelligence, meaning that it can engage in a dialogue to respond to user input given in natural language (Campbell 2020). Although such types of systems have been well-known since Weizenbaum’s Eliza program (Weizenbaum 1966) and are today widely deployed in practice under the popular term chatbots, ChatGPT has a particular set of properties that contributed to its wide reception and the recent hype surrounding it. In contrast to previous chatbots, ChatGPT does not retrieve responses from a knowledge base, which has been pre-defined by some human user. Rather, it is based on a pre-trained generative language model, which creates responses based on patterns that the user supplies as input. Thereby, a language model basically assigns probabilities to every word in a vocabulary that can follow a given input sequence. Such word embeddings are trained using artificial neural networks to learn a probability distribution from given texts in an unsupervised fashion, ie such that no additional human input or labeling is required. The generation of the output sequence thereby considers the tokens of the input sequence and their position as well as the previously generated output,
folia.unifr.ch
Showing the best result for this search. See all results