Hjem
Senter for digitale fortellinger

Varselmelding

There has not been added a translated version of this content. You can either try searching or go to the "area" home page to see if you can find the information there
Lecture

Hallucinations are (almost) all you Need

Jhave Johnston explains how fundamental research in science is being transformed by a practice predominantly associated with the arts: namely hallucinations.

AI-generated image of organic-looking bubbles in soft colours
Foto/ill.:
AI-generated by Jhave Johnston

Hovedinnhold

This rapid overview of key scientific AI examples (that covers a year loosely defined as starting with the release of GPT-4 on March 14th, 2023) is framed by the hypothesis that fundamental research in science is being transformed by a practice predominantly associated with the arts: namely hallucinations. Hallucinations in people are conventionally associated with mental illness, drugs, and/or genius. Hallucinations in AI (mostly in large language models) have been critiqued as net-negatives: contributing to disinformation, bias, post-truth, deep-fakes, collapse of democracy, copyright theft, etc… Yet at the same time, AI hallucinations (of proteins/crystals/algorithms/circuits etc) pruned down to the feasible, are contributing to a revolutionary acceleration of scientific discoveries in numeric-algorithmic optimizations, AI hardware accelerators, reward mechanism design, non-invasive brain sensors, drug discovery, sustainable deep-tech materials, autonomous lab robotics, neuromorphic organoid computing, and mathematical reasoning. In both art and science, hallucinations are almost enough: without the pruning down to the plausible, there is just a sprawl of potentiality.