Will transformer derived architectures accelerate progress in deep learning?

Metaculus
★★★☆☆
35%
Unlikely
Yes

Question description

OpenAI's transformer based GPT-3 has generated a lot of hype around the capabilities of current methods in deep learning. GPT-3 seems to be capable of creative works of writing as shown by Gwern. This creative potential, if applied to scientific writing or code generation, may accelerate research progress. If successfully applied to deep learning research, this acceleration may be self-reinforcing potentially having implications on the development of an AGI system. Indeed the Metaculus question "When will the first Artificial General Intelligence system be devised, tested, and publicly known of?" updated 10 years forward in the months following the announcement of GPT-3.

Indicators

IndicatorValue
Stars
★★★☆☆
PlatformMetaculus
Number of forecasts216

Capture

Resizable preview:
35%
Unlikely

OpenAI's transformer based GPT-3 has generated a lot of hype around the capabilities of current methods in deep learning. GPT-3 seems to be capable of creative works of writing as shown by Gwern. This creative potential, if applied to scientific...

Last updated: 2024-07-27
★★★☆☆
Metaculus
Forecasts: 216

Embed

<iframe src="https://https://metaforecast.org/questions/embed/metaculus-5173" height="600" width="600" frameborder="0" />