Will mind uploading happen before AGI?

Metaculus
★★★☆☆
1%
Exceptionally unlikely
Yes

Question description

At present, the leading answer, by a significant margin, to Eliezer Yudkowsky's Manifold Markets question, "If Artificial General Intelligence has an okay outcome, what will be the reason?", is "Humanity successfully coordinates worldwide to prevent the creation of powerful AGIs for long enough to develop human intelligence augmentation, uploading, or some other pathway into transcending humanity's window of fragility."

This question asks about the likelihood of human mind uploading happening before AGI. I believe this is an important question, because if it turns out that mind uploading first is intractably unlikely, then that should substantially reduce our confidence in an AI safety plan that relies on this happening. On the other hand, if mind uploading first is likely, then this should presumably be an important consideration for people working in AI safety and AI governance. (I find it surprising how infrequently mind uploading is mentioned by such people given the current predictions on Eliezer's question.) And if mind uploading first is around even odds, then this would imply that marginal effort put towards speeding up mind uploading timelines could be very impactful.

As far as I am aware, there have been few efforts to forecast mind uploading: There are technical reports on mind uploading, notably Sandberg and Bostrom (2008) and Eth, Foust and Whale (2013), but these do not include quantitative forecasts. I did manage to find four Manifold Markets questions on mind uploading, but the first is about C. elegans (a very simple organism that would be much easier to upload than a human), the second and third have few forecasters/traders, and the second, third and fourth are confounded with AGI timelines—i.e., because if AGI goes well, presumably mind uploading comes very soon after, and it's unclear how many of the traders are betting on this sequence of events; what matters for AI safety is whether mind uploading happens before AGI, not the absolute timeline.

Indicators

IndicatorValue
Stars
★★★☆☆
PlatformMetaculus
Number of forecasts127

Capture

Resizable preview:
1%
Exceptionally unlikely

At present, the leading answer, by a significant margin, to Eliezer Yudkowsky's Manifold Markets question, "If Artificial General Intelligence has an okay outcome, what will be the reason?", is "Humanity successfully coordinates worldwide to prevent...

Last updated: 2024-04-29
★★★☆☆
Metaculus
Forecasts: 127

Embed

<iframe src="https://https://metaforecast.org/questions/embed/metaculus-18839" height="600" width="600" frameborder="0" />