For anybody who does futures studies and strategic foresight work, it should seriously bug us that our “futures cone” for “AGI” evolution is primarily modeled in most TED talks, podcasts, and posts as a straight line.
This reflects a single highly desired future by investors, builders, e/accs, and seekers wishing to fill spiritual voids. But with no imagination to support other possible outcomes.
As such, it is driven by willful intent without recognition of emergent novelty along the way. (Hello, Sama.) Just ask anyone hiring radiologists over the past decade.
For anybody who does futures studies and strategic foresight work, it should seriously bug us that our “futures cone” for “AGI” evolution is primarily modeled in most TED talks, podcasts, and posts as a straight line.
This reflects a single highly desired future by investors, builders, e/accs, and seekers wishing to fill spiritual voids. But with no imagination to support other possible outcomes.
As such, it is driven by willful intent without recognition of emergent novelty along the way. (Hello, Sama.) Just ask anyone hiring radiologists over the past decade.
Couldn’t agree more @Swag Valance
Well put. We're sorely missing a set of perspectives and possible futures that should be part of a larger, more nuanced conversation.
Worth a read (or at least a skim -- this one is quite long) on that subject:
https://knightcolumbia.org/content/ai-as-normal-technology