Using Technology to Deepen Democracy, Using Democracy to Ensure Technology Benefits Us All

Sunday, July 26, 2009

Futurological Brickbats

To care most about things that are merely not impossible is simply not sensible.

2 comments:

Unknown said...

Alright, but what if those very improbable things have the (remote) possibility of having disastrous impacts?

It seems to me that it makes sense to look at this from something of a cost/benefit perspective. Yes, time would be invested in something that would probably not happen anyway, but disaster might also end up averted. Your "it's too unlikely to happen" thing is getting quite old.

I'm not advocating against attempts to halt climate change or other such nonsense. I'm just saying that it seems a reasonable thing to do to put forth an appropriate amount of resources to prevent our extinction, however unlikely (and let the populace determine exactly what "likely" is. But first they need to actually be aware that there is actually a small threat of, say, AI showing up this century. I would argue that the public is disproportionally terrified of something that is extremely highly unlikely: an asteroid or comet impact. Resources and researchers' time is spent on this issue, yet you seem to completely ignore this).

Dale Carrico said...

it makes sense to look at this from something of a cost/benefit perspective

Specify the "this" -- and you'll discover soon enough that if the "this" really is a matter usefully susceptible of stakeholder deliberation then the majority of people engaging in that discussion have no need of futurologists hyperbolizing the stakes, and that if the "this" really is a matter of blue-skying then it should be treated as an aesthetic matter not to be mistaken for actual science or actual policy in the first place. In a nutshell, nobody needs to join a Robot Cult to engage in actually sensible scientific research or technodevelopmental policy -- but anything beyond actually sensible scientific research and technodevelopmental policy marks the Robot Cult as a fandom sub(cult)ure, attractive or not according to a person's taste, but something that no more earns pretensions to representing science or policy than any other fundamentalism. When it comes to superlative futurology this either/or amounts to something close to an iron law.

Think about it. The "arrival" of an entitative post-biological superintelligent Robot God of the kind fetishized by superlative futurologists would be preceded by innumerable problems demanding decisions of technique and policy and regulation and education and on and on and on, not one of which is illuminated by looking at it here and now through the lens of would-be prophets claiming to speak for "the future," but a process of invention, collaboration, contestation, deliberation the substance of which constitutes the actually-existing rationality out of which that "arrival" would truly consist, should "it" be possible, whatever "it" actually shapes up to be. Those who claim to skip all the steps are always just con-artists trying to sell you something.

In my view the dead enders of the GOFAI program who cling to one another among the Robot Cultists regularly deny or fail to grasp fundamental realities about the social exhibitions and biological incarnations of the "intelligence" about which they speak so glibly, which means you are jumping the gun when you demand we leap forward into calculating the likely arrival of the entity presumably premised on these incomprehensions.

The Robot Cultists aren't actually ready for prime time (which helps account for their enduring marginality from the consensus of scientists in the fields indispensable to their own preferred outcomes).

Certainly one doesn't overcome the basic problem of this initial incoherence by ratcheting up the dire stakes presumably involved in the prediction -- that's exactly like being unable to explain what you actually mean by saying "God exists" but trying to distract our attention from this basic incomprehension by saying when God returns he will thrust all non-believers into eternal hellfire so you better pray to him.

When you say it is "reasonable" to devote public monies to prevent human extinction at the hands of the Robot God (even if, oh so wheedlingly reasonably, you admit the chances may only be negligible that the Robot God apocalypse will come to fruition), you really mean that a handful of pseudo-scientific nutjobs in a Robot Cult who worship at the feet of embarrassing wannabe gurus like Eliezer Yudkowsky and Ray Kurzweil should be given tax money to subsidize their flabbergasting crackpottery. Thanks, but no thanks.