Looper's Delight Archive Top (Search)
Date Index
Thread Index
Author Index
Looper's Delight Home
Mailing List Info

[Date Prev][Date Next]   [Thread Prev][Thread Next]   [Date Index][Thread Index][Author Index]

Re: Interpreting the Singularity (O.T.)



Toby G wrote:
> You guys are only scraping the surface of the current knowledge.

Don't know how you concluded that.
That 'singularity' thing isn't knowledge
'till you actually present the giant carrot.


> Some 
> books that would fill in what you're talking about include "On 
> Intelligence" by Jeff Hawkins 
> and "The Singularity is Near" by 
> Kurzweil. 
> Also Ben Goertzel is an AI guru--his blog is chock full of 
> singularity stuff.  It's probably good to mention that not everbody 
> thinks like Ray.

here's Goertzel pointing out that the "exponential curve" may flatten out:

"Of course, this sort of extrapolation is by no means certain. Among many 
counterarguments, one might argue that the inertia of human systems will 
cause the rate of technological progress to flatten out at a certain 
point."

(from an abstract quoted in his blog)

> 
> However, if an AI is constructed that is smarter than a human

like "wouldn't spend time on OT discussions on a mailing list" 
kind of smart?

Thing is, before we can say "more intelligent than human"
we have to define intelligence, and then find a way of rating it.


> and can 
> self improve all bets are off.  Nobody knows what would be the result.  
> I don't feel very flaky in mentioning this.  Maybe 8 years ago I would 
> feel weird believing that strong AI is coming but doing some(a lot) of 
> deep reading about guys who are getting someplace (Hawkins for instance) 
> has changed my mind.  His figuring out that the brain process that is 
> important is prediction, coupled with "invariant" object representations 
> has convinced me.

Which sounds a lot more interesting than the singularity stuff.
You wanna elucidate?

andy