A aviation & planes forum. AviationBanter

If this is your first visit, be sure to check out the FAQ by clicking the link above. You may have to register before you can post: click the register link above to proceed. To start viewing messages, select the forum that you want to visit from the selection below.

Go Back   Home » AviationBanter forum » rec.aviation newsgroups » Piloting
Site Map Home Register Authors List Search Today's Posts Mark Forums Read Web Partners

Planes that are afraid of crashing?



 
 
Thread Tools Display Modes
  #1  
Old May 24th 05, 06:52 PM
Jim Fisher
external usenet poster
 
Posts: n/a
Default Planes that are afraid of crashing?

An interesting excerpt from
http://www.cnn.com/2005/TECH/05/23/b...oad/index.html :

" Pearson (some brainiac computer expert guy) predicted that it would be
possible to build a fully conscious computer with superhuman levels of
intelligence as early as 2020.
IBM's BlueGene computer can already perform 70.72 trillion calculations a
second and Pearson said the next computing goal was to replicate
consciousness.

"We're already looking at how you might structure a computer that could
become conscious. Consciousness is just another sense, effectively, and
that's what we're trying to design in computer."

Pearson said that computer consciousness would make feasible a whole new
sphere of emotional machines, such as airplanes that are afraid of
crashing."



I'm not sure if this is scary as hell or not.

--
Jim Fisher


  #2  
Old May 24th 05, 07:48 PM
Peter Duniho
external usenet poster
 
Posts: n/a
Default

"Jim Fisher" wrote in message
. ..
[...]
" Pearson (some brainiac computer expert guy) predicted that it would be
possible to build a fully conscious computer with superhuman levels of
intelligence as early as 2020.


People have been saying AI is 10 or 20 years away since the late 70's (at
least).

Furthermore, we can create an airplane today that acts exactly like an
airplane that was actually afraid of crashing. In the case of machine
intelligence, emotions may be one way of encoding goals and motivations, but
I hardly think it's clearly the best way.

[...]
Pearson said that computer consciousness would make feasible a whole new
sphere of emotional machines, such as airplanes that are afraid of
crashing."

I'm not sure if this is scary as hell or not.


Possibly for machines with extremely complex design goals (consider a fully
autonomous replacement for human soldiers, for example), using emotions
might be an effective way to allow various competing interests to be
efficiently processed. But when you're just trying to get the machine from
point A to point B without running into anything, I doubt adding emotions
would improve things.

I think talk like that isn't so much scary as it is just plain dumb.

Pete


  #3  
Old May 24th 05, 09:07 PM
Jim Fisher
external usenet poster
 
Posts: n/a
Default

"Peter Duniho" wrote in message
...
"Jim Fisher" wrote in message
. ..
[...]
" Pearson (some brainiac computer expert guy) predicted that it would be
possible to build a fully conscious computer with superhuman levels of
intelligence as early as 2020.


People have been saying AI is 10 or 20 years away since the late 70's (at
least).


And those predictions were correct. We do have "AI" and it's been here for
at least 10 years or so in one form or another.

A "conscious" computer, though. That's light years beyond "AI."
Consciousness entails self-preservation ("fear") and all the other emotional
baggage we humans deal with every day. That's much different than the fuzzy
logic AI that causes a plane to respond with a synthesized "pull up!" when
the plane "knows" it's not landing.

Machine consciousness is not "dumb" nor "scary." We humes will take it for
granted in the not too distant future. That's just mind blowing.

--
Jim Fisher


  #4  
Old May 24th 05, 09:29 PM
Jose
external usenet poster
 
Posts: n/a
Default

Machine consciousness is not "dumb" nor "scary."

Machine consciousness =is= scary, because it means that we won't know
what the machines will do, or why. It's already happening with
software, although part of the problem there is that the publishers
refuse to tell us what the software is =actually= doing, (and most
people don't care to know).

Jose
--
The price of freedom is... well... freedom.
for Email, make the obvious change in the address.
  #5  
Old May 25th 05, 11:20 AM
Dylan Smith
external usenet poster
 
Posts: n/a
Default

In article , Jose wrote:
Machine consciousness is not "dumb" nor "scary."


Machine consciousness =is= scary, because it means that we won't know
what the machines will do, or why.


No more scary than other human beings, which we don't know what will do
or why.

Imagine the benefits - by the time we can make a truly conscious machine
with human intelligence, we will probably have the technology to do a
brain-dump. You could brain dump into a machine, discard your feeble
meat body, and go off into space and explore the planets - requiring no
pesky, complex, difficult life support to keep meat alive.
  #6  
Old May 25th 05, 02:35 PM
Jose
external usenet poster
 
Posts: n/a
Default

[machine conscousness is ]No more scary than other human beings, which we don't know what will do
or why.


Human beings are not mere tools. When I deal with a human or a dog or
even a frog, I do not expect it to merely do what it was designed to do.
When I put a light on a certain spot on stage, I expect that it will
stay there, even if it disagrees with me as to whether or not I am doing
a good job of lighting. If the lights start to design themselves, I
will lose control over whatever it is I am trying to accomplish for the
audience. Likewise I don't want my hammer to start reviewing the
architectural plans of the house I'm building and then refuse to hammer
the seventh and eighth beams into place.

I expect certain behavior from tools, and act accordingly. I expect
different behavior from people, and treat them accordingly.

Imagine the benefits - by the time we can make a truly conscious machine
with human intelligence, we will probably have the technology to do a
brain-dump. You could brain dump into a machine, discard your feeble
meat body, and go off into space and explore the planets - requiring no
pesky, complex, difficult life support to keep meat alive.


I don't think I'd ever want to do that.

Jose
--
"Never trust anything that can think for itself, if you can't see where
it keeps its brain." (chapter 10 of book 3 - Harry Potter).
for Email, make the obvious change in the address.
  #7  
Old May 25th 05, 09:14 AM
Peter Duniho
external usenet poster
 
Posts: n/a
Default

"Jim Fisher" wrote in message
. ..
People have been saying AI is 10 or 20 years away since the late 70's (at
least).


And those predictions were correct. We do have "AI" and it's been here
for at least 10 years or so in one form or another.


It should have been obvious from the context, but by "AI" I mean TRUE
artificial intelligence. That is, consciousness. A goldfish has more
complex capability to reason, learn, adapt, etc than the best computer does,
and I still wouldn't call *it* all that intelligent.

There's not a single computer out there that really qualifies as
"intelligent". Computers still just basically do exactly what we tell them
to do. And we are no closer to having them go beyond that than we were
three decades ago.

It's true that there's a field of computer science called "artificial
intelligence". But even the more innovative aspects of that field,
including neural nets and expert systems, aren't actually examples of
intelligent computers.

Pete


  #8  
Old May 24th 05, 07:59 PM
Paul kgyy
external usenet poster
 
Posts: n/a
Default

I wonder what the afraid-of-crashing plane would do if a crankshaft
broke. But maybe by then they'll have chutes on everything so it can
drop safely into a playground.

  #9  
Old May 24th 05, 09:26 PM
Jim Fisher
external usenet poster
 
Posts: n/a
Default

"Paul kgyy" wrote in message
oups.com...
I wonder what the afraid-of-crashing plane would do if a crankshaft
broke.


It would say, "Oh ****!" for you.

--
Jim Fisher


  #10  
Old May 26th 05, 08:53 PM
external usenet poster
 
Posts: n/a
Default



Jim Fisher wrote:
"Paul kgyy" wrote in message
oups.com...
I wonder what the afraid-of-crashing plane would do if a crankshaft
broke.


It would say, "Oh ****!" for you.


Just so long as it doesn't say "Watch this!"

-cwk.

 




Thread Tools
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

vB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Forum Jump

Similar Threads
Thread Thread Starter Forum Replies Last Post
FS: 1989 "War Planes" (Of The World) Cards with Box J.R. Sinclair Aviation Marketplace 0 December 30th 04 11:16 AM
Pilots afraid of their planes? ArtKramr Military Aviation 4 September 13th 04 12:23 AM
A Question For Real Airline Pilots Blue Simulators 34 September 6th 04 01:55 AM
FS: 1989 "War Planes" (Of The World) Cards with Box J.R. Sinclair Aviation Marketplace 0 April 15th 04 06:17 AM
Conspiracy Theorists (amusing) Grantland Military Aviation 1 October 2nd 03 12:17 AM


All times are GMT +1. The time now is 08:02 PM.


Powered by vBulletin® Version 3.6.4
Copyright ©2000 - 2024, Jelsoft Enterprises Ltd.
Copyright ©2004-2024 AviationBanter.
The comments are property of their posters.