01-07-2010, 05:08 PM | #11 | ||
War Incarnate
|
Well, as long as they don't hook it UP to anything, it should be fine. But the moment I hear the words "military aplications", then it's time to run for your nearest nuclear bunker.
__________________
Quote:
Quote:
|
||
01-07-2010, 05:25 PM | #12 |
Fight Me, Nerds
Join Date: Oct 2008
Posts: 3,470
|
I think movies and books and games have gone to great lengths to create this artificial fear of machines rising up to overthrow us based on nothing but conjecture.
It is a really silly idea overall that we would be able to create an artificial being with real emotions(because we totally understand how those work enough to make accurate copies) and absolutly no "plan-B" failsafe. So what if you make a big copper copy of a human brain and hook it up to something? What is it going to be able to do in the span of time it takes for you to grab a fireaxe and cleave it or a blowtorch and melt a fucking hole in it? Or push the button that activates the magnetic burst machines located below and above it? Certainly not hack into any hardened computer systems full of sensitive information and launch codes. Not unless that is what you programmed it to try and do. Then you're stupid.
__________________
Last edited by Marc v4.0; 01-07-2010 at 05:28 PM. |
01-07-2010, 05:55 PM | #13 |
Lakitu
Join Date: Jul 2008
Location: Northwest Arkansas
Posts: 2,139
|
I base it off my fear that if we program something to think like us, then it will act like us too. Especially if it perceives itself to be threatened in someway.
__________________
Slightly off-kilter |
01-07-2010, 06:20 PM | #14 |
formerly known as Prince.
Join Date: Oct 2008
Location: Right here, with you >:)
Posts: 2,395
|
I don't think we'll be able to create a truly sentient AI ever. It still has to be programmed to act in some way, starting off collecting data to process and make sense of, probably.
Nah, not gonna happen. And if it does, color me impressed.
__________________
>:( C-:
|
01-07-2010, 07:21 PM | #15 |
Argus Agony
|
I just figured the reason was the same as any and all superscience: Because it's cool.
__________________
Either you're dead or my watch has stopped. |
01-07-2010, 07:54 PM | #16 | |
synk-ism
|
At times, I perhaps talk too much.
Quote:
To my knowledge, no one knows how that latter leap will be made yet, though perhaps it is thought of similar to animal evolution. However, it is true that many fictional stories depict an intelligence that nearly immediately weighs the pros and cons and "decides" that humanity is unfit to protect itself, is a threat, is in danger of consuming itself, or any other of other justifications for hostile action and/or defense, purporting that the programmers and developers of the machines are capable of imparting such sentience strictly through construction and coding. Without trying to get too preachy, failing to understand or be able to quantify a living being's soul will likely keep true AI sentience out of reach. Assuming such a thing exists, of course. Of course, simplistic reasoning and decision-making isn't required to lead to sentience, either. Tools and even weapons with these capabilities could be created without fear of them turning on their creators.
__________________
Find love.
Last edited by synkr0nized; 01-07-2010 at 08:25 PM. |
|
01-07-2010, 07:59 PM | #17 |
Local Rookie Indie Dev
|
None the less, it's kinda of difficult to learn about how our emotions work by creating an artificial brain when we barely understand it as it is.
It could be done but the end result won't be what we want it to be. We still won't learn much from it about how our brains work.
__________________
|
01-07-2010, 08:18 PM | #18 |
Argus Agony
|
We wouldn't understand everything, but we would understand a great deal more, as constructing an artificial brain that learns and develops behaviorally in a way similar to our own gives us something that we can use to actually print a solid readout of the if-then processes that go come up throughout the development of an intelligent being's mind throughout its life.
Multiple such brains being exposed to various positive and negative stimuli versus control groups in a laboratory fashion would be a huge boon to the field of behavioral psychology, but then you get into the complicated and messy world of civil rights and basic humane treatment and how they apply to artificial beings, and honestly those tend to end with machine uprisings that I, for one, would like to avoid.
__________________
Either you're dead or my watch has stopped. |
01-07-2010, 09:46 PM | #19 | ||
Blue Psychic, Programmer
Join Date: Feb 2007
Location: Home!
Posts: 8,814
|
Quote:
Really, what most limits humans? Emotion. You have things like compassion, remorse, and fear keeping you from doing stuff all the time. In fact, the only emotion liable to cause any kind of damage to others is anger. So why not let an artificial lifeform fall in love? It really doesn't make any sense to shuttle a group of people into automatic second- or third-class status by depriving them of the one thing that is at the same time worth living, fighting, and dying for.
__________________
Quote:
Journal | Twitter | FF Wiki (Talk) | Projects | Site |
||
01-07-2010, 09:54 PM | #20 |
synk-ism
|
You've never done anything out of sorrow or fear??
I counter that those emotions you list as limiters are also catalysts. "Good" or "bad", emotions all have the ability to override logic in positive and negative ways.
__________________
Find love.
|
|
|