The Warring States of NPF  

Go Back   The Warring States of NPF > Dead threads
User Name
Password
FAQ Members List Calendar Today's Posts Join Chat

 
View First Unread View First Unread   Click to unhide all tags.Click to hide all tags.  
Thread Tools Display Modes
Unread 08-06-2005, 02:48 PM   #1
Red Fighter 1073
Rocky Wrench
 
Red Fighter 1073's Avatar
 
Join Date: Apr 2005
Posts: 1,351
Red Fighter 1073 is a name known to all, except that guy. Red Fighter 1073 is a name known to all, except that guy.
The Future of Robots

i realize that there have been a few discussions about today's robot technology and what they could to help out our lifestyle. so far, i view the making of robots as a great boost in the economy and they help out a ton in daily lifestyle.

though, the thing that scares me the most is looking at movies like "I, Robot" and others like that. in the very near future, i would still like to have robots and better technology, but just think about it. at some point, robots can downgrade our economy. a huge deal today is the loss of jobs around the world, and how to create more jobs. with the making of robots, there could a HUGE loss of jobs, making the economy worse. then, there are instances where stuff like "I, Robot" could become true. a point where the robots we have created become so smart that they feel that they dont need to help out the human race and would soon rebel against us. also, there could even be a time where robots would be the workers at a Mcdonalds or something like that. its weird, but still possible.

how do you view the coming of new robot technology today. how do you view the future of the economy and robots?? do you view the making of smarter robots a good or bad idea??
__________________
My Sprite Sheet/Mafia Roles
Red Fighter 1073 is offline Add to Red Fighter 1073's Reputation  
Unread 08-07-2005, 12:11 AM   #2
Raiden
Just a passing through veteran
 
Raiden's Avatar
 
Join Date: Nov 2003
Location: On your couch. Yes?
Posts: 5,327
Raiden is so pumped up.
Send a message via AIM to Raiden Send a message via MSN to Raiden Send a message via Skype™ to Raiden
Default

For one thing, I don't believe that they could possibly create a robot with its own conscience like they did in "I, Robot".

Whether you believe that a human conscience comes from a soul, or simple a series of synapsis from cerebral nuerons, it all goes down to that it's mostly different to every person. It's a 'nature vs. nurture' scenario, where our morals, values, ideas, etc., are affected by the events in our lives. Robots do not have that ability, and I find it very unlikely that they ever will. Intelligence created by man can only know what we tell it to know. Even if they create a program that gives a robot a conscience, the robot cannot have its own conscience. It will, in essence, have an incomplete copy of the conscience of the man/woman who programmed it.

And yes, robots will most likely take over the labor force. There are jobs that are dangerous, and as such, they are paid quite high. Having robots would make work more efficient and cheaper. As such, prices will go down, and it will make it easier for people to buy things that would normally be more expensive.

Now, with most of the labor force replaced with robots, humans will have little choice other than to focus on their academic abilities. But, there are people who's academic skills aren't very high, they can still do the labor of repairing the robots in case they malfunction.
__________________
I have a signature. It's a really cool one, too. It's so awesome, you'd pull your eyes out and punch your mother. Sadly, these rules state that my signature is just too darned big. Too much awesome for such a small space. Oh well. You can still punch your mother...if you want...

Fifth and Krylo made me do it.


http://www.animecubed.com/billy/user...sigs/60266.jpg
Be the Ultimate Ninja! Play Billy Vs. SNAKEMAN today!
Raiden is offline Add to Raiden's Reputation  
Unread 08-07-2005, 01:25 AM   #3
Staizer
"I was a Llama once"
 
Staizer's Avatar
 
Join Date: Jun 2005
Posts: 487
Staizer is reputed to be..repu..tational. Yes.
Send a message via AIM to Staizer
Default

First off, science fiction movies make money by giving us the worst possible scenarios. While robot AIs might want to take over our world because we are obsolete, the chances are slight. remembering those possiblities allow us to prevent them.

Second, if robots are doing the work, what would the point of money be? How would you judge who got what amount of money when?
__________________
"Oh sheep swallop! Sheep swallop and bloody buttered onions!" - Mat Cauthon - Wheel of Time.

Save the trees, eat the cows! - me

"YOU SPOONY BARD!" - Tellah FFIV

"If we had ham we could have ham and cheese sandwiches, if we had cheese." - Endymion

Quote:
Originally Posted by Pictish
Except it was more like someone took a crap actress, wrote her a script in crap and got her to say it in bullshit.
Staizer is offline Add to Staizer's Reputation  
Unread 08-07-2005, 08:25 AM   #4
Lockeownzj00
Homunculus
 
Lockeownzj00's Avatar
 
Join Date: Nov 2003
Posts: 2,396
Lockeownzj00 will become famous soon enough. Eventually. Maybe.
Default

I agree that it won't happen soon--that's just paranoid.

But to say it's not possible with AI I think is seriously overlooking the progress in AI nowadays, and what it fundamentally is.

There are a lot of things with AI we never thought possible--video games are a perfect example. They used to have simple routines: back and forth here, shoot when the player comes. Now, I read in PCGamer about some of the most intelligent AI to date in modern games that learn and adapt to their environment, are very handy, and very challenging.

I think oversimplifying AI is like oversimplifying video games: mainstream media is still kind of stuck in the "video games = tetris" era, not realising what they are or their potential. I feel the same way about AI. Yes, it would be more difficult. It would have to take countless more things into consideration. Yes, they would be extremely faulty at first. No, I don't think it's impossible.

Think about it--what is so ridiculous about, let's say a simple robot which is just programmed to walk to the fridge and get you a beer? Nothing, really. Let's take it a step up. We give it, let's call it, an "advanced furby" mechanism, allowing it to respond to and interpret your requests--but it's still fairly basic. Now we increase its tasks to cleaning the house and telling stories...you see where I'm going with this? Although you're fundamentally right that the AI can not and will not rebel if it is not programmed to learn, if it is programmed to be human-like (like much of modern AI) and adapt to all factors of ones environment, it could theoretically form thoughts of its own.

All we are is complex machines. It would be difficult, expensive, and time-consuming, but I think humans have already shown they can emulate themselves.

So no, the toaster won't rebel against you--but if you program the toaster with the same exact undetectable type feelings as a human, it might. I don't think it makes it any less just because it's a bunch of circuits and you're a bunch of meat.
__________________
Quote:
One of the greatest challenges facing civilization in the twenty-first century is for human beings to learn to speak about their deepest personal concerns—about ethics, spiritual experience, and the inevitability of human suffering—in ways that are not flagrantly irrational. We desperately need a public discourse that encourages critical thinking and intellectual honesty. Nothing stands in the way of this project more than the respect we accord religious faith.
Lockeownzj00 is offline Add to Lockeownzj00's Reputation  
Unread 08-08-2005, 02:15 AM   #5
Sanacra
pregnant goldfish
 
Sanacra's Avatar
 
Join Date: Aug 2005
Location: The fishbowl...in your mind
Posts: 8
Sanacra is reputed to be..repu..tational. Yes.
Default

Personally, I'm of the camp that doesn't see how automatons can "take" jobs from humans. Yes, they can be built to do the job you currently do, but that just gives you the opportunity to find something else to do. Maybe it's a hard switch to make, and maybe you don't think you want to do something else...but eventually, if enough of our lives become automated, the only "jobs" left will be based on what you, as an individual, want to do. For some, this might be things that are currently considered unuseful, or purely self-indulgent...but if everyone's basic needs are being met, why shouldn't that be a viable option? I'm not saying such a thing could happen soon, or that it would be easy, or a smooth transition, but it might happen someday. In the interim, we can debate the morality and possible dangers of sentient technology all we want, but that's not going to stop those who can get the funding from trying.

As far as my stance on the plausibility of a techno-intelligence rebellion against humanity...I think if we ever created technology self-aware enough to come up with the idea of rebellion, in great enough numbers to have a rebellion, there'd be similar issues in the robotic community to the ones faced by humans, in that there wouldn't be any unanimous consensus among the robots as to the worth of humanity. It would be more likely for factions to develop, in my opinion, such as extremist militant groups(BURN the obsolete ones, for they are UNCLEAN!) and environmentalist-type groups(humans are God's creature's, too, and they deserve to live and be loved), and more moderate, but equally diverse groups(Well I guess humans are ok, you know. I mean, I don't think they deserve to die, or anything, it's not their fault they have such faulty components. I mean they did, like, create us and whatever.)

And yeah, sci-fi movies like I, Robot are way too reactionary to be considered reliable predicters of what might happen with created intelligences. (IMO, of course) After all, would you go see a robot-movie where all the robots got along lovey-dovey with the humans and there were no "cool" explosions and special effects?(ok, maybe you would, but how many other people would?)
Sanacra is offline Add to Sanacra's Reputation  
Unread 08-08-2005, 02:33 AM   #6
Waylander
Soul Stealer
 
Waylander's Avatar
 
Join Date: Jul 2005
Posts: 230
Waylander is reputed to be..repu..tational. Yes.
Default

The main problem is the negative publicity that robots have been getting from the time the idea was created. You can just see in the future people instead of being racist against blacks of asians or something like that are now instead racist against robots and go around smashing them up. instead of the KKK it would be the MMM or the CCC or something like that whose entire lives are devoted to being against robots.

The whole 3 rules that would govern their control instead of governing them would prohibit sentient robots. When you think about it what all people want is freedom and robots wouldnt be free if they had rules that they couldnt ignore unlike a human who can if they really want to go out and kill someone else. Not that im saying robots should be allowed to but if they are at least as smart as the dumbest human they should have the same rights.

Another point would be that if they are intelligent they wouldnt be satisfied with just labor jobs why can't they have a management job or start up a multinational corporation? Then we have yet another problem. Robots dont need training for anything all they need is the information uploaded into their memory. This means that humans will spend the first 10-20 years of their life in training when a robot who can do the same if not better job is taught in a matter of seconds this would mean robots would control the higher heirachy or businesses because they are not only better at their job (they can't make mistakes) but they can do it for longer thus making them prime candidates for promotion.

Just a brief run rundown more thoughts but yeah cant be bothered writing em down ask me if you want
__________________
If your happy and you know it clap your hands!
Waylander is offline Add to Waylander's Reputation  
Unread 08-08-2005, 07:14 AM   #7
Staizer
"I was a Llama once"
 
Staizer's Avatar
 
Join Date: Jun 2005
Posts: 487
Staizer is reputed to be..repu..tational. Yes.
Send a message via AIM to Staizer
Default

mmmm, what is wrong with giving robots the freedom to kill? We humans have it. Are we so egotistical that if we gave sentience to something we'd have to say YOU WILL NOT KILL US!!!! OUR LIVES ARE MORE IMPORTANT THAN YOURS!

That's ridiculous. If you make something that is the exact same as a human for all intents and purposes, then you treat it like a human. Look at what happens when we don't treat HUMANS like humans.

Life would basically be slave labor on a metallic scale. The best way to treat it would be much like Futurama. Robots have their jobs, and humans have their jobs. neither is better or worse, just different.
__________________
"Oh sheep swallop! Sheep swallop and bloody buttered onions!" - Mat Cauthon - Wheel of Time.

Save the trees, eat the cows! - me

"YOU SPOONY BARD!" - Tellah FFIV

"If we had ham we could have ham and cheese sandwiches, if we had cheese." - Endymion

Quote:
Originally Posted by Pictish
Except it was more like someone took a crap actress, wrote her a script in crap and got her to say it in bullshit.
Staizer is offline Add to Staizer's Reputation  
Unread 08-08-2005, 12:14 PM   #8
Lockeownzj00
Homunculus
 
Lockeownzj00's Avatar
 
Join Date: Nov 2003
Posts: 2,396
Lockeownzj00 will become famous soon enough. Eventually. Maybe.
Default

Quote:
The whole 3 rules that would govern their control instead of governing them would prohibit sentient robots. When you think about it what all people want is freedom and robots wouldnt be free if they had rules that they couldnt ignore unlike a human who can if they really want to go out and kill someone else. Not that im saying robots should be allowed to but if they are at least as smart as the dumbest human they should have the same rights.
But that's the thing--perhaps professional, factory line robots would be restricted, because logically we wouldn't want them to start thinking. They wouldn't even be oppressed because they wouldn't be thinking, just performing very linear, set routines. But what happens when an independent source, or organisation, decides to experiment? I don't think it's too ridiculous to propose--science is curious, scientists want to try things out all the time. How can you sit there without pushing the big red button? You want to know what happens. So I definitely think experimentation with quasi-sentience would begin, perhaps with more animal-like robots.

Quote:
Another point would be that if they are intelligent they wouldnt be satisfied with just labor jobs why can't they have a management job or start up a multinational corporation? Then we have yet another problem. Robots dont need training for anything all they need is the information uploaded into their memory. This means that humans will spend the first 10-20 years of their life in training when a robot who can do the same if not better job is taught in a matter of seconds this would mean robots would control the higher heirachy or businesses because they are not only better at their job (they can't make mistakes) but they can do it for longer thus making them prime candidates for promotion.
Well I think there's a few oversights here. First, the strictly-labor robots would be deliberately made without even a possibility to "think." They literally would not be able to even propose rebellion or dissatisfaction in any sort of way, that is a sci-fi concept. It could only happen if the manufacturers or the company a) decided that a little thought was necessary for the robots to perform complex tasks, or b) somehow the manufacturer snuck in (highly, highly unlikely) "sentient" robots.

Second, you said they can't make mistakes. Why? Are you using Windows? Does ite ver not do what you want it to do? I think there's a misconception that robots or machines are perfect. If they are set with a simplem, logical routine, or even a well-thought out complicated one, they will "always" do it. Save for very specific situations, your toaster always works. Why doesn't your computer "always" work? The more complex you get, the more factors you have to consider, and obviously the first robots would be .001 alphas . Even by the time "we" (as a society) got to the "robots in factories" stage, they would have to at least be 80-90% reliable: not 100% (although companies would preach that..."Our robots are among the only 99.9% effective...), and all it takes is a minimum of some human interaction, somehow changing their routine, changing the objects they use in their routine, and it all fails.

I also don't think they would necessarily "know everything from the start." I'm not saying robots couldn't learn emotions, but how would we "program" them with happiness? We can't just give them examples of "happy things," that's totally subjective and ineffective. In order to create even the concept of happiness, besides being extremely advanced machines, we'd have to give them the ability to reason--to think, and to discover these feelings. Perhaps an emotional robot would begin as a quasi-baby--while, being a robot, it could, let's say, lift a car (cliche example), it still wouldn't understand the implications of this, and with a "learning" chip, or complicated AI, let's say, it would have to go through an experience similar to a baby's (perhaps faster) in order to "mature" and figure things out. You know, the old movie cliche? A robot saying "Love? What is love?"

I think Staizer is right, but again, only about the sentient robots. The "toasters" (hehe, unintentional Battlestar Galactica reference) would not need or want to kill. And I don't think you would even need to program "kill" into so-called emotional robots...when you think about it, what would stop a robot from smashing its hand into your brain? In fact, like a baby, it might do so to figure out what happens. You have to do everything at least once to know what its effect is, right? It may regret this, it may not understand this, it may like it, it may do it again, and any of those may change depending on what it later learns it has done. But I digress. What I was saying was, in order for them not to kill, they would have to not be "absent" of certain programming, but they would have to have an active routine, process somewhere that would be inhibiting their ability to swing their hand towards that piece of meat over there. Of course if we did this these emotional robots might get mad on principle

But I think even Staizer, you can see why a human would deliberately stop a robot from killing him, as we are hypothesising. Who wants to die (rhetorical)? We'll do anything not to die. So this problem might occur ;P
__________________
Quote:
One of the greatest challenges facing civilization in the twenty-first century is for human beings to learn to speak about their deepest personal concerns—about ethics, spiritual experience, and the inevitability of human suffering—in ways that are not flagrantly irrational. We desperately need a public discourse that encourages critical thinking and intellectual honesty. Nothing stands in the way of this project more than the respect we accord religious faith.
Lockeownzj00 is offline Add to Lockeownzj00's Reputation  
Unread 08-08-2005, 01:10 PM   #9
Red Fighter 1073
Rocky Wrench
 
Red Fighter 1073's Avatar
 
Join Date: Apr 2005
Posts: 1,351
Red Fighter 1073 is a name known to all, except that guy. Red Fighter 1073 is a name known to all, except that guy.
Default

hmm..i guess i agree that the I,Robot idea for robots is a little far fetch'd, but i still think it could be possible. like with what locke said, they could have a certain special AI programmed into them, to give robots more intelligence of some sort.

if not the I,Robot idea, then there could easily be a police robot officer that is programmed to do whatever is necessary to do its job. even kill. i cant exactly give an example of how they might turn sour, but these kinds of robots sure could cause a problem for us.

as for sanacra's idea, i really dont think that humans would let robots have that kind of freedom. remember that the whole idea of making robots is to make human life easier. so i really dont there would be anything like a "robot city", ever. (also, that city thing was MY idea..)

Quote:
Originally Posted by Sanacra
Personally, I'm of the camp that doesn't see how automatons can "take" jobs from humans. Yes, they can be built to do the job you currently do, but that just gives you the opportunity to find something else to do. Maybe it's a hard switch to make, and maybe you don't think you want to do something else...but eventually, if enough of our lives become automated, the only "jobs" left will be based on what you, as an individual, want to do. For some, this might be things that are currently considered unuseful, or purely self-indulgent...but if everyone's basic needs are being met, why shouldn't that be a viable option? I'm not saying such a thing could happen soon, or that it would be easy, or a smooth transition, but it might happen someday. In the interim, we can debate the morality and possible dangers of sentient technology all we want, but that's not going to stop those who can get the funding from trying.
but what im getting at is the fact that robots could take jobs from humans. this would create a decline in the giving of jobs towards HUMANS. what if there is NO job left to do. sure, there could still be jobs for humans. but robots would have jobs that humans could easily take. even if its as simple as cleaning the bathroom stalls, the humans wouldnt have this paying job because a robot is doing the job for them. that is how the economy could go into a decline.
__________________
My Sprite Sheet/Mafia Roles
Red Fighter 1073 is offline Add to Red Fighter 1073's Reputation  
Unread 08-08-2005, 02:53 PM   #10
Azisien
wat
 
Azisien's Avatar
 
Join Date: Jan 2005
Posts: 7,177
Azisien can secretly fly, but doesn't, because it would make everyone else feel bad that they can't. Azisien can secretly fly, but doesn't, because it would make everyone else feel bad that they can't. Azisien can secretly fly, but doesn't, because it would make everyone else feel bad that they can't. Azisien can secretly fly, but doesn't, because it would make everyone else feel bad that they can't. Azisien can secretly fly, but doesn't, because it would make everyone else feel bad that they can't. Azisien can secretly fly, but doesn't, because it would make everyone else feel bad that they can't. Azisien can secretly fly, but doesn't, because it would make everyone else feel bad that they can't. Azisien can secretly fly, but doesn't, because it would make everyone else feel bad that they can't. Azisien can secretly fly, but doesn't, because it would make everyone else feel bad that they can't. Azisien can secretly fly, but doesn't, because it would make everyone else feel bad that they can't.
Default

If our economy in the first place is strong enough that robots are cheap enough to be designed specifically for bathroom stall cleaning, I have this itching feeling we'd be just fine after all the bathroom cleaners are fired and replaced by ToiletSparkle Model 3092821X2.

If society goes in this direction, it means humans take to academic professions. Robots to do menial tasks, like cleaning and maintainence, yes. Robots to perform research in mathematics, physics, biology, and the other sciences...robots as engineers, the list goes on. Maybe if we get into a freaky debate about A.I.s and how they could evolve and take THOSE jobs too, well, I don't know about that, but I'm also not counting on it happening.

If there were no jobs left for humans, and robots are doing the jobs, doesn't the need for money disappear? I mean, ideally we'd become a self-sufficient society maintained by the robots, and humans would have a lot of free time to do what they WANT to do. It sounds too much like a utopia for me, though, something would go wrong. Not necessarily the ROBOTS KILL EVERYTHING route, more like a war among humans, or the field wouldn't end up expanding that much.

Still, a few decades ago, computers were never going to be in the commoner's home. Now they are. Maybe in a few decades, there will be robots in the commoner's home too. I wonder what my dog would think of a 3 foot tall cleaning bot.
Azisien is offline Add to Azisien's Reputation  
 


Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT -5. The time now is 11:27 PM.
The server time is now 04:27:35 AM.


Powered by: vBulletin Version 3.8.5
Copyright ©2000 - 2024, Jelsoft Enterprises Ltd.