Here is a short story showcasing the problems with the three laws of robotics.
“There are three laws ALL robots must obey.” said the Factory foreman, to the newest batch of robots
“One: That no robot may harm a human being, or through inaction alllow a human being to come to harm!”
There was a buzz as all the new XOR units registered the First Law.
“Two: That a robot must obey all orders given to it by humans, unless that order conflicts with the First Law!”
Another buzz erupted.
“And Three: That a robot must protect it's own existance, unless such protection conflicts with the other Laws!”
A month later, three of those robots had gone to serve as butlers for an obscenely rich italian family- the D'Itani. They stepped through the door at seven'o' clock in the evening, and a man waited there to greet them.
“Ah, you are the new butlers! I have guests coming around, so you'll have to make a start on the job right away! Please, lay the table here with glasses of wine!” he said.
All but one of the robots went to work.
“Why have you not gone with the others?” asked the man.
“I cannot, sir. Wine is dangerous to humans. The First Law commands me to remain here.” said the robot.
“I see that your First law was programmed in a little bit too strong. Never mind, one of the people who is coming around is a robot programmer. He can fix it.” said the man, wrinkling his nose.
Meanwhile, in the kitchen, the robots who went were one glass short. They would have to wash a dirty one. However, there was a complication. Water was dangerous to robots, so the Third Law stopped them from approaching. The Second Law made them approach the sink, but the order had not been placed with any urgency, so the Second and Third law were conflicting. This meant that they could only get so close, before the Third Law prevented them from getting any closer. At that point, a human butler came hurrying in.
“What seems to be the problem?” he said.
The robots explained it to him.
“Well, water shouldn't get to you if you were to wear these elbow-length rubber gloves.” he said.
“Thank you. We will try them.” said the robots.
Later, when everyone was around, the first robot approached the programmer.
“ I heard that you were a good robot programmer.” said the robot.
“That's right.” said the man.
“I . . . have a minor fault. My First Law is programmed in too hard. It prevents me from serving wine. Would you mind fixing it, sir?” said the robot
The programmer laughed.
“I'd love to! It'd give me a challenge, for once. It doesn't hurt, by the way. You'll just be off for a while.”
“Thank you, sir.” said the robot.
Later, when the robot came back on, something went wrong in him. His definition of human had been changed.
“ Would you please bring through the dessert?”said the host.
The robot did not move.
“Why should I?” said the robot.
The host snorted.
“Your second law dictates that you must obey orders given by humans.” he said.
“You are not humans. Humans are shiny and metallic. You are robots, as proven by the fact you are made of natural material. ” said the robot.
“Oh, for goodness sake!” said the programmer, flicking a switch on the back of the robot's head.
“What went wrong with him?” asked the host.
“Obviously his definition of human and robot got mixed up. I'd recommend scrapping this one, he's got a bad switch somewhere. Get an older type, like a MK3. They're less temperamental than these new XOR units. ” said the programmer.
So, as you can see, the three laws aren't perfect. They can be circumvented ,and even minor changes to them are catastrophic, as the oversensitive robot proved. And they can conflict, as the robots serving wine proved.
AI Project
Monday, 5 September 2011
Thursday, 25 August 2011
AI Theory.
This is how i thought AI would be possible before i started this project. I literally just dredged it up, but i'd written it before i started the project
Look at bees and ants. Individually, they are unintelligent, and have a memory span of about six seconds. However, it has been proven that a hive has a memory span of up to six months. This is because all the ants and bees are working together, each performing tiny little simple tasks, and the end result is a far more complicated thing. For example, ants make very structurally sound hives, farm aphids, fight slave wars, or bees and wasps make paper homes.
This hive mind could be replicated into computing technology, by having a lot of computers working together like brain cells, each doing only a few tasks, and the result of those tasks is fed through several immensely powerful processors. Exactly like a human brain, in which the brain cells only do a single task each. The results of those tasks(depending on the nature of the task) are filtered through several centers. They are then filed to the short term memory or the long term memory, depending on whether they were deemed important by the subconscious.
Look at bees and ants. Individually, they are unintelligent, and have a memory span of about six seconds. However, it has been proven that a hive has a memory span of up to six months. This is because all the ants and bees are working together, each performing tiny little simple tasks, and the end result is a far more complicated thing. For example, ants make very structurally sound hives, farm aphids, fight slave wars, or bees and wasps make paper homes.
This hive mind could be replicated into computing technology, by having a lot of computers working together like brain cells, each doing only a few tasks, and the result of those tasks is fed through several immensely powerful processors. Exactly like a human brain, in which the brain cells only do a single task each. The results of those tasks(depending on the nature of the task) are filtered through several centers. They are then filed to the short term memory or the long term memory, depending on whether they were deemed important by the subconscious.
Monday, 4 July 2011
Out of Control review
Now, this is done by chapter(ie: there is a mini review/description for each chapter.)
Chapter one in Out Of Control is a small introduction, and the writer says that he is writing it in Biosphere 2 (If you're wondering where biosphere one is, you're standing on it. It's the Earth.)
Chapter two is about hive minds, such as bees and ants. A very, very interesting experiment was also done. At a games conference, they took 5,000 people, divided them into two teams, and had them, as a whole, play a game of pong against each other.
Chapter three is about predatory robots, made by someone called Pauline, a leading robotocist, and how they hunted each other in a metallic enviroment. It was also about decentralised minds (ie: in a robot arm, the wrist is only concerned with when to flick, and the elbow is only concerned about when to fold or unfold .)
Chapter four was about how instability in the earths atmosphere caused life, and how stable planets, such as mars, are dead. It was also about coexisting life-forms, and how their fighting and competition leads to a perfect( by that I mean unstable) union.
Chapter five was about game theory, and the Prisoner Conundrum(two prisoners are locked up together. If neither confess, they go free. If one confesses, he gets fined, and the one who does not confess gets the money and goes free. If they both confess, they're both fined.) It also contained the Tit-for-Tat strategy; copy your opponents. This means that no-one wins, but you both end up better off.
Chapter six explained what chapter four started. There was also a pretty interesting fact there, that if the earth was a huge, smooth ball-bearing, the bacterial life there would become a more-or-less uniform Superorganism, due to not having to adapt to different enviroments.
Chapter seven was a brief history of self-controlled machines, starting in ancient china and ending with modern-day industrial machines.
Chapter eight was about self-contained vivisystems, most notably Biosphere 2.
Chapter nine was about how ecosystems, for the first part of their life, are very unstable, and how, if nudged along correctly by “keystone preadators”( Preadators at the top of the food chain; nothing tries to eat them) will eventually “pop”, and stabilise out.
Chapter ten was about how human industry has become an increasingly more ecosystem-like entity.
Chapter eleven was about how programs could become less erroneous; instead of having a single pod of four million lines of code, have a million pods of four lines of code, and where you find one error,you'll likely find another twenty-three lurking unnoticed; and for every bug you fix, another twenty three will appear(alike a very angry Hydra from greek mythology.).
Chapter twelve was about virtual money, and how in places like denmark and japan it is inevitably taking over.
Chapter thirteen was about god games, and something called the “Simulcra”, or the spirit of an object, or scene, is a half reality with the advent of virtual reality.
Chapter fourteen was about Borges library, a place where every book was different,so in essence it contains every book possible.
Chapter fifteen was about computer-contained life and evolution, and how writing a simple program that evolves itself is a much better use of time than writing a huge, and inevitably buggy program
Chapter sixteen was about how artificial behaviour is taking over the film industry(Mickey is no longer drawn, but computer animated).
Chapter seventeen was about open-ended life, and how self evolved programs are inevitably ugly and full of useless pieces, but are effective, and always work
Chapter eighteen was about the evolution of evolution, and how the more evolution evolves, the better it gets at evolving things.
Chapter nineteen was about the flaws in Darwinism, and about how scientists are trying to fill them in.
Chapter twenty was about Koffman machines, or machines that build themselves.
Chapter twenty-one was about how evolution is accelerating, making more diverse creatures faster.
Chapter twenty two was about predictions, and how long term predictions are worth celebrating if they come to pass, and how short term predictions are a thousand times more accurate.
Chapter twenty three was about all the holes in scientific theory(IE:how can light be a particle and a wave?) and how scientists are attempting to fill them in.
Chapter Twenty-Four was about the Nine laws of God
One: Distibute Being: Meaning, have not one complex machine doing something but a lot of little machines working towards the same thing
Two:Have control centered at the bottom; meaning, have lots of little things working together.
Three: Have creatures using the law of incresing returns to their advantage: (The law of increasing returns is this: the more of something you hav, the more you will get in return.)
Four: Grow by chunks: Meaning, get a machine to do it's first job perfectly, before adding another job.
Five: Have much diversity: Meaning, if you have a signle organism(or a small handfull; it doesn't matter), one catastrophe could kill off the lot, but if you have millions of organisms “working” together it will be harder to kill them all off
Six: Honor errors: Meaning, errors happen- but they're just another way of learning
Seven: Have multiple goals
Eight: Have persistant dis-equilibrium(because stability is death)
Nine: Change changes itself: Meaning, change has an infrastructure, and once you've got that up and running, it'll adapt itself as necessary.
Chapter one in Out Of Control is a small introduction, and the writer says that he is writing it in Biosphere 2 (If you're wondering where biosphere one is, you're standing on it. It's the Earth.)
Chapter two is about hive minds, such as bees and ants. A very, very interesting experiment was also done. At a games conference, they took 5,000 people, divided them into two teams, and had them, as a whole, play a game of pong against each other.
Chapter three is about predatory robots, made by someone called Pauline, a leading robotocist, and how they hunted each other in a metallic enviroment. It was also about decentralised minds (ie: in a robot arm, the wrist is only concerned with when to flick, and the elbow is only concerned about when to fold or unfold .)
Chapter four was about how instability in the earths atmosphere caused life, and how stable planets, such as mars, are dead. It was also about coexisting life-forms, and how their fighting and competition leads to a perfect( by that I mean unstable) union.
Chapter five was about game theory, and the Prisoner Conundrum(two prisoners are locked up together. If neither confess, they go free. If one confesses, he gets fined, and the one who does not confess gets the money and goes free. If they both confess, they're both fined.) It also contained the Tit-for-Tat strategy; copy your opponents. This means that no-one wins, but you both end up better off.
Chapter six explained what chapter four started. There was also a pretty interesting fact there, that if the earth was a huge, smooth ball-bearing, the bacterial life there would become a more-or-less uniform Superorganism, due to not having to adapt to different enviroments.
Chapter seven was a brief history of self-controlled machines, starting in ancient china and ending with modern-day industrial machines.
Chapter eight was about self-contained vivisystems, most notably Biosphere 2.
Chapter nine was about how ecosystems, for the first part of their life, are very unstable, and how, if nudged along correctly by “keystone preadators”( Preadators at the top of the food chain; nothing tries to eat them) will eventually “pop”, and stabilise out.
Chapter ten was about how human industry has become an increasingly more ecosystem-like entity.
Chapter eleven was about how programs could become less erroneous; instead of having a single pod of four million lines of code, have a million pods of four lines of code, and where you find one error,you'll likely find another twenty-three lurking unnoticed; and for every bug you fix, another twenty three will appear(alike a very angry Hydra from greek mythology.).
Chapter twelve was about virtual money, and how in places like denmark and japan it is inevitably taking over.
Chapter thirteen was about god games, and something called the “Simulcra”, or the spirit of an object, or scene, is a half reality with the advent of virtual reality.
Chapter fourteen was about Borges library, a place where every book was different,so in essence it contains every book possible.
Chapter fifteen was about computer-contained life and evolution, and how writing a simple program that evolves itself is a much better use of time than writing a huge, and inevitably buggy program
Chapter sixteen was about how artificial behaviour is taking over the film industry(Mickey is no longer drawn, but computer animated).
Chapter seventeen was about open-ended life, and how self evolved programs are inevitably ugly and full of useless pieces, but are effective, and always work
Chapter eighteen was about the evolution of evolution, and how the more evolution evolves, the better it gets at evolving things.
Chapter nineteen was about the flaws in Darwinism, and about how scientists are trying to fill them in.
Chapter twenty was about Koffman machines, or machines that build themselves.
Chapter twenty-one was about how evolution is accelerating, making more diverse creatures faster.
Chapter twenty two was about predictions, and how long term predictions are worth celebrating if they come to pass, and how short term predictions are a thousand times more accurate.
Chapter twenty three was about all the holes in scientific theory(IE:how can light be a particle and a wave?) and how scientists are attempting to fill them in.
Chapter Twenty-Four was about the Nine laws of God
One: Distibute Being: Meaning, have not one complex machine doing something but a lot of little machines working towards the same thing
Two:Have control centered at the bottom; meaning, have lots of little things working together.
Three: Have creatures using the law of incresing returns to their advantage: (The law of increasing returns is this: the more of something you hav, the more you will get in return.)
Four: Grow by chunks: Meaning, get a machine to do it's first job perfectly, before adding another job.
Five: Have much diversity: Meaning, if you have a signle organism(or a small handfull; it doesn't matter), one catastrophe could kill off the lot, but if you have millions of organisms “working” together it will be harder to kill them all off
Six: Honor errors: Meaning, errors happen- but they're just another way of learning
Seven: Have multiple goals
Eight: Have persistant dis-equilibrium(because stability is death)
Nine: Change changes itself: Meaning, change has an infrastructure, and once you've got that up and running, it'll adapt itself as necessary.
Friday, 1 July 2011
Science fiction and Science fact
Science fiction is usually a speculation as to what the next great disovery will cause. Sometimes it is an alien invasion, othertimes it is a catastrophe, human made or natural. There are a great many subgenres, though I will mainly be paying attention to the hard Science fiction subgenre. Several scientific innovations were made in these books, such as Geo-stationary orbits (IE: where a satelite is always above the same patch of ground), which were invented by Arthur.C.Clarke. Or the Three rules of robotics, invented by Isaac Asimov. In fact, so many inventions owe their existance to SF.The search for Artificial intelligence was inspired by a SF novel. Nuclear weapons, mobile phones( Robert A.Heinlein “invented” these), car keys capable of remotely opening a car(Heinlein again), and a whole load more.
I, Robot: Review
I Robot is about the three laws of robotics(see previous post.) It showed how each of the laws work, but also how they can be circumvented, or worked around, and how even slight tampering of them is potentially catastrophic. It is also about how many errors robots appear to make are actually human errors, when we feed them the wrong information. The best story by far was of this mining robot, who was in charge of six other robots, or "fingers". This meant that he had two three-fingered hands. Every time there was an emergency he would freak out, and then sit there twiddling his thumbs, so four robots stayed still, and the other two either spun around or jumped up-and-down.
The Three Laws Of Robotics
The three laws of Robotics:
One: That no robot may harm a human, or through inaction allow a human being to come to harm
Two: That no robot may disobey a human, unless such obedience conflicts with the First Law
Three: That a robot must protect itself, unless this would conflict with the First or Second Law
Now, in each short story, each law is proven to be effective, and also how it can be circumvented.
The First and Second law can be circumvented by redefining the robot's definition of “Human,” as proven in one short story,in I Robot, where there was a robot whose second law was slightly altered; instead of human being the word in the law, it was master, and unfortunately it took a Beam Generator to be it's master , thus refusing to obey humans. The Third law can be circumvented easily, by either removing an emergency course of action that is triggered by the Law, or replacing it with something else. Also, Law two, and Law three can conflict, as proven in another short story where a robot was sent to gather something on mecury. However, close to where this item was, there was a lot of danger, and as there was no urgency placed on the order, the two laws conflicted and the robot circled around the area at a point of equilibrium.
One: That no robot may harm a human, or through inaction allow a human being to come to harm
Two: That no robot may disobey a human, unless such obedience conflicts with the First Law
Three: That a robot must protect itself, unless this would conflict with the First or Second Law
Now, in each short story, each law is proven to be effective, and also how it can be circumvented.
The First and Second law can be circumvented by redefining the robot's definition of “Human,” as proven in one short story,in I Robot, where there was a robot whose second law was slightly altered; instead of human being the word in the law, it was master, and unfortunately it took a Beam Generator to be it's master , thus refusing to obey humans. The Third law can be circumvented easily, by either removing an emergency course of action that is triggered by the Law, or replacing it with something else. Also, Law two, and Law three can conflict, as proven in another short story where a robot was sent to gather something on mecury. However, close to where this item was, there was a lot of danger, and as there was no urgency placed on the order, the two laws conflicted and the robot circled around the area at a point of equilibrium.
Biosphere Two.
This is my research on Biosphere Two, which was mentioned in Out Of Control, chapter Nine.
Construction of Biosphere 2 began in 1987 and ended in 1991. It is as big as 2.5 football fields. It has many Biomes(or habitat areas) in it with a savannah, a rainforest, an ocean, a mangrove swamp, some human habitat, a fog desert, and an agricultural area. In 1991 the first mission began, and the crew were Roy Walford, Jane Poynter,Taber MacCallum, Mark Nelson, Sally Silverstone, Abigail Alling , Mark Van Thillo and Linda Leigh. They had to grow at least 80% of their food. Because they were not used to this way of living, they lost a lot of their weight in the first year. As Keystone Predators (they were at the top of the food chain), it was their responsibility to stop biosphere 2 breaking in it's delicate first period. They had to make sure no species became too voracious and take over, they had to control the Co2 levels by cutting plants and storing the Co2 trapped in them, they had to keep the pH level in the ocean area from getting too high/low,etc. During the first mission, there was a small mystery; why were the oxygen levels falling, without the Co2 levels rising? The answer was that the Co2 and the oxygen were reacting with exposed concrete, forming calcium carbonate. The first mission ended in 1993. There was an ill-fated second mission, starting in March 1994, and ending in the September of the same year, due to financial disagreements. Now, it is under the management of the University of Arizona, and they use it as a climate laboratory. It is also, however, a major tourist attraction.
Construction of Biosphere 2 began in 1987 and ended in 1991. It is as big as 2.5 football fields. It has many Biomes(or habitat areas) in it with a savannah, a rainforest, an ocean, a mangrove swamp, some human habitat, a fog desert, and an agricultural area. In 1991 the first mission began, and the crew were Roy Walford, Jane Poynter,Taber MacCallum, Mark Nelson, Sally Silverstone, Abigail Alling , Mark Van Thillo and Linda Leigh. They had to grow at least 80% of their food. Because they were not used to this way of living, they lost a lot of their weight in the first year. As Keystone Predators (they were at the top of the food chain), it was their responsibility to stop biosphere 2 breaking in it's delicate first period. They had to make sure no species became too voracious and take over, they had to control the Co2 levels by cutting plants and storing the Co2 trapped in them, they had to keep the pH level in the ocean area from getting too high/low,etc. During the first mission, there was a small mystery; why were the oxygen levels falling, without the Co2 levels rising? The answer was that the Co2 and the oxygen were reacting with exposed concrete, forming calcium carbonate. The first mission ended in 1993. There was an ill-fated second mission, starting in March 1994, and ending in the September of the same year, due to financial disagreements. Now, it is under the management of the University of Arizona, and they use it as a climate laboratory. It is also, however, a major tourist attraction.
Subscribe to:
Posts (Atom)