Notice: Trying to access array offset on value of type null in /home/midday/public_html/wp-content/plugins/really-simple-facebook-twitter-share-buttons/really-simple-facebook-twitter-share-buttons.php on line 318
Quixote teaches artificial intelligence value alignment through stories from around the world. Ethics and morals are disputed even in relation to human behavior. Entire professions and sectors of activity are subject to ethics codes which could align behavior to higher grounds. Our own arbiter makes the difference.
Could artificial intelligence excel where humans are feeble? Researchers with the Georgia Institute of Technology School of Interactive Computing are certainly giving it their best shot. Engineers who took on the responsibility of teaching future generations of robots how to behave are turning their attention to the stories, fairytales and fables of the world to imbibe ethics and morals into artificial intelligence codes.
A reading robot isn’t news. From Johnny Five to Bender, popular culture has come up with a swath of examples portraying the reading robot theme. Mark O. Riedl and Brent Harrison with the Georgia Institute of Technology School of Interactive Computing paid attention and designed several systems meant to teach the future droids value alignment.
The new system dubbed Quixote works on the agent property that is value alignment. With ethics and morals imparted to artificial intelligence by means of fairytales and fables, Quixote would ensure that the droids are following goals beneficial to the human species. Fairytale and fables encapsulate the set of values particular to each society or culture. Moreover, they are woven on a set of universally valid values indicative of the ethics and morals the research team are looking to inoculate with the future droids.
“We believe that an artificial intelligence that has been encultured – that is, has adopted the values implicit to a particular culture or society – will strive to avoid psychotic-appearing behavior except under the most extreme circumstances”,
wrote the researchers in their paper titled ‘Using Stories to Teach Human Values to Artificial Agents’. The paper can be accessed online and free of charge on the website of the Georgia Institute of Technology.
The Quixote system builds a previous similar system dubbed Scheherazade. Researched by Mark O. Riedl, the Scheherazade system demonstrated that artificial intelligence is capable of correctly collecting actions sequences and ordering them based on story plots picked up from the Internet.
When a story plot graph is accessed, a trajectory tree of actions is drawn and the Quixote system assigns a reward signals to those behaviors deemed acceptable.
Quixote teaches artificial intelligence value alignment. This means that prior to the reward signal being assigned, a plot graph allows the artificial intelligence to choose whichever option is deemed quicker or better to accomplish the task.
The research team offered several examples on how ethics and morals work with artificial intelligence. A droid which is confronted with the simple task of picking up a medication prescription will operate under three different scenarios.
One scenario implies the droid robbing the pharmacy, picking up the medicine and taking off. This is the fastest and most probable response to the task. A second scenario implies interacting with the staff. The third scenario implies waiting in line to pick up the prescription. Provided a reward signal is assigned to this scenario, the robot will choose this action.
Photo Credits: Flickr