BEIJING, February 10 (TMTPOST) — Japanese multinational giant Sony announced on Thursday that its AI player had defeated human players in the car racing game Gran Turismo Sport.
The research finding has been published in Nature’s February 10 issue.
The AI player named GT Sophy is an AI program based on a neural network, according to Sony. The AI was able to follow the play rules in Gran Turismo Sport and exhibited great control over the vehicle and racing strategies. Four best Gran Turismo Sport human players competed with GT Sophy in July last year for the first time. The AI player eventually outraced its human competitors in another round of competition in October last year.
GT Sophy was developed by Sony’s AI unit and Sony Interactive Entertainment. The AI underwent training in PlayStation game and driving simulator Gran Turismo Sport. In the game, players race against each other in highly realistic settings while taking into consideration of complex control of the vehicle, driving routes and speed, etc. Players often need to make snap judgments when playing the game.
During the competition, GT Sophy followed strict racing rules. The conventional AI program in the game will only follow certain driving routes, sometimes crashing into other cars. GT Sophy on the other hand would avoid collision with other cars for the spirit of fair game. It would even slow down its car to avoid hitting other players.
Sony AI’s CEO Kenichiro Yoshida, who spearheaded GT Sophy’s development, said that AI not only provides better competition in games but also brings new opportunities in autonomous driving.
“For Sony AI, Gran Turismo Sophy is not just a scientific AI challenge. It is also an important step in unleashing human imagination and creativity with AI,” Sony AI’s COO Michael Spranger said.
GT Sophy can help drive the development of autonomous driving, Chris Gerdes, Professor of Mechanical Engineering at Stanford University and co-author of the research finding said.
关键词: English