Results of the 2015 Man vs Machine Challenge

We set up 6 computer were people could play. Humans played first from 10am to 5.50pm. 

Every participant had to solve 4 Angry Birds levels in 10 minutes. Human players were surprisingly good this year. Many players solved all four levels and it was obvious from the way people played that they were very experienced and played to win. It seems one thing our competition has achieved over the years is that conference participants are becoming expert Angry Bird players :) 

At 6.00pm the four best AI agents from the 2015 Angry Birds AI Competition were started. Each of them had 10 minutes to solve the same four levels. We first ran the third and fourth ranked agents, but they didn't perform well. IHSEV solved only two levels (121360), Tori didn't solve any level. Finally we ran the two best agents, DataLab Birds (225190) and AngryHex (180910), who performed better, but still only solved three of the four levels. AngryHex almost solved the most difficult level and started with an amazing shot, but then couldn't see the last pig sitting unprotected in the grass. Their problem was that they used their own computer vision which couldn't distinguish the green pig from the green grass. Interestingly, some humans had the same problem earlier. Therefore:

The winner of the 2015 Man vs Machine Challenge is A HUMAN!!!!

Josef Bajada from King's College London had the highest score and won the 2015 Man vs Machine Challenge with a score of 341460. Nathan Sturtevant got second place (337890), Jingwei Xu third (335820), Fabio Rafael Gallo fourth (335580) and Santiago Ontanon fifth place (334520). Each of them was in the lead after they played and had hopes of winning. Josef only took the lead a few minutes before the end of the challenge, so it was exciting up to the end. Well done humans!

It is good to see that we are still better at playing Angry Birds than AI and that Angry Birds remains a very challenging problem for Artificial Intelligence. 

 

Update (August 31, 2015): Man vs Machine Challenge at ANU Open Day 2015

Attendants of the ANU Open Day had the opportunity to challenge 12 of the best AI agents on the same 4 levels as in the official Man vs Machine Challenge on July 31. Again, a human player won and a total of 4 humans performed better than TORI, the best performing agent of the day. But what is most remarkable is that TORI, who didn't score a single point in July and was obviously "jammed", solved all 4 levels and obtained a score of 319990 points! That is only about 21000 points worse than the best human player so far. Overall, TORI is now among the best 1/8th of all human players on these 4 levels. In 2014, the best agent was only among the top 1/3rd of human players, so this is an amazing improvement. 

 

A big THANK YOU to all the participants. We hope to see you again in 2016 for an even more exciting Man vs Machine Challenge.

Humans: keep practising

AI Researchers, Developers, Enthusiasts, Hobbyists, Students and anyone else who would like to contribute, you can download our basic game playing software and start adding your own Angry Birds strategies to it or put some AI techniques to the test. Let's see how good we can be in 2016!

 

Jochen Renz, XiaoYu (Gary) Ge, Peng Zhang and Stephen Gould

Australian National University