Results of the 2019 Man vs Machine Challenge

Every year, the Angry Birds AI Competition is followed by a Man vs Machine Challenge, where humans can challenge the four best AI agents. The purpose of this challenge is to see if we have reached the goal of the AIBIRDS competition to outperform the best human players in playing new Angry Birds levels. A recent survey among AI researchers found that AIBIRDS was considered the AI Milestone where AI will beat humans next, so the pressure was on for AI to perform well. 

This year, there was a noticeable jump in performance of the participating AI agents. The winner of the past two years, Eagle's Wings, got eliminated in the Quarter Finals already. Particularly this years finalists BamBirds and SimbaDD showed a very convincing performance with some remarkable shots. So we had high hopes, AI would be able to finally beat human players, or at least come close to human performance. This is a tough challenge, as IJCAI participants who normally form the majority of challenge participants are by now very experienced at playing Angry Birds that they are probably among the best Angry Birds players in the world. 

We had 4 laptops set up in Hall A in a very prominent place and had lots of interest. For the Challenge, we designed a set of 4 new and very hard levels and both humans and AI had only 10 minutes to solve them. Humans played first. Crowds of people were there the whole time, with lots of people queueing to participate. Humans played for around 5 hours and people were constantly playing. The levels we designed required a mix of precision and strategy to be solved. Only around 20 players were able to solve all 4 levels, which is a very good achievement. Quite a few of the top players of past competitions were there again. The leaderboard changed frequently, but we also had to remove some overly keen people from the leaderboard who played twice. Every player is only supposed to have one chance of playing the levels to ensure fairness and comparability. After all humans have played, Nathan Sturtevant from the University of Alberta was in the lead with 228,270 points, Da Sun Handason Tam from the Chinese University of Hong Kong was second with 226,370 points, and Quan Guo from Tulane University was third with 220,330 points. 

Now it was up to the AI agents to show how good they are and if they can beat the best humans already. The four semi-finalists from the day before qualified for the Challenge: the winner BamBirds, the second placed agent SimbaDD, the third placed agent Angry-HEX and Orpheus. Even thought the agents this year played very well in the AI competition, something surprising happened during the Man vs Machine Challenge. It was as if the agents were nervous to play in front of the large crowd of spectators we had. BamBirds performance was symptomatic. BamBirds started with the most difficult of the four levels and played the best shot of the whole competition. It was incredible and enabled a solution of the level with only three birds, everyone else needed four birds to solve the level. Only two more simple and direct shots were required to finish the level that even a beginner would now be able to complete. But BamBirds failed. They fired their remaining birds at some imaginary pigs. and didn't solve the level. We couldn't believe it. Overall, the agent performance was a disaster. Only Orpheus managed to solve one level, the other three agents didn't solve any of the four levels. The bar was too high for AI, the levels too difficult and 10 minutes were not enough. What a disappointment after an amazing competition.

 

Therefore, the winner of our 2019 AIBIRDS Man vs Machine Challenge is Nathan Sturtevant from the University of Alberta. Congratulations Nathan! 

 

Humans still beat AI at Angry Birds and it seems AI is not getting any closer. AI still has a long way to go to master this very hard problem that is much closer to real world problems than seemingly difficult games like Chess or Go. Let's see how well we can do in 2020.