HAL9000

July 23, 2015

Feels so good to see Android compiles of top engines updated to recent versions, like Cheng in today's case. Thanks to efficient SMP, Cheng is in Top-20 for me. I hope that new one will add some more ELOs.

Let me remind that Cheng was one of the engines selected to play in TCEC-6 and TCEC-7 too!

Although i didn't test it yet, i don't expect any issues as the previous version used to run with zero error. Therefore, new Cheng will probably be included in August release of Rapidroid: My humble guess around 2900 ELO.

I had always experienced issues with previous compiles of ExChess. In fact it's a great engine developed by Dan Homan and it took part in TCEC events but it was unlucky under Android.

I hope with this brand new version compiled by Jim Ablett, a new engine can be introduced and show up in Rapidroid.

According to CCRL ranking, the PC version of ExChess running on 4 cores and 64-bit reaches 2880 ELO. Bearing in mind the gap from a PC down to Android increases when XB is used, my humble guess for ExChess can't be higher than 2600. Tests will tell the truth in any case unless the compile causes compatibility problems.

Please remember that JA wants all 3 files (engine, book, search.par) installed in the same folder in your device to get it running.

July 16, 2015

Sedat Canbaz is recognized as an everlasting computer chess addicted by the community. Since longer than a decade, he contributed with countless tests, ratings lists and especially with opening books specifically tailored for engine to engine games. It's a great work to appreciate.

As per the permission given in his credits, i wanted to share his latest book PERFECT 2015 via my repository at box.com to provide a download alternative, given that upload servers do not always keep the files forever and downloading from there is slow with popups, captchas etc.

The 7z compressed file is only 2.42MB and contains various versions of the book compatible with Chessbase, Arena, Aquarium GUI's. A tutorial with comments is also included.

Perfect-2015 is currently used as default tournament book by various sites dedicated to engine testing such as: Chess Engines Diary

July 11, 2015

Now that i'm done with the multiversion lizard-o-fishy battle, it wouldn't be time wasting to come back to the previous dilemma about different Stockfish 6 compiles.

Just a couple of weeks after TCEC-7, Stockfish 6 had been released step by step, first with release candidates and the definitive version had followed soon after.

Although the original compile for Android didn't delay, other alternatives rained from different sources to complete the chaotic complexity:

* RC6 arm5 by Jim Ablett

* 6.0 arm5 by Jim Ablett

* 6.0 arm5 bundled with Droidfish 1.57

* 6.0 arm7 bundled with Droidfish 1.57

* 6.0 arm8 bundled with Droidfish 1.57 for 64 bit Android Lollipop

* 6.0 original arm7 by stockfishchess.org

Which is the strongest of these? Good question. It's almost certain arm7 is better than arm5 in same conditions unless big difference between compilers.

Basically Jim Ablett compiles performed very well in the past, especially in Senpai 1.0 case but there are few chances for his compiles to overcome Peter Österlund's.

What i really wonder is the difference between Droidfish and the original one. As i can't live my life without knowing about this, i'm starting a new parallel experiment between all above compiles with the exception of arm v8 version. I don't have Lollipop and probably i will never go for it because almost all existing compiles will refuse to run under Chess for Android. As mentioned before, no tourneys with Lollipop yet!

Let me remind that the Stockfish compile running in Rapidroid is the one released by stockfishchess.org, made with MingW under Linux.

After 20 rounds of 110 games, i've finally concluded it's been enough to clarify the verdict. Stockomodos, the temporarary parallel experiment, showed that Stockfish remains superior to Komodo in Android environment when it comes to face various versions of the two top engines. This statement didn't change from the beginning until the end. Stockfish 6 led the ranking in all the rounds.

Good, but in any case, head-to-head competitions are just limited experiments. They are not representative enough to simulate the real world. I could easily claim the fish proved its dominance but it would be too speculative, given that Komodo is ranked extremely close to Stockfish in Rapidroid after 150+ games played vs various other engines..

Anyway, engines are meant to play with any engine, not just their closest rival.

July 5, 2015

IGCA (International Computer Games Association) is not FIDE of computer chess. At least, it's not recognized like this since a while by a large community of chess addicted people.

Let's say the open source era with hundreds of free engines updated zillion times a year has become uncontrollable by such hierarchical organisation. It's clear and obvious that programmers don't like to be directed and depending on authority anymore. If you ask me about human players, i wouldn't reply differently though. After all, it's not a post about FIDE.

For many years, WCCC held by ICGA in various cities around the world had been the only reputable event where top chess programs had competed. Folks aged enough will surely remember those Mephisto, Shredder, Junior, Fritz years.

Today information flows much quicker through countless channels, thanks to unlimited communication tools, live tournament broadcasting, automated games leading to thousands of tests etc. Indeed it's an environment where people don't meet each other in person, less warm, less emotional, maybe too mechanical.

I can never deny the precision brought by TCEC which deserved the Unofficial World Championship status but the human-like tournament ambiance, real chess pieces operated by human hands still remain sympathical to me. I still enjoy following WCCC.

After Yokohama Japan in 2013, this year WCCC moved to Leiden Netherlands, one of the major chess cities. Well, only a single round robin with eight games played by each of nine participants, running on different hardware. Consequently, there can be no scientific basis to prove the winner of such tournament is the strongest of all programs available.

But who cares? It's not a testing lab, just a tourney. The games were fun enough and JOHNNY won that tournament with 2400 cores running in parallel. That's all to tell.

These days, evaluation and search algorithms are so sophisticated and developed that they often count more than the processing power. That's why it's hard for me to see extreme hardware can still overcome programming. Jonny deserves the congratulations here, especially for defeating the mighty Komodo. That game looked like a true breaking point which kept Komodo from winning the tournament. Once again, we have witnessed that surprises increase popularity. Frankly, who would enjoy watching games ending with expected results?