Friday, September 9, 2016

I put together a script my-tweets.py that gets my latest tweets and outputs as markdown. I intend to make this a cron job, and have it append my day’s tweets at the end of each days notes.

* * *

I was thinking. AI works in a way similar to human brain, which is less “deterministic” and more “probabilistic”. So, how are we going to determine if an AI is able to drive a car or fly a plane, or give legal advise? I expect that this will be done in a similar way that humans get certified in such “cases”: Diplomas, Certifications, University degrees. Why not? If we are to trust our lives to an AI “driver” it should pass a test, right? In more demanding cases, this “certification” may probably be tightly integrated with the learning proccess. An AI that has a “college” degree, because it went through a “certified” learning proccess, that included many tests and degrees. Of course, an AI may probably be able to complete such a degree in a much shorter period.

* * *

“Mujahideen forces caused serious casualties to the Soviet forces, and made the war very costly for the Soviet Union. In 1989 the Soviet Union withdrew its forces from Afghanistan. Many districts and cities then fell to the mujahideen; in 1992 the DRA’s last president, Mohammad Najibullah, was overthrown.

However, the mujahideen did not establish a united government, and many of the larger mujahideen groups began to fight each other over power in Kabul. After several years of devastating fighting, a village mullah named Mohammed Omar organized a new armed movement with the backing of Pakistan. This movement became known as the Taliban (“students” in Pashto), referring to how most Taliban had grown up in refugee camps in Pakistan during the 1980s and were taught in the Saudi-backed Wahhabi madrassas, religious schools known for teaching an orthodox interpretation of Islam. Veteran mujahideen confronted this radical splinter group in 1996.