A lot of the news that features a lot of barbaric acts against women and children seem to be coming from Islamic countries. That this mean that there is something in the Islamic faith that promotes barbaric acts?