Isaac

Matt,I had to stop at # 2. “Compassion” from a robot? If accidental anthropomorphism, it’s forgivable. If deliberate attribution of sentiment, it’s impossible. Maybe, ‘would I expect to be ‘dealt with’ more or less severely’ – or something. AI is a huge issue for me… If, alternately, milbots’ behavior could regularly be discussed in terms of CI, it would emphasize the fact that they (however amazingly capable) only ever have syntax and never semantics. This supports the Boyd assertion regarding war, machines, and minds posted at the DNI site, which I believe is a far broader statement than its succint nature belies. Further, such technofetishism is a departure from the need for a sea-change in approach, doctrine, training, and tactics in our armed forces – something you regularly do point out. I apologize for the Dennis Miller action, but, as a student of Dreyfus, it’s one of those things that pushes a button (lol) in me somehow…

Dan, no for a larger project unrelated to school.Isaac,
The points you raise were actually instrumental in my designing the survey, actually, and is figured into the report I’m presenting next week. I was tempted to put “compassion” in quotes, but I wanted the respondent to come up with his/her own notion of compassion. I’ll say more at the end of next week, but suffice it to say for now the word choice was intentional.