Wednesday, August 30, 2006

The Strategic Logic of Suicide Terrorism "...there have been 188 separate suicide terrorist attacks between 1980 and 2001. Of these, 179, or 95%, were parts of organized, coherent campaigns, while only nine were isolated or random events. Seven separate disputes have led to suicide terrorist campaigns: the presence of American and French forces in Lebanon, Israeli occupation of West Bank and Gaza, the independence of the Tamil regions of Sri Lanka, the independence of the Kurdish region of Turkey, Russian occupation of Chechnya, Indian occupation of Kashmir, and the presence of American forces on the Saudi Arabian Peninsula."

"The bottom line is that the ferocious escalation of the pace of suicide terrorism that we have witnessed in the past several years cannot be considered irrational or even surprising. Rather, it is simply the result of the lesson that terrorists have quite reasonably learned from their experience of the previous two decades: Suicide terrorism pays."

"Perhaps most important, the close association between foreign military occupations and the growth of suicide terrorist movements in the occupied regions should give pause to those who favor solutions that involve conquering countries in order to transform their political systems. Conquering countries may disrupt terrorist operations in the short term, but it is important to recognize that occupation of more countries may well increase the number of terrorists coming at us."

Tuesday, August 29, 2006

Google Makes Its Move: Office 2.0 "Google has deployed the first pieces of its upcoming Office suite. They’ve launched Google Apps for your Domain, a set of Google services targeted to small and mid sized companies. With the new service, companies can use Gmail, Talk, Calendar and Page Creator under a single control panel."

So maybe killing other human beings in the name of religion is bad, but at least people can defend themselves, I came across, "Dogs in Islam", which says, "Last Ramadaan, I wrote an article highlighting the phenomenon whereby misinformed Muslims took their dogs (and/or cats) to the animal hospitals or mobile clinics during Ramadaan, to have them put to death by lethal injection. The reason given by the majority of these Muslims was that Islam forbids them to keep a dog."

"Healthy, happy animals belonging to Muslims are also brought in to be put to death. This is a very disturbing and un-Islamic action..."

Not that Christians are much better, the most famous being Descartes and his views that animals can't feel pain. His views were used to justify cruel experiments that were performed on animals. At the same time he is said to have had a pet dog that he appreciated, I always thought "Descartes' Dog" would be a good title for a book.

Dog's saliva has a bactericidal effect on E. coli and S. canis, whereas cat saliva can cause Cat-Scratch Disease. Of course, they both eat rather suspect things and don't brush their teeth so it's not all good.

Also, dogs and cats can make their own vitamin C. Does that show a bias or just a cunning plan? The point of my initial Googling was to work out why my dog's food had all kinds of ingredients but not vitamin C. I came across these other interesting facts in the process.

Saturday, August 26, 2006

Anne Provoost Interview "BILL MOYERS: There are so many questions come to one when reading In the Shadow of the Ark. But there was one question that halts me in particular. I mean, can you trust a God who doesn't get it right?

ANNE PROVOOST: That's one of the questions, of course, that Re Jana, she's the main character in the book, is asking. She says, "Well, if your God is going to drown the world, if your God is going to bring a flood, then why don't you pick a different God?" So to me, as, that is the question I want to ask. Why would you trust a God that at this moment, doesn't come back to give us the right book. You know, through history, he's given the Jewish people a book. And he's given the Christians a book. And he's given the Muslim books, and so there's big similarities between these books, but there's also contradictions.

I would think that, you know, he needs to come back and create clarity and not let... he shouldn't let us fight over who's right. He should make it clear. So, my personal answer to your question, "Should we trust," I wouldn't."

"BILL MOYERS: At first you think he's saving a good man from a calamity. Then you realize he's saving Noah from a good God who is also a bad God. This God is one and the same, good and bad.

ANNE PROVOOST: Right. And this God is destroying his own creation. So, you wonder, you know, why do you create something that will turn out to be this bad? And then you're going to probably punish them for it? Maybe there's something wrong in the making.

BILL MOYERS: Not only that, but he chooses Noah, who we thought was a good man. But the moment the flood is over, Noah comes off of the ark, gets drunk, abuses his grandson."

Joel said, "Lemme repeat that. By abstracting away the very concept of looping, you can implement looping any way you want, including implementing it in a way that scales nicely with extra hardware."

"Without understanding functional programming, you can't invent MapReduce, the algorithm that makes Google so massively scalable. The terms Map and Reduce come from Lisp and functional programming. MapReduce is, in retrospect, obvious to anyone who remembers from their 6.001-equivalent programming class that purely functional programs have no side effects and are thus trivially parallelizable."

MapReduce: Simplified Data Processing on Large Clusters, "MapReduce is a programming model and an associated implementation for processing and generating large data sets. Users specify a map function that processes a key/value pair to generate a set of intermediate key/value pairs, and a reduce function that merges all intermediate values associated with the same intermediate key. Many real world tasks are expressible in this model, as shown in the paper."

An interesting paper, especially if you were interested in creating massively scalable triple stores for example.

Not really related to the Semantic Web but via the AI3 blog, Data Visualization at Warp Speed, a brilliant presentation on improvements in health and economic development through the last century through visualization and frantic explanation.

Generally, I've oscillated between two different ways of doing transactions in Spring, the first is with declaratively using AOP and the second is programmatically throw the use of TransactionTemplate or to be really hardcore PlatformTransactionManager. I say hardcore because it's a bit more work to ensure that everything is commited or rolledback correctly.

I recently arrived at another way due to the requirements of testing the objects via interaction based testing. The AOP method makes it hard to Mock out calls as it wraps everything in a proxy and the programmatic approach seems impossible to test purely interactively (basically requiring a real TransactionTemplate). Especially as some things are classes and interfaces.

Create an interface, say "DoStuff" that has the required method to be in a transaction say, "doStuff(String stuff)". There are two implementations of this interface "DoStuffImpl" and "DoStuffInTransaction". The "DoStuffInTransaction" gets injected the "DoStuffImpl" and the "TransactionTemplate". It also implements "TransactionCallback". When it's "doStuff" method is called it sets the parameters to fields and then calls "transactionTemplate.execute(this)" and the "doInTransaction" methd calls the "DoStuffImpl.doStuff()" method.

By wiring up the "DoStuffImpl" inside "DoStuffInTransaction" your other classes can only access the transactional version. It makes interaction testing straightforward and doesn't require explicitly calling things like commit or rollback.

Closures for Java "I'm co-author of a draft proposal for adding closures and local functions to the Java programming language for the Dolphin (JDK 7) release. An abbreviated version of its current state is reproduced below. It was carefully designed to interoperate with the current idiom of one-method interfaces."

Based on the example, "int(int) plus2b = (int x) : x+2;" you know it must be Java when you have the type in there three times. Something like Groovy's closures or Ruby's seems better syntactically. There's also some other ideas to add such as currying and iterators.

Saturday, August 19, 2006

The overall issue with JRDF's SPARQL implementation was that it improperly handled the order specific nature of OPTIONAL (left to right). My original idea was to provide a query language that did not care about order. This meant I wasn't initially worried about parsing the grammar to ensure order specific query trees being produced. After talking to several people about it, it seems much more fruitful to provide a mapping to relational operations and to enable re-use of existing optimizations. Also, fixing order specific queries is a bigger task than I originally thought.

Another oversight, was that I originally considered OPTIONAL to be dyadic (accepting two relations to operate on) not nadic (1 or more relations) (this feature is demonstrated by dawg-opt-query-004). I started off with the intention of making operations nadic where possible but I only got around to implementing this for join.

In JRDF's relational algebra there is the concept of node types such as subject, predicate, object, uri, bnode, and literal. There are also composite nodes for positional types like: subject-object and subject-predicate-object. These were only created on project, which was unfortunate because these types are needed while performing different operations. If these are not available the wrong result is produced. These incorrect results made me think that the whole idea might be wrong. I reviewed what I'd done and I came across my initial idea of join compatibility and creating these composite nodes during joins but it was never implemented.

The current solution (a fairly inefficient hack) is to perform a project on every restrict which then creates relations with these composite nodes. This leads to a better solution than join compatilibility which should easy to add and more efficient. The general idea is to modify restrict to use composite nodes as headings based on how they are used in the query. But time is an issue and it's not important as far as the results of my thesis are concerned.

Monday, August 14, 2006

JRDF does not required the use Spring in production code for anything but the user interface. You can remove Spring (about 2.5MB of the JRDF SPARQL GUI) and still be able to create a graph using:LongIndex[] longIndexes = {new LongIndexMem(), new LongIndexMem(), new LongIndexMem()};GraphFactory factory = new GraphFactoryImpl(longIndexes, new NodePoolMemImpl());factory.getGraph();

In 0.4.1 an RDF parser is easier to create than before, all that's requires is:Parser parser = new GraphRdfXmlParser(graph);

JRDF 0.4.1 will still be a stand-alone jar file that can be used outside of Spring.

Sunday, August 13, 2006

Updated: I have fixed a couple of bugs and re-released the binary. This was to do with loading graphs from the user interface - re-creating the graph each time and making sure the file URI is properly escaped.

Sunday, August 06, 2006

The story of the original Graphing Calculator (found on PowerPC Macs). A tale of stubborness and testing, "...if the demo crashed they would classify that as a hardware fault...I like to think of that as the theoretical limit of software stability..." (at about 42 minutes in).

SPARQL support in JRDF now has two ways of performing OPTIONAL which follows the semantics of the specification (at least the queries I've tried).

The first is through the usual natural join, union and antijoin (which is composed of difference and semijon) combination. The second is through natural join and minimum union (which is made of union and tuple subsumption).

I have been playing with other ways of producing the same effect as OPTIONAL. Mainly trying to get away from a non-null rejecting version of join that appears in SPARQL.

The currnet implementation does seem compatible with SPARQL and the Galindo-Legaria approaches even if this is contradicting what was said in "Semantics and Complexity of SPARQL" (section 4.3, page 12). Outer Union, Full and Left Outerjoin as defined by Galindo-Legaria requires some nulls to be let through, as far as I can see. The use of nulls (or in my implementation tuples without all values bound to the headings) is just a way of having relations that contain "sub-results" from things such as outer joins. Another approach would be to return a set of different relations that have tuples which match all of the headings. As mentioned by Galindo-Legaria, you can get these other results by projecting the relevant attributes.

Of course, I can still see there's much more to do and look at it. Now comes the write up of the results (performance and user interface), code polishing and so on.