(tested on IE 5.5, 6, 8 & 9 using IETester and Seamonkey-2.14 - so should verify for Firefox too, as well as Chrome and Opera, but drag/drop is broken in safari-5.1.7 for windows/wine, so use 5.0.6 - I think apple intentionally broke things for the windows client)

here are functions

Code:

function drag(ev){ //we will just store the inner html of our drag tag - it could be id, class or something else
ev.dataTransfer.setData("Text",ev.target?ev.target.innerHTML:event.srcElement.innerHTML) //IE needs "Text" and others are fine with it
}
function drop(ev){
if(ev.preventDefault){ev.preventDefault(); //or mozilla will navigate to the url.com of the text
ev.target.innerHTML=ev.dataTransfer.getData("Text") //we are replacing the inner html of our drop with our drag
}else{event.returnValue=false //ie workaround does not have preventDefault or target
event.srcElement.style.backgroundColor=ev.dataTransfer.getData("Text")
}
}

Notes on the html:
older versions of _some_ browsers only support dragging "a" or link tags, so use it with a blank href
we need to set draggable to true for the items we want to be able to drag
we also need to get the draggable, so we set ondragstart to our drag function and pass it the event
we also need to override the default function of ondragover - return false will do this nicely (or you _could_ use a function to preventDefault, but it is not browser independent)
and finally we need to do something with our drop event, so we set ondrop to pass the drop event to our drop function

I've been thinking about writing my own javascript library designed for "static" compiles, but though I was pretty certain it is needed, I figured I would scope out the top libraries and see how they compared with fully optimized compiles.

My method:
include 1 and only 1 javascript library and a simple hello world alert function that doesn't use any code from the library and build with full optimization

turns out they really are horrible for dead code elimination, except google's base.js (which is horrible for other reasons)

The library that I am working on (a.js... a reference to static archives) compiles down to zero if the code isn't used. The idea is to only add functions that aren't so complex that they prevent optimization (for example no functions that take a function as an argument)

# line to get the wsdl, but instead use:
# http://soapclient.com/soaptest.html instead
# you can use it to get the proper requests - hint: view source
#echo -e "GET /hosps?wsdl HTTP/1.1\r\n\r\n" | nc 192.168.0.102 8888
newline="
"

What's the end game? Or we anywhere in the realm of some sort of better JS stuff that can be compiled into other stuff? ..elinks?

Or is this all for producing web frontends? Does this all need a webserver?

what could you do with sqlite patched to produce JSON? A simple db backend for websites via json+ajax? could it simply replace the standard sqlite and function as normal?_________________Akita Linux, VLC-GTK, Pup Search, Pup File Search

The liba.js is for traditional web programming - just a clean room implementation of the most commonly used extras. Maybe I'm a bit conceited, but I felt I could do better than all of the contributors to jquery, google-closure and the other popular js libs out there when it comes to compiled code size and ease of use. The existing kits either require you to remember to either add dependency info into your code to work and compile down (google's) or just doesn't reduce much at all over the minified version (jquery, yui,...). My library is designed for people who want their site to load as fast as possible or to be obfuscated as much as possible without restrictive licensing concerns.

Currently it would need a server for saved states and interaction, but I've actually made a json patch for sqlite, however it was an uninformed hack... I need to redo it to simply query the data type (its only ~10 lines, 8 of which are cut-n-paste). I would like to figure out a way to do it with html5's local storage.

My if-I-had-all-the-time-in-the-world project would be a complete webpage optimizer that parsed html, css, svg, js and extensions and optimized them all into a single zopfli compressed page while generating optimized sprite sheets from img tags and separating out any browser specific workarounds to be loaded automatically (and provide related warning info along with possible workarounds) On the plus side of this, the same parser/lexer could be used for a browser or user interface. Netsurf's MIT licensed libs would be a good starting point to parse it all to a DOM tree then use that to render a page/interface... and rather than caching pages, cache an optimized DOM tree

You cannot post new topics in this forumYou cannot reply to topics in this forumYou cannot edit your posts in this forumYou cannot delete your posts in this forumYou cannot vote in polls in this forumYou cannot attach files in this forumYou can download files in this forum