A law blog addressing the foci of 3 intrepid law geeks, specializing in their respective fields of knowledge management, internet marketing and library sciences, melding together to form the Dynamic Trio.

Pages

11/23/09

It seems that Toby and I have a few extra dollars in our pockets, and we wanted to test some more Crowdsourcing Projects. Back in May, we did a 5 Part Series on Crowdsourcing and had some fun testing out different projects using Amazon's MTurk Crowdsourcing service. We thought we'd test the value of Google Scholar Legal Opinions & Journal (SLOJ) using a Crowdsourcing Project . This time we're not only asking our MTurk workers to do most of the heavy lifting, but that we'd also like to outsource the topic of the project to our blog readers. We are the ultimate Delegators!

If you have a project that you think would make a great add-in legal research Crowdsourcing project, put it in the comments below. Remember that the best Crowdsourcing projects are those that ask the workers to perform specific tasks that result in specific results. For example, let's say I wanted to pull a list of URL's from the new Google Scholar Legal Opinions & Journals site that match the cases in a volume of a law reporter. The project would first identify all of the cases within that reporter, then submit that list to the worker with instructions to search Google SLOJ, identify the specific case, then cut and paste the URL into the appropriate answer box. This type of one question, one result project works very well with crowdsourcing.

The type of projects that don't work with crowdsourcing is the one question with many answers project. For example, if you asked the workers to search Google SLOJ to find every 2009 case in New York that deals with Eminent Domain issues, that doesn't work very well because you'll either have to assign one person to do all the work, or assign multiple people to do the same work over and over again. To make this type of project work, you'd first need to identify each of the cases dealing with Eminent Domain, then ask the worker to do something specific for each one of these cases. For instance, you could have them read the case then summarize the court's decision. Or, you could have them identify specific information within the decisions such as who the attorneys are, who they represent, and which party the court ruled.

Now that you see the guidelines for submitting a Crowdsourcing project, let us know what you'd think would make a great Crowdsourcing project for Google SLOJ. Toby and I are ponying up the money for this project, so it won't cost you a thing. We'll compile and post the results right here on 3 Geeks.

5
comments:

I like your idea of creating an exhaustive list of opinions in a specific regional reporters(for me that would be a specific N.E.2d reporter). But I would be even more excited to see a properly formatted citation to each decision. For example, the following is the proper citation format prescribed by the Indiana courts for the following decision decision in google scholar:

I'd think that would be great project for Google Scholar, rather than a crowdsource project. Seems that the Blue Book, or Local Rules could be solved (mostly anyway) through programming code rather than brute force. At least it could get it closer than what it is now.

There's an email that you can use to contact Google Scholar - scholar-library@google.com - to make suggestions like these. I found that the folks at Google are pretty open to suggestion.

I use Zotero with google scholar generally to extract and save bibliography information. This is quite useful in writing a brief because Zotero has a plug in for open office writer.

Unfortunately, there does not appear to be any bibliographic information about the court opinions at google scholar to import into Zotero. The information would include volume number, reporter, page number, date, etc.

Perhaps, as you suggest concerning collecting a list of decisions in a particular volume, collecting this type of bibliographic information might be the basis for a good crowd sourcing project.

heh. I wrote citeable.org. parsing citations... that was a fun project. the problem I faced when I was working on all this was how to get the official reporter citation for a slip opinion published online (for example, one downloaded from the Ninth Circuit webside, www.ca9.uscourts.gov).