Law Journals: Submissions and Ranking

The purpose of the Law Journals webpage is to allow authors to find law journals by subject, country, or journal rank (where available), to display journal editorial information, and to facilitate an author's article submission to those journals.

Most bar journals, magazines, and newsletters are excluded from this list. Also excluded are law journals that have few English language articles (except for a few U.S. Spanish language and Canadian French language journals). The default listing is alphabetical by journal name. If a rank-order is chosen, by clicking on one or more checkboxes in the center of the menu area, then the list will sort in rank order. Clicking on the column headers for each numeric rank list will re-sort to that rank. Clicking on the "Journal" header above the journal titles column will re-sort alphabetically.

It's not necessary, when using the author submission process, to select any of the rank choices, if none is chosen the list defaults to ordering by the most recent combined-score of impact-factor and total cites. If the user selects more than one rank checkbox then the list will rank by the checkbox that is closest to the top, and then closest to the left, of the checkbox choice area.

General/Specialized, Country, and Student-edited/Peer-edited/Refereed
By selecting the appropriate checkboxes the journal list may be limited to combinations of:

General, specialized

"General" is a little confusing, as it's used as a checkbox, and also in the pull-down Subject menu. The checkbox meaning of "General" is that the journal is the main "flagship" journal of an institution (usually a law school) and has no subject speciality. In the pull-down Subject menu, a journal may have a "General" classification but also one or more other subject classifications because the journal has a tendency to publish in certain subject areas

Country (U.S., non-U.S., or another specific country)

Generally this is the country of the publisher, however, if there is enough contact with another country this may be modified. This typically occurs when a journal is issued under the auspices of an institution based in a county that differs from that of the publisher.

Student Edited, Peer Edited, and Refereed journals

"Student edited" means a student run journal that does not send articles out for peer review.
"Peer edited" means a journal that is edited by professionals in the field
"Refereed" means a journal that routinely sends article submissions on for peer review by members of a diverse professional group.
Student edited or Peer edited journals may also be refereed, in which case the journal will be listed as "refereed".

Journal Subjects
The pull-down subject list allows the journal lists to be limited to journals that fall within broad subject areas.

Journal Name Search
Using the "Jnl-name words" box, the journal list can be limited to those journals whose names contain the entered words. Words entered are ANDed together. Alternate journal names, including name changes over the past decade or so, are also searched.

Limiting to journals within a Range of Rankings
Using the "Rank" box the number of displayed journals can be reduced to the range number selected. A range of rank numbers can be entered using "," or "-", e.g. "31,33,35,37-43,46-50". It's best to look at a full ranking list before limiting it because rankings are bunched so that there may e.g. be four 3rd ranked journals and thus the next ranked journal would be 7th. So if you entered a range limit of 4-6 it would match with nothing in the ranking and your rank request would be ignored.

Impact-Factor Weighting
The impact-factor weighting determines the proportionality between impact-factor and total cites which comprise the combined-score calculation for each ranked journal. For more information see "Combined-Score Ranking" below.

Checkboxes next to Journal Names
The Journal checkboxes are used to limit the list of journals. If none are checked then the checkboxes are ignored and the output list is governed by the boxes and buttons chosen in the menu. Any journal checkboxes that are checked will be used to limit the menu choice output. Note that, if for example, a string of "yale" is entered in the menu to limit journals to only those with "yale" in their name, and then before clicking "Submit" some journal checkboxes are checked where the journal names do not include "yale", these choices will be inconsistent and will result in zero journals being listed.

Author Submissions
Check the box in the Author Submissions Process area to activate this feature. Choose either "Multi-email" or "Separate", then when the "Submit" button is clicked (and based on the menu options chosen), a list of editorial addresses, upload links and email address links will be listed. Other editorial information, such as submission policy, will be shown, if available.
If "Multi-email" is chosen, the returned page will be divided into three parts:
(1) A list of email addresses will be displayed, to allow users to copy and paste all the addressess into their email program.
(2) any electronic submissions that need to be done one-by-one. This is usually because
- articles must be uploaded at the journal's website, or
- the journal, while accepting email submissions, requires them to be exclusive submissions (in other words, prohibits multiple simultaneous submissions to other journals), or
- more than one journal shares the same e-mail submission address, and multiple submissions in one e-mail would not inform the recipient as to which journal the author intends to submit the article to.
(3) any other chosen journals (usually those that require paper copy submission, or don't accept unsolicited submissions).

Exclude Nonranked
Many journals in the list have not been ranked, i.e. no search has been run against Westlaw's JLR and ALLCASES databases. Some journals have recently begun publication and will eventually be ranked, others (most will be non-U.S journals) are listed in order to include their editorial information. Nonranked entries can be excluded from listings by checking the "Nonranked" box.

Information About the Journal
Clicking on the journal name retrieves information about the journal such as web address, and article submission information. The record retrieved by clicking on the journal name will display links to "OpenURL" resolvers that can attempt to find full-text sources for the journal (whether or not the source can actually be accessed by the user will depend on licensing restrictions).

Counted citations are those which cite journal volumes published in the preceding eight years. The reason for this limit is to prevent a bias in favor of long-published journals. Thus the study is concerned only with citations to current scholarship. The search results give only the number of citing documents, and do not show where a citing article or case cites to two or more articles in a cited legal periodical. Sources for the citation counts are limited to documents in Westlaw's JLR database (primarily U.S. articles), and in Westlaw's ALLCASES database (U.S. federal/state cases). The searches conducted in those databases generally use the Bluebook format in use in the U.S. (volume journal [page] year), any citations utilizing a non-U.S. legal citation format (year volume journal) would generally not have been counted. Thus it is important to realize that this survey is primarily intended to be a ranking from the perspective of U.S. legal scholarship.

The list includes periodicals that began publication after the survey period began. Rank results based on total citation counts are unfair to those periodicals, and whenever a journal recently began publication a warning has been supplied next to the periodical name in the form of a parenthetical date such as "(2001- )". Both impact-factor and combined-score rankings do make an allowance for how recently the journal began publishing. Legal periodicals which appear to have ceased publication (even though they were published during a part of the survey period) are not included.

The "Journals" column(s) shows the number of articles that cite to each journal (within our date period) that were found in the full-text Westlaw journals database "Journals and Law Reviews (JLR)". To see what sources are included in the JLR Database see the Westlaw description at directory.westlaw.com/scope/default.asp?db=JLR&RS=WDIR1.0&VR=1.0. The scope note in that Westlaw description describes the JLR content as, "The JLR database contains documents from law reviews, CLE course materials, and bar journals. A document is an article, a note, a symposium contribution, or other materials published in one of the available periodicals".

The "Cases" column(s) shows the number of cases that cite to each journal (within our date period) that were found in the full-text Westlaw state and federal case database "Federal & State Case Law (ALLCASES)". To see what sources are included in the ALLCASES Database see the Westlaw description at directory.westlaw.com/scope/default.asp?db=ALLCASES&RS=WDIR1.0&VR=1.0. The scope note in that Westlaw description describes the ALLCASES content as, decisions from the "U.S. Supreme Court, courts of appeals, former circuit courts, district courts, bankruptcy courts, former Court of Claims, Court of Federal Claims, Tax Court, related federal and territorial courts, military courts, the state courts of all 50 states and the local courts of the District of Columbia."

Comparisons between the 1999-2006, 1998-2005, 1997-2004, 1996-2003, and 1995-2002 citation counts cannot be made precisely. Although the number of years covered by each ranking column is an identical 7 years and 10 months (the arbitary cut-off date is October 31), the size of the JLR and ALLCASES databases for each of the rotated periods increases by approximately 2%-3% in each later period. Thus, a small percentage increase in the number of documents citing to a journal might be accounted for by an increase in the total number of documents in the database.

Searches for citing documents (taking the 1999-2006 period as an example) usually look for citations within the full-text articles/cases that have one of the volume numbers published for the journal from 1999 onwards (i.e. where the journal has labeled the issue as 1999..2006), followed immediately by the journal abbreviation/name, followed within 6 words with a year designation of 1999-2006. A further condition is that any document (case or article) in which such a citation occurs must be dated (in Westlaw's 'date' field) as 1999 onwards, and must have been added to the Westlaw JLR or ALLCASES database subsequent to 1998 and before November 2006. These dates should be adjusted for an explanation of other rotated 8 year periods.

As the searches are full-text searches they are naturally prone to some error due to variant citation form in the citing cases/articles. Searches for citation patterns roughly follow the Harvard Blue Book format, usually, VOLUME JOURNAL ... YEAR. Citations that are not in the usual format for legal citations may not have been found. Effort was made to allow for different forms of journal name citations (e.g. allowing for the ALWD citation format) but not all can be retrieved. The citation counts for citations to non-U.S. periodicals are likely to be less accurate than those for U.S. periodicals because non-U.S. legal citation formats are often severely abbreviated (e.g. the International Journal of Evidence and Proof says that it should be cited as, "(2004) 8 E & P"), and may have the date reversed (from the U.S. citation perspective) to first position in the citation string. Articles utilizing non-U.S. citation formats are most usually found in non-U.S. journals, and as there are relatively few non-U.S. journals in the JLR database this is not a severe problem, particularly as the intent of this survey is to rank journals based on U.S. citations. Nevertheless, it should be recognized that if the searches for non-U.S. citations were run in the alternative with year rotated to first position (as in "(2003) 119 L.Q.R.") then more citations would often be found.

Westlaw's treatment of periods and spaces within character strings is difficult to understand and is part of the reason why so many alternate cite forms were used.
Quotes are not used around embedded periods, for example, in looking for "HARV. L. REV.", the search HARV +2 L +1 REV would be used.
After a "word" ending with a period, the word is followed by a +2 as in HARV. +2 L.
After letters with embedded periods or single-letters there is no problem with using a +1 as in L +1 REV. However, a problem occurs if the text incorrectly uses a period after a complete word, such as "Akron. L. Rev.", in that case AKRON +1 L +1 REV would not retrieve the cite. Therefore it is a better practice to use a +2 after words longer than one character.

The combined-score is a composite of each journal's impact-factor and total cites count. The combined-score is, by default, weighted with a little more emphasis given to impact-factor than to total cites. The resulting score is normalized.
The formula for obtaining the combined-score is the addition of the weighted and normalized scores for each of impact-factor (IF) and total cites (TC):
((IF x weight x 100)/highest-IF) + ((TC x (1-weight) x 100)/highest-TC)
The resulting scores of the retrieved set of journals are then normalized as:
(combined-score/highest-combined-score) x 100
Thus the top-ranking journals in a retrieved set of journals will always have a value of 100 and other journals will have lower numbers in proportion to their ranking calculation score.

Users may alter the default weight (0.57) by entering any decimal number between 0 and 1. If "0" is entered then the combined-score ranking will ignore impact-factor and produce a normalized ranking by total cites, and if "1" is entered then the combined-score ranking will ignore total cites and produce a normalized ranking by impact-factor. It's recommended that users not change the weighting value unless there's an interest in seeing a normalized ranking for either impact-factor or total cites.

Combined-score ranking is based on the idea proposed by Ronen Perry that neither ranking by total cites nor by impact-factor are in themselves sufficient, and need to be combined. See, "The Relative Value of American Law Reviews: Refinement and Implementation" available at SSRN: http://ssrn.com/abstract=897063 (to be published in the Connecticut Law Review). The problem in any combined ranking is what weight to give to the underlying factors. Perry calculated a weight of 0.577 for impact-factor (and thus 0.433 for total cites) based on the idea that Harvard Law Review and Yale Law Journal have equal prestige, and 0.577 is the weight that makes the combined impact-factor and total-cites counts equal for these journals (over the survey period of 1998-2005). However, the default weighting used on the website is a slightly different value of 0.57. It was decided to use 0.57 because that weighting gives Harvard Law Review a normalized rank of 100 over each of nine rankings (1991-1998...1999-2006) while maximizing the average of Yale Law Journal's rank during those same years. While Harvard Law Review and Yale Law Journal are generally considered comparable, Harvard is still widely regarded as the gold-standard and deserves an edge over this period. It is expected that the 0.57 weighting will continue to be used in future annual surveys, and thus Harvard Law Review may at some time drop below a normalized combined-score ranking of 100.
Note that combined-score rankings prior to 1996-2003 that were used in calculating the 0.57 weighting are not part of this website, the figures (1991-1998 .. 1995-2002) were calculated only for Harvard Law Review and Yale Law Journal and solely for the purpose of determining this weighting. The missing data is as follows:

In order to more fairly compare new journals with established journals an adjustment to the combined-score is made for journals which, at the survey date, have been in existence for more than the then current year but for less than 8 years. For example a journal that began in 2004 will, for the 2006 ranking, have its total cites multiplied by 7.3 (that total cites extrapolation does not display in the total cites column - it is used only in the combined-score formula). The aim is to estimate from the cites to a journal over its few years of life how many cites the journal would likely have had if it had been in existence for 8 years. The multipliers are as follows (where the digit before the parenthical is the difference between the survey year and the year the journal began): 1(29) 2(7.3) 3(3.4) 4(2.3) 5(1.6) 6(1.3).
These multipliers are based on a sample of 3 journals (American Law and Economics Review, Journal of Appellate Practice and Procedure, and Journal of Law and Family Studies) all of which began publication in 1999. The sample looked at how many cumulative cites occured 2 years, 3 years, etc. after publication, and what multiplier each year would have predicted the 2006 total. To stay well on the conservative side the lowest multiplier from the three journals was used. The lowest multiplier in this sample for a journal that was in its second year of publication was actually 43, but it was felt that this was too high a value for the volatile task of predicting from its 2 year total how many cites a journal would have after 8 years, and this value was arbitrarily reduced by 1/3 to a multiplier of 29.

A webpage is available that shows for each law journal how many total cites to an article would have to occur over the following 8 years in order to make such an article worth publishing (from a combined-score ranking point of view). For the Yale Law Journal, for example, the number is '9', so in other words, if it's "known" that an article/note/review/introduction/obit... will receive less than a total of 9 cites during the 8 years after it's published then publishing the item will reduce Yale Law Journal's combined-score (see, http://lawlib.wlu.edu/LJ/citesneeded.aspx). Note, however, that because ranking is based on rounding to one decimal place, small changes in combined-score may not change a journals actual ranking.

Impact-factor shows the average number of citations to articles in each journal (rounded to one decimal place). Impact-factor rankings should be used cautiously as they are biased against journals that publish a larger number of shorter articles, such as book reviews. Nevertheless, if two legal journals have a similar composition of articles, notes, and book reviews, then from an author's viewpoint it's reasonable to compare the impact-factor of each to see which is a better journal with which to publish. The implication of a similar ranking by total citations, but a dissimilar ranking by impact-factor is that the journal ranked lower by impact-factor is publishing some articles of lesser quality, or of less general interest. It's suggested that in preference to using impact-factor, the combined-score ranking (a weighting of both impact-factor and total cites) offers a more balanced view of journal ranking.

Note that the methodology for calculating impact-factor produces a volatile measure for journals that have large changes in the number of articles that they publish from year to year. This is because there are usually fewer cites to recent volumes than to older volumes, but the numbers of articles published in those recent volumes are given equal weight in the formula with articles published 7 or 8 years earlier. Just to illustrate, take a hypothetical example of a journal publishing 8 volumes during 1998-2005 and during that period the oldest volume had 2100 cites and that each subsequent volume had 300 fewer cites (the current volume receiving zero cites we'll assume because it's too recent to have been cited in the published literature) - the total cites to all the volumes would be 8400. If each volume published 100 articles, then the journal's impact-factor would be 10.5 (8400/800). In the following year's cycle (1999-2006) assume again that the oldest volume receives 2100 cites dropping again by 300 for each volume (down to zero cites to the most recent volume), but then suppose that the journal increased its output to 150 articles in its most recent volume instead of the usual 100, then its impact-factor would decline from 10.5 to 9.9 (8400/(700+150)). If the journal instead had reduced its output to 50 articles in its most recent volume then its impact-factor would have increased from 10.5 to 11.2 (8400/(700+50)). Such an impact-factor fluctuation from 2005 to 2006 has no basis in any quality change in the journal as it's caused by an increase or decrease in numbers of articles where those articles are too recently published to be significantly cited. As delayed production can have a significant impact on impact-factor - in the above example a zero output for the current year would increase the impact-factor from 10.5 to 12 (8400/700) - an amelioration is introduced into the article count by estimating the number of expected articles for the currently expected year of publication. See below for the actual method used to extrapolate article counts from a 7 year (not 8 year) base period.

The formula for determining impact-factor is complicated by the fact that the citation data extracted from Westlaw's JLR database covers a seven year and ten month period - October 31st of each year being the somewhat arbitrar cut-off date for the study. However, the count of the number of articles published by each journal is for volumes dated during the previous seven years. Thus for the November 2006 survey the number of citations occur from the period 1999 through the end of October 2006, whereas the count of articles published in each journal is for 1999-2005.
So the formula to determine impact-factor extrapolates the article count by adding 10/12ths of each journal's average year's article count. To obtain an average yearly article count it's necessary to adjust for the number of years that a journal has been in existence if it began publication less than seven years prior to 2006.
Thus the formula process is:

YearDifference =

2006 - YearJournalBegan

if greater than 7 then = 7

ExtrapolatedArticleCount =

articlecount + ((articlecount/YearDifference)x(10/12))

impact-factor =

totalcites/ExtrapolatedArticleCount)

rounded to one decimal place

The basic methodological difficulty is one of determining the number of articles published by each journal for the date period, there being no completely satisfactory and automated method for doing this. Most of the article quantity data (at least for the higher ranked journals) was obtained from the WilsonWeb Index to Legal Periodicals. "Articles" meaning any entry that ILP indexes, such as forewords, letters, notes, book reviews, as well as more traditional articles. Note that ILP has had different inclusion policies for its indexing over the years: up to the year 1999 items with less than 5 pages were not included. Then through the year 2002 items with less than 2 pages were not included, and in subsequent years items with less than half a page were not included. After the first preference of ILP, if it was necessary to check journals or volumes in other databases, then the next preference was Westlaw (if Westlaw comprehensively added articles for the years needed), followed by Lexis, then Legal Resource Index, then Legal Journals Index (UK), then any other index in which the journal was indexed. Sometimes a manual count was made by physically examining the tables of contents for the journal years needed. In cases where indexing was not available and a manual count was not feasible, then an extrapolation was made from what was known. As these variant sources undoubtedly have differing definitions as to what is a countable entity this introduces variability into the counts.

The Cites per Cost ranking is the average yearly number of cites to the journal divided by the annual US$ cost to U.S. academic libraries. So e.g., a journal with 600 cites per annum and costing $60 would show '10' in the Cites per Cost column.

Journals that are free are ignored for the purpose of this ranking. Strictly they should have an infinite score (cites/0) and be at the top of the ranking, however the purpose of this ranking is to present a cost-effective analysis for purchasing decisions, so ranking free journals seems to counter that purpose. "Average yearly number of cites" is the yearly average of cites as determined by the law journal rankings in the law journal submission information data, and its inclusion methodology (described further above) should be kept in mind. Because of the quirky 7 year and 10 month period used for that data, the cites numerator is calculated by multiplying average cites per month by 12. The year the journal began publication is also considered for recently started journals. So the actual calculation is:
YearDifference = 2005 - YearJournalBegan (if > 7 Then = 7)
Months = (YearDifference X 12) + 10
AdjustedCites = (RawCiteCount X 12) / Months
Cites per Cost = AdjustedCites / AnnualCost (rounded to two decimal places)

To see the cost for any of the U.S. journals you can go to http://lawlib.wlu.edu/selecting.aspx which is a webpage that fits journals (ranked by cites/cost) into a user-supplied budget amount.

One idiosyncracy to note is that (at Georgetown Law Journal's request) cites to Geo. L.J. and the separately published Geo. L.J. Annual Review of Criminal Procedure are merged in the ranking under "Georgetown Law Journal" - consequently the costs for both publications are added into the denominator of the cites/cost calculation.