IP Law Journals: How are they ranked & who ranks them?

By: D. Deitch 

 

The concept of ranking things is one that is met with great debate.  The first known use of a ranking came in 1836.[1]  Ever since people have been ranking everything from military designations to college football teams, and everything in between.  The key to any standardized competition ranking is having a set of criteria to which all of the ranked items can be evaluated by.  No matter how simple or complex, the criteria behind the rankings is what provokes the discussion.  Law journals have been ranked for years by multiple sources with a range of different criteria.

 

How are intellectual property law journals ranked? Who are the organizations that rank them? And how do these organizations design the criteria for which the rankings are determined?

 

The exploration for these answers prompted a broad look into the landscape of law journal rankings and opened more questions that are yet to be answered.

 

Who ranks Journals?

Before looking into how IP Journals are ranked, it is important to understand who is behind journal rankings.  What makes understanding IP journal rankings difficult is the lack of reference points people have to look toward.  Law journals as a whole, are not evaluated by a multitude of sources.  This can lead to reliance on information that is crafted by a small subset of the community.  Additionally, another issue that the lack of information causes is a fallacy that there is a direct connection between where a law school ranks nationally and how esteemed their law journals are.  While prestige matters, it is naïve to simply rely on law school rankings as a backdrop for where a school’s particular journal should rank amongst its peers.

 

Washington and Lee School of Law has separated itself as one of, if not, the premier law school journal ranking organization.[2]  In addition to Washington and Lee, Google Scholar has cemented itself as one of the central platforms for scholarly literature.[3]  Washington and Lee was one of the first organizations to create a function to filter out subject specific journals (such as intellectual property) in their rankings.[4]  It took until November of 2011 for Google Scholar to announce that they too would be adding a function that allows users to search by “research area.”[5]  One unfortunate difference between the Washington and Lee rankings and the Google Scholar metrics is that Google Scholar does not have a specific intellectual property category amongst their “research areas,” instead having a “technology law” category which serves as a catch-all for journals across verticals such as “International Data Security” or “Law and the Arts.”[6]  What someone who does even a minimal amount of research into journal rankings, and specifically journal rankings of specific subjects such as IP will realize, is you end up asking more questions than you arrived with.

 

Ranking Methodology

Washington and Lee’s ranking methodology collects data from Westlaw using the Boolean search system.[7] The data they collect sorts the citing documents, which are court cases from all U.S. Jurisdictions as well as any law review or journal article for every law journal.[8]  A “Citing Document” is categorized as a document which cites journal volumes published in the previous 5 years.[9]  Washington and Lee recently changed this number from 8 down to 5 in order to continue the promotion of citing current scholarship, as well as preventing any bias that the rankings may show toward long-published journals.[10]

 

Washington and Lee use two crucial metrics to create their final rankings. “Impact Factor,” defined as the average number of annual citations per article in a given journal, rounded by two decimal points.[11]  The eventual weighing in the Combined Score metric (described below) helps any bias that might be shown against journals that publish a larger number of shorter articles.[12] The other critical metric Washington and Lee use is “Total Cites.”  Total Cites is simply the number of combined journal and case cites in the given time period.[13]

 

Together, Washington and Lee use “Combined Score” to determine their ultimate rankings.  Combined Score is their composite of each journal’s Impact Factor and Total Cites.[14]  The Combined Score is calculated by weighing approximately 1/3 to Impact Factor and 2/3 to Total Cites.[15]  This weighted total is then normalized on a scale of 100 in order to create a relative ranking of all journals.[16]  The Combined Score rankings are based on Ronen Perry’s idea that neither an Impact Factor metric nor a Total Cites metric are independently sufficient to create a holistic set of ranking.[17]  Perry’s premise was that there is a problem in determining what weight to give to the underlying factors within a combined ranking methodology.[18]  Washington and Lee ultimately came to the decision to use 0.33 because that weight gives Harvard the highest combined-rank for the better part of the period dating from 1988 onward.[19]

 

As far as Google Scholar is concerned, their rankings utilize two key metrics as well: “h5-index” and “h5-median.”[20]  The h-index Google uses is an alternative form to Impact Factor in measuring a given journals importance.

 

Google defines them both as follows:

“h5-median for a publication is the median number of citations for the articles that make up its h5-index.

h5-index is the h-index for articles published in the last 5 complete years. It is the largest number h such that h articles published in 2014-2018 have at least h citations each.[21]

 

While there is much debate as to which of the two metrics (Impact Factor or h-index) is “better” there might not be one right answer because they serve two different purposes.  Impact Factor is geared toward recognizing the so-called ‘prestige’ while the h-index is designed as more of an author/researcher-based metric.[21]  In other words, Impact factor looks to find the reputation of a given journal but does not necessarily find the impact of the individual articles or their authors.  On the other hand, h-index measures publication record and its impact by looking at both the number of papers the author has published as well as the citations that those papers have received.

 

Meta-Ranking

There are those people out there that question whether Washington and Lee or Google Scholar is a full-proof system of calculating rankings of law journals.  As a result, there are sources who have decided to create their own form of aggregate rankings.

 

Bryce Clayton Newell, Assistant Professor of Media Law and Policy at the University of Oregon has gained notoriety with his “Meta-Rankings” first posted in 2016.[23]  He considers a weighted combination of available resources that much of the community find reliable in a number of areas.  He aggregates information from Washington and Lee with their law journal rankings as well as their individual Impact Factor rankings, Google Scholars metrics, and US News peer reputation score rankings as well as US News school rankings.[24]  His “MetaRank” gives 25% weight to Washington and Lee’s journal rankings, Google Scholars rankings, Us News peer reputation score and US News overall school rankings respectively leaving out Washington and Lee’s Impact Factor from the final rankings.[25]  This exercise, or something similar has gained popularity in recent years with many people in search of what the best process is to rank law journals.[26]  There is certainly disagreement in what to measure and how to weigh certain categories, but the premise remains constant; Trying to find the best way to rank law journals on an even playing field.

 

Conclusion

As the interest in ranking law journals has increased, many believe the journal landscape will be given more exposure as a result.  More and more organizations and resources are being dedicated to studying journals and how people track citations within the legal documents.  The hope is that with fresh faces exploring the subject, old methods will be challenged, and new methods will be implemented into the ranking of journals.  While it won’t happen overnight, it sure will be interesting to see where it goes.

[1] Ranking, Merriam-Webster Dictionary. (11th ed. 2009).

[2] Washington and Lee School of Law, W&L Law Journal Rankings, washington and lee university, (Sep. 3, 2019), https://managementtools4.wlu.edu/LawJournals/Default.aspx.

[3] Google Scholar Top Publications, google, (Nov, 2019), https://scholar.google.com/citations?view_op=top_venues&hl=en&vq=soc_law.

[4] W&L Law Journal Rankings, supra note 1.

[5] Helder Suzuki, Updated Scholar Metrics: Now Grouped by Research Area, google scholar blog, (Nov. 15, 2012), https://scholar.googleblog.com/2012/11/updated-scholar-metrics-now-grouped-by.html.

[6] Google Scholar Top Publications: Technology Law, google, (Nov, 2019), https://scholar.google.com/citations?view_op=top_venues&hl=en&vq=soc_technologylaw.

[7] Washington and Lee School of Law, W&L Law Journal Rankings: Ranking Methodology, washington and lee university, (Sep. 2019), https://managementtools4.wlu.edu/LawJournals/Default3.aspx.

[8] Id.

[9] Id.

[10] Washington and Lee School of Law, W&L Law Journal Rankings: Combined-Score Ranking, washington and lee university, (Sep. 2019), https://managementtools4.wlu.edu/LawJournals/Default4.aspx.

[11] Washington and Lee School of Law, W&L Law Journal Rankings: Impact Facto, washington and lee university, (Sep. 2019), https://managementtools4.wlu.edu/LawJournals/Default5.aspx.

[12] W&L Law Journal Rankings: Combined-Score Ranking, supra note 10.

[13] Id.

[14] Id.

[15] Id.

[16] Id.

[17] Ronen Perry, The Relative Value of American Law Reviews: Refinement and Implementation, 39 Conn. L. Rev. 1 (2007) https://papers.ssrn.com/sol3/papers.cfm?abstract_id=897063.

[18] Id.

[19] W&L Law Journal Rankings: Combined-Score Ranking, supra note 10.

[20] Google Scholar Metrics, google, (July, 2019), https://scholar.google.com/intl/en/scholar/metrics.html#metrics

[21] Id.

[22] Lutz Bornmann & Hans-Dieter Daniel, The state of h index research in EMBO Reports: vol 10, 2 (Dec. 12, 2009) https://www.embopress.org/doi/abs/10.1038/embor.2008.233.

[23] Bryce Clayton Newell, Law Journal Meta-Ranking, 2019 Edition, University of Oregon Blog, (July 23, 2019), https://blogs.uoregon.edu/bcnewell/meta-ranking/.

[24] Id.

[25] Id.

[26] Paul Caron, 2019 Meta-Ranking of Flagship U.S. Law Reviews, TaxProf Blog, (July 25, 2019), https://taxprof.typepad.com/taxprof_blog/2019/07/2019-meta-ranking-of-flagship-us-law-reviews.html.