Back to Home


Robert Borofsky

Center for a Public Anthropology

1. What Is the Source for the Data Presented Here?

These data are drawn from the Altmetric database (see and are limited to anthropology departments in the United States. The percentage associated with each department is based on (a) the number of faculty having publications with five or more mentions in the international media (as affirmed by the Altmetric database) divided by (b) the number of fulltime members in that department. This format emphasizes the public attention garnered by the overall department rather highlighting one or two individuals.

To avoid biasing the results toward smaller departments – because the publications of one faculty member in a small department counts for a higher percentage than the publications of several faculty in a larger department – anthropology departments must have at least ten faculty to be included in the database. For the same reason, departments are divided into two categories – under 20 and 20 or over.

The list of fulltime departmental faculty is based on each department’s website. Adjunct and affiliate faculty are excluded. Faculty names used in word searches are based on their names as presented on their departmental websites.

Altmetric monitors a range of online sources on a continuing basis. It searches for mentions of scholarly outputs (items with DOI’s, arXiv, SSRN IDs, or unique URI identifiers) in order to present this information in a structured, coherent way. While the Metrics Project focuses on mainstream news outlets, policy documents and blogs, it should be noted Altmetric also tracks other data sources such as Wikipedia, Twitter, Reddit, Sina Weibo, and Mendeley. While the Metrics Project focuses on mainstream news outlets, policy documents and blogs, it should be noted Altmetric also tracks other data sources such as Wikipedia, Twitter, Reddit, Sina Weibo, and Mendeley (see

In respect to mainstream news outlets, Altmetric receives a real-time feed of news stories via Moreover (part of LexisNexis). Each of the news stories is searched for links to scholarly outputs as well as text-mined (English language only) for mentions of an author’s name and journal title. These pieces of information are then cross-referenced with publications from CrossRef to determine if a publication, fitting these criteria, has been published within 15 days either side of the news item. If a positive match is made, the reference is included in Altmetric’s details page for that publication (see

In respect to public policy documents, Altmetric monitors a curated list of policy sources and searches for new documents daily (as they appear online) for references. These references are then added to a publication’s details page, usually within a day’s time. For further information on public policy document tracking, please see:

In respect to blogs, Altmetric scans RSS feeds from a manually curated list of blogs once a day, looking for links to published blogs. When a mention is found, it is matched to the research output and then added to the appropriate details page within 24 hours.

Blogs vary in the degree to which they garner public attention. Some have large readerships, larger than certain newspapers. But many do not. Trying to account for the differences between media outlets and blogs by giving them different scoring weights creates complications. Exactly how many points a news outlet should have vs. a blog is unclear. Should the ratio between them be 1 to 2 or 1 to 3? Given the Project’s focus on news outlets and policy documents, it seems best to score mentions of a publication by a news outlet or policy document separately from blogs. The former are listed in a red rectangle; the latter in an orange rectangle.

For further information on these topics, please refer to the website.

2. What Do the Displayed Results Refer To?

The red and orange rectangles to the left of a publication’s title, publisher, and author, represent the number of times various news outlets, policy documents and/or blogs take note of a particular publication. Readers can click on the red and orange rectangles to see the specific sources that reference the publication (see

If you believe that Altmetric has missed a mention of a publication or has presented it incorrectly, please let Altmetric know via this form:

More information about Altmetric and how it operates, readers might wish to visit:

3. How Does Altmetric Compare with Google Scholar in assessing a publication’s public impact?

One should be cautious in assuming that because scholar A cites scholar B, scholar B has had a significant influence on scholar A’s work. As noted by Borofsky in An Anthropology of Anthropology (2019:113):

Often a publication’s intellectual value is assessed by the degree to which others cite it. This is a flawed standard. Citing an author does not guarantee actual engagement with the author’s ideas. Rhode writes: “There is no guarantee that authors have actually read the sources cited. Indeed, with technological advances, they need not even trouble to type them; entire string citations can be electronically lifted from other publications. Nor does it follow that the sources listed establish the proposition for which they are cited. Even when someone checks the notes, it is generally to determine only whether particular authorities support the text, not whether they are reliable or respected among experts” (Rhode 2006:38).

An Anthropology suggests, “most of the citations to [prominent anthropologists’] key works [are] of the “bump and go” variety, to use an American football metaphor. Authors mostly refer to them to convey they are aware of the relevant literature related to the topic they are writing about. But few seek to systematically engage with the ideas in these figures’ key works for more than two sentences” (2019:79).

Grafton, in The Footnote: A Curious History observes: “Only the relatively few readers who have trawled their nets through the same archival waters can identify the catch in any given set of [foot]notes. . . . For most readers, footnotes play a different role. In a modern, impersonal society, in which individuals must rely for vital services on others whom they do not know, credentials perform what used to be the function of guild membership or personal recommendations: they give legitimacy” (1997:7–8).

Google Scholar focuses on the number of times an author is cited in various academic publications. It does not examine the contexts of a citation, the number of sentences that deal with the citation in a publication, or its general significance in the publication referred to. It assumes more citations means more intellectual significance. Yet within anthropology, as just noted, rarely is a famous author’s key book cited and discussed for more than two sentences in citations referencing that book. See Borofsky (2019:66-106) for specific examples and details.

Altmetric, in contrast to Google Scholar, offers a less introverted perspective – academics citing academics. Public impact is assessed in the Metrics Project by what those beyond the academy write in the world’s media about a particular publication. Readers can click on a citation and read the relevant passage (or passages) related to that citation to understand how and why it is cited. (Interested readers might also refer to or

4. Can the Center for a Public Anthropology’s Metrics Project Benefit Anthropologists and Anthropology Departments?

Today, two outside forces are reshaping the discipline. First, there is an increasing demand for accountability: Are anthropologists producing work of public value that justifies their salaries and the costs of their research?

The call for increased academic accountability has been building for decades. Starting with the National Science Foundation Act in 1950, the National Defense Education Act in 1958, and the Higher Education Act in 1965, considerable funding has poured into universities. Today, this funding totals more than $117 billion. Not unreasonably, those providing this funding are increasingly concerned about how their money is being spent.

With the Covid-19 pandemic, the financial pressure has increased geometrically. Some anthropology departments will likely be downsized; some research funding will be reduced. There simply not be enough money to go around. Inside Higher Ed reports: American “governors or budget leaders in legislatures have begun announcing that they will have to make billions in cuts, and in many states, they are telling colleges and universities to start preparing plans for surviving with less state money” (Murakami, May 15, 2020). A headline in The Chronicle of Higher Education states, “As the coronavirus outbreak erodes financial health and administrative confidence at colleges across the country, many have started to lay off or furlough employees en masse to thwart colossal budget shortfalls” (Chronicle Staff, May 13, 2020).

Second, the call for increased accountability dovetails with another academic trend. In times past, reviewers of a promotion portfolio might informally assess a faculty member’s publications. With advances in the internet, metrics now exist for measuring the status of the journals an author publishes in as well as the number of colleagues that cite her or his work. This has allowed for a shift in control over accountability assessments. Administrators no longer need depend on faculty versed in a discipline’s intricacies to assess an individual’s productivity. They can rely on a host of numbers instead.

It should be noted that many faculty find metrics, such as those produced by Google Scholar and Academic Analytics, oppressive. Times Higher Education’s Paul Jump (2015), quoting other sources, writes: “Research managers can become ‘over-reliant on indicators that are widely felt to be problematic or not properly understood . . . or on indicators that may be used insensitively or inappropriately,’ and do not ‘fully recognize the diverse contributions of individual researchers to the overall institutional mission or the wider public good.’” A report by the American Association of University Professors (AAUP) regarding Academic Analytics cautions that “measuring faculty ‘productivity’ with an exclusive or excessive emphasis on quantitative measures of research output must inevitably fail to take adequate account of the variety and totality of scholarly accomplishments.” Deborah Rhode cites a Carnegie Foundation survey that indicates more than a third of university faculty believe their publications are assessed mostly in terms of quantity rather than quality. At schools with doctoral programs, the figure is more than 50 percent.

What the Metrics Project offers is a way to address the demand for public accountability without succumbing to the flaws of the citation statistical assessments – numbers without context – associated with Google Scholar and Academic Analytics. The Altmetric data allow readers to understand and assess the context and significance of a citation in a more open, transparent manner that provides a better sense of its public impact.

If anthropology departments are to survive the current financial crunch, if they are to flourish in these difficult economic times, public relevance matters. It is what university administrators need exhibit to the broader public to justify their existence – both politically and financially. The Metrics project helps departments demonstrate to those who determine their budgets and professional advancement that they are doing more than claiming to produce internationally relevant publications. They are actually doing so and have confirmable, transparent proof to demonstrate this in the Altmetric data (see,, and ).

The Metrics Project offers a way forward that not only allows anthropology to survive in difficult financial times, but to flourish. In supporting administrators and their need to demonstrate to their colleagues and the broader public the international relevance of specific departments (such as anthropology), anthropology is building institutional support for itself. In helping their schools to shine, anthropology is enhancing its own reputation within and beyond the academy.

5. Why Is Two Years Used for Assessing Award Winners (Rather Than One or Five Years)?

Any of the three time periods listed on the website could assess a department’s progress over time. Given the focus of the Metrics Project is on fostering structural change – highlighting the departments that are improving their public outreach through time – emphasizing one year seems to unfairly reward departments whose faculty publish on a particularly “hot” topic for a short time. Selecting five years as the standard may miss middle-range changes while reinforcing the long-term status quo. The two-year standard tries to combine the advantages of both these time periods while softening their respective disadvantages.

6. When I Click on An Article, The Cited Author Is Not Always Listed. Why Is That?

Generally, when one clicks on the publication listed, the individual referred to will be one of the publication’s authors. However, occasionally it will not. I found, for example, 67 references to Paul Farmer, dating back to 2011, in the Altmetric database. For most references, he is listed as an author.

In respect to the others, one cited him in the publication’s text, another in the publication’s references, and for one, he authored a forward. There were two awkward cases. One publication involved a Paul Hamilton and a Madison Farmer as authors. Presumably, the search engine combined them into Paul Farmer. The other is more problematic, The Year That Ebola Virus Took Over West Africa by Bausch in the American Journal of Tropical Medicine and Hygiene, 2015. The article does not make specific mention of Paul Farmer. As its title suggests, the article is about the Ebola epidemic in West Africa. Paul Farmer, in his role as Chief Strategist for Partners in Health, was quite involved in treating in Ebola in these countries during this period. I am unsure how Paul Farmer’s name became associated with this article. But the error rate, if you exclude the name merger above, is less than one and a half percent (01.5%). I would stress that none of the problematic cases discussed here made their way into the data displayed on the website.

For further clarification of how Altmetric collects its data, readers might wish to refer to: Altmetric website, About Our Data: How it works.

7. Can the Altmetric Measures Used Here Be Applied in Other Disciplines?

Yes. It is simply a matter of applying to Altmetrics for a license. A school administrator could apply for a number of departments. To gain a sense of how Altmetrics might prove useful, click on: How to Use Altmetrics to Showcase Engagement Efforts for Promotion and Tenure.

If you or your institution is interested in learning more about Altmetrics, go to and click on the Who are we for? (see Or, when you go to, click on the Products heading and then Explorer for Institutions.

8. Who Sponsors the Metrics Project?

The metrics project is sponsored by the Center for a Public Anthropology. Please refer to the organization’s website,, for further information.


Borofsky, Robert

2019 An Anthropology of Anthropology: Is It Time to Shift Paradigms? Kailua, HI: Center for a Public Anthropology.
Kailua, HI: Center for a Public Anthropology.

Chronicle Staff, Chronicle of Higher Education

2020 “As Covid-19 Pummels Budgets, Colleges Are Resorting to Layoffs and Furloughs.”

Grafton, Anthony.

1997 The Footnote: A Curious History. Cambridge: Harvard University Press.

Jump, Paul

2015a Grant Income Targets Set at One in Six Universities, THE Poll Suggests. Times Literary Supplement, June 12. (accessed August 6, 2018).

Murakami, Kery

2020 State Cuts Grow Deep. Inside Higher Ed. May 15.

Rhode, Deborah L.

2006 In Pursuit of Knowledge: Scholars, Status, and Academic Culture.

Stanford, CA:

Stanford University Press.

Back to Home