Back to Home

The Project

Robert Borofsky

Center for a Public Anthropology

Hawaii Pacific University

Public Anthropology’s Metrics Project highlights the degree to which an anthropology department’s faculty produce publications attracting broad public attention. In contrast to Google Scholar – which takes note of the degree to which academics cite each other in academic publications – the Altmetric database, used here, reports on the attention anthropological publications receive in news outlets, policy documents and blogs from around the world. It allows us to perceive which anthropological publications covering what topics receive widespread public attention.

Like Google Scholar, the Altmetric database starts with publications and books published by academically oriented journals and presses. But its orientation is on how others beyond the discipline, beyond the academy, perceive these publications. Altmetrics – short for alternative metrics – provides metrics capable of tracking these publications. They help highlight the public value of anthropological work. Using metrics.publicanthropology.org, readers can click on the red and orange rectangles to the left of a publication to see which news outlets, which policy documents, and which blogs refer to that publication.

The Metrics Project focuses on anthropology departments. Since promotions and tenure are initiated at the departmental level, departments play an important role in fostering public engagement. Using an arbitrary baseline of 5 mentions in news outlets and policy documents, the Project explores which departments have the greatest percentage of faculty with publications at or above this level. Certainly, individual faculty have publications which reach higher levels – some have over 200 mentions in the public media. But the Project is concerned with departments. Rather than the status claims of individual academics, it seeks to support the departmental structures that foster public engagement in anthropology.

To help make the competition between departments more even-handed, additional parameters are added. First, large departments are separated out from smaller departments since, smaller departments can far more easily reach higher faculty percentages with fewer faculty. This parameter limits the disadvantage larger departments face. Second, to provide a perspective on how various departments are doing over time, percentages are calculated for the past year, two years, and since 2011 (the year Altmetric was founded). The $1,000 competition only applies to the past two years. A particular research topic might be temporarily highlighted in the media. Given the Projects’ concern with supporting departmental structures, it seeks out a slightly long trend. Third, looking at the data, readers will see a few departments tend to dominant the rankings year in, year out. To insure additional departments have a chance to win the competition (and grow their public engagement), a particular department is limited to winning the $1,000 award once every five years.

Differing Perspectives

Anthropologists often address their publications to a small coterie of colleagues. Reinforcing this tendency, they frequently assess a publication’s value by the number of times these colleagues cite it. Such assessments convey an intellectual insularity. They suggest anthropology is primarily meant for anthropologists.

However, agencies and foundations outside academia fund most anthropological research. (Anthropologists rarely fund their own work.) A key criterion for these funders is that the research they fund have value for the broader public, not just a sedt of academics. The National Science Foundation (NSF), for example, requires all proposals and final reports to specify the “broader impacts” of their research defined as encompassing “the potential of the proposed activity to benefit society and contribute to the achievement of specific, desired societal outcomes . . . and [be], insofar as possible, understandable to a scientifically or technically literate lay reader”.

(http://www.nsf.gov/pubs/policydocs/pappguide/nsf13001/gpg_2.jsp#IIC2d).  The National Institutes of Health (NIH) Office of Behavioral and Social Sciences Research states in its 2017-2021 strategic plan: “As established by Congress more than 20 years ago, the role of the Office of Behavioral and Social Sciences research (OBSSR) at the NIH is to coordinate the health-relevant behavioral and social sciences and to identify challenges and opportunities to advance these sciences in the service of the nation’s health (https://obssr.od.nih.gov/wp-content/uploads/2016/12/OBSSR-SP-2017-2021.pdf ). Paralleling these perspectives, the United Kingdom’s Research Councils (RCUK) “supports world-leading research and innovation to create a more prosperous, healthy and sustainable society” (https://www.ukri.org/public-engagement/).

Two Key Audiences: Funding Agencies and University Administrators

Given funding agencies concern with public outreach, having an applicant’s research cited in news outlets and/or policy documents beyond the discipline offers an advantage in the grant application process. A recent commentary in Nature notes: “In the digital age, a growing number of researchers and publishers are using more than just [academic] citation counts to track the impact of their publications. In an essay in PLoS Biology, three authors from a major UK research-funding agency argue that alternative metrics — or altmetrics, such as social-media mentions — can help funders to measure the full reach of the research that they support” (http://www.nature.com/news/funders-drawn-to-alternative-metrics-1.16524).

Impact Versus Attention

A common frame of reference for assessing a publication’s value involves its “impact factor” – a standard devised by Eugene Garfield and calculated annually since 1975 (see https://en.wikipedia.org/wiki/Impact_factor).  According to Webster’s Third International Dictionary, impact, as a noun, refers to “the force of impression of one thing on another.” Some anthropological publications (following Webster) may “arouse and hold attention and interest.” But they rarely constitute a “concerted force producing change.”

In An Anthropology of Anthropology (2019), Borofsky shows that while academics often cite prominent anthropologists – such as Lévi-Strauss, Geertz, and Wolf – most citations lack extensive discussion of their work. The citations mostly mention them in passing, discussing their work in one or two sentences. The implication is that authors rarely seriously intellectually engage with such figures, preferring to just cite or briefly mention them.

As noted, the Metric Project focuses on the degree to which a publication attracts public attention – especially in news outlets and policy documents. More mentions in more media outlets suggest more public attention. It is not concerned with a publication’s “impact factor” as the term is usually used.

The Center

The Center for a Public Anthropology, that supports this project, is a non-profit – or 501 (c)(3) in the U.S. tax code. It encourages scholars and students to address public problems in public ways. It operates two websites: publicanthropology.org and publicanthropology.net. Please refer to publicanthropology.org to examine the projects the Center is involved with.

For further information, please refer to publicanthropology.org.

Relevant References:

Williams, Catherine and Danielle Padula
2015   The Evolution of Impact Indicators: From Bibliometrics to Altmetrics. London: Altmetric and Chicago: Scholastic.

Borofsky, Robert
2019   An Anthropology of Anthropology. Kailua, HI: Center for a Public Anthropology. https://books.publicanthropology.org/an-anthropology-of-anthropology.html.

Borofsky, Robert (ed.)
2019   Showing Anthropology Matters: Public Anthropology in Action. Kailua, HI: Center for a Public Anthropology. https://books.publicanthropology.org/showing-anthropology-matters.pdf

Back to Home