Back to Home

The Data

Robert Borofsky

Center for a Public Anthropology

Hawaii Pacific University

The data for this project come from two sources: (a) a list of fulltime faculty in 145 anthropology departments and (b) the Altmetric database. Let me discuss each in turn to provide a more transparent understanding of how the Project’s data are collected and scored.


This database is limited to anthropology departments in the United States with ten or more full-time faculty members. Admittedly, this cutoff point is arbitrary. But the Center lacks the resources to cover every department. Stopping at 145 departments – involving ten or more faculty – seemed reasonable for the Project’s purposes. The percentage associated with each department is based on (a) the number of faculty who, according to the Altmetric database, have five or more mentions for one of their publications in the public media divided by (b) the number of fulltime members in that department.

To facilitate the data’s scoring, two additional rules are included. First, the searches only include a department’s fulltime faculty as listed on its departmental website. Adjunct and affiliate faculty are excluded. Second, the exact names of faculty used in searches are drawn from their departmental websites. If an individual’s middle initial or name is listed on that website, it is included in the search. If only the first and last name are listed, then these are used in the search.

The collected data provide a general sense of a department’s size and faculty. But I would stress they are not perfect. Faculty sizes and affiliations change from year to year and sometimes even semester to semester. Keeping up with these changes, given the Center’s limited staff, has proved difficult. Since this can affect a department’s ranking, I have included a link to the full data set so readers can review the data most relevant to them. Please email me (at with any mistakes you find so I can correct them. Thank you. [List of Departmental Faculty]


Altmetric, which provides the other data set for this project, monitors a range of online sources on a continuing basis to track and collate attention relating to academic content. It searches for mentions of scholarly outputs (items with DOI’s, arXiv, SSRN IDs, or unique URI identifiers) in order to present this information in a structured, coherent way. While this website focuses on mainstream news outlets, policy documents and blogs, it should be noted Altmetric also tracks other data sources such as Wikipedia, Twitter, Reddit, Sina Weibo, and Mendeley. For a full list of sources tracked by Altmetric refer to:

Collection of Data

Mainstream news outlets: Altmetric receives a real-time feed of news stories via Moreover (part of LexisNexis). Each of the news stories is searched for links to scholarly outputs as well as text-mined (English language only) for mentions of an author name and journal title. These pieces of information are then cross-referenced with publications from CrossRef to determine if a publication, fitting these criteria, has been published within 15 days either side of the news item. If a positive match is made, the reference is included in Altmetric’s details page for that publication. For more information, including details on the news sources tracked by Altmetric, refer to:

Public policy documents: Altmetric monitors a curated list of policy sources (such as the World Health Organization), and searches new documents daily as they appear online for references to certain publications. These references are then added to a publication’s details page, usually within a day’s time. For further information on their public policy document tracking, refer to:

Blogs:  Altmetric scans RSS feeds from a manually curated list of blogs once a day, looking for links to published publications. When a mention is found, it is matched to the research output and then added to the appropriate details page within 24 hours.  For further information on the tracking of blogs, see: and

How Data Are Displayed and Scored on the Website

A publication’s mentions are displayed in the red and orange rectangles to the left of a publication’s title, publisher, and author. These numbers represent the number of times various news outlets, policy documents and/or blogs take note of that publication. Readers can click on a link to see the specific sources mentioning the publication. It is important to note that only publications referenced in news outlets and policy documents are included in the departmental percentages. Blogs are not.

There are three reasons for not including blogs in a publication’s overall score. (a) The Project focuses on anthropological publications that attract public attention. Generally speaking, news outlets and policy documents tend to garner more attention than blogs, especially in respect to two key audiences the Project is concerned with – funders and university administrators. (b) A number of blogs are produced by journals as a way of publicizing their content. A publication’s blog score may depend on whether it is published in a journal that writes blogs to support its publications since such blogs are distributed to thousands of readers.

(c) Blogs vary in garnering public attention. Some have large readerships, larger perhaps than certain newspapers. But many do not. Trying to account for the differences between media outlets and blogs by giving them different scoring weights creates complications. Exactly how many points a news outlet should have vis-à-vis a blog is unclear. Should the ratio between them, for example, be 10 to 2, 12 to 3, or 9 to 3? On what basis can one reasonably decide one ratio is more appropriate, more reasonable, than another? Given the Project’s focus on news outlets and policy documents, it seems best to score each mention of a publication by a news outlet or policy document with one point while setting aside a publication’s blog score in the summarizing score that gives a publication its public mentions. The blog mentions are listed in the orange block below the red one.

Missing mentions? If you believe that Altmetric has missed a mention of a publication or has presented it incorrectly, please let Altmetric know via this form: The Altmetric support team will review your submission and update the details page for the relevant publication where appropriate.

More information about why mentions of a publication are sometimes missed can be found at:

A Pattern to Ponder:  Perusing the data, readers will note that archeologists and biological anthropologists tend to be cited in the media more than cultural anthropologists. One likely reason derives from the journals the discipline’s subfields publish in. Cultural anthropologists tend to publish in a set of sub-field journals. Archeologists and biological anthropologists often publish in more interdisciplinary journals likely to reach a wider audience. There is no reason why cultural anthropologists could not publish in PlusOne, Science, or Nature. But many prefer publishing in the American Anthropologist, American Ethnologist or Cultural Anthropology thereby attracting less attention from those beyond their sub-field. Current Anthropology, which crosses the discipline’s sub-fields, tends to attract less attention than major inter-disciplinary journals’, but comparatively more attention than the American Anthropological Associations journals focusing on a specific sub-field.

Back to Home