Questionably Measuring Success

The University of Oregon is consistently evaluated by Academic Analytics, a group critics say uses inaccurate measures of academic performance

Academic Analytics, a company that measures the performance, the production and ultimately the success of faculty and overall institutions at the university level, has been collecting data on the University of Oregon since 2013, despite widespread controversy about the group’s accuracy and merit.

Under a contract the UO and Academic Analytics signed in 2015, the UO pays nearly $100,000 annually for Academic Analytics data.

Since the company was founded in 2005, a debate has emerged on whether data from Academic Analytics is accurate or useful. In early 2015, the American Association of University Professors (AAUP) released a statement about Academic Analytics and research metrics warning universities to be cautious about the group.

The AAUP cited a 2015 study by the Higher Education Funding Council for England that found “indicators can be misused or ‘gamed,’ that the data infrastructure underpinning use of metrics and information about research remains fragmented, with insufficient interoperability between systems; and that ‘it is not currently feasible to assess research outputs or impacts … using quantitative indicators alone.’”

Academics Analytics gathers data from universities worldwide. The mission, according to Tricia Stapleton, the company’s chief communications officer, is “to provide university administrators and faculty members with high quality, accurate data and tools for decision making, collaboration and research promotion to the public.”

Among other measurements, Academic Analytics works one-on-one with universities to measure success in terms of productivity among individual faculty members and departments. Data gathered are put into its comparative database — which reportedly has information on more than 270,000 faculty members associated with more than 9,000 Ph.D. programs and 10,000 departments at more than 385 universities globally — for universities to compare themselves against others.

Brad Shelton, UO executive vice provost of academic operations, says Academic Analytics measures faculty productivity by considering several factors: How many research papers has this faculty member published, where were the papers published, how many times have the papers been cited, and so on.

“Those are a set of metrics that very accurately measures the productivity of a math professor, for example,” Shelton says. “It’s not easy to collect, but it’s straightforward. But another discipline, such as music, is on a completely different landscape for measuring productivity, and Academic Analytics ends up missing a lot.”

Bill Harbaugh, UO economics professor and creator of the blog UO Matters, says he believes the way Academic Analytics measures data “encourages the wrong kind of science. It encourages faculty to publish quickly, publishing in a way not to have greatest intellectual impact but to get the best numbers,” he says. “Faculty will purposely split something into two papers that would normally have been one, just to get more numbers of publications.”

According to the AAUP, Rutgers University signed a $492,500 four-year contract with Academic Analytics in 2013. By 2015, the school disregarded all use of the data, saying they “hardly capture the range and quality of scholarly inquiry, while utterly ignoring the teaching, service, and civic engagement that faculty perform,” and “the Academic Analytics database frequently undercount, overcount, or otherwise misrepresent the achievements of individual scholars.”

In late 2015, Spike Gildea, a linguistics professor at the UO, also questioned the accuracy of Academic Analytics data, and spent time checking the data that Academic Analytics provided compared with what individual faculty in the linguistics department had measured of themselves.

“I found that the Academic Analytics data were embarrassingly wrong, so I did an analysis of the kinds of errors I saw, emphasizing that these didn’t look like random errors that happened to hit a lot of linguists, but rather they potentially revealed systematic problems in their data-collection algorithms,” he says.

“This was especially important because, at that time, ours was the only department that had tested the data, and the results were so bad that somebody else needed to do some due diligence,” Gildea adds. “If the numbers for other departments were anywhere near as bad as they were for our department, then we were paying good money for bad information — which might mislead us into making bad decisions.”

Even though the data centers significantly around individual faculty members, the UO restricts the information to department heads alone, something Harbaugh calls “a major transparency issue.”

Robert Berdahl, a longtime UO professor and the interim president of the university in 2012, works with Academic Analytics on a consulting basis as one of four “senior academic advisors,” Stapleton says. Berdahl is also former president of the Association of American Universities, a separate group from the AAUP.

In early 2016, Gildea presented his information to various faculty and administrators around the university until greater attention was brought to the matter, and the provost and academic affairs office ultimately decided to conduct another study to check the accuracy of Academic Analytics’ data.

The experiment, Shelton says, looked at individual curriculum vitae versus the Academic Analytic data for 30 randomly chosen faculty members.

“What we found is that Academic Analytics data is very accurate — it’s always accurate. If there are small errors, they fix them right away,” Shelton says. “The question becomes: How close does that represent the faculty in that field represent themselves? And as we move further from the sciences, that mismatch grows and the data doesn’t reflect so well what the faculty is doing.”

Shelton says the UO does not use the data to make hard and fast decisions — rather, the overall expected productivity of individual departments is always considered, and the data is only looked at in conjunction with traditional faculty measurements and reviews — a standard practice among universities.

The AAUP included in its statement that universities and faculty members should “exercise extreme caution in deciding whether to subscribe to external sources of data like Academic Analytics and should always refrain from reliance on such data in tenure, promotion, compensation or hiring decisions.”