Transcript: Altmetrics Under The Microscope

Listen to Podcast Download Transcript PDF

Interview with Prof. Cassidy Rose Sugimoto

For podcast release Monday, June 22, 2015

KENNEALLY: Scientific research seeks answers to questions large and small, from the composition of atoms to the age of the universe. The answers, of course, can lead to wider understanding, professional advancement, and sometimes commercial success.

Welcome to Copyright Clearance Center’s podcast series. I’m Christopher Kenneally for Beyond the Book. Scientists, funders, and institutions may value research for its own sake, yet they are practical, too. They seek reliable measurements of the impact their work has in the world. Over the last several years, the rise of Web publishing and the accompanying proliferation of data has spawned an explosion in the metrics population. Professor Cassidy Rose Sugimoto devotes her research to the domain of scholarly communication and scientometrics, examining the formal and informal ways in which knowledge producers consume and disseminate scholarship. She joins me now from her office at Indiana University Bloomington. Welcome to Beyond the Book, Professor Sugimoto.

SUGIMOTO: Hi, Chris, thanks for having me on the show.

KENNEALLY: We’re delighted to have you join us today. We can tell people that you have presented on this topic at numerous conferences and received research funding from the National Science Foundation, Institute for Museum and Library Services, and the Sloan Foundation among other agencies. We heard you speak at the annual U.S. conference of STM, a global trade association for academic and professional publishers, where you examined the proliferation, reliability, and validity of so-called alternative metrics. I guess a place to start here is really what’s the goal that researchers have in all this? They’re trying to measure their impact, but what do they mean by impact?

SUGIMOTO: I think we have largely quantified impact through traditional science metric measures, that is, the impact of science on science – how science is cited by other scientists and used in publications by other scientists. But I think that there is a growing need, not only from external pressures from research funding agencies, but also a need coming from the scientists themselves to ensure that their work is having impact on the public beyond academia. They’re looking to these so-called altmetrics and other forms of metrics as ways to quantify that impact outside of the scientific realm.

KENNEALLY: Right, so that impact on science itself was measured by, famously, the so-called impact factor. We could devote an entire program to impact factor. But for those in our audience who are not familiar, tell us briefly what the impact factor measures. We’ll start there, and then we’ll talk about why alternative metrics seem to be so necessary.

SUGIMOTO: Sure. The impact factor is an interesting metric because it applies to journals and it was used, frankly, as a collection development tool for librarians to be able to identify which journals were being used the most by scientists. It was misappropriated, in many ways, to become a quality measure. People then began to identify individuals as having greater quality when those individuals were associated with those journals. So we took a metric that was really supposed to be used for institutional decision-making and applied to individuals. So we have one of those. Altmetrics has largely gone at the article level rather than the journal level, looking at again using impact and equating that with quality – the quality of an individual article, and then the individuals associated with that article.

KENNEALLY: The other aspect of IF as they call it – impact factor – that is, I think, interesting and important to the discussion is the way that IF has proven to be a kind of commercial tool, as well, for the journals themselves.

SUGIMOTO: Absolutely, and a branding mechanism. I think it is quite common for researchers to stake their reputation on the impact factor of the journals in which they publish. In that way, it has become the goal in and of itself – to publish in a journal with a high impact factor can establish an individual’s reputation.

KENNEALLY: So altmetrics are really fairly new. I understand there is a so-called manifesto for altmetrics that goes back only a couple of years. Tell us briefly what this alternative metrics was promising. What’s the vision of altmetrics?

SUGIMOTO: The vision of altmetrics was the idea that we have metrics that show us – they indicate the use of documents. But it doesn’t show us the broader realm of what scholars can do and the kinds of activities they are doing. The ideal is to make manifest all of those activities – the fact that scholars produce data, that scholars teach classes, that scholars attend conferences, that scholars mentor students. All of those things were invisible to all of these metrics. We were focusing on a single metric for a single genre, and that was seen as inadequate in a holistic and comprehensive understanding of a scholar’s value and work to their institution. One of the ideas behind the altmetrics movement was to make visible that which was previously invisible.

KENNEALLY: I wonder if you think that we’ve moved from impact factor to ego factor. This is essentially about the scientists and their reputations.

SUGIMOTO: Absolutely. I’m very concerned about the type of goal displacement activities that happen within altmetrics. We know that there are abuses of citation measures and metrics built upon citations, but they are much harder to game than altmetrics. Altmetrics are still quite feral, quite unstandardized, and we lack a lot of understanding of the meaning of these. There’s also the fact that scholars are spending a lot of time curating their online portfolios, and this includes altmetrics. The degree to which they’re spending and investing time in that activity rather than scholarship itself becomes potentially damaging to scientific enterprise.

KENNEALLY: Interesting. Scientists, then, are no different from the rest. They are curating their online presence, as you say. But in favor of altmetrics, though, is the idea that scholarly communications, via journal publishing, has been a kind of slow moving process; rather rigid, rather formal. There’s a recognition here in this digital age that science, that the advancement of knowledge is something that is very fast-moving and, in fact, very messy.

SUGIMOTO: Absolutely, and that’s essentially Jason Priem’s idea behind altmetrics is –

KENNEALLY: Jason Priem was the man who declared this new age of altmetrics.

SUGIMOTO: Right, right. So one of the coauthors of this manifesto, and he said three things about this. He said that it could uncover previously invisible traces, and that citations miss impact, and then, as you said, that citations are part of this slow, rigid, and formal communication system, and that science itself is messy and fast-moving. To that end, absolutely altmetrics has promise. Altmetrics gives us timely indicators of impact. Now what that impact is and a greater understanding of that impact is necessary before we can more firmly utilize these measures.

KENNEALLY: What has happened, though, and this is what your research has looked at and uncovered to help us to better understand is that these new forms that have appeared only in the last couple of years have yet to replace the old forms. So what we have is this proliferation, this explosion of metrics. Really, you’re looking at how successful they are at measuring what they seek to measure.

SUGIMOTO: Absolutely. And I have to cite the Leiden manifesto which came out this year, and one of the great quotes from that is that metrics have proliferated, they’re usually well-intentioned, they’re not always well-informed, and they’re often ill-applied. That’s where we are (overlapping conversation; inaudible).

KENNEALLY: That’s three for three, it would seem to me.

SUGIMOTO: Right. So they’re well-intentioned. I think the idea of uncovering all of these forms of impact that a scholar has is very important, and it has different levels of importance for different disciplines. There are some disciplines where the impact upon society is very important, and we need to measure that. But the use of these metrics and the calculation of these indicators from these metrics still leaves a bit to be desired.

KENNEALLY: We are speaking today with Professor Cassidy Rose Sugimoto, who is on the faculty at the Indiana University of Bloomington, and researches so-called scientometrics and has done a fair amount of research about alternative metrics and their impact on the scholarly community. One of the things that you have looked at and helped to clarify for us is the way that some of these metrics – well, the goal was to reveal things which were previously invisible, and indeed they are revealing things, some of them troubling. For example, with regard to gender differences in authorship.

SUGIMOTO: Absolutely. One of the things we’re looking at is the way in which the Internet can possibly become a space for equal communication, where under-represented minorities, where people who were previously voices unheard in a discipline could have a space for discourse. But what we’re finding is that the similar gender imbalances and other disparities that we’ve seen in the scientific workforce continue in these online environments. So we must take a lot of care in interpreting these metrics to understand that they are not exacerbating disparities that we’ve already seen.

KENNEALLY: It seems to me, Professor, as someone who’s outside of the faculty world – the world of the university and so forth, that what you’re seeking to measure sounds rather, shall I say, squishy. But your research, though really relies upon some theorems, some laws that you have helped to identify kind of the rules of the road for these types of activities, for metrics themselves. So there is something called Campbell’s law, perhaps you can tell us about that. And then there’s also the law of requisite variety. I don’t want you to get into all the jargon, but I would want you to help people understand who are listening that the type of research you’re doing is itself rather formal and frankly, quite rigorous.

SUGIMOTO: Well, thank you, I appreciate that. We rely a lot on large-scale, heterogeneous data sets, and one of the exciting movements right now is the availability of those data sets through a lot of open data, both policies and regulations. The more data that we can have and combine, the better we can understand the scientific system, which in and of itself, is a very complex system with a lot of moving parts.

One of the laws that you mentioned was the law of requisite variety, and that’s developed in the area of cybernetics. Simply put, it states that for a system to be stable, the number of states in its control mechanism must be greater than or equal to the number of states in the system being controlled. That is that if we have a complex system with many moving parts, we need to make sure that the control mechanism is taking all of those parts into account. What we’ve done a lot with previous scientometric measures is we relied on single variables, and that really reduces the complexity of the scientific system in ways that are harmful for making proper evaluation. So one of the things that we try to do is get the big picture, look at the macro level of these and try to piece out what’s happening in this highly complex and highly interdependent system.

KENNEALLY: Campbell’s law also intrigued me because what it states again, in précis, is that the more any indicator is used for decision-making, the more it is open to corruption and corrupt pressure. So really the gaming of the system is an important thing to measure as well as the rest of the system.

SUGIMOTO: Absolutely. This is one of those laws that I keep in the back of mind throughout all of my research, with the idea that the more indicators that we create, the more distortions we will see in the system, and they will distort towards the indicator. So we can call this goal displacement, and there are a lot of other sociological terms where people, when given an incentive, will change their behavior. Now, there are fantastic – and this may seem cynical – but ways to social engineer the process. If our indicators change behaviors in desirable ways, then the end is good. But what we’ve seen is a lot of these incentive structures change behaviors in ways that are not desirable. So an increase in scientific fraud, in gaming, and in other ways of research misconduct we’ve seen as a result of these incentive structures. So we need to ensure that as we add yet another measure of quantifying scholarly output, that we are not distorting the scientific system in the process.

KENNEALLY: Professor Sugimoto, I’m going to ask you to do what you always have to do at the end of every semester, which is to give out some grades. The question I have is, how well are we doing now in this age of altmetrics? What grade would you give, not altmetrics itself, but just the effort that is being made in the scientific community to better measure its impact on global engagement?

SUGIMOTO: Well, that’s a hard question, Chris, because I’m usually not allowed to give grades on effort. For effort I would give it an A. This is a very important conversation that we need to have. We need to ask what the value of higher education is, and we need to ask what the value of research is, and we have to ask what the value of the current publishing system is. Those are very important questions and the altmetrics movement has done a lot to spur that conversation.

Is this metric doing well right now? I give it probably a C. I think that it’s telling us some information, but what it’s telling us and how accurate that information is not yet known.

KENNEALLY: Very interesting, and we can certainly say that you are not someone who is prone to grade inflation. We didn’t hear any As except for effort, and that’s a very kind one.

Professor Cassidy Rose Sugimoto and the Indiana University Bloomington, thank you so much for joining us today on Beyond the Book.

SUGIMOTO: Thank you Chris.

KENNEALLY: Beyond the Book is produced by Copyright Clearance Center, a global rights broker for the world’s most sought after materials, including millions of books and e-books, journals, newspapers, magazines, and blogs, as well as images, movies and television shows. You can follow Beyond the Book on Twitter, find us on Facebook, and subscribe to the free podcast series on iTunes, or at our Website, beyondthebook.com. Our engineer and co-producer is Jeremy Brieske of Burst Marketing. My name is Christopher Kenneally. For all of us at Copyright Clearance Center, thanks for listening to Beyond the Book.

Share This