Editors' Update is your one-stop online resource to discover more about the latest developments in journal publishing, policies and initiatives that affect you as an editor, as well as other services and support available. Discover and participate in upcoming events and webinars and join in topical discussions with your peers.
Our Journal Insights tool now offers increased information on metrics, publication speed and ‘reach’ for participating titles.
Hans Zijlstra | Project Manager, Marketing Communications & Researcher Engagement department, Elsevier
We know that when searching for the right home for their research, authors aren’t just interested in standard information such as Aims & Scope – they also want data on the performance of that journal.
To help meet that need, around two years ago we developed the Journal Insights tool, which brings together several important journal indicators on a journal’s homepage.
Based on researchers' positive feedback and suggestions, we have continued to develop the technology and are now happy to announce some additions to the information provided.
How to find Journal Insights
Clicking on the image brings authors to a landing page where they can select data visualizations of three key groups of metrics developed to aid their decision making –Impact, Speed and Reach. Please note, not all journals may have all metrics available.
To view Journal Insights information, an HTML5-compatible browser (Firefox, Chrome, Internet Explorer 9 or higher) is required.
Until recently, visitors could choose to view definitions, graphs, and relevant supporting data for the following five metrics:
We have now added a sixth option - Acceptance Rate. This displays the percentage of all articles submitted to that journal in a calendar year accepted for publication that year. Both the number of submitted articles and the number of accepted articles are shown.
Please note, the 2014 datasets for Impact will be added around July, when Scopus and Thomson Reuters release their latest metrics.
The metrics provided are:
Journals can now choose to exclude special issue articles – which can take longer to produce than an average journal article – from the Online Article Publication Time metric. If they have chosen this option, it will be made clear on the page. 2014 data has also been added.
Originally called ‘Authors’, this section displays a world map indicating the number of primary corresponding authors at the country level over the past five years.
The author map remains and we have added a world map displaying the number of full-text downloads from ScienceDirect on a country level over the past five years (if available). 2014 data has also been added.
Learn more about our plans to expand the use of almetrics on our platforms and the other metrics-related projects we are working on.
Elsevier’s altmetrics program is growing. This year will see us expand our deployment of this data on our various platforms and introduce new methods of classification and display.
Elsevier’s relationship with altmetrics – measurements of online engagement – dates back to the beginning of this exciting field. In November 2011, Elsevier announced the winner of its Apps for Science Challenge, Euan Adie. His idea for measuring the attention that research articles receive via social media and online news sites won the Grand Prize of $15,000.
The Altmetric.com score and donut visualization that Euan developed have been displayed in our abstract and citation database of peer-reviewed literature, Scopus, since 2012. They appear in the sidebar of document and abstract pages when data is available for the article being viewed.
In November 2013, we launched pilots using similar visualizations on our research content platform ScienceDirect, and on Elsevier’s journal homepages; you can read more about these in the Editors’ Update article “Elsevier expands metrics perspectives with launch of new altmetrics pilots”. Around this time, Cell and The Lancet also began to display a brick version of the same score on the online versions of their articles. These pilots are now ending and the results have been analyzed. Based on the results and your positive feedback, ScienceDirect and the journal homepages will roll out altmetrics data to a wider set of journals over the coming months.
Elsevier’s 2013 acquisition of Mendeley, the free reference manager and academic social network tool, also put Elsevier at the forefront of altmetrics data providers. We continue to make the Mendeley readership statistics freely available for use in applications and on websites. You can visit http://dev.mendeley.com/ and become a member of the community. Mendeley also hosts regular developer meetings at its offices, where you can learn more about the API that allows you to develop very complex applications using the Mendeley infrastructure.
In its pilot for 27 large journals, ScienceDirect tested alternating altmetrics images on an article level. Visitors landing on the relevant pages had a 50 percent chance of seeing either information presented in The Lancet / Cell brick format, or the donut. User interaction with the visualizations was similar which indicated that the information was interesting, regardless of the manner in which it was presented.
The journal homepages pilot involved 30 journals – both large and small – from various fields. The donut and article title for each journal’s top three rated articles appeared in an ‘altmetrics pod’ on the homepage (Figure 1). By clicking on the ‘view all’ option beneath the top three list, visitors could review the altmetrics score for the top 10 articles. Interestingly, 86 percent of visitors chose to find out more information by clicking on the article title, which took them to ScienceDirect, rather than the donut link to the Altmetric.com detail page.
Elsevier is certainly playing its part in the wider community. Dr. Lisa Colledge, Director of Research Metrics, has been working with university research offices to agree a set of standard metrics they would like to use to give input into their university strategies. The Snowball Metrics group has endorsed altmetrics that are now included in their open and community-led institutional metrics program. The four Snowball altmetrics 'buckets' cluster together data types that result from similar activity; Scholarly Activity, for example, is the number of times publications have been posted in online tools that are typically used by academic scholars, like Mendeley and CiteULike, and Social Activity counts the number of social media posts that have been stimulated by publications, such as those on Facebook, Twitter and Pinterest.
William Gunn of Mendeley and Michael Habib, Elsevier’s Senior Product Manager for Scopus, are at the forefront of the NISO (National Information Standards Organization) Alternative Metrics program, which aims to advance standards and / or best practices in this area.
Michael Taylor of Elsevier Labs plays an active part in the community, working with external groups on metrics formulations; attending and organizing conferences, funding postdoctoral research; and publishing various articles. Mike recently guest edited a special altmetrics edition of Research Trends, our free online magazine for insights into scientific trends, which contains a great deal of valuable contextual material, research summaries and thoughts on the future from key players in the field.
Elsevier has also recently launched its Metrics Development Program to provide data and financial sponsorship to individuals and research groups working on research metrics.
In the year ahead, we will start experimenting with displays of the ‘buckets’ of altmetrics data. Inevitably, as more research is undertaken, and more people become aware of the potential for exploring and sharing content offered by public and scholarly engagement indicators, this field will move on. There is already discussion on the future of the “alt” (for ‘alternative’) in altmetrics, indicating that they are increasingly perceived as mainstream.
Researchers can already monitor the impact of their own papers published in Elsevier journals, and get more specific insights, via free tools such as ScienceDirect Usage Alerts and the weekly CiteAlert service. They can also, of course, discover who is talking about their articles by following the links from the altmetrics information on Scopus. But above all, researchers can get involved by starting to share their research via the social media channels that best serve their network, and by starting to include altmetrics information on CVs.
|Based in Oxford, Mike Taylor has worked at Elsevier for 18 years, the past four as a Technology Research Specialist for the Elsevier Labs group. In that role, he has been involved with the ORCID Registry. His other research interests include altmetrics, contributorship and author networks. Details of his research work can be found on the Elsevier Labs website. He is currently producing a series of three plays about scientists for Oxford-based theater company www.111theatre.co.uk.|
|As Marketing Project Manager, Hans Zijlstra is responsible for projects focusing on journal and article metrics with the aim of improving our service to authors. He joined Elsevier in 1996 after working as an expert in database marketing for a German mail order company and Visa. He held various senior marketing positions before leaving to take on management roles in medical communications, telecom, finance and sailing. He also set up his own consultancy, Zijlvaart Advies & Actie, specializing in cultural heritage and museum marketing. He returned to Elsevier in 2008, initially doing marketing for magazines and webinars in horticulture and infosecurity. Via portfolios in Computer Science and Biochemistry he landed in his current role. In his spare time he researches and writes about maritime history, genealogy and 17th century paintings, next to sailing the Dutch Wadden sea.|
|Dr. Lisa Colledge is an expert in the use of research metrics that provide insights into research performance of universities, researchers, journals, and so on. She developed this knowledge “on the job” by working with editors and learned societies to build understanding of journal Impact Factors relative to their competitors, and to develop strategies to improve journals’ standings. She then joined the Elsevier Research Intelligence product group, which develops tools to support performance measurement, and most recently launched SciVal. Lisa currently applies her expertise to Snowball Metrics, for which she is Program Director; in this initiative, a group of universities agrees a single robust method to generate metrics that are commonly understood, draw on a combination of all data sources available to the universities generating them, and which thus support international benchmarking. Prior to joining Elsevier, Lisa conducted postdoctoral research at the University of Edinburgh. She holds both a DPhil and an MA from the University of Oxford.|
4 Aug 2014 1 Comment
As interest in measurement metrics continues to grow, Elsevier launches a new program to fund research in this area.
Gali Halevi | Senior Research Analyst and Program Director, Elsevier
Elsevier has launched a Metrics Development Program to provide data and financial sponsorship to individuals and research groups working on scientific evaluation metrics.
The aim of the program is to encourage research into a range of impact measures such as bibliometrics, altmetrics, and individual- or institution-level metrics.
Researchers, administrators and managers across industries and countries are constantly seeking ways to measure the impact of their funded projects and research in a systematic manner.
For more information about submissions and proposals, please visit the Metrics Development Program website.
14 May 2014 2 Comments
While there is much publishers and editors can do to ensure good research is highlighted, there’s no doubt authors also have an important role to play. Elsevier has developed several initiatives designed to help authors promote their papers. Below you can learn more about two of the most recent – Kudos and Share Link. We […]
While there is much publishers and editors can do to ensure good research is highlighted, there’s no doubt authors also have an important role to play.
Elsevier has developed several initiatives designed to help authors promote their papers. Below you can learn more about two of the most recent – Kudos and Share Link. We also touch on a longer-standing project, AudioSlides. And with researchers increasingly evaluated not only by the number of articles they have published but also by their impact, initiatives like these have never been more important.
What is Kudos?
Traditionally, the impact of publications is measured by citations. However, not only does it take a while for citations to begin accumulating, they also provide a limited picture of an article's reach. For that reason, other metrics – such as readership figures, social media mentions, and captures and shares on academic networks – are proving increasingly popular.
This is where a new service called Kudos comes in. In the words of its founders, Kudos was developed to help researchers, their institutions and funders "measure, monitor and maximize" the visibility and impact of their published articles. It does this by focusing on three core principles:
Kudos provides a platform for:
After a successful alpha release phase in partnership with AIP Publishing, the Royal Society of Chemistry and Taylor & Francis, Kudos is ready to take the next step and has signed up additional publishers, including Elsevier, for their beta phase. During this beta phase, which runs from April–December this year, we will test the tool with 22 journals.
Elsevier journals participating in the Kudos initiative are:
|Resuscitation||American Heart Journal|
|Vaccine||Evolution and Human Behavior|
|Virology||Journal Of Molecular Biology|
|Journal of Adolescent Health||The Journal of the Economics of Ageing|
|Fertility and Sterility||Journal of Consumer Psychology|
|Journal of Human Evolution||Leukemia Research Reports|
|Science of the Total Environment||Thrombosis Research|
|Journal of Archaeological Science||Journal of Functional Foods|
|Journal of Research in Personality||Appetite|
How Kudos works
Following publication of their articles, authors from participating journals will receive an email asking them to log on to the Kudos platform. On the platform, they will be led through various steps that prompt them to explain their article; add context via links to other content such as images and data; and share their article via social networks and email.
The Kudos platform, which is free for authors, allows authors to see the effect of their actions on altmetrics (via Altmetric.com) and data about the usage of their article on Elsevier’s ScienceDirect.
The alpha pilot site for Kudos was launched in September last year and during the three-month pilot period more than 5,500 authors registered. They have claimed articles, enhanced them with additional metadata (such as a short title and lay summary) and links to related resources, and shared them via email and social networks, which has led to increased usage of those articles.
During the beta phase, Kudos is working with a much wider group of publishers, articles and authors, which will enable them to undertake more rigorous analysis of the effectiveness of the service, and explore variables such as subscription versus open access.
For more than a decade, we have provided authors publishing in an Elsevier journal with an ‘e-offprint’ of their article – a PDF version they can share with their colleagues and peers.
But times and technologies have changed, and this year we are rolling out a new functionality: Share Link. Instead of a PDF, authors will receive a personalized link providing 50 days’ free access to their newly-published article on ScienceDirect.
Each customized link is ideal for sharing via email and social networks such as Facebook, Twitter, LinkedIn, Mendeley, and ResearchGate. Users clicking on the Share Link will be taken directly to the article with no sign up or registration required.
When will this technology be available?
In December 2013, a trial mailing was sent to 26,000 authors whose articles were published in October of that year. Feedback was encouraging with authors welcoming the opportunity to share their research.
After publication of a paper in Journal of Human Evolution, David J. Nash, Professor of Physical Geography at the School of Environment and Technology, University of Brighton, tweeted his enthusiasm about Share Link.
He later commented: “I’m very supportive of making research as widely available to end-users and the interested public as possible. Not everyone has access to academic journals, especially those in sub-Saharan Africa where I do much of my research. Rather than having to send my contacts personal copies of papers, it is much more useful to have free access, even if only for a limited period. The research I published last year in the Journal of Human Evolution was funded internally by my institution. As such, I did not have the resources to pay for full open access to my article. Any move to improve this situation would be welcomed.”
Following an expansion of the initial Share Link trial, the decision has now been taken to roll out the program and we expect to make it available to all eligible Elsevier titles by mid-2014.
Which journals will offer Share Link?
The majority of Elsevier’s journals will benefit from the program. Journals operating outside the Elsevier Production Tracking System (PTS), from which the Share Link data is extracted, and a selection of other titles are currently not included.
What are the benefits?
What will happen to the current e-offprint program?
Once the Share Link program has been rolled out to all eligible journals, the current e-offprint program will be closed. For those journals not using Share Link we are working on an alternative solution.
AudioSlides allow authors to make mini-webcasts about their papers
AudioSlides are 5-minute webcast-style presentations created by the authors of journal articles. Using a blend of slides (PDF and PowerPoint) and voice-over, authors can explain their research in their own words. The resulting presentation appears alongside their published article on ScienceDirect and, like the abstract, can be viewed by subscribers and non-subscribers alike.
Because AudioSlides presentations are made available under a Creative Commons open-access license, authors can also embed them on their personal or institutional websites. The team has also recently made it possible for authors to download their presentations in mp4/movie format so they have the option to promote them through other channels, such as YouTube, or in presentations at workshops and conferences.
Since the initiative was launched, more than 1,760 AudioSlides presentations have been created.
Inez van Korlaar
DIRECTOR OF PROJECT MANAGEMENT
Inez van Korlaar (@InezvKorlaar) joined Elsevier in 2006. After three years in publishing, she moved to the marketing communications department of STM Journals. In her current role she is responsible for global marketing communication projects, which includes outreach to researchers in their role as an author. She has a PhD in health psychology from Leiden University in The Netherlands and is based in Amsterdam.
In this Sharing Research Special Issue, I am delighted to welcome as Guest Editor the Senior Vice President of Elsevier’s Marketing Communications & Researcher Engagement department, Nicoline van der Linden. After gaining an MSc in Medical Biology from the University of Amsterdam and an MBA from the Rotterdam School of Management at Erasmus University, she […]
In this Sharing Research Special Issue, I am delighted to welcome as Guest Editor the Senior Vice President of Elsevier’s Marketing Communications & Researcher Engagement department, Nicoline van der Linden. After gaining an MSc in Medical Biology from the University of Amsterdam and an MBA from the Rotterdam School of Management at Erasmus University, she began her career as a researcher in Life Sciences. She worked as a molecular biologist in the pharmaceutical industry in Basel before joining Elsevier’s Amsterdam office two decades ago. Since then, she has held various roles in publishing, product development, marketing communications and researcher engagement.
I hope you enjoy this issue. We’ll be back in September with a focus on technology and how it can support you in your role.
It was author Isaac Asimov who wrote in the 1970s: “It is change, continuing change, inevitable change, that is the dominant factor in society today. No sensible decision can be made any longer without taking into account not only the world as it is, but the world as it will be.”
Those words ring just as true in 2014. We see the role of our editors and authors changing – not only in terms of how they must manage their journals or craft their papers, but in their day to day lives as academics. The world is digitizing at a very fast pace; this has greatly influenced how we search for information and has broadened the possibilities for dissemination and visualization of content.
The role of publishers is evolving too. While we have long needed to ensure that manuscripts are publishable and protected, in recent years it has become increasingly important that we make them searchable; retrievable; citable; and suitable for archiving on all our platforms – and, to some extent, other platforms – and for this we need the latest technologies.
At Elsevier, we operate an integrated marketing communications policy designed to ensure that messaging and communication strategies are unified across all channels and are focused on the researchers we serve. We combine more traditional media with newer avenues and allow the strengths of one to support the weaknesses of the other.
Increasingly, that integrated marketing communications is being driven by technology and it is an almost irresistible force. Just look at popular author services such as Journal Insights, CiteAlert and Article Usage Reports where automation/IT and promotion go hand in hand. This rising focus on technology will allow our marketing to become more and more targeted as we embrace databases and new electronic delivery systems. We hope this means we will be able to deliver more meaningful information to our research communities. To assist with that process, we have developed an online Customer Preference Center, where recipients can choose which communications they would like to receive.
As well as the journal-specific campaigns – highlighted in the Marketing Overview you receive from your marketing communications manager each year – we also run ‘global’ campaigns, which cater for large numbers of titles. These allow us to deliver consistent, timely information to authors and/or editors, no matter which journal they are associated with. Examples include:
Each year, we open more than 18,500 articles to the public through promotional access. Via Elsevier funding (e.g. by waiving OA article fees) we open up another 2,440+. Together, that is more than 20,000 articles. The majority of these receive our support because the editor has indicated they are special in some way, or analysis of reader behavior has led us to do so. We also make journal articles openly available to the press. In addition, Elsevier is actively supporting open data. While we have already been leading in linking our articles to open data at various data repositories, we are now investigating how we can open up all supplementary materials on ScienceDirect that contain original research data.
It is worth noting that with the introduction of new tools, techniques and business models, responsibilities are changing. As editors, there is still much you can do to make noteworthy or novel research more visible to our readers, as we explain in How to promote research in your journals (and why you should).
But for authors, it is no longer the case that publishing their article will ensure people read their research. They have an important role to play in raising the profile of their article. This is especially true with the rise in the number of OA articles, which sees some of the promotional responsibilities (and possibilities) for sharing divested to the authors themselves. In New tools help authors boost the visibility and impact of their research, we outline some of the avenues we have available to support them.
Interestingly, this new emphasis on author self-promotion may leverage the already shifting focus from the Impact Factor to other measures such as downloads, social media shares, Snip, Eigenfactor, and H-index. If it does, you can rest assured that Elsevier’s Marketing Communications & Researcher Engagement department will be ready to respond…
With the increasingly interconnected internet and developments in ‘big data’ analysis, there are now many ways available to measure research impact. Traditional bibliometrics may be supplemented by usage data (pageviews and downloads), while the success of online communities and tools have led to more widespread visibility for learned commentary, readership statistics, and other measures of […]
With the increasingly interconnected internet and developments in ‘big data’ analysis, there are now many ways available to measure research impact.
Traditional bibliometrics may be supplemented by usage data (pageviews and downloads), while the success of online communities and tools have led to more widespread visibility for learned commentary, readership statistics, and other measures of online attention. Altmetrics encompass social activity in the form of mentions on social media sites, scholarly activity in online libraries and reference managers and scholarly commentary, for instance, through scientific blogs and mass media references.
Elsevier has long been an advocate for robust informetrics and we are particularly interested in understanding how these new measures can be used in conjunction with usage and citation data, to provide new meaningful indicators to the research community. Mendeley statistics, for instance, appear to provide more insights into the academic use of a paper than Twitter.
Altmetrics, (the measurement of scholarly activity on social networks and tools), is a major buzzword at the moment, and although it is a very new discipline, interest in it is growing fast, as demonstrated by the relative search volumes in the graph below.
It’s not only general interest that is growing – scholarly interest appears to show high growth too, as demonstrated by a simple keyword search on Scopus for altmetric* or alt-metric* in the title, abstract, or keywords of any paper, the results of which are visible in the graph below.
The first two papers were Proceedings of the ASIST Annual Meeting in 2011; in 2012 the number of papers published jumped to eight, and in 2013 to 28 (data for 2013 may still be incomplete due to publication and indexation delays). Even though in absolute terms these are still low numbers, this is quite tremendous growth.
Altmetrics offer an alternative insight into the use and readership of scholarly articles, and this information has driven authors, researchers, editors, and publishers to try to understand the data. To this effect, Elsevier employees are engaged on the NISO Altmetrics project, have spoken at conferences around the world (e.g. altmetrics12, ALM workshop 2013), have written academic papers, and have conducted webinars in this multi-faceted field.
The last year has seen us launch several initiatives - three of which are explored in further detail below: these are pilots on Elsevier.com journal homepages and ScienceDirect and an existing Scopus project. Joining these is an article usage alert program informing authors in participating journals how their article is being viewed. Mendeley data continues to provide an invaluable and free source of data on the discipline, location, and status of researchers.
Elsevier has also formed partnerships with altmetrics start-ups:
This year will see even more metrics activity. With a special altmetrics issue of Research Trends planned, more products in the pipeline, and our partnerships beginning to produce solid results, we are continuing to actively support research in this fascinating field.
At the end of last year, we began displaying the Altmetric.com colorful donuts for a journal's top three rated articles on the Elsevier.com homepages of 33 Elsevier titles.
This rating is based on a social media traffic score given by Altmetric.com and an article must have received at least one social media mention within the last six months to qualify. By clicking on the "view all" option beneath the top three list, visitors can review the donuts for the top 10 articles. In both lists, the article name links to the full-text article on ScienceDirect, while the donut links to a breakdown of the news and social media mentions.
The pilot is led by Hans Zijlstra, Project Manager for Elsevier's STM Journals Project Management department. He worked closely with Elsevier's e-marketing team in cooperation with Altmetric.com — a company founded by Euan Adie (@Stew), who won Elsevier's Apps for Science Challenge in 2011.
Although it is still early days, Zijlstra will be closely monitoring how much traffic the donuts receive over the coming months. Based on those results and the feedback he receives, the aim is to make this available to all Elsevier journals.
He said: "These additional article metrics are intended to provide authors with extra insight into the various flavors of impact their research may achieve. We believe altmetrics will help them select a journal for article submission by giving a clearer indication of where a journal's strengths and weaknesses lie."
The donuts have also provided useful insights to publishers and editors. He explained: “They help publishers and marketeers determine which media they should engage with more often and publishers and editors can identify hot topics that might merit a special issue.”
Zijlstra and his colleagues are still working on adding to the journal homepage the names of the authors for the top ranked articles. In addition, they plan to include the donuts for participating health and medical titles on their homepages on the Health Advance platform.
Altmetric.com’s colorful donut explained
The Altmetric.com algorithm computes an overall score taking into account the volume, source and author of the mentions a paper receives. This includes mentions of academic papers on social media sites (e.g. Twitter, Facebook, Pinterest, Google+), science blogs, many mainstream media outlets (including The New York Times, The Guardian, non-English language publications like Die Zeit and Le Monde and special interest publications like Scientific American and New Scientist), peer-review site Publons, and reference managers.
News items are weighted more than blogs, and blogs are weighted more than tweets. The algorithm also factors in the authoritativeness of the authors, so a mention by an expert in the field is worth more than a mention by a lay person. The visual representation – the Altmetric.com donut – shows the proportional distribution of mentions by source type. Each source type displays a different color – blue for Twitter, yellow for blogs, and red for mainstream media sources. Links to the source data are also available. Altmetric.com tracks around a hundred thousand mentions a week, with some 3,000 new articles seen each day.
Elsevier’s ScienceDirect platform, home to one-quarter of the world’s STM journal and book content, launched a six-month altmetrics pilot in December 2013.
Until June this year, 26 journals – including The Lancet – will display alternating altmetrics images on an article level. Visitors landing on the relevant pages have a 50 percent chance of seeing either the traditional Altmetrics.com donut or the information presented in a bar chart form. This reflects ScienceDirect’s AB testing approach – the results will be monitored to discover which design is the most engaging and clear for users. The pilot also includes sharing buttons to promote social media mentions of the covered articles and will provide access to the individual article detail pages, which enables users to explore the actual mentions of the paper.
During the pilot we will be assessing the popularity of the altmetrics score with users. We will also be trying to determine how far the scores promote use of the article sharing buttons.
Todd Vaccaro, ScienceDirect Product Manager, commented: “The journals chosen for the pilot represent a good mix. We’ve included titles with a range of Impact Factors, types of attention on the social web, average altmetrics scores and subjects. We’ve also included some society journals and a recent OA title.”
Since June 2012, Elsevier’s Scopus – the largest abstract and citation database of peer-reviewed literature – has offered the Altmetric.com donut in the sidebar of document and abstract pages. It can be found on the right hand side of the screen when data is available for the article being viewed. Visitors can click through to scan the content mentioning the article and click on any entry to navigate to the original site. A “demographics” tab will also show a breakdown of where in the world the attention paid to the article is coming from.
Michael Habib, Senior Product Manager for Scopus, said: “Customers and users alike have found this a useful supplement to traditional citations. The primary point of interest hasn’t necessarily been the metrics themselves, but the underlying content. Discovering that a respected science blogger has given a positive review of your article is much more important than knowing how many people have blogged about it. This pilot has proven a powerful tool for uncovering these previously hidden citations from non-scholarly articles.”
SENIOR PUBLISHING INFORMATION MANAGER
As part of the Scientometrics & Market Analysis team in Research & Academic Relations at Elsevier, Sarah Huggett provides strategic and tactical insights to colleagues and publishing partners, and strives to inform the research evaluation debate through various internal and external discussions. Her specific interests are in communication and the use of alternative metrics for evaluating impact. After completing an M.Phil in English Literature at the University of Grenoble (France), including one year at the University of Reading (UK) through the Erasmus programme, she moved to the UK to teach French at University of Oxford. She joined Elsevier in 2006 and the Research Trends editorial board in 2009.
TECHNOLOGY RESEARCH SPECIALIST
Based in Oxford, Mike Taylor has worked at Elsevier for 18 years, the past four as a technology research specialist for the Elsevier Labs group. In that role, he has been involved with the ORCID Registry. His other research interests include altmetrics, contributorship and author networks. Details of his research work can be found on the Elsevier Labs website. He is currently producing a series of three plays about scientists for Oxford-based theater company www.111theatre.co.uk.
The colorful altmetric donut used to indicate an article’s impact in news and social media is now featured on various journal homepages and ScienceDirect
Linda Willems | Senior Researcher Communications Manager, Elsevier
The academic community has traditionally looked to citation analysis to measure the impact of scientific and medical research. But with journal articles increasingly disseminated via online news and social media channels, new measures are coming to the fore.
Alternative metrics – or altmetrics – represent one of the innovative ways the reach of articles is now being assessed, and Elsevier has just launched two pilots featuring the highly-recognizable altmetric "donut."
The first pilot will feature donuts for a journal's top three rated articles displayed on the Elsevier.com homepages of 33 Elsevier titles.
This rating is based on a social media traffic score given by Altmetric.com; an article must have received at least one social media mention within the last six months to qualify. By clicking on the "view all" option beneath this list, visitors can review altmetric donuts for the top 10 articles.
In both lists, the article name links to the full-text article on ScienceDirect, while the donut links to a breakdown of the news and social media mentions.
The pilot is led by Hans Zijlstra, Project Manager for Elsevier's STM Journals Project Management department. He said his team will be closely monitoring how much traffic the donuts receive over the coming six months, and depending on up-take, their aim is to make this available to all Elsevier journals.
They are still working on adding to the journal homepage the names of the authors for the top ranked articles. In addition, they plan to include the donuts for participating health and medical titles on their homepages on the Health Advance platform.
A parallel altmetric pilot for 25 journals will run on ScienceDirect, Elsevier's scientific database of journal articles and book chapters. The ScienceDirect pilot will have a greater focus on medical journals but there will be some overlap in titles between the two trials.
For some time now, Scopus, Elsevier's abstract and citation database of peer-reviewed literature, has been offering donuts on articles for which the relevant metrics are available.
"These additional article metrics are intended to provide authors with extra insight into the various flavors of impact their research may achieve," Zijlstra said. "We believe altmetrics will help them select a journal for article submission by giving a clearer indication of where a journal's strengths and weaknesses lie."
This article first appeared in Elsevier Connect, an online magazine and resource center for the science and health communities with a broad and active social media community. It features daily articles written by experts in the field as well as Elsevier colleagues.
The altmetric algorithm computes an overall score taking into account the number of mentions the article receives and the importance of the sources. For example, news is weighted more than blogs, and blogs are weighted more than tweets. It also factors in the authoritativeness of the authors, so a mention by an expert in the field is worth more than a mention by a lay person. The visual representation — the altmetric donut — shows the proportional distribution of mentions by source type. Each source type displays a different color – blue for Twitter, yellow for blogs, and red for mainstream media sources. Links to the source data are also available.
The most famous traditional metric, the Impact Factor, averages how often a journal is cited against the number of scholarly articles published in that journal. However, citations can take years to accrue.
One of the advantages of altmetrics is that the impact begins to be assessed from the moment the article is first posted online.
The pilot Altmetric pod for journal homepages has been developed by Elsevier's e-marketing team in cooperation with Altmetric.com — a company founded by Euan Adie (@Stew), who won Elsevier's Apps for Science Challenge in 2011.
For further details on the social media reports, and to see the score for any article containing a DOI, download the Altmetric Bookmarklet from Almetric.com.
|American Journal of Medicine
Applied Catalysis B: Environmental
Bioorganic & Medicinal Chemistry Letters
Journal of Archaeological Science
Computers In Human Behavior
Evolution and Human Behavior
Geochimica Et Cosmochimica Acta
Earth and Planetary Science Letters (EPSL)
International Journal of Radiation Oncology Biology Physics
Journal Of Econometrics
Journal Of Experimental Social Psychology
|Public Relations Review
Science of the Total Environment
Social Science & Medicine
European Journal of Cancer
Computers & Education
Journal of Hazardous Materials
Journal of Catalysis
Food Quality and Preference
International Journal of Surgery Case Reports
American Heart Journal
Earlier this year we unveiled the Journal Insights pilot. The initiative has now been rolled out to more than 800 journals and we highlight some recent improvements.
Hans Zijlstra | Marketing Project Manager, STM Journals Project Management department, Elsevier
Back in the March edition of Editors’ Update, we discussed the newly-launched Journal Insights project in the article Increased Transparency benefits both authors and journals.
The Elsevier.com and Health Advance homepages of all journals participating in the project (and the number has now reached 850) feature a new section, ‘Journal Insights’. Authors clicking on this link arrive at a landing page where they can select data visualizations of three key groups of metrics, developed to aid their decision making. When the project pilot was launched earlier this year, those three metrics were:
We have continued to work on improving the information displayed. Below I have outlined some recent changes which take on board lessons learned during the pilot phase and your early feedback.
For 2014, we would like to add new datasets and visualizations and we will further improve the interface on the basis of author feedback. But we have to be realistic too: there is no such thing as a perfect dataset. That is why not all journals can display all metrics. Also the fact that Journal Insights was built in HTML5 language is still an issue for some old browser versions. HTML5 is optimized for mobile devices but does not display very well in Internet Explorer 8 or lower. Fortunately access via this browser is diminishing and currently comprises less than 10% of our traffic.
We want to continue improving – if there are changes you would like to suggest, please contact me at firstname.lastname@example.org. And if you would like to see the Journal Insights information introduced on your journal homepage, please contact your publisher or marketing manager.
The 2012 Journal Citation Reports® (JCR) published by Thomson Reuters is now available.
Linda Willems | Senior Researcher Communications Manager, Elsevier
The 2012 Journal Citation Reports® (JCR) has been published by Thomson Reuters. Elsevier journals top the rankings in 59 subject categories, up from 57 in 2011. Of Elsevier's 1,600 journals in the JCR, 44% are in the top 25% of their subject category. In 2012, 15 Elsevier journals have risen to the top of their subject category, including Cell, reclaiming the top position in "Biochemistry & Molecular Biology".
Other highlights include:
"Another year of positive results reinforces my view that we're on the right track as a Publisher, but most of the credit should go to the world-class authors, editors and reviewers," said Philippe Terheggen, Managing Director Journals at Elsevier. "When we look at these scores, it's important to keep in mind that while the Impact Factor is an important measure for overall journal influence, it is not to be used for evaluation of individual researchers or articles. We're increasingly looking at additional metrics, including so called Altmetrics, as a measure of influence of journals and authors. Meanwhile, we will continue to invest in enhancing the quality of our content, for example by increased support of peer review, by enriching the online article, and through linking articles to research data sets in external repositories. That's the journey we're on."
All of The Lancet journals' Impact Factors increased. The Lancet rose from 38.278 to 39.060, The Lancet Oncology saw an increase from 22.589 to 25.117 moving up its subject category ranking from 4th to 3rd position. The Lancet Neurology increased to 23.917, from 23.462, and The Lancet Infectious Diseases went up from 17.391 to 19.966, both journals maintaining the top rankings in their respective categories.
The journals of Cell Press, an imprint of Elsevier, mostly saw quite stable trends in Impact Factor, with highlights including strong growth from Molecular Cell - an 8% increase to 15.280 and Trends in Cognitive Sciences, which saw a 27% increase in comparison to 2011, to 16.008. Its flagship journal Cell continues to lead in its field with an impact factor of 31.957, and is the number one research journal in the Cell Biology and Biochemistry & Molecular Biology categories.
Of the 420 titles in the JCR that Elsevier publishes on behalf of societies, 261, or 62%, showed a rise in their Impact Factors. Nine of these rank number one in their subject categories, including Evolution and Human Behavior, which rose from 4th position in the category "Social Sciences, Biomedical". Two society journals reached a number one position for the first time: European Urology in "Urology & Nephrology, and Forensic Science International: Genetics ranked highest in the category "Medicine, Legal".
The Impact Factor helps evaluate a journal's impact compared to others in the same field by measuring the frequency with which recent articles in a journal have been cited in a particular year: the 2012 Impact Factor takes into account citations in 2012 to papers published in 2010 and 2011.
Of interest to: Journal editors (key), additionally authors and reviewers
Of interest to: Journal editors (key), additionally authors and reviewers
Recent improvements to SNIP and SJR aim to make the metrics more intuitive and easy to understand.
Sarah Huggett | Publishing Information Manager, Elsevier
In recent years, computational advances have contributed to acceleration in the field of bibliometrics. While for a long time the journal evaluation landscape was somewhat dominated by a scarcity of measures, there are now many journal metrics available, providing a varied and more integral picture of journal impact . Editors may find these useful to compare their journal to competitors in various systematic ways.
Scopus features two such citation indicators to measure a journal's impact; SNIP (Source Normalised Impact per Paper) and SJR (SCImago Journal Rank). These indicators use the citation data captured in the Scopus database to reveal two different aspects of a journal's impact:
These two indicators use a three-year window, are freely available on the web  and are calculated for all journals indexed in the Scopus database. The metrics have article-type consistency, i.e. only citations to and from scholarly papers are considered.
Some editors may have noticed changes to both SNIP and SJR values for their and other journals around mid-October 2012. These changes were introduced to make the metrics more intuitive and easy to understand. Following these improvements, the values are now computed and released once a year in the summer.
Further information on both metrics is available on the Journal Metrics website.
SNIP was developed by Henk Moed, Senior Scientific Advisor at Elsevier, who was then part of the CWTS bibliometrics group at the University of Leiden, The Netherlands. It is a ratio, with a numerator and a denominator. SNIP's numerator gives a journal's raw impact per paper (RIP). This is simply the average number of citations received in a particular year (e.g. 2013) by papers published in the journal during the three preceding years (e.g. 2010, 2011 and 2012).
SNIP's denominator, the Database Citation Potential (DCP) is calculated as follows. We know that there are large differences in the frequency at which authors cite papers between various scientific subfields. In view of this, for each journal an indicator is calculated of the citation potential in the subject field it covers. This citation potential is included in SNIP's denominator.
SNIP is RIP divided by DCP.
In October 2012, the following changes were applied:
Further details are available on ScienceDirect .
Dr Ludo Waltman, Researcher at the Centre for Science and Technology Studies (CWTS) of Leiden University, commented: “SNIP allows the impact of journals to be compared across fields in a fair way, and has been updated following the most recent insights in the fields of bibliometrics and scientometrics. The recent changes ensure the most balanced treatment of journals from different fields, with minimal implications for users.”
SJR was developed by the SCImago research group from the University of Granada, dedicated to information analysis, representation and retrieval by means of visualization techniques.
SJR looks at the prestige of a journal, as indicated by considering the sources of citations to it, rather than its popularity as measured simply by counting all citations equally. Each citation received by a journal is assigned a weight based on the SJR of the citing journal. A citation from a journal with a high SJR value is worth more than a citation from a journal with a low SJR value.
In October 2012, the following changes were applied:
Further details are available on ScienceDirect .
 Bollen J, Van de Sompel H, Hagberg A, Chute R (2009) A Principal Component Analysis of 39 Scientific Impact Measures. PLoS ONE 4(6): e6022. doi:10.1371/journal.pone.0006022
 Waltman L, van Eck N J, van Leeuwen T N, Visser M S; Some modifications to the SNIP journal impact indicator, Journal of Informetrics, Volume 7, Issue 2, April 2013, http://dx.doi.org/10.1016/j.joi.2012.11.011
 Guerrero-Bote V P, Moya-Anegón F; A further step forward in measuring journals journals' scientific prestige: The SJR2 indicator, Journal of Informetrics, Volume 6, Issue 4, October 2012, http://dx.doi.org/10.1016/j.joi.2012.07.001
The recent Research Trends and Elsevier Labs virtual seminar, The Individual and Scholarly Networks, is now available to view in archive.
Sarah Huggett | Publishing Information Manager, Elsevier
Research Trends and the Elsevier Labs recently co-hosted their first virtual seminar: The Individual and Scholarly Networks. The event, held on 22nd January, attracted more than 500 attendees from all over the world, and featured six compelling external speakers. We used a novel format aimed to maximise engagement: in addition to audio and slides, we showed videos of the speakers and Twitter feed.
Materials from the event, including recordings of each session and discussion, presentations, and a Q&A transcript for those questions that we were unable to address live, are now all freely available on the Research Trends website, although unfortunately we were not able to get rid of some of the technical issues affecting audio in the second part of the event. A summary of the event and highlights of the discussion are also available.
There were two components to the event. The first part focussed on building networks, and the ways in which relationships are formed and maintained, as well as how they are changing the nature of scholarly relationships. In this session, Professor Jeremy Frey discussed how varying degrees of openness aid scientific collaboration, while Gregg Gordon presented an overview of the Social Science Research Network. Then, Dr William Gunn talked on building networks through information linking, using Mendeley as an example. The second part was about evaluating network relationships, exploring the related areas of alternative metrics, contributorship and the culture of reference. In this session, Dr Gudmundur Thorisson discussed digital scholarship and the recently launched ORCID initiative, while Kelli Barr questioned the purpose of and objectivity of evaluations. Finally, Dr Heather Piwowar explored various impact flavours, in particular ImpactStory. Each session was followed by lively discussions amongst the presenters, spurred by questions and comments from our remote audience.
30 Oct 2012 2 Comments
For several decades now, a principal measure of an article’s impact1 on the scholarly world has been the number of citations it has received. An increasing focus on using these citation counts as a proxy for scientific quality provided the catalyst for the development of journal metrics, including Garfield’s invention of the Impact Factor in […]
For several decades now, a principal measure of an article's impact1 on the scholarly world has been the number of citations it has received.
An increasing focus on using these citation counts as a proxy for scientific quality provided the catalyst for the development of journal metrics, including Garfield’s invention of the Impact Factor in the 1950s2. Journal level metrics have continued to evolve and refine; for example, relative newcomers SNIP and SJR3 are now used on Elsevier’s Scopus.
In recent years, however, interest has grown in applications at author, institute and country level. These developments can be summarized as follows (see Figure 1):
The Journal Impact Factor (JIF) was born at a time when there was one delivery route for scholarly articles – paper publications – and computational power was expensive. The migration from paper to electronic delivery (particularly online) has enabled better understanding and analysis of citation count-based impact measurements, and created a new supply of user-activity measurements: page views and downloads.
Over the past few years, the growing importance of social networking - combined with a rising number of platforms making their activity data publicly available - has resulted in new ways of measuring scholarly communication activities: one encapsulated by the term altmetrics5. Although we have added these new metrics to Figure 1, there is no suggestion that superseding generations necessarily replace the earlier ones. In fact, the Relative Impact Measure is still used substantially, even though network analysis exists. The choice of which metric to use is often influenced by the context and question and first, second or third generation metrics may still prove more suitable options.
Although the word altmetrics is still relatively new (not yet three-years-old), several maturing applications already rely on data to give a sense of the wider impact of scholarly research. Plum Analytics is a recent, commercial newcomer, whereas Digital Science's Altmetric.com is a better established, partially-commercial solution. A third mature product is ImpactStory (formerly total-impact.org), an always-free, always-open application.
Altmetrics applications acquire the broadest possible set of data about content consumption. This includes HTML page views and PDF downloads, social usage, (e.g. tweets and Facebook comments), as well as more specialized researcher activities, such as bookmarking and reference sharing via tools like Mendeley, Zotero and Citeulike. A list of the data sources used by ImpactStory appears below. As well as counting activities surrounding the full article, there are also figure and data re-use totals. Altmetric.com also takes into account mass media links to scholarly articles.
To get a feel for how altmetrics work, you can visit www.impactstory.it or www.altmetric.com and enter a publication record. Alternatively, if you have access to Elsevier’s Scopus, you will find many articles already carry an Altmetric.com donut in the right hand bar (the donut may not be visible in older versions of Microsoft Internet Explorer). If there is no data yet available, an Altmetric.com box will not appear on the page. Elsevier also supplies data to ImpactStory, sending cited-by counts to the web-platform.
Although there is some evidence to link social network activity, such as tweets, with ultimate citation count (Priem & Piwowar et al, 20126, Eysenback, 20117), this field is still in its early stages, and a considerable number of areas still require research. Further investigation aims to uncover patterns and relationships between usage data and ultimate citation, allowing users to discover papers of interest and influence they might previously have failed to notice. Planned areas of research include:
Altmetrics is still in its infancy, both as a field of study and a commercial activity. Currently only a handful of smaller organizations are involved and there is no engagement from major web players such as Google or Microsoft. On the publisher front, while all are active with altmetrics in some form, only Macmillan has chosen to get involved via Digital Science's Altmetric.com. That means there is a great deal to play for. We expect to see more emergent platforms and research, and it's not impossible to envisage the development of professional advisers who work with institutions to increase their altmetrics counts – especially now that impact is increasingly tied to funding decisions (e.g. Government funding in the UK via the Research Excellence Framework).
Elsevier is fully engaged with the altmetrics movement. For example, in 2013 the Elsevier Labs team aims to co-publish large scale research which will begin to explore the relationship between the different facets and to establish a framework for understanding the meaning of this activity. It aims to build on the current work to found an empirically-based discipline that analyses the relationship between social activity, other factors and both scholarly and lay consumption and usage. By working together to combine knowledge at Elsevier, we intend to show that no single measurement can provide the whole picture and that a panel of metrics informed by empirical research and expert opinion is typically the best way to analyze the performance of a journal, an author or an article.
TECHNOLOGY RESEARCH SPECIALIST
Mike has worked at Elsevier for 16 years. He has been a research specialist in the Labs group for the last four years, and has been involved with ORCID (and previous projects) throughout that time. Mike's other research interests include altmetrics, contributorship and author networks. Details of his research work can be found on http://labs.elsevier.com.
MANAGER STRATEGIC RESEARCH INSIGHTS & ANALYTICS
Judith focuses on demonstrating Elsevier’s bibliometric expertise and capabilities by connecting with the research community. She is heavily involved in analyzing, reporting and presenting commercial research performance evaluation projects for academic institutes, as well as governments. Judith has worked within several areas at Elsevier including bibliographic databases, journal publishing, strategy, sales and, most recently, within Research & Academic Relations. Judith has a PhD from Utrecht Institute of Linguistics and holds Masters Degrees in Corporate Communications and French Linguistics & Literature.
The 2011 Journal Citation Reports® (JCR), have just been published by Thomson Reuters. Learn how Elsevier’s journals fared.
Sarah Huggett | Publishing Information Manager, Elsevier
We are pleased to announce the highlights of our recent journal Impact Factor performance. According to the 2011 Journal Citation Reports® (JCR), published by Thomson Reuters, 58% of our journal Impact Factors increased from 2010 to 2011, compared to 54% for non-Elsevier journals.
Our journals top the rankings in 57 subject categories, and nearly 40% of our titles indexed in the JCR are in the top 10 of their subject category, with 188 journals ranked in the top 3. Five of our journals have even seen a decade of continuous Impact Factor increases: Journal of Ethnopharmacology; Journal of Materials Processing Technology; Carbon; Electrochemistry Communications; Renewable Energy.
In a press release announcing the good news, Martin Tanke, Managing Director Journals at Elsevier, said: “We value these results as they are an acknowledgement of the excellent performance by the authors, reviewers and editors we work with. In addition, we believe these outcomes are the result of our continuous focus on quality. We will continue to invest time and resources into quality enhancing initiatives such as increased support and enhancement of the peer review process to speed up review times, and further innovations on the publishing process to deliver faster publication for our authors.”
The Impact Factor helps to evaluate a journal’s impact compared to others in the same field by measuring the frequency with which recent articles in a journal have been cited in a particular year: the 2011 Impact Factor takes into account citations in 2011 to papers published in 2009 and 2010. It is important to note that the Impact Factor is just one perspective on a journal’s quality and influence, and that other metrics (such as SJR or SNIP) provide different perspectives and address some of the shortcomings of the Impact Factor.
The emphasis on the Impact Factor as a measure of journal evaluation can lead to potentially unethical behaviours aiming to inflate a journal’s Impact Factor, a topic discussed in our Impact Factor Ethics for Editors article in the June 2012 issue of Editors’ Update.
4 Jun 2012 7 Comments
How Impact Factor engineering can damage a journal’s reputation The dawn of bibliometrics We’ve all noticed that science has been accelerating at a very fast rate, resulting in what has been called ‘information overload’ and more recently ‘filter failure’. There are now more researchers and more papers than ever, which has led to the heightened […]
How Impact Factor engineering can damage a journal’s reputation
We’ve all noticed that science has been accelerating at a very fast rate, resulting in what has been called ‘information overload’ and more recently ‘filter failure’. There are now more researchers and more papers than ever, which has led to the heightened importance of bibliometric measures. Bibliometrics as a field is a fairly new discipline, but it has seen an impressive growth in recent years due to advances in computation and data storage, which have improved the accessibility and ease of the use of bibliometric measures (for instance through interfaces such as Sciverse Scopus or SciVal). Bibliometrics are being increasingly used as a way to systematically compare diverse entities (authors, research groups, institutions, cities, countries, disciplines, articles, journals, etc.) in a variety of contexts. These include an author deciding where to publish, a librarian working on changes in their library’s holdings, a policy maker planning funding budgets, a research manager putting together a research group, a publisher or Editor benchmarking their journal to competitors, etc.
In this perspective, journal metrics can play an important role for Editors and we know it’s a topic of interest because of the high attendance at our recent webinar on the subject. There are many different metrics available and we always recommend looking at a variety of indicators to yield a bibliometric picture that is as thorough as possible, providing insights on the diverse strengths and weaknesses of any given journal1. However, we are well aware that one metric in particular seems to be considered especially important by most Editors: the Impact Factor. Opinions on the Impact Factor are divided, but it has now long been used as a prime measure in journal evaluation, and many Editors see it as part of their editorial duty to try to raise the Impact Factor of their journal2.
There are various techniques through which this can be attempted, some more ethical than others, and it is an Editor’s responsibility to stay within the bounds of ethical behavior in this area. It might be tempting to try to improve one’s journal’s Impact Factor ranking at all costs, but Impact Factors are only as meaningful as the data that feed into them3: if an Impact Factor is exceedingly inflated as a result of a high proportion of gratuitous self-citations, it will not take long for the community to identify this (especially in an online age of easily accessible citation data). This realisation can be damaging to the reputation of a journal and its Editors, and might lead to a loss of quality manuscript submissions to the journal, which in turn is likely to affect the journal’s future impact. The results of a recent survey4 draw attention to the frequency of one particularly unethical editorial activity in business journals: coercive citation requests (Editors demanding authors cite their journal as a condition of manuscript acceptance).
Elsevier’s philosophy on the Impact Factor
“Elsevier uses the Impact Factor (IF) as one of a number of performance indicators for journals. It acknowledges the many caveats associated with its use and strives to share best practice with its authors, editors, readers and other stakeholders in scholarly communication. Elsevier seeks clarity and openness in all communications relating to the IF and does not condone the practice of manipulation of the IF for its own sake.”
This issue has already received some attention from the editorial community in the form of an editorial in the Journal of the American Society for Information Science and Technology5. Although some Elsevier journals were highlighted in the study, our analysis of 2010 citations to 2008-2009 scholarly papers (replicating the 2010 Impact Factor window using Scopus data) showed that half of all Elsevier journals have less than 10% journal self-citations, and 80% of them have less than 20% journal self-citations. This can be attributed to the strong work ethic of the Editors who work with us, and it is demonstrated through our philosophy on the Impact Factor (see text box on the right) and policy on journal self-citations (see text box below): Elsevier has a firm position against any ‘Impact Factor engineering’ practices.
There are probably as many answers to this question as there are journals. Journal self-citation rates vary between scientific fields, and a highly specialised journal is likely to have a larger proportion of journal self-citations than a journal of broader scope. A new journal is also prone to a higher journal self-citation rate as it needs time to grow in awareness amongst the relevant scholarly communities.
Elsevier’s policy on journal self-citations
“An editor should never conduct any practice that obliges authors to cite his or her journal either as an implied or explicit condition of acceptance for publication. Any recommendation regarding articles to be cited in a paper should be made on the basis of direct relevance to the author’s article, with the objective of improving the final published research. Editors should direct authors to relevant literature as part of the peer review process; however, this should never extend to blanket instructions to cite individual journals. […] Part of your role as Editor is to try to increase the quality and usefulness of the journal. Attracting high quality articles from areas that are topical is likely the best approach. Review articles tend to be more highly cited than original research, and letters to the Editor and editorials can be beneficial. However, practices that ‘engineer’ citation performance for its own sake, such as forced self-citation are neither acceptable nor supported by Elsevier.”
As mentioned in a Thomson Reuters report on the subject: “A relatively high self-citation rate can be due to several factors. It may arise from a journal’s having a novel or highly specific topic for which it provides a unique publication venue. A high self-citation rate may also result from the journal having few incoming citations from other sources. Journal self-citation might also be affected by sociological factors in the practice of citation. Researchers will cite journals of which they are most aware; this is roughly the same population of journals to which they will consider sending their own papers for review and publication. It is also possible that self-citation derives from an editorial practice of the journal, resulting in a distorted view of the journal’s participation in the literature.”6
There are various ethical ways an Editor can try to improve the Impact Factor of their journal. Through your publishing contact, Elsevier can provide insights as to the relative bibliometric performance of keywords, journal issues, article types, authors, institutes, countries, etc., all of which can be used to inform editorial strategy. Journals may have the options to publish official society communications, guidelines, taxonomies, methodologies, special issues on topical subjects, invited content from leading figures in the field, interesting debates on currently relevant themes, etc., which can all help to increase the Impact Factor and other citation metrics. A high quality journal targeted at the right audience should enjoy a respectable Impact Factor in its field, which should be a sign of its value rather being an end in itself. Editors often ask me how they can raise their journal’s Impact Factor, but the truth is that as they already work towards improving the quality and relevance of their journal, they are likely to reap rewards in many areas, including an increasing Impact Factor. And this is the way it should be: a higher Impact Factor should reflect a genuine improvement in a journal, not a meaningless game that reduces the usefulness of available bibliometric measures.
1 Amin, M & Mabe, M (2000), “Impact Factors: use and abuse”, Perspectives in Publishing, number 1
2 Krell, FK (2010), “Should editors influence journal impact factors?”, Learned Publishing, Volume 23, issue 1, pages 59-62, DOI:10.1087/20100110
3 Reedijk, J & Moed, HF (2008), “Is the impact of journal impact factors decreasing?”, Journal of Documentation, Volume 64, issue 2, pages 183-192, DOI: 10.1108/00220410810858001
4 Wilhite, AW & Fong, EA, (2012) “Coercive Citation in Academic Publishing”, Science, Volume 335, issue 6068, pages 542–543, DOI: 10.1126/science.1212540
5 Cronin, B (2012), “Do me a favor”, Journal of the American Society for Information Science and Technology, early view, DOI: 10.1002/asi.22716
6 McVeigh, M (2002), "Journal Self-Citation in the Journal Citation Reports – Science Edition"
PUBLISHING INFORMATION MANAGER, RESEARCH & ACADEMIC RELATIONS
As part of the Scientometrics & Market Analysis team, Sarah provides strategic and tactical insights to colleagues and publishing partners, and strives to inform the bibliometrics debate through various internal and external discussions. Her specific interests are in communication and the use of alternative metrics such as SNIP and usage for journal evaluation. After completing an M. Phil in English Literature at the University of Grenoble (France), including one year at the University of Reading (UK) through the Erasmus programme, Sarah moved to the UK to teach French at Oxford University before joining Elsevier in 2006.
23 Mar 2012 3 Comments
Does open access publishing increase citation rates?
Studies conducted in this area have not yet adequately controlled for various kinds of sampling bias.
Henk F Moed | Senior Scientific Advisor, Elsevier
Does open access publishing increase citation rates?
Studies conducted in this area have not yet adequately controlled for various kinds of sampling bias. Read on...
The debate about the effects of open access upon the visibility or impact of scientific publications started with the publication by Steve Lawrence (2001) in the journal Nature, entitled ‘Free online availability substantially increases a paper's impact’, analyzing conference proceedings in the field computer science. Open access is not used to indicate the publisher business model based on the ‘authors pay’ principle, but, more generally, in the sense of being freely available via the Web. From a methodological point of view, the debate focuses on biases, control groups, sampling, and the degree to which conclusions from case studies can be generalized. This note does not give a complete overview of studies that were published during the past decade but highlights key events.
In 2004, Stevan Harnad and Tim Brody (2004) claimed that physics articles submitted as pre-print to ArXiv (a preprint server covering mainly physics, hosted by Cornell University), and later published in peer reviewed journals, generated a citation impact up to 400% higher than papers in the same journals that had not been posted in ArXiv. Michael Kurtz and his colleagues (Kurtz et al., 2005) found in a study on astronomy evidence of a selection bias – authors post their best articles freely on the Web - and an early view effect – articles deposited as preprints are published earlier and are therefore cited more often. Henk Moed (2007) found for articles in solid state physics that these two effects may explain a large part, if not all of the differences in citation impact between journal articles posted as pre-print in ArXiv and papers that were not.
In a randomized control trail related to open versus subscription-based access of articles in psychological journals published by one single publisher, Phil Davis and his colleagues (Davis et al, 2008) did not find a significant effect of open access on citations. In order to correct for selection bias, a new study by Harnad and his team (Gargouri et al., 2010) compared self-selective self archiving with mandatory self archiving in four particular research institutions. They argued that, although the first type may be subject to a quality bias, the second can be assumed to occur regardless of the quality of the papers. They found that the OA advantage proved just as high for both, and concluded that it is real, independent and causal. It is greater for more citable articles then it is for less significant ones, resulting from users self-selecting what to use and cite. But they also found for the four institutions that the percentage of their publication output actually self-archived was, at most, 60%, and that for some it did not increase when their OA regime was transformed from non-mandatory into mandatory. Therefore, what the authors labeled as ‘mandated OA’ is in reality, to a large extent, subject to the same type of self selection bias as non-mandated OA.
On the other hand, it should be noted that all citation based studies mentioned above seem to have the following bias: they were based on citation analysis carried out in a citation index with a selective coverage of the good, international journals in their fields. Analyzing citation impact in such a database is, in a sense, a bit similar to measuring the extent to which people are willing to leave their car unused during the weekend by interviewing mainly persons on a Saturday at the parking place of a large warehouse outside town. Those who publish in the selected set of good, international journals – a necessary condition for citations to be recorded in the OA advantage studies mentioned above – will tend to have access to these journals anyway. In other words: there may be a positive effect of OA upon citation impact, but it is not visible in the database used. The use of a citation index with more comprehensive coverage would enable one to examine the effect of the citation impact of covered journals upon OA citation advantage; for instance: is such an advantage more visible in lower impact or more nationally oriented journals than it is in international top journals?
Analyzing article downloads (usage) is a complementary and, in principle, valuable method for studying the effects of OA. In fact, the study by Phil Davis and colleagues mentioned above did apply this method and reported that OA articles were downloaded more often than papers with subscription-based access. However, significant limitations of this method are that not all publication archives provide reliable download statistics, and that different publication archives that do generate such statistics may apply different ways to record and/or count downloads, so that results are not directly comparable across archives. The implication seems to be that usage studies of OA advantage comparing OA with non-OA articles can be applied only in ‘hybrid’ environments, in which publishers offer authors who submit a manuscript both an ‘authors pay’ and a ‘readers pay’ option. But this type of OA may not be representative for OA in general, as it disregards self-archiving in OA repositories that are being created in research institutions all over the world.
An extended version of this paper will be published soon in the Elsevier publication Research Trends.
Davis, P.M., Lewenstein, B.V., Simon, D.H., Booth, J.G., Connolly, M.J.L. (2008). Open access publishing, article downloads, and citations: Randomised controlled trial. BMJ, 337 (7665), 343-345.
Gargouri, Y., Hajjem, C., Lariviére, V., Gingras, Y., Carr, L., Brody, T., Harnad, S. (2010). Self-selected or mandated, open access increases citation impact for higher quality research. PLoS ONE, 5 (10), art. no. e13636.
Harnad, S., Brody, T. (2004). Comparing the impact of open access (OA) vs. non-OA articles in the same journals. D-Lib Magazine, 10(6).
Kurtz, M.J., Eichhorn, G., Accomazzi, A., Grant, C., Demleitner, M., Henneken, E., Murray, S.S. (2005). The effect of use and access on citations. Information Processing & Management, 41, 1395–1402.
Lawrence, S. (2001). Free online availability substantially increases a paper's impact. Nature, 411 (6837), p. 521.
Moed, H.F. (2007). The effect of “Open Access” upon citation impact: An analysis of ArXiv’s Condensed Matter Section. Journal of the American Society for Information Science and Technology, 58, 2047-2054.
Love it or loathe it the journal Impact Factor remains a widely-used benchmark by authors to decide which journal to submit to. Whilst we recognize the Impact Factor at Elsevier, we also nurture the idea of using other indicators of journal performance.
Love it or loathe it the journal Impact Factor remains a widely-used benchmark by authors to decide which journal to submit to. Whilst we recognize the Impact Factor at Elsevier, we also nurture the idea of using other indicators of journal performance.
Of interest to: Journal editors (key), additionally authors and reviewers Archive views to date: 310+ Average feedback: 4.4 out of 5
Of interest to: Journal editors (key), additionally authors and reviewers
Archive views to date: 310+
Average feedback: 4.4 out of 5