Return to Elsevier.com

Tagged:  Peer review

EgbertVanWezenbeek_square

Finding reviewers in EES just got easier…

Improvements to the Find Reviewers tool in EES have simplified the process of searching for potential referees. Find out more…

Read more >


Egbert van Wezenbeek | Director Publication Process Development, Elsevier

The Find Reviewers application in the Elsevier Editorial System (EES) was built to help you locate appropriate reviewers.

Since its launch in 2010, editor feedback has been positive but we know that you have been keen to see better integration of the tool with EES.

We are pleased to inform you that following a recent update to EES this is now the case, resulting in a new workflow, details of which are outlined below.

FindReviewersProcess

In addition to the improved workflow outlined above, visibility of the Find Reviewers tool on the 'Search for Reviewers' page has been improved by adding a logo and hyperlinking the entire phrase that follows.

FindReviewers_Search

More detailed information on how to use the Find Reviewers tool can be found on our support pages and your publisher can also help with any queries you might have.

Webinar_The peer review landscape1

Mounir_Adjrad

An author’s experience of peer review

Researcher Mounir Adjrad dwells on why constructive reviews are so important to the peer-review process.

Read more >


Mounir Adjrad is currently a researcher within the Engineering and Design department at London South Bank University (LSBU). His current research interests are Ultra Wideband (UWB) technology exploitation for biomedical engineering and communication applications. He has a multidisciplinary research experience in industry and academic institutions working on topics such as Global Navigation Satellite Systems (GNSS), satellite engineering, radar and transport engineering applications. 

SeparatorFINAL

Reviewers need to remember that they are on a mission, one of “evaluating” others’ work. Simply stated, to evaluate is to examine the worth of the author’s/authors’ efforts. The review output, besides the suggestion of whether the work is (or is not) acceptable for publication, needs to state whether the author is encouraged to continue his/her effort, list areas for making improvements and explain how these improvements could be implemented. The review report needs to be positive, in a broad sense, and the end message to the author should be as far as possible from a savage one.

On that latter point, a couple of years ago while working in industry, I submitted an article to a specialized technology magazine. The published work presented the validation results of a developed product with quantified efficiency figures. The reviews referred to the presented figures as being the result of “black magic” and the product was called “snake oil”. It was obvious that the reviewers did not believe the reported figures, but the review fell short of explaining rational reasons behind this sceptical attitude. The only explanation the reviewers could offer was the fact they had dealt with previous similar claims from other companies but when the results were scrutinised it turned out that the figures had been falsified.

Thankfully, the review process was an open one, which allowed me to identify who the reviewers were, and perhaps this was the only useful information I could take back from those reviews. It turned out that these reviewers were involved in consultancy work for companies that were already marketing a similar product (function-wise). The article was published following a reply to the editor highlighting the issue with the choice of the reviewers and a request for a second review of the article, with impartial reviewing panel.

The issue of hostility in reviews was addressed in the column On Civility in Reviewing by Robert J. Sternberg in Observer, 2002. He rightly pointed out that hostile and savage reviews violate the fundamental ethics Golden Rule: to act toward others as we would have them act toward us. This brings me to conclude that any hostility expressed in any review, regardless of the field of research, is totally unacceptable.

But how do we prevent the occurrence of negative reviews? I believe this is a joint effort between editors and reviewers: the editor needs to address the question of whether the reviewers have subject matter expertise and experience qualifying them to thoughtfully evaluate the work; whereas the reviewers need to honestly assess whether they can provide a fair and unbiased review of the work based solely on its merits. Specifically, for the reviewers, I believe a key element in avoiding the hostility trap is to assess, at an early stage of the review process, whether they will be able to evaluate the work with an open mind, and they should decline the review if they feel negatively predisposed to the submitted work.

In conclusion, across academic generations and disciplines, there is a cycle of negative reviews that needs to be broken to make peer review a healthy process.

P.S. the “snake oil” product and the company behind it are both enjoying great success.

 

RU19_Experiment_RajChetty

How small changes can influence reviewer behavior

Discover the lessons learnt during a reviewer experiment on the Journal of Public Economics.

Read more >


Raj Chetty, PhD, is the Bloomberg Professor of Economics at Harvard University Department of Economics and Editor-in-Chief of Journal of Public Economics.

Together with two colleagues - Emmanuel Saez, E. Morris Cox Professor of Economics and Director of the Center for Equitable Growth at the University of California Berkeley, and László Sándor, a PhD candidate in Economics at Harvard University - Chetty ran an experiment evaluating the effects of cash incentives, social incentives, and nudges on the behavior of referees at the Journal of Public Economics. The interesting, and sometimes surprising, results of their study will be published this summer in the Journal of Economic Perspectives. Here Chetty talks about the rationale behind the trial and lessons learnt.

SeparatorFINAL

Since I took on the role of Editor-in-Chief, I have been interested in how we can better serve authors and improve the review times on our journal. And, as an author myself, I would love to see my papers reviewed more quickly.

To design the trial, we began by reading the literature and thought about what might prove effective in motivating reviewers to submit high-quality reports more quickly.  We formulated three interventions.

RU19_Experiment_JournalCoverFirst, naturally, as economists, we thought that paying people might prove an incentive. But psychologists suggest that payment can crowd out “intrinsic motivation” and actually lead to worse performance.  So these contradictory hypotheses seemed very natural to test.

Second, the psychology literature suggests that simple nudges and reminders can affect people’s behavior, so we decided to try changing the deadline by which reports were due.

Third, sociologists have suggested that social incentives – namely, how people are perceived by their peers – may be a key determinant of behavior.

To test these hypotheses, we randomly assigned referees to four groups:

  • Group one was the control and participants had a six-week deadline to submit their reports.
  • Group two was given only four weeks to provide their reports.
  • Group three also had only four weeks and if they met that deadline they received $100.
  • Group four was offered a social incentive, i.e. we informed them that their turnaround times would be publicly posted.

In total, the experiment included 1,500 referees who submitted nearly 2,500 reports from February 2010 to October 2011.

I should mention that all the interventions we tested have been used by other journals, but until now there has never really been a clear examination of which factors work best.  

Key takeaways

First, a change in timeframe is very effective; if you shorten the deadline by two weeks you receive reviews two weeks earlier on average. In fact, we noticed that whatever timeframe you give, most people submit their review just prior to the deadline. Editors might worry that if you ask reviewers to review more quickly, they submit lower-quality reviews.  However, we found no significant changes in the quality of referee reports, as judged, for instance, by the editor’s propensity to follow the advice in the report. 

Second, if a journal has the money available, cash incentives also work very well. The $100 payment reduced review times by about 10 days on average.  Hence, it is clear that the “crowd-out of intrinsic motivation” that psychologists have been concerned about is actually not a serious concern in this context.

Third, the social incentive was less effective but still surprisingly successful in reducing review times, particularly with tenured professors, who were less sensitive to cash and deadlines.  This confirms that people care about how they are perceived and suggests that gentle, personalized reminders from editors could be very effective in improving referee performance.

Overall, my biggest takeaway was that, as editors, we shouldn’t believe that the performance of our journals is something we can’t change.  We can greatly improve the quality of our journals’ review process through simple policy changes and active editorial management. 

Personally, I was surprised by how effective the shorter deadline was. There was no consequence for reviewers who didn’t meet it, yet they were still very receptive. The advantage for journals is that this approach is cost-free. I would probably be less responsive to the cash incentive, so I was also quite surprised by how successful that proved to be. However, if I have to do something anyway and by doing it today I get $100 then perhaps it’s not so surprising it has some effect.

Going forward, we would love to see other journals adopting some of these policies.  And for reviewers, I would suggest that often what is useful to editors, especially if you are going to recommend rejection, is a short, clear - and on time - report, rather than something that is more detailed which takes longer to draft. By focusing on the big picture, you not only save yourself time but better serve editors and the author community too.

Author biography

Chetty's research combines empirical evidence and economic theory to help design more effective government policies. His work on tax policy, unemployment insurance, and education has been widely cited in media outlets and Congressional testimony.

Chetty was recently awarded a MacArthur "Genius" Fellowship and the John Bates Clark medal, given by the American Economic Association to the best American economist under age 40. He received his PhD from Harvard in 2003 at the age of 23 and is one of the youngest tenured professors in the university's history.

ArnoutJacobs

Establishing new revision times for Elsevier journals

New reviewer and revision deadlines have been established for a number of Elsevier journals. Find out why.

Read more >


Arnout Jacobs | Business Development Director, Elsevier

Sometimes, even simple things can make a big difference. An Elsevier project designed to reassess reviewer and revision times is large-scale, involving 1,100 journals. Yet we are focused on one small aspect: optimizing the deadlines we give authors and reviewers.  The idea was born at Cell Press, where we decided to look at what would happen if a journal set its review deadline a few days earlier. Results were encouraging: reviews did indeed come in earlier, and there was no difference in reviewer response rates. 

Recently, via a controlled experiment on the Journal of Public Economics, we also received confirmation from the scientific community that shorter review deadlines can work. You can find out more in the Short Communication How small changes can influence reviewer behavior.

Building on the lessons learnt at Cell Press, we made an inventory of deadlines across all of our titles. Some journals did not mention any. We also found titles where contributions were routinely received well before the stated deadline. And then there were journals that were still using timeframes from the days when manuscripts were physically sent around the world and back, with deadlines extending up to a year! For journals publishing on arctic geology in the 1980s, this may well have been understandable, but with today’s instant communication, a new policy was due.

Whereas speed has always been important, that importance is increasing in today’s publishing environment. In the past, even if an article was ready, it may still have had to wait for backlogs to clear and issues to be complete. Today, we aim to publish articles online as quickly as possible after they are accepted.  So a day saved in peer review, means a day quicker online!

So, how did we set our new deadlines? Our first principle was not to disrupt existing practices. Some fields are slower than others, and usually there is a good reason for this. Other journals are already very fast, and there is little gain in asking contributors to submit within 4 days instead of the existing 5. So we looked at actual reviewing and author revision times, and at stated deadlines, and we used these as a starting point. The biggest gains were to be found in author revision times, where articles can sometimes linger for months. We then came up with proposed new deadlines, and consulted with you as editors. As a result, new reviewer and revision deadlines were implemented at the beginning of the year for around 600 titles.

It’s still early days, so we do not yet know what the results will be. We are keeping a close eye on measurable items, such as response rates, compliance, and submission-to-acceptance times. But qualitative feedback is equally important. Ultimately, we hope that this initiative will speed up the publication process, while keeping all participants satisfied.

EU42_HERO_ReviewerRecognition

How we can better support and recognize reviewers

We know that finding, retaining and rewarding reviewers are long-term pain points for editors. Scientists are increasingly busy and often find it difficult to free up time to do reviews. At the same time, new approaches to peer review are being developed, for example, working in a more open and collaborative manner or making use […]

Read more >


We know that finding, retaining and rewarding reviewers are long-term pain points for editors. Scientists are increasingly busy and often find it difficult to free up time to do reviews. At the same time, new approaches to peer review are being developed, for example, working in a more open and collaborative manner or making use of the latest technology. That makes these times challenging, as well as exciting, and this is reflected in the enthusiasm and energy with which new experiments are being launched within our organization.

My team is behind a number of these peer-review pilots and our decision to carry them out in an experimental setting, i.e. test the concepts with a limited number of journals, is deliberate. It means we can learn quickly and be flexible. If a pilot proves unsuccessful, we can swiftly shift our attention to other areas. However, if the results are encouraging, we can upscale and roll it out to more journal titles. Below I outline a few of the pilots currently taking place.

New platform will provide reviewer rewards

This experiment looks at addressing the need of reviewers to be better recognized for their work. Reviewers indicate that they like to review manuscripts; they feel it is an important service to their communities and it keeps them abreast of the latest developments. At the same time, we know they often feel that they are not fully recognized for their work. 

Simon Gosling

Simon Gosling

With this in mind, Elsevier set up a Peer Review Challenge in 2012. We asked entrants to submit an original idea that would significantly improve or add to the current peer-review process. The winner was Simon Gosling, a Lecturer in Climate Change and Hydrology at The University of Nottingham. He proposed the creation of a ‘reviewer badges and rewards scheme’ as an incentive for reviewers. Elsevier has since been working with him to implement his vision and, in early February, we began piloting a digital badge system with a selection of journals in our Energy portfolio. Via Mozilla OpenBadges, reviewers are issued with badges that they can display on their Twitter, Facebook and Google+ pages.

A second phase of the pilot is due to be launched this month - a ‘reviewer recognition’ platform for approximately 40 journals. Upon completion of a review for one of these titles, reviewers are provided with a link to a personal page on the platform that displays their reviewer activity. Based on their contributions to the journal, they are appointed statuses – for example, ‘recognized reviewer’ for those completing one review within two years, and ‘outstanding reviewer’ for those that have completed the most reviews. They are also able to download certificates based on their achievements and discount vouchers. We hope the platform will make the important work of reviewers more visible and encourage them to engage with Elsevier journals. Following the pilot, our aim is to make the platform available to all Elsevier titles.

We are continuously looking at how we can increase the visibility of the contribution made by reviewers; in another pilot, the journal Agricultural and Forest Meteorology has been making its review reports accessible on ScienceDirect. We now want to extend the experiment to more journals and see if we can provide the reports with DOIs (Digital Object Identifiers). In this way, the reports will be better acknowledged as an essential part of the scientific literature.

ScienceDirect visitors to articles from the journal Agricultural and Forest Meteorology can download the reviewer reports in the right hand bar.

 

Article Transfer Service for soil science journals

As an editor, you may frequently be confronted with manuscripts that are out of scope or are simply not suitable for the journal; however, they still contain sound research. For some time now, we have been offering the complementary Article Transfer Service (ATS), which is currently active for more than 300 of our journals. ATS allows editors to recommend that authors transfer their submitted papers – and any accompanying reviews – to another Elsevier journal in the field, without the need to reformat them.

A new experiment with six Elsevier soil science journals aims to improve on this service. If participating editors decide not to accept a paper, they can now choose from two important options in Elsevier’s Editorial System (EES):

    • They can choose a new option, ‘decline’, which means the paper is not suitable for their title. If this option is chosen, the author will always have the option to transfer the article, with the review reports, to another journal.
    • They can decide to ‘reject’ the paper. If they choose this option, the author will not be invited to submit to any other journal in the pilot.

Gilles Jonker, Executive Publisher for soil science, explained: “The editors of these journals were confronted with a strong growth in submitted articles and found it increasingly difficult to find reviewers. To help address these issues, an agreement was reached to harmonize the editorial policies of the six journals, honor another editor’s decision to reject a paper, as well as give authors more autonomy in finding an alternative journal.”

Early pilot results show a good uptake by editors of the ‘decline’ decision option. Authors are also embracing the concept and are accepting transfers to journals within the cluster that better fit the scope of their articles. “Later this year we should be able to see whether this pilot study has indeed addressed reviewer fatigue and improved the quality of submitted articles,” said Jonker.

Experimenting with the peer-review process via Mendeley

MendeleyLogo_150x150Last, but not least, Elsevier is exploring ways in which Mendeley can be used to improve the peer-review process. Mendeley, a London-based company that operates a global research management and collaboration platform, was acquired by Elsevier in April 2013. Researchers worldwide use Mendeley’s desktop and cloud-based tools to manage and annotate documents, create citations and bibliographies, collaborate on research projects and network with fellow academics. These advanced collaborative features could benefit the peer-review process. Manuscripts can be annotated online, and these annotations can be shared in private groups. Moreover, editors and reviewers can discuss manuscripts in discussion forums. We are curious to see whether peer review within this environment will streamline the peer-review process, increase its efficiency and, in the end, lead to a better manuscript review. As part of this experiment, papers will be brought within the Mendeley environment - naturally only with the consent of the reviewers and editors. This pilot began with a few titles earlier this year. If it proves successful we will look to make it more widely available.

If you have any comments or suggestions for new peer-review pilots, I would really like to hear from you. You can contact me at j.p.rossum@elsevier.com

Author biography

Dr. Joris van Rossum

Dr. Joris van Rossum

Dr. Joris van Rossum
DIRECTOR PUBLISHING INNOVATION
For the past 12 years, van Rossum has been involved in the launch and development of products and initiatives within Elsevier. From its inception he worked as a Product Manager on Scopus, the largest abstract and citation database of peer-reviewed literature, and he worked on Elsevier’s search engine for scientific information as Head of Scirus. Later, he developed the Elsevier WebShop, which offers support and services for authors at many stages of the publication workflow. In his current role, van Rossum is focused on testing and introducing important innovations with a focus on peer review. He holds a master’s of science in biology from the University of Amsterdam, and a PhD in philosophy from VU University Amsterdam

MichaelEmerman_square

New ‘Streamline’ peer-review process piloted by Virology

New peer-review system piloted by the journal Virology avoids the need to ‘start over’ with new reviews if paper is rejected from a high-impact journal.

Read more >


Dr Michael Emerman | Editor-in-Chief of the Elsevier journal Virology

Established in 1954, Virology is one of the oldest journals in its field and publishes the results of basic research in all branches of virology. Dr Michael Emerman, who researches HIV replication at the Fred Hutchinson Cancer Research Center in Seattle, is a long-serving editor of Virology and took on the role of Editor-in-Chief in January. Since then, he has instigated some big changes, dramatically increasing submissions. Recent changes include the launch of a blog, Virology Highlights; bringing on board new editors; and the introduction of Streamline Reviews. Here he explains why he has high hopes that the latter will contribute to the journal’s success.

In January, Virology introduced a new program — Streamline Reviews — with the aim of capturing and publishing manuscripts that have been rejected by journals with high Impact Factors. The idea came from one of our editors who described the frustration of resubmitting a rejected manuscript from one high-impact journal to another because of the need to respond to a completely new set of reviewers.

The way Streamline Reviews works is simple. If an author has a manuscript that has been reviewed and rejected by a journal with an Impact Factor higher than 8 that publishes papers on the basic science of viruses, (such as Cell Host & Microbe, Nature, PLOS Pathogens, Proceedings of the National Academy of Sciences and Science), they can send us the original reviews, their rebuttal and a revised manuscript. They should include these extra items as part of their cover letter.

We will then consider the manuscript based on the reviews and usually send the manuscript, reviews and response to one additional expert for an opinion. In theory, this should speed up the review process for these manuscripts — authors do not need to start over at the beginning, and it is easier for someone to give an opinion on the paper with reviews already to hand. This option works best for those manuscripts rejected for perceived reasons of impact, novelty or significance.

The program is still in its infancy. We have received a handful of Streamline Review submissions, but we believe more papers will be submitted this way once the initiative becomes better known. What has been interesting is the very positive feedback we have received from editorial board members and community members, many of whom have experienced the long process of resubmitting a very good manuscript that has just missed the mark at a high-impact journal. In fact, they wonder why Streamline Reviews is not already standard practice amongst journals.

As I mentioned, we have set our bar at an Impact Factor of higher than 8. We decided on that figure after identifying which of our competitor journals featured the kinds of papers we are interested in.

In practice, it has worked well so far. In one case, the additional expert reviewer had also reviewed the paper for the high-impact journal and recommended it be accepted right away since the authors had addressed all previous concerns. In other cases, we have asked for additional changes, but these mostly related to the way the paper had been written and didn’t require the author to carry out additional experiments.

While there was initially some concern that we would not know the identities of the reviewers for the high-impact journal, this has not proved a problem when it comes to evaluating the manuscripts.

Despite complaints, I think the peer-review system serves a wonderful purpose. The role of the editor is to weed out the poor reviews and to use the peer-review system to turn out better papers, and I have seen many papers over the years become vastly improved by reviews. I think that the Streamline Review process is a means to help good papers get published in a faster and more efficient manner without sacrificing any of the benefits of stringent peer review.

This article first appeared in Elsevier Connect.

EU39_Reviewers_Hero_web

Recognizing your top reviewers

We know that finding and retaining good reviewers is one of the greatest challenges our editors face. In March this year, we collaborated with editors on 32 journals to find a simple way to recognize the contributions of ‘top’ reviewers – those who have really gone that extra mile for a journal. The result was […]

Read more >


We know that finding and retaining good reviewers is one of the greatest challenges our editors face.

In March this year, we collaborated with editors on 32 journals to find a simple way to recognize the contributions of ‘top’ reviewers - those who have really gone that extra mile for a journal. The result was the Certificate of Excellence in Reviewing, featured in figure 1 below.

How does it work?

Editors from each of the participating journals nominated their 25 best reviewers. Those reviewers then received a personalized HTML email containing a link to a high-resolution PDF file of their certificate, suitable for printing. Each certificate was created using a unique PDF generation tool developed by Elsevier WebShop for the Top25 Hottest Articles and Certificate of Publication.

The response from reviewers was immediate and encouraging. One of the participating editors, Dr Sandra Shumway, co-Editor-in-Chief of the Journal of Experimental Marine Biology and Ecology (JEMBE) admitted:I didn’t expect it to do anything. But I had a couple of people write back and say thank you.”

 

Figure 1. Example of a Certificate of Excellence in Reviewing, presented to a journal’s ‘top’ reviewers.

Sandra Shumway

Comments from the reviewers who contacted Shumway included, “thank you….a really nice surprise!” and “just returned from a trip and was cleaning up email when I came across the Excellence in Reviewing Certificate.  Nice to be appreciated.  Many thanks!”. Shumway added: “People need to publish and people need to review. My list included those who I knew had really come through for me; those who responded when I needed them for reviews. I believe the newer and younger reviewers will be attracted by this recognition.”

Debra Bick

The journal Midwifery also participated in the project and Editor-in-Chief, Professor Debra Bick, is keen to continue using the certificate. She said: “This initiative worked very well with our reviewers, many of whom contacted myself or Sarah (Sarah Davies, her Elsevier Publisher) to say how pleased they were to be identified in this way.  We would certainly do it again, as the journal feedback is also an excellent way for our reviewers to reflect in their CVs how they are contributing to research and scholarship.”

Across the board, the response to this initiative was positive, both in quantitative and qualitative feedback. Data shows that more than 65% of the email recipients went on to download their certificate.

Next steps

Based on these early results, we plan to turn the study into an annual initiative available to every journal, beginning in 2014. The input required by editors will be minimal. As the time approaches for the certificates to be distributed, we will approach you to ask for your list of ‘exceptional’ reviewers – those who have really excelled that year. The rest of the process will be administered by your journal’s Marketing Manager. While we recommend that you choose 25 reviewers, that number will remain flexible. Shumway said: “I found it hard to create a shortlist of 25 reviewers, but this number seems suitable. When it’s ready to go again, I’ll be ready with my list.”

Philippe Terheggen

Philippe Terheggen, Executive Vice President of Science, Technology and Medicine Journals, has been a strong supporter of the project within Elsevier. He explained: “I’m delighted that this new initiative has landed so well with both our reviewers and the editors that participated. It shows how small steps to provide formal recognition provide a valuable tool for retaining these sought-after people. Perhaps there are some framed Certificates of Excellence in Reviewing hanging on a few walls around the world right now.”

No reviewer left behind

Elsevier understands that even one review is a great contribution. To this end, there is also an annual ‘Thank you Reviewers!’ initiative. Run at the beginning of each year for all participating journals, a special announcement is placed on the journal homepages, together with full page print adverts in the journals, thanking the reviewers for their valued contributions. In addition, the initiative links to the reviewer benefits page on Elsevier.com which reminds them we provide free Scopus and ScienceDirect access and outlines the other benefits Elsevier offers.

These programs are just two of the initiatives we are exploring to help editors find or retain reviewers. Find out more about some of these in Exploring Improvements to the Peer-Review System featured in issue 36 of Editors’ Update.

Hot off the press!

As of this week, Elsevier reviewers can feature the journal for which they review in their e-mail signature or on their personal webpage using a new badge we have created. The journal-specific badge can be claimed within seconds via an online tool at http://www.elsevier.com/reviewerbadge

Figure 2. Example of a reviewer badge

What do you think about these initiatives? Do you have suggestions for your colleagues on how to attract and retain reviewers? Please take a few moments to post a comment below.

Author biography

Ursula van Dijk

Ursula van Dijk
HEAD OF MARKETING COMMUNICATIONS
Ursula has more than 20 years of experience in Science, Technology and Medicine journal marketing.  She is based in Amsterdam and leads a team of marketers within the Physical, Formal and Applied Sciences area with a focus on supporting publishing initiatives by communicating and interacting with our editors, authors and reviewers.


Cortex journal cover

Journal Cortex launches Registered Reports

Editors of the journal Cortex are experimenting with an innovative new approach which will see the peer-review process split into two stages. Find out more…

Read more >


Dr Chris Chambers and Professor Sergio Della Sala | Associate Editor and Editor-in-Chief of the Elsevier journal Cortex

On May 1st, Cortex launched a new innovation in scientific publishing called a Registered Report. Unlike conventional publishing models, Registered Reports split the review process into two stages. Initially, experimental methods and proposed analyses are pre-registered and reviewed before data are collected. Then, if peer reviews are favourable, we offer authors “in-principle acceptance” of their paper. This guarantees publication of their future results providing that they adhere precisely to their registered protocol. Once their experiment is complete, authors then resubmit their full manuscript for final consideration.

Cortex is an international journal devoted to the study of cognition and of the relationship between the nervous system and mental processes.

Why should we want to review papers before data collection? The reason is simple: because the editorial process is too easily biased by the appearance of data. Rather than valuing innovative hypotheses or careful procedures, too often we find ourselves applauding impressive results or being bored by non-significant effects. For most journals, issues such as statistical power and technical rigor are outshone by novelty and originality of findings.

Chris Chambers

Dr Chris Chambers

By venerating findings that are eye-catching, we incentivize the outcome of science over the process itself, forcing aside other vital issues. One of these sacrificial lambs is statistical power – the likelihood of detecting a genuine effect in a sample of data. Several studies in neuroscience suffer from insufficient statistical power, so – driven by the need to publish – scientists inevitably mine their underpowered datasets for statistically significant results. Many will p-hack, cherry pick, and even reinvent study hypotheses to ‘predict’ unexpected findings.

Such practices cause predictable phenomena in the literature, such as poor repeatability of results, a prevalence of studies that support stated hypotheses, and a preponderance of articles in which obtained p values fall just below the significance threshold. Furthermore, an anonymous survey recently showed that these behaviours are not the actions of a naughty minority – in psychology and neuroscience they are the norm. We ourselves are guilty.

Sergio Della Sala

Professor Sergio Della Sala

Registered Reports will help minimise these practices by making the outcome of experiments almost irrelevant in reaching editorial decisions. Cortex is the first journal to adopt this approach, but our underlying philosophy is as old as the scientific method itself: If our aim is to advance knowledge then editorial decisions must be based on the strength of the experimental design and the likelihood of a study revealing definitive results – and never on how the results themselves appeared.

We know that other journals are watching Cortex to gauge the success of Registered Reports. Will the format be popular with authors? Will peer reviewers be engaged and motivated? Will the published articles be influential? We have good reasons to be optimistic. In the lead-up to Registered Reports, many scientists have told us that they look forward to letting go of the toxic incentives that drive questionable research practices. And our strict peer review will ensure that our published findings are among the most definitive in cognitive neuroscience.

review-time-graph

Edward O'Breen

Update on EES User Profile Consolidation

Since the launch of the EES user consolidation project in December last year, thousands of researchers have responded. Find out more…

Read more >


Edward O'Breen | Marketing and Brand Manager, Elsevier

In a recent post on the Short Communications Board, our Vice President of Corporate Relations, Tom Reller, discussed the hacking of EES, our online platform for managing the submission and peer-review process.

He explained that in late October last year, one of the editors of Optics & Laser Technology (JOLT) alerted our EES team that reviewers for two of his assigned submissions had been invited but not by him. Our team immediately launched an investigation and discovered that someone had been able to retrieve the EES username and password information for this editor.

Tom went on to outline the various steps we are taking to reduce these risks, and that one of these innovations - user profile consolidation – had become available to all EES users on December 3, 2012.

Consolidation of user profiles was a project the EES team was working on prior to the hacking.  A regular audit of EES had identified the many advantages that enabling researchers to use a single username and password across all EES journal sites would provide. Not only would it streamline their workflow, it would increase security levels too.

Since December 3, about 350,000 users have consolidated more than 950,000 individual EES accounts into about 350,000 consolidated user profiles.

Alongside the user profile consolidation, we have also introduced enhancements in security and user data protection. EES users can now reset their passwords via a self-chosen security question. They will receive a confirmation by email and only the user will have access to the password and security question.  This makes the end user responsible for his/her own data and helps to avoid abuse of EES accounts.

On December 19 last year, we surveyed those EES users who had consolidated their accounts since the December 3 launch. More than 400 researchers provided their feedback, which revealed:

  • 85% consolidated their accounts immediately after logging into EES
  • 83% needed less than 10 minutes to consolidate their accounts
  • 88% were satisfied to have a consolidated account, while 3.5% were dissatisfied. Those who recorded a dissatisfied reaction identified the main drawback as being that they still have to log into each EES site separately – they would like to login and view their tasks across journals. This will be fixed in Evise, the next generation editorial system Elsevier is working on.
  • 87% approved of the fact that the end user is now solely responsible for updating their personal information - 3% disapproved

For a few days following the December 3 launch, EES servers were slow to respond due to the large number of users consolidating their profiles. We appreciated this was very frustrating for users and worked on improving the situation. Luckily, only very few users still experience this problem and we have seen calls to our Elsevier Customer Services team fall from 1.6% to 0.2%.

Tom Reller

Faking Peer Reviews

Someone found a way to infiltrate the Elsevier Editorial System; Tom Reller, Vice President of Global Corporate Relations at Elsevier, explains what happened and what we’ve done.

Read more >


Tom Reller | Vice President of Global Corporate Relations, Elsevier

This article originally appeared in Elsevier Connect

Yesterday, Ivan Oransky of Retraction Watch reported that Elsevier Editorial System (EES), our online platform for managing the submission and peer-review process, had been hacked in November. His article, “Elsevier editorial system hacked, reviews faked, 11 retractions follow,” is an accurate account of what happened and a good example of the positive role Retraction Watch can play in monitoring the scientific literature.The Retraction Notices posted by the Elsevier journals themselves provided details about the falsified reports:

A referee’s report on which the editorial decision was made was found to be falsified. The referee’s report was submitted under the name of an established scientist who was not aware of the paper or the report, via a fictitious EES account. Because of the submission of a fake, but well-written and positive referee’s report, the Editor was misled into accepting the paper based upon the positive advice of what he assumed was a well-known expert in the field. This represents a clear violation of the fundamentals of the peer-review process, our publishing policies, and publishing ethics standards. The authors of this paper have been offered the option to re-submit their paper for legitimate peer review.

Online Elsevier Editorial System

What happened here is that in late October, one of the editors of Optics & Laser Technology (JOLT) alerted our EES team that reviewers for two of his assigned submissions had been invited but not by him. Our team immediately launched an investigation and discovered that someone had been able to retrieve the EES username and password information for this editor

Fake reviews are becoming an increasingly challenging issue for publishers, but one we’re prepared to confront. We participated in a story in The Chronicle of Higher Education back in September, also stemming from someone creating fake reviewer accounts. In that case, the editors noticed the reviews were coming in from emails with generic email contacts (i.e., yahoo or gmail) and not institutional emails. Here, it was clear the author himself had created the fake reviewer accounts.

What is Elsevier doing to protect EES users?

We regularly conduct an audit of EES tools and processes to determine where improvements can be made. The major recommendations from the most recent audit prompted a security change that was introduced: User Profile Consolidation. Consolidated profiles in EES are protected from the malicious use that occurred in this scenario because the registered user has total control over the personal information in the user profile. More information about the benefits of User Profile Consolidation can be found on this Profile Consolidation FAQ.

In July, we ran a pilot to make user profile consolidation in EES available to almost 1,000 “very active” users. The first pilot was successful, with 90 percent of these pilot users consolidating approximately 4,000 entitlements. Pilot users were surveyed for feedback on the process, including level of effort, provision of help and support. This pilot ran for 10 weeks, and the process itself, the supporting documentation and the communication was improved prior to introducing a second pilot on October 10. This second pilot introduced user profile consolidation for 16,500 additional users and has also proven to be very successful.

After the successful pilots, user profile consolidation became available to all users on December 3.  Elsevier encourages all EES users to complete this process as soon as possible; we’ve already seen more than100,000 unique users consolidate their accounts. In the coming weeks, we will proactively support larger numbers of frequent users through this process as necessary.

In addition to User Profile Consolidation, we have implemented other changes that were recommended by Elsevier’s internal Security and Data Protection team, not all of which would be wise for us to discuss publicly. It has also been suggested that the new ORCID program also has the potential to reduce this type of fraud.

The challenge for us is not so different from that of other companies, and that’s finding the right balance between security control and customer ease of use. One result of this is that editors may have to do more to keep their accounts safe — much like people have to do more to access their online bank accounts —though clearly, there are differences here. Another important aspect of fraud detection in academic publishing is that no matter how strong we make protocols and controls, there is always going to be a human element – a role for editors and publishers to flag when something looks out of line.

Scientific fraud and misconduct is a growing concern in the scientific community and is something Elsevier contributes a significant amount of resources to confront. That includes an information security team that is acutely aware of the risks and vulnerabilities of any online system. The reality today is that hacking and spoofing can and will occur, though here we believe we acted quickly, the impact is minimal and that we have taken the necessary steps to eliminate the threat posed, at least through this method.

We’ll be paying close attention to the discussion surrounding this incident and will try to address any questions that arise.

peer-review-trophy

Peer Review Challenge Winners Unveiled

“This Challenge has shown that the scientific community is keen to publicly and systematically acknowledge reviewers’ work…” Philippe Terheggen, Senior Vice President, Physical Sciences II Since the 1660s, peer review has proved an essential dividing line for judging the difference between science and speculation. Over the years, however, pressure on the system has continued to […]

Read more >


"This Challenge has shown that the scientific community is keen to publicly and systematically acknowledge reviewers’ work..." Philippe Terheggen, Senior Vice President, Physical Sciences II

Since the 1660s, peer review has proved an essential dividing line for judging the difference between science and speculation.

Over the years, however, pressure on the system has continued to grow as the global academic community expands and manuscript submissions rise.

We know that it can be tough for you as Editors to not only find appropriate reviewers willing to review for your journal, but to motivate them to keep reviewing, time and time again. Together with the research community, we are keen to explore ways to ease that pressure and are currently working on a number of peer-review pilots and innovations. One avenue we recently explored was a global competition seeking researchers’ thoughts on the future of peer review. This was inspired by the success of similar competitions such as the Elsevier Grand Challenge and the Executable Paper Grand Challenge.

In an issue of Reviewers’ Update earlier this year, we threw down the gauntlet to readers, asking them to submit ideas that could significantly add to the current peer-review system. We also invited entries that explored how Publishers and Editors can help early career researchers become reviewers, or how reviewers can be recognized by either their institutes or Publishers.

More than 800 readers took up the challenge and entries embraced all aspects of peer review, from improvements to the actual ‘mechanics’ of the process to how to reward reviewers.  It was interesting to note that a great many entries detailed suggestions for rewarding reviewers in ways that can be noted on CVs.  In addition, there was an intriguing dichotomy between entries that were firmly in favour of the extension of double blind type peer-review systems and those who advocated completely open peer review. The judges chose a list of finalists, whose ideas were published on the Peer Review Challenge website with an invitation to the academic community to post their comments.  More than 300 of you responded and those comments were taken into account by the judges when making their final decision.  Many teleconference calls and emails later, the final three winners were chosen and their winning entries can be found below.

Philippe Terheggen, Senior Vice President Physical Sciences II, sat on the judging panel, and was impressed by the quality of the entries. “Thank you to everyone who took the time to enter, or provide us with their feedback. We were really pleased by the number of entries, as well as the thought and initiative invested in them. The next stage is to work with our winners to develop their ideas. We need to determine whether they can be integrated into existing systems, like the peer review annotation idea, or develop the framework necessary to manage and deliver the reviewer points system in a logical way.”

He added: “This Challenge has shown that the scientific community is keen to publicly and systematically acknowledge reviewers’ work, as well as to utilize platforms that make reviewing easier and more transparent for all players.  It is our task as Publishers to work with the community, and your task as Editors to make sure that these needs are met.”

If you would like to get involved in any of the pilots mentioned below, please do let me know at c.lehane@elsevier.com

Overall winner

Simon Gosling

Elsevier Reviewer Badges and Rewards

Simon Gosling,  Lecturer in Physical Geography, Nottingham University, UK

Simon suggested introducing a standardized way to recognize the cumulative effort of a particular reviewer.  He explains: “Peer reviews are an important service to the academic community and fundamental to academic publishing. I entered the Challenge because I feel strongly that reviewers should be recognized and rewarded in some way for the time and effort they invest – often on top of a very busy schedule – in preparing article reviews for journal Editors. I see the future of peer review as a positive feedback cycle whereby reviewers (especially early career researchers) – encouraged by an opportunity to enhance their reputation as a reliable and good reviewer and to receive journal and/or book discounts – prepare high quality helpful reviews that in turn help to make the job of journal Editors more straightforward. This way, authors, Editors and reviewers all benefit from the peer-review process. To date, reviewers have received little benefit, let alone anything tangible. I envisage the acknowledgment of reviewers’ time and efforts through a widely recognized tangible accrediting system that would be well-known and citeable on a CV, which I have termed “Elsevier Reviewer Badges”.

Visit the Peer Review Challenge website for further details of Simon’s winning entry.

Runner up

Michael Muthukrishna

Top Reviewer Incentives

Michael Muthukrishna, Graduate student, Department of Psychology, University of British Columbia, Canada

Michael’s idea also concentrates on reviewer rewards and develops the idea of a public points system for reviewers, inspired by websites such as Reddit and Slashdot.  Michael elaborates: “As a researcher and a technologist, I was thrilled to hear about Elsevier’s Peer Review Challenge. Technology affords new and often better ways of approaching old problems. The peer-review process consists of established and largely successful practices that support good science and I am cognizant that caution is required when tampering with it. Nevertheless, I see the peer-review process gradually taking advantage of the advancements tested in public, peer-reviewed forums, such as Reddit, Slashdot.org, StackOverflow, Amazon, and many open source software projects. The successful processes in these domains have converged on solutions supported by psychological research, including the power of reputation, which I focused on in my entry.”

Visit the Peer Review Challenge website for further details of Michael’s entry.

Runner up

Koen Hufkens

Peer Review Annotation Application

Koen Hufkens, Postdoctoral research associate, Faculty of Bioscience Engineering, Ghent University, Belgium

In contrast to the previous ideas, Koen’s entry suggests that the online platform facilitating peer review could be improved. He suggests creating an integrated review application, which would, among other things, include an integrated PDF reader with annotation possibilities and generate a review template listing the changes to be made based upon annotations on the pdf. This would save reviewers’ time.  Of his idea, Koen says: “As a scientist, reviewing for journals is considered an integral part of your work. However, given the voluntary nature of this task, often little time is budgeted towards it. Talking to colleagues it became apparent to me that reviewing for journals is something done when one has a few minutes to spare, e.g. on a flight, train or bus to the next meeting or commuting home. I entered the competition with the idea that a lot of time could be saved by integrating some of today's technology into a reviewing platform.

Note from Ed:
Koen’s ideas chime perfectly with current plans for Evise, Elsevier’s next-generation editorial system. The roll-out of Evise will begin in the second half of 2013 and will include online viewing and annotation of submissions. Reviewers and Editors will be able to view the manuscript and add comments to it online. These comments will then be saved with the submission and can be made available to the author. Koen’s winning entry will be shared with the Evise team.

This platform would integrate reading and annotating on the original document, summarizing these 'in line' notes into a concise summary for the authors to read. Ideally this platform would be online and cloud-based so you could easily pick up and continue where you left of the day before. It's easy to perceive how trends such as tablet applications could extend from this basic idea to provide even greater flexibility and ease of use for the reviewer on the go, or on the couch. As a good work-life balance is limited by available free time, I feel that providing the tools to free up extra time should be considered a major incentive for most scientists to continue reviewing for Elsevier.”

Visit the Peer Review Challenge website for further details of Koen’s entry.

Author Biography

Clare Lehane

Clare Lehane
PUBLISHER, ENERGY AND PLANETARY SCIENCES
Clare graduated from University College Cork, Ireland, with a PhD in Marine Ecology in 2004 and since then she has worked in various aspects of publishing. She has been working with Elsevier since 2006 and is a Publisher on the Energy and Planetary Sciences portfolio where she has responsibility for 11 journals, across the nuclear energy, solar materials, bioenergy and greenhouse gas areas, along with three planetary sciences journals.


 

 

Michael L Callaham

An Update on Research Regarding Reviewer Expertise

Dr Michael L Callaham, Editor-in-Chief of Annals of Emergency Medicine, writes about his research looking into methods of educating peer reviewers.

Read more >


Dr Michael L Callaham, MD | Editor-in-Chief, Annals of Emergency Medicine

Dr Michael L Callaham, MD, is Chair of the Department of Emergency Medicine and Professor of Emergency Medicine at the University of California, San Francisco (UCSF) School of Medicine. He is also Editor-in-Chief of Annals of Emergency Medicine, the official journal of the American College of Emergency Physicians.  He received his MD in Medicine from UCSF in 1970 and carried out his residency in Emergency Medicine at the USC Medical Center, Los Angeles, CA. He is a member of the Institute of Medicine, National Academy of Science.

As a result of his editing and publishing experience, his research interests have turned to trying to better understand the scientific peer review publication process through research into methods of educating peer reviewers, as well as research into bias and its impact on scientific publication.

 

Quality peer reviewers play a major role in the quality of the science a journal publishes, and many journals have trouble finding a sufficient supply of reliable ones. Therefore it would be valuable for journals to know what characteristics identify a good reviewer in advance, and/or how to improve their skills once they are reviewing. In the past decade our understanding of this topic has deepened, but the results are not encouraging.

It would be very desirable for editors to be able to identify high quality reviewers to target for recruitment, or at the time of recruitment, to help weed out those who will not perform well. Several studies, one including 308 reviewers and 32 editors, showed that factors such as special training and experience (including taking courses on peer review, academic rank, experience with grant review, etc.) were not reflected in the quality of reviews subsequently performed by reviewers. There was a trend towards better performance in those who had a degree in epidemiology or statistics, as well as those who had already served on an editorial boards. Several papers found that more experienced reviewers (> 10 years out of residency) performed more poorly, but for all these variables, the relationship was weak and the odds ratios were less than 2.

Therefore, if we cannot identify good reviewers in advance, perhaps we can train them to perform good reviews once on board. A number of studies have examined the impact of formal reviewer training, most of them focusing on the traditional half day voluntary interactive workshop format. In all these studies, attendees were enthusiastic about the workshop training, felt it would improve the quality of their subsequent reviews, and performed better on a post-test of their understanding of peer review. Unfortunately, even when compared to controls with similar previous volume and quality ratings, none of these predictions came true and the objective quality scores of attendees did not change at all. At the journal in these studies, this led to abandonment of these methods, with however a subsequent steady rise in review quality due to other interventions.

These failures led to study of more substantial interventions that would still be reasonable logistically for a journal to implement.  One involved increased feedback to reviewers, who were not only given explicit information about what was expected in the review, but also received copies of other reviews of the same manuscript with the editor’s rating of each of those reviews, a copy of a truly superb review of a different manuscript, as well as being told the rating they received on their actual review. These interventions (carried out on about 4 reviews for each subject) had no significant impact on subsequent quality performance. Finally, a recent study identified volunteer mentors among reviewers who had the highest performance ranking for review quality, matching them up with randomly selected reviewers new to the journal and encouraging them to discuss each review by phone or email. Like previous studies, for reasons of practicality this typically involved only 3 or 4 reviews per subject, and like other interventions it had no effect compared to the control group who received no special effort.

We can conclude that so far none of the fairly easy approaches to reviewer training have been shown to have any effect, probably because the amount of feedback and interaction needed to teach the complex skills of critical appraisal is much greater than the time allotted to this task by editors and senior reviewers.

What then is a poor editor to do? We cannot identify good reviewers in advance, and we can’t train them in any relatively easy, low-resource fashion. This makes it all the more crucial to adopt a validated and standardized editor rating of review quality and use it on all reviews. This allows identifying reviewers by quality performance, and then periodic stratifying of those reviewers and steering more reviews to the good ones, has been shown to have a significant effect on the quality and timeliness of reviews as a whole. All this, of course, assumes that one has enough reviewer raw material to make choices, which unfortunately is a luxury many smaller journals do not possess.

This article first appeared in Reviewers' Update, Issue 12, October 2012

Studies referred to in this article are:

David_Schriger

Improving the Reporting of Clinical Research – An Editor’s View

Editor Dr David L Schriger muses on what can be done to improve both the quality of science and its reporting.

Read more >


Dr David L Schriger| Deputy Editor, Annals of Emergency Medicine

As well as his role as Deputy Editor of the Annals of Emergency Medicine, Dr Schriger is also a member of the CONSORT and EQUATOR initiatives. His research focuses on improving the credibility of medical literature through the detailed presentation of results via figures and tables.

The last 20 years have seen much written about the poor quality of medical literature1. Recent endeavors such as EQUATOR (and its component reporting guidelines) and the Peer Review Congress (which has fostered interest in journal quality) have sparked considerable improvement. However, there is more to be done to improve both the quality of the science and the quality of the reporting of the science. Journals can play an important role in both areas. The first step for journals interested in doing so is to step beyond three common misconceptions.

First, there is a misguided obsession with statistics, a misconception that distracts authors, reviewers, and readers from more fundamental issues2. Classical statistics is concerned with differentiating observations expected by chance alone from those unlikely to be due to chance, thereby suggesting a potentially important association. While random error is a legitimate concern, particularly for small studies with positive results, in clinical research, concerns about random error are dwarfed, or should be dwarfed, by concerns about non-random error which is also known as confounding or bias3.

When problems occur in clinical studies they are typically related to the methodology of the study, not the statistics. In reviewing more than 2,500 papers for Annals of Emergency Medicine and other journals over the past 25 years, I have seldom found a paper for which the main deficiency was the use of the wrong statistic or the miscalculation of a statistic. In contrast, I routinely read studies that are poorly designed or fail to account for the presence of confounding in their analyses or conclusions. I also commonly find studies that devote multiple paragraphs in the Methods and Results sections to statistical concerns but fail to include even a single sentence about non-random error. A skeptic might think that the obsession with statistics is a diversionary smokescreen designed to distract readers from fundamental problems with confounding and bias.

Second, journals often have ill-defined goals for their review process. Review processes can ask several questions including:

a) Is the topic paper appropriate for our audience?

b) Is the reporting of the science complete? Does the paper provide all of the information that a knowledgeable, critical reader needs to reach a conclusion about the work

c) Is the science correct?

A common misconception is that c) is a legitimate goal of peer review. While it is certainly appropriate that the peer-review process filters out abject garbage (papers whose claims are unsubstantiated or ludicrous), caution should be taken to ensure that reviewers are critiquing the research design, analytic methods, and the quality of the reporting of the results, not the conclusion. Otherwise, journals will reject articles that conclude that ulcers are caused by bacteria just because the conclusion is unexpected. Instead, peer review should focus on ensuring that readers have all the information they need to reach their own decisions about the paper's conclusion. From this perspective, peer review's purpose is to bring to readers complete presentations that meet methodological standards and standards for comprehensive reporting. Don't worry whether the authors have found truth, worry about whether they have told a complete story. The scientific process will take care of the rest4.

The third misconception is that article quality is the responsibility of the authors, not the journal. While it is certainly true that better journals tend to get better papers, there is ample evidence that the papers of the highest impact journals have problems with incomplete or suboptimal reporting5-6. Research suggests that these problems are only corrected if the journal identifies them and insists that they be fixed7-8. A journal must take an active role in setting expectations and enforcing them if the reporting of science is to be improved.

At Annals of Emergency Medicine, we recognized these issues and have taken a series of steps to improve our journal. I share with you a number of them so you may consider whether they would be appropriate for your journal.

In 1997, the editors recognized that bias was the greatest threat to the veracity of the work being published and decided that all research papers would be reviewed by one of a small cadre of ‘methodology/statistics’ reviewers in addition to the typical content reviewers. Experience had shown us that the ideal person to perform this function is not a full-time statistician but a clinician-researcher who thoroughly understands methodology and knows enough statistics to know when formal statistical review is needed. This program has proved successful - the quality of reviews has improved as has the quality of the published papers9-11. Starting six years ago, this program was supplemented by a check for the appropriateness and quality of tables and figures in papers about to be offered acceptance or revision12-13.

These two programs have improved the journal and have slowly trained the author community about the journal's standards (which are stated in detailed Instructions for Authors initially composed in 200314-15). Over time, the methodology/statistical reviewers have had an easier time because papers come in with many of our requirements already met. In summary, our experience leads me to offer the following guidance to journals trying to improve their quality:

1) The main problem is study methodology, not statistics. Put your efforts into carefully critiquing each paper's methodology. Do not assume that regular reviewers will do this well. Identify reviewers who are capable of doing this job and use them. With more and more physicians getting clinical epidemiology training in public health and other graduate programs, finding such reviewers is getting easier. If you want them to do lots of reviews, compensate them.

2) The second problem is the quality of reporting. Get familiar with EQUATOR-network.org and the reporting guidelines for different types of research (CONSORT, STAR-D, PRISMA, STROBE...). Recognize, however, that these guidelines may be insufficiently detailed regarding specific nuances of your field and are not as strong on the presentation of results as they are on the presentation of methods. Augment them as needed.

3) Discourage papers that hide behind a torrent of statistics and models instead of showing readers the actual data. Editors and reviewers should ask "are methods and results presented in sufficient detail that learned readers can decide whether they agree or disagree with the conclusion"? Focus on whether the paper is fully reported rather than whether the science is correct or not.

By refocusing peer review on the paper's methodology - as opposed to its statistics - and on the quality of the reporting of the science, editors can improve the quality of research articles in their journals.

References:

1 DG Altman. The scandal of poor medical research.  BMJ 1994;308:283–284

2 Schriger DL. Problems with current methods of data analysis and reporting, and suggestions for moving beyond incorrect ritual. Eur J Emerg Med. 2002;9:203-7.

3 Goodman, S.N. Toward evidence-based medical statistics. 1: The p-value fallacy.  1999 Ann. Int. Med;130:995–1004.

4 Ziman JM. Reliable Knowledge: An Exploration of the Grounds for Belief in Science. Cambridge University Press: Cambridge, 1978.

5 Glasziou P, Meats E, Heneghan C, Shepperd S. What is missing from descriptions of treatment in trials and reviews? BMJ 2008; 336:1472–1474.

6 Hopewell S, Dutton S, Yu LM, Chan AW, Altman DG. The quality of reports of randomised trials in 2000 and 2006: comparative study of articles indexed in PubMed. BMJ 2010; 340:c723.

7 Plint AC, Moher D, Morrison A, Schulz K, Altman DG, Hill C, Gaboury I. Does the CONSORT checklist improve the quality of reports of randomised controlled trials? A systematic review. Medical Journal of Australia 2006; 185:263267.

8 Goodman SN, Berlin J, Fletcher SW, Fletcher RH.  Manuscript quality before and after peer review and editing at Annals of Internal Medicine. Ann Intern Med. 1994;121:11-21.

9 Schriger DL, Cooper RJ, Wears RL, Waeckerle JF. The effect of dedicated methodology and statistical review on published manuscript quality. Ann Emerg Med. 2002;40:334-337.

10 Goodman SN, Altman DG, George SL. Statistical reviewing policies of medical journals: caveat lector? J Gen Intern Med. 1998;13(11):753-6

11 Day FC, Schriger DL, Todd C, Wears RL. The use of dedicated methodology and statistical reviewers for peer review: A content analysis of comments to authors made by methodology and regular reviewers. Ann Emerg Med. 2002;40:329-33.

12 Cooper RJ, Schriger DL, Tashman D.  An evaluation of the graphical literacy of the Annals of Emergency Medicine.  Annals of Emergency Medicine. 2001;37(1):13-19.

13 Cooper RJ, Schriger DL, Close RJ. Graphical literacy: The quality of graphs in a large-circulation journal. Ann Emerg Med. 2002;40:317-22.

14 Cooper RJ, Wears RL, Schriger DL. Reporting research results: recommendations for improving communication. Ann Emerg Med. 2003 Apr;41(4):561-4.

15 Schriger DL.  Suggestions for improving the reporting of clinical research: the role of narrative. Ann Emerg Med. 2005;45:437-43.

Journal of Public Economics

Refereeing Behavior and the Determinants of Altruism

The Editors of the Journal of Public Economics conducted an innovative experiment into how we can motivate pro-social behavior. Here they share their findings…

Read more >


Raj Chetty, Emmanuel Saez and László Sándor | Editors, Journal of Public Economics

Study Overview

Raj Chetty

Raj Chetty

We chose to collaborate with Elsevier on a study into how we can motivate pro-social behavior. Timely and useful refereeing work is usually understood to be a favor to colleagues, an unpaid but important task for the profession. As such, it is a good candidate for comparing financial incentives with other more inexpensive alternatives, in situations where the work benefits the public more than it rewards the worker.

The study has highlighted some key factors:

  • Deadlines are extremely effective.
  • Moderate financial incentives invoke a large response.
  • Public comparison to peers has a moderate effect.
  • There are different effects when looking at younger referees in comparison to established tenured ones.
  • There is no backlash when financial rewards are missing or phased out.

Conducting the Experiment

In 2010, we randomly assigned approximately 1,000 referees to one of four groups:

  • A control group with the usual 6-week due date.
  • A short deadline group with a 4-week due date.
  • A cash incentive group where a $100 payment was made for reports submitted by the shorter deadline.
  • A social incentive group where referees’ turnaround times were publicly posted on the journal’s website.
Emmanuel Saez

Emmanuel Saez

Always following the same process the majority of referees received multiple invitations throughout the study, which ended in November 2011. Using data retrieved from the Elsevier Editorial System (EES), the team studied refereeing times both before and after the experiment, as well as comparing times to other Elsevier journals. The differing approaches had small but significant impacts on whether referees agree to review a paper, but do not appear to have generated differential selection. The shorter deadline, of a 4-week due date, reduced turnaround times by an average of 10 days. The cash offer doubled this effect, further reducing referee times by an additional 10 days and the social incentive treatment reduced turnaround times by approximately 5 days. Less surprising in hindsight is that the study showed that tenured professors are most responsive to social pressure, while untenured referees are most responsive to deadlines and cash offers.

The study also showed that a common misgiving about financial incentives -- that they may crowd out intrinsic (or altruistic) motivation -- does not appear to apply here. Referees are no slower at concurrent or later unpaid jobs. Finally, all the approaches show modest impacts on the quality (length) of reports, the recommendations of the referees and/or the final decision of the editors.

Presenting the study

Laszlo Sandor

László Sándor

This study was presented at the National Tax Association Annual Conference in November 2011 and will be part of a session on journals and academic publishing at the Summer Institute of the National Bureau of Economic Research in July 2012. The session will be attended by editors of leading journals, who may decide to re-evaluate their own journal's policies based on the evidence from this innovative experiment conducted by the research team, along with Elsevier.

This article first appeared in Reviewers' Update, Issue 11, July 2012.

VoYS_PeerReview_JWphotocrp

New guide shines a light on peer-review process

A desire to understand the inner workings of the peer-review system has led a group of early career researchers to publish a new guide on the topic.

Read more >


Julia Wilson | Development Manager, Sense About Science
Sense About Science is a UK charity that equips people to make sense of science and evidence

A desire to understand the inner workings of the peer-review system has led a group of early career researchers to publish a new guide on the topic.

Peer review: The nuts and bolts

Members of Sense About Science’s Voice of Young Science (VoYS) network, an active group of early career researchers who stand up for science in public debates and inspire their peers to do the same, were behind the guide. They were keen to discover how to get involved in peer review and what is being done to address some of the criticisms of the system, such as bias from reviewers. So, armed with a collection of concerns raised by their peers, they set off to interview scientists, journal editors, grant body representatives, patient group workers and journalists worldwide. The end result is the new guide, Peer review: the nuts and bolts, which is aimed at early career researchers. It received its official launch at the EuroScience Open Forum (ESOF) in Dublin this July.

In 2009, Sense About Science partnered with Elsevier to conduct one of the largest surveys of international authors and reviewers which highlighted how dedicated the scientific community is to peer review. 90% of respondents review articles because they like playing their part as a member of the academic community; 85% enjoy seeing papers and being able to improve them; and 91% believe their own last paper was improved through the peer-review process.

Just as a washing machine has a quality kite mark, peer review is a kind of quality mark for science. It tells you that the research has been conducted and presented to a standard that other scientists accept. At the same time, peer review is not saying that the research is perfect (nor that a washing machine will never break down). I’m surprised that such an integral and valuable contribution from scientists is often given little recognition in academia or training in how to do it for early career researchers.*

In writing the guide, the authors of Peer review: the nuts and bolts have not avoided criticisms of the peer-review process. They have asked journal editors and reviewers some challenging questions about scientific fraud and plagiarism going undetected; issues of trust and bias; ground-breaking research taking years to publish and the system benefiting a closed group of scientists.

What became clear was that early career researchers are frustrated by the lack of formal recognition for reviewing. With so many pressures to secure grant funding and publish research, there is a risk reviewing will become marginalised and inevitably inconsistent and shoddy.

Reviewing is currently not included in the Research Excellence Framework (REF) in the UK (the new system for the allocation of funding to higher education institutes).  Members of the VoYS network decided to do something about this and wrote an open letter to Sir Alan Langlands, the Chief Executive of the Higher Education Funding Council of England, calling for formal recognition of reviewing in the REF. In the letter, the early career researchers told Sir Alan: “Recognising reviewing as part of the REF would ensure that it is prioritised and safeguarded by university departments, [...] and approached professionally and seriously, enabling senior researchers to spend time mentoring early career researchers like ourselves in these activities.” A copy of their letter can be found on the Sense About Science website.

Dr Irene Hames (l)

Their call was supported by high profile editors and experts in the field including Dr Irene Hames, Editorial Consultant and author of Peer Review and Manuscript Management in Scientific Journals who spoke at our discussion on peer review at ESOF 2012 to mark the launch of our peer-review guide. Dr Hames said in support of the early career researchers’ open letter: “Peer reviewing involves a lot of time and effort by researchers [...] There is, however, currently no formal recognition of peer reviewing as a professional activity. Better recognition would be especially important for early career researchers, to demonstrate not only their contribution to this important activity, but their recognition as experts in their research areas.”

Peer review: the nuts and bolts is available to download from the Sense About Science website. For hard copies, please send requests to publications@senseaboutscience.org.

eeslogo

Planned 2012 Innovations Promise Easier-to-Use EES

As many of you know, Elsevier is currently building Evise, our next generation online submission and peer-review system.  The rollout of Evise is planned to begin in the second half of 2013 and to prepare for a smooth transition, 2012 will see the introduction of new features to our current system, EES. These include something […]

Read more >


As many of you know, Elsevier is currently building Evise, our next generation online submission and peer-review system.  The rollout of Evise is planned to begin in the second half of 2013 and to prepare for a smooth transition, 2012 will see the introduction of new features to our current system, EES.

These include something we know you have been keen to see – a single username and password across all EES journal sites.

Single login across EES journal sites

Researchers have multiple roles in publishing: many authors are also reviewers; many Editors are also authors and reviewers. And researchers can perform these roles for multiple journals. We know that EES does not recognize that sufficiently so, later this year, we will begin the task of consolidating all user accounts.

How to consolidate your account

Once the change has been rolled out, when you log into EES you will receive a prompt to consolidate your accounts. EES looks for matching associated email addresses when deciding which accounts to group together. If you have used different email addresses per EES site, you can indicate this during consolidation. Once you have selected the accounts to consolidate, you will receive a confirmation email. This is sent to ensure that only the account owner can give approval.

During consolidation, you will also be asked to choose a security question and answer. You will need these to reset your password if you forget it.

You will have 30 days to consolidate your accounts. After this period, you will only be able to use EES if you have consolidated your accounts.

Figure 1. The consolidation notification screen.

Logging in to EES after consolidation

After you have followed the consolidation procedure, you will be able to use the same username and password to access each EES journal site you use. Your primary email address in EES will be your username. You will continue to log into each EES journal site separately.
If you have multiple roles for a single journal, you will need to log off and log in again if you want to switch your user role.

Roll out timing

The new user consolidation functionality will be piloted in July and August 2012, with roll out activity ramping up from September 2012 onwards. We will keep you informed of our progress by email.

Online support consolidation

We are also working on consolidating the online support available for EES. This is currently spread across the Elsevier website but going forward generic information on EES will be available on Elsevier.com, while EES support information will be presented in EES. That means that if you click on Help in EES, a pop-up window will open up in which you will be able to quickly access the right support content. The content will be presented per role and per phase in the editorial process to make it easier for you. The search function will also be available in the window.

EU36_EESFigure2

Figure 2. The new help window.

Future improvements

Elsevier has a number of user feedback programs and the results of these, along with the questions end users ask Elsevier customer support, are just some of the sources we call on when determining which improvements we should introduce. You can also provide feedback via evise@elsevier.com.

Author Biography

Edward O'Breen

Edward O'Breen
MARKETING AND BRAND MANAGER, EES AND EVISE
Edward has worked on the development and launch of new products and services since 1997. Prior to joining Elsevier in 2011, he worked for telecom operators, utilities and publishers. He has a MSc degree in Business Administration from the Rotterdam School of Management, Erasmus University Rotterdam.


EU36_word-cubes

Exploring Improvements to the Peer-Review System

Peer review has a long history; it has been a part of scientific communication since the appearance of the first journals in the 1660s. The Royal Philosophical Transactions is accredited as being the first journal to introduce peer review. Each year more than 1.3 million learned articles are published in peer-reviewed journals. Such is its […]

Read more >


Peer review has a long history; it has been a part of scientific communication since the appearance of the first journals in the 1660s. The Royal Philosophical Transactions is accredited as being the first journal to introduce peer review.

Each year more than 1.3 million learned articles are published in peer-reviewed journals. Such is its importance that according to Ziman (1968)1 it is ‘the lynchpin about which the whole business of science is pivoted.’

However, the expansion of the global research community and the year on year increase in the number of papers published means the pressure on the peer-review system has grown. Moreover, as the pressure has increased, so too has the volume of those questioning peer review’s effectiveness.  Some are worried by bias and are concerned it is not objective, others are anxious about the length of time it takes for an article to go through the peer-review process and some worry about its efficiency. Richard Smith2, former Editor of the BMJ, said the following about peer review in 2006:

“.. it is slow, expensive, profligate of academic time, highly subjective, something of a lottery, prone to bias, and easily abused.”

In response to the perceived challenges, peer review has evolved and continues to do so. Working with you, the Editor, we hope to be able to improve and streamline the peer-review process, ultimately easing the burden on both reviewers and Editors. In this article, we take a closer look at initiatives in Elsevier that tackle some of the challenges in peer review and evaluate the progress of some of these pilots.


Peer Review Grand Challenge

Running from March to May 2012, this web-based Challenge invited submissions on any aspect that could significantly add to the current peer-review system. Entries could range from designing a completely new system, to working within an existing peer-review method (like the single blind system).

The Challenge also welcomed entries that explored how publishers and Editors can help early career researchers become reviewers, or how reviewers can be recognized by either their institutes or publishers.

The entry phase of the Challenge closed on 7th May and the judges are now going through the submissions to pick out up to 10 finalists, whose ideas will be posted on the Challenge website.  We will be inviting comments from the community on these ideas before the judges make their final decisions, taking into account any relevant community comments. Please do check the Peer Review Challenge website from 12th June onwards for details of the finalists!

For more information on this initiative, please contact Clare Lehane, Executive Publisher, STM Publishing, c.lehane@elsevier.com

Cascading of manuscripts

Figure 1. The Article Transfer Service at a glance.

As an Editor, you may frequently be confronted with manuscripts that are out of scope or are simply not suitable for the journal; however, they still contain sound research. With this in mind, we have developed the complementary Article Transfer Service (ATS) which allows the paper to be moved to a more appropriate journal. Currently, Editors within the fields of Pharma Sciences, Physics and Immunology, are able to offer authors this option and, if the author agrees, we can promptly transfer the manuscript on their behalf.

Key advantages of the Article Transfer Service:

  • Editors can make faster, more informed decisions on manuscripts;
  • authors receive faster decisions without the need to reformat or resubmit;
  • reviewers benefit from a lighter burden due to a reviewer sharing policy where reviews have already taken place; and
  • authors can publish in a journal that maximizes the impact of their research.

Results so far:

  • Editors have offered to transfer up to 35% of rejected manuscripts and up to 35% of offered transfers have been taken up by authors.
  • Up to 20% of those transfers have been accepted by the receiver journals.

We also surveyed a number of participants in the ATS scheme and discovered the following:

  • 67% of Editors think that the ATS benefits the authors, while 75% agree that having reviewer reports is beneficial.
  • 55% of authors are active promoters of the scheme.
  • 86% of the reviewers are willing to recommend an alternative journal to the Editor.

For more information on this pilot, please contact John Lardee, Senior Project Manager, Publishing Services, j.lardee@elsevier.com.


The Reviewer Guidance Program

From feedback we know that reviewers, especially those new to the task, would value more guidance on how to peer review. This program, which is still in the developmental stages, has been created to answer that need and will consist of both theory and hands-on practice.

Theory: By attending a Reviewer Workshop, participants will be introduced to the concept and background of peer review as well as peer-review fundamentals, publication ethics and the role of a reviewer. They will also examine a specific case study. Reviewer Workshops have been taking place for a while now and participants have told us that they feel more confident after attending one. Since it is not always possible to physically attend a workshop, we are now looking into the possibility of offering a distance learning (online) alternative.

“The Reviewer Guidance Program is not only an experience that helps early career researchers become better reviewers, but also to be more critical in analyzing their own papers before submitting. In addition, this is a great opportunity for junior scientists to network with their more senior peers.” Irene Kanter-Schlifke

Hands-on practice under mentorship: This part of the Reviewer Guidance Program aims to provide participants with the experience of independently reviewing at least two manuscripts inside a specially-created EES (Elsevier Editorial System). Each trainee is supported by a mentor who discusses the reviews with them and gives feedback and guidance. The mentor finally decides when a trainee has gained enough experience to start reviewing live manuscripts. After the program, each trainee receives a certificate of participation from Elsevier. We began piloting this module at the end of last year and the first feedback is promising. One trainee commented: “I’m now more familiar with rating papers and I’m more critical when I read papers.” The mentors involved in this module, often journal Editors, also see the benefits of this initiative; one remarked: “This module is a nice opportunity to learn how to efficiently review manuscripts. Often junior scientists have no idea how it works. As well, they can better understand how their manuscripts will be reviewed.”

During the Reviewer Guidance Program we will guide participants in how to write review reports in such a way that they answer the needs of both the Editor and the author. Another expected benefit is that the program should contribute to increasing the number of trusted – and usually enthusiastic – reviewers available for Editors to call on. Irene Kanter-Schlifke is a Publisher for Pharmacology and Pharmaceutical Sciences and is closely involved in the pilot. She adds: “The Reviewer Guidance Program is not only an experience that helps early career researchers become better reviewers, but also to be more critical in analyzing their own papers before submitting. In addition, this is a great opportunity for junior scientists to network with their more senior peers.”

If you are interested in organizing a Reviewer Workshop at your institute, please contact your publisher.

Results so far: We are currently evaluating feedback and expect to do a further pilot in due course.

For more information on this program, please contact Irene Kanter-Schlifke, Publisher Pharmacology & Pharmaceutical Sciences, STM Publishing, i.kanter@elsevier.com, or Angelique Janssen, Project Manager, Publishing Services, a.janssen@elsevier.com.


Published reviewer reports

Reviewers play such a vital role in the peer-review process yet their contribution often remains hidden. In addition, open reviewer reports increase peer-review transparency and assist good articles to gain authority. With that in mind, we thought why not publish reviewer reports alongside the final article on SciVerse ScienceDirect?

At the beginning of this year, we began doing just that on the journal Agricultural and Forest Meteorology.

We know from the feedback we have received that Editors welcome such a public acknowledgement of reviewers’ contributions, and we hope this step will enhance the quality of the review reports and help to capture / attract good reviewers for the journal.

How does it work?

Both authors and reviewers for the journal are informed about the new process and reviewers can indicate whether they want their name disclosed on ScienceDirect. Editors then decide if the reviewer reports are appropriate to publish alongside the article as supplementary material.

Results so far: The pilot launch attracted positive international media attention. It was also suggested that open reviewer reports could play a useful role in training early career researchers as reviewers.  So far, reviewer reports have been published alongside around 13 manuscripts.

For more information on this pilot, please contact Gilles Jonker, Executive Publisher, Physical Sciences, g.jonker@elsevier.com.

Open peer commentary format

In this pilot, we have asked experienced researchers to submit a one page comment on a (review) article for the journal Physics of Life Reviews. These comments are published in the same issue as the article. On average, five comments are published with the article and the author can write a rebuttal article.

Figure 2. An example of an article with open peer commentary in SciVerse ScienceDirect.

Results so far: Since the pilot was launched in January 2010, the journal has seen an increase in papers (2011 - 85 and 2010 - 74; previously the journal received around 12 papers per year). There has also been a sharp increase in usage – roughly 3,000 downloads per month compared to 2,000 per month in 2009.

For more information please contact Charon Duermeijer, Publishing Director Physics, c.duermeijer@elsevier.com.

PeerChoice

Traditionally in peer review, Editors have chosen to approach reviewers they consider are suitably qualified to comment on a manuscript, or who would find the subject matter interesting.

But what if the reviewer could select the manuscript themselves? For a year now, we have been experimenting with this additional peer-review system on the journal Chemical Physics Letters. Each week, a selected pool of reviewers receives an overview of the new submissions. If they like a paper because it matches their expertise and interest, they can decide to review it. Because they make the decision themselves, we ask them to review the manuscript within a week.

Martin Tanke, Managing Director of Elsevier’s STM Journals, explains: “The 2009 Peer Review Survey, which we conducted with our partner Sense About Science, showed that a significant number of reviewers were sometimes hesitant to review an article because of a lack of expertise in that particular field. In addition, researchers made clear they want to improve peer review by improving article relevancy and speeding up turnaround time. PeerChoice can contribute to solving both issues.”

Figure 3. An example of the email overview a reviewer receives.

Results so far: The time taken to review the manuscript has been slightly reduced, while the time taken to accept an invitation has been halved.

For more information on this pilot, please contact Egbert van Wezenbeek, Director Publication Process Development, Publishing Services, e.wezenbeek@elsevier.com.

All these pilots have been launched with one aim in mind; to support and improve the peer-review process to the benefit of Editors, authors and reviewers.

We would love to hear your thoughts on these new approaches and your suggestions for improvements. If you have a story you would like to share, you can post it on our new Short Communications bulletin board.

1 Ziman, J.M. (1968), Public Knowledge: an essay concerning the social development of science. London: Cambridge University Press.

2 Smith, R. Peer Review: A Flawed Process at the Heart of Science and Journals. Journal of the Royal Society of Medicine April 2006 99.4: 178–182.

Related links:

Author Biographies

John Lardee

John Lardee
SENIOR PROJECT MANAGER, PUBLISHING SERVICES
For the last 15 years, John has been involved in managing projects to improve author, Editor and reviewer experiences with Elsevier’s products and services. Recent projects include the Article Transfer Service and the Find Reviewer Tool. John’s approach to project management is an agile one: “To develop services and products iteratively together with our Editors, authors and reviewers”. John has a Master of Science in Informatics from the Technical University of Delft.


Adrian Mulligan

Adrian Mulligan
DEPUTY DIRECTOR, RESEARCH & ACADEMIC RELATIONS
Adrian has 14 years of experience in STM publishing. The last 10 years he has spent in research have given him the unique opportunity to study the scholarly community. Recently, in partnership with Sense About Science, Adrian worked on a large scale study that examined attitudes of researchers towards peer review. He has presented on peer review at various conferences, including STM, ESOF, AAP and APE.  Adrian’s background is in archaeology with a BA Honours degree and a Master of Science from Leicester University. He also has a diploma in Market Research from the Market Research Society.


SCIENCE-LehaneC

Elsevier Peer Review Challenge is Now Open for Entries!

On March 28th, Elsevier launched the ‘How do you see the future of peer review?’ challenge. We hope that this challenge will help inform the ongoing discussions on peer review.

Read more >


Clare Lehane | Executive Publisher, Energy & Planetary Sciences, Elsevier

On Wednesday 28th March, Elsevier launched the How do you see the future of peer review? challenge.  The aim of the challenge is to invite our reviewing community to submit ideas on any of the following three aspects of the peer review system (for journals):

  • The peer review process itself – new approaches or enhancements of current approaches
  • Approaches to help early career researchers to become reviewers
  • Improving the recognition and rewarding of reviewers by their institutions and/or journal publishers

The challenge website will remain open to entries until midnight on Monday 7th May, 2012 (CET).

We will work with the overall winners of the challenge to determine if their idea could be piloted with a suitable Elsevier journal, and in cooperation with the editors of that pilot journal. The winning ideas will be announced around 15th August via the challenge website. 

We hope that this challenge will help inform the ongoing discussions on peer review and help us, as your publishing partners, to work more closely with the reviewing community.

You are welcome to forward this challenge announcement to your colleagues and editorial network to encourage submissions.

Webcast_reviewers_small150

Finding and Retaining Reviewers

Discover new ways to identify and retain the best reviewers in your field; how to motivate them to do a good job and encourage them to repeat review for you.

Read more >


Discover new ways to identify and retain the best reviewers in your field; how to motivate them to do a good job and encourage them to repeat review for you.

frankarthur

Peer Review and the Role of the Editor

Editors today are confronted with a number of challenges to the peer-review process, for example finding reviewers. That means new and different approaches are required, Frank H Arthur writes.

Read more >


Frank H Arthur | USDA, Agricultural Research Service, Center for Grain and Animal Health Research | Regional Editor-in-Chief, Journal of Stored Products Research

I have been a Regional Editor of the Journal of Stored Products Research since November of 2006, and continue to serve as a reviewer for other scientific journals. Editors today are being confronted with a number of challenges to the peer review process, including obtaining the peer reviews necessary to evaluate scientific studies for journal publication. New and different approaches are necessary to cultivate and maintain a solid base of reviewers.

First, editors must become more active in pre-screening manuscripts before they are sent out for review. As a reviewer, I regularly receive manuscripts that are severely deficient in English grammar and construction, along with the stated or implicit assumption that it is also my responsibility to re-write these manuscripts in addition to evaluating the scientific content. This expectation places an unfair burden on reviewers and editors, who are usually serving on a volunteer basis. Related issues include being sent manuscripts that are obviously lacking in scientific quality for that journal, out of scope, or in a completely different format from what is specified.  Receiving these types of manuscripts increases frustration on the part of reviewers, and editors can, and should, simply return those manuscripts to the authors and let them address the deficiencies. The authors are ultimately responsible for the quality of the manuscript.      

Second, editors should focus on obtaining reviews from scientists who are actively publishing in their journal. Every month I receive several automatic ‘invitation to review” emails from journals where I have not published in the past, nor am I likely to do so in the future, including various new online journals. Many scientists will decline those invitations unless there is overwhelming interest in the topic of the paper. I also receive numerous requests for reviews from journals where I have published only sporadically as a submitting or lead author, and often not at all for the past several years. Regular contributors have a more vested interest in the journal but, at the same time, editors must not continually ask the same people to review because “they cannot find anyone else”. Efforts must be made to broaden the review base and increase participation in the review process.

Third, assuming reviews are being solicited from regular contributors to a journal, editors should first make personal contact with reviewers instead of just generating an “invitation to review” email. However, if the reviewer declines a review because of their current workload, the editor should go to someone else, rather than asking the reviewer for a suggested alternative. In my experience, many scientists will not do a review if they know a colleague has declined because he or she was “too busy”, because they are busy as well. I do not suggest colleagues when I decline a review unless that person is more appropriate because of their expertise, and I generally let them know that I have, or will, recommend them as a reviewer.         

Within many biological disciplines, the number of professional scientists is declining, pressure to obtain outside funding is increasing, and research scientists are being required to perform administrative functions as well. The steps discussed above are just a few ways editors can facilitate the peer review process to ease the burden on journal reviewers.

flag 168 x 168

Fact-finding Mission to China Provides Key Insights

“In China, the research community is gaining year on year in resources and ability. That is very exciting to be around.” — Tracey Brown, Managing Director, Sense About Science Tracey Brown believes peer review is vital to good science and the society that uses it. And it’s a conviction the Managing Director of Sense About […]

Read more >


"In China, the research community is gaining year on year in resources and ability. That is very exciting to be around.” — Tracey Brown, Managing Director, Sense About Science

Tracey Brown believes peer review is vital to good science and the society that uses it.

And it’s a conviction the Managing Director of Sense About Science shares with members of the Chinese Academy of Sciences, as she discovered during a trip to the research-rich country in March this year.

Sense About ScienceBrown embarked on the fact-finding mission with two key aims in mind; she was keen to test out views advanced about the integration of Chinese authors and reviewers into international STM publishing, and to explore future collaborations to help researchers, policy makers and journalists identify the best science.

During the two-week visit, which was supported by Elsevier, she met not only the Chinese Academy of Sciences (CAS), but Science.net, journalists, post docs and publishers.

Brown says: “It was clear that CAS is keen to discuss the best ways to evaluate research and to explore their concerns about what peer-reviewed publishing can - and can't - deliver. In an effort to avoid cronyism and subjective assessment in China, there has been a shift towards using flatter measurements; for example, the Impact Factor. There is a feeling, however, that these do not reveal enough about individual papers or the research output of an institution. Most people, including CAS, are coming to the conclusion that what we really need is a mix of the two.”

Gaining new understanding

Asked to highlight some of her key learnings during the trip, Brown says:

“People raised many interesting points and some quite contradictory ones. The early career researchers I spoke with viewed international journals as motivated by quality and fairness, and in some cases compared them favourably with Chinese journals, which can be seen as wedded to the relationships and prestige of individuals and institutions.

“On the other hand, some of the more editorially-experienced people had stories of less than positive attitudes among international editors to Chinese papers. They were concerned about a head-in-the-sand approach to such a major research base and that valuable new insights could be missed.”

Other key take-aways for Brown include:
  • General agreement that reviewing is an important part of the role of a researcher. However, involvement in it varies enormously.
  • As in many other countries, a researcher's day is structured in a way that makes it difficult to find time to review and their career progresses in response to grants and publications, not time spent reviewing. Views differed widely about the problems this posed and whether it inhibited the reviewing effort.
  • A strong interest in training, both for authors and reviewers.
  • Interest in other metrics for evaluating research output, the respective contributions of regions/countries and the performance of individual institutions.
Visiting the CAS

Tracey Brown; David Ruth, Elsevier Senior Vice President Global Communications; and Hugo Zhang, Elsevier Managing Director S&T China (left) meet with Mr Jinghai Li, Vice President of the Chinese Academy of Sciences (right)

Peer-review progress

There were also a few eye-opening moments for Brown.

She explains: “I had not expected people’s personal experiences to differ so widely. For example, I was speaking to two post doc students at Shanghai Jiao Tong University. Both had published very successfully early in their careers in some of the top journals - the elite of the elite. One was receiving almost weekly requests to review while the other had received only one request in a year. That may reflect the different nature of their papers but I heard their stories repeated elsewhere. It is perhaps to be expected that peer-review requests from international journals are still a bit of a hit and miss process in China.”

She adds: “Each time I was about to draw a conclusion about anything I would meet someone who took me in a different direction – a symptom, I imagine, of things being a work in progress there.

“Another surprising thing for me was the high level of confidence in the research community in contrast to the UK, and perhaps the US, where universities face straitened circumstances. In China, the research community is gaining year on year in resources and ability. That is very exciting to be around.”

The pressure to publish

Commenting on the quality – and quantity – of papers submitted by Chinese researchers, Brown says: “There is some concern, internationally, about filtering the sheer weight of papers produced by China. A big sea of papers makes it difficult to pick out the best.

“The thing is, there is a large pressure to publish in China and doing so in international journals brings career breaks and prestige. While lead institutions no longer pay incentives for this, some second-tier universities still appear to, which may contribute to journals being overwhelmed by unsuitable papers.

“We discovered that inappropriate submissions also stem from a lack of local knowledge about international journals, with younger researchers copying where their supervisors have published. Library services can play a very important role in widening the pool of journals considered.”

She adds: “Since returning I have been in touch with members of the Publishing Research Consortium to discuss the prospect of looking at how these new regions, such as China and India, are being integrated. Do editors now need something different from publishers with regard to support and advice? These are questions I know publishers are asking too. There is clearly some opportunity for international publishers to improve the availability of information about how to publish and where to publish, probably via librarians in those institutions where library services are developing and pro-active.”

Looking to the future

And what does Brown think the next five years will hold for the Chinese research community?

“Because of the volume of research and population size, even minority behaviors in China are likely to have a significant effect.  If just a proportion of the new generation of researchers are trained and engaged with reviewing, it could have a big impact on sharing the reviewing burden. I know that there are already programs underway, such as Elsevier’s Reviewer Workshops and Reviewer Mentorship Program. The value of their contribution to the research output cannot be overstated – just like so many other things in China at the moment!”

What is Sense About Science?

Sense About Science is a UK charitable trust that equips people to make sense of science and evidence on issues that matter to society. With a network of more than 4,000 scientists, the organization works with scientific bodies, research publishers, policy makers, the public and the media, to lead public discussions about science and evidence. Through award-winning public campaigns, it shares the tools of scientific thinking and the peer-review process. Sense About Science’s growing Voice of Young Science network engages hundreds of early career researchers in public debates about science. Sense About Science will be publishing a Chinese edition of its public guide to peer review I Don’t Know What to Believe early in 2012 in collaboration with learned societies, patient groups and journalists.


Author Biography 

Tracey Brown
Tracey Brown

Tracey Brown
MANAGING DIRECTOR OF SENSE ABOUT SCIENCE
Tracey has been the Director of Sense About Science since shortly after it was established in 2002. Tracey is a trustee of Centre of the Cell and MATTER. In 2009 she became a commissioner for the UK Drugs Policy Commission. She sits on the Outreach Committee of the Royal College of Pathologists and in 2009 was made a Friend of the College.

Helping Hand for Early Career Reviewers

A Helping Hand for Early Career Reviewers

“A real-life, hands-on approach like this equips future reviewers like never before.” — Irene Kanter-Schlifke, Publisher In many areas of research, the growth of paper submissions is outpacing the growth of qualified reviewers and resulting in pressure on the peer review system. As an editor, you will be only too aware of the challenge of […]

Read more >


"A real-life, hands-on approach like this equips future reviewers like never before." — Irene Kanter-Schlifke, Publisher

In many areas of research, the growth of paper submissions is outpacing the growth of qualified reviewers and resulting in pressure on the peer review system. As an editor, you will be only too aware of the challenge of finding good reviewers. Together with our editorial community, journal publishers at Elsevier have been working on a number of programs to develop and nurture your future pool of reviewers.

Reviewer Guidelines

Following a request from reviewers for increased support and guidance, and tests by current journal editors, the Reviewer Guidelines are now available on all Elsevier journal homepages and on our Reviewers’ homepage.

A step-by-step guide through the various stages of the peer-review process, the guidelines begin with the ‘purpose of peer review’ (addressing why reviewers should review); move on to conducting the review itself (what criteria should the reviewer be taking into account); and finish with submitting the report to the editor. They include key topics relevant to peer review, such as conducting the review, originality of research, the structure of a paper and ethical issues, together with a sample peer review report.

Reviewer Workshops – the next step

Reviewer Workshops allow participants to put the Reviewer Guidelines into context. “They aim to promote and explain the fundamentals and techniques that reviewers should adhere to when reviewing manuscripts for academic journals,” explains Andrea Hoogenkamp-O’Brien, Customer Communications Manager. Such workshops have been taking place across China, together with input from some Elsevier journal editors giving young Chinese scientists the opportunity to review scientific papers for international journals and to get hands-on training.

Reviewer Workshops held in China
Reviewer Workshops held in China

During a workshop, reviewers receive practical information on Elsevier publishing policies and procedures together with advice from other reviewers and editors, all with the aim to expedite the process of reviewing papers. Throughout the sessions, there is thorough discussion of the philosophy of peer review, various steps of the review process and examples from recent journals.

“The result is that reviewers get a real opportunity to better understand the principles and methods involved in reviewing for an international journal,” notes Hoogenkamp-O’Brien. "This is invaluable experience for the next step in our program."

Reviewer Mentorship Program

This program aims to extend the help given to reviewers during workshops by also giving some coaching and direct feedback on the reports that the trainees have submitted. Elsevier Publisher Irene Kanter-Schlifke has been piloting this program in two Institutions; Lille University, France, and the Saarland University in Saarbrücken, Germany. Each program involved 10-12 trainees.

The Reviewer Mentorship Program consists of two parts:

  • Part one – organization of the workshop itself at an institute or university. The journal publisher works together with an editor who is affiliated to the institute or university.
  • Part two – the setting up of a support EES site (our online submission, peer-review and editorial system) which is populated with original manuscripts selected by the editor. This is due to go live shortly.

Before the workshop, trainees must:

  • review an original manuscript;
  • complete the journal’s reviewer checklist; and
  • submit their report to the workshop tutors (the publisher and the editor).

“It is important that the trainees review a manuscript that is both controversial and in their area of expertise. During the workshop, an introduction on reviewing is given, followed by a discussion of the review and disclosure of  the original ‘fate’ of the paper (reviews and the final article, if accepted for publication). A real-life, hands-on approach like this equips future reviewers like never before,” explains Kanter-Schlifke.

After the workshop, trainees are invited through the system to review at least two manuscripts within a given timeframe. Each trainee is supported by a mentor who discusses the reviews with the trainee and gives feedback and guidance. The mentor finally decides when a trainee has gained enough experience to review live manuscripts. After the program, each trainee receives a certificate of participation from Elsevier.

“There are a few thoughts on what defines a good reviewer,” adds Hoogenkamp-O’Brien. “The definition I particularly like is: A good reviewer should know the journal and should have the knowledge to be able to fairly and objectively give a good report of the manuscript they are reviewing. They should concentrate on offering useful advice to authors rather than giving summary reports to editors.”

If you are interested in running either a Reviewer Workshop or Reviewer Mentorship Program at your institute or would like some further information, please email Editors' Update.

We want to hear your views on these and other issues surrounding the challenges faced by editors and peer review. Please share your thoughts by posting a comment at the bottom of this page.

View a videocast of a Reviewer Workshop in China

Author Biographies

Irene Kanter-Schlifke

Irene Kanter-Schlifke
PUBLISHER
In 2008, Irene began work as a Publisher for Elsevier’s Pharmacology and Pharmaceutical Sciences portfolio of journals. In her current role as publisher, she has been working on a number of exciting initiatives with her editors and colleagues, one of which is helping to organize and run a mentorship program for new reviewers. She holds a PhD in Neurology from Wallenberg Neuroscience Centre, in Lund, Sweden.   Before coming to Elsevier, she worked at Centocor (now Janssen Biologics), part of Johnson & Johnson pharmaceuticals in The Netherlands.

Andrea Hoogenkamp-O'Brien

Andrea Hoogenkamp-O'Brien

Andrea Hoogenkamp-O'Brien
CUSTOMER COMMUNICATIONS MANAGER
Andrea has recently started working in the Strategy and Journal Services department of Elsevier in Amsterdam, where she is part of a team responsible for developing new initiatives to improve services for authors, editors and reviewers. She joins Elsevier from FEMS in Delft where she had worked as the Editorial Coordinator, responsible for managing the publications unit, which publishes five FEMS Microbiology journals. Prior to that, Andrea held the position of Postdoctoral Research Fellow at the University of Amsterdam.


Related Articles

webinar-peer-review_150

A 20:20 Vision on the Future of Peer Review

Of interest to: Journal editors (key), additionally authors and reviewers Archive views to date: 845+ Average feedback: 4.4 out of 5

Read more >


Of interest to: Journal editors (key), additionally authors and reviewers
Archive views to date: 845+
Average feedback: 4.4 out of 5

If experiments don't produce positive results they are often not published, yet they can help to progress research. Should we publish them?

View Results

Loading ... Loading ...

Short Communications

  • Registrations are now open for the first Altmetrics Conference

    A conference dedicated to altmetrics - the first of its kind - will take place in London this September. Find out how you can register. Learn more

  • Registrations open for journal editor webinar series

    Registrations are now open for the remaining webinars in our 2014 series for journal editors. Learn more

  • Finding reviewers in EES just got easier…

    Improvements to the Find Reviewers tool in EES have simplified the process of searching for potential referees. Find out more... Learn more

  • An author’s experience of peer review

    Researcher Mounir Adjrad dwells on why constructive reviews are so important to the peer-review process. Learn more

  • The importance of gender balance for editorial teams

    We look at why a gender-balanced editorial team can be beneficial not only to your journal but to the research community at large. Learn more

  • Why I dedicated my journal editorial to open access

    UK-based editor, Professor Peter Griffiths, on why it's so important editors understand open access. Learn more

  • Warning regarding fraudulent call for papers

    Information from our Legal department about a fraudulent call for papers sent out in Elsevier's name. Learn more

  • Publishing skills webinar series for early career researchers

    Registrations are now open for our webinar series for ECRs which focuses on developing their publishing skills. Learn more

  • Discover the latest re. EES user profile consolidation

    Learn what a consolidated EES user profile can mean for ORCID profiles and managing EES accounts. Learn more

Other articles of interest

Webinars & webcasts

Upcoming webinars

Trends in journal publishing
Thursday 18th September, 2014

How to make your journal stand out from the crowd
Tuesday 21st October, 2014

Discover our webinar archive. This digital library features both Elsevier and external experts discussing, and answering questions on, a broad spectrum of topics.

Learn more about our growing library of useful bite-sized webcasts covering a range of subjects relevant to your work as an editor, including ethics, peer review and bibliometrics.

Editors’ Conferences

Boston, USA (program TBC)
21-23 November, 2014

Learn more about these forums for dialogue with, and between, our senior editors.