An Interview with Editor in Chief Eric Eich"
One big thing is Enhanced Reporting on Methods, which includes something like:
For each study reported in your manuscript, check the boxes below to:
(1) Confirm that (a) the total number of excluded observations and (b) the reasons for doing so have been reported in the Method section(s) [ ]. If no observations were excluded, check here [ ].
(2) Confirm that all independent variables or manipulations, whether successful or failed, have been reported in the Method section(s) [ ]. If there were no independent variables or manipulations, as in the case of correlational research, check here [ ].
(3) Confirm that all dependent variables or measures that were analyzed for this article’s target research question have been reported in the Method section(s) [ ].
(4) Confirm (a) how sample size was determined and (b) your data-collection stopping rule have been reported in the Method section(s) [ ] and provide the page number(s) on which this information appears in your manuscript:
Then they go on
"Several points merit attention. First, as shown above, the four-item Disclosure Statement applies only “to each study reported in your manuscript.” Originally, we considered adding a fifth item covering additional studies, including pilot work, that were not mentioned in the main text but that tested the same research question. However, feedback from several sources suggested that this would open a large can of worms. To paraphrase one commentator (Leif Nelson), it is all too easy for a researcher to think that an excluded study does not count. Furthermore, this actually puts a meaningful burden on the “full disclosure” researcher. The four items in the Disclosure Statement shown above are equally easy for everyone to answer; either that information is already in the manuscript or they can go back and add it. But a potential fifth item, covering additional studies, is different. The researcher who convinces himself or herself that one or more excluded studies don’t count has now saved the hours it might take to write them up for this query. File-drawering studies is damaging, but we are not convinced that this will solve that problem. A better solution involves preregistration of study methods and analyses — an approach we also take up."
Another big item seems Promoting Open Practices. They go on
"Over the past several months, a group of 11 researchers led by Brian Nosek has been grappling with these and other issues. The result is an Open Practices document that proposes three forms of acknowledgment:
- Open Data badge, which is earned for making publicly available the digitally shareable data necessary to reproduce the reported result.
- Open Materials badge, which is earned for making publicly available the digitally shareable materials/methods necessary to reproduce the reported results.
- Preregistered badge, which is earned for having a preregistered design and analysis plan for the reported research and reporting results according to that plan. An analysis plan includes specification of the variables and the analyses that will be conducted. Preregistration is an effective countermeasure to the file-drawer problem alluded to earlier in connection with Disclosure Statements.
The criteria for each badge — and the processes by which they are awarded — are described in the Open Practices document along with answers to frequently asked questions. The document proposes two ways for certifying organizations to award badges for individual studies: disclosure or peer review. For now, PS will follow the simpler disclosure method.
Manuscripts accepted for publication on or after 1 January, 2014, are eligible to earn any or all of the three aforementioned badges. Journal staff will contact the corresponding authors with details on the badge-awarding process.
Psychological Science is the first journal to implement the badge program, so changes are sure to come as editors and authors gain experience with it in the field. Again, I welcome comments and suggestions for improvement from our community."
I'll be curious about their usage and success with the registry: I have blogged about similar attempts in Economics before here.
A big impetus has been work by Simmons, Nelson, and Simonsohn, some of which I talked about here