Skip to main content

ReproducibiliTea

About ReproducibiliTea

Join us at the ReproducibiliTea journal club for an informal discussion about how we can improve transparency, reproducibility and openness in research – with tea (and other refreshments)!

We welcome colleagues and research students from any discipline, regardless of experience or expertise.

We meet on the third Thursday of each month from 12.00 - 13.00 in the Philip Robinson Library Collaborative Learning Hub (unless indicated otherwise.) Participants not able to attend in person are welcome to join us via Teams.

The programme of events and articles for discussion is set out below. You're welcome to just drop in to any of these meetings, but if you want to join online or stay up-to-date with these and other events, please join us on Teams or on our mailing list.

ReproducibiliTea is run by our UKRN local network leads, Nicola Howe and Clement Lee. You can contact them both via ukrn@newcastle.ac.uk.

Programme for 2024/25

DateSpeaker(s) and Topic
19/09/24

Franziska Hartung, Centre for Behaviour and Evolution.

"The Hitchhiker's guide to Open Science"   

17/10/24

Chris Moreh, School of Geography, Politics and Sociology.

Has the Credibility of the Social Sciences Been Credibly Destroyed? Reanalyzing the “Many Analysts, One Data Set” Project 

21/11/24

Cong Zhang, School of Education, Communication and Language Sciences.

Open Science in Linguistics and Beyond.

The movement toward open science is transforming research across disciplines, and linguistics is no exception. This seminar will explore how my collaborators and I have promoted open science in linguistics to enhance transparency and reproducibility. Key practices, such as sharing data, tools, scripts, and manuscripts, will be highlighted as ways to incorporate open science into linguistic research. Additionally, I will introduce the concept of "Big Team Science", exploring how collaborative, large-scale research models can be applied to linguistics to address complex questions and cultivate a more diverse research environment.

05/12/24 

Vivek NityanandaBiosciences Institute.

Incentivising good research practices. What we can do to encourage open and more rigorous research 

16/01/25

Kirsten Brandt, Population Health Sciences Institute; Caroline Allen, School of Psychology.

Personality traits and acceptance of a subject-specific disruptive hypothesis 

20/02/25

Sam Orange, School of Biomedical, Nutritional and Sport Sciences. 

Change of venue: This meeting will be held in the Philip Robinson Library Committee Room. Please follow the signage from reception.

20/03/25

Chris Harrison, School of Mathematics, Statistics and Physics.

Audio Universe: An Open Approach to Turning the Universe into Sound 

17/04/25

Helen Gray, School of Natural and Environmental Sciences.

15/05/25

Clement Lee, School of Mathematics, Statistics and Physics. 

[Retracted] High replicability of newly discovered social-behavioural findings is achievable.

19/06/25

Jordan Cuff, School of Natural and Environmental Sciences.

The case for open research in entomology: Reducing harm, refining reproducibility and advancing insect science.

 This programme will be updated as soon as speakers and topics are confirmed for the remaining journal club events. 

Programme for 2023/24

DatePaper
21/09/23 Bishop, DVM. (2018). Fallibility in Science: Responding to Errors in the Work of Oneself and Others. Advances in Methods and Practices in Psychological Science, 1(3), 432-438. DOI: 10.1177/2515245918776632. (Accepted manuscript)
19/10/23 Allen, C, Mehler, DMA (2019) Open science challenges, benefits and tips in early career and beyond. PLoS Biology 17(5): e3000246. DOI: 10.1371/journal.pbio.3000246
16/11/23 Kowalczyk, OS. et al. (2022) What senior academics can do to support reproducible and open research: a short, three-step guide. BMC Research Notes 15, 116. DOI: 10.1186/s13104-022-05999-0
15/02/24 Hagger, MS (2022) Developing an open science ‘mindset’, Health Psychology and Behavioural Medicine, 10:1, 1-21, DOI: 10.1080/21642850.2021.2012474 
21/03/24 Mirko Gabelica, M et al. (2022) Many researchers were not compliant with their published data sharing statement: a mixed-methods study. Journal of Clinical Epidemiology. 150, 33-41. DOI: 10.1016/j.jclinepi.2022.05.019
18/04/24 Lakens, D (2023) Is my study useless? Why researchers need methodological review boards. Nature 613, 9, DOI: 10.1038/d41586-022-04504-8
16/05/24 Schwab, S et al. (2022) Ten simple rules for good research practice. PLoS Computational Biology 18(6): e1010139. DOI: 10.1371/journal.pcbi.1010139
20/06/24 Bring a paper from your discipline for a roundtable discussion of how open research practices were applied and what we can learn from it.

Previous events

DatePaper
20/10/22 Ioannidis JPA (2005) Why Most Published Research Findings Are False. PLoS Med 2(8): e124. https://doi.org/10.1371/journal.pmed.0020124 
17/11/22 Wagenmakers , E.-J., Dutilh, G., & Sarafoglou, A. (2018). The Creativity-Verification Cycle in Psychological Science: New Methods to Combat Old Idols. Perspectives on Psychological Science, 13(4), 418–427. https://doi.org/10.1177/1745691618771357
15/12/22 Klein, Richard A., et al. (2018) Many labs 2: investigating variation in replicability across sample and setting. Advances in Methods and Practices in Psychological Science. 1(4), 443-490. https://doi.org/10.1177/2515245918810225
Postponed due to UCU strike action Wilson G, Bryan J, Cranston K, Kitzes J, Nederbragt L, et al. (2017) Good enough practices in scientific computing. PLOS Computational Biology 13(6), https://doi.org/10.1371/journal.pcbi.1005510
20/04/23 Zwaan, R. A., Etz, A., Lucas, R. E., & Donnellan, B. (2018, August 1). Making Replication Mainstream. Preprint. https://doi.org/10.31234/osf.io/4tg9c
18/05/23 Tennant JP, Waldner F, Jacques DC et al. The academic, economic and societal impacts of Open Access: an evidence-based review. F1000Research 2016, 5:632. https://doi.org/10.12688/f1000research.8460.3
Postponed due to UCU strike action Hussey, I., & Hughes, S. (2018, November 19). Hidden invalidity among fifteen commonly used measures in social and personality psychology. Preprint. https://doi.org/10.31234/osf.io/7rbfp
Moved to 2024 Bishop, DVM. (2018). Fallibility in Science: Responding to Errors in the Work of Oneself and Others. Advances in Methods and Practices in Psychological Science, 1(3), 432–438. https://doi.org/10.1177/2515245918776632. (Accepted manuscript)