A need for science to ‘slow down’? Experiences from the British Neuroscience Association Festival of Neuroscience

Posted on

By Alice Stephenson (PhD Student)

BNA 2019 Festival of Neuroscience, Dublin

In April, I was fortunate to attend the annual and prestigious British Neuroscience Association (BNA) Festival of Neuroscience. The conference provided a unique opportunity to engage with contemporary interdisciplinary neuroscience research across the UK and internationally. Spread over four days the event hosted workshops, symposiums, keynote lectures and poster presentations. Covering 11 neuroscience themes, as an experimental psychologist, I was particularly interested in themes relating to attention, motivation and behavioursensory and motor systems, and neuroendocrinology and autonomic systems. 

Not only was this a chance to embrace cutting-edge interdisciplinary neuroscience research, this was a chance to develop skills enabling me to become a meticulous researcher. It was clear that an overall goal of the conference was to encourage high standards of scientific rigour by embracing the open science movement.

“Fast science… too much rubbish out there we have to sift through.”

Professor Uta Frith

The open science movement encompasses a range of practices that promote transparency and accessibility of knowledge, including data sharing and open access publishing. The Open Science Framework is one tool that enables users to create research projects and encourages sharing hypotheses, data, and publications. These practices encourage openness, integrity, and reproducibility in research, something particularly important in the field of psychology. 

A especially striking claim was noted by Ioannidis (2005): “most published research findings are false.” Ioannidis argued that there is a methodological crisis in science, particularly apparent in psychology, but also cognitive neuroscience, clinical medicine, and many other fields (Cooper, 2018). If an effect is real, any researcher should be able to obtain it using the same procedures with adequate statistical power. However, many scientific studies are difficult to replicate. Open science practices have been suggested to help enable accurate replications and facilitate the dissection of scientific knowledge, improving scientific quality and integrity.  

The BNA has a clear positive stance on open science practices, and I was lucky enough to be a part of this. Professor Uta Frith, a world-renowned developmental neuropsychologist, gave a plenary lecture about the three R’s: reproducibility, replicability, and reliability, which was arguably one of the most important and influential lectures over the course of the conference. 

Professor Frith summed up the scientific crisis in two words, “Fast Science.” Essentially, science is progressing too fast, leading to lower quality research. Could this be due to an increase in people, labs, and journals? Speeded communication via social media? Pressure and career incentives for increased output? Sheer volume of pre-prints available to download? Professor Frith argued that there is “too much rubbish one has to sift through.”

A potential solution to this is a ‘Slow Science’ movement. The notion of “resisting quantity and choosing quality.”  Professor Frith argued the need for the system to change. Often we here about the pitfalls of the peer review process, yet Professor Frith provided us with some novel ideas. She argued for a limited number of publications per year. This would encourage researchers to spend quality time on one piece of research, improving scientific rigour. Excess work would be appropriate for other outlets. Only one grant at a time should be allowed. She also discussed the need for continuous training programmes. 

A lack of statistical expertise in the research community?

Professor Firth argued that there is a clear lack of statistical knowledge. With increasing computational advancements, it is becoming easier and easier to plug data into a function and accept the outcome. Yet, we must understand how these algorithms work so we can spot errors and notice illogical results. 

This is something that spoke out to me. I love working with EEG data. Analysing time-series data allows us to capture cognitive processes during dynamic and fast changing situations. However, working with such rich and temporally complex data is technically challenging. The EEG signal is so small at the surface of the scalp, and the signal to noise ratio is poor. Artefacts, non-physiological (e.g. computer hum) and physiological (e.g. eye movements), contaminate the recording, meaning that not only does the EEG pick up neural activity, but it also records other electrical signals we are not interested in. Therefore, we apply mathematical algorithms to help with cleaning the data, to improve the signal to noise ratio. Once the data are cleaned, we also apply algorithms to transform the data from the time series domain (for which it is recorded in) to the frequency domain. The number of techniques of EEG analysis has risen hugely, partly thanks to computational power, and therefore there are now a whole host of computational techniques, including machine-learning, that can be applied to EEG data.  

Each time algorithms are applied to the EEG data, the EEG data change. How can an EEG researcher trust the output? How can an EEG researcher sensibly interpret the data, and make informed conclusions? Having an underlying understanding of what the mathematical algorithms are doing to the data is no doubt paramount. 

Professor Frith is right, there is a need for continuous training as data analysis is moving at an exhaustingly fast pace. 

Pre-registration posters – an opportunity to get feedback on your method and planned statistical analyses

I also managed to contribute to the open science movement during the conference. On the second-to-last day, I presented a poster on my research looking at the temporal neural dynamics of switching between a visual perceptual and visuomotor task. This was not an ordinary poster presentation; this was a pre-registration poster presentation. I presented planned work to be carried out, with a clear introduction, hypotheses and method. I also included a plan of the statistical analyses. There were no data, graphs, or conclusions.

The poster session was an excellent opportunity for feedback from the neuroscience community on my method and statistical analyses. This is arguably the most useful time for feedback – before the research is carried out. This was particularly beneficial for me coming from a very small EEG community, and seeking particular expertise is vital. A post-doctoral researcher, who had looked at something similar during her PhD, provided me with honest and informative feedback on my experimental design. In addition, I uploaded my poster to the Open Science Framework, and the abstract was published in the open access journal Brain and Neurosciences Advances. I also received a preregistered badge for my work. These badges work as an incentive to encourage researchers to engage with open science practices. Check out cos.io/badges for more information. 

So, what next?

Practical tools and significant support are coming together to allow open science to blossom. It is now our responsibility to be part of this. I’ve created an Open Science Framework account and plan to start there, detailing my aims, methods and data, to improve transparency in research. I’m making the most of my last year of my PhD to attend data analysis workshops. I would like to pre-register my research in the near future. How do I contribute to the slow science movement? I can start by slowing down (perhaps saying no to additional projects?!), improving my statistical knowledge, and embracing open science practices.   

Not only was the conference an incredible insight into multidisciplinary neuroscience research (I did not realise you could put a mouse in an MRI scanner, anaesthetised of course, as it would never keep its head still!), it was an influential and motivating atmosphere. Thank you, British Neuroscience Association. Now, who else wants to join me in advocating open science, becoming a rigorous researcher, and improving scientific practices?!

References

Cooper, M. M. (2018). The Replication Crisis and Chemistry Education Research. Journal of Chemical Education, 95, 1– 2.

Ioannidis, J. P. (2005). Why most published research findings are false. PLoS Medicine, 2(8), e124.

Leave a Reply

Your email address will not be published. Required fields are marked *