The authors argue that considering the test-retest reliability of a perception or attention task is crucial if researchers wish to use the task to assess the impact of individual differences (e.g., traits, experience) on performance. The issue is rooted in the historical development of vision science tasks, which were often designed to minimise differences between participants in order to understand a cognitive mechanism more generally. With an increased interest in looking at the influence of individual differences on perception and attention, researchers are now using the same tasks, but these tasks may not have a sufficient spread in participant variability to tell us anything meaningful about individual differences.
Test-retest reliability is the degree to which a participant’s performance is similar from one time completing a task to another. When there are little differences between individuals on a task, test-retest reliability tends to be low; i.e., if participants’ measures of accuracy or response time are all quite similar to each other, the degree to which one individual’s performance predicts their performance on a second test is going to be small. Therefore, a task with low test-retest reliability is not going to produce a consistent index of performance for any given individual (i.e., where they fall on a spectrum from “poor” to “excellent”) and cannot be used to assess individual differences in performance.
To assess test-retest reliability, Dr Clark and her team tested 160 undergraduate psychology participants on four commonly used tasks in vision science. The tasks measured a range of perceptual and attentional faculties such as sustained attention, motion perception, and peripheral processing, and each participant was tested twice, 1-3 weeks apart. The results demonstrate a range of reliabilities (as measured by the intraclass correlation coefficient, or ICC), indicating that some tasks (and some measures within these tasks) are more suitable for the exploration of individual differences than others. As expected, higher ICCs were associated with higher between-participant variability. The authors also reviewed a wide range of vision science tasks with known reliabilities and summarise these statistics in a useful reference table for future researchers. Finally, they provide detailed guidelines and recommendations for how to appropriately assess test-retest reliability.
Having received a First-class BSc (Hons) in Psychology from the University of the West of England, I then completed my PhD at Cardiff University. My PhD explored the potential applications of learning theory, and other engineering frameworks, to improve human identification of previous unfamiliar faces.
Following the completion of my PhD, I held postdoctoral positions within the Brain and Cognition group at Anglia Ruskin University and within the Visual Experience Laboratory at the University of Birmingham. During these roles, I have been trained to use a variety of techniques including; eye tracking, transcranial direct current stimulation (tDCS), and EEG. My position at Birmingham was funded by the International Banknote Designers Association. This role involved collaborating with the central banks (e.g., the Bank of England, US Federal Reserve) and other stakeholders to design perception studies. The main focus of these studies was to investigate how individuals navigate and use the security features on banknotes.
More recently, I have held the positions of Lecturer and then Senior Lecturer at Bath Spa University teaching and leading on a range of modules including: Introduction to Cognitive and Comparative Neuroscience and Advanced Biological and Cognitive Psychology.
Broadly, my research interests focus on visual perception/attention and how our interpretation of the world can be influenced by prior experience, what we are viewing, and current internal state. I am interested in the basic mechanisms and associated brain areas that underpin learning through simple exposure (i.e., perceptual learning). In addition to applying this knowledge to investigate how we learn to identify previously unfamiliar faces and other frequently encountered objects.
Broadly, my research interests focus is on two strands of investigation (i) information elicitation and lie-detection, and (ii) mental health and offending behaviour.
After gaining a BSc (Hons) in Psychology from Bangor University I became interested in the forensic aspects of Psychology. I then completed an MSc Forensic Psychology from the University of Portsmouth. During this time, I started working in the NHS with offenders who had complex mental health disorders. I worked primarily with adult males but spent considerable time on the female ward. I then transferred to Bluebird House – the highest level of security in the UK for adolescents. I worked here for a year and trained in DBT for frontline staff.
My PhD research focused on ethical methods for encouraging suspects and eyewitness to say more during investigative interviewing. Providing more information creates greater deception cues. From this I developed a new interviewing protocol called the Asymmetric Information Management (AIM) technique.
My academic career started in the Psychology department in Portsmouth part time while working in the NHS. I then left the NHS to work full time in the Institute of Criminal Justice Studies where I supported the development of embedding well-being within the curriculum. I have taught on various psychology and criminology modules and have led a range of undergraduate modules such as Understanding Criminology, Essential Skills for Criminologists, and an optional practice-based Forensic Psychology and Mental Health module. I have also been module coordinator of the Masters dissertation module.
Most recently, I held the post of Principal Lecture at the University of Portsmouth, where I was the Programme Area Leader managing all distance learning Masters courses. Some examples include MSc Criminal Psychology, MSc Victimology, MSc Crime Science, MSc Counter Fraud and Counter Corruption, MSc International Criminal Justice.
I have recently joined the UWE Department of Health and Social Sciences as a Senior Lecturer in Psychology. I am a developmental psychologist who has studied family relationships for the past 15 years.
At UWE, I will be teaching on the undergraduate Psychology program. I will deliver teaching on research design and analysis, as well as drawing on my expertise in family relationships. My teaching reflects the central aim of my research: to understand families as they are, rather than how they could or should be. I engage students in the key debates in the field of family psychology, for example: Do children have an obligation to maintain an active relationship with their parents? How important is genetic relatedness for family functioning?
I completed my PhD and postdoctoral research at the Centre for Family Research at the University of Cambridge. My research examined family functioning in new and non-traditional families, such as those created through the use of assisted reproductive technologies. My specific contribution to the field has been to explore how parents explain their use of donated sperm or eggs to their children and what children think, feel and understand about how they were conceived.
Most recently, my research has explored family estrangement, which is a term that is increasingly used to refer to relationships between parents, children and siblings in adulthood that are characterised by distance and negativity. In 2015 I published a report exploring the experiences of approximately 800 people who identified as being estranged from a family member. Respondents were members of the Stand Alone community, a UK-based charity which aims to support those experiencing family estrangement. The findings of this study featured in a variety of media outlets such as The New York Times and were estimated to have reached an audience of 9 million readers.
This work is particularly relevant to the ‘Promoting Psychological Health’ theme of the PSRG. For example, I have recently conducted an evaluation of the therapeutic groups run by Stand Alone for those experiencing family estrangement. I have also conducted a qualitative research that has explored people’s experiences of accessing counselling for family estrangement. As well as publishing articles in academic journals, I write about research on family estrangement for a general audience, publishing articles in Psychology Today and The Conversation. I have also written a book: “No Family is Perfect: A Guide to Embracing the Messy Reality”, which will be published by Welbeck in January 2022.
I look forward to collaborating with colleagues and to expanding my research on family relationships in such a vibrant department.
I have recently joined the Department of Social Sciences as a Lecturer in Psychology and am very excited to be part of Team UWE and the Psychological Sciences Research Group!
I am an experimental psychologist with a strong interest in methods and Open Science. I completed my undergrad and master’s in Biology at the University of Vienna, specializing in biological anthropology and human behaviour. In 2011, I moved to Scotland to do my PhD with Prof. Dave Perrett in the Perception Lab at the University of St Andrews. After I received my PhD in Psychology in 2015, I completed a post-doc at the University of Glasgow, where I was working on a five-year-long ERC-funded project on human kin recognition with Prof. Lisa DeBruine in the Face Research Lab.
While I have a broader interest in human behaviour and social cognition, the bulk of my work has focused on social face perception. Faces have a crucial role in social interactions—they provide a rich source of information, as well as a canvas to which traits, attitudes and behavioural tendencies are ascribed to, often with consequential real-world outcomes. I am interested in understanding how facial cues affect social interactions, and why: many judgments we make are inaccurate, but also extremely quick and showing significant consensus across observers, suggesting that there is more at play than mere idiosyncrasies. In particular, I am interested in the evolutionary, neurobiological, and socio-cultural influences that shape our preferences and underpin our responses to facial cues. I take a data-driven and functional approach in my work (“perceiving is for doing”), and my research is inherently interdisciplinary, drawing on models and methods from experimental psychology, evolutionary biology, and computer science.
Oosterhof and Todorov’s prominent model of face perception suggests faces are evaluated along two main dimensions that have an adaptive origin—dominance, and trustworthiness. My PhD work investigated facial cues to body physique and their relation to perceptions of dominance and attractiveness, while my post-doctoral work has explored the role of kinship cues in perceptions of trustworthiness and attractiveness. Here at UWE, I plan to continue to draw on a functional framework to investigate social perception, focusing on questions around face preferences and impression formation, and how these are affected by individual differences, environmental pressures (such as scarcity) as well as perceptual biases and stereotypes.
This past autumn, we were fortunate to welcome five new lecturers to our Department who have joined PSRG. We are excited to have these phenomenal researchers join our team. All of our new members are early-career researchers who are looking to expand their research profiles, so please do reach out to them if you see potential for collaboration!
My name is Trang, and I recently joined the UWE Department of Health and Social Sciences as a Lecturer in Psychology in Individual Differences.
I originally came from Hồ Chí Minh city, Vietnam. I completed my undergraduate degree in Psychology at Aberystwyth University in 2012. After completing my MEd in Psychology in Education (University of Bristol) and MSc in Psychoanalytic Theories (UCL), I started my PhD research in 2014 (University of Bristol), which focused on the psychological well-being and socio-cultural adjustments of EU/international PhD students in the UK. I collected longitudinal questionnaire and interview data over 15 months to understand some significant factors and changes over time using mixed effects modelling, thematic analysis, and narrative analysis. As a mixed methodologist, my primary research interests include mental health and well-being in HE,staff and student well-being, and intercultural practices in education.
My academic work and experience centres around individual differences in well-being and socio-cultural adaptation, and how they can be studied within multicultural social contexts. My research projects between 2014 – 2020 have been around HE student well-being and transitions, with various foci on aspects such as extenuating circumstances, BAME student experience and attainment gap, assessment and feedback, and currently I am collaborating on a project looking at UG students’ psycho-social wellbeing and their sense of community during COVID-19 and online learning.
I also work as a research volunteer for a mental health foundation for student mental health, and previously as a mental health champion for staff mental health at universities (as part of the Mental Healthy Universities initiative with Mind). I love teaching and learning about teaching, and while I am aware of the stress of moving to blended/online teaching on everyone in our sector, I also found the experience very useful in rethinking my approach to teaching and research in general, and in particular how we can address issues with equality and diversity through the new platforms.
I recently joined the UWE Department of Health and Social Sciences as a Lecturer in Occupational Psychology.
I completed my MSc in Social and Applied Psychology at the University of Kent in 2016, and stayed on at Kent to complete my PhD. My PhD focuses on perceived leadership potential, exploring a preference for leadership potential over leadership performance in leadership evaluations and the extent to which it is influenced by a pro-youth bias. My thesis took a mixed methods approach, employing thematic analysis, correlational research, and experimental studies. My research interests also include representations of age and ageing, age and gender stereotypes in leadership, and the impact of wellbeing at work initiatives on employee attitudes.
Previously, I worked as a Business Consultant for Bailey & French. In this role I worked with organisations to develop and implement workplace solutions founded on positive psychology research, covering areas such as leadership, wellbeing, and performance. Before that, I worked in learning and development in the Financial Services industry, specialising in leadership development.
I am an ‘applied’ social psychologist, with an expertise in verbal lie-detection using psychologically-based, ‘proactive’ interviewing protocols.
I studied BSc Forensic Psychology at the University of Portsmouth (UoP), obtaining a 1st class classification in 2013 and winning the Departments John Denis Award for best undergraduate dissertation. This project applied metacognitive theory to verbal lie-detection.
My PhD examined the effects of sub-optimal recall settings (i.e. reporting events after delays or in contexts when events were incidentally – rather than intentionally – encoded) upon the popular verbal veracity cue ‘richness of detail’. My PhD discovered evidence of a ‘stability bias’ (like) effect impacting liar’s statements after delays, a finding that has been independently replicated since.
In 2015 I became a full-time Research Associate at the UoP, working on two core projects: a High-Value Detainee Interrogation Group (HIG) funded memory-based lie-detection project, and a Centre for Research and Evidence on Security Threats (CREST) funded project developing the Verifiability Approach (VA). The outputs from these projects have been published in academic peer-reviewed journals including: Law and Human Behavor; the Journal of Applied Research in Memory and Cognition (JARMAC); and Acta Psychologica. I have presented our findings international and won the first-place (student) prize at Cambridge during the first Decepticon international conference for our research applying the VA to insurance fraud settings.
In 2019 I became a PTHP Lecturer (and tutor) in the School of Education and Sociology (UoP), before being appointed to Lecturer of Social Psychology in early 2020. In September 2020 I became Lecturer in Social Psychology at UWE Bristol.
I recently joined the UWE Department of Health and Social Sciences as a Lecturer in Psychology.
I previously studied at Coventry University (CU) for my BSc and MScR in Psychology. I was then awarded a teaching and research scholarship at the University of Bristol (UoB). I moved to Bristol in 2016 and joined the Tobacco and Alcohol Research Group (TARG). My PhD explored the effects of acute and chronic alcohol consumption on emotional face processing. I completed work that aimed to extend our understanding of how alcohol impairs our ability to process key social information that has the potential to influence behaviour (especially important given the social context in which alcohol is typically consumed).
My research interests include psychopharmacology, social drugs, alcohol-related aggression, alcohol policy, health outcomes, and emotional face processing. I am a big advocate of open science and reproducibility and believe strongly in the accurate dissemination of research findings. Because of this, I actively involve myself in public engagement events. Examples include the yearly Bristol Neuroscience festival, Women in STEMM (Ada Lovelace), and an Alcohol Labelling event hosted by TARG.
I am delighted to join UWE Bristol, and the excellent researchers at PSGR.
I recently joined the UWE Department of Health and Social Sciences as a Senior Lecturer in Sport and Exercise Psychology. Prior to joining UWE, I was a lecturer at Middlesex University for 3 years, where I was the curriculum lead for sport and exercise psychology, lead a football science degree programme and also developed an MSc in Sport and Ex Psychology. Prior to this role in academia I worked in elite sport, with Fulham FC across their senior and junior teams and then with Sunderland AFC, supporting their senior team. I studied for a BSc ‘Sport and Exercise Science’ and an MSc ‘Sport and Exercise Science (Psychology)’ at Brunel University (London) between 2006 and 2011.
Currently, my primary research interest is in mental health and wellbeing in elite sport. My PhD project is investigating perceptions and engagement with mental health support services in English elite football. I therefore have taken interest in areas around mental health stigma, literacy and support systems, which I imagine would map on to other research being carried out in the department.
I am also currently engaged with The Royal Marines, looking into the development and measurement of psychology programmes within the force. This is in its early stages, however is covering a breadth of occupational and performance psychology themes.
As an applied practitioner I am accredited by the British Association of Sport and Exercise Sciences (BASES) and am currently on an accelerated pathway to obtain HCPC status with them.
Thanks to Matt at Housecat Productions, we have videos about PSRG and each of our themes (footage recorded pre-pandemic). Check out each of the videos and the work that we do. Feel free to get in touch in the comments below or by email (firstname.lastname@example.org).
An overview of our whole research group:
About our Ageing Well theme:
About our Applied Cognition and Neuroscience theme:
About our Optimising Performance and Engagement theme:
This blog post provides a reflective account of my own experiences participating in Dry January – an alcohol abstinence challenge initiated by Alcohol Change UK that encourages people to reduce their levels of alcohol consumption. As an experimental social psychologist, my research investigates the influences of contextual and social factors on alcohol consumption and related behaviours. Working alongside collaborators at Edge Hill and Aston University, our work to date has suggested that frequent alcohol consumption is associated with our motivations and expected outcomes of drinking, as well as heightened attention towards alcohol-related cues. It was never my intention to blog about my experiences of Dry January; it was simply a personal endeavour that I aimed to complete. However, as the days went on, I noticed a lot of parallels with my own research, as well as becoming more cognisant about wider issues embedded in the UK’s drinking culture. I hope that my reflections below will speak to other people’s experiences associated with taking part, as well as highlighting cultural issues with our relationship with alcohol, and how we might best support those who wish to reduce their consumption.
What is Dry January?
“Dry January” is a 1-month challenge where people give up alcohol for the month of January. Typically completed with the aid of a phone app, the overarching goal is to refrain from drinking alcoholic beverages for 31-days, with badges rewarded for day streaks, drinking in moderation, reducing alcohol intake, and total dry days. At the end of each day, the participant completes an online calendar, responding whether they’ve stayed ‘Dry’, ‘drank’, or ‘drank as planned’. Given the nascent stage of this particular challenge (2013), there has been little research conducted on the benefits and potential drawbacks of Dry January. However, research has suggested that Dry January can have a range of positive health-related and psychological benefits, ranging from improved sleep, weight loss, and enhanced self-control (see De Visser et al., 2016). Further, whilst some have proposed that Dry January may lead to a ‘rebound effect’ (i.e., binge February), the majority of available research suggests that a period of abstinence can encourage longer term reductions in drinking (Bray et al., 2010; De Visser et al., 2016). This is because, after a person has made a commitment to engage in behaviour change, they are more likely to maintain these changes in the future (de Visser et al., 2017).
Reflections on Dry January
Learning about the reasons I drink
According to the Alcohol Use Disorders Identification Test (AUDIT), I am categorised as a ‘low risk drinker’ with an overall score of 7 out of 40 (anything above 8 is seen as higher risk). Personally, I would classify myself as an ‘occasional social drinker’, who rarely binge drinks but instead has ‘one or two every now and again’. Over the Christmas period, I found myself overindulging in unhealthy foods and drinking more alcohol and I therefore decided to take part in Dry January to regulate my behaviour and explore the benefits. Early in the New Year, I found it relatively easy to abstain from alcohol, simply because I felt I’d had my fill over the festive period (my AUDIT score would have been temporarily higher!). As the days turned into weeks, however, I found myself thinking about reaching for an alcoholic drink a lot more. I then stopped to think about when and why I wanted an alcoholic drink, and realised that I tend to drink to alleviate stress or to relax in social situations. In the alcohol literature, these reasons are known as ‘drinking motives’ (see Kuntsche et al., 2006), which are the valued outcomes that people associate with drinking alcohol. The examples I provide are known as ‘coping’ (i.e., to deal with negative emotions) and ‘social’ motives (i.e., to enhance interactions), but there are also enhancement (i.e., to heighten mood) and conformity motives (i.e., to avoid social pressure or a need to fit in). Interestingly, research has shown that these drinking motives are a unique predictor of alcohol consumption and related behaviours (Kuntsche et al., 2014; Merrill & Read, 2010).
Another insight I had was that my tendency to drink alcohol in low-quantity, but rather frequently, may influence me to underestimate my true alcohol consumption. The limitations of self-report measures of alcohol consumption could be discussed at length, but the main point here is that such behaviour may be easily forgotten and unreliability reported. In addition, it may mask viewing consumption as a ‘problem’ and be an obstacle to behaviour change. Partaking in Dry January made me realise how I might reach for a drink without really counting it or thinking that I need to cut down.
Alcohol cues are everywhere
Seeing an advertisement for alcohol in a drinking establishment (i.e., pubs and bars) comes as no surprise, particularly in UK culture. However, abstaining from alcohol made me evaluate the quantity of alcohol advertisements we see in our everyday lives and the appropriateness of their placement. During a conference visit in January, I stayed at a well-known hotel chain and found promotional offers for alcohol in the reception lobby, as well as included in leaflets in my room. Throughout Dry January I became more and more aware of the number of alcohol adverts that were aired on television and decorative signs in shops that glamorised drinking. I found my heightened awareness of this very interesting and this led me to think more about something we, as researchers, call ‘cue reactivity’. Here, research shows that alcohol-related cues capture and hold the attention of those who drink alcohol and appear to increase subjective cravings for alcohol (see Field et al., 2009). Moreover, such attentional processing seems to be heightened in heavy drinkers and even abstinent alcoholics (Field et al., 2013). Such advertisements challenged my self-control to not drink by heightening my craving for alcohol, which led me to think about the implications that such advertising has for those with problematic alcohol use, or alcohol-related disorders. In the UK, alcohol-related adverts are regulated so that they do not condone or encourage irresponsible or immoderate drinking. Unlike cigarette advertising which is banned on television and heavily regulated in supermarkets (e.g., cigarettes hidden behind a screen), however, alcohol advertising appears to be much more relaxed. Let’s compare a packet of cigarettes and a bottle of alcohol, for example; the packaging of cigarettes includes large health warnings on 65% of the front and back, a brand name in standard font, and drab colours. The packaging of alcohol, on the other hand, includes limited (or no) health warnings, bright colours and attractive images, and beverages themselves come in many different colours and flavours. It therefore seems that more work needs to be done to regulate the branding and advertisement of alcohol to make them less attractive and ‘wanted’. Health warnings and nutritional labels would raise awareness of the health implications of consumption, and help people to make informed decisions with regards to drinking.
Challenging conversations around drinking
a few occasions during Dry January where my choice not to drink during social
occasions was questioned by others, and people tried to influence this decision.
Statements such as “just have one and then don’t drink tomorrow”, or “have a
beer now and then a glass of water” were voiced, perhaps with the aim to test
my self-control. This made me think about the wider conversations we have about
drinking and the societal norms associated with alcohol; we wouldn’t ask
someone why they are drinking, so why is it okay to ask someone why they’re
not? I found that the most effective way of dealing with this was to have open
conservations about the benefits of Dry January and to engage in discussion
with people about wider problems regarding the UK’s binge drinking culture (see
Pincock, 2003). It was interesting to
outline the many different reasons why people may choose to moderate their
alcohol consumption, or not to drink, spanning choice (e.g., not feeling
compelled to drink in social circles), finance (deciding to drive to a venue
rather than drinking to save money), and health (weight loss, better sleep, concentration,
and addiction). In relation to this, another challenging experience of taking
part in Dry January concerned situations in which others expected me to pay for
a round of drinks, or split the bill, when they had been drinking alcohol and I
hadn’t. In some establishments, the cost of an alcoholic beverage is up to
four-times the cost of a soft drink, so the bill might be quite surprising! Again,
this may have been overcome with a simple conversation, but the stronger
message here is that we need to be more aware of how we treat people who are
not drinking and think more carefully about how we can support them.
So how did I get on?
Out of the
31 days in January, I managed 26 days dry; 1 of these was a ‘drinking as planned’
day, whilst the other 4 were days in which I drank in moderation. Using the
phone app was extremely helpful for monitoring and managing my behaviour; over
the course of my alcohol-free days, my best streak was 14 days, and I saved substantial
money and calories. Some people have
said “so you didn’t complete Dry Jan?!”, and again I think this rhetoric is
problematic. Although I didn’t complete 31 whole days, the challenge allowed me
to regulate my consumption and cut down significantly. It also helped me to
think about helpful strategies to moderate my drinking, such as adding on an
extra “dry” day to my calendar after an unplanned drinking session, and to not
give up on the challenge if I had drank. The most positive experience of Dry
January for me, however, has been reflecting on the conversations we have about
alcohol, and thinking about ways in which we can support people who chose to
reduce their intake or abstain all together. It has also opened my eyes to
cultural and societal factors that influence alcohol consumption (e.g., alcohol
advertising), which may act as an obstacle to reducing intake. For me, Dry
January has been a fundamentally interesting reflective experience, both as a
participant and an alcohol researcher, and has aided my awareness of the
benefits and barriers that people face when making the choice to cut down or
abstain from drinking, and how we might best support them.
M., Brown, J. M., Pemberton, M. R., Williams, J., Jones, S. B., &
VandermaasPeeler, R. (2010). Alcohol use after forced abstinence in basic
training among United States Navy and Air Force trainees. Journal of Studies on
Alcohol & Drugs, 71, 15-22.
de Visser, R. O., Robinson,
E., & Bond, R. (2016). Voluntary temporary abstinence from alcohol during
“Dry January” and subsequent alcohol use. Health Psychology, 35, 281–289.
de Visser, R. O., Robinson, E., Smith, T., Cass, G., &
Walmsley, M. (2017). The growth of ‘dry January’: promoting participation and
the benefits of participation. European Journal of Public Health, 27,
Field, M., Mogg, K., Mann,
B., Bennett, G. A., & Bradley, B. P. (2013). Attentional biases in
abstinent alcoholics and their association with craving. Psychology of Addictive Behaviors, 27,
Field, M., Munafò, M. R.,
& Franken, I. H. (2009). A meta-analytic investigation of the relationship
between attentional bias and subjective craving in substance abuse. Psychological
Bulletin, 135, 589.
Field, M., Mogg, K., Mann,
B., Bennett, G. A., & Bradley, B. P. (2013). Attentional biases in
abstinent alcoholics and their association with craving. Psychology of
Addictive Behaviors, 27, 71-80.
Kuntsche, E., Knibbe, R.,
Gmel, G., & Engels, R. (2006). Who drinks and why? A review of
socio-demographic, personality, and contextual issues behind the drinking
motives in young people. Addictive Behaviors, 31,
Kuntsche, E., Gabhainn, S.
N., Roberts, C., Windlin, B., Vieno, A., Bendtsen, P., … & Aasvee, K.
(2014). Drinking motives and links to alcohol use in 13 European
countries. Journal of Studies on Alcohol and Drugs, 75,
Merrill, J. E., & Read,
J. P. (2010). Motivational pathways to unique types of alcohol
consequences. Psychology of Addictive Behaviors, 24,
Pincock, S. (2003). Binge
drinking on rise in UK and elsewhere. The Lancet, 362,
In April, I was fortunate to attend the annual and prestigious British Neuroscience Association (BNA) Festival of Neuroscience. The conference provided a unique opportunity to engage with contemporary interdisciplinary neuroscience research across the UK and internationally. Spread over four days the event hosted workshops, symposiums, keynote lectures and poster presentations. Covering 11 neuroscience themes, as an experimental psychologist, I was particularly interested in themes relating to attention, motivation and behaviour, sensory and motor systems, and neuroendocrinology and autonomic systems.
Not only was this a chance to embrace cutting-edge interdisciplinary neuroscience research, this was a chance to develop skills enabling me to become a meticulous researcher. It was clear that an overall goal of the conference was to encourage high standards of scientific rigour by embracing the open science movement.
“Fast science… too much rubbish out there we have to sift through.”
The open science movement encompasses a range of practices that promote transparency and accessibility of knowledge, including data sharing and open access publishing. The Open Science Framework is one tool that enables users to create research projects and encourages sharing hypotheses, data, and publications. These practices encourage openness, integrity, and reproducibility in research, something particularly important in the field of psychology.
A especially striking claim was noted by Ioannidis (2005): “most published research findings are false.” Ioannidis argued that there is a methodological crisis in science, particularly apparent in psychology, but also cognitive neuroscience, clinical medicine, and many other fields (Cooper, 2018). If an effect is real, any researcher should be able to obtain it using the same procedures with adequate statistical power. However, many scientific studies are difficult to replicate. Open science practices have been suggested to help enable accurate replications and facilitate the dissection of scientific knowledge, improving scientific quality and integrity.
The BNA has a clear positive stance on open science practices, and I was lucky enough to be a part of this. Professor Uta Frith, a world-renowned developmental neuropsychologist, gave a plenary lecture about the three R’s: reproducibility, replicability, and reliability, which was arguably one of the most important and influential lectures over the course of the conference.
Professor Frith summed up the scientific crisis in two words, “Fast Science.” Essentially, science is progressing too fast, leading to lower quality research. Could this be due to an increase in people, labs, and journals? Speeded communication via social media? Pressure and career incentives for increased output? Sheer volume of pre-prints available to download? Professor Frith argued that there is “too much rubbish one has to sift through.”
A potential solution to this is a ‘Slow Science’ movement. The notion of “resisting quantity and choosing quality.” Professor Frith argued the need for the system to change. Often we here about the pitfalls of the peer review process, yet Professor Frith provided us with some novel ideas. She argued for a limited number of publications per year. This would encourage researchers to spend quality time on one piece of research, improving scientific rigour. Excess work would be appropriate for other outlets. Only one grant at a time should be allowed. She also discussed the need for continuous training programmes.
A lack of statistical expertise in the research community?
Professor Firth argued that there is a clear lack of statistical knowledge. With increasing computational advancements, it is becoming easier and easier to plug data into a function and accept the outcome. Yet, we must understand how these algorithms work so we can spot errors and notice illogical results.
This is something that spoke out to me. I love working with EEG data. Analysing time-series data allows us to capture cognitive processes during dynamic and fast changing situations. However, working with such rich and temporally complex data is technically challenging. The EEG signal is so small at the surface of the scalp, and the signal to noise ratio is poor. Artefacts, non-physiological (e.g. computer hum) and physiological (e.g. eye movements), contaminate the recording, meaning that not only does the EEG pick up neural activity, but it also records other electrical signals we are not interested in. Therefore, we apply mathematical algorithms to help with cleaning the data, to improve the signal to noise ratio. Once the data are cleaned, we also apply algorithms to transform the data from the time series domain (for which it is recorded in) to the frequency domain. The number of techniques of EEG analysis has risen hugely, partly thanks to computational power, and therefore there are now a whole host of computational techniques, including machine-learning, that can be applied to EEG data.
Each time algorithms are applied to the EEG data, the EEG data change. How can an EEG researcher trust the output? How can an EEG researcher sensibly interpret the data, and make informed conclusions? Having an underlying understanding of what the mathematical algorithms are doing to the data is no doubt paramount.
Professor Frith is right, there is a need for continuous training as data analysis is moving at an exhaustingly fast pace.
Pre-registration posters – an opportunity to get feedback on your method and planned statistical analyses
I also managed to contribute to the open science movement during the conference. On the second-to-last day, I presented a poster on my research looking at the temporal neural dynamics of switching between a visual perceptual and visuomotor task. This was not an ordinary poster presentation; this was a pre-registration poster presentation. I presented planned work to be carried out, with a clear introduction, hypotheses and method. I also included a plan of the statistical analyses. There were no data, graphs, or conclusions.
The poster session was an excellent opportunity for feedback from the neuroscience community on my method and statistical analyses. This is arguably the most useful time for feedback – before the research is carried out. This was particularly beneficial for me coming from a very small EEG community, and seeking particular expertise is vital. A post-doctoral researcher, who had looked at something similar during her PhD, provided me with honest and informative feedback on my experimental design. In addition, I uploaded my poster to the Open Science Framework, and the abstract was published in the open access journal Brain and Neurosciences Advances. I also received a preregistered badge for my work. These badges work as an incentive to encourage researchers to engage with open science practices. Check out cos.io/badges for more information.
So, what next?
Practical tools and significant support are coming together to allow open science to blossom. It is now our responsibility to be part of this. I’ve created an Open Science Framework account and plan to start there, detailing my aims, methods and data, to improve transparency in research. I’m making the most of my last year of my PhD to attend data analysis workshops. I would like to pre-register my research in the near future. How do I contribute to the slow science movement? I can start by slowing down (perhaps saying no to additional projects?!), improving my statistical knowledge, and embracing open science practices.
Not only was the conference an incredible insight into multidisciplinary neuroscience research (I did not realise you could put a mouse in an MRI scanner, anaesthetised of course, as it would never keep its head still!), it was an influential and motivating atmosphere. Thank you, British Neuroscience Association. Now, who else wants to join me in advocating open science, becoming a rigorous researcher, and improving scientific practices?!
Cooper, M. M. (2018). The Replication Crisis and Chemistry Education Research. Journal of Chemical Education, 95, 1– 2.
Ioannidis, J. P. (2005). Why most published research findings are false. PLoS Medicine, 2(8), e124.
In everyday life, you will be asked to report your attitudes and opinions towards a whole host of different things. When buying a TV, you may be asked retrospectively to provide ratings of the product, or even the person who sold it to you. The reporting of attitudes has become so sought after by companies that specific websites have been developed providing people with an open forum to post their opinions and evaluations of accommodation, restaurants, and services, and even receive arbitrary points and badges for their reviews (e.g., Trip Advisor). Given the plethora of surveys and questionnaires utilised on a daily basis, you may therefore think that measuring attitudes is relatively easy. Simply ask someone what they think and they will respond with an honest answer. However, psychology has shed light on the limitations posed by self-report tools, such as questionnaires and surveys, which are so readily used by companies and organisations alike.
Gauging attitudes: A problem of measurement or construct?
Studies have shown consistently how people’s attitudes can
be altered by systematic factors, such as how the questions are framed and
even what order they are presented.
For example, a recent study
demonstrates how the number of scale points in a questionnaire affects the
extent to which gender stereotypes of brilliance are expressed. Specifically,
female course instructors were more likely to receive a top rating on a 6-point
scale relative to a 10-point scale, whereas this difference did not emerge for
male instructors. The author’s reason that this effect occurs because of
cultural meanings assigned to the number ‘10’ – perfection. As such, a
top-score on a 6-point scale does not carry such strong performance
expectations. To me, this is a landmark study demonstrating how the features of
tools that are frequently used to judge merit can powerfully affect people’s
responses. Who knew that something which appears meaningless can shape our
answers in a way that tells a completely different story?
Another issue plaguing questionnaires is that psychologists – or whomever uses them – need to trust that the questionnaire can tap into exactly what we want to measure. When asking people about socially sensitive topics, such as prejudice or discriminatory behaviour this is rarely often the case. Consider how you would answer the following questions when asked by a researcher, someone you barely know: “Do you treat people from other races the same as you treat people from your own race? Do you willingly give to charity or those who need it the most? Think hypothetically about your answers for a minute. Now, reflect on your previous behaviour and try to gauge whether the answers given provide an accurate representation of how you really act. What you might uncover about yourself here is called the ‘willing and able’ problem; people may not be willing to report their honest attitudes, and when put on the spot, may not be able to accurately reflect and report what they truly feel. Answers to questions are usually influenced by self-presentational motives – that is, people’s desire to look good in someone else’s eyes.
A more interesting question is that we might not know what we actually believe. To a lay audience with no psychological training, this may sound surprising. How can we hold attitudes that we are unaware of? Psychology holds the answer. The past three decades of psychological research have revealed the frailties of introspection (the inner workings of our mind), and how little control we possess over our own thoughts. This has led researchers to coin the term ‘implicit attitudes’; introspectively unidentified traces of past experience that mediate favourable or unfavourable feeling towards social objects. The general argument is that individuals harness attitudes that they are not aware of, and these can manifest as judgements or actions.
How do we measure attitudes that people aren’t aware of?
The development of implicit measures have afforded
remarkable insight into the human mind, and opened up a new research field
termed implicit social
cognition. This may leave you wondering, how do we measure such
attitudes, and how do they develop in the first place?
Whereas explicit attitudes are measured by asking people directly about their thoughts and feelings (e.g., through questionnaires), implicit attitudes are assessed indirectly through tasks that typically measure response times towards various stimuli and compare systematic variations in people’s performance. One of the most well-known tasks of this kind is the Implicit Association Test (IAT), which tests how quick (or slow) people are at pairing different social categories with various attributes. The race IAT, for example, requires test-takers to categorise pictures of White and Black faces with positive and negative terms as quickly as possible. The underlying theory is that people will be quicker to pair concepts with attributes that are strongly associated in memory, compared to those weakly associated. In order to understand this better, think about learning a new language for the first time; you will always be quicker to think about words from your own language compared to those from a newly learned language because of the automaticity of your native tongue. Going back to the race IAT, research has consistently shown that White people are quicker to associate pictures of White faces with positive terms and Black faces with negative terms. This is referred to as implicit bias.
However, implicit measures have also received their fair
share of criticism. Research indicates a weak
relationship between explicit and implicit attitudes, suggesting
that they may reflect separate attitude representations. An alternative theory,
however, is that explicit and implicit measures allow people to edit their
responses to varying degrees. In 2016, as a PhD student I wrote my first
commentary reflecting on what
exactly do implicit measures assess?
In addition, although the IAT has shown some predictive validity (e.g.,
voting behaviour), other research indicates that for more socially sensitive
attitudes, the IAT does not
predict resulting discriminatory behaviour. Although the IAT was heralded to provide new insights into
human cognition and behaviour, some researchers believe this test has been
oversold. Nevertheless, I argue that the reason that implicit attitudes may not predict real-world behaviour is influenced by the same
issues that plague self-report measures – social desirability. That is, people
may think negatively about a certain
out-group member, but that doesn’t necessarily mean they will act upon this. The same may be true for
weak correlations between explicit and implicit attitude measures; people
distort their attitudes on self-report questionnaires, whereas implicit
measures aren’t susceptible to these self-presentational motives. Should we
expect correlations between these two measures when one is tapping into
controllable beliefs and the other is uncovering introspectively unidentified
traces of past experience?
In order to answer these questions, I was awarded funding through the Vice Chancellor’s Early Career Research Awards (VC ECR Award) at UWE Bristol to investigate other implicit socio-cognitive mechanisms that may predict implicit bias. The blue sky thinking behind this research is to develop other measures that can potentially measure implicit behavioural manifestations of bias. At this stage, we are too early in our research endeavour to reveal any findings; however other influential and impactful avenues have already stemmed from this research.
At the same time as I have been conducting my research, Ellie Bliss (Adult Nurse
Lecturer) and Alisha
Airey (BME Project Officer) have been running staff workshops at UWE
Bristol, reflecting upon how implicit (unconscious) bias can play out in the
higher education classroom. I am now involved in supporting these workshops,
providing research-led guidance on how we access implicit bias, and answering
the many questions that staff have about this rather ambiguous construct. One
interesting discussion centres on whether implicit biases can be viewed as unconscious when we are increasingly
acknowledging them through teaching and training. The majority of attendees
come away from the workshop with new reflections on how teaching practice is
orientated towards Western culture, and with classroom strategies to implement
to prevent implicit bias playing out. However, a handful of attendees are surprised
and doubtful of the concept of implicit bias and the tools that purportedly
measure it. They have difficulty in accepting that they may hold certain
biases. But the truth is, we all do.
Where is implicit social cognition headed?
In this blog post I hope I have demonstrated that we are
shining the light on what implicit bias really is and the nature of our unconscious
attitudes. Such research has paved the way for training workshops which teach
people to acknowledge their deep-rooted attitudes and reflect upon how these
may impact our thinking and behaviour towards other people. But what’s next for
this research arena? There are still lots of unanswered questions and
controversies surrounding implicit bias, which makes it an exciting topic to
study. Do implicit measures really provide a window into the unconscious mind? Is
implicit bias relatively stable when measured at different time points? Can
implicit bias be changed, and if so, are such changes short or long-term? Are
attitudes towards some social groups easier to change than others? Can we, as a
field, develop other (implicit) behavioural measures that more accurately
predict implicit attitudes better than self-reports? Such investigations will
represent the future of implicit social cognition and I, for one, am extremely
excited to see what’s to come.
I’m a second-year psychology student at UWE, and throughout my first year I found myself developing a keen interest in psychological research. The more I engaged with my degree, the more interested I became, and I started actively seeking opportunities to gain research experience towards the end of first year. I was interested in learning more about the research process, and I also know how valuable experience can be for postgraduate applications.
In May of this year I went on an animal behaviour research trip to the island of Lundy. This was shortly after applying for my first research role, a paid summer internship with Drs Kait Clark and Charlotte Pennington. I learnt a lot on Lundy and made friends with the other student researchers. Towards the end we realised we were on the same wavelength…three fellow Lundy attendees and I had been invited to interview for the same position. The interviews were scheduled for the week after our return from Lundy, and we were now friends competing against each other. All we could do was wish each other luck in the interview and hope for the best.
The interview was competitive, and we were all given a short programming task to attempt in advance. Maybe there was something in the sea air, but when an email came through from Kait offering the job, all four our names were on it. Taking the extracurricular opportunity to learn and conduct psychological research on Lundy perhaps led to an edge in the interview, and we now had the chance to contribute to a legitimate paper together.
The main aim of the project was to develop a set of visual and social cognition tasks for the purposes of establishing test-retest reliability, building on a recent study by Hedge, Powell, & Sumner (2018). Our first task was to complete a comprehensive review of visual cognition literature. Although I had experience of examining research papers to get references for essays, this was much more in depth and specific. The process of comparing the different papers took a while to get used to, but it has been eye-opening to review papers with a view towards designing our own study rather than evaluating a proposition for an essay. It highlights different issues within and between papers that I would not have considered otherwise, and I feel like it has helped me develop a more complete approach to evaluating research papers in general. We were given lots of freedom to conduct the review and research – this was hugely beneficial as it left a lot of potential for creative ideas and individual contribution.
We chose measures for which the test-retest reliability had not already been established so our research could have the most impact. Each of us then chose one measure and worked through writing the Python code to implement parameters in alignment with previous studies. We are using PsychoPy, open-source software, to program our measures. I have limited coding knowledge (but enough to pass the interview stage!) so using Python has been a learning experience. Although frustrating at times, help has always been available and through a combination of initiative, trial and error, and advice, the measures shaped up nicely. I developed a motion coherence task, and piloting it on my friends has been interesting – explaining what the task is for and the wider context requires a thorough knowledge of it, and I am genuinely passionate about it. I never thought I’d be excited about a spreadsheet.
During our summer internship we also had an opportunity to meet with Dr Craig Hedge, whose recent paper has inspired our current work. We got to hear about his research first hand and discuss our project and how it related to his paper. It was interesting and insightful to talk about his work and how our test-retest reliability project came about.
Now we’ve finished the development stage of the project, and with all the tasks up and running, it’s time for data collection. I’m continuing to work on this project as my work-based learning placement for my Developing Self and Society (DSAS) module. Time slots are available on UWE’s participant pool for students to book in, and so we have all been running sessions for up to four participants at once. This involves briefing, setting up the experiments on the computers, giving instructions, addressing issues that arise, and ensuring that the conditions are the same for every session. It’s fun to discuss the study when debriefing the participants, to raise awareness of what is being investigated and help them understand why they did the tasks involved. The integration of my internship with one of my second-year modules shows how beneficial an opportunity like this can be. In isolation, it is good experience on its own, but linking it with my regular studies and incorporating my experience into university work has made it invaluable.
It’s been great working closely with Kait and Charlotte in addition to Austin, Triin, and Kieran. Chatting with staff as well as students in a different year to me has given me insight into the university and the course itself. I have learnt a lot already and will continue to do so. The project will also help me with my own research project and my degree in general. I’m excited to see what the rest of it brings.