By Josh Lee
I’m a second-year psychology student at UWE, and throughout my first year I found myself developing a keen interest in psychological research. The more I engaged with my degree, the more interested I became, and I started actively seeking opportunities to gain research experience towards the end of first year. I was interested in learning more about the research process, and I also know how valuable experience can be for postgraduate applications.
In May of this year I went on an animal behaviour research trip to the island of Lundy. This was shortly after applying for my first research role, a paid summer internship with Drs Kait Clark and Charlotte Pennington. I learnt a lot on Lundy and made friends with the other student researchers. Towards the end we realised we were on the same wavelength…three fellow Lundy attendees and I had been invited to interview for the same position. The interviews were scheduled for the week after our return from Lundy, and we were now friends competing against each other. All we could do was wish each other luck in the interview and hope for the best.
The interview was competitive, and we were all given a short programming task to attempt in advance. Maybe there was something in the sea air, but when an email came through from Kait offering the job, all four our names were on it. Taking the extracurricular opportunity to learn and conduct psychological research on Lundy perhaps led to an edge in the interview, and we now had the chance to contribute to a legitimate paper together.
The main aim of the project was to develop a set of visual and social cognition tasks for the purposes of establishing test-retest reliability, building on a recent study by Hedge, Powell, & Sumner (2018). Our first task was to complete a comprehensive review of visual cognition literature. Although I had experience of examining research papers to get references for essays, this was much more in depth and specific. The process of comparing the different papers took a while to get used to, but it has been eye-opening to review papers with a view towards designing our own study rather than evaluating a proposition for an essay. It highlights different issues within and between papers that I would not have considered otherwise, and I feel like it has helped me develop a more complete approach to evaluating research papers in general. We were given lots of freedom to conduct the review and research – this was hugely beneficial as it left a lot of potential for creative ideas and individual contribution.
We chose measures for which the test-retest reliability had not already been established so our research could have the most impact. Each of us then chose one measure and worked through writing the Python code to implement parameters in alignment with previous studies. We are using PsychoPy, open-source software, to program our measures. I have limited coding knowledge (but enough to pass the interview stage!) so using Python has been a learning experience. Although frustrating at times, help has always been available and through a combination of initiative, trial and error, and advice, the measures shaped up nicely. I developed a motion coherence task, and piloting it on my friends has been interesting – explaining what the task is for and the wider context requires a thorough knowledge of it, and I am genuinely passionate about it. I never thought I’d be excited about a spreadsheet.
During our summer internship we also had an opportunity to meet with Dr Craig Hedge, whose recent paper has inspired our current work. We got to hear about his research first hand and discuss our project and how it related to his paper. It was interesting and insightful to talk about his work and how our test-retest reliability project came about.
Now we’ve finished the development stage of the project, and with all the tasks up and running, it’s time for data collection. I’m continuing to work on this project as my work-based learning placement for my Developing Self and Society (DSAS) module. Time slots are available on UWE’s participant pool for students to book in, and so we have all been running sessions for up to four participants at once. This involves briefing, setting up the experiments on the computers, giving instructions, addressing issues that arise, and ensuring that the conditions are the same for every session. It’s fun to discuss the study when debriefing the participants, to raise awareness of what is being investigated and help them understand why they did the tasks involved. The integration of my internship with one of my second-year modules shows how beneficial an opportunity like this can be. In isolation, it is good experience on its own, but linking it with my regular studies and incorporating my experience into university work has made it invaluable.
It’s been great working closely with Kait and Charlotte in addition to Austin, Triin, and Kieran. Chatting with staff as well as students in a different year to me has given me insight into the university and the course itself. I have learnt a lot already and will continue to do so. The project will also help me with my own research project and my degree in general. I’m excited to see what the rest of it brings.