DETI is a research, innovation and skills initiative created to develop and accelerate digital engineering across multiple industry sectors, to ultimately benefit future generations of engineers and engineering products, and to help tackle global challenges.
UWE’s Engineering, Design and Mathematics department will play a central role in DETI, leading the Skills development branch of the centre. EDM will work with other DETI partners to:
Inspire the next generation of diverse engineers
Transform the further and higher education landscape
Innovate lifelong learning of specialised digital engineering skills
Dr Lisa Brodie, Head of UWE Bristol’s Department of Engineering Design and Mathematics (EDM), who led UWE’s bid, said: “This is a vitally important investment for our region and we are pleased to be leading on the skills and workforce development element of the centre’s work. It comes at a perfect time as we prepare to open our new engineering building where we will have state-of-the-art digital engineering facilities and an increased focus on digital engineering to train our graduates for emerging roles in the sector.”
In October, FET awarded the team at the Bioenergy Centre a public engagement and outreach award. The Centre are using this fund to support the production of an interactive workshop for schools – Mud powered robots!
Research Associate Pavlina Theodosiou, who led the project until her move to Newcastle University in January, provides an update here on this exciting workshop.
Before Christmas, Pavlina worked hard alongside electronics engineer technician, Ugnius Barajunas, to obtain quotes from various companies for the prototype motors – the most expensive part of the robot.
At the same time, the two assembled different electronic boards with the 3D printed parts and borrowed motors, to create three robots. The robots were tested with live Microbial FuelCells in the lab and ran well on urine (but don’t worry, they won’t be run on urine in school!)
They presented their results at the a best-in-class overview of robotics and automation, which BotTalks hosted at the watershed in November. The team are now excited about trialling the robots on mud for the first time!
“Overall the project received a lot of interest from public and investors at BotTalks.”
Later in the year, the workshop will be taken into Sea Mills Primary School for the Year 6s to get stuck into.
The Lightyear Foundation works hard to break down barriers to getting more disabled people into Science, Technology, Engineering, Math, & Medicine. One of the ways they inspire children with SEN is through work inspiration trips.
This is what New Fosseway School had to say about the trip:
“What a unique experience for our students and interesting place to visit! It was a real delight watching them so interested in all the different robots from the very tiny to the huge car simulators.
They were especially interested in the social robots designed to help disabled people. Being able to have a go and manipulate some of the robots was really exciting and they also enjoyed the coding session where they got to programme some of the robots.
The trip most definitely inspired curiosity!”
Jo Payne, Transitions Lead, New Fosseway School.
Thanks to Severin, this trip has opened up the possibility of more SEN schools visiting the BRL….hopefully schools will be back in the summer term and these visits can go ahead!
Technology from the Centre for Machine Vision (CMV) has been making moves to improve animal welfare and maximize crop harvesting.
First off, the 3D imagery system, Herdvision, that helps farmers assess cows’ wellbeing, was featured on the BBC six o’clock news in 2019 as it began a trial by Arla UK 360 farmers.
The technology developed in collaboration with Kingshay and AgsenZe, uses visual monitoring, data recording and automated intelligence to identify changes in each cow’s physical wellbeing, mobility and weight, before they are visible to the human eye.
Facial recognition used to assess pig’s emotions
Animal behaviourists from Scotland’s Rural College in Edinburgh, are using the technology provided by machine vision experts at UWE, to picture a range of pig facial expressions. The hope is that emotions can be identified and facial recognition used to improve pig welfare.
The BBC reported on the study in spring last year and the work is due to appear as part of a Netflix program in 2020.
The potato harvester based data capture system –Harvest Eye – provides insight on size, count and crop variation on unwashed potatoes as they are harvested. The integrated data analytics shows precisely what is being lifted and from where in the field, insights that will help maximise marketable yield and reduce crop imbalance.
The technology’s utility was recognised at the Potato Industry Event 2019/20, when it picked up second prize (out of 15 nominations) .
Harvest Eye was developed by CMV for B-hive, who then patented the technology in collaboration with CMV, and now B-Hive / Branston have established a new company, HarvestEye Ltd, to supply the HarvestEye technology to Grimme,a major manufacturer of root crop harvesters.
But the team at CMV aren’t stopping there.
“We’re working on a new funding bid right now to add functionality.”
Melvyn Smith, CMV
Mark Hansen, who led development of the technology, represented CMV, as part of the team that picked up the award.
A new project studies how to investigate accidents with social robots, Alan Winfield explains why this is needed…
Originally posted on September 17th, 2019 by Alan Winfield on his blog.
Imagine that your elderly mother, or grandmother, has an assisted living robot to help her live independently at home. The robot is capable of fetching her drinks, reminding her to take her medicine and keeping in touch with family. Then one afternoon you get a call from a neighbour who has called round and sees your grandmother collapsed on the floor. When the paramedics arrive they find the robot wandering around apparently aimlessly. One of its functions is to call for help if your grandmother stops moving, but it seems that the robot failed to do this.
Fortunately your grandmother recovers but the doctors find bruising on her legs, consistent with the robot running into them. Not surprisingly you want to know what happened: did the robot cause the accident? Or maybe it didn’t but made matters worse, and why did it fail to raise the alarm?
Although this is a fictional scenario it could happen today. If it did you would be totally reliant on the goodwill of the robot manufacturer to discover what went wrong. Even then you might not get the answers you seek; it’s entirely possible the robot and the company that made it are just not equipped with the tools and processes to undertake an investigation.
Right now there are no established processes for robot accident investigation.
Of course accidents happen, and that’s just as true for robots as any other machinery .
Finding statistics is tough. But this web page shows serious accidents with industrial robots in the US since the mid 1980s. Driverless car fatalities of course make the headlines. There have been five (that we know about) since 2016. But we have next to no data on accidents in human robot interaction (HRI); that is for robots designed to interact directly with humans. Here is one – a security robot – that happened to be reported.
But a Responsible Roboticist must be interested in all accidents, whether serious or not. We should also be very interested in near misses; these are taken very seriously in aviation , and there is good evidence that reporting near misses improves safety.
First we will look at the technology needed to support accident investigation.
In a paper published 2 years ago Marina and I argued the case for an Ethical Black Box (EBB) . Our proposition is very simple: that all robots (and some AIs) should be equipped by law with a standard device which continuously records a time stamped log of the internal state of the system, key decisions, and sampled input or sensor data (in effect the robot equivalent of an aircraft flight data recorder). Without such a device finding out what the robot was doing, and why, in the moments leading up to an accident is more or less impossible. In RoboTIPS we will be developing and testing a model EBB for social robots.
But accident investigation is a human process of discovery and reconstruction. So in this project we will be designing and running three staged (mock) accidents, each covering a different application domain: assisted living robots, educational (toy) robots, and driverless cars. In these scenarios we will be using real robots and will be seeking human volunteers to act in three roles, as the subject(s) of the accident, witnesses to the accident, and as members of the accident investigation team. Thus we aim to develop and demonstrate both technologies and processes (and ultimately policy recommendations) for robot accident investigation. And the whole project will be conducted within the framework of Responsible Research and Innovation; it will, in effect, be a case study in Responsible Robotics.
 Dhillon BS (1991) Robot Accidents. In: Robot Reliability and Safety. Springer, New York, NY
 Macrae C (2014) Close Calls: Managing risk and resilience in Airline flight safety, Palgrave macmillan.
 Winfield AFT and Jirotka M (2017) The Case for an Ethical Black Box. In: Gao Y, Fallah S, Jin Y, Lekakou C (eds) Towards Autonomous Robotic Systems. TAROS 2017. Lecture Notes in Computer Science, vol 10454. Springer, Cham.
Senior Research fellow from the Bristol Robotics Laboratory, Severin Lemaignan, took a team of students and plenty of robots to Bristol’s Teen Tech Fair earlier this month.
Teen Tech Festivals pop up across the UK to inspire the innovators of tomorrow -teenagers! On Thursday 10th October, local businesses turned up to Bristol’s Pavilion to help young people understand the opportunities in the science, technology and engineering industries.
Lemaignan was enthusiastic about how his robotic programming activity was received. “About 60 children came and visited us. They all went through a bit of robot exploration with the Thymios, trying to guess their different behaviours, and relate them to the sensors and actuators that the robots have; followed by a short introduction to programming with the Vectors: how can we get the robot to avoid a wall?”
Students, Ranvir Bhogal, Bethany Mackey and Jiangyin Sun, helped facilitate the short 15 minute activities.
“All of the instructors, activity leaders and ambassadors were tirelessly energetic with infectious enthusiasm. They used language to explain concepts to the pupils in an accessible way. Not all of mine are regular users of technical vocabulary but I felt that they understood all that they needed to and learnt loads! They have come away inspired and really excited about entering the TeenTech Awards. I also had a lovely day!”
Comment from a Teacher who attended.
You can find out more about Teen Tech below and read the report from the day here.
On Tuesday 21st May an autonomous vehicle was used by older people around the St Monica Trust’s Cote Lane retirement village, bringing a world-leading research project to a close.
The £5.5M project, “Flourish”, is delivered by a consortium of organisations including UWE Bristol and is the only Innovate UK funded project focused on older people. Launched in 2016, the project aims to develop a driverless vehicle integrating older people’s mobility needs with a secure and connected infrastructure.
The project works across three specialist areas at UWE Bristol, including the Bristol Robotics Laboratory. The demonstration explored how driverless vehicles, known as CAVs (connected and autonomous vehicles) could make a difference to older people’s everyday lives.
Professor Alan Winfield will be starting a new five year EPSRC funded project with Professor Marina Jirotka (University of Oxford), staging mock human-robot accidents in order to deeply explore the problem of robot accident investigation and develop both technical (i.e. data logging and explainer systems) and process solutions (i.e. frameworks for how to responsibly conduct such investigations).
The team will explore three scenarios, likely to be: assisted living (care) robots; robot toys and Autonomous Vehicles – with human volunteers role playing as the subject of, witnesses to, and the investigators of the accident. Alan believes this will be the first research project in the world to fully and systematically study this important aspect of real world robotics.
Through these two activities, RIFBristol attracted businesses from a diverse range of sectors to its dedicated workspace in the Bristol Robotics Laboratory (from civil engineering and manufacturing, to the brewing and creative industries).