Kids with special needs visit the BRL

Posted on

Katie Sparkes from the Lightyear Foundation thanks Severin Lemaignan and his team for enabling the special educational need (SEN) trip to the Bristol Robotics Laboratory in January.

The Lightyear Foundation works hard to break down barriers to getting more disabled people into Science, Technology, Engineering, Math, & Medicine. One of the ways they inspire children with SEN is through work inspiration trips.

This is what New Fosseway School had to say about the trip:

“What a unique experience for our students and interesting place to visit! It was a real delight watching them so interested in all the different robots from the very tiny to the huge car simulators.

They were especially interested in the social robots designed to help disabled people. Being able to have a go and manipulate some of the robots was really exciting and they also enjoyed the coding session where they got to programme some of the robots.

The trip most definitely inspired curiosity!”

Jo Payne, Transitions Lead, New Fosseway School.

Thanks to Severin, this trip has opened up the possibility of more SEN schools visiting the BRL….hopefully schools will be back in the summer term and these visits can go ahead!

Machine Vision Impacts Farming

Posted on

Technology from the Centre for Machine Vision (CMV) has been making moves to improve animal welfare and maximize crop harvesting.

Herdvision

First off, the 3D imagery system, Herdvision, that helps farmers assess cows’ wellbeing, was featured on the BBC six o’clock news in 2019 as it began a trial by Arla UK 360 farmers.

The technology developed in collaboration with Kingshay and AgsenZe, uses visual monitoring, data recording and automated intelligence to identify changes in each cow’s physical wellbeing, mobility and weight, before they are visible to the human eye.

Facial recognition used to assess pig’s emotions

Animal behaviourists from Scotland’s Rural College in Edinburgh, are using the technology provided by machine vision experts at UWE, to picture a range of pig facial expressions. The hope is that emotions can be identified and facial recognition used to improve pig welfare.

The BBC reported on the study in spring last year and the work is due to appear as part of a Netflix program in 2020.

Harvest Eye

The potato harvester based data capture system –Harvest Eye – provides insight on size, count and crop variation on unwashed potatoes as they are harvested. The integrated data analytics shows precisely what is being lifted and from where in the field, insights that will help maximise marketable yield and reduce crop imbalance.

The technology’s utility was recognised at the Potato Industry Event 2019/20, when it picked up second prize (out of 15 nominations) .

Harvest Eye was developed by CMV for B-hive, who then patented the technology in collaboration with CMV, and now B-Hive / Branston have established a new company, HarvestEye Ltd, to supply the HarvestEye technology to Grimme,a major manufacturer of root crop harvesters.

But the team at CMV aren’t stopping there.

“We’re working on a new funding bid right now to add functionality.”

Melvyn Smith, CMV
Mark Hansen, who led development of the technology, represented CMV, as part of the team that picked up the award. 

Robots. What could possibly go wrong?

Posted on

A new project studies how to investigate accidents with social robots, Alan Winfield explains why this is needed…

Originally posted on September 17th, 2019 by Alan Winfield on his blog.

Imagine that your elderly mother, or grandmother, has an assisted living robot to help her live independently at home. The robot is capable of fetching her drinks, reminding her to take her medicine and keeping in touch with family. Then one afternoon you get a call from a neighbour who has called round and sees your grandmother collapsed on the floor. When the paramedics arrive they find the robot wandering around apparently aimlessly. One of its functions is to call for help if your grandmother stops moving, but it seems that the robot failed to do this. 

Fortunately your grandmother recovers but the doctors find bruising on her legs, consistent with the robot running into them. Not surprisingly you want to know what happened: did the robot cause the accident? Or maybe it didn’t but made matters worse, and why did it fail to raise the alarm?

Although this is a fictional scenario it could happen today. If it did you would be totally reliant on the goodwill of the robot manufacturer to discover what went wrong. Even then you might not get the answers you seek; it’s entirely possible the robot and the company that made it are just not equipped with the tools and processes to undertake an investigation.

Right now there are no established processes for robot accident investigation. 

Of course accidents happen, and that’s just as true for robots as any other machinery [1].

Finding statistics is tough. But this web page shows serious accidents with industrial robots in the US since the mid 1980s. Driverless car fatalities of course make the headlines. There have been five (that we know about) since 2016. But we have next to no data on accidents in human robot interaction (HRI); that is for robots designed to interact directly with humans. Here is one – a security robot – that happened to be reported.

But a Responsible Roboticist must be interested in all accidents, whether serious or not. We should also be very interested in near misses; these are taken very seriously in aviation [2], and there is good evidence that reporting near misses improves safety.

So I am very excited to introduce our 5-year EPSRC funded project RoboTIPS – responsible robots for the digital economy. Led by Professor Marina Jirotka at the University of Oxford, we believe RoboTIPS to be the first project with the aim of systematically studying the question of how to investigate accidents with social robots.

So what are we doing in RoboTIPS..?

First we will look at the technology needed to support accident investigation.

In a paper published 2 years ago Marina and I argued the case for an Ethical Black Box (EBB) [3]. Our proposition is very simple: that all robots (and some AIs) should be equipped by law with a standard device which continuously records a time stamped log of the internal state of the system, key decisions, and sampled input or sensor data (in effect the robot equivalent of an aircraft flight data recorder). Without such a device finding out what the robot was doing, and why, in the moments leading up to an accident is more or less impossible. In RoboTIPS we will be developing and testing a model EBB for social robots.

But accident investigation is a human process of discovery and reconstruction. So in this project we will be designing and running three staged (mock) accidents, each covering a different application domain: 
assisted living robots, educational (toy) robots, and driverless cars.
In these scenarios we will be using real robots and will be seeking human volunteers to act in three roles, as the subject(s) of the accident, witnesses to the accident, and as members of the accident investigation team.
Thus we aim to develop and demonstrate both technologies and processes (and ultimately policy recommendations) for robot accident investigation. And the whole project will be conducted within the framework of Responsible Research and Innovation; it will, in effect, be a case study in Responsible Robotics.

References:

[1] Dhillon BS (1991) Robot Accidents. In: Robot Reliability and Safety. Springer, New York, NY
[2] Macrae C (2014) Close Calls: Managing risk and resilience in Airline flight safety, Palgrave macmillan.
[3] Winfield AFT and Jirotka M (2017) The Case for an Ethical Black Box. In: Gao Y, Fallah S, Jin Y, Lekakou C (eds) Towards Autonomous Robotic Systems. TAROS 2017. Lecture Notes in Computer Science, vol 10454. Springer, Cham.

Bristol Technology Showcase this Friday!

Posted on

The one day conference and expo coming to Aerospace Bristol this Friday (8th November), focuses on how new and emerging technologies will affect businesses and wider society.

Industry leading experts will be taking part in panel discussions and leading talks about the future of various technologies and industries. While local Bristol Tech will be showcasing in the expo.

Find out more on the Bristol Technology Showcase website.

Find discounted tickets on Eventbrite here.

And here’s a video from one of the speakers who is leading a session on the Future of vertical farming.

UWE introduce teenagers to robots and programming

Posted on

Senior Research fellow from the Bristol Robotics Laboratory, Severin Lemaignan, took a team of students and plenty of robots to Bristol’s Teen Tech Fair earlier this month.

Teen Tech Festivals pop up across the UK to inspire the innovators of tomorrow -teenagers! On Thursday 10th October, local businesses turned up to Bristol’s Pavilion to help young people understand the opportunities in the science, technology and engineering industries.

Lemaignan was enthusiastic about how his robotic programming activity was received. “About 60 children came and visited us. They all went through a bit of robot exploration with the Thymios, trying to guess their different behaviours, and relate them to the sensors and actuators that the robots have; followed by a short introduction to programming with the Vectors: 
how can we get the robot to avoid a wall?”

Students, Ranvir Bhogal, Bethany Mackey and Jiangyin Sun, helped facilitate the short 15 minute activities.

“All of the instructors, activity leaders and ambassadors were tirelessly energetic with infectious enthusiasm. They used language to explain concepts to the pupils in an accessible way. Not all of mine are regular users of technical vocabulary but I felt that they understood all that they needed to and learnt loads! They have come away inspired and really excited about entering the TeenTech Awards. I also had a lovely day!”

Comment from a Teacher who attended.

You can find out more about Teen Tech below and read the report from the day here.

First UK study into driverless cars for older people draws to a close

Posted on


On Tuesday 21st May an autonomous vehicle was used by older people around the St Monica Trust’s Cote Lane retirement village, bringing a world-leading research project to a close.

The £5.5M project, “Flourish”, is delivered by a consortium of organisations including UWE Bristol and is the only Innovate UK funded project focused on older people. Launched in 2016, the project aims to develop a driverless vehicle integrating older people’s mobility needs with a secure and connected infrastructure.

The project works across three specialist areas at UWE Bristol, including the Bristol Robotics Laboratory. The demonstration explored how driverless vehicles, known as CAVs (connected and autonomous vehicles) could make a difference to older people’s everyday lives.

More information is available through UWE Bristol news.

Developing Responsible Robots for the Digital Economy

Posted on

Professor Alan Winfield will be starting a new five year EPSRC funded project with Professor Marina Jirotka (University of Oxford), staging mock human-robot accidents in order to deeply explore the problem of robot accident investigation and develop both technical (i.e. data logging and explainer systems) and process solutions (i.e. frameworks for how to responsibly conduct such investigations).

The team will explore three scenarios, likely to be: assisted living (care) robots; robot toys and Autonomous Vehicles – with human volunteers role playing as the subject of, witnesses to, and the investigators of the accident. Alan believes this will be the first research project in the world to fully and systematically study this important aspect of real world robotics.

Robotics Workshop for Businesses

Posted on

Engineers from the Robotics Innovation Facility (RIFBristol) recently delivered two workshops for businesses interested in exploring the potential benefits of robotics and automation. 

As part of the Facility’s new £1m ERDF-funded initiative, the SABRE Programme, a free two-day ‘Introduction to Robotics’ workshop was offered to small and medium-sized enterprises operating in the West of England.  The same introductory workshop was also delivered exclusively for clients of the Somerset Energy Innovation Centre by PhD students from the FARSCOPE Centre for Doctoral Training

Through these two activities, RIFBristol attracted businesses from a diverse range of sectors to its dedicated workspace in the Bristol Robotics Laboratory (from civil engineering and manufacturing, to the brewing and creative industries).

Work up to running 5 km with the Pepper Robot!

Posted on

In this guest post, Katie Winkle of the Bristol Robotics Laboratory tells us about an opportunity to get involved in her exciting research.

I’m going to be running an ambitious (and exciting!) research study over the summer period and am putting out a first call for participants. This will be my final big study as a PhD student and brings together all of my previous work to date – some of you may have taken part in my previous experiments e.g. doing wrist turns with Pepper or arm exercises with the NAO robot. I would really appreciate it if you would consider taking part and/or share with friends and family etc. who may be interested. 

We will be installing a Pepper robot (picture below from one of my previous studies) in the Wallscourt Gym here on campus and using it to guide people through the NHS designed ‘Couch to 5k’ programme (https://www.nhs.uk/live-well/exercise/get-running-with-couch-to-5k/) designed to help people work up to running 5 km. Over the course of the programme, we will be investigating the use of supervised machine learning to have a human fitness instructor train the robot on how to be an encouraging ‘coach’.

The programme is made up of 3x ~30 minute exercise sessions per week over 9 weeks – so participants should be available and able to visit campus over the summer period (approx.3rd June up until 18thAugustbut there is time built in for participants to take a week or two off for holidays etc. We will make the robot and exercise instructor available at set times each week and set-up an online booking system for participants to choose slots from. One reason for starting to recruit now is so that we can make sure these time slots work for as many people as possible. 

An initial information sheet is attached above, but essentially we are looking for participants who are:

– over 18, fluent in English, with no health conditions that might prevent engaging in the Couch to 5K programme

– generally available and able to attend 3x ~30 minute weekly exercise sessions at the Wallscourt Gym on Frenchay Campus from early June to mid-August

– interested in signing up for a long-term exercise programme to get running!

If you are interested in taking part please drop me an email and/or go ahead and complete this poll to give an idea of what day/times you might be available to work out! https://forms.gle/PG7zjHA1DVBqUmEx9