UWE Bristol Active Living Architecture: Controlled Environment (ALICE) project selected to be showcased on EU Innovation Radar Website

Posted on

The Active Living Architecture: Controlled Environment (ALICE) project has been recognised by the European Commission’s Innovation Radar team has an Innovation Highlight and will be showcased on their website.

The project, which follows on from the Living Architecture research programme, is a joint venture between UWE Bristol, Newcastle University and Translating Nature.

The aim of ALICE is to introduce and familiarise sustainably-minded promotors such as architects, designers, engineers, “green” businesses and their clients, to advocate the use live microbes as processors of waste within our homes and cities.

ALICE aims to provide a publicly accessible interface that is activated by household waste, namely urine and grey water. It exploits the properties of the integrated bioreactor system developed for the Living Architecture (LIAR) project. Creating a useable context and habitat that can be exhibited at biennales or festivals and explored by these audiences. ALICE catalyses a conversation about the future of sustainability in homes and public buildings, as well as the lifestyle changes implicit in adopting this new generation of utilities.

ALICE is a highly personal experience where ‘users’ may understand how waste can be dealt with differently in the home by putting it to good use. ALICE takes the form of a cabin and through a digital interface that translates data into graphical animations, participants will be able to see how their waste ‘enlivens’ the cabin’s performance. For example, turning on LEDs, or charging small mobile devices.

Conceptually, ALICE may be likened to the ‘tamagotchi pet’, a digital toy that flourishes through the owner’s digital care and attention. In this way, ‘care’ for ALICE is through its feeding and engagement with audiences. The system will also collect data that will help the innovators better understand the performance and potential usage of such a system outside the laboratory space so that appropriate prototypes for market can be developed.

UWE Bristol lead for the project Ioannis Ieropoulos, Professor of Bioenergy and Self-Sustainable Systems and Director of the Bristol BioEnergy Centre, at the Bristol Robotics Laboratory, commented on the project: We are delighted for this recognition by the European Commission, which is an important milestone in our endeavour to make this technology widely available. The work of our partners has enabled the successfully translation of a complex technology into a visual representation that is highly appealing to a wide audience and this could have only been achieved through open-minded collaboration. We very much look forward to seeing this installed in everyone’s home.

Congratulations to Ioannis and the team for the recognition of their project.

Case study: How happy is your pig?

Posted on

Professor Melvyn Smith was recently interviewed by KTN about his research into the emotional state of pigs. The below case study was written by Alan Cowie at KTN as part of their annual report:

Innovation in agriculture has advanced significantly over the past century. In 1920 it would take a farmer an hour and a half to till one acre of land. In 2020, it takes 5 minutes. It’s not just the technology which has effected change, it’s ever-changing societal attitudes which continue to revolutionise not only agriculture but other industries too.

Today’s farmer is not only interested in their animals’ physical health, but also their emotional wellbeing. We’re not pretending these animals are not being reared for food, but we all have a responsibility to ensure animals are content, happy and healthy throughout their lives, and healthier animals deliver higher yields.

One person who is doing that more than most is Mel Smith, a professor at the Centre for Machine Vision (CMV), based at the Bristol Robotics Laboratory, which is jointly run by the University of the West of England (UWE) and Bristol University. It was originally set up in 1999 to study industrial inspection, metrology, surface analysis and quality control. Over the years, and with extensive support from KTN, the CMV has completed projects in defence, health and, more recently, AgriFood. Projects have involved an EPSRC[1] funded trial using 3D imaging technology for facial recognition, examining the colon and the oesophagus for tumours and polyps, and 2D imaging grass fields using a convolutional neural network to locate and identify species of weed.

Where Mel wants to make a real impact is in animal welfare. “Tagging a pig’s ear can cause pain and distress to the pig” explains Mel. “Tags can also get ripped off and they get dirty. So what if there was a way of identifying the pig without even touching it?” This is where Mel’s photometric stereo technology comes in. In a recent trial[2], a drinker was adapted and fitted with a motion activated webcam, which takes thousands of pictures of the pigs’ faces every day, feeding a computer algorithm which successfully identifies the animal with 97% accuracy. But this goes beyond facial recognition. Mel believes his work shows that pigs are revealing their emotional state through facial expression. Are they happy? Are they content? Are they nervous? 

“You can interrogate the neural network to ask it which parts of the image it’s using to tell whether it’s a happy face or not. It produces a heat map showing the areas of the face it’s using to assess happiness. For pigs’ faces, it is around the eyes, ears and the top of the snout which relate to expression.” 

Mel has been collaborating with other researchers on the potential of using existing technologies and applying them in new ways. In one example, he explains how a system which was originally designed to analyse aggregate particles in the construction industry, has found new uses in agriculture, to check the body condition score of livestock, a measure of the health and welfare of animals. Mel explains how it involves a camera which takes a normal image and a 3D depth image. Looking down on a cow, it captures data as it walks underneath. “We’re looking at how bony the animal is – around its hindquarter, where you have its hook and pin bones. If they’re sticking through, they have a low body condition and if they’re nice and fat and rounded, they have a high body condition.” 

Happier animals are more productive and deliver higher yields, so there is a commercial advantage, as well as a social advantage. In an industry where profit margins are often very tight, new practices which promote efficiency or boost productivity are usually welcomed. We may be some way off seeing widespread livestock facial recognition in all farms, but attention to our environment and ecology is only increasing. Who knows where we’ll be in the next century. We already have ‘free range’ and ‘organic’ stickers on our food. Will we have ‘certified pig happy’ too?

Be it for commercial or animal welfare benefits, it’s clear Mel is passionate about using this technology for good. Mel says “It’s about finding a niche where we can make a contribution and machine vision technology has real value for the wellbeing of animals. If we can be at the forefront of this, and do something that’s cutting edge, that’s quite a motivation for me.”

Whilst many of the technologies Mel describes are not necessarily new, they are being applied in novel ways, and KTN has played a key role creating new opportunities and new connections for Mel.

“KTN have had a transformational impact on helping us to deploy our Machine Vision skills to collaborate with agriculture and food industry partners. We have really benefitted from the ability to network through KTN. Their funding expertise and knowledge of the AgriFood industry has led us to many new innovation opportunities that we would not have identified ourselves. Several of these projects have resulted in products that are now reaching a commercial stage”.


[1] Engineering and Physical Sciences Research Council

[2] Watch “Connected – the hidden science of everything”, episode one, on Netflix.

Alan Cowie is the Partnership, PR and Communications Lead at KTN.

UWE Bristol’s Centre for Machine Vision receive funding to create augmented reality picking aid for farm workers

Posted on

UWE Bristol’s Centre for Machine Vision team and other members of a consortium has been awarded Innovate UK funding to develop a low cost, augmented reality picking aid that will display information about berry maturity through the use of machine learning and spectral imaging cameras. 

The consortium includes AR developers Opposable Games; environment, food and science research organisation NIAB EMR and leading industry grower-owned co-operative, Berry Gardens Growers Limited, alongside the Centre for Machine Vision which is part of the Bristol Robotics Laboratory.

The concept and commercial opportunity was identified by Richard Harnden, Director of Research at Berry Gardens Growers Ltd who has wanted to improve the consistency of the eating quality of the co-operative’s premium berry lines, which includes a sweet eating dessert blackberry, for several years.

“It is very hard for pickers, especially new pickers, to really understand the correct stage of ripeness in the blackberry before picking it”, he said.  “Pick it too early and, although the berry will be black in colour, it won’t have accumulated enough sugars and so it will still taste acidic. Pick it too late, and the berry will be too soft to withstand the supply chain and will leak juice in the punnet.” 

He continued, “There is a small correct window for picking the fruit that delivers an exquisite combination of sweetness and flavour, which can be done by eye but it takes time for pickers to achieve the correct level of perception. The proposed picking aid, using novel technology, will deliver a maturity indicator, which will guide new and experienced pickers alike to quickly make the right decision every time.”      

Bo Li, a machine vision specialist in the Centre for Machine Vision at Bristol Robotics Laboratory at UWE Bristol, who devised the project, said: “By developing a low cost multispectral camera for detecting the real time ripeness of fruit, we can enhance the efficiency of picking, reduce the requirement for pickers to be experienced, and shorten the training time required. This step forward will improve the consistency of fruit quality and customer satisfaction.”

In an industry already experiencing difficulties in accessing experienced staff, the impact of Covid-19 is putting additional strains on farms and farm workers. Restrictions on labour movement, new safety measures, and risk mitigation procedures being required, mean that the horticultural and agricultural industries must look to novel solutions to train new workers and meet existing and future labour requirements. Global demand for high quality and healthy food such as soft fruit is increasing. To meet this demand farms are looking to technological solutions that enable increasing the quality, yields, and productivity whilst reducing environmental impacts. This project will contribute towards the UK government’s Transforming Food Production objectives, part of the Industry Strategy Challenge Fund.

The Innovate UK funded project will commence in September 2020, with the development of a prototype device building on the experience of the consortium, then moving on to field trials.  Members of Berry Garden Growers Ltd will trial the harvesting aid on their farms as the project progresses.

The Centre for Machine Vision is part of the Bristol Robotics Laboratory (BRL). We solve real-world practical computer vision problems. Their particular excellence lies in three-dimensional reconstruction and surface inspection. They are recognised as one of only three UK centres with expertise in Photometric Stereo (PS). They have pioneered PS in industry, medicine and defence/security. Their laboratory supports REF (Research Excellence Framework) level research activities and research-led teaching in machine vision. Find out more here.

Bristol Robotics Laboratory and Future Space trials Robot Tours

Posted on

Future Space, in partnership with the Bristol Robotics Laboratory (BRL), recently trialled an innovative new approach to providing tours of its facility, enabling people to view its workshop, laboratory and networking spaces from the comfort of their own homes and offices.

Using their personal IT devices to remotely control the movements of a self-driving, two-wheeled videoconferencing robot, potential new Future Space members were given the freedom to explore the unique, state-of-the-art space, while also being able to communicate with staff through a live video link.

Developed by Double Robotics Inc, this exciting technology helps people to feel more connected to colleagues, friends or patients, by having a physical presence, even if they are unable to attend an event or meeting in person. The robot is involved in several UWE Bristol research projects currently underway at BRL.

“We start by co-designing and trialling the technology in our purpose-built Assisted Living Studio,” says Professor Praminda Caleb-Solly, BRL’s Assistive Robotics and Intelligent Health Technologies lead. “We develop, test and implement various assistive robots and heterogeneous sensor systems in this realistic environment before taking them into real-world settings. The next stage, as we are doing with the Double telepresence robot, is evaluating its use in health and social care settings. We are particularly interested in how it can allow nurses, social workers and doctors to remotely interact with patients and are exploring this as part of our partnership with North Bristol Trust.”

Read the full story.

Future Space resident Homelync joins forces with Aico to expand market leading social landlord Internet of Things platform

Posted on

Homelync, who started in the Bristol Robotics Laboratory (BRL) Hardware Incubator before graduating to Future Space, have been acquired by Aico.

Homelync are an award-winning, innovative technology firm that specialise in smart home integration and analytics technology. With industry-leading expertise in the Internet of Things (IoT), software development and integration, the Homelync team are at the forefront of this progressive market. Homelync have widely rolled out integrated IoT solutions for social landlords across the UK to help them tackle challenges associated with cost savings, maintenance efficiency, decarbonization, tenant safety, fuel poverty, and social care. This is done by leveraging an ecosystem of leading IoT devices including temperature, humidity, CO2, fire safety, water leak, boilers, and energy.

Aico is the UK market leader in domestic Fire and Carbon Monoxide protection, pioneering new technologies and offering high quality Fire and Carbon Monoxide alarms. Known for their focus on education, quality, service and innovation they are the premium brand in social housing. After the successful roll out of their popular SmartLINK Gateway, and due to customer demand, they have ambitious plans to expand further into the social housing Internet of Things (IoT) and connected home market.

This acquisition represents a significant step forward for IoT in social housing at a time when the market has recently seen significant growth. The acquisition gives landlords the opportunity to invest in IoT over a longer term investment cycle by providing an additional layer of confidence that couples Homelync’s innovation with the resources of a well-established brand that is trusted in the sector. Aico’s heritage of 30 years standing provides great strength for all landlords.

Read the full story here. The BRL Hardware Incubator and Future Space are part of the University Enterprise Zone. They connect entrepreneurs and tech innovators with scientists, researchers and graduate talent – to spark collaboration, innovation and growth. Find out more here.

Bristol Robotics Laboratory manufacture visors for NHS staff

Posted on

Original post appeared on UWE Bristol website.

Technicians at Bristol Robotics Laboratory (BRL) are using laser cutting technology to produce protective visors for NHS staff during the coronavirus pandemic.

A team are manufacturing an initial batch of 200 for staff working at Avon and Wiltshire Mental Health Partnership (AWP) NHS Trust. They plan to expand production with the support of 3D printing facilities and technicians across three UWE Bristol faculties.

The cleanable visors are being created using an approved design by University College London. The team also plans to manufacture surgical mask straps, which help prevent masks rubbing against the ears of clinical staff.

Gareth Griffiths, a Senior Engineering Manager in BRL’s Robotics Innovation Facility (RIF), said: “The trust approached us asking if we could supply PPE and we were very happy to use our facilities and expertise to help with their request.

“The visors can be made very quickly, with the manufacture process taking about two-and-a-half minutes for each visor. They are made from smooth laser-cut plastic so they can be easily cleaned and reused if necessary.”

Read the full post here.

UWE Researchers test driverless pods at The Mall Cribbs Causeway

Posted on

Adapted from this UWE Bristol news article.

Researchers from the Centre for Transport and Society and the Bristol Robotics Laboratory (BRL) at UWE Bristol are currently partners on the Capri Project, the first UK project to trial driverless pods on public roads.

From 20th – 25th January, the driverless pods were at The Mall, Cribbs Causeway transporting members of the public, enabling them to experience connected and autonomous vehicles (CAVs) and understand how they might operate in the future.

Capri is a consortium comprising 17 partners, including lead organisation AECOM, South Gloucestershire Council and UWE Bristol. The Capri trial is the first in the UK without this level of supervision, inviting members of the public to turn up and travel alone in the autonomous pod.

The research used in this trial will help reduce potential barriers limiting the uptake of commercially ready autonomous vehicle services. This also includes overcoming technical challenges, raising public awareness and ensuring sustainable integration into the wider transport systems. This pilot will support the local and UK economy by helping regional and national businesses become more competitive in a growing international market.

Read the full article here.

UWE Academics help in public trial of driverless pods

Posted on

As part of a research project involving UWE Bristol robotics, driverless pods helped transport members of the public around London’s Queen Elizabeth Olympic Park.

The project aims to pave the way for the use of connected and autonomous vehicle (CAV) transport services at public transport hubs and around private estates, including tourist and shopping centres, hospitals, business parks and airports.

With Queen Elizabeth Olympic Park already a testbed for smart mobility activity, alongside a wide range of other innovation projects, an important element of this trial assessed people’s behaviours and attitudes towards driverless pods. With little existing research on how people interact with CAVs in public spaces, representatives from UWE Bristol and Loughborough University observed how people behaved when confronted by the pods, as well as surveying passengers who took a ride on them.

Conducting the trial in the park allowed the UWE Bristol team to speak to users of the park to explore how they felt about the pods being in the same space, and if that raised concerns. Talking to groups such as cyclists, e-scooter users and families provided feedback on how accepting the public might be of driverless vehicles in off-road spaces like the park, and in other locations such as shopping centres, hospitals or airports.

The trial at Queen Elizabeth Olympic Park earlier this month was the first public appearance for the Capri pods, which picked up and dropped off passengers at a number of points on a circular route. The Capri pods will be at The Mall in South Gloucestershire in early 2020, returning to the park next year with a final trial that will extend their route and further test the on-demand technology.

Blog post adapted from UWE Bristol news article, which can be found here.

The startup using tech to deliver a personal message

Posted on

Taken and adapted from The Pitch. Author: Hannah Jolliffe

UWE Bristol Enterprise Zone residents, The Handwriting Company, are currently taking part in The Pitch, a competition to identify top start-ups:

Robert Van Den Burgh is co-founder of The Handwriting Company, a startup that helps organisations better engage with their customers through the power of the handwritten letter. But, while the name suggests a gang of people scribbling away, the reality couldn’t be further from the truth.

“We’ve created technology that can mimic handwriting. It can then be printed at scale on a good quality office printer or on robotics based in our facility. So instead of an organisation hiring 100 people to handwrite notes and pay a huge amount of money, we can fully automate the process.”

Do your research first

As with most innovators, the idea was born out of a realisation that a solution was needed to fix a broken system.

Van Den Burgh was on a marketing internship two and a half years ago when his manager got him involved in a handwritten marketing campaign.

“I was the fool who took three weeks writing out all the letters,” he laughs. “But it was one of the most successful marketing campaigns they’d ever run.”

This prompted Van Den Burgh to research the market. He found about 30 companies offering a similar service, but they all had people writing letters by hand. “I could see that the economics behind it didn’t make sense and it just wasn’t efficient enough to make it an effective tool.”

I could see that the economics behind it didn’t make sense

Van Den Burgh joined forces with Alex Robinson, an AI engineer with a background in computer science. The two founded Scribeless, which has since been renamed The Handwriting Company.

Together they began a huge research piece to see if they could use technology to optimise and automate the process of handwriting using AI, algorithms and robotics. It took time, but they developed a programme that could learn someone’s handwriting at a level that was indiscernible from human writing.

They then equipped robots with classic fountain pens and ink that can even mimic the physics of pen pressure and variation and deliver thousands of letters in hours.

Taking a punt at entering The Pitch

The pair reached the stage where they had a rough idea of the market needs, technology and where they wanted to take the business, when a friend recommended that they enter The Pitch.

The pair reached the stage where they had a rough idea of the market needs, technology and where they wanted to take the business, when a friend recommended that they enter The Pitch.

“It was about this time last year. I thought we didn’t really stand a chance because we were still a very new company, but we gave it a go. We were lucky enough to get to the semi-final and then the final!”

I thought we didn’t really stand a chance because we were still a very new company

For Van Den Burgh, the day at the boot camp helped them to better understand how to articulate what they offer.

“The format of pitching is very short and sharp and about getting your main points across. We spent a day discussing our concept with the boot camp coaches. They gave us feedback to really help us understand how to better articulate our story, the problem and how we could help solve that problem.”

Staying ahead of their own game

It’s been a busy year for the company since then. The model has moved on from robotics to Advanced Printing Technology, which can print a handwritten note indiscernible from human handwriting. It’s helped them create handwriting campaigns at scale.

The company’s client base includes banks, churches, charities and corporate gifting companies across the UK, US and Germany. They’ve also established their place within the greetings card space. Things are looking healthy, but one of the biggest hurdles they still need to overcome is funding.

It’s really hard to do everything on a shoestring budget

“Until now we’ve been fully self-funded. It’s been really hard to do everything on a shoestring budget. It’s part of being a startup, but it has been a strain on resources – only having 10% of the funds you need is difficult.”

This has led The Handwriting Company to raise investment, with the aim of building the team and scaling into the US.

“We’re just about to close our investment round, with a mixture of angels and investment capitalists, so we’d like that to fund five or six new people across sales, tech and marketing to allow us to keep innovating and build a more scalable model.”

It’s important for Van Den Burgh to get more competitive and to “out-innovate” their own technology. His key objectives are to make sure they can deliver quickly and at an affordable rate. At the moment, it takes a couple of days for the software to mimic handwriting, but the aim for the near future is to get this working in real-time.

Original post can be viewed here

UWE Bristol secure new Knowledge Transfer Partnership with Reusaworld

Posted on

UWE Bristol Knowledge Transfer Partnership (KTP) team have secured another KTP with Reusaworld and the Centre for Machine Vision. The new KTP means that UWE Bristol now has 11 live KTPs. The KTP which is based in Gloucester will see innovative changes to the world of second hand books.

This KTP will be with Reuseabook, a part of Reusaworld.

Reuseabook was founded in 2008 by Rob Hollier and Ami Hollier with the following mission: NEVER to allow a single book to go to landfill.

Strong believers in conscientious capitalism, they wanted to create an earth-friendly sustainable business model while helping others. After much hard work what emerged was the Reuseaworld group: an award-winning, ethical, environmentally-friendly and technology-savvy enterprise that uses the internet to sell second-hand books worldwide.

Working with the Centre for Machine Vision, the aim of the 30 month KTP is to develop innovative machine vision techniques and deep learning methodologies to test the viability of data outputs of a 3D Book Vision System and its application to the book grading process. Ultimately, increasing the speed and quality of inbound book sorting, in-house data management and book cataloguing.

The UWE Lead for the KTP is Professor Lyndon Smith and the Academic Supervisor is Dr Abdul Farooq, who are both part of the Centre for Machine Vision at UWE Bristol. The Centre for Machine Vision is part of the Bristol Robotics Laboratory (BRL). They solve real-world practical computer vision problems. Their  particular excellence lies in three-dimensional reconstruction and surface inspection.

Innovate UK scored the proposal very highly (4th out of 60 applications) so congratulations to all involved!

This partnership received financial support from the Knowledge Transfer Partnerships programme (KTP).  KTP aims to help businesses to improve their competitiveness and productivity through the better use of knowledge, technology and skills that reside within the UK knowledge base.  This successful Knowledge Transfer Partnership project, funded by UK Research and Innovation through Innovate UK, is part of the government’s Industrial Strategy.