Gestural musical gloves now available on pre-order

Posted on

Gestural musical gloves, technology originally developed at UWE Bristol by Dr Tom Mitchell, are now available for pre-order through a company called MI.MU. The gloves use motion capture and AI to enable wearers to create music with their movements.

The technology, which has been developed in partnership with Grammy Award-winning musician Imogen Heap, has already produced a small run of bespoke and handmade gloves for a select few musicians.

The product’s commercialisation now means that the gloves are half their original price and currently cost £2500 a pair. They have been designed according to the needs of musical artists and contain enhanced build quality and gesture control, improved electronics, and faster wireless communication.

In 2014, Ms Heap founded MI.MU, a partnership with UWE Bristol that also comprises fashion designer Rachel Freire, E-textiles designer Hannah Perner-Wilson, electronic engineer Sebastian Madgwick, scientist and musician Kelly Snook, musician and UX designer Chagall van den Berg, as well as Managing Director Adam Stark.

It was then made available to the public and saw the growth of a burgeoning community of performers making use of the gloves’ potential – from classical pianists, to film composers, beatboxers, and pop stars including Ariana Grande, who used the gloves on her 2015 ‘Honeymoon’ world tour.

Since 2014, Dr Mitchell and colleagues have refined the technology, streamlining designs with initial support from private investors and a range of academic and enterprise support including the EU Commission and Innovate UK.

Dr Mitchell said: “It’s exciting that we have managed to get to a point where the gloves will soon be available to all musicians. The gloves bring a new creative dimension to music performance, enabling musicians to create the movements that perform their music. I can’t wait to see what people will do with the technology.”

Imogen Heap, who uses the gloves as part of her performances, said: “So happy that we are finally able to extend the incredible superhuman feeling of having music in our hands out to a wider audience. You just have to remember to open your eyes during a performance, as it becomes so second nature!”
Adam Stark, Managing Director of MI.MU, said: “We are hugely proud to release the MI-MU gloves to musicians everywhere, and we can’t wait to see what they do with them.

“They are the result of years of research and development into new ways to compose and perform music. We believe they will enable musicians to discover new forms of expression, leading to new ideas, new performances and, ultimately, new forms of music.”

Facial recognition technology aims to detect emotional state in pigs

Posted on

State-of-the-art facial recognition technology is being used in an attempt to detect different emotional states in pigs.

Machine vision experts at the University of the West of England (UWE Bristol) have teamed up with animal behaviourists from Scotland’s Rural College (SRUC) in Edinburgh for the study, which it is hoped will lead to a tool that can monitor individual animals’ faces and alert farmers to any health and welfare problems.

Pigs are highly expressive and SRUC research has previously shown they can signal their intentions to other pigs using different facial expressions. There is also evidence of different expressions when they are in pain or under stress.

At SRUC’s Pig Research Centre in Midlothian, scientists are capturing 3D and 2D facial images of the breeding sow population under various, typical commercial situations that are likely to result in different emotional states. For example, sows can experience lameness and could show different facial expressions relating to pain before and after being given pain relief. Detecting positive emotional state is more novel but sows are highly food motivated and appear calm and content when satiated. They hope this mood could be reflected in sows facial expressions.

Images are then processed at UWE Bristol’s Centre for Machine Vision, where various state-of-the-art machine learning techniques are being developed to automatically identify different emotions conveyed by particular facial expressions. After validating these techniques, the team will develop the technology for on-farm use with commercial partners where individual sows in large herds will be monitored continuously.

Professor Melvyn Smith from UWE Bristol’s Centre for Machine Vision, part of the Bristol Robotics Laboratory, said: “Machine vision technology offers the potential to realise a low-cost, non-intrusive and practical means to biometrically identify individual animals on the farm. Our work has already demonstrated a 97% accuracy at facial recognition in pigs. Our next step will be, for the first time, to explore the potential for using machine vision to automatically recognise facial expressions that are linked with core emotion states, such as happiness or distress, in the identified pigs.”

Dr Emma Baxter from SRUC said: “Early identification of pig health issues gives farmers the potential to improve animal wellbeing by tackling any problems quickly and implementing tailored treatment for individuals. This will reduce production costs by preventing impact of health issues on performance.

“By focussing on the pig’s face, we hope to deliver a truly animal-centric welfare assessment technique, where the animal can “tell” us how it feels about its own individual experiences and environment. This allows insight into both short-term emotional reactions and long-term individual ‘moods’ of animals under our care.”

The study, which is being funded by the Biotechnology and Biological Sciences Research Council (BBSRC), is also being supported by industry stakeholders JSR Genetics Ltd and Garth Pig Practice as well as precision livestock specialists Agsenze.

Notes and links for editors:
https://bbsrc.ukri.org/research/

Relevant papers:

Hansen, M.F., Smith, M.L., Smith, L.N., Salter, M.G., Baxter, E.M., Farish, M. and Grieve, B., 2018. Towards on-farm pig face recognition using convolutional neural networks. Computers in Industry, 98, pp.145-152.
https://www.sciencedirect.com/science/article/pii/S0166361517304992

Camerlink, I., Coulange, E., Farish, M., Baxter, E.M. and Turner, S.P., 2018. Facial expression as a potential measure of both intent and emotion. Scientific reports, 8(1), p.17602.
https://www.nature.com/articles/s41598-018-35905-3

Knowledge Transfer Partnership with ExtraCare Charitable Trust introduces smart technologies to retirement villages

Posted on

In April 2018, UWE Bristol announced a two-year Knowledge Transfer Partnership (KTP) with ExtraCare Charitable Trust to help incorporate innovative technologies into its properties for the benefit of residents.

ExtraCare Charitable Trust’s recently opened ‘Stoke Gifford Village’ in Frenchay is home to an innovation apartment which showcases the use of devices such as smart kettles, a body dryer, remote-controlled blinds, video doorbells, integrated with intelligent sensing, to develop and demonstrate practical smart solutions to support active ageing.

The innovation apartment, next to UWE Bristol’s Frenchay campus, allows ExtraCare Charitable Trust and the KTP team to trial the technology and gather data on how users interact with the systems.

ExtraCare Charitable Trust’s Executive Director of Marketing and Innovation Henriette Lyttle, said, “Our vision is to enable better lives for older people and to create sustainable communities that provide homes older people want and lifestyles they can enjoy. This KTP is an opportunity to pioneer the integration of technologies into our retirement villages in order to increase quality of life and prolong independent living.”

Prof Praminda Caleb-Solly, Professor of Assistive Robotics and Intelligent Health Technologies at Bristol Robotics Laboratory, part of UWE Bristol, who is the academic supervisor leading the KTP commented:

“We are privileged to be working with ExtraCare Charitable Trust and to have the opportunity of testing, trialling and co-designing with residents and carers. This project will enable us to make a positive impact on supporting people as they age.”

ExtraCare Charitable Trust is the UK’s leading not-for-profit developer of housing for over 55s. Since 1988, they’ve operated retirement villages and smaller housing developments.

Find out more about KTPs here.

Eliminating Uncertainties and Improving Productivity in Mega Projects using Big Data and Artificial Intelligence

Posted on

A series of projects at the Bristol Business School combining cutting-edge digital technologies could potentially revolutionise the way industry tackles management of Mega Projects at the bidding stage. These innovative technologies include Artificial Intelligence (AI), Big Data, Virtual Reality (VR) and Augmented Reality (AR).

Professor Lukumon Oyedele and his team of developers have created software that harnesses the power of big data and artificial intelligence to help companies accurately plan and execute Mega Projects (large-scale, complex ventures that typically cost hundreds of millions of pounds).

The software uses advanced analytics to predict a whole range of complex project parameters such as three-points estimates, tender summaries, cash flow, project plans, risks, innovations, opportunities, as well as health and safety incidents.

The project, whose flagship simulation tool is called Big-Data-BIM, is part of a partnership with leading UK construction contractor Balfour Beatty, to help it plan better power infrastructure projects involving the construction of overhead lines, substations and underground cabling. By using the software, the company is able to improve productivity and maximise profit margins.

“When planning a tender for a project, companies often plan for a profit of 10 to 15 percent, but on finishing the project, many struggle to make two percent profit margin,” says Professor Oyedele, who is Assistant Vice-Chancellor and Chair Professor of Enterprise and Project Management.

“The reason is that there are many unseen activities, which are hard to capture during the early design stage. Besides, the design process itself is non-deterministic. This is why when you ask two quantity surveyors how much a project is likely to cost; they often produce different figures.

“With Big-Data-BIM, we are bringing in objectivity to plan the projects and taking care of uncertainties by engaging advanced digital technologies, so that a tender estimate remains accurate until project completion, with minimal deviation from what was planned at the beginning.”

The tool taps into 20 years of Balfour Beatty’s data on power infrastructure projects and learns predictive models that inform the most optimal decisions for executing the given work. The tool informs the business development team at the beginning of the project whether it is likely to succeed or fail.

One of the functions of the software is to create a 3D visual representation of project routes to understand complexity, associated risks (like road and river crossings) and opportunities (such as shared yards and local suppliers). For this purpose, the software taps into Google Maps data and integrates data from the British Geological Survey and Ordnance Survey to discover automatically the number of roads, rivers, and rail crossings.

The tool performs extensive geospatial analysis to find out the optimal construction route and measure distances between route elements with a high degree of accuracy. “This all happens within a twinkle of an eye. Without leaving your office, you can determine the obstacles on the planned route of the cables, or whether there is a river in the way,” says Professor Oyedele.

By mining the huge datasets of health and safety incidents, the software can also determine what kind of injuries might occur on a project, and even produce a detailed analysis of the most probable body parts that could be prone to injury. This can help prepare an accurate health and safety risk assessment before the work begins.

The software provides an intuitive dashboard called “Opportunity on a page” where all predictions are visualised to facilitate data-driven insights for designers to make critical planning decisions.

As a contractor, Balfour Beatty uses the tool to enable it to submit the best bids to clients so that it can have a high chance of winning them. The software is also set to be provided for other industries carrying out linear projects. These are to include water distribution networks, and the rail, roads, as well as oil and gas sectors.

 

£6.5m project aims to drive digital innovation in the South West

Posted on

A project worth £6.5million is being launched across the South West to expand the use of digital technologies throughout the region’s creative, health and manufacturing sectors.

The new Creative Technology Network will bring together universities and industrial partners, pooling their research and innovation expertise to develop cutting-edge practices, techniques and products in creative digital technologies.

Supported by a grant from RESEARCH ENGLAND, and led by the University of the West of England (UWE Bristol), the three-year project is a partnership with Watershed in Bristol, Kaleider in Exeter, Bath Spa University, the University of Plymouth and Falmouth University.

UWE Bristol Professor Jon Dovey is leading the project for the DCRC

As new technology, including automation and big data, raises new challenges and opportunities for businesses, this partnership is designed to respond to industry needs across the health and manufacturing sectors and the creative industries, driving productivity and resilience.

The grant is part of RESEARCH ENGLAND’s Connecting Capabilities Fund, which supports university collaboration and encourages commercialisation of products made through partnerships with industry. The funding will kick-start the project, which begins in April.

Professor Martin Boddy, who is Pro Vice-Chancellor for Research and Business Engagement at UWE Bristol, said, “We are immensely proud to be taking the lead on this exciting project which builds on UWE Bristol’s vision to work with partners to enhance innovation across the region and nationally. This new network will stimulate the regional economy and will undoubtedly lead to new products and new ways of working, all thanks to shared research experience and technical expertise.”

Professor Jon Dovey, who is Professor of Screen Media at the Faculty of Arts, Creative Industries, and Education at UWE Bristol and leading the project for the Digital Cultures Research Centre (DCRC) said, “This project will bring together the best and the brightest researchers in creative arts, technology and design to work with companies old and new to show what new kinds of value can be unlocked by the application of creative technologies.

“We are going to be working with immersive media, processes of automation and the new availability of big data to support business to find new ways of working with their customers and our citizens. Watch this space for the amazing new products and services we invent in the next three years.”

 

Passenger-carrying drones among us by 2030, says UWE Bristol expert

Posted on

Drone technology is in its infancy but in the not-too-distant future we are likely to see unmanned air vehicles (UAVs) perform actions like paint or clean, with the ability to visualise very small items like a hairline fracture in a building structure. This is according to Dr Steve Wright, who is Associate Professor in Aerospace Engineering at UWE Bristol and a drone expert. He also predicts that we could see freight and even passenger-carrying drones by 2030.

But there are still many challenges to overcome before these autonomous aircraft are reliable and trustworthy enough to be an integral part of our society, says Steve. In fact, he believes the technology needs to be improved by a factor of one million before it is safe enough.

The current big challenge is to programme a drone to navigate and fly autonomously through a cluttered environment, like a city, in a safe way, and we are still a long way off.

The MAAXX (Micro Aero Autonomous Extremes Europe) drone racing contest that Steve organises is the second iteration of Europe’s only indoor drone flying contest. The two-day UWE Bristol event on 23-24 March takes place in the University’s exhibition centre and sees several teams programme their UAVs to fly unaided around a designated track, with a hackathon for budding coders to programme a ‘house’ drone provided by the organisers.

As well as a useful day for industry to meet their peers, and a fun day for families (on the second day only), the event also contributes to driving forward the technology. Given they are part of a contest, the teams push drone development to their limits by finding solutions to UAVs veering off course, or not stopping in time.

Previous to his work in academia, Steve worked in the aerospace industry for over two decades. “I look at drones with the eye of someone who for 25 years has been helping to build systems in conventional aircraft and these are exciting times for UAV development.”

He explains that we are at the exact point in history as with conventional aircraft development in 1918 – exactly 100 years ago. Using the comparison, he says: “We are at the equivalent point in time where we know how the Wright Brothers were able to fly their plane, and have already built a Sopwith Camel [a war plane used in the First World War]. We can glimpse what a spitfire looks like, but still have no idea what aircraft will look like in 30 or 40 years.” As a result, we have a clean slate with UAVs, he adds, and still have so much to learn about and improve.

Interestingly, the technology is being developed from the bottom up, says Steve. “Some other similar technologies have been driven from the top down by large corporations, but this one is from the bottom up, by consumers, very much like the early days of the electronics operations.”

As for the future, says Steve, we are moving towards close-up imaging, whereby a drone will soon be able to detect minute structural faults on a bridge or building. We could also soon see drones that clean surfaces such as solar panels in the desert that become covered in sand.

Steve also predicts that as soon as 2030, we are likely to see drones carrying passengers as well as freight over short distances.

His biggest fear about the drone industry? “The trouble is that there are many people who know how to fly a drone and although they are often not reckless, many are unaware of the safety issues.  Those of us involved in the drone industry live in terror that somebody will cause a horrendous accident – this would shut us all down in a single afternoon.”