Skip to content


Computer vision and mobile technology could help blind people ‘see’

Computer scientists are developing new adaptive mobile technology which could enable blind and visually-impaired people to ‘see’ through their smartphone or tablet.

Funded by a Google Faculty Research Award, specialists in computer vision and machine learning based at the University of Lincoln, UK, are aiming to embed a smart vision system in mobile devices to help people with sight problems navigate unfamiliar indoor environments.

Based on preliminary work on assistive technologies done by the Lincoln Centre for Autonomous Systems, the team plans to use colour and depth sensor technology inside new smartphones and tablets, like the recent Project Tango by Google, to enable 3D mapping and localisation, navigation and object recognition. The team will then develop the best interface to relay that to users – whether that is vibrations, sounds or the spoken word.

Project lead Dr Nicola Bellotto, an expert on machine perception and human-centred robotics from Lincoln’s School of Computer Science, said: “This project will build on our previous research to create an interface that can be used to help people with visual impairments.

“There are many visual aids already available, from guide dogs to cameras and wearable sensors. Typical problems with the latter are usability and acceptability. If people were able to use technology embedded in devices such as smartphones, it would not require them to wear extra equipment which could make them feel self-conscious. There are also existing smartphone apps that are able to, for example, recognise an object or speak text to describe places. But the sensors embedded in the device are still not fully exploited. We aim to create a system with ‘human-in-the-loop’ that provides good localisation relevant to visually impaired users and, most importantly, that understands how people observe and recognise particular features of their environment.”

The research team, which includes Dr Oscar Martinez Mozos, a specialist in machine learning and quality of life technologies, and Dr Grzegorz Cielniak, who works in mobile robotics and machine perception, aims to develop a system that will recognise visual clues in the environment. This data would be detected through the device camera and used to identify the type of room as the user moves around the space.

A key aspect of the system will be its capacity to adapt to individual users’ experiences, modifying the guidance it provides as the machine ‘learns’ from its landscape and from the human interaction. So, as the user becomes more accustomed to the technology, the quicker and easier it would be to identify the environment.

The research team will work with a Google sponsor and will be collaborating with specialists at Google throughout the ‘Active Vision with Human-in-the-Loop for the Visually Impaired’ project.

Below is an interview with Dr Bellotto on BBC Radio Lincolnshire:

Posted in Research.

Tagged with , , , , , , , , .


Linda the robot stars on TV’s Gadget Man

A robot called Linda developed by computer scientists at the University of Lincoln, UK, has appeared on Channel 4’s Gadget Man.

In the fourth series of the technology show, presenter Richard Ayoade test-drives new technological devices designed to make life easier.

In an episode exploring the theme Health and Safety, aired at 8.30pm on Monday 22nd June, Ayoade tests security devices with actor Keith Allen and comedian Bill Bailey, including a post-apocalypse survival kit that also works at festivals.

Fearing that the world is a dangerous place for Gadget Man, Ayoade employs the services of Linda the robot to guard his home.

Linda, who is based in the School of Computer Science at the University of Lincoln, is named after the city’s Roman roots as Lindum Colonia. The specialist mobile robot is currently being programmed to act intelligently in real-world environments, with the ultimate aim of being able to support security guards or staff in care homes.

She is one of six robots involved in the £7.2 million collaborative STRANDS project aimed at creating mobile robots that are able to operate independently, based on an understanding of 3D space and how this space changes over time.

Funded by the European Union’s Seventh Framework programme (FP7), the research project involves six academic partners, a security company and an Austrian care home provider, where the technology will be tested.

The robots have just finished a month-long deployment at Haus der Barmherzigkeit care facility in Austria, as they continue to develop an understanding of how the world should appear and be able to identify deviations from their normal environment.

The trial tested how long the robots could autonomously complete simple tasks in a real-life hospital environment without human support. Beside frequent patrols through the corridors, the robots also guided visitors, residents and members of staff to offices or seminar rooms, and accompanied physio-therapeutic walking groups twice a week.

Linda’s TV debut is not her first high profile public appearance. The robot was also chosen to be part of Universities Week 2014 which aims to increase public awareness of the wide and varied role of the UK’s universities. She greeted and interacted with visitors to the Natural History Museum in London during the week-long event in June 2014.

Dr Marc Hanheide, from the University of Lincoln’s School of Computer Science, who is working on the STRANDS project with colleague Professor Tom Duckett, said: “It’s fantastic that Linda is still getting out and about, as a key aim for this project is to show people how this sort of technology could help us in our everyday lives.”

To view the episode click here.

Linda on Gadget Man set

linda-on-gadget-man

 

Posted in Human-Robot interaction, School news.

Tagged with , , , , .


Robotic harvesting of broccoli could be coming to a field near you

A project involving 3D camera technology currently being developed at the University of Lincoln, UK, could result in a fully automatic robotic harvesting system for broccoli.

The University of Lincoln was one of more than 70 UK businesses and universities to share funding through the £70 million Agri-Tech Catalyst, which aims to improve the development of agricultural technology in the UK.

The project, which is jointly funded by BBSRC and Innovate UK, will test whether 3D camera technology can be used to identify and select when broccoli is ready for harvesting. This will be a key step towards the development of a fully automatic robotic harvesting system for broccoli, which will significantly reduce production costs. It has been praised as ‘world leading’ by UK Farming Minister George Eustice.

The research team comprises academics Professor Tom Duckett and Dr Grzegorz Cielniak from Lincoln’s School of Computer Science and Dr Simon Pearson from the University’s National Centre for Food Manufacturing (NCFM) at Holbeach.

The main industry partner is R. Fountain & Son Ltd, horticultural consultants based in Boston, Lincolnshire, who will be responsible for creating the broccoli-cutting device.

Project lead Professor Tom Duckett, group co-ordinator of the Agri-Food Technology Research Group at the University of Lincoln, said: “Broccoli is one of the world’s largest vegetable crops and is almost entirely manually harvested, which is costly. This technology is seen as being an important move towards developing fully automatic robot harvesting systems, which could then be used for a variety of different crops.

“In all our agri-related research work, our mission is to develop new technological solutions for the business of producing food through agriculture. The long-term impact of our research includes safer food, less waste, more efficient food production and better use of natural resources, as well as promoting human health and happiness.”

Listen to Professor Duckett speaking about the project on Siren FM:

 

Head of Agriculture and Food at Innovate UK Ian Meikle said: “The Agri-Tech Strategy aims to make the UK a world leader in agricultural technology, innovation and sustainability. The funding decisions are expert-led and evidence-based.

They support great ideas that address challenges of the future in food and farming. With business, research and government working together, these investments can unlock potential and deliver major benefits for society and the economy.”

Another project benefiting from the University of Lincoln’s expertise in this area is the early detection and biocontrol of prevalent diseases of mushrooms and potatoes.

Also funded by Innovate UK, this project addresses challenges associated with the identification, prevention and management of disease by developing diagnostic tools for farm use and alternatives to chemical pesticides. This will enable the primary producers in these industries to rapidly diagnose the existence of disease and facilitate earlier decision making.

It is anticipated that this project will develop a long-needed alternative to the use of pesticides by the mushroom and potato industries, thereby ensuring their future sustainability.

Principal Investigator Dr Bukola Daramola, from the University’s NCFM, said: “Food loss from farm to fork, due to disease and spoilage, causes considerable environmental and economic effects. The outputs of this project have the potential to significantly address the challenges presented to the mushroom and potato sectors by pathogenic bacteria and fungi, their detection and resistance to treatment. At the heart of the project is a drive to develop robust solutions for bio-monitoring and bio-control, leading to scientific advancement and the marketing of products which will ultimately have significant economic and societal benefit for the UK and beyond.”

The project also involves Monaghan Mushrooms, Queen’s University Belfast, AHDB Potato Council, RoboScientific and Rationale Biopesticide Strategists.

Posted in Research.

Tagged with , , , , , , .


Lincoln to host Europe’s leading robotics conference

Robotics experts from around the world will present ground-breaking research on how robots are moving out of the laboratory and into homes and workplaces at a major international conference hosted by the University of Lincoln, UK.

The European Conference on Mobile Robots (ECMR) 2015 takes place on 2-4 September 2015. It is the first time the conference has been hosted in the UK, following previous meetings in Spain, Sweden, Croatia, Germany, Italy and Poland.

This year’s event is being organised by the Lincoln Centre for Autonomous Systems, a research centre within the University of Lincoln’s School of Computer Science which specialises in the integration of perception, learning, decision‐making and control capabilities in autonomous systems such as mobile robots and smart devices. The group applies its research in fields such as personal robotics, food and agriculture, security and surveillance, and intelligent transportation.

Conference organiser Professor Tom Duckett, who leads the Lincoln Centre for Autonomous Systems, said: “Hosting ECMR in 2015 is a fabulous opportunity to showcase our research in mobile robotics and to help to put Lincoln firmly on the map in the international scientific community. As today’s robots leave the laboratory and start entering many different real-world applications, it is a very exciting time to be working in robotics research, and I feel privileged to be hosting this important international event together with my colleagues in the robotics research team here at Lincoln. Our colleague Professor Adriana Tapus from ENSTA ParisTech, France, is leading the technical programme of the conference, and we already had nearly 100 paper submissions, so it promises to be a fantastic event with contributions from all over Europe and beyond.”

ECMR is a biennial European forum, internationally open, that allows roboticists throughout Europe to become acquainted with the latest research accomplishments and innovations in mobile robotics and mobile human-robot systems. The keynote speakers this year are Maja Pantic, Professor of Affective and Behavioural Computing at Imperial College London; Roland Siegwart, Professor for Autonomous Systems at ETH Zurich; and Ingmar Posner, Associate Professor in Engineering Science at the University of Oxford.

Professor Pantic, who is leader of the i•BUG group, works on machine analysis of human non-verbal behaviour and has published more than 200 technical papers in the areas of machine analysis of facial expressions, machine analysis of human body gestures, audio-visual analysis of emotions and social signals, and human-centred machine interfaces. In 2011, she was awarded the BCS Roger Needham Award, presented annually to a UK-based researcher for a distinguished research contribution in computer science within ten years of their PhD.

Professor Siegwart’s research focusses on the design and control of systems operating in complex and highly dynamical environments. His major goal is to find new ways to deal with uncertainties and enable the design of highly interactive and adaptive systems. Prominent application examples are personal and service robots, planetary exploration robots, autonomous micro-aircrafts and driver assistant systems.

Professor Posner, who is co-lead of the Oxford Mobile Robotics Group (MRG), focuses on the application of machine learning techniques to emerging mobile robotics tasks such as semantic mapping, active exploration and life-long learning.

Posted in Human-Robot interaction, School news.

Tagged with , , , .


Research seminar on “Multisensory Perception of Soft Objects”, Dr Massimiliano Di Luca, University of Birmingham

On Wednesday 3rd June 2015, Dr Massimiliano Di Luca from the School of Psychology, University of Birmingham, will give a research seminar in the School of Computer Science. Everyone is welcome!

When: Wednesday 3rd June 2015 @ 4:00pm
Where: MC0024, MHT Building

 

Title: Multisensory Perception of Soft Objects

Abstract
Softness is the subjective impression of the physical deformability of an object. When we interact with a deformable object like a pillow, sensory signals from multiple sense modalities provide information related to its compressibility (i.e. proprioceptive position, tactile force, visual deformation). Such signals dynamically depend on the way we interact with the object. Our brain has specialised mechanisms that processes this information to create a coherent perceptual representation of the compressibility of the object and to adjust motor actions accordingly. In this work I will present psychophysical experiments that employ visual-haptic virtual reality setups to investigate how we perceive the compressibility of an object while we squeeze it or while we press against its surface. The results of these studies form the basis of a computational model of softness perception where signals are combined into perceptual estimates that are then integrated according to the rules of Bayesian inference.

Posted in Events, Research.