Just got news that our paper on the ARTY smart paediatric wheelchair was accepted to the IROS 2012 Workshop on Progress, Challenges and Future Perspectives in Navigation and Manipulation Assistance for Robotic Wheelchairs.
Abstract: Standard powered wheelchairs are still heavily dependent on the cognitive capabilities of users. Unfortunately, this excludes disabled users who lack the required problem-solving and spatial skills, particularly young children. For these children to be denied powered mobility is a crucial set-back; exploration is important for their cognitive, emotional and psychosocial development. In this paper, we present a safer paediatric wheelchair: the Assistive Robot Transport for Youngsters (ARTY). The fundamental goal of this research is to provide a key-enabling technology to young children who would otherwise be unable to navigate independently in their environment. In addition to the technical details of our smart wheelchair, we present user-trials with able-bodied individuals as well as one 5-year-old child with special needs. ARTY promises to provide young children with “early access” to the path towards mobility independence.
More information about ARTY (with video).
Just got the news this morning: ARTY was selected as a Dyson Award Finalist! The UK winner is an amazing project by Dan Watson that tackles the problem of sustainable fishing.
Although I’m disappointed that ARTY didn’t win the national award, there were many exceptional projects this year and I’m delighted ARTY was a finalist. Congratulations to Dan and the other National Finalists! I’m looking forward to the International results.
After watching this tutorial,
Kyu Hwa had the great idea to replicate it. So, we gave it a go:
It didn’t taste good. Our experience was that marshmallows and watermelons do not go together. Neither does peanut butter.
We ended up eating the ingredients separately.
But maybe we did it wrong. For example, we cut the watermelon the wrong way. Also, we made only 2/3 holes; Mr Willett made 4 holes. Perhaps we used the wrong kind of watermelon / marshmallow / peanut-butter.
P.S. This is what happens when you have PhD students working 10-12 hours a day in the summer.
The 2013 UK Space Design Competition Request for Proposals (RFP) is now out!
The UK Space Design Competition 2013 is open to all UK secondary school students in years 9-13. Teams must consist of between 8 and 12 students, plus a supervising adult, but need not be affiliated with any particular institution. This means that schools, colleges, science clubs, and societies are all free to enter a team, provided that the above criteria are satisfied. All team members must be specified in the initial application.
Check out http://uksdc.org for details!
Had a great time with the gang watching the Olympics closing ceremony. Coincidentally, we had another steak night. Yummy. Hats off to Kyu Hwa the Chef!
In other news, just submitted a paper on ARTY (and a trial-run by a child with special needs) to the IROS Workshop on robotic wheelchairs — fingers crossed that it’ll get accepted. Will put up a pre-print soon. If you are in the field, I recommend checking out the workshop on shared-control (organised by Tom Carlson at EPFL) at this year’s SMC.
Next week will be busy for us as we gear up for the Summer School.
Just submitted an IROS camera-ready copy of some recent work on online spatio-temporal learning:
In this work, we are primarily concerned with robotic systems that learn online and continuously from multi-variate data-streams. Our first contribution is a new recursive kernel, which we have integrated into a sparse Gaussian Process to yield the Spatio-Temporal Online Recursive Kernel Gaussian Process (STORK-GP). This algorithm iteratively learns from time-series, providing both predictions and uncertainty estimates. Experiments on benchmarks demonstrate that our method achieves high accuracies relative to state-of-the-art methods. Second, we contribute an online tactile classifier which uses an array of STORK-GP experts. In contrast to existing work, our classifier is capable of learning new objects as they are presented, improving itself over time. We show that our approach yields results comparable to highly-optimised offline classification methods. Moreover, we conducted experiments with human subjects in a similar online setting with true-label feedback and present the insights gained.
This work was nominated as a finalist for the 2012 CoTeSys Cognitive Robotics Best Paper Award.
After a couple of weeks of putting it off, I’m releasing an alpha version of the Online Temporal Learning (OTL) C++ library, which can be used for learning time-series in an online environment (samples are processed one at a time). The Online Echo State Gaussian Process (OESGP) and Spatial-Temporal Online Recursive Kernel Gaussian Process (STORK-GP) algorithms are part of this library.
Find the code at bitbucket: https://bitbucket.org/haroldsoh/otl
And be sure to check out the Getting Started Guides on the wiki.
Comments/Criticisms/Bug reports are welcome! Though, depending on my workload, replies may be slow in coming. Have fun!
P.S.: If you’re looking for the OESGP Experimental setup scripts, you can find them here.