Well, my first working day at SMART is almost over. 4 minutes and 35 seconds to the stipulated end-of-work-day. But who’s counting? So far, it’s been interesting — met the friendly folks here and saw the cool toys (autonomous vehicles). I’m one of the “early birds” and managed to land a desk with a great view of the NUS campus. That said, I might move down to “The Garage” where all the robots/machines are. Hopefully, I’ll sort out all my administration stuff soon and get on to
playing working with the vehicles and some new learning methods I have in mind.
It’s been a while since my last post. Excuse: thesis write-up. Update: Thesis submitted!
In other news, our recent work on Learning Assistance by Demonstration was accepted this year’s IROS! It’ll be a fun and interesting conference in Tokyo, Japan! You can find a preprint here.
Abstract: Crafting a proper assistance policy is a difficult endeavour but essential for the development of robotic assistants. Indeed, assistance is a complex issue that depends not only on the task-at-hand, but also on the state of the user, environment and competing objectives. As a way forward, this paper proposes learning the task of assistance through observation; an approach we term Learning Assistance by Demonstration (LAD). Our methodology is a subclass of Learning-by-Demonstration (LbD), yet directly addresses difficult issues associated with proper assistance such as when and how to appropriately assist. To learn assistive policies, we develop a probabilistic model that explicitly captures these elements and provide efficient, online, training methods. Experimental results on smart mobility assistance — using both simulation and a real-world smart wheelchair platform — demonstrate the effectiveness of our approach; the LAD model quickly learns when to assist (achieving an AUC score of 0.95 after only one demonstration) and improves with additional examples. Results show that this translates into better task-performance; our LAD-enabled smart wheelchair improved participant driving performance (measured in lap seconds) by 20.6s (a speedup of 137%), after a single teacher demonstration.
Download Draft PDF
Going back in time…
This Wired article on the Internet Archive reminded me of a talk I once attended (at UCD) by the founder, Brewster Kahle. In addition to being technologically impressive, the Archive’s WayBackMachine is tremendously fun. Try visiting Google back in 1999.
Most PhD students know that at some point, the dreaded PhD Avoidance Syndrome sets in. A few whatsapp and real-world conversations led to this:
Slightly Longer Version: http://www.youtube.com/watch?v=rTX9WcgjZqM
YARP is another robot development platform, similar to ROS. I had to code up a simple data reader in Python (operating over YARP ports) and couldn’t find any good examples. After some experimenting, I found a solution that worked for me. The following is a simple code snippet for other YARP Python newbies:
#create a new input port and open it
self.in_port = yarp.BufferedPortBottle()
#connect up the output port to our input port
#in this example, I assume the data is a single integer
#we use read() where the parameter determines if it is
#blocking (True) or not.
btl = self.in_port.read(True)
my_data = btl.get(0).asInt()
#if you have doubles, you can use asDouble()
#or strings can be obtained using asString()
I did the unthinkable and upgraded my OS (in my final year of my PhD!). And surprise-surprise, some of my code wouldn’t compile anymore. I figured I needed to rebuild my macports-installed *nix software but ran into problems with gcc45 and libstdcxx. The issue is a ld64 bug, that was fixed using user adrian’s solution (replicated here):
sudo port uninstall ld64
sudo port -v install ld64
sudo port clean libstdcxx
sudo port -d build libstdcxx build.jobs=1
sudo port install libstdcxx
Just got news that our paper on the ARTY smart paediatric wheelchair was accepted to the IROS 2012 Workshop on Progress, Challenges and Future Perspectives in Navigation and Manipulation Assistance for Robotic Wheelchairs.
Abstract: Standard powered wheelchairs are still heavily dependent on the cognitive capabilities of users. Unfortunately, this excludes disabled users who lack the required problem-solving and spatial skills, particularly young children. For these children to be denied powered mobility is a crucial set-back; exploration is important for their cognitive, emotional and psychosocial development. In this paper, we present a safer paediatric wheelchair: the Assistive Robot Transport for Youngsters (ARTY). The fundamental goal of this research is to provide a key-enabling technology to young children who would otherwise be unable to navigate independently in their environment. In addition to the technical details of our smart wheelchair, we present user-trials with able-bodied individuals as well as one 5-year-old child with special needs. ARTY promises to provide young children with “early access” to the path towards mobility independence.
More information about ARTY (with video).