Home arrow Home arrow Preliminary Python
Preliminary Python
Monday, 16 July 2007

I recently showed you my Media Server project which allows phrases to be spoken sounds to be played on the Bot by connecting to it over the network and sending command strings.

I've now also written something similar which allows the 914 itself to be driven around. This was written in C# using the White Box Robotics .NET controls to talk to the hardware. The sensor data, 'safety' and 'drop' events are served up on a socket, as well as it accepting left and right wheel velocity data to drive around

Image

My intention for this is to build a modular control framework that I could interface to from a variety of other components and languages - primarily Python. I also intend to allow interfacing to Bots running Linux and Player - there's more of an update about that in my Python project progress blog.

I've made a short video of the Bot doing another auto roam around the room. On the surface this looks quite a bit like the previous video I posted, however the key differences here are that the Bot is running Windows this time, the .NET sockets server is being used to interface to the hardware instead of Player, the speech is coming from my Media Server, and lastly - the overall control is coming from a Python script running over a wireless network, instead of running locally on the Bot - you see a brief glimpse of that running on my laptop in the video.

As Python is interpreted and also in this case running on the end of wireless connection, it seemed safer to allow the native 'drop' and 'safety' support built into the .NET controls to be used for actually stopping the Bot when needed. The sensor data is read by the Python script along with the nature of the event stopping it, and then a decision is made based on the data about what to do next. At the moment it can only turn away from the object and carry on again, but the next step is to take other inputs such as vision data from RoboRealm to help with the decision making process. There will also be a 'Pseudo AI life loop' running at some point which will add it's two cents about what to do next - hopefully resulting in some emergent behaviour.

I would also like to try some very rudimentary mapping using sensor data, perhaps with some additional sensors as I discussed recently. This will likely take the form of a 'wall follower' to take crude measurements and populate a 2D array to draw out an approximate map of the environment. I guess this will be a bit like measuring the room from different points with an ultrasonic tape measure - but it being dark so having to follow the walls with your hands to find the corners.

Discuss this article on the forums. (17 posts)

 
< Prev   Next >