Several example clients written in Python are provided with the MDK. You can find these at
~/mdk/bin/shared. Generally, they will work with any of the development Profiles (though some features may be missing in some configurations).
Using examples on board
With the release of the 2020 edition, the MDK has moved to Python 3, and all the example scripts here are now configured to run in Python 3 by default. The robot runs on ROS Kinetic, which has some compatibility issues with Python 3—to run the examples on board, start them in Python 2 explicitly (e.g.
General control examples
Each of these examples illustrates how to use some parts of the interface of the robot (or simulator), exchanging data and, in some cases, additionally providing some small example of data processing.
- Receive, extract and display various sensor (or control) topics. Not exhaustive, a good example of a minimal client for some straightforward sensors and actuators.
- Receive audio data from the microphones, and send audio to the speaker.
- Manage streaming of audio to the speaker, keeping the buffer stuffed to ensure continuous playback (see also client_audio which uses much the same approach).
- Receive images from the cameras, do some processing using OpenCV, and either display them or record them to video files.
- This client is used by our development team for testing various elements of the robot—a support engineer may ask you to run this client to perform diagnostics. However, it may also be useful to explore the client to see how to control those aspects of the robot.
- Drive MiRo in a square, repeatedly, with or without using the included position controller to improve the accuracy of the movement.
- A short client, showing the minimal structure required to read some data from the robot, and control its movement.
- An even shorter example that just reads back the live battery voltage.
Special purpose examples
- This is by far the most comprehensive interfacing example, showing how to access all of the upstream signals and to command all of the downstream signals. However, it's quite complex, so you may prefer to start by looking at one of the simpler examples.
- This example interacts with the demo controller, which must therefore be running. Run without arguments to see the options.
- This client allows live control of the configuration of the demo controller. It is still in development and is not currently documented.
- This client shows how to drive MiRo's voice controller which produces the vocalisations heard when running the demo.
Kinematic chain examples
- Simple use of the KC model to transform locations and directions between frames of the robot body model. This is not a live example, and does not connect to the robot.
- This more complex client illustrates how to use the KC and camera models to map a pixel location in image space all the way through the system to a location in the WORLD frame. It uses the ROS topics sensors/kinematic_joints and sensors/body_vel to configure the mapping, and is live, so the output will update if you manually move the robot (or if you control it to move). The camera model is used to reverse the particular distortion introduced by the lenses of MiRo's cameras, and then the KC model is used to map from HEAD space to WORLD space.
The examples above all use the native Python ROS tools to communicate with the robot. In the sub-directory
robot, you'll find examples that use the provided RobotInterface object, which handles ROS for you. Some users may prefer this way of driving the robot.
- Does what it says on the tin.
- Control MiRo using arrow keys!
- An example of how to receive camera images using this interface.