The following is a detailed analysis of the following sensors
- Proximity infra-red sensors
- Infrared floor sensors
- Camera and microphones
- Object recognition via programming interface
We initialise the robot's sensors and calibrate the infrared sensors. We plot the calibration data. For now, we are only interested in the proximity infrared sensors (PROX LEFT FRONT and PROX RIGHT FRONT ) of the robot front. The relevant information can be found in the GCTronic specification1
There are three infrared floor sensors in the lower section at the front of the E-Puck. We use the robot.init ground() function to test the response behaviour of the sensors.
Floor sensors front view | Floor sensors bottom view | Colored lines for ground sensor testing |
---|---|---|
![]() | ![]() |
![]() |
The value range of these sensors should vary between 1000 (pure white) and 0 (black). A white sheet with coloured lines glued to it will serve as our test setup: We place the robot on the bottom white edge of the paper and then let it vertically over all the lines.
We let the robot scan an environment of different coloured blocks at different distances, a black ball and another e-puck. The robot rotates around itself, saves the resulting images and creates a CSV document with 350 measurements (approximately two measurements per step). The resulting measurements contain the following information: x_centre, y_centre, width, height, conf and label, where x- and y_centre describe the detected centres of the respective objects.
In addition to the self-explanatory width and height attributes, conf and label are of particular interest. The label assumes the values red block, green block, blue block or black block, but can also recognise black ball or epuck.
Recognition of a blue and yellow blocks | |
---|---|
![]() The treshold (here: 0.7) is crucial for object recognition and must be determined heuristically. |
![]() |
![]() |