Francesco Melpignano

Autonomous Mower Covnersion

☕️☕️ 12 min read

Redwing’s First Product

In the summer of 2016 I started a company who’s mission was to bring golf course maintenance to the 21st century by creating autonomous vehicles to perform jobs traditionally carried out by low-skill labourers.

My first move in this space was to strike an initial partnership with the Italian National Research Center for Robotics, working specifically with Prof. Ettore Stella and his team of talented researchers. Their expertise, similarly to mine, lied more in the software side of robotics as apposed to mechatronics. We decided the best course of action would be for them to start developing a machine learning navigation algorithm purposely designed for the terrain and surroundings of a golf course. The hardware was to be kept as simple as possible, so we used a golf cart, programmatically driven through its control unit and a robotics steering column, to test the navigation software.

Concurrently, I had to decide what product I wanted to bring to market given a myriad of different jobs that could be automated. After thorough market research and some prelimiary tinkering, I decided that our first project would be to explore the feasibility of retorfitting existing mowers, already owned by golf courses, with self-driving kits. The alternative idea at the time was designing a grass-mowing autonomous vehicle from the ground-up, but that seemed unfeasible financially due to the elevated cost of building a prototype (~500k)

In order to parallelize the development of the navigation software and this new self-driving conversion kit I decided to hire three summer interns from the robotics program of the University of Pennsylvania, my alma matter. We split the work as follows: Alex and Ahmed, who had a mechanical engineering background, would focus on the mechanics of converting the mower, whilst Kevin and I, both with CS backgrounds would focus on the navigation and vision algorithms.

The following hardware navigation system was put in place to drive the mower: a linear actuator on the accelerator, a chain + motor system on the steering column, pulley + motor on the cutter lever. Once this part of the project was done, our designated roles quickly disintegrated as the problems that rose needed all our combined mental power to fix. We spent the bulk of our time integrating the vast number of hardware devices we were using with the ROS operating system to act as the “brain” of the whole operation (see image below).

ros-image

I was personally most involed with intergrating and interpreting the singals coming from our stereo camera to dynamically navigate the mower close to the edge of the grass. This complemented my skills in machine learning very well because I had to apply my knowledge to figure out how to properly use the OpenCV python and C++ library in order to feed the system the right data to make the correct navigation decisions. It also turns out this was the most important part of our navigation algorithm because the system being developed by Prof. Stella was largely based on GPS, which meant not enough accuracy to cut the edge of the grass to the wanted precision of 1cm. Here is our turnaround algorithm when close to the edges:

ros-image

Despite the successful realization of the project, we were unable to bring this product to market due to the complications of controlling unpredicatable diesel engines and the various models from all the different brands on the market

I decided to rethink the strategy of Redwing Technologies to create a small robot designed from us from the ground up with eletric motors to mitigate all the problem we encountered building and coding this prototype. This experience will be outlined in a later blog post

Here is a description on how we made the magic happen and a deeper look at our tech stack:

System

  • Computer: Latitude E6400 XFR
  • Operating System: Ubuntu 14.04
  • ROS Distro: Indigo
  • Mower: Toro ReelMaster 3100-D
  • Micro-processors: (x2) Fishino Mega
  • Imu: Xsens MTi-G 700
  • Encoders: (x3) Kubler Incremental TTL Rotary Encoder, 6000rpm, 1024 ppr, 5 V dc, IP67
  • StereoCamera: Bumblebee XB3 1394b
  • Steering Motor: Robotzone Gearmotor 6RPM 12VDC
  • Cutter Lever Motor: DC 12V 150mA 10RPM 20.9Kg-cm High Torque Permanent Magnetic Gear Motor
  • Linear Actuator: SKF 500N Electric Linear Actuator, 24V dc, 100mm Stroke
  • Motor Controller (Linear Actuator): Pololu High-Power Motor Driver 24v12
  • Motor Controller (Steering & Cutter): TB6612FNG Dual Motor Driver Carrier

Software Requirements

apt-get

sudo apt-get install ros-indigo-rosserial
sudo apt-get install ros-indigo-rosserial-arduino
sudo apt-get install ros-indigo-robot-localization

Git

$ cd catkin_ws/src
$ git clone https://bitbucket.org/AndyZe/pid.git
$ cd catkin_ws/src
$ git clone https://github.com/lukscasanova/mtig_driver.git

Binaries

Pip

pip install shapely

Arduino

Description: We run 2 different scripts for encoder reading and PWM output on two separate Fishino Megas. Both Arduinos act as ROS nodes from rosserial_arduino. The script left_right_wheel_encoder.ino uses edge-triggered interrupts on INT4 and INT2 to read the left and right wheel encoders respectively and publishes at a rate of ~195Hz. pwm_with_back_encoder.ino subscribes to control gains and outputs corresponding direction and duty-cycles in 12-bit resolution PWM to the motor controllers. It also toggles the cutters with a set routine that can be called from ROS.

Include

Copy over directories in /arduino/include to ~/Arduino/libraries directory.

$ cp -r pins ~/Arduino/libraries
$ cp -r CumulativeMovingAverage ~/Arduino/libraries

When adding new messages from ROS to arduino code for rosserial connection do the following:

$ cd ~/Arduino/libraries
$ rm -rf ros_lib
$ rosrun rosserial_arduino make_libraries.py ./

Current Arduino scripts used:

left_right_wheel_encoder.ino
pwm_with_back_encoder.ino

File Structure

/arduino                  (all necessary arduino code)
    /include              (code to include in Arduino libraries)
    /{arduinio scripts}   (various scripts for running arduino code)
/turf_slayer              (Ros package directory)
    /launch               (Launch files)
    /maps                 (Map files for fairway boundaries)
    /msg                  (custom Ros messages)
    /scripts              (Ros scripts and nodes to run project)
    /tests                (Unit tests and other testing files)

ROS

Frames

Units and frames conform to REP 103 and REP 105.

Currently there are five frames used.

  • map (global frame see REP 105)
  • utm
  • odom (global frame see REP 105)
  • base_link (defined as the center of the front wheel axle at ground level with x forward, y left, z up)
  • imu_link (defined as the center of the xsens imu with x forward, y left, z up)

The utm -> map transform is computed by the navsat_transform.launch as a static transform in tf2.

The imu_link -> base_link transform is a static tf transform defined in start.launch.

The other transforms are computed by the two extended kalman filter nodes continous_ekf_localization.launch and gps_ekf_localization.launch from Robot Localization.

Note on GPS: Although the GPS data comes from the Xsens Imu, the frame_id associated with the GPS is the base_link frame because of its mounting location.

Nodes and Topics

ros-image

Nodes

  • mtig_driver (mtig_driver package)
    • Routes raw data from the Xsens imu to ROS.
  • filter_imu_data_node (filter_imu_data_node.py)
    • Filters and re-routes data from the Imu.
  • turf_slayer_odometry_node (odometry_node.py)
    • Calculates Odometry using encoder data, pressure, and temperature.
  • continuous_ekf_localization (robot_localization package)
    • Filters continuous odometry and imu data.
  • gps_ekf_localization (robot_localization package)
    • Filters and outputs discrete odometry data.
  • navsat_transform (robot_localization package)
    • Outputs positioning and transforms from GPS data.
  • path_planner_node (path_planner_node.py and test_pure_pursuit.py)
    • Creates a grid surrounding the fairway boundary and computes a global_path for the robot.
  • path_tracking_node (path_tracking_node.py)
    • Implements the purepursuit proportional controller for path tracking.
  • controls_routing_node (controls_routing_node.py)
    • This node routes the correct topics to the steering and actuator controllers.
  • steering/controler (pid package)
    • PID controller for steering by controlling rear wheel angle.
  • actuator/controller (pid package)
    • PID controller for the actuator by controlling linear velocity in base_link frame x direction.
  • serial_node_encoder rosserial
    • Sends encoder data to the turf_slayer_odometry_node and controls_routing_node.
  • serial_node_pwm rosserial
    • Recieves gains from the controls_routing_node and sends them to the arduino.

Topics

  • /xsens/*
    • Data output from the mtig_driver.
  • /filtered/*
    • Outputs from the filter_imu_data_node after data filtering has occurred.
  • /wheel_encoder/data
    • Encoder output from the arduino using custom message type EncoderTick.
  • /rear_wheel_angle
    • The angle of the back wheel in radians from the center line using tricycle geometry. This is controlled by the steering controller.
  • /odometry/raw
    • Calculates odometry from the encoders in the odom frame. This node calculates orientation from its inital position as well which is used for position calculations.
  • /odometry/ekf
    • Fused odometry using imu data and odometry data with ekf_localization from robot_localization. This is an accurate representation of localization within the odom frame.
  • /odometry/gps
    • Odometry calculated from the navsat_transform in the map frame. This is used to fuse a more globally accurate position.
  • /odometry/filtered
    • Calculated by fusing all of the above odometry calculations. This is the most globally accurate odometry in the map frame.
  • /global_path
    • Path message output from the path_planner_node.
  • /steering/desired_angle
    • This is the desired angle of the back wheel which is sent to the pid controller.
  • /steering_controller/gain
    • The gain output from the steering controller.
  • /actuator/desired_velocity
    • This is the desired velocity of the vehicle which is sent to the pid controller.
  • /actuator/measured_velocity
    • The current measured velocity that is controlled.
  • /actuator_controller/gain
    • The gain output from the actuator velocity controller.
  • /steering_controller/output
    • The gain and direction sent to the arduino with custom ActuatorSignal message.
  • /actuator_controller/output
    • The gain and direction sent to the arduino with custom ActuatorSignal message.

Launching

Please follow these steps to ensure proper startup.

  1. Disconnect all arduino cords.
  2. Attach actuator, steering chain, and cutter lever (see: redwing_mechanics/Descriptions.md)
  3. Turn key on machine halfway so that lights show.
  4. Wait for lower right light to turn off.
  5. Turn ignition starting machine.
  6. Plug in all cords from arduinos, imu, and other devices to computer.
  7. Check which ports corrospond to the devices by running ls /dev/tty.
  8. From terminal run the following command or rosrun turf_slayer script_you_want_to_run.py.
$ roslaunch turf_slayer start.launch encoder_port:="/dev/ttyUSB0" pwm_port:="/dev/ttyUSB1"

Parameters encoder_port and pwm_port are optional defaulting to the above values, but should match the correct arduino ports, they may switch because of port and cord issues.

Note on absolute heading: When starting the project, absolute heading is sometimes wrong because the algorithms on Xsens reduce error once moving around. The ekf filters take some time (about 15s) to correct for this and covariances associated with the values. There also appears to be a slight offset.

###Hardware -See datasheet and websites for detailed information!

  • Max duty cycle for linear actuator is 25%
  • Stalling motor may break gearbox

Known Issues

USB Ports

The USB ports on the computer have been giving us issues. It stems from a combination of hardware and software issues. The ports on the Latitude E6400 XFR are finicky and seem to disconnect frequently when jiggled. This causes the port name to switch. For example /dev/ttyUSB1 will switch to /dev/ttyUSB3 or disappear. This will disconnect rosserial. In addition, the mtig_driver uses code that searches and attempts to connect to USB ports with the Xsens imu. To combat this, we added a shell script called scripts/timed_roslaunch.sh that delays the launching of rosserial and takes USB port arguments as described above in Launching.

CPU Usage and Battery Life

We have noticed that the Latitude E6400 XFR cannot handle running multiple threads and processes. We were unable to properly test path tracking algorithms because of this. It also only has a battery life between 1 to 3 hours when used outside. When the battery level is low, there is significant lag.

Magnetic Disturbances

Due to the fact that the Toro ReelMaster 3100-D is constructed from iron and steel, we have observed large hard-iron and soft-iron distortions. These have been calibrated for using the MT Software for xsens and the appropriate offsets have been uploaded to the imu. The machine also experiences fluctuations when initially turning it on because of a battery that outputs a large amount of current. These fluctuations subside after about 20 seconds of turning the machine on.

Vibrations

The Toro ReelMaster 3100-D experiences a large amout of vibrations and disturbances while moving. This vibrations affect imu data and encoder data. We have implemented moving average filters for the imu data. The encoders experience drift when stationary, but we have not accounted for this.

Moving Forward

Move Controls to Arduino

We can achieve a faster control loop by moving controls to the arduino and avoid unecessary processes from using CPU and RAM.

Implement Advanced Controls

We’re using a basic PID controller with a pure pursuit algorithm to control our vehicle. A more advanced path tracking algorithm could increase the efficiency of our machine and at the same time alleviate issues with simple controls such as integral windup.

Robust Steering

The steering motor has a no-load speed of 6rpm right now and takes ~7s to reach back wheel angle of 30 degrees. If we replace it with a stronger motor and friction drive system, we’ll have more robust and reliable steering control.

Tune Covariances

Localization works well (however with unknown error). Process noice covariances, initial state covariances, and covariances of measure states are all values that should be tuned for higher precision and accuracy.

Perception

In order to make the system have higher precision, a perception system using point clouds, and lasers should be implemented to get a 3D mapping.

Obstacle Detection

Obstacle detection should be implemented for in front of the vehicle and behind the vehicle assuming moving in both directions.

Mapping

3D mapping and reference points are recommended to increase precision.

More general algorithms should be implemented for navigating in known spaces like between fairways.