Laura: Self-driving public transportation. Prototype II.

Laura: Self-driving public transportation. Prototype II. Started on Sep 2010 and finished on Aug 2011 Abstract: Discover how this 12-ton truck was automated to drive itself with Computer Vision. This was the second prototype, after a Citroën C3, in a project led by Siemens to develop a self-driving public transportation system. Technologies:OpenCVC++WindowsC#.NETIEEE 1394LWS17 more... Tags:Computer VisionAutonomous RoboticsUnmanned VehiclesDynamic Control3 more...

This article is about the second prototype of a self-driving public transportation system: an IVECO Eurocargo ML150E18. Before going on, you might like to read about the first prototype here.

The project aimed to create a driverless vehicle, but with a different goal than the Google car, which had not been unveiled yet. Our system was intended to target the public transportation market, so it was not supposed to drive freely but on predefined marked routes. It was supposed to be like a tram, but with no rails and the possibility to change to another predefined route at some points. The propulsion system, on which I cannot elaborate, required constant accurate vehicle position control on the lane, far from what a human driver could achieve. This was the main reason to automate driving.

The Laura family. First prototype, Citroën C3 [right], and second prototype, IVECO Eurocargo ML150E18 [left].

New responsibilities

As in prototype I, Siemens was the project leader and the propulsion system developer. Vehicle maintenance, sensors and actuators were performed by the University Institute of Automobile Research (INSIA). In the Computer Vision Group (CVG) we continued providing Computer Vision localization and vehicle trajectory control and, in addition, we became in charge of the full in-vehicle architecture, network and low-level controllers.

In the previous prototype, I worked as a project engineer for the CVG. This time, I also played the roles of system architect and CVG project manager. I had already attended many meetings since prototype I, but this time I was also in charge of managing the CVG team project members and organizing the workflow.

Prototype II: IVECO Eurocargo ML150E18. Approx. length: 10 meters. Approx. total weight: 12 tons (truck + 2-ton load).

New hardware diagram

Prototype II hardware is distributed around five physical networks. A WiFi access point wirelessly connects the vehicle with external elements, like a base station or a portable user interface. The other four networks are cabled and interconnect the in-vehicle devices. The two system brains, the SIMATIC computers, communicate with each other through a 1 Gbps Ethernet network. One of them executes the vision algorithms and is connected to the camera through a FireWire 800 bus. The other computer performs all high-level and low-level control and is connected through a USB to an IXXAT USB-to-CAN II unit, a gateway to the CAN bus, to which all sensors and actuators are connected. The hardware diagram is shown in the figure below.

Hardware architecture of Laura prototype II.

For steering and brake actuation, INSIA installed Maxon motors. The motors were driven by two EPOS2 units from Maxon. These units connect directly to the CAN network and feature internal controllers for position, speed and electric current control. In the control computer, I programmed modules to communicate with the EPOS2 through the CAN bus and perform real-time control actions.

    

EPOS2 units [left] to control steering and brake motors [right].

The Guppy Pro camera, the FireWire 800 and the SIMATIC IPC647C all together make possible the video capture and processing at 60 fps with my visual library, the same one as in prototype I. For further details on the vision algorithm, go here. The isolation structure was made bigger and the camera was placed a bit higher to capture a larger ground rectangle. In this way, wider marks with more bits per mark can be detected and decoded, allowing for more complex routes than the original test circuit.

New camera isolator, much larger than the first prototype's isolator allows to catch wider visual marks with a higher bit count.

The LWS1 was the same absolute encoder that was used to sense the steering wheel position in prototype I, whereas the custom CAN interface was a new development for prototype II. We needed a way to read the pulse train from the tachometer and transmit the speed measure through the CAN bus. For this purpose I developed a custom CAN node with a dsPIC33FJ128GP804 microcontroller for pulse-to-analog signal conversion, an MCP25050 for CAN node management, an MCP2551 as CAN transceiver and some op amps and analog parts for level shifting and signal conditioning. I soldered the microcontroller to a basic interface board that I designed a year before for a personal hobby project. For the rest, as it was a development only for that prototype, I used prototyping PCB. Everything was encapsulated in a plastic case with DB9 connectors. The development of the whole custom CAN interface took about five days, including testing on the truck, because I already had experience with the microcontroller.

Distributed software architecture

The software architecture in prototype II was completely redesigned from scratch. That was one of my tasks. The system load was balanced among two computers running Microsoft Windows 7: one for visual processing and one for control and decision taking. Although there is no assurance that the delays are bounded in a non-real-time operating system, the application low frequency needs made our clean Windows installation suitable for the prototyping environment.

The software in both computers was conceived as a modular system, distributed among several asynchronous components on a message-passing intercommunication infrastructure programmed on C#. That approach really helped isolating the independent system parts, making software maintenance easier, improving scalability and reducing programming errors thanks to C# managed code. And it would open the possibility to freely move any component from one processing platform to another, maximizing resource usage and increasing global system performance.

To implement the system, I used a component library whose core was developed for some of my past personal projects. In the library architecture, every active component has a thread that serially dispatches messages from a queue. A priority can be assigned to any message and messages carrying data with real-time requirements can be flagged to be treated properly (high priority and no buffering). Redesigning the whole system with this simple component model avoided many thread synchronization flaws from the previous prototype. Regarding dynamic stability, the typical intercomponent message delay was in the order of a tenth of microsecond, which was negligible compared to any dynamic subsystems' time response. This means that the component library benefits perfectly justified the extra overhead.

Some components from prototype I could be reused; specifically, the computer vision libraries and some of the communication modules. They were written in C++, which provided extra execution speed, and their integration in the C# component architecture was achieved thanks to .NET interop, by writing custom wrappers.

System diagram of software components.

Improved guidance controller

Because of the physical differences between both prototypes, some improvements were necessary in the guidance controller. Whereas the Citroën C3 was 2.5 meters long and weighed about 1.2 tons, the truck was 10 meters long and weighed 12 tons (truck + 2-ton load). Those characteristics were chosen because of the similarity with the final production vehicle. Obviously, we were facing something with much higher inertia and kinematic constraints than prototype I. Therefore, a more complex dynamic control was required.

In relation to prototype I, the main improvements were:

  • The fuzzy controller was trained in a simulator before going to field tests
  • The integrator effect was embedded in the fuzzy rules, so it can also be tuned during automatic training
  • Any trajectory with arbitrary shape could be specified using the line as reference, thanks to a kinematic circuit generator
  • Real-time full vehicle state estimation thanks to a route model and multiple sensor fusion: line, visual marks, speed, steering angle
  • Truck and wheel dynamics were taken into account

Regarding the last point, because of the large tires, rubber physical properties and friction, we realized that there was some extra delay since the steering motor turned until the wheels actually reached the corresponding turning point. That delay was changing with vehicle speed and, at very low speeds, some non-linearities appeared in the relation between tire angle and steering angle. The effect was noticeable enough to keep us from meeting the requirements.

My proposal consisted in characterizing the steering-wheel-to-tire dynamics (including steering motor and gears) and compensate them in the controller. An absolute encoder (the LWS1) was attached to the steering wheel, so measuring its angle would be a piece of cake. The tire angle that mattered was the angle of the lower part, the one in contact with the ground that was imposing the kinematic constraint. I suggested attaching a laser range finder under the truck cabin and pointing it to the lower part of the tire. The range finder would give a distance measure from which the angle could be found by trigonometry. With that setup, several tests could be run at different truck speeds to model the steering-wheel-to-tire subsystem and compensate its dynamics.

Greetings

I have especially elaborated on my tasks —after all, it is my portfolio, right? cool—, but the results are obviously the fruit of a common effort of the whole team. Special greetings and thanks go to all the CVG members that participated in the project: José Luis Sánchez López, David Galindo, Miguel A. Olivares Méndez and Pascual Campoy. Greetings also go to Antonio Sanz Gil, Marta Robles and all other collaborators from Siemens and INSIA that made it possible.