The Software that Brings Self-Driving Cars to “Life”
There are five general types of programs that work together to create self-driving cars:
- Computer Vision
- Sensor Fusion
- Path Planning
Let’s get familiar with each one.
1. Computer Vision
Computer Vision is a type of code that uses camera hardware and deep learning to build digital 3D maps of an environment. It helps the car understand and “see” what’s around it.
2. Sensor Fusion
What do you call it when eyes, ears, and noses all work together? That’s not a setup for a corny joke, it’s just Sensor Fusion. The car intelligently combines the data from different sensors (gyroscope, accelerometer, pressure sensor…) to make informative decisions.
This code helps the car know where it is with respect to the 3D virtual environment the sensors and cameras have been capturing.
While one of the technologies used here is GPS (Global Positioning System), just like Google Maps and Waze, GPS is not enough for self-driving cars. That is because this GPS is only accurate from 3 to 6 feet.
While that may not seem like all that much, imagine what would happen if your car thought it was 6 feet to the left of where it actually was. On any normal road, a car with this degree of error could find itself driving down a sidewalk or into a guardrail!
To solve that issue, self-driving cars use additional localization technologies like high-def maps and mathematical equations to improve upon the accuracy of normal GPS systems.
4. Path Planning
Okay, so we’ve got a map and we know where we are on the map. Now what? Now we gotta navigate through our map to get to where we need to go. That is Path Planning.
A self-driving car needs to be constantly reanalyzing its environment and updating the path that it wants to take based on objects around it are and where the car thinks those objects are heading. If there’s construction blocking a route or other vehicles are merging into your lane, your car will need to keep changing its path to accommodate these elements.
This is how the car actually takes action on its plans: hitting the accelerator, pumping the brake, moving the steering wheel. In the robot car that we are making, it’s the servo that receives electrical signals in the Control Stage from the Raspberry Pi in order to turn the front wheels left and right.