Learn autonomous systems through practical guides on perception, planning, localization, and system integration.
This section is for engineers who want to understand how self-driving and autonomous systems are built as connected systems, not as isolated algorithms. The guides below are organized to help you move from perception basics into mapping, planning, and full-system thinking.
Who this section is for
- readers learning the architecture of self-driving systems
- engineers who want practical explanations of perception and planning
- students who need a clearer path through autonomous-system topics
Start here
- Camera perception in self-driving cars
- System integration for self-driving cars
- Occupancy grid maps in practice
Top guides
- Computer vision in practice
- Vision-based navigation in practice
- Region masking for lane detection
- Trajectory generation basics
- Yolo for real-time detection
Learning path
- Start with perception and understand how cameras and sensors represent the world.
- Move into mapping and localization to understand where the vehicle is.
- Learn planning and trajectory generation to understand how the vehicle decides what to do next.
- Finish with system integration so the full architecture makes sense as one system.
Perception path
Planning and system path
- Occupancy grid maps in practice
- Trajectory generation basics
- System integration for self-driving cars
Read next
The best first article here is Camera perception in self-driving cars. If you already know the perception basics, move to System integration for self-driving cars to see how the pieces fit together.