Indeed, I’ve been bullish on autonomous vehicles for a while now. But though the industry’s developments have been promising, self-driving cars have remained €˜five years away from being five years away’…
Until now, that is.
Thanks to the rapid expansion of autonomous ride-hailing services in places like Phoenix and San Francisco, the rollout of autonomous trucking in Texas and Arizona, and the upcoming launch of Elon Musk’s Robotaxi on Oct. 10, I believe the stage is set for self-driving cars to begin transforming the $11 trillion transportation services industry.
Now, that’s all great information to know. But it doesn’t mean much if we don’t grasp how these vehicles actually work.
After all, understanding a burgeoning megatrend is key to finding the best stocks to buy to profit from it.
Therefore, to potentially turn the Age of AVs into a massive payday in the long run, we must first understand how a self-driving car works.
A Tech Trifecta
At its core, a self-driving car is operated by a combination of sensors – the €˜hardware stack’ – and AI-powered software – unsurprisingly called the €˜software stack.’
In short, the car’s sensors gather information about its surroundings. Then the AI software processes that data to determine whether the car accelerates, brakes, changes lanes, turns, etc. And this all happens on an instantaneous basis.
Usually, the €˜hardware stack’ comprises three sensors: cameras, radar, and lidar. A typical self-driving car uses all three sensors as each has strengths and weaknesses that complement the others nicely.
Cameras are used to collect visual data. They capture high-resolution images of the vehicle’s environment, similar to that of a human driver’s eye. These cameras help to recognize various signs, lane markings, and traffic lights and can distinguish between different objects, like pedestrians, cyclists, and vehicles. They are very good at providing detailed visual information, which helps the car understand the context of its surroundings. But they tend to perform poorly in bad visual environments, like when there’s low light or inclement weather.
An AV’s radar sensors emit radio waves that bounce off objects and return to the sensor, providing information about the distance, speed, and movement of obstacles in the car’s vicinity. These sensors work well in all weather conditions (complementing cameras nicely), but they provide limited resolution and detail (where cameras excel).
Lidar – which stands for light detection and ranging – is essentially radar powered by lasers. These sensors emit laser pulses that also bounce off surrounding objects and return to the sensor. By measuring the time it takes for the light to return, lidar can create a high-resolution 3D map of the vehicle’s environment. This provides accurate depth perception, enabling the car to understand the exact shape, size, and distance of surrounding objects. However, lidar doesn’t capture color or texture information (like cameras do).
In other words, cameras are used to see things. Radar is used to sense how fast those things are going. And lidar helps to calculate the exact position of those things.
In this sense, it is easy to see how these three sensors work together within a self-driving car.
Self-Driving Cars: Driven By Robust Next-Gen Software
Self-driving cars use what is called €˜sensor fusion’ to combine camera, radar, and lidar data, creating a complete, accurate, and reliable model of a vehicle’s environment.
For example, if a person crosses the road in front of an AV:
- The camera identifies it as a person.
- The radar tracks the pedestrian’s speed to predict potential collisions.
- The lidar measures the pedestrian’s exact distance, shape, and movement.
Together, these sensors allow the car to make informed decisions, such as slowing down, stopping, or rerouting, ensuring safe and efficient navigation.
But it can only make those decisions with the help of its €˜software stack.’
An AV utilizes a variety of software and methods to provide real-time intelligence about its surroundings. And there are essentially five components to its software stack: perception, localization, prediction, planning, and control.
In short:
- The perception software uses sensor fusion, object classification, and semantic segmentation to create a comprehensive picture of a car’s environment.
- The localization software uses highly detailed maps and location data to place the car precisely in its environment.
- The prediction software leverages machine learning models to predict how things in the environment may act in different scenarios.
- The planning software takes the outcomes of the perception, localization, and prediction software to decide an optimal path for the car.
- The control software executes the planned action, controlling the car’s steering, acceleration, braking, etc.
Together, these sensors and software processes create the technological background for self-driving cars.
The Final Word
Of course, every company attacks the self-driving problem differently. But this is the general framework most follow.
As such, when looking to invest in the Autonomous Vehicle supply chain, it makes sense to look for stocks providing AVs’ critical components.
Find the strongest camera, radar, and lidar providers. Focus on the most promising software plays.
They’ll likely be the biggest winners in the Age of AVs.
In fact, if you’re hoping to get positioned for an era of AV-powered market gains, join my special briefing next Monday, Oct. 7 at 10 a.m. Eastern, to learn all about the quickly unfolding Autonomous Vehicle Revolution. (Go here to sign up and save your seat.)
This upcoming event is all about getting you prepared for Tesla’s Robotaxi launch next week (which we expect will be huge). Though it’s about much more than that upcoming debut.
Indeed, in this broadcast, I’ll detail all the recent groundbreaking developments in the autonomous vehicle industry, including how robotaxis are set to completely transform transportation, save millions of lives, and potentially put up to $30,000 a year in passive income in your pocket.
And that includes my playbook on the best AV stocks to buy right now.
Click here to reserve your seat now.
On the date of publication, Luke Lango did not have (either directly or indirectly) any positions in the securities mentioned in this article.
P.S. You can stay up to speed with Luke’s latest market analysis by reading our Daily Notes! Check out the latest issue on your Innovation Investor or Early Stage Investor subscriber site.