Technical Lead Manager - Perception, Self-Driving Systems
full-time
lead
Posted 2 days ago
About this role
About Applied Intuition
Applied Intuition, Inc. is powering the future of physical AI. Founded in 2017 and now valued at $15 billion, the Silicon Valley company is creating the digital infrastructure needed to bring intelligence to every moving machine on the planet. Applied Intuition services the automotive, defense, trucking, construction, mining and agriculture industries in three core areas: tools and infrastructure, operating systems, and autonomy. Eighteen of the top 20 global automakers, as well as the United States military and its allies, trust the company’s solutions to deliver physical intelligence. Applied Intuition is headquartered in Sunnyvale, California, with offices in Washington, D.C.; San Diego; Ft. Walton Beach, Florida; Ann Arbor, Michigan; London; Stuttgart; Munich; Stockholm; Bangalore; Seoul; and Tokyo. Learn more at applied.co .
We are an in-office company, and our expectation is that employees primarily work from their Applied Intuition office 5 days a week. However, we also recognize the importance of flexibility and trust our employees to manage their schedules responsibly. This may include occasional remote work, starting the day with morning meetings from home before heading to the office, or leaving earlier when needed to accommodate family commitments.
About the role
Applied Intuition builds the software infrastructure for autonomous vehicles across passenger cars, trucking, mining, and defense. Our Self-Driving Systems (SDS) team develops production-grade autonomy stacks deployed on real vehicles across multiple continents, from highway trucking in Japan to urban ADAS in the United States and Europe.
We are looking for a Technical Lead Manager to own the perception model at the core of our autonomy stack. This is a single combined model: shared backbone, multi-task heads, serving every SDS program from the same codebase. The same model runs on a passenger car in Los Angeles, a truck in rural Japan, and an offroad vehicle in the Philippines. Different sensor configurations, different road geometries, different weather distributions, one model. You will lead the team that trains, evaluates, and ships this model, and you will be hands-on in the architecture and training decisions that drive its performance.
At Applied Intuition, you will:
Own the perception model end-to-end: architecture, training, evaluation, and deployment. The core challenge is building a model that generalizes across geographies, road types, sensor setups, and environmental conditions without per-vertical forks.
Drive a camera-first perception strategy. The goal is to progressively reduce dependencies on HD maps and lidar. How to get there is part of the job.
Lead training and iteration cycles hands-on. You will be in the data, the eval dashboards, and the failure analysis. When perception regresses in a new geography or road type, you own understanding why and fixing it.
Own model performance across the full deployment surface: highway, urban, residential, ramps, complex intersections, poor weather, hilly terrain. You care about on-vehicle driving outcomes, not just offline metrics.
Manage the model lifecycle from training through quantization and deployment on embedded compute, including device-specific optimizations. Close the gap between what the model does offboard and what it does on the vehicle.
Work directly with OEM customer programs to understand sensor configurations, target ODDs, and performance requirements. Translate these into model architecture and data strategy.
Recruit, develop, and technically lead a team of perception engineers. Set high technical standards and create a culture of rigorous experimentation and measurement.
We’re looking for someone who has:
5+ years in ML/deep learning for perception or 3D scene understanding. Deep hands-on experience training and deploying vision models at scale.
2+ years managing or technically leading a perception team, with ability to both set direction and contribute to architecture and training decisions directly.
Experience building production perception systems, especially camera-only or camera-first solutions.
Track record deploying perception models to embedded hardware under real-time latency and compute constraints, including device-specific optimizations.
Strong software engineering in Python and C++, comfortable across the stack from training code to onboard inference integration.
Experience scaling perception models across multiple geographies, sensor setups, or vehicle platforms.
Nice to have:
Deep familiarity with transformer-based architectures for 3D perception, BEV representations, multi-task learning, and dense prediction.
Familiarity with occupancy-based scene representations, sparse query-based architectures, or temporal aggregation approaches.
Experience reducing or removing map dependencies in perception systems.
Background in autolabel pipelines, data quali
Similar Jobs
Related searches:
On-site Jobs
Lead Jobs
On-site Lead Jobs
Lead Machine LearningLead Robotics & AutonomyLead Computer VisionLead AI Infrastructure
AI Jobs in Sunnyvale
Machine Learning in SunnyvaleRobotics & Autonomy in SunnyvaleComputer Vision in SunnyvaleAI Infrastructure in Sunnyvale
deep-learningautonomous-vehiclescloud
Get jobs like this delivered weekly
Free AI jobs newsletter. No spam.