At General Motors, our product teams are redefining mobility. Through a human-centered design process, we create vehicles and experiences that are designed not just to be seen, but to be felt. We’re turning today’s impossible into tomorrow’s standard —from breakthrough hardware and battery systems to intuitive design, intelligent software, and next-generation safety and entertainment features.
Every day, our products move millions of people as we aim to make driving safer, smarter, and more connected, shaping the future of transportation on a global scale.
Role:
As a Senior Machine Learning Engineer for Perception within the Embodied AI organization, you will own the end-to-end pipeline for safety-critical ML perception models, from initial research and large-scale data curation to optimization and real-time deployment on the vehicle's compute platform. Your primary mission is to enable the vehicle to accurately and reliably see, classify, and track everything in its environment—from rare road markings and complex intersections to vulnerable road users (VRUs).
Responsibilities:
-
End-to-End Model Lifecycle: Own the design, training, validation, and deployment of deep learning models for core perception tasks such as:
-
3D Object Detection and Tracking (vehicles, pedestrians, cyclists).
-
Real-time map detection of the drivable world (lanes, road boundaries, traffic signs).
-
Multi-Modal Sensor Fusion (Camera, LiDAR, Radar).
-
Production Pipeline: Build and scale the ML training infrastructure, including data mining and loading, multi-stage training and evaluation, to streamline model development.
-
Performance Optimization: Improve model performance through data iterations, parameter tunings, training strategy and architecture updates to produce reliable models that meet and the strict real-time, low-latency requirements on the vehicle's embedded hardware.
-
Model Debugging: Conduct rigorous, data-driven analysis to identify, debug, and resolve performance degradations and failures, specifically targeting long-tail and adversarial scenarios (e.g., adverse weather, sensor noise, occlusions).
-
Metric Definition: Define and implement robust model-level metrics to aid model development.
-
System Integration: Work closely with the Safety, Systems, and other engineering functions to integrate Perception outputs.
Skills & Experience:
-
BS, MS or PhD in Computer Science, Machine Learning, Robotics, or a related quantitative field.
-
5+ years of professional experience with a focus on Computer Vision, Deep Learning, and Perception in a production environment.
-
Deep hands-on experience with modern deep learning frameworks (e.g., PyTorch or TensorFlow) for training, experimentation, and debugging complex DNNs.
-
Proven experience working with and fusing data from multiple sensor modalities (Camera, LiDAR, and/or Radar).
-
Practical experience deploying and optimizing ML models for resource-constrained, real-time embedded systems.
-
Demonstrated ability to drive model improvements through large-scale data analysis, error logging, and data curation.
Bonus:
-
Expertise with Transformer-based models for 3D detection, tracking, and scene understanding.
-
Technical leadership experience, including mentoring junior engineers and leading major feature development from concept to launch.
-
Contributions to relevant academic publications (CVPR, ECCV, ICCV, IROS, NeurIPS etc.).
Compensation :
The compensation information is a good faith estimate only. It is based on what a successful applicant might be paid in accordance with applicable state laws. The compensation may not be representative for positions located outside of the California Bay Area.
-
Benefits: GM offers a variety of health and wellbeing benefit programs. Benefit options include medical, dental, vision, Health Savings Account, Flexible Spending Accounts, retirement savings plan, sickness and accident benefits, life insurance, paid vacation & holidays, tuition assistance programs, employee assistance program, GM vehicle discounts and more.
This job may be eligible for relocation benefits.