Open Data Science job portal

Software Engineer, Radar Perception (Autonomy) 1050 views

At Lyft, community is what they are and it’s what they do. It’s what makes them different. To create the best ride for all, they start in their own community by creating an open, inclusive, and diverse organization where all team members are recognized for what they bring.

From day one, Lyft’s mission has been to improve people’s lives with the world’s best transportation. And self-driving cars are critical to that mission: they can make their streets safer, cities greener, and traffic a thing of the past. That’s why they started Level 5, their self-driving division, where they’re building a self-driving system to operate on the Lyft network.

Level 5 is looking for doers and creative problem solvers to join them in developing the leading self-driving system for ridesharing. Their team members come from diverse backgrounds and areas of expertise, and each has the opportunity to have an outsized influence on the future of their technology. Their world-class software and hardware experts work in brand new garages and labs in Palo Alto, California, and offices in London, England and Munich, Germany. And they’re moving at an incredible pace: they’re currently servicing employee rides in their test vehicles on the Lyft app. Learn more at lyft.com/level5.

As part of the Autonomy Team, you will be interacting on a daily basis with other software engineers to tackle highly advanced AI challenges. Eventually they expect all Autonomy Team members to work on a variety of problems across the autonomy space; however, with a focus on perception, your work will initially involve turning their constant flow of sensor data into a model of the world. For this position, they are looking for a software engineer with the ability to master the problem of perception using advanced radar processing algorithms.

Responsibilities:

  • Develop core perception algorithms such as object detection, and tracking using raw radar sensor data.
  • Develop sensor fusion algorithms for radar, LiDAR, and vision modalities.
  • Implement real-time algorithms (< 10 milliseconds) on CPU/GPU in C++.
  • Build tools and infrastructure to evaluate the performance of perception stack and track it over time.

Experience & Skills:

  • M.S. or Ph.D. degree in Computer Science, Electrical Engineering, or related field.
  • Ability to write high-quality C++ code.
  • Strong background in mathematics, linear algebra, geometry, and probability.
  • Strong skills in algorithm design and complexity analysis.
  • Ability to work in a fast-paced environment and collaborate across teams and disciplines.
  • Openness to new / different ideas. Ability to evaluate multiple approaches and choose the best one based on first principles.

Nice To Have:

  • 3+ years experience working in a related role.
  • 5+ years developing in C++ / Python.
  • Hands on experience with applying deep learning to sensor data.
  • Experience in ADAS systems.
  • Experience in radar signal processing, such as clustering, false target removal, frequency hopping.
  • Knowledge of advanced imaging radar hardware.

More Information

Share this job

Lyft

(0)
Company Information
Connect with us
Contact Us
https://jobs.opendatascience.com/wp-content/themes/noo-jobmonster/framework/functions/noo-captcha.php?code=db090

Here at the Open Data Science Conference we gather the attendees, presenters, and companies that are working on shaping the present and future of AI and data science. ODSC hosts one of the largest gatherings of professional data scientists with major conferences in the USA, Europe, and Asia.

Contact Us