At Mobileye, we foster a hybrid-friendly environment, combining work from office and home.
Which team will you join?
Our team is responsible for embedding Mobileye's computer-vision, deep-learning and sensor fusion algorithms (and their building blocks) to run on Mobileye's EyeQ hardware, implementing them on special EyeQ HW accelerators. This code must be super-efficient in order to run in the autonomous vehicle. Consequently, there is a lot of focus on the performance analysis and optimization of these implementations, and of the end-to-end system.
The development environment is mostly based on C programming with extensions.
We get to see our code running in actual consumer cars and autonomous vehicles, as part of the most advanced algorithms and use cases which are developed both by Mobileye and by customers.