Apple AR/VR Job | Embedded Systems Camera Engineer – Imaging & Sensing Technology Group
Citys（岗位城市）: Herzliya, Israel
Imagine what you could do here. At Apple, new ideas have a way of becoming extraordinary products, services, and customer experiences very quickly. Bring passion and dedication to your job and there’s no telling what you could accomplish. Dynamic, hard-working people and inspiring, innovative technologies are the norm here. The people who work here have reinvented entire industries with all Apple products.
Do you love working on challenges that no one has solved yet? Our group developes depth sensing systems such as the ground-breaking True Depth camera that powers Face ID. This group is responsible for architecture, design, and development of these highly complex sensing systems for all Apple products.
The Imaging and Sensing group at Apple develops Depth and Image Sensing systems such as the ground-breaking True Depth camera that powers FaceID and the recently released LiDAR scanner.
Apple’s Camera/ISP Firmware team brings together extraordinary professionals to explore, develop, and ship state of the art technologies for Apple products. As part of the team you would work on core camera/ISP/Camera peripheral technologies, including Apple designed Image signal processing pipeline and HW components, where you will have the chance to define the way that Apple develops, tests and manufactures all of its products. In this role, you will have the opportunity to work on VR/AR/ML type projects. The candidate will have strong communication and interpersonal skills.
Experience developing software for device drivers such as image sensors: I2C, SPI, GPIOs, MIPI/LPDP, DMAs.
Proficiency in multi-thread RTOS systems.
Proficiency in C/C++ programming.
Excellent problem solving and debugging skills.
Experience building high quality production software.
Familiarity with image processing, camera pipeline is a bonus.
You will design and develop features for image sensor, peripheral devices, and latest image processing pipeline on Apple products. Device driver functionality will include controlling camera peripherals, voice-coil motors (VCMs), lens actuators, LED strobes and power management units (PMUs). You will also have the opportunity to collaborate cross-functionally with Sensor Hardware, Silicon Design, Machine Learning Algorithm teams to craft the roadmap for future Apple products.
This is a highly multi-functional product development role and you will work closely with various teams, like Silicon Design, EE, and System/Controls team. The environment is dynamic, fast-paced and requires a self-starter attitude.