Job Information
Google Software Engineer, Pixel Camera 3A in New Taipei City, Taiwan
Google welcomes people with disabilities.
Minimum qualifications:
Bachelor’s degree in Computer Science, Engineering, or equivalent practical experience.
2 years of experience in camera auto focus or Android mobile.
Experience with coding in C++ or Python.
Preferred qualifications:
Master's degree or PhD in Computer Science, Image Processing, or a related technical field.
Experience in camera software domains such as camera driver, hardware abstraction layer and algorithm.
Experiences in machine learning and TensorFlow.
Experiences with mobile chipset vendors.
Google's software engineers develop the next-generation technologies that change how billions of users connect, explore, and interact with information and one another. Our products need to handle information at massive scale, and extend well beyond web search. We're looking for engineers who bring fresh ideas from all areas, including information retrieval, distributed computing, large-scale system design, networking and data storage, security, artificial intelligence, natural language processing, UI design and mobile; the list goes on and is growing every day. As a software engineer, you will work on a specific project critical to Google’s needs with opportunities to switch teams and projects as you and our fast-paced business grow and evolve. We need our engineers to be versatile, display leadership qualities and be enthusiastic to take on new problems across the full-stack as we continue to push technology forward.
As a Camera 3A Software AF (Auto Focus) Engineer, you will develop auto focus algorithms and will work with Software Engineers to integrate the algorithm into the Android platform and improve performance. You will be in charge of launching the algorithms on mobile devices and work with Image Quality Engineers to fine tune the algorithm for quality and develop automated tuning and evaluating methodology to facilitate the process and reduce manual effort.
In this role, you will also involve technology related to auto focus functions (e.g., Time of Flight, Phase Detection, etc.). You will develop and simulate the in-house algorithm, and integrate the total solution into the Android platform.
Google's mission is to organize the world's information and make it universally accessible and useful. Our Devices & Services team combines the best of Google AI, Software, and Hardware to create radically helpful experiences for users. We research, design, and develop new technologies and hardware to make our user's interaction with computing faster, seamless, and more powerful. Whether finding new ways to capture and sense the world around us, advancing form factors, or improving interaction methods, the Devices & Services team is making people's lives better through technology.
Develop PDAF (Phase Diff Auto Focus) algorithms and relevant infrastructure tools, such as simulator/debugger/evaluator/etc.
Engage with different sensor and module vendors and incorporate the new technology into Google products.
Implement, improve, and integrate the algorithms onto device platforms.
Design methodology for automated tuning/testing/simulation to reduce manual efforts.
Google is proud to be an equal opportunity workplace and is an affirmative action employer. We are committed to equal employment opportunity regardless of race, color, ancestry, religion, sex, national origin, sexual orientation, age, citizenship, marital status, disability, gender identity or Veteran status. We also consider qualified applicants regardless of criminal histories, consistent with legal requirements. See also https://careers.google.com/eeo/ and https://careers.google.com/jobs/dist/legal/OFCCPEEOPost.pdf If you have a need that requires accommodation, please let us know by completing our Accommodations for Applicants form: https://goo.gl/forms/aBt6Pu71i1kzpLHe2.