top of page

Senior, Computer Vision Researcher/Engineer, Hand Tracking

Zurich, Switzerland, Europe

Full time


About the Role

Hand tracking is a critical feature that enables people to use their hand as input to interact with our environment on Magic Leap devices. We have an exciting opportunity on our Perception team to work on the next generation and world leading hand tracking system. 

We are looking for individuals with exceptional development/research skills in the field of Computer Vision, Machine Learning and Deep Learning. In the hand tracking team, you will be faced with an exciting array of opportunities and a number of challenges. The problems we try to tackle include, but are not limited to, 3D hand reconstruction, 2D/3D hand key-point estimation and meshing, self-supervised 3D hand pose estimation, neural rendering, and gesture recognition. The primary responsibility of the role is to contribute to the research and development of multiple core components for hand pose/shape estimation and gesture recognition. The candidate’s responsibilities extend to working closely with the executive team to establish the scope and schedule of the product critical projects, drive the formation of technical teams and ensure a cohesive alignment of all essential technical expertise's by setting optimal communication strategies. Qualified candidates will be driven self-starters, robust thinkers, strong collaborators, effective leaders and adept at operating in a highly dynamic environment. We look for colleagues that are passionate about our product and embody our values.


  • Contribute to the research and development effort of advanced, product-critical computer vision components such as hand tracking and gesture recognition

  • Actively contribute to Magic Leap Intellectual Property and publish research findings in peer-reviewed conferences

  • Work hand-in-hand with the key stakeholders and developers across the company using computer vision components

  • Support overall research engineering and architecture efforts of computer vision and machine learning components

  • Write maintainable, reusable code, leveraging test driven principles to develop high quality geometric vision and machine learning modules

  • Troubleshoot and resolve software defects and other technical issues

  • Act as a mentor and subject matter expert within the computer vision group and with other key stakeholders

  • Review individual developer's code in the team to ensure highest code quality in Computer Vision and Deep Learning components


  • 5+ years of working experience in Computer Vision targeted to product development.

  • Experience/specialization in at least one of the following domains:

    • Sensor Calibration: Design and implement algorithms for online and offline intrinsic and extrinsic calibration of complex devices composed of several sensors, cameras, IMUs, depth sensors, and imagers. Collaborate with other engineers on the design and deployment of fully automatic robotics-aided calibration processes targeted for factory production.

    • 2D/3D Pose Estimation and Tracking: Research, architect and hands-on experience on the hand or body pose estimation and tracking. 

    • Hand Meshing and Hand Reconstruction: Be familiar with the standard parameterized 3D hand or body shape model. 

    • Computer vision algorithms on device: Research, architect, and implement high-performance computer vision software on device with state-of-the-art capabilities.

  • Strong knowledge of Python programming

  • Sound knowledge of C/C++  (programming and debugging)

  • Experience in Deep Learning with knowledge of at least one of PyTorch or TensorFlow 

  • Experience working with OpenCV

  • Knowledge of parallel computing, OpenCL, CUDA, GPGPU is a plus

  • Knowledge of software optimization and embedded programming is a plus


  • MS in Computer Science or Electrical Engineering (with a minimum of 5 years of relevant experience)

  • Ph.D. is preferred (with a minimum of 2 years of relevant experience)

Please let the company know you found this position on Jobdai to support us!

bottom of page