🚀 ML Compiler Engineer II - Neuron Kernel Interface , Annapurna Labs

Hiring now — limited positions available!

Amazon

💰 Earn $50.000 – $70.000 / year
  • 📍 Location: Asti
  • 📅 Posted: Oct 28, 2025

The AWS Neuron Compiler team is seeking skilled compiler engineers to develop a state-of-the-art deep learning compiler stack. This stack optimizes application models across diverse domains, including Large Language and Vision, originating from leading frameworks such as PyTorch, TensorFlow, and JAX.

Your role will involve working closely with our custom-built Machine Learning accelerators, including Inferentia/Trainium, which represent the forefront of AWS innovation for advanced ML capabilities, powering solutions like Generative AI.

Key Job Responsibilities

  • Develop and maintain tooling for best-in-class technology for raising the bar of the Neuron Compiler's accuracy and reliability.
  • Help lead the efforts building fuzzers and specification synthesis tooling for our LLVM-based compiler.
  • Work in a team with a science focus, and strive to push what we do to the edge of what is known, to best deliver our customers.

Strong software development skills using C++/Python are critical to this role. A science background in compiler development is strongly preferred. A background in Machine Learning and AI accelerators is preferred, but not required.

Basic Qualifications

  • 3+ years of leading design or architecture (design patterns, reliability and scaling) of new and existing systems experience
  • 2+ years of experience in developing compiler features and optimizations
  • Proficiency in C++ and Python programming, applied to compiler or verification projects
  • Familiarity with LLVM, including knowledge of abstract interpretation and polyhedral domains
  • Demonstrated scientific approach to software engineering problems

Preferred Qualifications

  • Masters degree or PhD in computer science or equivalent
  • Experience with deep learning frameworks like TensorFlow or PyTorch
  • Understanding of large language model (LLM) training processes
  • Knowledge of CUDA programming for GPU acceleration

Amazon is an equal opportunity employer and does not discriminate on the basis of protected veteran status, disability, or other legally protected status.

Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, please visit for more information.

The base pay for this position ranges from $129,300/year in our lowest geographic market up to $223,600/year in our highest geographic market. Pay is based on a number of factors including market location and may vary depending on job-related knowledge, skills, and experience.

#J-18808-Ljbffr
👉 Apply Now

Hurry — interviews are being scheduled daily!