background

Scala/Spark Optimization Engineer (HPC)

Remote - UK, UK

Contract

Fully Remote - £450 to £500 per day - Outside IR35 (if UK Based) - 6 Month Contract

V-216766

profile photo
Jonathan Courtney

C++ | Python | Machine Learning | Gen AI | LLM | Web Contracts

Job Title: Scala Engineer – Apache Spark & HPC

Job Description:

We are seeking an experienced Scala Engineer to focus on accelerating Apache Spark workloads and developing high-performance solutions. This role requires expertise in Scala, a deep understanding of the JVM & domain-specific languages (DSLs), and experience with compilers or HPC.

Key Responsibilities:

  • Optimise and accelerate Apache Spark applications for complex, large-scale data processing tasks.
  • Develop and enhance JVM-based solutions to ensure performance, scalability, and reliability.
  • Design and implement domain-specific languages (DSLs) and contribute to compiler optimisation pipelines.
  • Collaborate on integrating solutions with Apache engine components and distributed systems.
  • Troubleshoot, profile, and resolve performance bottlenecks in big data and HPC environments.
Required Skills and Experience:
  • Strong proficiency in Scala programming and a deep understanding of the JVM ecosystem.
  • Hands-on experience with Apache Spark, particularly in performance optimization.
  • Solid knowledge of compilers, domain-specific languages (DSLs), or high-performance computing (HPC).
  • Proven experience in designing, developing, and optimising distributed systems.
  • Strong analytical skills and a focus on delivering efficient, high-quality solutions.
Nice-to-Have:
  • Experience with other distributed computing frameworks or big data technologies.
  • Knowledge of cloud-based platforms and containerisation (e.g., Kubernetes, Docker).
This is an exciting opportunity for a skilled engineer to work on cutting-edge projects at the intersection of big data and high-performance computing.
Apply now
job-details-decor