Data Engineer (Spark)

Location:

Quantori is an international team: we have colleagues who work not only from offices but also remotely from all over the world.

What we expect:

  • Expertise in Spark
  • Expertise in Databricks
  • Experience in working with one of these clouds: AWS, Google Cloud Platform, or Azure
  • Ability to write robust code with Python
  • Strong SQL skills
  • Experience with building Data lakes, Data warehouses, and ETL solutions
  • Understanding of data structures, data modeling, and software architecture
  • Analytic and problem solving skills

Nice to have:

  • Experience in big data solutions based on AWS cloud
  • Background in using Apache Airflow (AWS MWAA)
  • Knowledge of either AWS Glue or Apache NiFi

We offer:

  • Highly competitive compensation
  • Remote or office work
  • Flexible working hours
  • Continuous education, mentoring, and professional development programs
  • Strong management and tech expertise
  • Certifications paid by the company
This field is required
This field is required
The maximum number of characters is 500

PDF / DOC / DOCX / RTF / Max 10 Mb

If you don't see an open position that suits your skills stack and/or professional background but you are interested in working with us — please send your CV to career@quantori.com. We will try to find something special and interesting for you!