World
Country
Language

poland Poland

portugal Portugal

netherlands Netherlands

OFERTA PRACY NIE JEST JUŻ AKTUALNA
Kliknij w zakładkę "Oferty pracy", aby zobaczyć inne stanowiska dostępne na naszej stronie internetowej.

Senior Data Engineer

  • Hybrid/On-site
  • English
  • HealthCare
  • Expert/Senior
  • Agile/Scrum

Join us and continue moving humanity forward by tirelessly shaping what’s possible!

Warsaw-based opportunity with possibility to work partly remote 

As a Senior Data Engineer, you will work for our Client – a global enterprise using science and imagination to advance health and nutrition. You will be working on cutting-edge solutions in the area of crop science – monitoring yields, environmental impact of products, irrigation status, analysing data that will translate into environmental improvements. Join us and be a part of a global transformation to the world where farms are more sustainable, with plants that are more adaptive and resilient, to help improve life for families and communities.

Your main responsibilities:  Create and maintain an optimal data quality, structure, pipeline, and processing architecture

  • Build infrastructure required to optimally integrate, cleanse, and load data from diverse data sources using various modern cloud and big data technologies
  • Address data platform quality attributes such as security, high availability, reliability, performance, etc
  • Develop cloud data platform design skills and pass certified training

You’re ideal for this role if you have:

  • Minimum 7 years of experience in the area of Data Engineering
  • Minimum 3 years of experience working with cloud (ideally GCP)
  • Experience in working with large amounts of data within SQL, Python, version control (git)
  • Familiarity with data governance tools
  • Extensive knowledge of design principles, and architectural best practices for designing and running workloads in the cloud
  • Knowledge of monitoring and alerting concepts in a distributed system
  • Experience in building and optimizing data integrations, data pipelines, data architectures, and data sets with Apache Airflow or a similar tool
  • Ability to create and maintain optimal data pipeline and data access architecture
  • SQL and NoSQL database design, and Data Warehouse design skills
  • Knowledge of data integration patterns and tools (Kafka, pub-sub, SQS or similar)
  • Good analytical capabilities
  • Ability to architect build and manage robust and scalable data pipeline
  • Excellent communication skills
  • Fluent English

It is a strong plus if you have:

  • Knowledge of ML algorithms and familiarity with ML frameworks like TensorFlow, PyTorch, etc
  • Cloud certification
  • Knowledge of R language

#GETREADY  to meet with us!

We would like to meet you. If you are interested please apply and attach your CV in English or Polish, including a statement that you agree to our processing and storing of your personal data. You can always also apply by sending us an email at recruitment@itds.pl.

Internal number #4656

Internal number #4656

Benefits

Access to +100 projects
Access to Healthcare
fintech-delivery
Access to Multisport
Training platforms
Access to Pluralsight
Make your CV shine
B2B or Permanent Contract
Flexible & remote work
Flexible hours and remote work