Join us, and architect the future of financial data intelligence!
Kraków – based opportunity with hybrid work model (2 days/week in the office).
As a Data Engineer (ESG), you will be working for our client, a leading global financial institution, on an ambitious data transformation project aimed at enhancing ESG (Environmental, Social, and Governance) reporting and analytics. The client is modernizing their data infrastructure to support large-scale data processing and migration to the cloud, with a strong focus on ensuring data accuracy, availability, and performance across distributed systems. You will be part of a hybrid team contributing to the delivery of robust, scalable data pipelines and architecture, combining cutting-edge technologies in cloud computing and big data engineering.
Your main responsibilities: Developing and maintaining scalable data pipelines using Spark and Scala
- Designing and implementing workflows in Apache Airflow
- Managing large datasets using Hadoop ecosystem tools like HDFS and Hive
- Migrating and processing data across Google Cloud components such as BigQuery and Dataflow
- Collaborating with data architects and analysts to build efficient data models
- Automating data integration and deployment processes using Jenkins and Git
- Monitoring and troubleshooting data pipelines to ensure high reliability
- Conducting code reviews and ensuring best practices in data engineering
- Working closely with stakeholders to gather data requirements and support reporting needs
- Participating in Agile ceremonies and contributing to sprint deliverables
You’re ideal for this role if you have:
- 5+ years of experience in data engineering or a related field
- Strong knowledge of Apache Spark and Scala
- Hands-on experience with Hadoop, Hive, and HDFS
- Experience working with Google Cloud Platform, especially BigQuery and Dataflow
- Proficiency with SQL and data modeling techniques
- Familiarity with CI/CD tools such as Jenkins and version control with Git
- Experience with Apache Airflow or similar orchestration tools
- Understanding of DevOps practices and Agile methodologies
- Strong debugging and problem-solving skills
- Excellent communication and interpersonal skills
It is a strong plus if you have:
- Experience with Cloud DataProc, Cloud PubSub, and Cloud Composer
- Hands-on experience with Tableau or other data visualization tools
- Familiarity with Enterprise Data Warehouse technologies
- Exposure to customer-facing roles or working with enterprise clients
- Experience with automated testing frameworks for data pipelines
- Knowledge of Cloud design patterns and architecture
- Google Cloud certification
- Experience using Jira for project tracking
- Understanding of both relational and non-relational big data modeling
- Prior involvement in ESG or sustainability data projects
#GETREADY to meet with us!
We would like to meet you. If you are interested please apply and attach your CV in English or Polish, including a statement that you agree to our processing and storing of your personal data. You can always also apply by sending us an email at recruitment@itds.pl.
Internal number #6922
Address:
SKYLIGHT BUILDING | ZŁOTA 59 | 00-120 WARSZAWA
BUSINESS LINK GREEN2DAY BUILDING | SZCZYTNICKA 11| 50-382 WROCŁAW
Contact:
INFO@ITDS.PL
+48 883 373 832