Join us, and create cutting-edge pipelines for seamless data transformation!
Kraków – based opportunity with hybrid work model (2 days/week in the office).
As a Data Engineer, you will be working for our client, a global financial institution that is driving DevOps transformation through data analytics and engineering. You will be part of a team that provides key metrics and analytical products to enhance software engineering practices across the organization. Your role will focus on developing data transformation pipelines, ensuring data quality, and supporting a cloud data platform to improve the overall DevOps experience. You will collaborate with diverse global teams to deliver enriched datasets, dashboards, and insights that enable strategic decision-making.
Your main responsibilities: Designing, developing, testing, and deploying data ingest, quality, refinement, and presentation pipelines
- Operating and iterating on a cloud data platform to support internal goals
- Building and maintaining ETL processes and data transformation pipelines
- Ensuring data quality and implementing automated data validation solutions
- Developing data marts and optimizing schema designs for performance and usability
- Collaborating with business stakeholders to understand data needs and deliver actionable insights
- Working with cloud-based big data technologies, particularly Google Cloud Platform (GCP) and BigQuery
- Utilizing orchestration and scheduling tools such as Airflow and Cloud Composer
- Supporting continuous integration and continuous delivery (CI/CD) processes
- Following Agile methodologies and working within a product-oriented culture
You’re ideal for this role if you have:
- At least 7 years of professional experience in SQL development
- Strong experience in data engineering and ETL processes
- Expertise in GCP, BigQuery, and data build tools (DBT)
- Hands-on experience with Apache Airflow and Cloud Composer
- Proficiency in data modeling and designing optimized data schemas
- Experience with data streaming technologies such as Kafka
- Familiarity with BI tools, especially Looker Studio
- Understanding of DevOps principles and working in a DevOps environment
- Experience with Continuous Integration and Continuous Delivery (CI/CD) practices
- Strong communication skills and ability to work with global teams
It is a strong plus if you have:
- Experience in building and operating a cloud data platform
- Knowledge of data architecture and data marts
- Proficiency in Git, Shell scripting, and Python
- Ability to quickly learn and adapt to new technologies
- Experience collaborating with technical staff and project managers for efficient delivery
- Proactive approach to identifying improvement opportunities and solving issues
- Comfort in working in fast-paced, changing, and ambiguous environments
#GETREADY to meet with us!
We would like to meet you. If you are interested please apply and attach your CV in English or Polish, including a statement that you agree to our processing and storing of your personal data. You can always also apply by sending us an email at recruitment@itds.pl.
Internal number #6763
Adres:
SKYLIGHT BUILDING | ZŁOTA 59 | 00-120 WARSZAWA
BUSINESS LINK GREEN2DAY BUILDING | SZCZYTNICKA 11| 50-382 WROCŁAW
Kontakt:
INFO@ITDS.PL
+48 883 373 832