Join us, and influence transformative solutions in a dynamic, global environment!
Krakow-based opportunity with the possibility to work 80% remotely!
As a Data Engineer, you will be working for our client, a global financial institution leading the way in creating a cutting-edge, data-driven organization within the Wholesale Chief Data & Analytics Office. This revolutionary data analytics ecosystem aims to generate business insights and provide an exceptional customer experience through well-managed and trusted data assets. The client’s global team collaborates with IT to deliver a vast ecosystem of curated, enriched, and protected datasets. The Wholesale Big Data Lake, with over 300 sources, is a testament to their commitment to utilizing the latest technologies to solve complex business problems and deliver unique insights.
Your main responsibilities:
- Designing software, Python & GCP development, and automated testing in an Agile, DevOps, and dynamic environment
- Promoting development standards, conducting code reviews, and facilitating knowledge sharing
- Contributing to product and feature design through scrum story writing
- Handling Data Engineering tasks, ensuring product support, and troubleshooting
- Implementing tools and processes to manage performance, scale, availability, accuracy, and monitoring
- Collaborating with BAs to interpret and implement requirements, and working with Testers to ensure effective testing
- Participating in regular planning and status meetings, providing input into system architecture and design, and conducting peer code reviews
- Offering 3rd line support when needed
You’re ideal for this role if you have:
- Strong communication skills, with the ability to communicate effectively between technical and non-technical stakeholders
- Proficiency in data analysis and synthesis, including data profiling and system analysis
- Experience in the data development process, designing, building, and testing complex or large-scale data products
- Knowledge of data integration design and the ability to select and implement appropriate technologies for resilient and scalable solutions
- Competence in data modeling, understanding industry-recognized patterns, and standards
- Expertise in metadata management and problem resolution in databases and data processes (Proficient in SQL and relational database design)
- Programming and build skills in data engineering, using standards and tools to design, code, test, and document programs and scripts (Python)
- Experience in Google Cloud Platform or other cloud vendor
- Fluent English
It is a strong plus if you have:
- Experience with data pipeline building technologies, including BigQuery, DataProc, and Python
- Knowledge and experience with GCP ecosystem and data management frameworks
- Familiarity with the HSBC data governance framework
- CI/CD experience
- Knowledge and experience with Hadoop, PySpark, Scala, Hive, Java
#GETREADY to meet with us!
We would like to meet you. If you are interested please apply and attach your CV in English or Polish, including a statement that you agree to our processing and storing of your personal data. You can always also apply by sending us an email at firstname.lastname@example.org.
Internal number #4619