Join us, and transform ideas into high-performance digital platforms!
Krakow-based opportunity with the possibility to work 60% remotely!
As a Data Analyst, you will be working for our client, a renowned global financial institution leading the way in data engineering and digital transformation. In this role, you will play a pivotal part in building, optimizing, and scaling high-performance data pipelines and applications. By leveraging your experience in data processing, integration, and emerging technologies, you will support large-scale projects that transform data management frameworks, enhance business insights, and shape data-driven decision-making across the organization.
Your main responsibilities:
- Designing and implementing data products that meet complex, large-scale needs
- Analyzing data sources, performing profiling, and synthesizing findings for stakeholders
- Collaborating to enhance data integration processes and ensure resilience and scalability
- Building and optimizing data pipelines using Hadoop, PySpark, and Hive
- Developing data models across multiple domains and applying industry standards
- Ensuring high-quality metadata management for improved accessibility and usage
- Monitoring and resolving issues within data products and services proactively
- Applying programming standards to code, test, and document software efficiently
- Reviewing specifications to identify risks and testing conditions for robust outcomes
- Advising on emerging data technologies and their impact on data tools and solutions
You’re ideal for this role if you have:
- Proven experience in data analysis, profiling, and synthesis for technical and non-technical audiences
- Proficiency in building data integration processes and managing large data sets
- Strong skills in designing resilient data models and integration solutions
- Practical expertise with Hadoop ecosystem, PySpark, Scala, and Hive
- Familiarity with CI/CD practices and data pipeline management
- Background in metadata management, from design to documentation
- Experience in resolving issues within databases and data engineering environments
- Technical understanding of data engineering and programming fundamentals
- Ability to conduct thorough testing and report analysis with attention to risk
- Effective communication skills with diverse technical teams
It is a strong plus if you have:
- Knowledge of Java SpringBoot, BigQuery, or DataProc frameworks
- Hands-on experience with Airflow for data pipeline orchestration
- Exposure to DevOps practices and automated deployment strategies
- Experience in managing large-scale data systems in financial or regulated industries
#GETREADY to meet with us!
We would like to meet you. If you are interested please apply and attach your CV in English or Polish, including a statement that you agree to our processing and storing of your personal data. You can always also apply by sending us an email at recruitment@itds.pl.
Internal number #6038
Address:
SKYLIGHT BUILDING | ZŁOTA 59 | 00-120 WARSZAWA
BUSINESS LINK GREEN2DAY BUILDING | SZCZYTNICKA 11| 50-382 WROCŁAW
Contact:
INFO@ITDS.PL
+48 883 373 832