Join us and shape the data landscape with our team!
Krakow-based opportunity with the possibility to work in a hybrid model!
As a Data Engineer, you will be working for our client, a leading financial institution, within the HSBC Market and Security Services (MSS) division. The MSS team provides tailored financial solutions to major government, corporate, and institutional clients worldwide. Your role will be part of the Global Business Insights (GBI) Transformation, a large and complex data integration program serving diverse users and data visualization needs.
Your main responsibilities:
- Designing, building, testing, and deploying Google Cloud data models and transformations in BigQuery and the Datafusion environment
- Reviewing, refining, and implementing business and technical requirements
- Ensuring your active involvement in refining User Stories, Epics, and Backlogs in Jira to maintain ongoing productivity and priorities
- Managing code artifacts and CI/CD processes using tools like Git, Jenkins, and Google Secrets Manager
- Estimating, committing, and delivering requirements with a focus on scope, quality, and time expectations
- Writing automated unit and regression tests to follow a test-centric development approach
- Delivering non-functional requirements, IT standards, and developer and support tools to ensure secure, compliant, scalable, reliable, and cost-effective applications
- Adhering to consistent logging, monitoring, error handling, and automated recovery practices as per HSBC standards
- Developing procedures and scripts for data migration, back-population, and initialization
- Maintaining a high-quality, up-to-date knowledge base, wiki, and admin pages for the solution
You’re ideal for the role if you have:
- 4 years of experience in database design, development, and administration of Traditional/Cloud Databases & Data Warehouses/Procedures/Products
- 1 years of experience in developing, refactoring, and optimizing SQL/T-SQL procedures in BigQuery or equivalent Cloud Databases
- Good understanding of GCP Core and Data Products, Architecting, and Designs/Patterns
- Data preparation, wrangling, and refactoring skills, as part of a Data Science pipeline
- Expertise in data visualization and editing for web, dashboard, or other user interfaces
- Strong knowledge of IT methodologies and practices, including Agile/Scrum
- Experience with collaboration tools such as JIRA and Confluence
- BS/MS degree in Computer/Data Science, Engineering, Data, or related field
- Excellent communication and interpersonal skills in English, with the ability to learn rapidly and independently
It is a strong plus if you have:
- Experience developing BI/MI reports and dashboards in a popular tool like Qlik (VizLib Library, VizLib Collaboration, Mashups, etc.), Tableau, Looker etc.
- Experience in GCP based big data / ETL solutions DevOps / DataOps model
- Experience of deploying and operating Datafusion/CDAP based solutions
- Experience in building and operating CI/CD life-cycle management Git, Jenkins, Groovy, Checkmarx, Nexus, Sonar IQ and etc.
- Expertise of Java, Python, DataFlow
- Broad experience with IT development and collaboration tools.
- An understanding of IT Security and Application Development best practice.
- Understanding of and interest in various investment products and life cycle and the nature of the investment banking business.
- Experience of working with infrastructure teams to deliver the best architecture for applications.
- Working in a global team with different cultures
#GETREADY to meet with us!
We would like to meet you. If you are interested please apply and attach your CV in English or Polish, including a statement that you agree to our processing and storing of your personal data. You can always also apply by sending us an email at recruitment@itds.pl.
Internal number #3999
Address:
SKYLIGHT BUILDING | ZŁOTA 59 | 00-120 WARSZAWA
BUSINESS LINK GREEN2DAY BUILDING | SZCZYTNICKA 11| 50-382 WROCŁAW
Contact:
INFO@ITDS.PL
+48 694 564 593