Unleash the Power of Data!
Warsaw-based opportunity with remote work possibilities.
As a Big Data Developer, you will be working for our client, a leading company in data-driven solutions. Your role involves overseeing the IT infrastructure linked to the Hadoop cluster for Big Data. Daily tasks include addressing issues related to processing large datasets, collaborating closely with analysts and users of the Big Data ecosystem tools, and providing support for platform development projects. You will be responsible for monitoring and ensuring the robustness of Hadoop systems (Python, Spark, Hive, Impala), along with Linux system administration.
Your main responsibilities: Manage the IT infrastructure associated with the Hadoop cluster for Big Data
- Address daily challenges related to processing large datasets
- Provide continuous support and collaboration with analysts and users of Big Data tools
- Ensure support for platform development projects
- Monitor and support Hadoop systems (Python, Spark, Hive, Impala)
- Administer Linux systems
You’re ideal for this role if you have:
- Minimum 3 years of practical experience in IT infrastructure management
- Knowledge of the architecture of the distributed Hadoop environment
- Minimum 3 years of experience in Hadoop environment administration (Cloudera, Hortonworks)
- Minimum 3 years of experience in monitoring and supporting applications using Apache Spark, Hive, HBase, Impala, Kudu
- Familiarity with monitoring tools (Zabbix, Grafana, SMM, etc.)
- Basic understanding of relational databases (Oracle, MS SQL, Postgres)
- Linux Red Hat administration expertise
#GETREADY to meet with us!
We would like to meet you. If you are interested please apply and attach your CV in English or Polish, including a statement that you agree to our processing and storing of your personal data. You can always also apply by sending us an email at email@example.com.
Internal number #4571