Rodzaj pracy: Pełny etat
Rodzaj umowy: Na czas nieokreślony
Ilość wakatów: 1
Minimalne doświadczenie: 1 rok
Wykształcenie: Wyższe licencjackie
Branża: Praca Finanse, Praca IT - Konsulting, Praca IT - Project Management
Description
We are member of French ALTEN Group present in 25 countries around the world and employing over 34,000 engineers and IT specialists. Since 1988 we deliver advanced IT systems for well-known brands, develop medicine and the renewable energy industry. ALTEN innovates aero and cosmonautics, trains, electric and autonomous vehicles, and even space rockets. We are currently looking for a Big Data Developer/Architect who will join our team.
Main tasks:
Administration of Hadoop clusters, users, and cluster resources
Cluster capacity planning and sizing
Regular operation such as adding and removing nodes and troubleshooting any failures
Secure Hadoop cluster using Kerberos, Knox, and LDAP integration
Setup security policies including ACLs and RBAC using Ranger, Sentry etc.
Monitoring, tuning and improving Hadoop clusters to keep them healthy
Evaluating and enabling new Hadoop components
Install and upgrade R, Python, Spark, Jupyter hub and other analytics tools and related packages
Integrate data sources such Oracle and SQL Server, and Enterprise tools such as OBIEE, and Tableau
Support user base through adequate resource managed on Hadoop clusters
Develop best practices and train users
Deploy Hadoop components in Docker containers
Requirement:
- Working experience as a Hadoop Administrator for Cloudera or Hortonworks Hadoop ecosystem for at least 4 years
- Deep understanding and strong conceptual knowledge in Hadoop architecture components
- Strong hands-on experience and knowledge of Hadoop core components such as HDFS, YARN, Hive, Spark, Kafka etc.
- Hands-on experience and knowledge of Linux and Hardware
- Strong hands-on programming shell scripting experience including bash and python
- Strong analytical mind to help solve complicated problems
- Desire to resolve issues and dive into potential issue
- Self-starter who works with minimal supervision. Ability to work in a team of diverse skill sets
- Ability to comprehend customer needs and requests and provide the correct solution
Optional skills:
- Any contribution to Hadoop open community
- Experience on tools like Hive, Spark, HBase, Sqoop, Impala, Kafka, Flume, Oozie, etc.
- Basic experience and knowledge of one of automation tools such as Chef, Puppet, Ansible
- Hands-on programming experience in Python
- Experience in end to end design and build process of Near-Real time and Batch Data Pipelines
- Strong hands-on experience and knowledge of Linux and Hardware
We offer
- Stable and long-term cooperation
- Certification and training opportunities
- Possibility to choose type of contract (B2B also)
- Benefit package (MultiSport card, Medicover, MyBenefit)
- Unlimited growth and development opportunity
- Relocation support & relocation package
Do not hesitate and join our team!