Big Data Solution Engineer
PolskaBig Data Solution Engineer
Polska
NR REF.: 1148986
HAYS IT Contracting matches IT Contractors with the best employers.
Our passion lies in helping people develop their professional careers in IT sector - quite simply, we power the world of work.
For our Client we are currently looking for qualified Candidates for the position of:
Big Data Solution Engineer
Location: 100% remote
Job type: contract B2B
Length: long term cooperation
Rate: 230 - 270 zł / hour net
Branch: Insurance
English: C1 or higher
Responsibilities:
Requirements:
Software Development
General IT Skills
Other Skills
Our offer:
Polska
NR REF.: 1148986
HAYS IT Contracting matches IT Contractors with the best employers.
Our passion lies in helping people develop their professional careers in IT sector - quite simply, we power the world of work.
For our Client we are currently looking for qualified Candidates for the position of:
Big Data Solution Engineer
Location: 100% remote
Job type: contract B2B
Length: long term cooperation
Rate: 230 - 270 zł / hour net
Branch: Insurance
English: C1 or higher
Responsibilities:
- Rollout the Client Global Data Platform (GDP) to customers and support the adaption of the technical services and the data capabilities
- Design the solutions of the complete use-case pipelines, from data ingestion to data processing, software deployment and feedback loop
- Understand and be able to explain advantages and disadvantages of the proposed solutions to internal and external stakeholders
- Build and maintain data driven applications on top of GDP
- Define and apply best practices based on client platform
- Contribute to improve the GDP stack and to define the final production environment
- Keep up with trends and evolving technology in the big data and analytics world
- Identify opportunities to improve performance, reliability and automation
- Write technical documentation, announcements, and blog posts about our platform on internal channels
Requirements:
- Theoretical and practical competency in running big data workloads in production at scale
- Familiar with data engineering patterns (Ingestion, Transformation, Storage, Consumption etc.)
- Event-based systems (deep knowledge on the Kafka confluent stack)
- Relational databases (e.g. Postgres)
- Graph databases (e.g. Neo4j, Stardog)
- Cloud storage (Azure Datalake or AWS S3 with EMR)
- Distributed systems (e.g. Spark)
- Knowledge graph (e.g. Stardog)
Software Development
- Proficiency in at least one programming language (we have components written in Python, Angular, Clojure, Scala, Elm, etc.)
- Software engineering (micro services, design patterns, version control, automated testing, concurrent programming etc.)
- Continuous integration, deployment, and delivery
General IT Skills
- Advanced experience with Linux
- Containerization and container orchestration (Docker & Kubernetes)
- General understanding of IT infrastructure, orchestration and IT security principles, especially on enterprise level
Other Skills
- Bachelor (BSc), Master (MSc) or equivalent experience in a technical field (for example, Computer Science, Engineering...)
- DevOps mindset (you build it, you run it; taking responsibility for your work)
- Willingness and ability to learn new technologies and tools
- Team player open to work in an agile environment
- Capability of result oriented communication with people from different departments with different skill sets
Our offer:
- Very Attractive rate
- Possibilities: variety of projects to participate in development, working closely with big brands
- Real impact on the project – if you are good, you are independent in making decisions
- Fast learning opportunities
- work for an international client - 100% work in English
Prosimy o aplikowanie poprzez przycisk znajdujący się po prawej stronie ogłoszenia.