Westhouse is one of the leading international recruitment agencies for the procurement of highly qualified experts in fields such as IT lifecycle management, SAP, engineering, commerce and specialist consultancy.

Please note that we can only present candidates to this client who have a contractual relationship with a company and are subject to social security contributions. Managing partners are classified as freelancers according to the definition there and can therefore not be considered.

Data Engineer (m/f/d) - Munich & Remote

General information

Reference:
146618
Start:
22.03.2021
Duration:
31.12.2021
Job Type:
Project
Location:
Munich & Remote
Volume:
full-time 5 Days / Week
Languages:
German, English

Your Tasks

  • Develop new functionalities especially for the core data capabilities provided by the Global Data Platform (GDP), which are: o Real Time Ingestion Service o Data Governance Solution o Knowledge Graph o Data Supermarket (UI for data consumers based on the knowledge graph or ingested data to search and consume data sets available within Allianz)
  • o Real Time Ingestion Service
  • o Data Governance Solution
  • o Knowledge Graph
  • o Data Supermarket (UI for data consumers based on the knowledge graph or ingested data to search and consume data sets available within Allianz)
  • Understand and explain advantages and disadvantages of the proposed solutions to internal and external stakeholders
  • Participate in Scrum ceremonies e.g. daily standup, Sprint review
  • Evaluate new technologies in the field of data engineering, processing and management
  • Contribute to improve the Allianz Big Data software stack and define the final production environment
  • Look for opportunities to improve performance, reliability and automation
  • Resolve incidents and change requests
  • Support and interact with data consumers and data engineers
  • Write technical documentation, announcements, blog posts, and best practices

Your Skills

  • Proficiency in at least one programming language (we have components written in Angular, Coljure, Python, Scala, Elm etc.)
  • Software engineering (design patterns, version control, automated testing, concurrent programming etc.)
  • Continuous integration, deployment, and delivery
  • Competence in running big data workloads in production at scale
  • Data Engineering Pattern (Ingestion, Transformation, Storage, Consumption etc.)
  • Event-based systems (deep knowledge on the Kafka confluent stack)
  • Databases (e.g. PostgreSQL, Neo4J, Stardog)
  • Cloud Storage (Azure Datalake or AWS S3 with EMR)
  • Distributed systems (e.g. Spark)
  • Knowledge Graph (e.g. Stardog, Metaphactory)
  • Data Governance Frameworks (e.g. Informatica)
  • Data Virtualization Technologies (e.g. Denodo)
  • Advanced Experience with Linux
  • Containerization / Container Orchestration (Docker & Kubernetes)
  • General understanding of Infrastructure, Orchestration, Distributed Systems and IT Security Principles (Data Access Control)

Interested

We look forward to receiving your application documents in electronic form.