Azure Big Data Architect - Bonn, Deutschland - DHL

DHL
DHL
Geprüftes Unternehmen
Bonn, Deutschland

vor 2 Wochen

Lena Wagner

Geschrieben von:

Lena Wagner

beBee Recruiter


Beschreibung
DO YOU WANT TO WORK & LEARN ON STATE-OF-THE-ART TECHNOLOGY IN A TEAM WITH STRONG ENGINEERING KNOWLEDGE AND ENJOY THE BENEFITS OF A COPORATE ENVIRONMENT?

JOIN OUR TEAM IN BONN, BERLIN, PRAGUE, CHENNAI, CYBERJAYA OR FULLY REMOTE IN GERMANY, CZECH REPUBLIC OR INDIA FOR A FULL-TIME OR PART TIME POSITION, STARTING AS SOON AS POSSIBLE

Cloud, #CloudArchitecture, #DataLake, #Python, #Scala, #Spark, #DeltaLake, #Kuberneetes, #Azure, #GCP, #DevOps, #GitHubActions, #Terraform

DO YOU WANT TO MAKE A DIFFERENCE?

WE OFFER EXCELLENT OPPORTUNITIES FOR PROBLEM SOLVERS.

The Deutsche Post DHL Group is the leading mail and logistics service provider for the world. As one of the planet's largest employers operating in over 220 countries and territories, we see the world differently.

Join our team and discover how an international network that's focused on service, quality and sustainability is able to connect people and improve lives through the power of global trade.

And not just for our customers, but for every member of our team, too.


Join a great international team (> 15 nationalities) of data engineers, cloud engineers, DevOps engineers, data scientists and architects to learn from and to share your experiences.

The team language is English; hence you don't need to speak any German. In our family friendly environment, we offer part-time working, flextime and sabbaticals.


Your tasks

You will

  • Perform requirements workshops and analyses for Azure data platforms and related solutions with business departments & product owners
  • Create and optimize bestin class technical architectures for our Azure data platforms (data lakes, data lakehouses) and our solutions on top of these platforms, catering to data engineers, business analysts and data scientists
  • Drive the implementation of these architectures by leading data engineers, DevOps engineers and architects and by doing handson technical engineering work
  • Investigate emerging technology trends, concepts and solutions, by e.g., building PoCs and prototypes, to enhance our data platforms and to supplement our best practices
  • Create technical documentation on our platforms, related solutions and best practices in general
  • Mentor junior architects and engineers

Your profile

  • Strong expertise in data platform and solution architecture, specifically:
  • Experience in architecting, designing and building enterprise-grade scalable and robust big data platforms and solutions on cloud
  • preferably on Azure and nice to have also on on-premise (Cloudera, MapR)
  • Ability to gather and assess business & technical requirements and to map those to technical architectures
  • Ability to evaluate architecture variations (e. g. Spark Streaming vs. Flink, Databricks vs. opensource Spark, Cassandra vs. HBase, Synapse vs. Cosmos DB,)
  • Knowledge on best practices for selecting optimal component mix of managed services (e. g. ADF, Databricks, Synapse, etc.) and open source components.
  • Strong expertise in data engineering
- +6 years of experience working as a data engineer or software developer and/or demonstrable involvements in open-source projects

  • Expert in designing, building and maintaining large scale data pipelines, incl. processing (transforming, aggregating, wrangling) data.
  • Capability to understand and write complex SQLs in data analytics projects.
  • Experience in at least 2 of the following programming languages: Python, Scala, Java, Kotlin, Go, Rust, C#, F#, C, C++.
  • Strong handson expertise in platform and DevOps engineering:
  • Cloud computing concepts and Azure cloud platform concepts (e.g. networking, security and monitoring)
  • Strong handson expertise in specific big data technologies:
  • At least 3 Azure data services: ADLS Gen2, Azure Data Factory, Azure Databricks, Cosmos DB, Synapse Serverless or Synapse Dedicated SQL Pool, Azure Data Explorer, Stream Analytics, etc.
  • At least 2 big data technologies and frameworks: HDFS/S3/ADLS Gen2/GCS, Apache Spark/EMR/Dataproc, Delta Lake/Iceberg, Flink, Hive/Impala, Presto/Drill, etc.
  • NoSQL technologies like Hbase, MongoDB, Cassandra, Azure Cosmos DB, etc.
  • Streaming technologies like Kafka, Spark Structured Streaming, Flink or equivalent cloud services like Azure EventHub, Azure Streaming Analytics, Confluent, etc.
  • Personal skills
  • Experience in customer communication and managing business stakeholders
  • Ability to work effectively independently as well as in a team.
  • Strong verbal and written communication as well as presentation skills flexibly adaptable to different target audiences (business, technical, developers,).
  • Strong analytical skills.

Your benefits


We offer excellent employee benefits, a competitive salary package and great development opportunities, such as joining conference or paid trainings.

We welcome full-time (40 hours) and part-time work and offer hybrid work at home and in our offices.

If you move to a Bonn or Berlin for joining one of our offices, we support

Mehr Jobs von DHL