Big Data Solutions Architect - Munich, Deutschland - Databricks

    Databricks
    Default job background
    Beschreibung
  • DiscoverCustomersPartners
  • Databricks PlatformIntegrations and DataPricingOpen Source
  • Databricks for IndustriesCross Industry SolutionsMigration & DeploymentSolution Accelerators
  • Training and CertificationEventsBlog and PodcastsGet HelpDive Deep
  • CompanyCareersPressSecurity and Trust
  • Ready to get started?
  • Big Data Solutions Architect Munich, Germany

    CSQ225R53

    As a Big Data Solutions Architect (Resident Solutions Architect) in our Professional Services team you will work with clients on short to medium term customer engagements on their big data challenges using the Databricks Data Intelligence Platform. You will provide data engineering, data science, and cloud technology projects which require integrating with client systems, training, and other technical tasks to help customers to get most value out of their data. RSAs are billable and know how to complete projects according to specification with excellent customer service. You will report to the regional Manager/Lead.

    The impact you will have:

  • You will work on a variety of impactful customer technical projects which may include designing and building reference architectures, creating how-to's and productionalizing customer use cases
  • Work with engagement managers to scope variety of professional services work with input from the customer
  • Guide strategic customers as they implement transformational big data projects, 3rd party migrations, including end-to-end design, build and deployment of industry-leading big data and AI applications
  • Consult on architecture and design; bootstrap or implement customer projects which leads to a customers' successful understanding, evaluation and adoption of Databricks.
  • Provide an escalated level of support for customer operational issues.
  • You will work with the Databricks technical team, Project Manager, Architect and Customer team to ensure the technical components of the engagement are delivered to meet customer's needs.
  • Work with Engineering and Databricks Customer Support to provide product and implementation feedback and to guide rapid resolution for engagement specific product and support issues.
  • What we look for:

  • Proficient in data engineering, data platforms, and analytics with a strong track record of successful projects and in-depth knowledge of industry best practices
  • Comfortable writing code in either Python or Scala
  • Enterprise Data Warehousing experience (Teradata / Synapse/ Snowflake or SAP)
  • Working knowledge of two or more common Cloud ecosystems (AWS, Azure, GCP) with expertise in at least one
  • Deep experience with distributed computing with Apache Spark and knowledge of Spark runtime internals
  • Familiarity with CI/CD for production deployments
  • Working knowledge of MLOps
  • Design and deployment of performant end-to-end data architectures
  • Experience with technical project delivery - managing scope and timelines.
  • Documentation and white-boarding skills.
  • Experience working with clients and managing conflicts.
  • Build skills in technical areas which support the deployment and integration of Databricks-based solutions to complete customer projects.
  • Travel to customers 30% of the time
  • Databricks Certification
  • Benefits

  • Private medical insurance
  • Life, accident & disability insurance
  • Pension Plan
  • Vision Reimbursement
  • Equity awards
  • Enhanced Parental Leaves
  • Fitness reimbursement
  • Annual career development fund
  • Home office & work headphones reimbursement
  • Business travel accident insurance
  • Mental wellness resources
  • Employee referral bonus