VetJobs - The Leading Military Job Board

Job Information

Dynatrace Senior DevOps Engineer for Data Engineering team (m/f/x) in Vienna, Australia

Your role at Dynatrace

Data = information = knowledge = power. Do you want to hold the keys to that power? Are you motivated by solving challenging problems, where creativity is as crucial as your ability to write code, deliver solutions, and bring valuable data sets together to answer business questions?

If this sounds like an environment where you will thrive, come and join our Data Engineering team. Interested? Cause we are!

About the role

In Dynatrace we are all about automation, self healing and noOps approach. We preach automation wherever possible and we live by what we preach.

In the Data Engineering team for which we are hiring we are providing data to drive the world class Application Intelligence platform that is Dynatrace.

As a DevOps Engineer in the Data Engineering you will help us automate away our Data Platform both by providing the necessary tooling as by designing processes.

It is quite an unique situation, as Dynatrace delivers the one of the best tools for DevOp, with this opportunity you would put your experience to drive this product, dogfooding it whenever possible and building a tool for other DevOps as well.

You will be building tools to automate installation at scale, accelerating time-to-value and enhancing the reliability of the Data Platform. That includes scripts but we may also need to integrate with existing mechanisms via APIs or provide means to reconfigure an already deployed product. Have an impact on how we shape our ETL pipeline and make sure the deployments of new builds of pipeline are automatic, predictable and transparent. All this working towards eliminating data downtimes and adding bricks to building trust in the data that your fellow Dynatracers will use on all levels of seniority to build the product that our customers love.

This is an exciting opportunity to make a direct, tangible impact on our product and work on our crucial Digital Business Platform.

As a member of the Data Engineering team, you will be at the center of Dynatrace product innovation.

In a company as Agile in organization as Dynatrace, there is always an option and encouragement to explore new areas when you find them interesting, moving to new positions and building a career with Dynatrace.

We guarantee plenty of challenges and scope to grow.

What will help you succeed

Main responsibilities

  • Creating deployment integrations for cloud platforms, primarily AWS and Azure.

  • Deployment automation in Jenkins and Terraform

  • Designing and automating processes for ETL data pipeline(s)

  • Proactively ensuring continuous and smooth data related processes execution

  • Collaboration in international cross-lab teams (mostly in the same time zone, across Europe) on the delivery of current objectives.

  • Priority skills & experience

  • 3+ years professional experience with process automation, preferably as a DevOps, SRE or sysadmin

  • 3+ years working with Cloud solutions, preferably AWS, on configuration, deployment management and automation

  • Experience with deployment automation, and CI/CD pipelines preferably using Jenkins

  • Good English communication skills.

  • Desired skills & experience

  • Experience with Cloud databases, preferably Snowflake

  • Experience with DB services administration (PostgreSQL, AWS RDS, Aurora, Snowflake) and practical knowledge of SQL

  • Practical knowledge of IaC tools: CloudFormation, Terraform, and similar tools.

  • Mindset focused on monitoring and observability

  • Nice-to-haves

  • Experience in CI/CD support for MS Power BI development

  • Experience with working on data pipelines (ETL / ELT) automation as well as supporting Data Engineering and Data Science team(s)

  • Experience with multiple cloud platforms (AWS, GCP, Azure)

  • Good command of scripting language(s): Python, Shell script, PowerShell.

  • Practical knowledge of IaC tools: Ansible, Chef, Puppet, PowerShell DSC, SaltStack, CloudFormation, Terraform, and similar tools.

  • Java literacy, experience with other programming languages

  • Familiarity with Docker and Kubernetes.

Why you will love being a Dynatracer

  • Dynatrace is a leader in unified observability and security.

  • We provide a culture of excellence with competitive compensation packages designed to recognize and reward performance.

  • Our employees work with the largest cloud providers, including AWS, Microsoft, and Google Cloud, and other leading partners worldwide to create strategic alliances.

  • The Dynatrace platform uses cutting-edge technologies, including our own Davis hypermodal AI, to help our customers modernize and automate cloud operations, deliver software faster and more securely, and enable flawless digital experiences.

  • Over 50% of the Fortune 100 companies are current customers of Dynatrace.

Compensation and Rewards

  • We offer attractive compensation packages and stock purchase options with numerous benefits and advantages .

  • Due to legal reasons, we are obliged to disclose the minimum salary for this position, which is € 56,000 gross per year based on full-time employment. We offer a higher salary in line with qualifications and experience.

DirectEmployers