Analysieren und Verifizieren: Prozesse und Systeme von Banken sowie IT-Dienstleistern direkt vor Ort untersuchen, Aussagen verifizieren und sicherstellen, dass alle Risiken lückenlos erfasst und adressiert werden. Risiken sichtbar machen: Gefundene Schwachstellen strukturiert dokumentieren sowie klar …
1
Analysieren und Verifizieren: Prozesse und Systeme von Banken sowie IT-Dienstleistern direkt vor Ort untersuchen, Aussagen verifizieren und sicherstellen, dass alle Risiken lückenlos erfasst und adressiert werden. Risiken sichtbar machen: Gefundene Schwachstellen strukturiert dokumentieren sowie klar …
2
Aufgaben - Administrate, monitor and optimize our Big Data environment based on Apache Hadoop from Cloudera (AWS-Cloud) - Manage and maintain services like Kafka, Flink, NiFi, DynamoDB and Iceberg Tables - IaaC deployment via Terraform
3
Cloudera CDH - Python - Apache Kafka - Apache Kudu - NiFi - SQL - Scala - Unsere Erwartungen an dich: Qualifikationen - Proven expertise in real-time data technologies such as Kafka and Spark Structured Streaming, plus solid SQL skills (including query tuning and optimization)
4
für Wien, Linz, Wels, Salzburg, Graz, St. Pölten - Aufgaben: Administrate, monitor and optimize our Big Data environment based on Apache Hadoop from Cloudera (AWS-Cloud) - Manage and maintain services like Kafka, Flink, NiFi, DynamoDB and Iceberg Tables
5
Experience developing data pipelines with the Cloudera stack is essential - Some experience developing data pipelines in Databricks a plus - Proven expertise in real-time data technologies such as Kafka and Spark Structured - Streaming, plus solid SQL skills (including query tuning and optimization)
6
Several years of data engineering experience building batch and real-time data pipelines - 3+ years of hands-on experience with Databricks is essential - Experience developing data pipelines with the Cloudera stack is a plus - Strong knowledge of the Azure cloud ecosystem, including Azure Data Lake and Azure DevOps
7
THIS IS US - The Data Science department uses statistical modeling, machine learning and optimization techniques to continuously improve business processes at NEW YORKER … Work closely with the data engineering and data platform teams to maintain, operate and improve the infrastructure and pipelines that run critical data …
8
THIS IS US - The Data Science department uses state-of-the-art machine learning and optimization techniques to continuously improve business processes at NEW YORKER … Operate and optimize Hadoop (Cloudera) clusters with support of the DevOps and Infrastructure teams
9
Job Details - Must Haves: A passion for optimizing the Software Development Lifecycle to speed - delivery of solutions - Min. 3 years of relevant professional experience with DevOps related - topics - Public Cloud experience, preferably GCP - Experience with Cloudera platfrom
10