Betreuung und kontinuierliche Weiterentwicklung der SCADA-Systeme mit Fokus auf datengetriebene Optimierung und Anbindung der Renewable Energy Assets - Sammlung, Analyse und Interpretation von Anlagedaten aus den Hybrid-Parks (Wind, PV und Speicher) zur gezielten Störungsanalyse und Pflege der Instandhaltungs-Datenbank
1
Administrate, monitor and optimize our Big Data environments based on Apache Hadoop (AWS Cloud) - Manage and maintain services like Kafka, Flink, NiFi, DynamoDB and Iceberg Tables - IaaC deployment via Terraform - Plan and execute updates and upgrades
2
Administration, monitoring, and optimization of the Big Data environment based on Apache Hadoop in the AWS Cloud - Management and maintenance of services like Kafka, Flink, NiFi, DynamoDB, and Iceberg Tables - Deployment using Infrastructure as Code (Terraform)
3
Aufgaben - Administrate, monitor and optimize our Big Data environment based on Apache Hadoop from Cloudera (AWS-Cloud) - Manage and maintain services like Kafka, Flink, NiFi, DynamoDB and Iceberg Tables - IaaC deployment via Terraform
4
Support our clients in areas such as data strategy, architecture, intelligence, big data, analytics, data warehousing, and business intelligence … Hands-on project experience in SAP BI – either frontend, backend, or both. Experience leading (or co-leading) projects as a (partial) project manager
5
Administration, monitoring, and optimization of big data environments based on Apache Hadoop (AWS Cloud) - Management and maintenance of services such as Kafka, Flink, NiFi, DynamoDB, and Iceberg Tables - Deployment using Infrastructure as Code (IaC) with Terraform
6
Experience in system administration of Linux systems (RedHat) - Expertise in building and operating Big Data environments based on Apache Hadoop clusters - Interest in continuously learning and improving one’s skillset - OUR FEATURES - State-of-the-Art Technologies
7
Administration, monitoring, and optimization of the Big Data environment based on Apache Hadoop in the AWS Cloud - Management and maintenance of services like Kafka, Flink, NiFi, DynamoDB, and Iceberg Tables - Deployment using Infrastructure as Code (Terraform)
8
für Wien, Linz, Wels, Salzburg, Graz, St. Pölten - Aufgaben: Administrate, monitor and optimize our Big Data environment based on Apache Hadoop from Cloudera (AWS-Cloud) - Manage and maintain services like Kafka, Flink, NiFi, DynamoDB and Iceberg Tables
9
Teaching and conducting examinations independently (the standard teaching load for this position is 5 units/week per semester (1 unit = 45 minutes of teaching). Teaching courses such as entrepreneurship, new venture creation, managing high growth firms; public policy and ecosystem development
10