PREMIUM
... structured tables is done through Spark jobs in Databricks or data ... focus is on rearchitecting existing Spark applications to either Cloud Dataproc ... , including a Kubernetes cluster with Apache Kafka, Apache Airflow and Apache Flink, are used to meet ...
www.adzuna.pl
15 000 - 20 000 PLN net/month
PREMIUM
... structured tables is done through Spark jobs in Databricks or data ... focus is on rearchitecting existing Spark applications to either Cloud Dataproc ... , including a Kubernetes cluster with Apache Kafka, Apache Airflow and Apache Flink, are used to meet ...
www.adzuna.pl
20 000 - 30 000 PLN net/month
... wykorzystanie w pełni potencjału klastra Spark, oczyszczanie, przekształcanie i analizowanie ogromnych ... w pracy w tych obszarach: Apache Spark 2.x Apache Spark RDD API Apache Spark SQL DataFrame API Apache Spark Streaming API Scala dostrojenie zapytań ...
www.goldenline.pl
... wykorzystanie w pełni potencjału klastra Spark, oczyszczanie, przekształcanie i analizowanie ogromnych ... w pracy w tych obszarach: Apache Spark 2.x Apache Spark RDD API Apache Spark SQL DataFrame API Apache Spark Streaming API Scala dostrojenie zapytań ...
www.goldenline.pl
... wykorzystanie w pełni potencjału klastra Spark, oczyszczanie, przekształcanie i analizowanie ogromnych ... w pracy w tych obszarach: Apache Spark 2.x Apache Spark RDD API Apache Spark SQL DataFrame API Apache Spark Streaming API Scala dostrojenie zapytań ...
www.goldenline.pl
... wymagania: doświadczenie w pracy jako programista programista aplikacji
pl.jooble.org
... structured tables is done through Spark jobs in Databricks or data ... focus is on rearchitecting existing Spark applications to either Cloud Dataproc ... , including a Kubernetes cluster with Apache Kafka, Apache Airflow and Apache Flink, are used to meet ...
pl.talent.com
... structured tables is done through Spark jobs in Databricks or data ... focus is on rearchitecting existing Spark applications to either Cloud Dataproc ... , including a Kubernetes cluster with Apache Kafka, Apache Airflow and Apache Flink, are used to meet ...
pl.talent.com
... structured tables is done through Spark jobs in Databricks or data ... focus is on rearchitecting existing Spark applications to either Cloud Dataproc ... , including a Kubernetes cluster with Apache Kafka, Apache Airflow and Apache Flink, are used to meet ...
pl.jooble.org
... structured tables is done through Spark jobs in Databricks or data ... focus is on rearchitecting existing Spark applications to either Cloud Dataproc ... , including a Kubernetes cluster with Apache Kafka, Apache Airflow and Apache Flink, are used to meet ...
pl.jooble.org