PySpark Jobs in Yorkshire

4 PySpark Jobs in Yorkshire

Data Engineer

North Yorkshire, England, United Kingdom
KDR Talent Solutions
a complete greenfield re-architecture from the ground up in Microsoft Azure. The Tech you'll be playing with: Azure Data Factory Azure Databricks PySpark SQL DBT What you need to bring: 1-3 year's experience in building Data Pipelines SQL experience in data warehousing Python experience would more »
Posted:

Lead Data Engineer

Leeds, England, United Kingdom
HCLTech
Job DescriptionLead Data Engineer:We need some strong Data engineer profiles… they need good experience withPyspark, Python, SQL, ADF and preferably Databricks experience Job description:Building new data pipelines and optimizing data flows using the Azure cloud stack.Building data products more »
Posted:

Spark Architect

Leeds, England, United Kingdom
PRACYVA
months to begin with & its extendableLocation: Leeds, UK (min 3 days onsite)Context: Legacy ETL code for example DataStage is being refactored into PySpark using Prophecy low-code no-code and available converters.Converted code is causing failures/performance issues.Skills:Spark Architecture – component understanding around Spark Data Integration (PySpark … SQL, Spark Explain plans.Spark SME – Be able to analyse Spark code failures through Spark Plans and make correcting recommendations.Spark SME – Be able to review PySpark and Spark SQL jobs and make performance improvement recommendations.Spark – SME Be able to understand Data Frames/Resilient Distributed Data Sets and understand any … there are Cluster level failures.Cloudera (CDP) – Knowledge of understanding how Cloudera Spark is set up and how the run time libraries are used by PySpark code.Prophecy – High level understanding of Low-Code No-Code prophecy set up and its use to generate PySpark code. more »
Posted:

Spark Architect

Leeds, England, United Kingdom
PRACYVA
months to begin with & its extendable Location: Leeds, UK (min 3 days onsite) Context: Legacy ETL code for example DataStage is being refactored into PySpark using Prophecy low-code no-code and available converters. Converted code is causing failures/performance issues. Skills: Spark Architecture – component understanding around Spark … Data Integration (PySpark, scripting, variable setting etc.), Spark SQL, Spark Explain plans. Spark SME – Be able to analyse Spark code failures through Spark Plans and make correcting recommendations. Spark SME – Be able to review PySpark and Spark SQL jobs and make performance improvement recommendations. Spark – SME Be able … are Cluster level failures. Cloudera (CDP) – Knowledge of understanding how Cloudera Spark is set up and how the run time libraries are used by PySpark code. Prophecy – High level understanding of Low-Code No-Code prophecy set up and its use to generate PySpark code. more »
Posted:
PySpark
Yorkshire
10th Percentile
£57,500
25th Percentile
£77,500
Median
£80,000
75th Percentile
£82,500
90th Percentile
£85,000