Mathematics, Finance, Accounting, Economics or a related field or equivalent work experience (3+ years) Experience in: Some knowledge of database orchestration technologies + ETL (Airflow, DBT, Databricks) Working understanding of financial concepts and systems Ability to recognize and diagnose potential errors or data inconsistencies between multiple reports Working knowledge more »
Edinburgh, Scotland, United Kingdom Hybrid / WFH Options
BlackRock
DevOps automation, idempotent deployment testing, and continuous delivery pipelines Networking and security protocols, load balancers, API Gateways ETL tooling and workflow engines (e.g., Spark, Airflow, Dagster, Flyte) Data modeling, and strategies for cleaning and validating data at scale Performance tuning on RDBMS or Big Data tools for row- and more »
bonusComponents used in our Data Stack: Fivetran, Prefect, Snowflake, dbt and PeriscopeExperience in writing ETL pipelines with SQL and Python, using orchestration tools like Airflow or PrefectExperience working in fast-paced venture-backed startup environmentsAt Fresha, we value passion and potential as much as specific skills. If you're more »
few of the open-source libraries we use extensively. We implement the systems that require the highest data throughput in Java and C++.We use Airflow for workflow management, Kafka for data pipelines, Bitbucket for source control, Jenkins for continuous integration, Grafana + Prometheus for metrics collection, ELK for log more »
testing, and maintenance of data pipelines and data storage systems on Google Cloud Platform (GCP). You will be working with technologies such as ApacheAirflow, BigQuery, Python, and SQL to transform and load large data sets, ensuring high data quality and accessibility for business intelligence and analytics more »
have Terraform experience SQL & NoSQL experience Have built out Data Warehouses & built Data Pipelines Strong Databricks & Snowflake experience Docker, ECS, Kubernetes & Orchestration tools like Airflow or Step Functions are nice to have Contracts are running for 6 months initially, paying up to £450p/day (Outside IR35) and will more »
load data to Snowflake. Converting SAS based module to python based Candidates who have data management experience in Snowflake with expertise in Python, DBT, Airflow or similar technologies Must have hands on experience in DBT & SQL Snowflake (added advantage more »
Manchester Area, United Kingdom Hybrid / WFH Options
Maxwell Bond
successful Lead Data Engineer will have: Experience leading a Data Engineering team. Extensive working experience with GCP, SQL and DBT. Proficient in: Kafka, Dataform, Airflow, Tableau, PowerBI, Redshift, Snowflake, Terraform and BigQuery. What's in it for the successful Lead Data Engineer: Hybrid working for a better work/ more »
converting SAS-based modules to Python-based solutions. Strong understanding of data management principles and experience working with Snowflake. Proficiency in Python, DBT, and Airflow or similar technologies. Excellent problem-solving skills and ability to troubleshoot complex issues. Experience working in an Agile environment and collaborating with cross-functional more »
Greater London, England, United Kingdom Hybrid / WFH Options
Agora Talent
early-stage B2B SaaS experience involving client-facing projects • Experience in front-end development and competency in JavaScript • Knowledge of API development • Familiarity with Airflow, DBT, Databricks • Experience working with Enterprise Resource Planning (e.g. Oracle, SAP) and CRM systems. If this role sounds of interest, please apply using the more »
Oxfordshire, England, United Kingdom Hybrid / WFH Options
Mirus Talent
mandatory, familiarity with the following technologies and tools would be advantageous: Dagster (or similar Orchestration Tools) : Experience with Dagster or other orchestration tools like Airflow for managing complex data workflows and pipelines. Qlik Sense Cloud (or similar Reporting Tools) : Knowledge of Qlik Sense Cloud or similar reporting tools such more »
London, England, United Kingdom Hybrid / WFH Options
Jobleads-UK
Continuous Delivery Continuous integration pipelines Strong Python AWS or Azure with large-scale streaming data (Pulsar, Kafka, Kinesis, etc) ETL management; structured or custom (Airflow, Luigi, etc) Bonus Robust experience managing and developing an engineering team Delta lake or Iceberg Trino or Presto Graphql Good salary, bonus, stock options more »
Proven experience in MLOps and deploying machine learning models on Kubernetes. Proficiency in cloud technologies, AWS, GCP, Azure Experience with data orchestration tools (e.g., ApacheAirflow). Familiarity with Terraform for CI/CD and infrastructure as code. Strong programming skills in software development. A cloud-agnostic mindset more »
intelligence, and data warehousing Proficiency in BI tools (e.g., PowerBI, ThoughtSpot) Expert in SQL and Python with knowledge of how to leverage dbt and Airflow Data modelling and governance expertise Knowledge of how to run experimentation platforms and CoE Inspirational leadership styles with the ability to influence senior stakeholders more »
Southsea, Hampshire, United Kingdom Hybrid / WFH Options
Checkatrade
intelligence, and data warehousing Proficiency in BI tools (e.g., PowerBI, ThoughtSpot) Expert in SQL and Python with knowledge of how to leverage dbt and Airflow Data modelling and governance expertise Knowledge of how to run experimentation platforms and CoE Inspirational leadership styles with the ability to influence senior stakeholders more »
/BS or MA/MS degree in Mathematics, Statistics, Computer Science, Data Science, or a related field.Familiarity with data tools like Druid, Hadoop, Airflow, and Superset is beneficial.Experience with Tableau is a plusStrong communication skills to collaborate effectively with cross-functional teams.#LI-GW1 #LI-Remote Kraken is powered more »
Greater London, England, United Kingdom Hybrid / WFH Options
Cera
an autonomous environment. 5+ years SQL experience developing data pipelines in a cloud based database environment (e.g.BigQuery), including scheduling your own queries (e.g. using Airflow or Stitch) 3+ years experience delivering data products in a modern BI technology (ideally Looker) or open source data frameworks Experience managing end-to more »
Manchester, England, United Kingdom Hybrid / WFH Options
Vermelo RPO
sees challenges as development opportunities not problems Desirable Skills Experience of SAS Viya Experience of SAS Visual Analytics Experience of SQL Server Experience with ApacheAirflow Experience using MS Dev Ops for workflow and CI/CD pipelines. Educated to degree standard more »
Glasgow, Scotland, United Kingdom Hybrid / WFH Options
Synchro
in Python or PySpark, we encourage you to apply. Python App Developer Requirements: Proficiency with Python or PySpark. Exposure to cloud technologies such as Airflow, Astronomer, Kubernetes, AWS, Spark, Kafka. Experience with Big Data solutions or Relational DB. Demonstrated knowledge of software applications and technical processes within a cloud more »
Bristol, Avon, South West, United Kingdom Hybrid / WFH Options
Set2Recruit
proficiency GIT proficiency Linux use and admin Experience deploying cloud services (AWS is a bonus) Experience with Docker and Kubernetes Using frameworks such as AirFlow ML background - PyTorch for computer vision This is a fully remote role which comes with: Budget for WFH set up. Stock options. 25 days more »
City Of London, England, United Kingdom Hybrid / WFH Options
Cititec Talent
a leading commodities trading firm. Outside of IR35 Hybrid working - 2/3 days in London office Experience Required: - Commodities industry experience Weather forecasting Airflow (or equivalent data orchestration platforms) Data Modeling (Star Schema) Data Warehousing Data Pipeline Orchestration (Kafka) On-Prem SQL more »
working on very complex systems Strong experience with Computer vision Longevity in their previous roles Experience with Remote Sensing highly desirable Stack: Python, PyTorch, AirFlow, PySpark (equivalent tools are fine more »
knowledge on key technologies like Big Query/Redshift/Synapse/Pub Sub/Kinesis/MQ/Event Hubs, Kafka Dataflow/Airflow/ADF etc Strong proven knowledge of Kimball/Dimensional data modelling and/or Data vault If you are interested in applying for more »
Experienced creating data pipelines on a cloud (preferably AWS) environment CI/CD experience Containerization experience (Docker, Kubernetes, etc.) Experience with SQS/SNS, Apache Kafka, RabbitMQ Other interesting/bonus skills – Airflow, Trino, Apache Iceburg, Postgres, MongoDB You *must* be eligible to work in your chosen more »