London, England, United Kingdom Hybrid / WFH Options
Aventum Group
Profisee), Snowflake Data Integration, Azure Service Bus, Deltalake, BigQuery, Azure DevOps, Azure Monitor, Azure Data Factory, SQL Server, Azure DataLake Storage, Azure App Service, Apache Airflow, Apache Iceberg, Apache Spark, Apache Hudi, Apache Kafka, Power BI, BigQuery, Azure ML is a plus Experience with Azure more »
up and learn new technologies and frameworks Nice to have: Knowledge of databases, SQL Familiarity with Boost ASIO Familiarity with data serialization formats such Apache Arrow/Parquet, Google Protocol Buffers, Flatbuffers Experience with gRPC, http/REST and Websocket protocols Experience with Google Cloud/AWS and/ more »
that incorporate various data backends, query languages and ORM frameworks. Experience designing and building ETL pipelines built around libraries and frameworks like Pandas and Apache Spark. Strong API design skills and a familiarity with building web applications. A proponent of great testing, first-class observability and automating everything. Familiarity more »
stakeholdersWillingness to pick up and learn new technologies and frameworksNice to have:Knowledge of databases, SQLFamiliarity with Boost ASIOFamiliarity with data serialization formats such Apache Arrow/Parquet, Google Protocol Buffers, FlatbuffersExperience with gRPC, http/REST and Websocket protocolsExperience with Google Cloud/AWS and/or containerization more »
Java, Python, and Ruby. Experience in database design under MS SQL, MySQL, Firebird, or similar servers. Experience with web servers such as IIS and Apache, or similar servers. Experience in Web design using HTML, JSON, JavaScript, etc. Experience in API design. Degree in electronics engineering/IT - Programming or more »
Greenford, London, United Kingdom Hybrid / WFH Options
Indotronix Avani UK Ltd
Java, Python, and Ruby. Experience in database design under MS SQL, MySQL, Firebird, or similar servers. Experience with web servers such as IIS and Apache, or similar servers. Experience in Web design using HTML, JSON, JavaScript, etc. Experience in API design. Degree in electronics engineering/IT - Programming or more »
developing and optimising ETL pipelines. Version Control: Experience with Git for code collaboration and change tracking. Data Pipeline Tools: Proficiency with tools such as Apache Airflow. Cloud Platforms: Familiarity with AWS, Azure, Snowflake, and GCP. Visualisation: Tableau or PowerBI Delivery Tools: Familiarity with agile backlogs, code repositories, automated builds more »
expertise in developing and optimising ETL pipelines.Version Control: Experience with Git for code collaboration and change tracking.Data Pipeline Tools: Proficiency with tools such as Apache Airflow.Cloud Platforms: Familiarity with AWS, Azure, Snowflake, and GCP.Visualisation: Tableau or PowerBIDelivery Tools: Familiarity with agile backlogs, code repositories, automated builds, testing, and releases. more »
pipelines solutions for the ingestion, transformation, and serving of data, as well as solutions for the orchestration of pipeline components (e.g. AWS Step Functions, Apache Airflow).Good understanding of data modelling, algorithm, and data transformation techniques to work with data platforms.Working knowledge of cloud development practices (AWS/GCP more »
pipelines solutions for the ingestion, transformation, and serving of data, as well as solutions for the orchestration of pipeline components (e.g. AWS Step Functions, Apache Airflow). Good understanding of data modelling, algorithm, and data transformation techniques to work with data platforms. Working knowledge of cloud development practices (AWS more »
have a valid visa as we Are not able to sponsor Technical Stack:- Python, Postgres SQL, Azure Databricks, AWS(S3), Git, Azure DevOps CICD, Apache Airflow Skills years of experience in python scripting. in developing applications in Python language. to python-oriented Algorithm’s libraries such as NumPy, pandas more »
South East London, England, United Kingdom Hybrid / WFH Options
Client Server
to production, providing subject matter expertise on the .Net stack and contributing to technical design discussions. You'll use a range of technology including Apache Flink with Java for large scale data processing and will be able to assess and recommend new and emerging technologies, using the best tool more »
to production, providing subject matter expertise on the .Net stack and contributing to technical design discussions. You'll use a range of technology including Apache Flink with Java for large scale data processing and will be able to assess and recommend new and emerging technologies, using the best tool more »
Terraform/Docker/Kubernetes. Write software using either Java/Scala/Python . The following are nice to have, but not required - Apache Spark jobs and pipelines. Experience with any functional programming language. Database design concepts. Writing and analysing SQL queries. Application overVIOOH Our recruitment team will more »
Flask, Tornado or Django, Docker Experience working with ETL pipelines is desirable e.g. Luigi, Airflow or Argo Experience with big data technologies, such as Apache Spark, Hadoop, Kafka, etc. Data acquisition and development of data sets and improving data quality Preparing data for predictive and prescriptive modelling Hands on more »
South East London, England, United Kingdom Hybrid / WFH Options
McGregor Boyall
Processing: Work with Hadoop, Spark, and other platforms for large-scale data processing.Real-Time Data Streaming: Develop and manage pipelines using CDC, Kafka, and Apache Spark.Database Management: Handle SQL databases like Oracle, MySQL, or PostgreSQL.Data Governance: Ensure data quality, security, and compliance with best practices.Ideal Candidate:Significant experience in more »
London, England, United Kingdom Hybrid / WFH Options
McGregor Boyall
Work with Hadoop, Spark, and other platforms for large-scale data processing. Real-Time Data Streaming: Develop and manage pipelines using CDC, Kafka, and Apache Spark. Database Management: Handle SQL databases like Oracle, MySQL, or PostgreSQL. Data Governance: Ensure data quality, security, and compliance with best practices. Ideal Candidate more »
management and data governance open source platform that we will teach you. Other technologies in use in our space: RESTful services, Maven/Gradle, Apache Spark, BigData, HTML 5, AngularJs/ReactJs, IntelliJ, Gitlab, Jira. Cloud Technologies: You’ll be involved in building the next generation of finance systems more »
Greater London, England, United Kingdom Hybrid / WFH Options
Understanding Recruitment
use Java (for a very small amount of scripting work) Have public cloud experience with AWS or other cloud providers Have an understanding of Apache products such as Kafka and Flake Good knowledge of development using CI/CD Bonus points if you knowledge of: Web products Financial markets more »