Developer certification Very Disrable to have hands-on experience with ETL tools, Hadoop-based technologies (e.g., Spark), and batch/streaming data pipelines (e.g., Beam, Flink etc) Proven expertise in designing and constructing data lakes and data warehouse solutions utilising technologies such as BigQuery, Azure Synapse, Redshift, Oracle, Teradata more »
Azure/GCP for our cloud backend Skills we’d like to hear about Experience working with large data pipelines (using technologies such as Beam or Kafka) Experience in LLMs using OpenAI, Gemini or open source models Exposure to other programming languages (such as Java) Experience of working on more »
experience in data engineering. Experienced in building ETL data pipelines. Relational database experience w/PostgreSQL. Understanding of tech within our stack: AWS/Apachebeam/Kafka. Experience with Object Orientated Programming A desire to work in the commodities/trading sector. Permanent/Full-Time Employment. more »
Cardiff, Wales, United Kingdom Hybrid / WFH Options
Maclean Moore
basis. ROLE: GCP DATA ENGINEER LOCATION: NEWPORT OR CARDIFF (HYBRID) IR35 STATUS: INSIDE LENGTH: 6 MONTHS Required experience: Expertise in python and DataFlow/Apachebeam Experience in handling streaming data Strong experience in database replication using Message based CDC Experience in using Kafka implementations in a secured more »
development, deployment of large scale data streaming Pipelines in GCP . Work on Data Streaming POC Experience required: Expertise in python and DataFlow/Apachebeam Experience in handling streaming data. Strong experience in database replication using Message based CDC Experience in using Kafka implementations in a secured more »
Newport, Gwent, Wales, United Kingdom Hybrid / WFH Options
Maclean Moore Ltd
Developer. ROLE: GCP DATA ENGINEER LOCATION: NEWPORT OR CARDIFF (HYBRID) IR35 STATUS: INSIDE LENGTH: 6 MONTHS Required experience: Expertise in python and DataFlow/Apachebeam Experience in handling streaming data Strong experience in database replication using Message based CDC Experience in using Kafka implementations in a secured more »
development, deployment of large scale data streaming Pipelines in GCP . Work on Data Streaming POC Experience required: Expertise in python and DataFlow/Apachebeam Experience in handling streaming data. Strong experience in database replication using Message based CDC Experience in using Kafka implementations in a secured more »
experience in Java. Strong experience on data quality standards and contribution on defining and monitoring data quality metrics and KPIs Key Skills Needed: Flink Beam Kafka Connect Java more »