Azure and Snowflake, enhancing data capabilities for analytics and science. What you need: 3+ years of hands-on experience as a Data Engineer, building ETL pipelines and managing data lifecycle. Strong Python coding experience. 2+ years of commercial experience developing in Snowflake. Good understanding of cloud principles (ideally Azure but more »
the infrastructure that supports their cutting-edge trading strategies. Key Responsibilities: Design and manage scalable data pipelines. Integrate data from various sources and develop ETL processes. Optimise database systems for high performance. Work with data scientists and analysts to meet data needs. Automate data processing tasks to enhance efficiency. Implement more »
to combine the latest thinking with traditional military functions. *Must have Active SC Clearance* Role requirements Leveraging Azure cloud technologies for tasks such as ETL pipeline development, data warehousing, data lake creation, and data movement. Utilizing Azure data and analytics services, including but not limited to Azure Data Factory, Azure more »
and processes to enhance data operations. Requirements: Bachelor's or Master's degree in Computer Science, related field, or equivalent commercial experience. Proficiency in ETL/ELT processes and database systems. Strong programming skills in Python and experience with data operations within AWS. Background in event-based data operations. Excellent more »
DY5, Woodside, Dudley, West Midlands (County), United Kingdom Hybrid / WFH Options
Technical Placements Ltd
data analysis, particularly using Data manipulation tools, Excelm PowerBI, ERP, and MRP systems. Strong proficiency data preparation and data visualization. Familiarity with data integration, ETL processes, and data warehousing concepts. Excellent analytical, problem-solving, and communication skills. For a full job description of an informal chat about the role, please more »
Employment Type: Permanent
Salary: £30000 - £40000/annum Good benefits and Prospects
with real-time data analysis and financial systems (preferred). Knowledge of database design principles, performance optimization, and data modeling. Familiarity with data integration, ETL processes, and data warehousing. Excellent problem-solving skills and the ability to work effectively in a fast-paced environment. Strong communication and teamwork skills. A more »
in Python to work within a fast paced Data team . You will work very closely with Quant Research, Trading, Quant Development and traditional ETL/ELT Data Engineering. In this latency sensitive trading environment , your work is intrinsic to the firms PnL. Your core focus will be on on more »
a call on 0191 3387551. Keywords: Azure Data Factory, Azure Databricks, Databricks Lakehouse, MS Power BI, Power BI, Spark, Delta Lake, T-SQL, DevOps, ETL, Data Modelling, DAX, Data Warehousing, London more »
and manipulation · Analytical capability/Analytical skills: · Issues & risks identification & mitigation · Root cause analysis · Qualitative & quantitative analysis · End to end development of PowerBI Reports (ETL, data modelling, expression writing and report writing) · Experience with data mastery · Good practices in SQL development · Good understanding of relational databases · Advanced Excel (Power Query more »
knowledge in distributed systems, cloud architecture, and data pipelines. Proficiency in Python programming (knowledge of Scala or Rust is a plus). Familiarity with ETL principles in contemporary data applications (Dagster, Airflow, Perfect). Familiarity with AWS services such as Glue, Redshift, Athena, and S3. Proficiency with Terraform, Kubernetes, andmore »
Northamptonshire, England, United Kingdom Hybrid / WFH Options
Capgemini
technologies like Python, Scala, Spark, PySpark or Ab Initio, Glue, Starburst, Snowflake, Redshift, Starburst, Hybrid(on-prem to cloud) for building data pipelines andETL processes. Experience with data pipeline orchestration tools like Apache Airflow . Knowledge of SQL and ability to write complex queries for data transformation and analysis. more »
Knutsford, England, United Kingdom Hybrid / WFH Options
Capgemini
technologies like Python, Scala, Spark, PySpark or Ab Initio, Glue, Starburst, Snowflake, Redshift, Starburst, Hybrid(on-prem to cloud) for building data pipelines andETL processes. Experience with data pipeline orchestration tools like Apache Airflow . Knowledge of SQL and ability to write complex queries for data transformation and analysis. more »
Ability to think strategically and assess options across short, medium, and long-term timeframes. Experience with data technologies including MS SQL, data warehousing, andETL processes. Familiarity with microservices architecture, APIs, and integration patterns. Knowledge of DevOps practices and CI/CD pipelines. Understanding of security standards and optimal practices more »
contribute to the technical aspects in the upkeep of the data estate. Focusing on hands-on development, enhancement, and maintenance of data models andETL processes within the data estate You will emphasise the importance of data quality and governance in your project work, contributing to the organisational data model more »
Central London, London, United Kingdom Hybrid / WFH Options
Recruitment Revolution
analytic role, with a focus on analysis, reporting and visualisation in Looker Studio + Highly proficient in GCP/GBQ, SQL, data modelling andETL processes + Experience with Supermetrics, GitHub, DBT, Google Big Query and Shopify + Strong communication and presentation skills to effectively convey insights to both technical more »
and Machine Learning engineers and it is responsible for supporting data scientists in deploying, maintaining and monitoring an increasing number of Python-based microservices, ETL pipelines, SaaS models, databases and vector stores. The MLOps Lead would need to act as an interface between data scientists, the data & analytics team andmore »
Central and North West London NHS Foundation Trust
and social care data sets (MHSDS, CSDS, CDS, RAP etc.) oExcellent knowledge of concepts of data analysis and visualisation oExcellent knowledge of database management, ETL process and transactional SQL oUnderstanding of information technology infrastructure requirements in relation to data flows. oExcellent time and workload management skills. oAble to manage andmore »
Burton-On-Trent, Staffordshire, Burton upon Trent, United Kingdom
Michael Page
company data acquisition strategy allowing for near real time process reporting and full support of contractual reporting to customers. Key Responsibilities: Monitor and maintain ETL processes to ensure the continuation of an accurate data reporting platform for the business. Explore solutions for optimising the performance of the strategic data platform … Extensive SQL knowledge (Microsoft SQL Server 2005+) and experience working with relational databases, query authoring (T-SQL) Microsoft SQL server Integrated Services (SSIS) including ETL/ELTL design and development experience. Microsoft DevOps source control software and development lifecycle software Developing and maintaining objects within Data Warehouses/Lakehouses. Experience … Extensive SQL knowledge (Microsoft SQL Server 2005+) and experience working with relational databases, query authoring (T-SQL) Microsoft SQL server Integrated Services (SSIS) including ETL/ELTL design and development experience. Microsoft DevOps source control software and development lifecycle software Developing and maintaining objects within Data Warehouses/Lakehouses. Experience more »
a strategic vision for data management information reporting within our organisation. Responsibilities Develop, support, and our business reporting platform, including DataBlend for integrations an ETL, along with gathering requirements for business reporting across a range of systems. Create and maintain reports and dashboards in PowerBI to support business decisions andmore »
Mathematics, or related field 📊 Experience : Proven Data Engineer experience, preferably in finance 💻 Skills : Python, SQL, NoSQL, big data tools Cloud platforms (AWS, Azure, GCP) ETL processes and database design Join us and make a difference more »
engineers. Proficiency in Azure (including services like Azure Data Factory, Azure Databricks, etc.). Strong programming skills in Python . Familiarity with data modeling, ETL processes, and data warehousing. Excellent communication and leadership abilities. Additional Information: This role offers a £100k + performance-based bonus, and comprehensive benefits package. Candidates more »
key stakeholders and other business units to understand business needs and gather requirements. Required Experience Track record of designing and building data infrastructure andETL pipelines Experience with Azure Platform including Data Factory, Synapse, and Data Lake DB development experience with SQL Software development experience in relevant languages like Python more »
strategy constraints. Essential Skills: The ideal candidates will have a proven Senior Data & Analytics Development background, with the following skills/experience: Knowledge of ETL/ELT, data warehousing/business intelligence methodologies and best practice including dealing with big data, cloud technology and unstructured data and the relative required more »
a 50/50 split between BI reporting/analysis and data ingestion, and would suit someone with a couple years experience across Python , ETL, (AWS, Azure or GCP) and any BI tool (PowerBI, Tableau, Qlik, QuickSight, or GoodData - it's more about the transferable skills!). This is an more »
Python and its data processing, analysis, and visualization libraries Experienced with SQL and Timeseries databases Skilled in AWS services: S3, EC2, RDS Knowledgeable in ETL tools like Airflow Proficient in Git, CI/CD, testing tools, and documentation best practices Adheres to quality engineering practices including TDD and BDD Nice more »