HAVE SC CLEARANCE a. Overall 5-7 Years of IT experience b. Strong experience to Dev-Ops principles c. Strong experience in Spark, Tableau, Hadoop, PL/SQL d. Good experience working with AWS platform e. Good exposure of ITIL processes which includes incident, problem, change management etc. f. more »
South East London, England, United Kingdom Hybrid / WFH Options
J&C Associates
Job Description Title: DevSecOps Engineer Skills: Devops Spark Hadoop PL/SQL Tableau Location: London - Hybrid Duration: 11 Months We are IT Recruitment Specialists partnered with a prestigious Global Consultancy who required a DevSecOps Engineer for one of their Clients based in London (Hybrid). IR35: This role is … Inside of IR35 Job Description: a. Overall 5-7 Years of IT experienceb. Strong experience to Dev-Ops principlesc. Strong experience in Spark, Tableau , Hadoop, PL/SQLd. Good experience working with AWS platforme. Good exposure of ITIL processes which includes incident, problem, change management etc.f. Good experience using more »
manage multiple tasks and projects simultaneously. Preferred Qualifications AWS Certified Solutions Architect or other relevant AWS certifications. Experience with big data technologies such as Hadoop, Spark, or similar. Knowledge of data governance and data quality best practices. Familiarity with machine learning and AI concepts and tools. more »
Experience of Data Lake/Hadoop platform implementation Hands-on experience in implementation and performance tuning Hadoop/Spark implementations Experience ApacheHadoop and the Hadoop ecosystem Experience with one or more relevant tools (Sqoop, Flume, Kafka, Oozie, Hue, Zookeeper, Hcatalog, Solr, Avro) Experience with one … or more SQL-on-Hadoop technology (Hive, Impala, Spark SQL, Presto) Experience developing software code in one or more programming languages (Java, Python, etc.) Preferred Qualifications Masters or PhD in Computer Science, Physics, Engineering or Math Hands on experience leading large-scale global data warehousing and analytics projects Ability more »
Company Description Petrolink is a global independent and neutral wellsite data solutions company that provides services in major oil and gas regions worldwide. Our specialties include visualization, data analytics, and data interoperability. Our technologies and services drive down the cost more »
modern data engineering technology stack compatible with AWS. Experience with web scraping and other data ingestion methods and tools. Knowledge of distributed computing frameworks (Hadoop, Spark, Hive, Presto). Experience with data orchestration tools (Airflow, Orchestra, Azkaban). Expertise in cloud data warehousing and core data modelling concepts. Proficiency more »
Surrey, England, United Kingdom Hybrid / WFH Options
The JM Longbridge Group
e.g., PostgreSQL), NoSQL databases (e.g., MongoDB), and implement data streaming solutions (e.g., Kafka) for efficient data handling. Work with big data technologies such as Hadoop to manage and analyse large datasets. Qualifications : Bachelor’s degree in computer science, Engineering, or related field (or equivalent experience). Experience with cloud more »
Key Skills 3+ years of Python experience Highly statistical and Analytical Exposure to Google Cloud Platform ( BigQuery, GCS, Datalab, Dataproc, Cloud ML (desirable) Spark & Hadoop experience Strong communication skills Good problem solving skills Qualifications Bachelor's degree or equivalent experience in a quantative field (Statistics, Mathematics, Computer Science, Engineering … and classification techniques, and algorithms Fluency in a programming language (Python, C,C++, Java, SQL) Familiarity with Big Data frameworks and visualization tools (Cassandra, Hadoop, Spark, Tableau) This is a permanent position, and offers flexibility with Hybrid working, 2-3 days per week in the office, depending on workload more »
large-scale data science/data analytics projects Ability to lead effectively across organizations Hands-on experience with Data Analytics technologies such as AWS, Hadoop, Spark, Spark SQL, Mlib or Storm/Samza Implementing AWS services in a variety of distributed computing, enterprise environments Proficiency with at least one more »
modeling, data access, and data storage techniques. Excellent problem-solving skills and the ability to think algorithmically. Desirable Skills: Knowledge of big data technologies (Hadoop, Spark, Kafka) is highly desirable. Familiarity with data governance and compliance requirements. more »
/CD. Strong design and coding skills (e.g. Python, Scala, JavaScript). Experience with Microsoft or AWS data stack e.g. Microsoft Azure Data Lake, Hadoop (preferably with Spark), Cosmos DB, HDInsight/HBase, MongoDB, Redis, Azure Table/Blob stores etc. Exposure to tools like SAP technologies and Alteryx more »
as DBT, FiveTran, etc. Understanding of Agile Delivery best practice Good knowledge of the relevant technologies e.g. SQL, Oracle, PostgreSQL, Python, ETL pipelines, Airflow, Hadoop, Parquet. Strong problem-solving and analytical abilities. Ability to present solutions and limitations to non-IT business experts ABOUT YOU Integrity, respect, intellectually curious more »
integration, transformation, and visualization capabilities. - Strong programming skills in Python, SQL, and other relevant languages. - Experience with big data technologies and tools such as Hadoop, Spark, and Kafka. - Familiarity with cloud platforms (AWS, Azure, GCP) and containerization technologies (Docker, Kubernetes). - **Soft Skills:** - Excellent problem-solving and analytical skills. more »
ideas • Ability to set the direction and deliver on a vision with forward planning to achieve results • Technical knowledge of big data platforms (e.g., Hadoop and Hive) as well as knowledge of ML, Data science and advanced modelling techniques, technologies, and programming languages • Possess a high degree of self more »
and classification techniques, and algorithms Fluency in a programming language (Python, C,C++, Java, SQL) Familiarity with Big Data frameworks and visualization tools (Cassandra, Hadoop, Spark, Tableau more »
HAVE SC CLEARANCE a. Overall 5-7 Years of IT experience b. Strong experience to Dev-Ops principles c. Strong experience in Spark, Tableau , Hadoop, PL/SQL d. Good experience working with AWS platform e. Good exposure of ITIL processes which includes incident, problem, change management etc. f. more »
Red Hat Decision Central Key Responsibilities: Manage operational procedures. Transform and process data using Apache Spark. Administer AWS RDS with MySQL. Work with the Hadoop platform. Create reports using Tableau. Utilize Red Hat Decision Central About Capgemini Capgemini is a global leader in partnering with companies to transform and more »
Leeds, Yorkshire, United Kingdom Hybrid / WFH Options
LSA Recruit
integration, transformation, and visualization capabilities. - Strong programming skills in Python, SQL, and other relevant languages. - Experience with big data technologies and tools such as Hadoop, Spark, and Kafka. - Familiarity with cloud platforms (AWS, Azure, GCP) and containerization technologies (Docker, Kubernetes). *Soft Skills:* - Excellent problem-solving and analytical skills. more »
3 Development resources (London) with experience in designing and building platforms, and supporting applications both in cloud environments and on-premises. These resources are expected to be open-source contributors to Apache projects, have an in-depth understanding of the more »
consulting environment• Current or previous consulting experience highly desirable • Experience of working with companies in the finance sector highly desirable• Platform implementation experience (ApacheHadoop - Kafka - Storm and Spark, Elasticsearch and others)• Experience around data integration & migration, data governance, data mining, data visualisation, database modelling in an agile delivery more »
with JIRA, strong knowledge of Python for automation, and expertise in test strategy, test management, and defect management. Additionally, familiarity with SQL, Big Data (Hadoop), ETL, and basic understanding of AWS is desirable. Responsibilities: Configure and manage JIRA projects, dashboards, defects, and test cases. Develop and maintain in-house … on experience with Python scripting. Strong understanding of test management and defect tracking. Knowledge of SQL for querying and reporting. Familiarity with Big Data (Hadoop) and ETL processes. Basic understanding of AWS services. If you are interested, apply here more »
Romsey, Hampshire, United Kingdom Hybrid / WFH Options
CBSbutler Holdings Limited trading as CBSbutler
Experience with cloud technologies i.e. AWS or Azure Programming language experience i.e. Java, Python, node.js or SQL Data technologies experience i.e. PostgreSQL, MongoDB, kafka, Hadoop If you are interested in discussing this DevOps Engineer role further, please apply or send a copy of your updated CV to (url removed more »
Woking, Surrey, United Kingdom Hybrid / WFH Options
CBSbutler Holdings Limited trading as CBSbutler
Experience with cloud technologies i.e. AWS or Azure Programming language experience i.e. Java, Python, node.js or SQL Data technologies experience i.e. PostgreSQL, MongoDB, kafka, Hadoop If you are interested in discussing this DevOps Engineer role further, please apply or send a copy of your updated CV to (url removed more »
looking for you to demonstrate include: Experience of data storage technologies: Delta Lake, Iceberg, Hudi Sound knowledge and understanding of Apache Spark, Databricks or Hadoop Ability to take business requirements and translate these into tech specifications Knowledge of Architecture best practices and patterns Competence in evaluating and selecting development more »
Bristol, England, United Kingdom Hybrid / WFH Options
Made Tech
how one could deploy infrastructure into different environments. Knowledge of handling and transforming various data types (JSON, CSV, etc) with Apache Spark, Databricks or Hadoop Good understanding of possible architectures involved in modern data system design (Data Warehouse, Data Lakes, Data Meshes) Ability to create data pipelines on a more »