Freelance Senior Data Engineer

New Yesterday

Job Description

Senior Data Engineer


Start - ASAP


Duration - 3 months


Rates - TBC


Location - Chancery Lane


Hybrid - 3 days onsite / 2 days remote


We are seeking a proactive and self-motivated Senior Data Engineer with a proven track record in building scalable cloud-based data solutions across multiple cloud platforms to support our work in architecting, building and maintaining the data infrastructure. The specific focus for this role will start with GCP however we require experience with Snowflake and Databricks also.


As a senior member within the data engineering space, you will play a pivotal role in designing scalable data pipelines, optimising data workflows, and ensuring data availability and quality for production technology.


The ideal candidate brings deep technical expertise in AWS, GCP and/or Databricks alongside essential hands-on experience building pipelines in Python, analysing data requirements with SQL, and modern data engineering practices. Your ability to work across business and technology functions, drive strategic initiatives, and independently problem solve will be key to success in this role.

Qualifications:


Experience:


  • 7+ years of experience in data engineering and solution delivery, with a strong track record of technical leadership.
  • Deep understanding of data modeling, data warehousing concepts, and distributed systems.
  • Excellent problem-solving skills and ability to progress with design, build and validate output data independently.
  • Deep proficiency in Python (including PySpark), SQL, and cloud-based data engineering tools.
  • Expertise in multiple cloud platforms (AWS, GCP, or Azure) and managing cloud-based data infrastructure.
  • Strong background in database technologies (SQL Server, Redshift, PostgreSQL, Oracle).


Desirable Skills:


  • Familiarity with machine learning pipelines and MLOps practices.
  • Additional experience with Databricks and specific AWS such as Glue, S3, Lambda
  • Proficient in Git, CI/CD pipelines, and DevOps tools (e.g., Azure DevOps)
  • Hands-on experience with web scraping, REST API integrations, and streaming data pipelines.
  • Knowledge of JavaScript and front-end frameworks (e.g., React)


Key Responsibilities:


  • Architect and maintain robust data pipelines (batch and streaming) integrating internal and external data sources (APIs, structured streaming, message queues etc.).
  • Collaborate with data analysts, scientists, and software engineers to understand data needs and develop solutions.
  • Understand requirements from operations and product to ensure data and reporting needs are met
  • Implement data quality checks, data governance practices, and monitoring systems to ensure reliable and trustworthy data.
  • Optimize performance of ETL/ELT workflows and improve infrastructure scalability.

Location:
City Of London
Category:
Technology

We found some similar jobs based on your search