Flashfood Logo

Flashfood

Data Engineer

Sorry, this job was removed at 08:21 p.m. (GMT) on Wednesday, Aug 13, 2025
Be an Early Applicant
In-Office or Remote
Hiring Remotely in Toronto, ON
In-Office or Remote
Hiring Remotely in Toronto, ON

Similar Jobs

Yesterday
In-Office or Remote
Toronto, ON, CAN
Senior level
Senior level
AdTech • Digital Media • eCommerce • Marketing Tech
As a Senior Data Engineer, you'll build scalable data pipelines, optimize data models, and ensure pipeline performance using Databricks, PySpark, and Kafka while collaborating cross-functionally with Architects and Scientists.
Top Skills: AWSCi/CdDatabricksDelta LakeGitKafkaPysparkSparkSQL
5 Days Ago
Easy Apply
Remote or Hybrid
Canada
Easy Apply
Mid level
Mid level
Artificial Intelligence • Cloud • Computer Vision • Hardware • Internet of Things • Software
The Data Operations Engineer will ensure the stability and performance of the data platform, manage data pipelines, monitor incidents, and optimize data quality and infrastructure.
Top Skills: AWSAws Rds/Aurora MysqlAws RedshiftAzureCi/CdDatabricksDbtFivetranGCPGoogle Big QueryMySQLOraclePostgresPythonSnowflakeSQLWorkato
5 Days Ago
In-Office or Remote
Toronto, ON, CAN
Senior level
Senior level
Consulting
The Manager, Data Engineer will lead and advise on data strategy and architecture, oversee data transformation projects, and mentor teams while ensuring delivery of robust data solutions.
Top Skills: AWSAzureDatabricksGCPPythonSnowflakeSparkSQL

Location - Toronto Hub

Summary

We are seeking a talented and experienced Data Engineer to join our growing team. The Data Engineer will play a key role in designing, building, and maintaining our data infrastructure, ensuring scalability, reliability, and performance.

Who You Are: 

  • An engineer at heart who takes pride in a clean and powerful code base.
  • Deep knowledge of Data Architecture, Data Engineering best practices and passionate about making data accessible.
  • Enjoys sharing knowledge and working in a collaborative team environment.
  • Ready to take ownership and make decisions.
  • Data governance, security, and compliance are always top of mind.
  • Previous experience in ETL/ELT Architecture: Expertise in designing and optimizing ETL/ELT pipelines to handle various data formats (CSV, JSON, Parquet, etc.) and integrating data from multiple sources (e.g., APIs, cloud storage, client Snowflake shares).
  • Strong understanding of REST API principles, experience with high-volume API requests, and ability to optimize API calls and data ingestion strategies.
  • Proficiency in using orchestration tools like Apache Airflow, or similar tools to automate and manage data workflows.
  • Expertise in building efficient ETL/ELT workflows to enable scalable feature engineering.
  • Previous experience in performance testing and optimization (data load testing/performance tuning/monitoring) for various databases, and ETL pipelines.
  • Experience building and testing resilient infrastructure using IaC and cloud-specific features for disaster recovery.
  • Experience working in an Agile environment.
  • Experience building data products  in large scale distributed systems.
  • Knowledge of industry best practices and compliance standards, such as DAMA, CCPA, PIPEDA, etc.

What You Will Do:

  • Work with business partners and stakeholders to understand data requirements and support data-related projects.
  • Work with engineering, product teams and 3rd parties to collect required data.
  • Drive data modeling and warehousing initiatives to enable and empower data consumers.
  • Develop ETL/ELT pipelines to ingest, prepare and store data for the product, analysis, and data science.
  • Develop and implement data quality checks, conduct QA and implement monitoring routines.
  • Improve the reliability and scalability of our ETL processes.
  • Develop and manage backup and recovery strategies to ensure data integrity and availability.
  • Ensuring our system is architected to balance cost and latency.
  • Collaborate with partners to execute compliance, security, and privacy reviews/audits.
  • Deploy data infrastructure to support product, analysis, and data science.

Qualifications: 

  • Education: Bachelor’s degree in Computer Science, Information Technology, or a related field.
  • Experience: Minimum of 4 years working with databases, preferably within a platform tool and automation environment.
  • Technical Skills: Programming Languages (PySpark, Scala, Python, Snow SQL, SnowPipe, SQL, Terraform),  Data Orchestration and Automation (Airflow, K8 or similar), Cloud Infrastructure and Data Management Systems (MongoDB, Snowflake, Databricks and Azure or similar).
  • Problem-Solving: Strong analytical and problem-solving skills.
  • Communication: Excellent verbal and written communication skills.
  • Team Player: Ability to work effectively in a collaborative team environment.
  • Knowledge of DevOps and mobile software develop practices and tools.

What you need to know about the Manchester Tech Scene

Home to a £5 billion digital ecosystem, including MediaCity, which consists of major players like the BBC, ITV and Ericsson, Manchester is one of the U.K.'s top digital tech hubs, at the forefront of advancements in film, television and emerging sectors like as e-sports, while also fostering a community of professionals dedicated to pushing creative and technological boundaries.

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account