Docker, Inc Logo

Docker, Inc

Senior Data Engineer

Job Posted 13 Days Ago Posted 13 Days Ago
Remote
7 Locations
Mid level
Remote
7 Locations
Mid level
The Senior Data Engineer will design and manage data pipelines, ETL processes, and data models, ensuring compliance and supporting machine learning infrastructure.
The summary above was generated by AI

Docker is a remote first company with employees across Europe, APAC and the Americas that simplifies the lives of developers who are making world-changing apps.  We continued to see exponential revenue growth last year.  Join us for a whale of a ride!

Docker is looking for a Senior Data Engineer to join our Data Engineering team which is led by our Director of Data Engineering. The team transforms billions of data points generated from the Docker products and services into actionable insights to directly influence product strategy and development. You'll leverage both software engineering and analytics skills as part of the team responsible for managing data pipelines across the company: Sales, Marketing, Finance, HR, Customer Support, Engineering, and Product Development.  

In this role, you'll help design and implement event ingestion, data models, and ETL processes that support mission-critical reporting and analysis while building in mechanisms to support our privacy and compliance posture. You will also lay the foundation for our ML infrastructure to support data scientists and enhance our analytics capabilities. Our data stack consists of Snowflake as the central data warehouse, DBT/Airflow as the orchestration layer and Looker for visualization and reporting. Data flows in from Segment, Fivetran, S3, Kafka, and a variety of other cloud sources and systems. You'll work together with other data engineers, analysts, and subject matter experts to deliver impactful outcomes to the organization.  As the company grows, ensuring reliable and secure data flows to all business units and surfacing insights and analytics is a huge and exciting challenge!

Responsibilities:

  • Manage and develop ETL jobs, warehouse, and event collection tools and test process, validate, transport, collate, aggregate, and distribute data

  • Build and manage the Central Data Model that powers most of our reporting

  • Integrate emerging methodology, technology, and version control practices that best fit the team

  • Build data pipelines and tooling to support our ML and AI projects

  • Contribute to enforce SOC2 compliance across the data platform

  • Support and enable our stakeholders and other data practitioners across the company

  • Write and maintain documentation of technical architecture

Qualifications:

  • 4+ yrs of relevant industry experience

  • Experienced in data modeling and building scalable data pipelines involving complex transformations

  • Proficiency working with a Data Warehouse platform (Snowflake or BigQuery preferred)

  • Experience with data governance, data access, and security controls. Experience with Snowflake and dbt is strongly preferred

  • Experience creating production-ready ETL scripts and pipelines using Python and SQL and using orchestration frameworks such as Airflow/Dagster/Prefect

  • Experience designing and deploying high-performance systems with reliable monitoring and logging practices

  • Familiarity with at least one cloud ecosystem: AWS/Azure Infrastructure/Google Cloud

  • Experience with a comprehensive BI and visualization framework such Tableau or Looker

  • Experience working in an agile environment on multiple projects and prioritizing work based on organizational priorities

  • Strong verbal and written English communication skills

What to expect in the first 30 days:

  • Onboard and meet data engineers, analysts, and key stakeholders and attend team meetings

  • Develop an understanding of the current data architecture and pipelines 

  • Review current projects, roadmap, and priorities

  • Identify areas for quick wins for improving data engineer and analyst experience 

  • Understand our privacy and compliance requirements and current design/workflows

What to expect in the first 90 days:

  • Contribute meaningfully to the data engineering projects 

  • Recommend opportunities for continuous improvement for data pipelines and infrastructure


We use Covey as part of our hiring and / or promotional process for jobs in NYC and certain features may qualify it as an AEDT. As part of the evaluation process we provide Covey with job requirements and candidate submitted applications. We began using Covey Scout for Inbound on April 13, 2024.

Please see the independent bias audit report covering our use of Covey here.

We use Covey as part of our hiring and / or promotional process for jobs in NYC and certain features may qualify it as an AEDT. As part of the evaluation process we provide Covey with job requirements and candidate submitted applications. We began using Covey Scout for Inbound on April 13, 2024.

Please see the independent bias audit report covering our use of Covey here.

Perks (for Full-Time Employees Only)

  • Freedom & flexibility; fit your work around your life

  • Designated quarterly Whaleness Days

  • Home office setup; we want you comfortable while you work

  • 16 weeks of paid Parental leave

  • Technology stipend equivalent to $100 net/month

  • PTO plan that encourages you to take time to do the things you enjoy

  • Quarterly, company-wide hackathons

  • Training stipend for conferences, courses and classes

  • Equity; we are a growing start-up and want all employees to have a share in the success of the company

  • Docker Swag

  • Medical benefits, retirement and holidays vary by country

Docker embraces diversity and equal opportunity. We are committed to building a team that represents a variety of backgrounds, perspectives, and skills. The more inclusive we are, the better our company will be.

Due to the remote nature of this role, we are unable to provide visa sponsorship.

#LI-REMOTE

Top Skills

Airflow
AWS
Azure
Dbt
Docker
GCP
Looker
Python
Snowflake
SQL

Similar Jobs

2 Days Ago
Remote
28 Locations
Senior level
Senior level
Information Technology • Consulting
As a Senior Data Engineer, you'll develop and optimize data pipelines, design cloud data models, ensure data quality, and collaborate on advanced analytics solutions for consumer data lakes.
Top Skills: AzureDbtGCPNoSQLPythonSap HanaSnowflakeSQL
5 Hours Ago
Remote
32 Locations
Senior level
Senior level
Artificial Intelligence
As a Senior Data Engineer, you will design and manage data pipelines for various types of data, ensuring scalability and reliability. You will work closely with R&D teams to enhance data access and improve data quality systems.
Top Skills: Delta LakeHudiIcebergKubeflowPythonSparkSQLTerraform
4 Days Ago
Remote
28 Locations
Mid level
Mid level
Information Technology • Consulting
The Data Engineer will optimize data processing pipelines, ensuring real-time data availability while collaborating with teams. Proficiency in Python, PostgreSQL, and ETL processes is required.
Top Skills: AirflowAWSDbtPostgresPythonSnowflake

What you need to know about the Manchester Tech Scene

Home to a £5 billion digital ecosystem, including MediaCity, which consists of major players like the BBC, ITV and Ericsson, Manchester is one of the U.K.'s top digital tech hubs, at the forefront of advancements in film, television and emerging sectors like as e-sports, while also fostering a community of professionals dedicated to pushing creative and technological boundaries.
By clicking Apply you agree to share your profile information with the hiring company.

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account