Marcura Logo

Marcura

Data Engineer

Posted An Hour Ago
Be an Early Applicant
In-Office or Remote
3 Locations
Mid level
In-Office or Remote
3 Locations
Mid level
The Data Engineer will focus on data engineering best practices, data transformation using DBT, managing data pipelines, ensuring data governance, and maintaining data security within a remote team environment.
The summary above was generated by AI

About Marcura:

Marcura is a global leader in maritime technology and operations, supporting nearly one‑third of the world’s seaborne commodity trade. Our trusted platforms which span software, data intelligence and payments sit at the centre of digital transformation across the maritime industry.  We are now seeking a Data Engineer to join our high‑impact team and contribute to the success of one of the sector’s most forward‑looking organisations.

About the Role:

To bring domain expertise in data engineering to the team, including the ETL process using modern tools and methodologies. You will play a key role in building scalable data structures, with a specific focus on implementing Data Vault 2.0 to ensure a flexible and auditable data foundation.

Roles and Responsibilities:

1. Data engineering best practices

  • You will contribute to the data team's ability to adhere to data engineering best practices across pipeline design, data quality monitoring, storage, versioning, security, testing, documentation, cost, and error handling.

2. Data transformation in DBT

  • Ensure that the daily DBT build is successful, including full test coverage of existing models.
  • Create new data models in collaboration with the data analysts, utilizing Data Vault 2.0 principles where appropriate to handle complex data relationships and historical tracking.
  • Add new tests to enhance data quality and maintain the integrity of the data warehouse.
  • Incorporate new data sources into the warehouse architecture.

3. Data extraction

  • Develop and maintain our data pipelines in Stitch, Fivetran, Segment, and Apache Airflow (Google Cloud Composer).
  • Evaluate when it's appropriate to use managed tools versus building custom data pipelines in Cloud Composer.
  • Ensure that data extraction jobs run successfully daily.
  • Collaborate with engineers from MarTrust to add new data sets to our data extraction jobs.

4. Data warehousing in BigQuery

  • Ensure that the data in our data warehouse is kept secure and that daily jobs in BigQuery run successfully.
  • Support the evolution of our BigQuery schema to accommodate Data Vault 2.0 structures (Hubs, Links, and Satellites) for long-term scalability.

5. Data Governance and Security

  • Data Quality (DQ): Implement and monitor automated data quality checks and observability to ensure the accuracy and reliability of downstream reporting.
  • Access Control: Manage and enforce granular access control policies (IAM) within BigQuery and GCP to ensure data is only accessible to authorized users.
  • Governance: Ensure all data processes comply with security standards and data privacy regulations, maintaining clear documentation of lineage and metadata.

Requirements
  • Data Modeling: Solid understanding and hands-on experience with Data Vault 2.0 methodologies.
  • GCP Infrastructure: Experience with Google BigQuery and Cloud Composer (Apache Airflow).
  • Modern Data Stack: Proficiency in DBT for data transformation and data quality testing.
  • Governance & Security: Practical experience managing data access controls, security best practices, and DQ frameworks.
  • Pipeline Tools: Experience with managed ELT services like Fivetran, Stitch, or Segment.
  • Remote Work: Ability to work effectively in a fully remote, distributed team environment.

Benefits

Competitive Salary and Bonus: We reward your expertise and contributions.

Inclusive Onboarding Experience: Our onboarding program is designed to set you up for success right from day one.

Marcura Wellness Zone: We value your work-life balance and well-being.

Global Opportunities: Be part of an ambitious, expanding company with a local touch.

Diverse, Supportive Work Culture: We’re committed to inclusion, diversity, and a sense of belonging for all team members.

Top Skills

Apache Airflow
Cloud Composer
Data Vault 2.0
Dbt
Fivetran
Google Bigquery
Segment
Stitch

Similar Jobs

5 Hours Ago
In-Office or Remote
Athens, GRC
Mid level
Mid level
Information Technology • Consulting
You will analyze and maintain RDF data models, develop SPARQL queries, support data workflows, ensure data consistency, and collaborate with backend engineers.
Top Skills: CsvJSONOwlPythonRdfRdfsSparqlTtlXML
5 Hours Ago
Remote or Hybrid
Greece
Mid level
Mid level
Analytics
As a Data Engineer on the Power team, enhance data retrieval pipelines, expand market coverage, develop features, and optimize API performance while collaborating with data scientists.
Top Skills: AWSDatadogDockerFastapiGitGrafanaKubernetesPostgresPythonTerraform
5 Hours Ago
Remote or Hybrid
Greece
Senior level
Senior level
Analytics
As a Senior Data Engineer, you will enhance data pipelines, design robust architectures, enrich products, and collaborate with engineers and data scientists.
Top Skills: AWSDatadogDockerFastapiGitGrafanaKubernetesPostgresPythonRestful ApisTerraform

What you need to know about the Manchester Tech Scene

Home to a £5 billion digital ecosystem, including MediaCity, which consists of major players like the BBC, ITV and Ericsson, Manchester is one of the U.K.'s top digital tech hubs, at the forefront of advancements in film, television and emerging sectors like as e-sports, while also fostering a community of professionals dedicated to pushing creative and technological boundaries.

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account