Lantern (lantern.ai) Logo

Lantern (lantern.ai)

Senior Data Engineer

Posted Yesterday
Be an Early Applicant
Hybrid
London, Greater London, England
Senior level
Hybrid
London, Greater London, England
Senior level
As a Senior Data Engineer, you will create and maintain data pipelines, optimize data delivery, and build analytics models to provide insights, while working closely with various teams to enhance private market data intelligence.
The summary above was generated by AI

About us

Our mission is to transform how private markets use business intelligence. The Lantern vision will see GPs use data to drive better & faster decision-making, through deeper insights built on centralised, validated data.

To support our ambitious plans, we are looking for an exceptional Data Engineer to join our mission to innovate private capital software and change its landscape as we bring our products to market.

Our Data Stack

  • Python - which supports our EL services, supported along with a modern data stack – i.e. Fivetran, Databricks
  • Snowflake and PostgreSQL - to run our databases
  • DBT Core - which hosts our extensive library of data models
  • Azure - which hosts our infrastructure outside of our data stack - i.e. Databricks

About you

As a Data Engineer, you will join a team with ambitious plans to power Private Market data intelligence and insights. Reporting to the Head of Data and working with our Data Engineering, Data Science, Product, and Development squads, you will become a key member in our growth journey.

Key Responsibilities

  • Create and maintain data pipelines
  • Assemble large, complex data sets to meet product and business requirements
  • Identify, design, and implement internal process improvements, e.g.
    • automating manual processes
    • optimising data delivery
    • re-designing infrastructure for greater scalability
  • Construct the pipelines necessary for efficient extraction, transformation, and loading of data from diverse sources.
  • Build analytics models or tools that utilise our data pipelines to provide actionable insights into customer acquisition, operational efficiency, and other key business performance metrics
  • Work with stakeholders to assist with data-related infrastructure needs and issues
  • Have a security-minded focus when designing pipelines to protect end-user data
  • Work with data and analytics experts to strive for greater functionality in our data systems

Requirements

On-the-job Experience

  • 4+ years of experience in a Data Engineer or Analytics Engineering role
  • As well as experience:
    • Manipulating, processing, modelling, and extracting value from large, disconnected datasets
    • With relational databases and complex query authoring.
    • With data warehouse design and modelling
    • Building extract processes from API, SFTP, and Databases.
    • Project management/documentation software (JIRA, Confluence preferred).

Technical Capabilities

  • Expert proficiency of Python with a strong focus on code quality, unit, and E2E testing
  • Proficiency in object-oriented development practises.
  • Experience in building and maintaining DBT models
  • Advanced SQL knowledge and experience working with a variety of SQL databases (Preferrably Snowflake and Postgres)
  • Experience with Version Control Software (Preferrably Git & Github)
  • Familiarity with cloud data services (Preferrably Azure).

Nice to Have

  • Worked within financial services, preferably private equity.
  • Experience with dbt and analytics engineering best practices.
  • Familiarity with Snowflake performance and modelling patterns.
  • Experience with data governance, metadata, or lineage tooling.
  • Dagster/Prefect experience
  • Snowflake Cost Optimisation

Character

  • Humble – aware of your strengths and open about your learning areas.
  • Growth-minded – committed to improving your skills and raising the bar as a team.
  • Detail-oriented – accuracy matters, especially with financial data.

Top Skills

Azure
Databricks
Dbt Core
Fivetran
Postgres
Python
Snowflake

Similar Jobs

4 Days Ago
Easy Apply
Hybrid
Easy Apply
Senior level
Senior level
AdTech • Artificial Intelligence • Marketing Tech • Software • Analytics
Design and operate data processing pipelines for Zeta's AdTech platform, focusing on streaming and batch processes while collaborating with cross-functional teams.
Top Skills: AirflowArgoAWSAws KinesisBeamBigQueryCassandraClickhouseDruidDynamoDBFlinkJavaKafkaMySQLPostgresPythonRedisRedshiftS3ScalaSnowflakeSparkSQL
Senior level
Digital Media • Gaming • Information Technology • Software • Sports • Esports • Big Data Analytics
As a Senior Data Scientist, you will develop and implement statistical models and machine learning algorithms to enhance sports modeling and the Sportsbook experience, collaborating across teams and mentoring junior scientists.
Top Skills: DockerKubernetesNumpyPython
An Hour Ago
In-Office
Manchester, Greater Manchester, England, GBR
Senior level
Senior level
Fintech
The Senior Data Engineer will design and implement data pipelines for fraud prevention, collaborate with teams, and ensure high quality code while participating in agile methodologies.
Top Skills: AWSCloudFormationPandasPysparkPythonSQLTerraform

What you need to know about the Manchester Tech Scene

Home to a £5 billion digital ecosystem, including MediaCity, which consists of major players like the BBC, ITV and Ericsson, Manchester is one of the U.K.'s top digital tech hubs, at the forefront of advancements in film, television and emerging sectors like as e-sports, while also fostering a community of professionals dedicated to pushing creative and technological boundaries.

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account