interactive investor Logo

interactive investor

Data Engineer

Posted 16 Days Ago
Be an Early Applicant
In-Office
Manchester, Greater Manchester, England
Junior
In-Office
Manchester, Greater Manchester, England
Junior
The Data Engineer will design and build data pipelines, improve the firm's Data Platform, and automate data workflows using Python, SQL, and various cloud technologies.
The summary above was generated by AI

WHO WE ARE:

interactive investor is an award-winning investment platform that puts its customers in control of their financial future.

We’ve been helping investors for nearly 30 years. We’ve seen market highs and lows and been resilient throughout. We’re now the UK’s number one flat-fee investment platform, with assets under administration approaching £75 billion and over 450,000 customers.

For a simple, flat monthly fee we provide a secure home for your pensions, ISAs and investments. We offer a wide choice of over 20,000 UK and international investment options, including shares, funds, trusts and ETFs.

We also bring impartial, expert content from our award-winning financial journalists, highly engaged community of investors, and daily newsletters and insights.

PURPOSE OF ROLE:

The Data Engineer role will help ii design, build, and continually improve the firm's Data Platform, consolidating datasets such as customer, transaction, marketing, web analytics, and market data into a trusted, comprehensive source for analytics and Data Science/ML/AI. You will design, build, and run robust Python/SQL pipelines, orchestrated with Dagster, to deliver and transform data in our Snowflake Data Warehouse.

The role also partners with the wider Data and Innovation team and business stakeholders on Intelligent Automation—embedding AI agents within data workflows to replace manual, data‑heavy processes—while maintaining high standards of data quality, security, and governance.

While our Data Analysts primarily build and maintain reporting outputs, you should be comfortable presenting data via Streamlit and occasionally Power BI.

Work may span Microsoft Azure, Amazon Web Services, and Google Cloud, leveraging their AI agent feature sets where appropriate.

We are seeking candidates with a range of experience levels, from Junior to Senior and Lead positions.


RequirementsKEY RESPONSIBILITIES:
  • Support and monitor the daily Data Platform build; investigate and resolve issues from overnight jobs
  • Orchestrate reliable ELT/ETL pipelines using Dagster (assets, jobs, schedules, sensors), implementing dependency management, retries, backfills, SLAs, and alerting to populate the warehouse (star schemas, snapshot tables, slowly changing dimensions)
  • Provide reusable SQL queries and data extracts to Data Analysts and business users; promote self‑service patterns
  • Monitor and triage Data Request tickets for ad‑hoc data needs across business functions, including legacy transaction record requests
  • Maintain clear data lineage and up‑to‑date data cataloguing and dictionaries; advise on the most appropriate fields for specific use cases
  • Partner with business stakeholders to identify manual, data‑heavy processes and redesign them as automated pipelines, incorporating AI agents into data workflows (e.g., classification, enrichment, reconciliation, document processing, alerting)
  • Integrate pipelines with business systems, APIs, and webhooks; implement scheduling, retries, and alerting through orchestration
  • Apply data quality checks, validation rules, and observability (e.g., automated tests, SLAs, anomaly detection, etc)
  • Tune performance and cost (query optimisation, partitioning/clustering in Snowflake, indexing, caching, efficient storage formats such as Parquet/Delta)
  • Practice strong DataOps: Git‑based version control, pull requests/code reviews, CI/CD, environment promotion
  • Ensure compliance with data privacy, security, and regulatory requirements; uphold access controls, encryption, and auditability
  • Maintaining the documentation for the Data Warehouse including design documentation to accompany new scripts/processes and corresponding Data Dictionaries
  • As required, develop and support new and existing data outputs via Streamlit and occasionally Power BI dashboards/reports for operational MI and ad‑hoc analysis
SKILLS & EXPERIENCE REQUIRED:
  • In-depth knowledge of fundamental database concepts (design and queries) and strong knowledge of advanced topics (management and optimising)
  • Advanced SQL knowledge: can write new and interpret existing complex multi-join aggregation queries
  • Advanced understanding of data mart concepts: star/snowflake schemas, snapshot tables, slowly changing dimensions
  • Strong knowledge of Python with a focus on scripting data collection and transformation queries
  • Experience with Dagster (preferred) for orchestration: assets, jobs, schedules, sensors for event‑based and scheduled pipelines; familiarity with comparable cloud services is a plus
  • Experience with Snowflake (or similar cloud data warehouses)
  • Strong DataOps practices: Git-based version control, pull requests, code reviews, CI/CD for data pipelines and infrastructure, environment promotion
  • Experience developing data outputs using Streamlit (primary) and BI visualization tools such as Power BI (occasional)
  • Understanding of data quality frameworks and observability and operational monitoring/alerting
  • Exposure to Intelligent Automation in data workflows, including safe use of AI agents for enrichment, classification, or document processing; familiarity with cloud AI agent feature sets (e.g., Azure AI/Agents including Azure OpenAI, AWS Agents for Bedrock, Google Vertex AI Agents)
  • Familiarity with security, governance, and privacy concepts
  • Able to translate high‑level business requirements into clear data requirements and robust technical designs

Benefits
  • Group Personal Pension Plan – 8% employer contribution and 4% employee contribution
  • Life Assurance and Group Income Protection
  • Private Medical Insurance – Provided by Bupa
  • 25 Days Annual Leave, plus bank holidays
  • Staff Discounts on our investment products
  • Personal & Well-being Fund – Supporting your physical and mental wellness
  • Retail Discounts – Savings at a wide range of high street and online retailers
  • Voluntary Flexible Benefits – Tailor your benefits to suit your lifestyle

Please Note: We will do our utmost efforts to respond to all applicants. However, due to the high volume of applications we're currently receiving, if you haven't been contacted within 30 days of application, please consider unsuccessful.

interactive investor operates in accordance with the UK Equality Act 2010. We welcome applications from individuals of all ages, disabilities, gender identities, marital status, pregnancy/maternity, race, religion or belief, sex, and sexual orientation. We are committed to treating all applicants fairly and making reasonable adjustments where needed to support disabled applicants. We actively prevent all forms of discrimination, harassment, and victimisation—whether direct, indirect, associative, or perceptive


Top Skills

AWS
Azure
Dagster
GCP
Power BI
Python
Snowflake
SQL
Streamlit
HQ

interactive investor Manchester, England Office

201 Deansgate, Manchester, United Kingdom, M3 3NW

Similar Jobs

5 Days Ago
Easy Apply
Hybrid
London, England, GBR
Easy Apply
Senior level
Senior level
Artificial Intelligence • Machine Learning • Software
The Senior Implementations Data Engineer will design, develop, and maintain data pipelines, ensure data security, and support customer integrations, working with cross-functional teams and stakeholders.
Top Skills: AWSAzureDatabricksDockerGitKubernetesOpensearchPostgresPythonRest ApisSparkSQL
10 Days Ago
Hybrid
London, Greater London, England, GBR
Senior level
Senior level
Financial Services
Lead development of strategic data pipelines and software solutions for Credit Trading. Collaborate with stakeholders to enhance operational stability, promote innovation, and ensure technology meets business needs.
Top Skills: AWSAzureGCPKdb+OraclePostgresPythonVertica
20 Days Ago
In-Office
London, Greater London, England, GBR
Senior level
Senior level
Fintech • Information Technology • Financial Services
As a Vice President, you will lead Preqin's Data Engineering/BI team, develop data infrastructure, ensure stakeholder alignment, and promote data best practices. You will also drive the adoption of advanced analytics and AI/ML solutions.
Top Skills: AWSAzureDbtKubernetesPythonSnowflakeSQLTerraform

What you need to know about the Manchester Tech Scene

Home to a £5 billion digital ecosystem, including MediaCity, which consists of major players like the BBC, ITV and Ericsson, Manchester is one of the U.K.'s top digital tech hubs, at the forefront of advancements in film, television and emerging sectors like as e-sports, while also fostering a community of professionals dedicated to pushing creative and technological boundaries.

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account