Kroll Logo

Kroll

Data Engineer I

Posted 5 Days Ago
Remote
Hiring Remotely in Canada
Mid level
Remote
Hiring Remotely in Canada
Mid level
The Data Engineer I will design and implement data solutions on Azure, developing ETL processes, managing data pipelines, and ensuring data quality and security, while collaborating with various teams.
The summary above was generated by AI

Kroll’s Private Capital Markets (PCM) platform is transforming private asset valuation and portfolio workflows for alternative asset managers. We’re seeking a Data Engineer to design and implement secure, scalable data solutions across the PCM platform on Azure.

You will collaborate closely with Product and Implementation teams to deliver client-ready analytics, robust APIs, and high-performance data pipelines that power financial workflows spanning private equity, fixed income, derivatives, and structured products.

You’ll also help establish engineering standards and communities of practice across a global team of data professionals and developers.

This is a hybrid role, requiring 2–3 days of on-site presence each week.

Day-to-Day Responsibilities

  • Data Pipeline Construction: Design, build, and maintain reliable data pipelines to move, transform, and integrate data from diverse sources into data warehouses or lakes.
  • ETL and Data Integration: Develop and optimize ETL/ELT processes using tools such as Azure Data Factory, Databricks, Synapse, DBT, Airflow, or Informatica.
  • Data Warehousing: Model and manage data warehouses to ensure efficient querying, high performance, and data quality using platforms like Azure Synapse, Snowflake, Redshift, or BigQuery.
  • Data Quality & Monitoring: Implement validation, cleaning, and monitoring processes to ensure data accuracy, consistency, and reliability.
  • Data Security: Apply robust data governance practices, manage access permissions, and ensure compliance with privacy regulations.
  • Performance & Scalability: Optimize systems to handle growing data volumes and support evolving business needs.
  • Lead and mentor cross-functional teams, driving adoption of modern data technologies and best practices.
  • Spearhead greenfield initiatives that align with strategic business objectives, including innovation to support revenue growth and market expansion.
  • Own key functional areas of the PCM platform to ensure operational efficiency, reliability, and peak performance.
  • Promote collaboration and excellence by participating in architectural reviews, defining technical standards, and contributing to a culture of continuous improvement.

Essential Traits

  • Technical Expertise
    • Proven experience building ETL/ELT pipelines using Azure, AWS, or Databricks platforms.
    • Strong proficiency in SQL (T-SQL, PL/pgSQL, Spark-SQL) for data transformation and optimization.
    • Skilled in Python, C#/.NET, or Java for data engineering and backend services.
    • Hands-on experience with REST API development, Python SDKs, and containerization tools such as Docker and Kubernetes.
    • Working knowledge of CI/CD pipelines, Git, and Azure DevOps.
  • Data Systems & Architecture
    • Experience with Microsoft SQL Server, PostgreSQL, and cloud-native databases.
    • Understanding of data warehousing, dimensional modeling, and data lake architectures.
    • Hands-on experience with data pipeline orchestration tools like Airflow, Ascend, or Azure Synapse.
    • Exposure to data quality frameworks and monitoring best practices.
  • Collaboration & Delivery
    • Partner effectively with Product Owners and end users in an agile environment.
    • Participate in code reviews, technical design sessions, and architecture discussions.
    • Demonstrated ability to manage multiple priorities, solve complex problems, and deliver scalable solutions.
  • Master’s degree in Computer Science, Data Science, Mathematics, Statistics, or a related field.
  • Minimum 3 years of hands-on data engineering experience, ideally within financial services.
  • Relevant Cloud (Azure/AWS) or Data Engineering certifications preferred.
  • Ability to handle confidential and sensitive information with discretion.

About Kroll

Join the global leader in risk and financial advisory solutions—Kroll. With a nearly century-long legacy, we blend trusted expertise with cutting-edge technology to navigate and redefine industry complexities. As a part of One Team, One Kroll, you'll contribute to a collaborative and empowering environment, propelling your career to new heights. Ready to build, protect, restore and maximize our clients’ value? Your journey begins with Kroll.

We are proud to be an equal opportunity employer and will consider all qualified applicants regardless of gender, gender identity, race, religion, color, nationality, ethnic origin, sexual orientation, marital status, veteran status, age or disability.

In order to be considered for a position, you must formally apply via careers.kroll.com.


Top Skills

Airflow
Azure
Azure Data Factory
Azure Devops
C#/.Net
Databricks
Dbt
Docker
Git
Informatica
Java
Kubernetes
Microsoft Sql Server
Postgres
Python
Rest Api
SQL
Synapse

Kroll Manchester, England Office

The Chancery, 58 Spring Gardens, Manchester, United Kingdom, M2 1EW

Similar Jobs

12 Days Ago
Easy Apply
Remote
Easy Apply
Senior level
Senior level
Healthtech
The Senior Data Engineer at League will develop and support data services and software products, ensuring performance and security of data infrastructure while collaborating with cross-functional teams.
Top Skills: AirflowGoGCPPython
5 Minutes Ago
In-Office or Remote
Senior level
Senior level
Blockchain • eCommerce • Fintech • Payments • Software • Financial Services • Cryptocurrency
The Senior Account Executive will drive sales for Square's ecosystem, focusing on acquiring new restaurant clients and managing the sales cycle from prospect to close, utilizing Salesforce to track activities and achieve sales targets.
Top Skills: Salesforce
4 Hours Ago
Remote
Mid level
Mid level
Artificial Intelligence • Productivity • Software • Automation
As a Data Engineer at Zapier, you'll build scalable data systems, enhance product functionality through data, and collaborate with teams to improve data access and usability.
Top Skills: AWSAzureDatabricksGCPPythonSparkSQLTypescript

What you need to know about the Manchester Tech Scene

Home to a £5 billion digital ecosystem, including MediaCity, which consists of major players like the BBC, ITV and Ericsson, Manchester is one of the U.K.'s top digital tech hubs, at the forefront of advancements in film, television and emerging sectors like as e-sports, while also fostering a community of professionals dedicated to pushing creative and technological boundaries.

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account