We are seeking a Senior Data Engineer to join our dynamic Tech teams. The ideal candidate is self-learning, problem-solving oriented individual with strong analytical thinking. You will be working as part of a team to build / develop / enhance and support SCOR Data ecosystem (SCOR Data Platform, Datawarehouses…), including but not restricted to Syndicate Data Hub. The Senior Data Engineer will promote good data hygiene across the Syndicate and wider SCOR group.
ResponsibilitiesUnder the responsibility of a Data Architect, your mission will be to:
Build, deliver, test, maintain data artefacts such as data pipelines, datasets, cubes, models, services (API) to serve data distribution while following standard best practices and state-of-the-art approaches (testing, reconciling and documenting changes are a key part of the role).
Document data artefacts (code, diagrams, wiki-like documentation) to secure the comprehensiveness of your work.
Collaborate within and outside your squad by participating to workshops or rituals (dailys, sprint reviews, design sessions) and promoting good practices across the SCOR group.
Support the delivery of trust-worthy data pipelines that serve both transactional and analytical needs, by implementing adequate tests and controls, and monitoring data scheduled outputs or ad hoc inputs when required.
Review and coach data engineers, support them by helping them to grow.
Contribute to ICS (Internal Control System) and support audits when required.
Qualifications- Several years of experience as a data engineer, data oriented mindset
- Proven experience in development and maintenance of data pipelines, preferably in agile projects (Scrum and/or Kanban)
- Good knowledge of the Lloyds of London Market
Technical Skills :
- Strong level in T-SQL, Azure Data Factory, ability to develop data pipelines under various platform using good practices such as parallelization, distributing programing techniques, decisional modeling, slowly changing dimensions, change data capture management…
- Good knowledge of CI / CD pipelines and Gitflows best practices
- Good knowledge of Power BI / MDX / DAX
- Good knowledge in Python, Pyspark
- Experience with Databricks and/or Palantir Foundry is a strong plus
- Knowledge of REST API development is a plus
Behavioral & Management Skills :
- Strong analytical thinking, rigorous mindset solution-oriented and force of proposal
- Team player, commitment, curiosity, interest to challenge
- Capacity to navigate in an international environment
- Communication & people skills, ability to speak to a wide community of stakeholders (business, data, IT) and understand their needs
Required Education
Bachelor's degree in computer science, software or computer engineering, applied maths, physics, statistics, or a related field or equivalent experience
As a leading global reinsurer, SCOR offers its clients a diversified and innovative range of reinsurance and insurance solutions and services to control and manage risk. Applying “The Art & Science of Risk,” SCOR uses its industry-recognized expertise and cutting-edge financial solutions to serve its clients and contribute to the welfare and resilience of society in around 160 countries worldwide.
Working at SCOR means engaging with some of the best minds in the industry – actuaries, data scientists, underwriters, risk modelers, engineers, and many others – as we work together to find solutions to pressing challenges facing societies.
As an international company, our common culture is defined by “The SCOR Way.” Serving both to build momentum that drives the Group forward and as a compass to guide our actions and choices, The SCOR Way is anchored by five core values, reflecting the input of employees at all levels of the Group. We care about clients, people, and societies. We perform with integrity. We act with courage. We encourage open minds. And we thrive through collaboration.
SCOR supports inclusion and the diversity of talents, and all positions are open to people with disabilities.

