Blankfactor Logo

Blankfactor

Senior Data Engineer

Posted 24 Days Ago
Be an Early Applicant
Sofia, Sofia-grad
Senior level
Sofia, Sofia-grad
Senior level
As a Senior Data Engineer at Blankfactor, you will design and maintain scalable ETL pipelines for financial data, ensuring integration from various sources while implementing real-time processing systems. Your role involves working closely with product and engineering teams to maintain data integrity and compliance standards in a fintech environment.
The summary above was generated by AI

What we do 

At Blankfactor, we are dedicated to engineering impact. We are passionate about creating value by building best-in-class tech solutions for companies looking to transform, innovate, and scale. In every project, we aim to deliver work that moves the needle and drives measurable outcomes for our partners and clients. Our full-stack development, data engineering, digital product, and enterprise AI solutions cater to a range of industries, including payments, banking, capital markets, and life sciences.

We are headquartered in Miami, Florida, have offices in Bulgaria, Colombia, and Romania, and are rapidly expanding our global footprint. Our culture of engineering excellence, technical expertise, and care for both our clients and our talented workforce has made us one of the fastest-growing companies in America.

We only hire the best and brightest. If you have talent and ambition, join us and be part of an environment that fosters innovation, collaboration, and growth. Welcome to Blankfactor!

What to expect in this role

As a Senior Data Engineer, you will play a key role in developing and optimizing data pipelines, ETL processes, and analytics solutions that handle large volumes of transaction data in real-time. You will collaborate closely with product, engineering, and compliance teams to ensure data reliability, accuracy, and security. This is an exciting opportunity to work on mission-critical systems in the financial industry and leverage your skills in big data, cloud infrastructure, and modern data engineering practices.

  • Design, build, and maintain scalable and efficient ETL/ELT pipelines to process high volumes of financial transaction data from various sources including card networks, partner banks, and APIs2.

  • Ensure seamless integration of structured and unstructured data from multiple sources (issuer, card networks, payment platforms) into centralized data warehouses and data lakes.

  • Implement real-time data streaming and processing systems to handle large-scale transactions and events using tools like Kafka, Kinesis, Spark etc.

  • Manage and optimize data storage solutions (e.g., Snowflake, Redshift) to ensure efficient querying and reporting.

  • Implement robust data validation, error handling, and auditing mechanisms to ensure the accuracy and integrity of financial data. 

  • Collaborate with compliance and security teams to ensure data governance standards are met.

  • Implement monitoring, alerting, and logging for data pipelines and systems to ensure high availability and performance.

  • Ensure all data handling meets stringent security and compliance standards (PCI DSS, GDPR, etc.) required in our business

Our stack:

  • Cloud platform - AWS

  • Execution engine – Glue (Spark)

  • Real-time solution – MSK/Kinesis

  • Data discovery/observability – DBT

  • Data Orchestration – Apache Airflow

  • Data Warehouse – TBC

  • IasC – Terraform/Terramate

  • CI/CD – Bitbucket pipelines

Requirements and technical skills

  • 5+ years of proven experience as a Senior Data Engineer preferably in fintech, or payment industry (though domain is not a deal-breaker).

  • Advanced knowledge of Python and (No)SQL.

  • Experience with modern data warehousing technologies (Snowflake, Redshift, or similar).

  • A solid understanding and hands-on experience(preferably) with real-time and event-driven systems such as Kafka, Kinesis or similar.

  • Strong experience with ETL/ELT pipeline development using tools such as AWS Glue, Apache Airflow, DBT, etc

  • Working knowledge of Terraform for Infrastructure as Code.

  • Experience with distributed computing and big data technologies like Spark.

  • Experience with cloud platforms such as AWS, GCP or Azure (preferably AWS) and services.

  • Working knowledge of containerization (Docker).

  • Strong communication skills to collaborate with cross-functional teams and translate business needs into technical solutions.

  • Ability to troubleshoot and resolve complex data issues in high-pressure environments.

  • Ability to work with large data sets and support insight developments that drive business decisions.

  • Bachelor's or Master’s degree in computer science, Information

  • Systems, Data Science, Engineering or a related field.

What you can expect as a member of the Blankfactor team

  • Fintech Expertise: Access to expertise in machine learning, data science, big data, and AI, providing opportunities for continuous learning and exposure to cutting-edge technologies.

  • Technology exams/Certifications covered by the company

  • World-class workspace for unleashing creativity

  • Lunch is provided when working from the office 

  • Fresh fruits and snacks in the office 

  • Diverse client portfolio

  • Cutting-edge high-tech stack 

  • Monthly on-site gatherings

  • Annual festivities: Participate in team-building activities, family BBQs, and end-of-year celebrations

  • Participation in Sporting Challenges and Marathons

  • Voluntary social events 

We believe that diversity of experience and background contributes to more robust ideas and a stronger team. All qualified applicants will receive consideration for employment without regard to religion, race, sex, sexual orientation, gender identity, national origin, or disability.

Top Skills

Apache Airflow
AWS
Dbt
Docker
Glue
Kafka
Kinesis
NoSQL
Python
Redshift
Snowflake
Spark
Terraform

Similar Jobs

22 Days Ago
2 Locations
Senior level
Senior level
Information Technology
As a Senior Azure Data Engineer, you will analyze, design, and implement data warehouses and pipelines, build ETL code using SQL and Python, and develop analytics solutions. Collaborating in an Agile environment, you will work on innovative projects involving Big Data and AI technologies.
Top Skills: SparkArm TemplatesAzure Data BricksAzure Data FactoryAzure DevopsConfluenceGitJIRAPythonSQLTerraform
7 Days Ago
Hybrid
4 Locations
Junior
Junior
Big Data • Food • Hardware • Machine Learning • Retail • Automation • Manufacturing
The Sourcing Analyst role involves leading analytics for complex bidding projects, providing advanced analytics, optimizing sourcing events, and performing data manipulation and analysis. The analyst will utilize tools like PowerBI and Excel to translate complex data into insights to support sourcing decisions and the broader analytics agenda.
22 Days Ago
Easy Apply
Hybrid
Sofia, Sofia-grad, BGR
Easy Apply
Entry level
Entry level
AdTech • Artificial Intelligence • Machine Learning • Marketing Tech • Software • Sports • Big Data Analytics
As a Sports Event Analyst, you will monitor and collect data from live sports events globally, ensure data accuracy, and manage the company database. You will react to discrepancies in information through research and reviews of match reports and sports news, with full training provided for new analysts.
Top Skills: Custom SoftwareData Entry

What you need to know about the Manchester Tech Scene

Home to a £5 billion digital ecosystem, including MediaCity, which consists of major players like the BBC, ITV and Ericsson, Manchester is one of the U.K.'s top digital tech hubs, at the forefront of advancements in film, television and emerging sectors like as e-sports, while also fostering a community of professionals dedicated to pushing creative and technological boundaries.

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account