About the Client
We are one of the world’s leading providers of reinsurance, insurance and other forms of insurance-based risk transfer, working to make the world more resilient. We anticipate and manage a wide variety of risks, from natural catastrophes and climate change to cybercrime. Combining experience with creative thinking and cutting-edge expertise, we create new opportunities and solutions for our clients. This is possible thanks to the collaboration of more than 14,000 employees across the world.
About the Role
End-of-day processing is a mission-critical function in Asset Management, ensuring seamless daily closure, PnL calculations, and regulatory reporting. As the Tech Lead Core Transaction Processing, you will take ownership of the stability, efficiency, and transformation of this essential Core Transaction Processing (CTP) layer.
You will be responsible for overseeing current batch processes, managing dependencies, and being a key contributor to the transition towards next-generation event-driven and micro-batch processing architectures. This role requires a strong technical foundation, cross-functional collaboration skills, and a vision for innovation in financial market operations.
Responsibilities
Own & Evolve Core Transaction Processing – Ensure resilience, stability, and efficiency across the entire execution chain, from trade processing to regulatory reporting.
Technology Leadership – Play a pivotal role in the transformation from traditional batch-driven processes to event-driven, micro-batch, and cloud-based architectures.
Operational Excellence – Develop a deep understanding and drive improvements of our existing batch including architecture, the incumbent batch software (UC4), the data transfer tools, the support process and the monitoring tools that we use
Data Lineage & Dependency Management – Gain deep visibility into transactional workflows, optimize critical paths, and ensure compliance with regulatory requirements.
Cross-Team Collaboration – Work across technology, operations, and business teams to align on strategy, improve processes, and introduce cutting-edge solutions.
Leverage analytical and forecasting tools to assess the performance and resilience of the end-to-end batch process (the so called "critical path")
Maintain the batch monitoring tool
Requirements
At least 5+ years professional experience
Bachelor or master's degree in computer science, business informatics, mathematics or similar
Batch processing experience is a must and we are particularly interested in candidates who have knowledge of concepts like Event Driven Architecture, micro batches and dynamic batch monitoring
Analytical & conceptual skills to understand key business needs and to design tailored solutions to solve specific business problems
Strong interpersonal, communication and collaboration skills:
Stakeholder alignment, engage with department leads to ensure data outputs align with business goals
Documentation and clarity, create clear, high-level documentation that guides teams through the data processing landscape
Strategic and industry insights including:
Familiarity with how asset management firms use data, ensuring designs reflect business-critical operations
Ensuring the integration process aligns with industry regulations and standards (Compliance awareness)
Ability to write SQL queries/scripts (Oracle and PostgreSQL) required
Nice to Have Skills
Python/Java
Azure DevOps, Azure Cloud
Palantir Foundry, PowerBI