About the job

The position is within the Capital Markets IT (CMI) division which manages information systems for capital markets activities which includes Front Office (FO) booking platforms, Risk Management (RM) control and regulatory reporting systems, financial product and contract valuation systems, and much more.

The team is looking for data engineers to join their growing team of analytics experts to evolve the platform to move to a more data centric architecture as part of the global Information Systems data strategy for the bank. You will be responsible to expand and optimize data and data pipeline architecture, as well as optimizing data exchanges and usage capabilities for all entities.

  • Around 8 years of software development experience
  • Expertise working on Java 8/11 especially using multithreading principles and good knowledge of java data structures with strong fundamentals in complexity analysis
  • Expertise in redesigning infrastructure for greater scalability, typically building solutions for optimal storage, extraction, transformation and distribution of data from variety of data sources using Sql and AWS big data
  • Expertise in managing large complex data sets
  • Advanced knowledge working with relational databases
  • Experience building and optimizing ‘big data’ data pipelines and data sets
  • Building processes for data transformation and vitualizations
  • Working knowledge of message queuing, stream processing
  • Microservices based technology
  • Experience with Maven, Git, writing and maintaining unit/ integration tests
  • Experience in leading technical study into a proposed solution, while involving expertise from infrastructure
  • At least 4 years of experience in a data centric role with experience in using the following software/ tools:
  • o Big data tools : Hadoop, Storm/Spark/Flink , Kafka etc.
    o Relational Sql and Nosql databases
    o Experience with AWS cloud services: Athena, EC2, Lambda , Kiness, S3 (at least few or similar ones if not all)
    o Familiarity with Python, ELK, Kotlin , Power BI, Tableau and data virtualization tools like Denedo , Vyne etc. would be an added advantage but not mandatory
    o Experience in Performance Tuning and optimizations
    o Agile methodology (Scrum/ Kanban)


Submit your application to quoting the Job Title.

Your interest will be treated in strict confidence.

Data collected will be used for recruitment purposes only. Personal data provided will be used strictly in accordance with the relevant data protection law and WMRC’s personal information and privacy policy.