Head of Data Engineering Jobs at Jubilee Insurance7
Job Ref. No. JLIL064
Position: Head of Data Engineering
Employment Terms: Permanent and Pensionable
Jubilee Insurance was established in August 1937, as the first locally incorporated Insurance Company based in Mombasa. Jubilee Insurance has spread its sphere of influence throughout the region to become the largest Composite insurer in East Africa, handling Life, Pensions, General and Medical Insurance. Today, Jubilee is the number one insurer in East Africa with over 450,000 clients. Jubilee Insurance has a network of offices in Kenya, Uganda, Tanzania, Burundi and Mauritius. It is the only ISO certified insurance group listed on the three East Africa stock exchanges – The Nairobi Securities Exchange (NSE), Dar es Salaam Stock Exchange and Uganda Securities Exchange. Its regional offices are highly rated on leadership, quality and risk management and have been awarded an AA- in Kenya and Uganda, and an A+ in Tanzania. For more information, visit www.JubileeInsurance.com
We currently have an exciting career opportunity for Head of Data Engineering. The position holder will report to the Chief Data Officer and will be based at Head Office in Nairobi.
Main Purpose of the Job (Job Summary)
The role holder will be responsible for leading a team of data engineers to design and deliver a data engineering solution for our internal and external facing business lines. The role holder will leverage on strategic planning, business analysis and technical knowledge of data engineering, tools, and data architecture definition. In addition to managing Jubilee Group’s portfolio of datasets, the role holder will play key roles in helping our data scientists to leverage these datasets effectively.
Contribute to value data from EDM and business perspective:
- Using architectures and data engineering techniques to design and provide tools dedicated to data extraction, analysis, and enhancement (build common service layers as much as possible)
- Perform research and analysis (including technological watch) as needed to understand market trends and impact
- Contribute to build & maintain the global analytic environment of SGL (which includes Data Science & Big data platform, Data Catalog and Data Capture tools) to ease exploitation of data
- Take part in the strategic comity for Data Analytics Solution
- Ensure compliance with policies related to Data Management and Data Protection, in close relationship with the Data Protection Officer, Security & Risk regulation teams
- Contribute to build data engineering pipelines & API for Data Science / Big Data applications
- Take active part in data architecture conception, environments design, core components development based on conceptual architecture/design, etc.
- Design, manage and support PoC, contribute to the choice of tools (build or buy) with all the team & the Group, test solutions. Identify and challenge partners and providers when relevant.
- Document services and build all relevant documentation
Collaborate and communicate:
- Act as an SME and tech lead / veteran for any data engineering question and manage data engineers within the Data Analytics Solution organization.
- Promote data cultural change within the division to build a data-driven company (convince people of the importance of data, how it should be managed and used, …)
- Collaborate with SGL local teams, FIT department colleagues, IT SME (functional, data, solution and technical architects, data scientists, innovators, business experts…)
- Promote services, contribute to the identification of innovative initiatives within the Group, share information on new technologies in dedicated internal communities.
- Bachelor’s degree in Statistics, Software Engineering, Engineering, Machine Learning, Mathematics, Computer Science, Economics, or any other related quantitative field.
- Python/R integration (preferred)
- A minimum of 5 years’ experience in BI developments with a preference for the finance or insurance industry
- Proven experience (5+ years) writing complex SQL / NoSQL statements and ETL processes
- Proven experience (3+ years) on Big Data technologies such as Hadoop, Spark, Elastic, etc.
- Development & coding design in parallelized/distributed environments, preferably in cloud environment (AWS / Azure)
How to apply
If you are qualified and seeking an exciting new challenge, please apply via Recruitment@jubileekenya.com quoting the Job Reference Number and Position by 23rd May 2022. Only shortlisted candidates will be contacted.