This job has been Expired
Job Overview
JOB DETAILS
Requirements
- Bachelor’s Degree or foreign equivalent in Computer Science, Electrical Engineering, Mathematics, Computer Applications, Information Systems or Engineering is required
- Experience building high-performance, and scalable distributed systems
- Experience in ETL and ELT workflow management
- Experience using Kafka as a distributed messaging system
- Experience with Kafka producer and consumer APIs
Responsibilities
- building (1) data provisioning frameworks, (2) data integration into data warehouse, data marts and other analytical repositories (3) integration of analytical results into operational systems, (4) development of data lakes and other data archival stores.
- Optimally leverage the data integration tool components for developing efficient solutions for data management, data wrangling, data packaging and integration. Develop overall design and determine division of labor across various architectural components
- Deploy and customize Standard Architecture components
Are you interested in this position?
Apply by clicking on the “Apply Now” button below!
#JobSearch#CareerOpportunities
#HiringNow#EmploymentOpportunities
#JobOpenings#RecruitmentAgency
#JobSeekers#Myfintechjob#