This job has been Expired
Job Overview
JOB DETAILS
Requirements
- 5+ years of development experience with Python, Scala, or SQL
- Proven professional working experience in Event Streaming Platforms and data pipeline orchestration tools like Apache Kafka, Fivetran, Apache Airflow, or similar tools
- Proven professional working experience in Databricks, Snowflake, BigQuery, Spark, HIVE, Hadoop, Cloudera, or RedShift
- Experience in schema design and data ingestion/processing
- Experience developing in Docker, Rancher, or Kubernetes
- Experience in orchestrating data processing jobs using Apache Airflow
- Bachelor’s degree in Computer Science or related field
Responsibilities
- Create scalable, maintainable, reliable data practices and data pipelines
- Identify, design, and implement internal process improvements
- Build and enhance a shared data lake
- Partner with teams across the business to develop end-to-end data solutions
- Collaborate with analysts and data scientists for exploratory analysis
- Manage and model data using visualization tools
- Build tools and processes to make correct data accessible
Are you interested in this position?
Apply by clicking on the “Apply Now” button below!
#JobSearch#CareerOpportunities
#HiringNow#EmploymentOpportunities
#JobOpenings#RecruitmentAgency
#JobSeekers#Myfintechjob#