Role and Responsibilities
- Design, build, and maintain scalable data pipelines (batch and real-time)
- Develop and optimize ETL/ELT processes for large datasets
- Manage data storage solutions (data lakes, warehouses)
- Ensure data quality, integrity, and availability across systems
- Collaborate with analysts, data scientists, and backend teams
- Monitor pipeline performance and troubleshoot issues