Lead Data Engineer :
We are seeking a talented Lead Data Engineer to join our dynamic team. In this role, you will play a pivotal part in designing, developing, and maintaining cutting-edge data solutions. If you're passionate about leveraging technology to drive business success and thrive in a collaborative environment, this is the perfect opportunity for you.
Key Responsibilities :
Innovative Data Pipeline Development :
- Design and implement data pipelines for seamless collection, transformation, and loading of data across multiple platforms.
Robust Data Warehousing :
- Lead the establishment and maintenance of robust data warehousing and data lake solutions.
Strategic Data Modeling :
- Develop and deploy data models that cater to diverse business needs and ensure efficient operations.
Scalable Code Development :
- Craft scalable and efficient code using Python, Scala, or Java to support data processing requirements.
Quality-Driven Solutions :
- Lead the design of data solutions with a focus on quality, automation, and performance to meet evolving business needs.
Ownership and Reliability :
- Take ownership of data pipelines feeding into our Data Platform, ensuring reliability and scalability at every step.
Business-Ready Data :
- Ensure timely and accurate availability of data to empower business decision-making and analytics.
Stakeholder Collaboration :
- Collaborate closely with stakeholders and clients to understand their data requirements and provide tailored solutions.
Clear Communication :
- Effectively communicate complex solutions to both technical and non-technical stakeholders, fostering understanding and in Cloud Technologies: Extensive experience leading initiatives on AWS and Snowflake platforms, delivering large-scale data and analytics solutions.
End-to-End Data Expertise: Hands-on experience with end-to-end data pipeline implementation on AWS, covering data preparation, extraction, transformation & loading, normalization, aggregation, warehousing, data lakes, and governance.
Architectural Understanding :
- Strong grasp of modern data architecture concepts such as Data Lake, Data Warehouse, Lakehouse, and Data Mesh.
Technical Proficiency :
- Proficiency in Terraform, Snowflake, dbt, SnapLogic, Kafka, SQL, Python, and relevant technologies.
Agile Mindset :
- Experience working in Agile/Scrum environments, embracing iterative development and continuous improvement.
Continuous Learning :
- A self-driven individual with a passion for learning and adapting to new technologies in a rapidly evolving landscape.
Collaborative Spirit :
- Ability to thrive in a fast-paced, collaborative environment, tackling challenges with initiative and innovation.
Attention to Detail :
- Strong attention to detail and thoroughness in all tasks, ensuring the highest quality deliverables.
Effective Communication :
- Excellent communication skills, with the ability to articulate complex solutions clearly and guide others effectively.
Skills Required
Primary : CI/CD, Production deployment, Client management and Salesforce
Mandatory : Terraform, Snowflake, dbt, snap logic, Kafka, SQL