Company

Antal InternationalSee more

addressAddressDelhi
type Form of workRecruitment Agency
salary SalaryUnspecified
CategoryFinance & Accounting

Job description

Job Title: GM – Data Engineering


Job Type: Permanent


Locations: Pune


Experience: 23 to 28 Years
Skill Set :
GCP, Snowflake, Airflow, DBT, SQL, Python, Data Extraction, Data Transformation, Data Architecture, Data Pipeline, Data Ingestion, data warehousing, Data Engineering, Data Lake, Data Governance, Data Flow, Data Quality


Key Responsibilities:


  • Local Line Management, Recruitment & Team Development: You will manage several multi-disciplinary data delivery teams consisting of Data & Analytics Engineers and Test Engineers with Data Scientists and Data Visualization specialists embedded as required. The data teams are expanding rapidly, and you will play a key role in recruitment across your teams and support the ongoing learning and development of your team members.
  • Data Delivery: You will be responsible for the delivery performance of your teams and will ensure key delivery metrics are closely monitored to allow you to best provide support where needed, communicate effectively on progress, and identify opportunities to improve and optimize our data delivery processes.
  • Data Architecture & Solution Design: You will support continual improvement and optimization of our data architecture working closely with other data managers and our data architecture function to ensure we have a good understanding of emerging trends in the data landscape and respond appropriately to evaluate and consider them as part of our longer-term data strategy. You will also be involved in supporting solution design for delivery through your team in line with data architecture standards and principles

Requirements:


  • Communication: You should demonstrate strong written and verbal communication skills and be comfortable communicating and building relationships with stakeholders at all levels up to and including C-level.
  • Management: You should have prior experience managing a data team, ideally in a medium to large-scale organization. This would be a perfect opportunity for someone looking to extend their management remit across multiple teams and gain experience of building-up data teams. Experience managing multi-disciplinary, off-site, and multi-cultural teams would also be beneficial.
  • Agile Delivery: You should have experience working in an Agile delivery environment, ideally using Scrum and\or Kanban.
    • SQL (mandatory): You should be able to demonstrate a strong understanding of SQL and be comfortable reading and writing complex SQL queries ideally across multiple platforms
    • Cloud Platforms (mandatory): You should have experience working with key services on either GCP (preferred), AWS or Azure. Key services include cloud storage, containerization, event-driven services, orchestration, cloud functions and basic security/user management.
    • Data Warehousing (highly desirable): You should have experience working on a medium to large-scale data warehouse solution irrespective of underlying technology. Ideally you will have experience working on the design and data modelling stages of a data warehouse projects and be comfortable with conceptual, logical and physical data modelling techniques as well as dimensional modelling techniques
    • CI\CD & Automation (desirable): Any experience developing or supporting data CI\CD pipelines regardless of tooling would be beneficial. We use Microsoft Azure DevOps to run most of our CI\CD pipelines. We also rely heavily on Infrastructure as Code for cloud infrastructure deployment so any experience with technology such as Terraform would be beneficial in this respect.
    • Data Visualization (desirable): Although we have dedicated data visualization specialists within the team, any knowledge of, or experience with, data visualization platforms such as Tableau (preferred), Power BI, Looker or Quick Sight would be beneficial.
    • Ingest, cleanse and transform data from a wide variety of source systems into our cloud data lake to support advanced analytics, data warehousing and data science

Technical Skills & Knowledge:


  • Advanced SQL knowledge with experience using a wide variety of source systems including Microsoft SQL Server
  • Experienced in Cloud Data Engineering on the Google Platform (experience of AWS and Azure also beneficial but delivery will be focused on GCP)
  • Specific experience with the following services running on the GCP platform:
    • Google Cloud Storage (GCS)
    • Google Cloud Composer (or Apache Airflow) including development of DAG’s
    • Google Kubernetes Engine (GKE) or equivalent experience working with containerization
    • Google Cloud Functions
  • Experience working with Infrastructure as Code (IaC), specifically with Terraform (equivalent experience with similar technology also accepted)
  • Experience of working in an Agile delivery team using automated build, deployment and testing (CI\CD, DevOps, DataOps)
  • Experience with one or more programming language compatible with developing functionality on the above platform (Python o0r Java preferred)
  • Knowledge or experience in the field of data warehousing and advanced analytics would be beneficial but not essential, specifically any experience in the following area:
    • Dimensional Modelling
    • Working with Snowflake’s Cloud Data Warehouse (Google Big Query experience also beneficial)
    • Working with dbt to develop, test, deploy and monitor data transformation code

 Qualification & Certification:


  • B.E./B.Tech/ MTech in IT or Computer Science from a reputed institute (preferred) or Master’s Degree in Quantitative Subjects e.g. Mathematics, Statistics & Economics
  • Cloud Certification in GCP with the following being preferred (AWS, Azure certifications also beneficial)
    • Google Certified Cloud Architect
    • Google Certified Data Engineer
  • Any certification or formal training in the following areas would also be highly beneficial:
    • Python
    • Snowflake Cloud Data Warehouse
    • dbt
    • Terraform
Refer code: 945801. Antal International - The previous day - 2024-03-05 07:44

Antal International

Delhi
Popular Data Engineer jobs in top cities

Share jobs with friends

Related jobs

Gm – Data Engineering

Junior Data Engineer - X Delivery

Boston Consulting Group

Delhi

3 months ago - seen

R & D Software Engineer - QT , OPENGL

Nippon Data System, Delhi ,Delhi/ Ncr

Undisclosed

Delhi

3 months ago - seen

Data Engineer

Luxoft

Unspecified

Delhi

4 months ago - seen

Senior Data Engineer

Luxoft

Unspecified

Delhi

4 months ago - seen

Data Engineer

Razor Group

New Delhi, Delhi

4 months ago - seen

Senior Software Engineer - Data/ML

Antal International

Unspecified

Delhi

4 months ago - seen

Lead Data Engineer

Wavicle Data Solutions

Delhi

4 months ago - seen

Senior Lead Engineer

Ntt Data Services

Delhi

4 months ago - seen

Data Engineer - Python/PySpark/AWS

Crescendo Global

Delhi

5 months ago - seen

Sr. Data Engineer

Rockwell Automation

Delhi

5 months ago - seen

Senior Associate Engineer, Data Engineering

Bain & Company

New Delhi, Delhi

5 months ago - seen

Data Engineer

Rooter.gg

Delhi

5 months ago - seen

Immediate opening for Big Data Engineer - Delhi (Hybrid)

Htc Global Services

New Delhi, Delhi

5 months ago - seen

Immediate opening for Big Data Engineer - Delhi (Hybrid)

Htc Global Services

Delhi

5 months ago - seen

Lead Electrical Commissioning Engineer - DATA CENTERS

Commissioning Agents Inc.

New Delhi, Delhi

5 months ago - seen

Senior Data Engineer, Client Ops- X Delivery

Boston Consulting Group

Delhi

6 months ago - seen

Applied Data Research Engineer 2

Microsoft

Delhi

6 months ago - seen

Applied Data Research Engineer

Microsoft

Unspecified

Delhi

6 months ago - seen