Kindly go through the below JD & submit your details on google link -Âhttps://forms.gle/fS5uRJzQKkt2PCJK8
Designation - GCP Data Engineer
Skills - Google Cloud, Kubernetes, Kafka, Spark, Hive
Experience - 3 - 8 Years
6 Months Contract on TECH M Payroll & based on their performance they will be converting from Contact to Permanent
Location - Noida
Below JD for your reference
You'll have the following responsibilities
• Applies specialist data expertise to develop and advise approach on a range of complex, high impact and data solutions a services
• Assures data availability and quality in Digital and across all CFU's for BT
• Helps to resolve technical problems
• Proactively identifies new potential data sources and assess feasibility of ingesting
• Develops tangible and ongoing standardisation, simplification and efficiency of engineering processes, reviewing and revising continuous improvement opportunities
• Assures a high quality and comprehensive data flow and manage a team that provides a consumption layer where the business has access to all the data it needs
• Ensures that all data acquired is fully described/understood and communicated using appropriate tools
• Productionise any tactical data feeds, including documentation
You'll have the following skills & experience
Essential skills & experience
• Deep technical knowledge of complex (and simple) data architectures, covering all aspects (compliance, risk, security) of our requirements
• Detailed knowledge of the concepts and principles of Data Engineering
• Sound awareness of Data Management best practice, including data lifecycle management
• Extensive skills in SQL, both at production grade and at analytical level, gained through intensive application in a commercial business environment
• Can deliver complex big data solutions with structured and unstructured data
• Excellent oral and written communication skills for all levels of an organisation
• Collaborate to identify how work activities across the teams are related and highlight inefficiencies. You help to remove barriers and find the resources or support needed to improve processes
• Understanding of task orchestration frameworks (Apache Airflow etc) within cloud environments.
• Use of Google Cloud PubSub as part of a scalable real-time analytics solution.
• Experience using containerisation (Kubernetes/Docker) within one or more cloud environments - (GKE, EKS etc).
• Familiarity with Apache Kafka core concepts, previous experience interacting with a production-grade kafka cluster.
• Experience with python development and a solid understanding of Google Cloud Platform sdk's and api's.
Desirable skills & experience
• Experienced in deploying data solutions and cloud infrastructure via CI/CD pipelines
• Experienced in deploying Infrastructure as Code (Terraform/Cloudformation etc)
• Knowledge of REST/Graph APIs and how they can be used in a data environment.
• Knowledge of Apache Kafka/Confluent within cloud environments.
• Experience of building large scale data pipelines on Google Cloud Platform
• Experience with Spark, Spark Structured Streaming, Hive and HDFS.
Skills: Gcp, Bigdata, Hive, Hadoop, hdfs
Experience: 3.00-8.00 Years