Who we areWe are Chief Data & Analytics Office (CDAO) responsible for Enterprise data and metrics. The CDAO charter is to enable Cisco to compete and win with data. The playing field is extremely dynamic, motivated by Cisco's transition and expanded market leadership into recurring revenue business models. In line with the most progressive industry trends, CDAO is endowed with a singular focus and sponsorship around all things data and analytics for the Company. Key subject areas within CDAO include Analytics and Data Science, Data Architecture, Data Platforms, Data Engineering Core Data Management, and Data Governance.What You'll DoWe are seeking a skilled and experienced Data Engineer to lead the design, development, and implementation of a ground breaking data integration platform using Python, Google Cloud Platform (GCP) and Snowflake. The successful candidate will collaborate with multi-functional teams of data analysts and Data Engineers to create a scalable, efficient, and secure ...
Who we areWe are Chief Data & Analytics Office (CDAO) responsible for Enterprise data and metrics. The CDAO charter is to enable Cisco to compete and win with data. The playing field is extremely dynamic, motivated by Cisco's transition and expanded market leadership into recurring revenue business models. In line with the most progressive industry trends, CDAO is endowed with a singular focus and sponsorship around all things data and analytics for the Company. Key subject areas within CDAO include Analytics and Data Science, Data Architecture, Data Platforms, Data Engineering Core Data Management, and Data Governance.What You'll DoWe are seeking a skilled and experienced Data Engineer to lead the design, development, and implementation of a ground breaking data integration platform using Python, Google Cloud Platform (GCP) and Snowflake. The successful candidate will collaborate with multi-functional teams of data analysts and Data Engineers to create a scalable, efficient, and secure data architecture that supports our organization's data-driven initiativeAs a Data Engineer your key responsibilities will include:• Data Pipeline Development: Designing and implementing data pipelines using Python-based tools and frameworks, such as Apache Airflow, Luigi, or Apache NiFi, to extract, transform, and load data from various sources into Cloud data warehouses and data lake.• Data Integration: Integrating and consolidating data from different sources using Python libraries and packages, ensuring data consistency, quality, and integrity.• Data Transformation and Processing: Writing Python scripts and implementing algorithms to transform and process raw data into a usable format for analysis, including data cleansing, aggregation, and enrichment.• Data Storage and Management: Managing and optimizing data storage systems, such as databases or data lakes, using Python libraries and frameworks like SQL Alchemy or Pandas, including setting up and supervising database clusters and implementing data retention policies.• Data Security and Privacy: Implementing data security and privacy measures using Python libraries and frameworks, such as cryptography or data encryption packages, to ensure data protection and compliance with privacy regulations.• Performance Optimization: Optimizing data processing workflows and Python scripts for efficient and timely data processing, including parallel processing, caching, and optimizing database queries.• Collaboration with Data Analysts and Engineers: Collaborating with data analysts and Data Engineers to understand their data requirements, provide them with the necessary datasets, and support their data analysis and transformation tasks.• Documentation: Documenting Data Engineering processes, codebase, and best practices, as well as providing documentation for the systems and tools used.• Continuous Learning and Improvement: Up-to-date with the latest Python libraries, frameworks, and best practices in Data Engineering, and continuously improving scripting skills and Data Engineering practices.Who You are• Bachelor’s degree in Engineering or Computer Science• 7+ years of software development experience in Data Engineering/analytics with Agile and Waterfall development methodologies.• Expert-level programming expertise in Python or other scripting languages.• Must Have Hands-On Experience in Cloud DB platforms (Google BigQuery/Snowflake)• Exposure to application container and runtime such as Docker• Exposure and some experience with Spark is a plus• Teradata, Oracle and HANA experience is a plus.• Experience and proven track record in building/supporting/scoping solutions in the Data Engineering domain• Experience with web service development.• Exposure to Test automation.• You have soft skills like negotiation and working with diverse teams.• You enjoy a fast-paced environment that requires high levels of collaboration• You are a game-changer with innovation on your mind.Who You'll Work WithAs part of the Cisco Data and Analytics, you will work with the team responsible for leading and governing Enterprise Data foundation and metrics. You'll be part of the team of data architects, analysts and engineers while being surrounded by a sophisticated, dynamic, creative and passionate organization that is leading efforts to digitize and demonstrate data in ways it’s never been done before.