Responsibilities
- Data Extraction and Data Processing
- Experience with Elastic Stack (ELK), data modelling, data warehousing, and ETL processes is a must.
- Proficiency in programming languages such as Python and SQL.
- Experience with cloud-based data solutions such as AWS or Azure.
- Basic understanding of Machine Learning methodologies and pipelines.
- Data Quality Assurance
- Maintain and monitor data pipelines, addressing recurring issues and deeper data problems.
- Address and fix vulnerability issues in data pipelines to ensure data security.
- Proactively create alerts and notifications to identify potential issues in data pipelines.
- Data Communication
- Communicating analytical insights through sophisticated synthesis and packaging of results (including PPT slides and charts) with consultants, collecting, synthesising, and analysing case team learning & inputs into new best practices and methodologies.
- Collaboration with Cross-Functional teams to include analysts and business stakeholders to create actiona
Responsibilities
- Data Extraction and Data Processing
- Experience with Elastic Stack (ELK), data modelling, data warehousing, and ETL processes is a must.
- Proficiency in programming languages such as Python and SQL.
- Experience with cloud-based data solutions such as AWS or Azure.
- Basic understanding of Machine Learning methodologies and pipelines.
- Data Quality Assurance
- Maintain and monitor data pipelines, addressing recurring issues and deeper data problems.
- Address and fix vulnerability issues in data pipelines to ensure data security.
- Proactively create alerts and notifications to identify potential issues in data pipelines.
- Data Communication
- Communicating analytical insights through sophisticated synthesis and packaging of results (including PPT slides and charts) with consultants, collecting, synthesising, and analysing case team learning & inputs into new best practices and methodologies.
- Collaboration with Cross-Functional teams to include analysts and business stakeholders to create actionable insights.
Personality Profile
- Conscientiousness: Process-Oriented
- Demonstrates thoughtfulness, meticulousness, and a strong sense of responsibility.
- Highly organised, detail-oriented, and adept at managing multiple tasks and deadlines.
- Proactive in approach with reasonable impulse control.
- Consistently delivers high-quality results with a strong work ethic.
- Committed to continuous self-improvement through formal and informal learning opportunities.
- Sturdiness: Consistently Producing Quality Work
- Demonstrates resilience and perseverance in completing tasks effectively.
- Consistently produces high-quality work, even under pressure or challenging circumstances.
- Maintains focus and determination to achieve goals and meet deadlines consistently.
- Quantitative: Analytical and Data-Driven
- Possesses strong quantitative skills and analytical mindset.
- Comfortable working with numerical data and conducting statistical analysis.
- Applies quantitative methods to solve complex problems and derive actionable insights from data.
- Utilises mathematical models and tools to optimise processes and improve efficiency.
Experience and Qualifications
Minimum Qualifications
- 2+ years of data pipeline management experience.
- Proficiency in Python development (2+ years).
- Exposure to Google Cloud Platform (GCP).
- Ability to write complex SQL queries.
- Understanding of data management standards.
Preferred Qualifications:
- 2-4 years of Data Engineering experience.
- Consulting experience is a plus.
- Knowledge of Spark cluster management.
- Hands-on experience with tools like Airflow.
- Proficiency in Elastic Stack (ELK).
Categorisation Details
○ Industry Type: Health and Wellness Technology
○ Department: Operations
○ Employment Type: Full Time, Permanent
○ Education: Relevant Degree in Business, Data Science, Data Engineering or Computer Science