Description
Responsibilities :
Enable and maintain data sets on-prem and in the cloud
Move data across the enterprise using packaged data containers, with access through common APIs
Integrate systems with connectors to build reliable pipelines
Monitor load-balancing processes to reduce down-time
Build queries to derive data and maintain archives related to data architecture and extraction
Develop scalable transformation packages, to support data flow and analytic activities
Qualifications :
Minimum three years` experience delivering data in a clean and efficient manner; trained in database architecture, data import/export functionality and controlling access to data
Bachelor`s degree from an accredited college/university or equivalent work experience
Strong background in computer science or software engineering as well as exposure to distributed or high-performance computing; MS Azure experience helpful
Working knowledge of data manipulation and ETL using common languages like SQL and python
Strong verbal/written communication, problem solving, analytical and independent judgment skills, to support an environment driven by customer service and teamwork; ability to positively influence, mentor and be a credible source of knowledge to less experienced team members