Trigyn has immediate requirement for a Azure Data Engineer. The ideal candidate will be responsible for the design and implementation of a data infrastructure that provides standard data connectivity, storage, pipelining, and data integration services under a data hub and spoke architecture model.
• Collecting large, complex sets of data that meet non-functional and functional business requirements.
• Identifying, designing, and implementing internal process improvements including re-designing infrastructure for greater scalability, optimizing data delivery, and automating manual processes.
• Building required infrastructure for optimal extraction, transformation and loading of data from various data sources using Azure and SQL technologies.
• Building analytical tools to utilize the data pipeline, providing actionable insight into key business performance metrics including operational efficiency and customer acquisition.
• Working with stakeholders including cloud infrastructure teams, data teams, governance, and information management officers, to collaboratively design, develop and deploy and support data solutions and related infrastructure.
• Input for API Service – Provide input, feedback and testing for the implementation of the API service that will standardize data exchange between different components (Ingestion, Processing, storage, delivery).
• Implement Central API Services - Implementing and configure the API Service that has been developed to the central data hub. Also work with and help coordinate use of APIs with satellite(spoke) data platforms standardize data ingestion, metadata exchange and data delivery between hub and spoke data platforms.
• Data Synchronization - Implementing Central Data Exchange interface that can be used by satellite (spoke) data platforms for data exchange between platforms.
• Data Pipelines – Design and implement optimal solutions (Azure Data Factory, Data Lake, Synapse etc.) for data processing, storage, analysis, and delivery.
• Data Governance – Translating requirements for data governance (Workflow, Information sensitivity, data quality, data privacy) into a functional architecture.
• Identity and Access Management - Establish secure and smart data connectivity via advanced information protection layers and the UN’s Identity Access Management system (Azure AD).
• Proficiency in working with Azure Data Factory - Extraction, Transformation and Loading is required.
• Proficiency in working with Azure Storage/ Data Lake / Data Warehouse is required.
• Experience with Database architecture testing methodology, including execution of test plans, debugging, and testing scripts and tools is required.
• Working knowledge of Azure Synapse analytics is required.
• Strong experience in coding languages like Python, Scala & Java is a plus.
• Experience with Azure DevOps and Azure Purview are a plus.
• Any professional certifications encouraged.
• At least 5 years of relevant experience with detailed knowledge of data warehouse technical architectures, infrastructure components, ETL/ ELT and reporting/analytic tools.
• At least 3 years of experience with Azure Cloud portal.
Master’s degree in computer science, Information Systems, Engineering or its equivalent in education and/or work experience