HamburgerMenu
hirist

Data Engineer - Azure/Synapse Analytics

IOWeb3 Technologies
Others
3 - 5 Years

Posted on: 05/06/2025

Job Description

Key Responsibilities :

- Azure Synapse Development: Design, develop, and optimize data solutions within Azure Synapse Analytics, leveraging its capabilities for data warehousing, data lakes, and big data processing.

- Data Pipeline Development (ADF): Build, manage, and monitor scalable and efficient data pipelines using Azure Data Factory (ADF) for data ingestion, transformation, and orchestration.

- Data Warehousing & Modelling: Apply expertise in data warehousing principles and various data modelling techniques to design and implement robust data structures.

- Snowflake & Stored Procedures: Work extensively with Snowflake, including data loading, transformation, and optimizing queries. Develop and maintain complex Stored Procedures in various database environments.

- ETL/ELT Processes: Implement and enhance ETL (Extract, Transform, Load) and ELT (Extract, Load, Transform) processes to move data efficiently between disparate systems.

- Data Quality & Monitoring: Implement and ensure adherence to data quality frameworks. Utilize data monitoring tools to ensure data integrity and reliability.

- Job Scheduling: Configure and manage job scheduling for automated data workflows and pipeline execution.

- Data Format Handling: Work proficiently with various data formats including JSON, XML, CSV, and Parquet.

- Agile Collaboration: Participate actively in an Agile development environment, using tools like JIRA for task management and collaboration.

- Communication: Clearly communicate technical concepts and solutions to team members and stakeholders, maintaining formal and professional interactions.


Must-Have Skills :


- Azure Synapse: Good experience in Azure Synapse Analytics.

- Azure Data Factory (ADF): Good experience in Azure Data Factory.

- Snowflake: Good experience with Snowflake.

- Stored Procedures: Strong experience with Stored Procedures.

- Data Engineering Fundamentals: Experience with ETL/ELT processes, data warehousing, and data modelling.

- Data Quality & Operations: Experience with data quality frameworks, monitoring tools, and job scheduling.

- Data Formats: Knowledge of data formats like JSON, XML, CSV, and Parquet.

- Agile: Experience with Agile methodology and tools like JIRA.

- Language: Fluent in English (Strong written, verbal, and presentation skills).

- Communication: Good communication and formal skills.


info-icon

Did you find something suspicious?

Posted By

Job Views:  
342
Applications:  187
Recruiter Actions:  0

Functional Area

Big Data / Data Warehousing / ETL

Job Code

1491831