Posted:2 weeks ago|
Platform:
Hybrid
Full Time
Job Description Looking for a Data Engineer with expertise in Azure SQL, data solutions, and data migrations to design, develop, and optimize data pipelines and integration processes. You will be responsible for moving data between systems, ensuring data integrity, accuracy, and efficiency in migrations and conversions. This role involves working closely with stakeholders to understand data requirements, implement scalable solutions, and optimize database performance. Required Skills & Experience: 4-6 years of experience in data engineering and migrations. Strong expertise in Azure SQL, SQL Server, and cloud-based databases. Hands-on experience with ETL/ELT processes and data integration. Knowledge of Azure Data Factory, Synapse Analytics, and Data Lake. Experience in moving data between systems, data conversions, and migrations. Proficiency in Python, PowerShell, or SQL scripting for data manipulation. Understanding of data modeling, indexing, and performance optimization. Preferred Skills: Experience with NoSQL databases (Cosmos DB, MongoDB). Familiarity with Kafka, Event Hubs, or real-time data streaming. Knowledge of Power BI, Databricks, or other analytical tools. Exposure to Azure DevOps, Git, and CI/CD pipelines for data workflows. Key Responsibilities: Data Engineering & Development Design and implement ETL/ELT pipelines for data movement across systems. Develop, optimize, and manage Azure SQL databases and other cloud-based data solutions. Ensure data integrity and consistency during migrations and conversions. Implement data transformation, cleansing, and validation processes. Data Migration & Integration Design and execute data migration strategies between different platforms. Extract, transform, and load data from structured and unstructured sources. Work with APIs, batch processing, and real-time data movement. Support cross-system data integration for analytics, reporting, and operational needs. Cloud & DevOps Utilize Azure Data Factory, Synapse, and Data Lake for scalable data processing. Implement monitoring, logging, and performance tuning for data solutions. Work with CI/CD pipelines for automated data deployments and version control. Collaboration & Best Practices Work closely with data analysts, developers, and business teams to understand requirements. Ensure compliance with data governance, security, and privacy standards. Document data workflows, architecture, and technical decisions.
Globallogic
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Practice Python coding challenges to boost your skills
Start Practicing Python NowNoida, Pune, Bengaluru
15.0 - 30.0 Lacs P.A.
Hyderabad
10.0 - 14.0 Lacs P.A.
10.0 - 14.0 Lacs P.A.
6.0 - 10.0 Lacs P.A.
Navi Mumbai
6.0 - 9.0 Lacs P.A.
10.0 - 17.0 Lacs P.A.
3.0 - 7.0 Lacs P.A.
3.0 - 7.0 Lacs P.A.
Hyderabad
2.0 - 6.0 Lacs P.A.
Hyderabad
2.0 - 5.0 Lacs P.A.