Posted:3 days ago|
                                Platform:
                                
                                
                                
                                
                                
                                
                                
                                
                                
                                
                                
                                
                                
                                
                                
                                
                                
                                
                                
                                
                                
                            
On-site
Full Time
Job Title: AWS Data Engineer
Location: Navi Mumbai , Work From Office 5days
Experience: 6+ Years
Budget: 18-20 LPA
Job Summary:
We are looking for a skilled AWS Data Engineer (Lead) to join our team in Mumbai. The ideal candidate will have 6+ years of experience in designing, building, and optimizing data
pipelines and workflows using AWS cloud technologies. You should be proficient in SQL, PySpark, and Python and have hands-on experience with AWS services like Redshift, EMR, Airflow, CloudWatch, and S3.
Key Responsibilities:
· Design, develop, and maintain scalable ETL/ELT data pipelines using AWS cloud technologies.
· Work with PySpark and SQL to process large datasets efficiently.
· Manage AWS services such as Apache Airflow, Redshift, EMR, CloudWatch, and S3 for data processing and orchestration.
· Implement CI/CD pipelines using Azure DevOps for seamless deployment and automation.
· Monitor and optimize data workflows for performance, cost, and reliability.
· Utilize Jupyter Notebooks for data exploration and analysis.
· Collaborate with cross-functional teams, including data scientists, analysts, and software engineers, to ensure smooth data integration.
· Implement Unix commands and Git for version control and automation.
· Ensure best practices for data governance, security, and compliance in cloud environments.
Required Skills:
Programming Scripting:
· SQL, PySpark, Python AWS Tools Services:
· Apache Airflow (Workflow Orchestration)
· Redshift (Data Warehousing)
· EMR (Big Data Processing)
· CloudWatch (Monitoring C Logging)
· S3 (Data Storage)
· Jupyter Notebooks (Data Exploration)
DevOpss CI/CD:
· Azure DevOps (CI/CD Pipelines)
· Git (Version Control)
· Unix Commands (Shell Scripting)
Preferred Qualifications:
· Experience in performance tuning data pipelines and SQL queries.
· Knowledge of data lake and data warehouse architecture.
· Strong problem-solving skills with the ability to troubleshoot data pipeline issues.
· Understanding data security, encryption, and access control in cloud environments.
Job Types: Full-time, Permanent
Pay: ₹1,800,000.00 - ₹2,000,000.00 per year
Ability to commute/relocate:
Application Question(s):
Experience:
Work Location: In person
 
                Varishtha Infotech
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
 
        Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
 
            
         
                        Practice Python coding challenges to boost your skills
Start Practicing Python Nowchennai
Salary: Not disclosed
chennai
27.5 - 42.5 Lacs P.A.
5.0 - 9.0 Lacs P.A.
hyderabad, telangana, india
Salary: Not disclosed
haryana
Salary: Not disclosed
5.0 - 9.0 Lacs P.A.
hyderabad, telangana, india
Salary: Not disclosed
Salary: Not disclosed
5.0 - 9.0 Lacs P.A.
navi mumbai, maharashtra
Experience: Not specified
18.0 - 20.0 Lacs P.A.