A consulting firm focused on delivering strategic insights and solutions across various sectors including government and private enterprises.
Not specified
INR 20.0 - 35.0 Lacs P.A.
Work from Office
Full Time
Role: Datasatge Tech LeadLocation- Pune (Hybrid) / RemoteExperience:12 to 14 YearsPrimary Skill: Datastage, ADF, Azure, SQL, ETLSkills & Qualifications10+ years of experience as a Data Engineer or similar role.Proven expertise in ETL Tool for data pipeline development and data integration.Strong understanding of data warehousing principles and experience with cloud-based data warehouse solutions.Experience with data quality tools and techniques for data cleansing and validation.Proficiency in SQL scripting for data manipulation and querying experience with cloud-based SQL solutions like Azure SQL Database or Oracle (a plus).Familiarity with cloud platforms like Azure for data storage and processing (a plus).Excellent analytical and problem-solving skills.Effective communication and collaboration skills.Ability to work independently and manage multiple tasks effectively
Not specified
INR 20.0 - 25.0 Lacs P.A.
Work from Office
Full Time
Role & responsibilities 8+ years of experience as a #Snowflake DBA/Admin. Snowflake DBA, AWS, SQL, ETL concepts, Airflow or any orchestration tools, Data Warehousing conceptsProficiency in scripting languages (e.g., Python, Shell) and automation tools (e.g., Ansible, Terraform).Experience in database performance tuning, security management, and automation.Ability to diagnose and troubleshoot database issues effectively.Excellent problem-solving and communication skills Contactsoniya soniya05.mississippiconsultants@gmail.com Preferred candidate profile Perks and benefits
Not specified
INR 10.0 - 16.0 Lacs P.A.
Work from Office
Full Time
Role & responsibilities Key Responsibilities:• Design, develop, and maintain data pipelines and ETL processes using Microsoft Azure services (e.g., Azure Data Factory, Azure Synapse, Azure Databricks, Azure Fabric).• Utilize Azure data storage accounts for organizing and maintaining data pipeline outputs. (e.g., Azure Data Lake Storage Gen 2 & Azure Blob storage).• Collaborate with data scientists, data analysts, data architects and other stakeholders to understand data requirements and deliver high-quality data solutions.• Optimize data pipelines in the Azure environment for performance, scalability, and reliability.• Ensure data quality and integrity through data validation techniques and frameworks.• Develop and maintain documentation for data processes, configurations, and best practices.• Monitor and troubleshoot data pipeline issues to ensure timely resolution.• Stay current with industry trends and emerging technologies to ensure our data solutions remain cutting-edge.• Manage the CI/CD process for deploying and maintaining data solutions.Qualifications:• Learning agility• Technical Leadership• Consulting and managing business needs• Strong experience in Python is preferred but experience in other languages such as Scala, Java, C#, etc is accepted.• Experience building spark applications utilizing PySpark.• Experience with file formats such as Parquet, Delta, Avro.• Experience efficiently querying API endpoints as a data source.• Understanding of the Azure environment and related services such as subscriptions, resource groups, etc.• Understanding of Git workflows in software development.• Using Azure DevOps pipeline and repositories to deploy and maintain solutions.• Understanding of Ansible and how to use it in Azure DevOps pipelines.Interview -Face to FaceWork on Client Location-Bellandur5 days work from officeContactSoniyasoniya05.mississippiconsultants.com
Not specified
INR 12.0 - 19.0 Lacs P.A.
Work from Office
Full Time
Role: Data EngineerWork mode: 5 days Work from office Location: Bengaluru (Bellandur) Notice Period: Immediate joiners Job Description:Utilizes software engineering principles to deploy and maintain fully automated data transformation pipelines that combine a large variety of storage and computation technologies to handle a distribution of data types and volumes in support of data architecture design. A Data Engineer designs data products and data pipelines that are resilient to change, modular, flexible, scalable, reusable, and cost effective. Key Responsibilities:• Design, develop, and maintain data pipelines and ETL processes using Microsoft Azure services (e.g., Azure Data Factory, Azure Synapse, Azure Databricks, Azure Fabric).• Utilize Azure data storage accounts for organizing and maintaining data pipeline outputs. (e.g., Azure Data Lake Storage Gen 2 & Azure Blob storage).• Collaborate with data scientists, data analysts, data architects and other stakeholders to understand data requirements and deliver high-quality data solutions.• Optimize data pipelines in the Azure environment for performance, scalability, and reliability.• Ensure data quality and integrity through data validation techniques and frameworks.• Develop and maintain documentation for data processes, configurations, and best practices.• Monitor and troubleshoot data pipeline issues to ensure timely resolution.• Stay current with industry trends and emerging technologies to ensure our data solutions remain cutting-edge.• Manage the CI/CD process for deploying and maintaining data solutions.Qualifications:• Learning agility• Technical Leadership• Consulting and managing business needs • Strong experience in Python is preferred but experience in other languages such as Scala, Java, C#, etc is accepted.• Experience building spark applications utilizing PySpark.• Experience with file formats such as Parquet, Delta, Avro.• Experience efficiently querying API endpoints as a data source.• Understanding of the Azure environment and related services such as subscriptions, resource groups, etc.• Understanding of Git workflows in software development.• Using Azure DevOps pipeline and repositories to deploy and maintain solutions.• Understanding of Ansible and how to use it in Azure DevOps pipelines
Not specified
INR 20.0 - 27.5 Lacs P.A.
Work from Office
Full Time
Role: Lead Data EngineerWork mode: 5 days Work from office Location: Bengaluru (Bellandur) Notice Period: Immediate joiners Job Description:Lead Data Engineer utilizes software engineering principles to deploy and maintain fully automated data transformation pipelines that combine a large variety of storage and computation technologies to handle a distribution of data types and volumes in support of data architecture design. A Senior Data Engineer designs and oversees the entire data infrastructure, data products and data pipelines that are resilient to change, modular, flexible, scalable, reusable, and cost effective. Key Responsibilities: Design and overseeing the entire data architecture strategy. Mentor junior data architects to ensure skill development in alignment with the team strategy. Design and implement complex scalable, high-performance data architectures that meet business requirements. Model data for optimal reuse, interoperability, security and accessibility. Develop and maintain data flow diagrams, and data dictionaries. Collaborate with stakeholders to understand data needs and translate them into technical solutions. Ensure data accessibility through a performant, cost-effective consumption layer that supports use by citizen developers, data scientists, AI, and application integration. Ensure data quality, integrity, and security across all data systems.Qualification: Experience in Erwin, Azure Synapse, Azure Databricks, Azure DevOps, SQL, Power BI, Spark, Python, R. Ability to drive business results by building optimal cost data landscapes. Familiarity with Azure AI/ML Services, Azure Analytics: Event Hub, Azure Stream Analytics, Scripting: Ansible Experience with machine learning and advanced analytics. Familiarity with containerization and orchestration tools (e.g., Docker, Kubernetes). Understanding of CI/CD pipelines and automated testing frameworks. Certifications such as AWS Certified Solutions Architect , IBM certified data architect or similar are a plus.
FIND ON MAP
1. Are certifications needed?
A. Certifications in cloud or data-related fields are often preferred.
2. Do they offer internships?
A. Yes, internships are available for students and recent graduates.
3. Do they support remote work?
A. Yes, hybrid and remote roles are offered depending on the project.
4. How can I get a job there?
A. Apply via careers portal, attend campus drives, or use referrals.
5. How many rounds are there in the interview?
A. Usually 2 to 3 rounds including technical and HR.
6. What is the interview process?
A. It typically includes aptitude, technical, and HR rounds.
7. What is the work culture like?
A. The company promotes flexibility, innovation, and collaboration.
8. What is their average salary for freshers?
A. Freshers earn between 3.5 to 6 LPA depending on role.
9. What kind of projects do they handle?
A. They handle digital transformation, consulting, and IT services.
10. What technologies do they work with?
A. They work with cloud, AI, cybersecurity, and digital solutions.
Reviews
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
Chrome Extension