Home
Jobs
Companies
Resume

36 Dwh Jobs - Page 2

Filter
Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

3 - 5 years

40 - 45 Lacs

Bhubaneshwar, Kochi, Kolkata

Work from Office

Naukri logo

We are seeking experienced Data Engineers with over 3 years of experience to join our team at Intuit, through Cognizant. The selected candidates will be responsible for developing and maintaining scalable data pipelines, managing data warehousing solutions, and working with advanced cloud environments. The role requires strong technical proficiency and the ability to work onsite in Bangalore. Key Responsibilities: Design, build, and maintain data pipelines to ingest, process, and analyze large datasets using PySpark. Work on Data Warehouse and Data Lake solutions to manage structured and unstructured data. Develop and optimize complex SQL queries for data extraction and reporting. Leverage AWS cloud services such as S3, EC2, EMR, Athena, and Redshift for data storage, processing, and analytics. Collaborate with cross-functional teams to ensure the successful delivery of data solutions that meet business needs. Monitor data pipelines and troubleshoot any issues related to data integrity or system performance. Required Skills: 3 years of experience in data engineering or related fields. In-depth knowledge of Data Warehouses and Data Lakes. Proven experience in building data pipelines using PySpark. Strong expertise in SQL for data manipulation and extraction. Familiarity with AWS cloud services, including S3, EC2, EMR, Athena, Redshift, and other cloud computing platforms. Preferred Skills: Python programming experience is a plus. Experience working in Agile environments with tools like JIRA and GitHub.

Posted 2 months ago

Apply

7 - 12 years

20 - 35 Lacs

Hyderabad

Hybrid

Naukri logo

Job Description: We are looking for a highly skilled Lead Data Engineer with 8-12 years of experience to lead a team in building and managing advanced data solutions. The ideal candidate should have extensive experience with SQL, Teradata, Ab-Initio, and Google Cloud Platform (GCP). Key Responsibilities: Lead the design, development, and optimization of large-scale data pipelines,ensuring they meet business and technical requirements. Architect and implement data solutions using SQL, Teradata, Ab-Initio, and GCP, ensuring scalability, reliability, and performance. Mentor and guide a team of data engineers in the development and execution of ETL processes and data integration solutions. Collaborate with cross-functional teams (e.g., data scientists, analysts, productmanagers) to define data strategies and deliver end-to-end data solutions. Take ownership of end-to-end data workflows, from data ingestion to transformation, storage, and accessibility. Lead performance tuning and optimization efforts for complex SQL queries and Teradata database systems. Design and implement data governance, quality, and security best practices to ensure data integrity and compliance. Manage the migration of legacy data systems to cloud-based solutions on Google Cloud Platform (GCP). Ensure continuous improvement and automation of data pipelines and workflows. Troubleshoot and resolve issues related to data quality, pipeline performance, and system integration. Stay up-to-date with industry trends and emerging technologies to drive innovation and improve data engineering practices within the team. Required Skills: 8-12 years of experience in data engineering or related roles. Strong expertise in SQL, Teradata, and Ab-Initio. In-depth experience with Google Cloud Platform (GCP), including tools like BigQuery,Cloud Storage, Dataflow, etc. Proven track record of leading teams and projects related to data engineering and ETL pipeline development. Experience with data warehousing and cloud-native storage solutions. Strong analytical and problem-solving skills. Experience in setting up and enforcing data governance, security, and compliance standards. Preferred Skills: Familiarity with additional cloud services (AWS, Azure). Experience with data modeling and metadata management. Knowledge of big data technologies like Hadoop, Spark, etc. Strong communication skills and the ability to collaborate effectively with both technical and non-technical teams.

Posted 2 months ago

Apply

15 - 17 years

17 - 19 Lacs

Pune

Work from Office

Naukri logo

Position Overview We are seeking a dynamic and experienced Enterprise Solution Architect to lead the design and implementation of innovative solutions that align with our organization's strategic objectives. The Enterprise Solution Architect will play a key role in defining the architecture vision+ establishing technical standards+ and driving the adoption of best practices across the enterprise. The ideal candidate will have a deep understanding of enterprise architecture principles+ business processes+ and technology trends+ with a focus on delivering scalable+ flexible+ and secure solution Responsibilities • Drive client conversations+ solutions and build strong relationships with client+ acting as a trusted advisor and technical expert. Experienced in laying down Architectural roadmap+ guidelines and High Level Design covering E2E lifecycle of data value chain from ingestion+ integration+ consumption (visualization+ AI capabilities)+ data governance and non-functionals (incl. data security) Experienced in delivering large scale data platform implementations for Telecom clients. Must have Telecom domain understanding. Experienced in implementation of data applications and platform on GCP. Execution of a comprehensive data migration strategy for our telecom client+ involving multiple source systems to GCP. Deep dive into client requirements to understand their data needs and challenges. Proactively propose solutions that leverage GCP's capabilities or integrate with external tools for optimal results. Spearhead solution calls with the client+ translating complex data architecture and engineering concepts into clear+ actionable plans for data engineers. Demonstrate flexibility and adaptability to accommodate evolving needs. Develop a robust data model for the telecom client+ ensuring data is organized+ consistent+ and readily available for analysis. Leverage your expertise in Data+ AI+ and ML to create a future-proof blueprint for the client's data landscape+ enabling advanced analytics and insights generation. Develop architectural principles+ standards+ and guidelines to ensure consistency+ interoperability+ and scalability across systems and applications. Lead the design and implementation of end-to-end solutions that leverage emerging technologies and industry best practices to address business challenges and opportunities. Conduct architectural reviews and assessments to validate design decisions+ identify risks+ and recommend mitigation strategies. Collaborate with vendors+ partners+ and external consultants to evaluate and select technology solutions that meet business requirements and align with enterprise architecture standards. Drive the adoption of cloud computing+ microservices architecture+ API management+ and other emerging technologies to enable digital transformation and innovation. Communicate the enterprise architecture vision+ principles+ and roadmap to stakeholders at all levels of the organization+ and advocate for architectural decisions and investments. Qualifications • Bachelor's degree in Computer Science+ Engineering+ or a related field. Total experience of 18+ years on data analytics implementations. Minimum 10+ years of extensive experience as a Principal Solution Architect or similar senior role. Proven success in leading large-scale data migrations+ particularly to GCP. In-depth knowledge of data architecture principles and best practices. Strong understanding of data modeling techniques and the ability to create efficient data models. Experience working with GCP and its various data management services (e.g.+ BigQuery+ Cloud Storage+ Dataflow+ dbt). Experience with at least one programming language commonly used in data processing (e.g.+ Python+ Java). A demonstrable understanding of Data Science+ Artificial Intelligence+ and Machine Learning concepts.

Posted 3 months ago

Apply

5 - 10 years

20 - 25 Lacs

Bengaluru

Work from Office

Naukri logo

Job Description AWS Data engineer Hadoop Migration We are seeking an experienced AWS Principal Data Architect to lead the migration of Hadoop DWH workloads from on-premise to AWS EMR. As an AWS Data Architect, you will be a recognized expert in cloud data engineering, developing solutions designed for effective data processing and warehousing requirements of large enterprises. You will be responsible for designing, implementing, and optimizing the data architecture in AWS, ensuring highly scalable, flexible, secured and resilient cloud architectures solving business problems and helps accelerate the adoption of our clients data initiatives on the cloud. Key Responsibilities: Lead the migration of Hadoop workloads from on-premise to AWS-EMR stack. Design and implement data architectures on AWS, including data pipelines, storage, and security. Collaborate with cross-functional teams to ensure seamless migration and integration. Optimize data architectures for scalability, performance, and cost-effectiveness. Develop and maintain technical documentation and standards. Provide technical leadership and mentorship to junior team members. Work closely with stakeholders to understand business requirements, and ensure data architectures meet business needs. Work alongside customers to build enterprise data platforms using AWS data services like Elastic Map Reduce (EMR), Redshift, Kinesis, Data Exchange, Data Sync, RDS , Data Store, Amazon MSK, DMS, Glue, Appflow, AWA Zero-ETL, Glue Data Catalog, Athena, Lake Formation, S3, RMS, Data Zone, Amazon MWAA, APIs Kong Deep understanding of Hadoop components, conceptual processes and system functioning and relative components in AWS EMR and other AWS services. Good experience on Spark-EMR Experience in Snowflake/Redshift Good idea of AWS system engineering aspects of setting up CI-CD pipelines on AWS using Cloudwatch, Cloudtrail, KMS, IAM IDC, Secret Manager, etc Extract best-practice knowledge, reference architectures, and patterns from these engagements for sharing with the worldwide AWS solution architect community Basic Qualifications: 10+ years of IT experience with 5+ years of experience in Data Engineering and 5+ years of hands-on experience in AWS Data/EMR Services (e.g. S3, Glue, Glue Catalog, Lake Formation) Strong understanding of Hadoop architecture, including HDFS, YARN, MapReduce, Hive, HBase. Experience with data migration tools like Glue, Data Sync. Excellent knowledge of data modeling, data warehousing, ETL processes, and other Data management systems. Strong understanding of security and compliance requirements in cloud. Experience in Agile development methodologies and version control systems. Excellent communication an leadership skills. Ability to work effectively across internal and external organizations and virtual teams. Deep experience on AWS native data services including Glue, Glue Catalog, EMR, Spark-EMR, Data Sync, RDS, Data Exchange, Lake Formation, Athena, AWS Certified Data Analytics – Specialty. AWS Certified Solutions Architect – Professional. Experience on Containerization and serverless computing. Familiarity with DevOps practices and automation tools. Experience in Snowflake/Redshift implementation is additionally preferred. Preferred Qualifications: Technical degrees in computer science, software engineering, or mathematics Cloud and Data Engineering background with Migration experience. Other Skills: A critical thinker with strong research, analytics and problem-solving skills Self-motivated with a positive attitude and an ability to work independently and or in a team Able to work under tight timeline and deliver on complex problems. Must be able to work flexible hours (including weekends and nights) as needed. A strong team player

Posted 3 months ago

Apply

2 - 7 years

6 - 16 Lacs

Bengaluru

Work from Office

Naukri logo

Hi, Greetings from Sun Technology Integrators!! This is regarding a job opening with Sun Technology Integrators, Bangalore. Please find below the job description for your reference. Kindly let me know your interest and share your updated CV to nandinis@suntechnologies.com ASAP. 2:00PM-11:00PM-shift timings (free cab facility-drop) +food Please let me know, if any of your friends are looking for a job change. Kindly share the references. Please Note: WFO-Work From Office (No hybrid or Work From Home) Mandatory skills : Snowflake, SQL, ETL, Data Ingestion, Data Modeling, Data Warehouse, AWS S3, EC2 Preferred skills : Any ETL Tools Venue Details: Sun Technology Integrators Pvt Ltd No. 496, 4th Block, 1st Stage HBR Layout (a stop ahead from Nagawara towards to K. R. Puram) Bangalore 560043 Company URL: www.suntechnologies.com Thanks and Regards,Nandini S | Sr.Technical Recruiter Sun Technology Integrators Pvt. Ltd. nandinis@suntechnologies.com www.suntechnologies.com

Posted 3 months ago

Apply

5 - 7 years

22 - 25 Lacs

Chennai, Bengaluru, Hyderabad

Hybrid

Naukri logo

Greetings from InfoVision...!!! We, InfoVision, looking forward to fill the position of Data Engineer with the main skill-set focus on Data Pipelines, Azure Databricks, Pyspark/Python, Azure DevOps, DWH, Azure Data Lake Storage Gen2. Company profile: Infovision, founded in 1995, is a leading global IT services company offering enterprise digital transformation and modernization solutions across business verticals. We partner with our clients in driving innovation, rethinking workflows, and transforming experiences so businesses can stay ahead in a rapidly changing world. We help shape a bold new area or era of technology led disruption accelerating digital with quality, agility and integrity. We have helped more than 75 global leaders across Telecom, Retail, Banking, Healthcare and Technology Industries deliver excellence for their customers. InfoVisions global presence enables us to offer offshore, near shore and onshore solutions for our customers. With our world-class infrastructure for employees and people-centric policies, InfoVision is one of the highest-rated digital services companies in Glassdoor ratings. We encourage our employees to thrive in and are committed to providing a work environment that fosters an entrepreneurial mindset, nurtures inclusivity, values integrity and accelerates your career by creating opportunities for promising growth. Designation: Data Engineer Experience Required: 5-7 Years Job Location: Hyderabad, Chennai, Coimbatore, Pune, Bangalore Opportunity is Fulltime and Hybrid model work As a Data Engineer in our team, you will be responsible for assessing complex new data sources and quickly turning these into business insights. You also will support the implementation and integration of these new data sources into our Azure Data platform. Responsibilities: You are detailed reviewing and analyzing structured, semi-structured and unstructured data sources for quality, completeness, and business value. You design, architect, implement and test rapid prototypes that demonstrate the value of the data and present them to diverse audiences. You participate in early state design and feature definition activities. Responsible for implementing robust data pipeline using Microsoft, Databricks Stack Responsible for creating reusable and scalable data pipelines. You are a Team-Player, collaborating with team members across multiple engineering teams to support the integration of proven prototypes into core intelligence products. You have strong communication skills to effectively convey complex data insights to non-technical stakeholders. You have experience working collaboratively in cross-functional teams and managing multiple projects simultaneously. Skills: Advanced working knowledge and experience with relational and non-relational databases. Experience building and optimizing Big Data pipelines , architectures, and datasets. Strong analytic skills related to working with structured and unstructured datasets. Hands-on experience in Azure Databricks utilizing Spark to develop ETL pipelines . Strong proficiency in data analysis, manipulation, and statistical modeling using tools like Spark, Python , Scala, SQL , or similar languages. Strong experience in Azure Data Lake Storage Gen2, Azure Data Factory , Databricks, Event Hub , Azure Synapse . Familiarity with several of the following technologies: Event Hub, Docker, Azure Kubernetes Service, Azure DWH , API Azure, Azure Function, Power BI , Azure Cognitive Services. Azure DevOps experience to deploy the data pipelines through CI/CD . Qualifications and Experience: Minimum 5-7 years of practical experience as Data Engineer. Bachelors degree in computer science, software engineering, information technology, or a related field. Azure cloud stack in-production experience. You can share your updated resume to the Email ID: Bojja.Chandu@Infovision.com along with below details. Full Name: Current Company: Payroll Company: Experience: Rel. Exp.: Current Location: Preferred Location: CTC: ECTC: Notice Period: Holding offers?: You can connect with me to my LinkedIn as well: https://www.linkedin.com/in/chandu-b-a48b2a142/ Regards, Chandu.B, InfoVision, Senior Executive - Talent Acquisition, Bojja.Chandu@Infovision.com

Posted 3 months ago

Apply

6 - 10 years

8 - 12 Lacs

Mumbai

Work from Office

Naukri logo

Working hours - 2 pm to 11pm Hybrid is a must - 2 days in a week from day of joining. Work location: Mumbai, Pune, Gurgaon, Bangalore Must have ETL Testing QA Hands on - Writing, executing and reviews, End to end. Strong in SQL Knowledge - any cloud AWS, Azure and GCP Strong Communications skills Client facing role, Excellent communication skills Experience in DWH Ready to work as required - flexible with work timings Good to have Knowledge on any Automation tool API testing Agile

Posted 3 months ago

Apply

6 - 10 years

8 - 12 Lacs

Gurgaon

Work from Office

Naukri logo

Working hours - 2 pm to 11pm Hybrid is a must - 2 days in a week from day of joining. Work location: Mumbai, Pune, Gurgaon, Bangalore Must have ETL Testing QA Hands on - Writing, executing and reviews, End to end. Strong in SQL Knowledge - any cloud AWS, Azure and GCP Strong Communications skills Client facing role, Excellent communication skills Experience in DWH Ready to work as required - flexible with work timings Good to have Knowledge on any Automation tool API testing Agile

Posted 3 months ago

Apply

6 - 10 years

8 - 12 Lacs

Pune

Work from Office

Naukri logo

Working hours - 2 pm to 11pm Hybrid is a must - 2 days in a week from day of joining. Work location: Mumbai, Pune, Gurgaon, Bangalore Must have ETL Testing QA Hands on - Writing, executing and reviews, End to end. Strong in SQL Knowledge - any cloud AWS, Azure and GCP Strong Communications skills Client facing role, Excellent communication skills Experience in DWH Ready to work as required - flexible with work timings Good to have Knowledge on any Automation tool API testing Agile

Posted 3 months ago

Apply

6 - 10 years

8 - 12 Lacs

Bengaluru

Hybrid

Naukri logo

Working hours - 2 pm to 11pm Hybrid is a must - 2 days in a week from day of joining. Must have ETL Testing QA Hands on - Writing, executing and reviews, End to end. Strong in SQL Knowledge - any cloud AWS, Azure and GCP Strong Communications skills Client facing role, Excellent communication skills Experience in DWH Ready to work as required - flexible with work timings Good to have Knowledge on any Automation tool API testing Agile

Posted 3 months ago

Apply

8 - 13 years

9 - 14 Lacs

Hyderabad

Work from Office

Naukri logo

Python Knowledge with Pyspark, Pandas and Python Objects Knowledge of Google Cloud PlatformGCP Certified Prof. Data Engineer. Google Cloud :GCP cloud storage, Data proc, Big querySQL - Strong SQL & Advanced SQLSpark- writing skills on Pyspark DWH - Data warehousing concepts & dimension modelingGITTeam Leading ExperienceApache SparkScheduling tools/ETL toolsRoles & Responsibilities: Provide Technical Solutions Team Management:Assign the tasks to the team, Help/Guide Team Members for technical Queries Excellent troubleshooting, attention to detail, and communication skills in fast-paced setting.Good Understanding of Technical Requirements and co-ordination with business stakeholdersDesign & Build Enterprise Datawarehouse & DataMarts, deploy in cloud (GCP) Perform Descriptive Analytics & Reporting Perform peer code reviews, design documents & test casesSupport systems currently live and deployed for customers Build knowledge repository & cloud capabilities Good understand writing python codeUnderstanding Agile - User Story creation

Posted 3 months ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies