Home
Jobs

12859 Etl Jobs - Page 11

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

3.0 - 5.0 years

5 - 7 Lacs

Gurugram

Work from Office

Naukri logo

The ability to be a team player The ability and skill to train other people in procedural and technical topics Strong communication and collaboration skills Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Able to write complex SQL queries; Having experience in Azure Databricks Preferred technical and professional experience Excellent communication and stakeholder management skills

Posted 1 day ago

Apply

4.0 - 6.0 years

6 - 8 Lacs

Mumbai

Work from Office

Naukri logo

As a Data Engineer at IBM, you'll play a vital role in the development, design of application, provide regular support/guidance to project teams on complex coding, issue resolution and execution. Your primary responsibilities include: Lead the design and construction of new solutions using the latest technologies, always looking to add business value and meet user requirements. Strive for continuous improvements by testing the build solution and working under an agile framework. Discover and implement the latest technologies trends to maximize and build creative solutions Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Experience with Apache Spark (PySpark): In-depth knowledge of Sparks architecture, core APIs, and PySpark for distributed data processing. Big Data Technologies: Familiarity with Hadoop, HDFS, Kafka, and other big data tools. Data Engineering Skills: Strong understanding of ETL pipelines, data modeling, and data warehousing concepts. Strong proficiency in Python: Expertise in Python programming with a focus on data processing and manipulation. Data Processing Frameworks: Knowledge of data processing libraries such as Pandas, NumPy. SQL Proficiency: Experience writing optimized SQL queries for large-scale data analysis and transformation. Cloud Platforms: Experience working with cloud platforms like AWS, Azure, or GCP, including using cloud storage systems Preferred technical and professional experience Define, drive, and implement an architecture strategy and standards for end-to-end monitoring. Partner with the rest of the technology teams including application development, enterprise architecture, testing services, network engineering, Good to have detection and prevention tools for Company products and Platform and customer-facing

Posted 1 day ago

Apply

3.0 - 5.0 years

5 - 7 Lacs

Bengaluru

Work from Office

Naukri logo

As an Data Engineer at IBM you will harness the power of data to unveil captivating stories and intricate patterns. You'll contribute to data gathering, storage, and both batch and real-time processing. Collaborating closely with diverse teams, you'll play an important role in deciding the most suitable data management systems and identifying the crucial data required for insightful analysis. As a Data Engineer, you'll tackle obstacles related to database integration and untangle complex, unstructured data sets. In this role, your responsibilities may include: Implementing and validating predictive models as well as creating and maintain statistical models with a focus on big data, incorporating a variety of statistical and machine learning techniques Designing and implementing various enterprise search applications such as Elasticsearch and Splunk for client requirements Work in an Agile, collaborative environment, partnering with other scientists, engineers, consultants and database administrators of all backgrounds and disciplines to bring analytical rigor and statistical methods to the challenges of predicting behaviours. Build teams or writing programs to cleanse and integrate data in an efficient and reusable manner, developing predictive or prescriptive models, and evaluating modelling results Required education Bachelor's Degree Preferred education High School Diploma/GED Required technical and professional expertise Good Hands on experience in DBT is required. ETL Datastage and snowflake - preferred. Ability to use programming languages like Java, Python, Scala, etc., to build pipelines to extract and transform data from a repository to a data consumer Ability to use Extract, Transform, and Load (ETL) tools and/or data integration, or federation tools to prepare and transform data as needed. Ability to use leading edge tools such as Linux, SQL, Python, Spark, Hadoop and Java Preferred technical and professional experience You thrive on teamwork and have excellent verbal and written communication skills. Ability to communicate with internal and external clients to understand and define business needs, providing analytical solutions Ability to communicate results to technical and non-technical audiences

Posted 1 day ago

Apply

3.0 - 5.0 years

5 - 7 Lacs

Pune

Work from Office

Naukri logo

As Data Engineer at IBM you will harness the power of data to unveil captivating stories and intricate patterns. Youll contribute to data gathering, storage, and both batch and real-time processing. Collaborating closely with diverse teams, youll play an important role in deciding the most suitable data management systems and identifying the crucial data required for insightful analysis. As a Data Engineer, youll tackle obstacles related to database integration and untangle complex, unstructured data sets. In this role, your responsibilities may include: Implementing and validating predictive models as well as creating and maintain statistical models with a focus on big data, incorporating a variety of statistical and machine learning techniques Designing and implementing various enterprise search applications such as Elasticsearch and Splunk for client requirements Work in an Agile, collaborative environment, partnering with other scientists, engineers, consultants and database administrators of all backgrounds and disciplines to bring analytical rigor and statistical methods to the challenges of predicting behaviours. Build teams or writing programs to cleanse and integrate data in an efficient and reusable manner, developing predictive or prescriptive models, and evaluating modelling results Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Expertise in designing and implementing scalable data warehouse solutions on Snowflake, including schema design, performance tuning, and query optimization. Strong experience in building data ingestion and transformation pipelines using Talend to process structured and unstructured data from various sources. Proficiency in integrating data from cloud platforms into Snowflake using Talend and native Snowflake capabilities. Hands-on experience with dimensional and relational data modelling techniques to support analytics and reporting requirements Preferred technical and professional experience Understanding of optimizing Snowflake workloads, including clustering keys, caching strategies, and query profiling. Ability to implement robust data validation, cleansing, and governance frameworks within ETL processes. Proficiency in SQL and/or Shell scripting for custom transformations and automation tasks

Posted 1 day ago

Apply

4.0 - 8.0 years

6 - 10 Lacs

Hyderabad

Work from Office

Naukri logo

As Data Engineer at IBM you will harness the power of data to unveil captivating stories and intricate patterns. Youll contribute to data gathering, storage, and both batch and real-time processing. Collaborating closely with diverse teams, youll play an important role in deciding the most suitable data management systems and identifying the crucial data required for insightful analysis. As a Data Engineer, youll tackle obstacles related to database integration and untangle complex, unstructured data sets. In this role, your responsibilities may include: Implementing and validating predictive models as well as creating and maintain statistical models with a focus on big data, incorporating a variety of statistical and machine learning techniques Designing and implementing various enterprise search applications such as Elasticsearch and Splunk for client requirements Work in an Agile, collaborative environment, partnering with other scientists, engineers, consultants and database administrators of all backgrounds and disciplines to bring analytical rigor and statistical methods to the challenges of predicting behaviours. Build teams or writing programs to cleanse and integrate data in an efficient and reusable manner, developing predictive or prescriptive models, and evaluating modelling results Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Experience in the integration efforts between Alation and Manta, ensuring seamless data flow and compatibility. Collaborate with cross-functional teams to gather requirements and design solutions that leverage both Alation and Manta platforms effectively.. Develop and maintain data governance processes and standards within Alation, leveraging Manta's data lineage capabilities.. Analyze data lineage and metadata to provide insights into data quality, compliance, and usage patterns Preferred technical and professional experience Lead the evaluation and implementation of new features and updates for both Alation and Manta platforms Ensuring alignment with organizational goals and objectives. Drive continuous improvement initiatives to enhance the efficiency and effectiveness of data management processes, leveraging Alati

Posted 1 day ago

Apply

4.0 - 6.0 years

6 - 8 Lacs

Pune

Work from Office

Naukri logo

As an Application Developer, you will lead IBM into the future by translating system requirements into the design and development of customized systems in an agile environment. The success of IBM is in your hands as you transform vital business needs into code and drive innovation. Your work will power IBM and its clients globally, collaborating and integrating code into enterprise systems. You will have access to the latest education, tools and technology, and a limitless career path with the worlds technology leader. Come to IBM and make a global impact Responsibilities: Responsible to manage end to end feature development and resolve challenges faced in implementing the same Learn new technologies and implement the same in feature development within the time frame provided Manage debugging, finding root cause analysis and fixing the issues reported on Content Management back end software system fixing the issues reported on Content Management back end software system Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Tableau Desktop & Server SQL, Oracle & Hive, Communication Skills, Project Management Multitasking, Collaborative Skills Proven experience in developing and working Tableau driven dashboards, analytics. Ability to query and display large data sets while maximizing the performance of workbook. Ability to interpret technical or dashboard structure and translate complex business requirements to technica Preferred technical and professional experience Tableau Desktop & Server SQL ,Oracle & Hive

Posted 1 day ago

Apply

1.0 - 5.0 years

1 - 2 Lacs

Thane

Work from Office

Naukri logo

About The Role Short Description for Internal Candidates Description for Internal Candidates Job Role : Periodic extraction and publishing of MIS/Reports/Dashboard/Info graphs and Dumps Ensure Data"™s are provided to the top management on time and make certain the accuracy of data Maintain Timely submission and circulation of Data to the stake holders of below mentioned data Ensure data are provided for compliance and Audits and RCSA and related compliances Sales Management - Has to provide to ensure data and eligible base provided to sales management unit to drive sales across the contact center. Timely and accurate execution of adhoc requests Ensure and comply to all control and compliance guidelines on data sharing About The Role : Graduate Minimum of 1 year of MIS experience. Good Communication Skill Good hand on Info graphics, MS Excel,VBA (Macro ), MS access Proficient in advance excel, h lookup, v lookup, power pivots Basic analytics skills Expertise on Power Point, Prezi and other slide making platforms. Comfortable with flexible shift and work timings Same Posting Description for Internal and External Candidates

Posted 1 day ago

Apply

3.0 - 5.0 years

5 - 7 Lacs

Bengaluru

Work from Office

Naukri logo

As Data Engineer, you will develop, maintain, evaluate and test big data solutions. You will be involved in the development of data solutions using Spark Framework with Python or Scala on Hadoop and AWS Cloud Data Platform Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise AWS Data Vault 2.0 development mechanism for agile data ingestion, storage and scaling Databricks for complex queries on transformation, aggregation, business logic implementation aggregation, business logic implementation AWS Redshift and Redshift spectrum, for complex queries on transformation, aggregation, business logic implementation DWH Concept on star schema, Materialize view concept. Strong SQL and data manipulation/transformation skills Preferred technical and professional experience Robust and Scalable Cloud Infrastructure End-to-End Data Engineering Pipeline Versatile Programming Capabilities

Posted 1 day ago

Apply

8.0 years

0 Lacs

Greater Kolkata Area

On-site

Linkedin logo

Role: Technical Architect Experience: 8-15 years Location: Bangalore, Chennai, Gurgaon, Pune, and Kolkata Mandatory Skills: Python, Pyspark, SQL, ETL, Pipelines, Azure Databricks, Azure Data Factory, & Architect Designing. Primary Roles and Responsibilities: Developing Modern Data Warehouse solutions using Databricks and AWS/ Azure Stack Ability to provide solutions that are forward-thinking in data engineering and analytics space Collaborate with DW/BI leads to understand new ETL pipeline development requirements. Triage issues to find gaps in existing pipelines and fix the issues Work with business to understand the need in reporting layer and develop data model to fulfill reporting needs Help joiner team members to resolve issues and technical challenges. Drive technical discussion with client architect and team members Orchestrate the data pipelines in scheduler via Airflow Skills and Qualifications: Bachelor's and/or master’s degree in computer science or equivalent experience. Must have total 8+ yrs. of IT experience and 5+ years' experience in Data warehouse/ETL projects. Deep understanding of Star and Snowflake dimensional modelling. Strong knowledge of Data Management principles Good understanding of Databricks Data & AI platform and Databricks Delta Lake Architecture Should have hands-on experience in SQL, Python and Spark (PySpark) Candidate must have experience in AWS/ Azure stack Desirable to have ETL with batch and streaming (Kinesis). Experience in building ETL / data warehouse transformation processes Experience with Apache Kafka for use with streaming data / event-based data Experience with other Open-Source big data products Hadoop (incl. Hive, Pig, Impala) Experience with Open Source non-relational / NoSQL data repositories (incl. MongoDB, Cassandra, Neo4J) Experience working with structured and unstructured data including imaging & geospatial data. Experience working in a Dev/Ops environment with tools such as Terraform, CircleCI, GIT. Proficiency in RDBMS, complex SQL, PL/SQL, Unix Shell Scripting, performance tuning and troubleshoot Databricks Certified Data Engineer Associate/Professional Certification (Desirable). Comfortable working in a dynamic, fast-paced, innovative environment with several ongoing concurrent projects Should have experience working in Agile methodology Strong verbal and written communication skills. Strong analytical and problem-solving skills with a high attention to detail. Show more Show less

Posted 1 day ago

Apply

3.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Job Title: Power BI Developer with SQL Location: Pune (complete onsite) Job Type: Full-time Job Description: We are looking for a skilled Power BI Developer with strong SQL skills to join our dynamic team. The ideal candidate will have experience in developing business intelligence (BI) solutions, creating data models, and building visually appealing dashboards and reports using Microsoft Power BI. You should also have a strong understanding of SQL and experience working with relational databases to extract, transform, and load data. Key Responsibilities: Experience with visual design, creating BI dashboards and reports, and working with PowerBI tool. Modifying existing dashboards (shift and add visualizations, buttons, cards, etc.) Experience with Bookmarks (update and make adjustments) Experience writing DAX Queries. Experience using DAX for filtering, selected values SQL background is a must, should be able to write complex queries. Experience with Power Automate would be a plus Experience working with GCP (Google Cloud Platform) is good to have to fetch data from GCP to create report and dashboards, should be able to write different commands to fetch data from GCP. Data visualization skills PL-300 certification is must have. Qualifications: Bachelor’s degree in Computer Science, Information Technology, or a related field. 3+ years of experience as a Power BI Developer or in a similar role. Proficiency in Microsoft Power BI, including Power Query, Power Pivot, and DAX. Strong experience with SQL, including query writing, optimization, and database design. Experience with ETL processes, data warehousing concepts, and data modeling. Excellent analytical and problem-solving skills with a keen attention to detail. Familiarity with other BI tools (e.g., Tableau, Qlik) is a plus. Strong communication and collaboration skills, with the ability to work effectively in a team environment. Ability to work in a fast-paced, deadline-driven environment and manage multiple projects simultaneously. Preferred Skills: Experience with Azure SQL, Azure Data Factory, or other cloud-based data platforms. Knowledge of programming languages such as Python or R for data analysis. Familiarity with data governance and data security best practices. Experience in developing reports for mobile platforms using Power BI Mobile. Show more Show less

Posted 1 day ago

Apply

5.0 years

0 Lacs

Ernakulam, Kerala, India

Remote

Linkedin logo

🔹 Position: Senior Data Analyst 📍 Location: Trivandrum / Kochi / Remote 🕒 Experience: 5+ Years ⏳ Notice Period: Immediate Joiners Only 🛠️ Mandatory Skills: SQL, Power BI, Python, Amazon Athena 🔎 Job Purpose We are seeking an experienced and analytical Senior Data Analyst to join our Data & Analytics team. The ideal candidate will have a strong background in data analysis, visualization, and stakeholder communication. You will be responsible for turning data into actionable insights that help shape strategic and operational decisions across the organization. 📝 Job Description / Duties & Responsibilities Collaborate with business stakeholders to understand data needs and translate them into analytical requirements. Analyze large datasets to uncover trends, patterns, and actionable insights. Design and build dashboards and reports using Power BI. Perform ad-hoc analysis and develop data-driven narratives to support decision-making. Ensure data accuracy, consistency, and integrity through data validation and quality checks. Build and maintain SQL queries, views, and data models for reporting purposes. Communicate findings clearly through presentations, visualizations, and written summaries. Partner with data engineers and architects to improve data pipelines and architecture. Contribute to the definition of KPIs, metrics, and data governance standards. 📌 Job Specification / Skills and Competencies Bachelor’s or Master’s degree in Statistics, Mathematics, Computer Science, Economics, or a related field. 5+ years of experience in a data analyst or business intelligence role. Advanced proficiency in SQL and experience working with relational databases (e.g. SQL Server, Redshift, Snowflake). Hands-on experience in Power BI. Proficiency in Python, Excel and data storytelling. Understanding of data modelling, ETL concepts, and basic data architecture. Strong analytical thinking and problem-solving skills. Excellent communication and stakeholder management skills To adhere to the Information Security Management policies and procedures. 📌 Soft Skills Required Must be a good team player with good communication skills Must have good presentation skills Must be a pro-active problem solver and a leader by self Manage & nurture a team of data engineers Show more Show less

Posted 1 day ago

Apply

8.0 - 10.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

Job Title: R Analytics Lead Experience: 8-10 years in Analytics (SAS/SPSS/R/Python) with at least the last 4 years in a senior position focusing on R Studio, R Server, and similar Location: Mumbai, India [Full Time Office Hours] Department: Business Analytics Company : Smartavya Analytica Private limited is niche Data and AI company. Based in Pune, we are pioneers in data-driven innovation, transforming enterprise data into strategic insights. Established in 2017, our team has experience in handling large datasets up to 20 PB’s in a single implementation, delivering many successful data and AI projects across major industries, including retail, finance, telecom, manufacturing, insurance, and capital markets. We are leaders in Big Data, Cloud and Analytics projects with super specialisation in very large Data Platforms. https://smart-analytica.com Empowering Your Digital Transformation with Data Modernization and AI Job Summary: We are seeking a highly skilled R Analytics Lead to oversee our analytics team and drive data-driven decision-making processes. The ideal candidate will have extensive experience in R programming, data analysis, and statistical modelling, and will be responsible for leading analytics projects that provide actionable insights to support business objectives. Responsibilities and Duties: Lead the development and implementation of advanced analytics solutions using R. Manage and mentor a team of data analysts and data scientists. Collaborate with cross-functional teams to identify business needs and translate them into analytical Solutions. Design and execute complex data analyse, including predictive modelling, machine learning and statistical analysis. Develop and maintain data pipelines and ETL processes. Ensure the accuracy and integrity of data and analytical results Academic Qualifications: Bachelor’s or Master’s degree in Statistics, Computer Science, Data Science, or a related field. Skills : Extensive experience with R programming and related libraries (e.g., ggplot2, dplyr, caret). Strong background in statistical modelling, machine learning, and data visualization. Proven experience in leading and managing analytics teams. Excellent problem-solving skills and attention to detail. Strong communication skills, with the ability to present complex data insights to non-technical audiences. Experience with other analytics tools and programming languages (e.g., Python, SQL) is a plus Show more Show less

Posted 1 day ago

Apply

8.0 - 12.0 years

10 - 14 Lacs

Pune

Work from Office

Naukri logo

Before you apply to a job, select your language preference from the options available at the top right of this page. : Job Summary This position provides input, support, and performs full systems life cycle management activities (e.g., analyses, technical requirements, design, coding, testing, implementation of systems and applications software, etc.). He/She participates in component and data architecture design, technology planning, and testing for Applications Development (AD) initiatives to meet business requirements. This position provides input to applications development project plans and integrations. He/She collaborates with teams and supports emerging technologies to ensure effective communication and achievement of objectives. This position provides knowledge and support for applications development, integration, and maintenance. He/She provides input to department and project teams on decisions supporting projects. Responsibilities: Full stack developer with Java, Oracle and Angular. Devops and Agile project management is a plus. Plans, develops, and manages the organization's information software, applications, systems, and networks. Application Containerization (Kubernetes, Red Hat Open Shift) Experience with public cloud (e.g., Google, Azure) Performs systems analysis and design. Designs and develops moderate to highly complex applications. Develops application documentation. Produces integration builds. Performs maintenance and support. Supports emerging technologies and products. Ensures UPS's business needs are met through continual upgrades and development of new technical solutions. Qualifications: 8-12 years of experience Bachelors Degree or International equivalent Employee Type:

Posted 1 day ago

Apply

1.0 - 2.0 years

3 - 4 Lacs

Bengaluru

Work from Office

Naukri logo

: Headquartered in Noida, India, Paytm Insurance Broking Private Limited (PIBPL), a wholly owned subsidiary of One97 Communications (OCL) is an online insurance market place, that offers insurance products across all leading insurance companies, with products across auto, life and health insurance and provide policy management and claim services for our customers. Expectations/ : 1. Using automated tools to extract data from primary and secondary sources 2. Removing corrupted data and fixing coding errors and related problems 3. Developing and maintaining databases, data systems - reorganizing data in a readable format 4. Preparing reports for the management stating trends, patterns, and predictions using relevant data 5. Preparing final analysis reports for the stakeholders to understand the data-analysis steps, enabling them to take important decisions based on various facts and trends 6. Supporting the data warehouse in identifying and revising reporting requirements. 7. Setup robust automated dashboards to drive performance management 8. Derive business insights from data with a focus on driving business level metrics 9. 1 -2 years of experience in business analysis or a related field. Superpowers/ Skills that will help you succeed in this role 1. Problem solving - Assess what data is required to prove hypotheses and derive actionable insights 2. Analytical skills - Top notch excel skills are necessary 3. Strong communication and project management skills 4. Hands on with SQL, Hive, Excel and comfortable handling very large scale data. 5. Ability to interact and convince business stakeholders. 6. Experience working with web analytics platforms is an added advantage. 7. Experimentative mindset with attention to detail. 8. Proficiency in Advance SQL , MS Excel and Python or R is a must 9. Exceptional analytical and conceptual thinking skills. 10. The ability to influence stakeholders and work closely with them to determine acceptable solutions. 11. Advanced technical skills. 12. Excellent documentation skills. 13. Fundamental analytical and conceptual thinking skills. 14. Experience creating detailed reports and giving presentations. 15. Competency in Microsoft applications including Word, Excel, and Outlook. 16. A track record of following through on commitments. 17. Excellent planning, organizational, and time management skills. 18. Experience leading and developing top-performing teams. 19. A history of leading and supporting successful projects. Preferred Industry - Fintech/ E-commerce / Data Analytics Education - Any graduate or a Graduate from Premium Institute is preferred. Why join us: 1. We give immense opportunities to make a difference, and have a great time doing that. 2. You are challenged and encouraged here to do meaning work for yourself and customers/clients 3. We are successful, and our successes are rooted in our people's collective energy and unwavering focus on the customer, and that's how it will always be

Posted 1 day ago

Apply

4.0 - 9.0 years

6 - 11 Lacs

Bengaluru

Work from Office

Naukri logo

Hello Visionary! We empower our people to stay resilient and relevant in a constantly evolving world. We’re looking for people who are always searching for creative ways to grow and learn. People who want to make a real impact, now and in the future. Does that sound like youThen it seems like you’d make a great addition to our vibrant team. We are looking for Snowflake Engineer . Before our software developers write even a single line of code, they have to understand what drives our customers. What is the environmentWhat is the user story based onImplementation means – trying, testing, and improving outcomes until a final solution emerges. Knowledge means exchange – discussions with colleagues from all over the world. Join our Digitalization Technology and Services (DTS) team based in Bangalore. You’ll make a difference by: Being responsible for the development and delivery of parts of a product, in accordance with the customers’ requirements and organizational quality norms. Activities to be performed include: Good at communicating within the team as well as with all the stake holders Strong customer focus and good learner. Highly proactive and team player Implementation of features and/or bug-fixing and delivering solutions in accordance with coding guidelines and on-time with high quality. Identification and implementation of test strategy to ensure solution addresses customer requirements, and quality, security requirements of product are met. Job / Skills: 4+ years’ work experience in Software Engineering especially in professional software product development. Strong knowledge in Snowflake, Database and Tools Strong knowledge in Data Warehouse, Data Visualization, BI, ETL, Analytics Strong knowledge in RDBMS, Stored Procedures and Triggers Strong Knowledge in DBT Basic knowledge in Power BI Knowledge of Software Engineering processes. Basic Experience with Agile Create a better #TomorrowWithUs! This role is in Bangalore, where you’ll get the chance to work with teams impacting entire cities, countries – and the craft of things to come. We’re Siemens. A collection of over 312,000 minds building the future, one day at a time in over 200 countries. All employment decisions at Siemens are based on qualifications, merit and business need. Bring your curiosity and creativity and help us craft tomorrow. At Siemens, we are always challenging ourselves to build a better future. We need the most innovative and diverse Digital Minds to develop tomorrow ‘s reality. Find out more about the Digital world of Siemens herewww.siemens.com/careers/digitalminds

Posted 1 day ago

Apply

5.0 - 10.0 years

7 - 12 Lacs

Gurugram

Work from Office

Naukri logo

Hello Visionary! We know that the only way a business thrive is if our people are growing. That’s why we always put our people first. Our global, diverse team would be happy to support you and challenge you to grow in new ways. Who knows where our shared journey will take you We are looking for Senior Tableau Developer We are seeking a highly skilled Senior Tableau Developer with a minimum of 5 years of relevant experience to join our team. The ideal candidate will be responsible for designing, developing, and maintaining Tableau dashboards and reports to support business decision-making processes. You’ll make a difference by Develop and maintain Tableau dashboards and reports. Collaborate with business stakeholders to gather requirements and translate them into effective visualizations. Optimize and enhance existing Tableau solutions for better performance and usability. Provide training and support to end-users on Tableau functionalities. Ensure data accuracy and integrity in all reports and dashboards. Drive report requirements and specifications, provide feasibility analysis, and effort estimation. Manage Tableau report access, deployment, and development with best practices. Provide daily/weekly project status updates to Project Managers. Create or update necessary documentation as per project requirements. Collaborate with the team and all stakeholders. You’ll win us over by Expert in developing innovative and complex dashboards in Tableau. Strong understanding of data visualization principles. Proficiency in SQL and data manipulation. Excellent analytical and problem-solving skills. Working knowledge of databases like Snowflake, and Oracle. Extensive experience in writing complex database queries using SQL. Demonstrated experience in creating and presenting data, dashboards, and analysis to the management team with the ability to explain complex analytical concepts. Good to Have Tableau certification such as Desktop Specialist/Professional. Working experience in Agile methodologies. Experience with other BI tools like Power BI. Strong skills in Microsoft Excel and PowerPoint. AWS know – how Experience in the Finance domain. Data security and handling expertise. Create a better #TomorrowWithUs! This role, based in Bangalore, is an individual contributor position. You may be required to visit other locations within India and internationally. In return, you'll have the opportunity to work with teams shaping the future. At Siemens, we are a collection of over 312,000 minds building the future, one day at a time, worldwide. We are dedicated to equality and welcome applications that reflect the diversity of the communities we serve. All employment decisions at Siemens are based on qualifications, merit, and business need. Bring your curiosity and imagination, and help us shape tomorrow Find out more about Siemens careers at www.siemens.com/careers

Posted 1 day ago

Apply

6.0 - 10.0 years

4 - 8 Lacs

Bengaluru

Hybrid

Naukri logo

Role: PySpark Developer Work Mode: Hybrid Interview Mode: Virtual (2 Rounds) Type: Contract-to-Hire (C2H) Job Summary We are looking for a skilled PySpark Developer with hands-on experience in building scalable data pipelines and processing large datasets. The ideal candidate will have deep expertise in Apache Spark , Python , and working with modern data engineering tools in cloud environments such as AWS . Key Skills & Responsibilities Strong expertise in PySpark and Apache Spark for batch and real-time data processing. Experience in designing and implementing ETL pipelines, including data ingestion, transformation, and validation. Proficiency in Python for scripting, automation, and building reusable components. Hands-on experience with scheduling tools like Airflow or Control-M to orchestrate workflows. Familiarity with AWS ecosystem, especially S3 and related file system operations. Strong understanding of Unix/Linux environments and Shell scripting. Experience with Hadoop, Hive, and platforms like Cloudera or Hortonworks. Ability to handle CDC (Change Data Capture) operations on large datasets. Experience in performance tuning, optimizing Spark jobs, and troubleshooting. Strong knowledge of data modeling, data validation, and writing unit test cases. Exposure to real-time and batch integration with downstream/upstream systems. Working knowledge of Jupyter Notebook, Zeppelin, or PyCharm for development and debugging. Understanding of Agile methodologies, with experience in CI/CD tools (e.g., Jenkins, Git). Preferred Skills Experience in building or integrating APIs for data provisioning. Exposure to ETL or reporting tools such as Informatica, Tableau, Jasper, or QlikView. Familiarity with AI/ML model development using PySpark in cloud environments Skills: ci/cd,zeppelin,pycharm,pyspark,etl tools,control-m,unit test cases,tableau,performance tuning,jenkins,qlikview,informatica,jupyter notebook,api integration,unix/linux,git,aws s3,hive,cloudera,jasper,airflow,cdc,pyspark, apache spark, python, aws s3, airflow/control-m, sql, unix/linux, hive, hadoop, data modeling, and performance tuning,agile methodologies,aws,s3,data modeling,data validation,ai/ml model development,batch integration,apache spark,python,etl pipelines,shell scripting,hortonworks,real-time integration,hadoop

Posted 1 day ago

Apply

7.0 years

0 Lacs

Mumbai, Maharashtra, India

On-site

Linkedin logo

Job Title: Senior Technical Delivery Manager – ETL, Datawarehouse and Analytics Experience : 15 plus years in IT delivery management, with at least 7 years in Big Data, Cloud, and Analytics. Experience should span across ETL, Data Management, Data Visualization and Project Management Location : Mumbai, India Department : Big Data and Cloud – DATA ANALYTICS DELIVERY Company: Smartavya Analytica Private limited is niche Data and AI company. Based in Pune, we are pioneers in data-driven innovation, transforming enterprise data into strategic insights. Established in 2017, our team has experience in handling large datasets up to 20 PB’s in a single implementation, delivering many successful data and AI projects across major industries, including retail, finance, telecom, manufacturing, insurance, and capital markets. We are leaders in Big Data, Cloud and Analytics projects with super specialisation in very very large Data Platforms. https://smart-analytica.com Empowering Your Digital Transformation with Data Modernization and AI Job Overview : Smartavya Analytica Private Limited is seeking an experienced Senior Delivery Manager to oversee and drive the successful delivery of large-scale Big Data, Cloud, and Analytics projects. The ideal candidate will have a strong background in IT delivery management, excellent leadership skills, and a proven record in managing complex projects from initiation to completion. The ideal candidate should have the right blend of Client Engagement, Project Delivery and Data Management Skills Key Responsibilities : • Technical Project Management: o Lead the end-to-end technical delivery of multiple projects in Big Data, Cloud, and Analytics. Lead teams in technical solutioning, design and development o Develop detailed project plans, timelines, and budgets, ensuring alignment with client expectations and business goals. o Monitor project progress, manage risks, and implement corrective actions as needed to ensure timely and quality delivery. • Client Engagement and Stakeholder Management: o Build and maintain strong client relationships, acting as the primary point of contact for project delivery. o Understand client requirements, anticipate challenges, and provide proactive solutions. o Coordinate with internal and external stakeholders to ensure seamless project execution. o Communicate project status, risks, and issues to senior management and stakeholders in a clear and timely manner. • Team Leadership: o Lead and mentor a team of data engineers, analysts, and project managers. o Ensure effective resource allocation and utilization across projects. o Foster a culture of collaboration, continuous improvement, and innovation within the team. • Technical and Delivery Excellence: o Leverage Data Management Expertise and Experience to guide and lead the technical conversations effectively. Identify and understand technical areas of support needed to the team and work towards resolving them – either by own expertise or networking with internal and external stakeholders to unblock the team o Implement best practices in project management, delivery, and quality assurance. o Drive continuous improvement initiatives to enhance delivery efficiency and client satisfaction. o Stay updated with the latest trends and advancements in Big Data, Cloud, and Analytics technologies. Requirements : • Experience in IT delivery management, particularly in Big Data, Cloud, and Analytics. • Strong knowledge of project management methodologies and tools (e.g., Agile, Scrum, PMP). • Excellent leadership, communication, and stakeholder management skills. • Proven ability to manage large, complex projects with multiple stakeholders. • Strong critical thinking skills and the ability to make decisions under pressure. Academic Qualifications: • Bachelor’s degree in computer science, Information Technology, or a related field. • Relevant certifications in Big Data, Cloud platforms like GCP, Azure, AWS, Snowflake, Databricks, Project Management or similar areas is preferred. Experience : • 15+ years in IT delivery management, with at least 7 years in Big Data, Cloud, and Analytics. Experience should span across ETL, Data Management, Data Visualization and Project Management The ideal candidate will have a strong experience in IT delivery management, excellent leadership skills, and a proven record in managing complex projects from initiation to completion. The ideal candidate should have the right blend of experience in Client Engagement, Project Delivery and Technical Data Management Skills If you have a passion for leading high-impact projects and delivering exceptional results, we encourage you to apply and be a part of our innovative team at Smartavya Analytica Private Limited Show more Show less

Posted 1 day ago

Apply

0 years

0 Lacs

Kolkata, West Bengal, India

On-site

Linkedin logo

Hello, Good Day ! Minimum 10 Yrs Hands-on with the Subject. Currently, looking for an Individual, possessing the following Skill-Set w.r.t to Employment. Kindly reach-out to me if you inherit these tools. Sharing the required profile for your reference. Responsibility of / Expectations from the Role : - Develop Data models and patterns for Data Platform Solutions . Conceptualize and solution a centralized data lake solution to bring data from disparate sources . Implement security on DL Solution . Present agreed solutions to business to get buy-in on the strategy and roadmap. Technical : - Experience in Data intensive activities ( Data Modeling, ER Diagram, ETL, Data mapping, Data Governance and security )· Experience in End-to-End implementation of large Data Lake or Data Warehouse solution · Experience in Azure data lake and unity catalog . Strong knowledge on Data Governance . Hands on experience with ETL tools (ADF) and Spark Pools . Strong knowledge and experience in data modeling · Strong knowledge on Data Security / Encryption / Monitoring . Knowledge on DevOps . Analytical and problem-solving skills with a high degree of initiative and flexibility to be available over extended hours. Ability to communicate with Business SME. Understanding of Design Patterns – Data Models . Experienced with Onsite – Offshore Delivery Model. Technical background ideally within Managed Services and IT outsourcing industry. Certification on Data Platform Solutions . Behavioral : - Creating a detailed requirement analysis and provide solution. Strong Analytical thinking and problem-solving skills. Strong communication, presentation and writing skills. Be proactive and have a can-do attitude. Good-to-Have : - Experience on Cloud Tool & Technology viz. ADO, ServiceNow, Azure Monitoring, Performance Management, Analytics etc. Understanding of Agile Process. Maintains a broad and current knowledge of the industry on Cloud Platform . Show more Show less

Posted 1 day ago

Apply

3.0 - 5.0 years

5 - 8 Lacs

Noida

Work from Office

Naukri logo

Primary Skills SQL (Advanced Level) SSAS (SQL Server Analysis Services) Multidimensional and/or Tabular Model MDX / DAX (strong querying capabilities) Data Modeling (Star Schema, Snowflake Schema) Secondary Skills ETL processes (SSIS or similar tools) Power BI / Reporting tools Azure Data Services (optional but a plus) Role & Responsibilities Design, develop, and deploy SSAS models (both tabular and multidimensional). Write and optimize MDX/DAX queries for complex business logic. Work closely with business analysts and stakeholders to translate requirements into robust data models. Design and implement ETL pipelines for data integration. Build reporting datasets and support BI teams in developing insightful dashboards (Power BI preferred). Optimize existing cubes and data models for performance and scalability. Ensure data quality, consistency, and governance standards. Top Skill Set SSAS (Tabular + Multidimensional modeling) Strong MDX and/or DAX query writing SQL Advanced level for data extraction and transformations Data Modeling concepts (Fact/Dimension, Slowly Changing Dimensions, etc.) ETL Tools (SSIS preferred) Power BI or similar BI tools Understanding of OLAP & OLTP concepts Performance Tuning (SSAS/SQL) Skills: analytical skills,etl processes (ssis or similar tools),collaboration,multidimensional expressions (mdx),power bi / reporting tools,sql (advanced level),sql proficiency,dax,ssas (multidimensional and tabular model),etl,data modeling (star schema, snowflake schema),communication,azure data services,mdx,data modeling,ssas,data visualization

Posted 1 day ago

Apply

8.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

At Nielsen, we are passionate about our work to power a better media future for all people by providing powerful insights that drive client decisions and deliver extraordinary results. Our talented, global workforce is dedicated to capturing audience engagement with content - wherever and whenever it’s consumed. Together, we are proudly rooted in our deep legacy as we stand at the forefront of the media revolution. When you join Nielsen, you will join a dynamic team committed to excellence, perseverance, and the ambition to make an impact together. We champion you, because when you succeed, we do too. We enable your best to power our future. This role will be part of a team that develops software that processes data captured every day from over a quarter of a million Computer and Mobile devices worldwide. Measuring panelists activities as they surf the Internet via Browsers, or utilizing Mobile App's download from Apples and Googles store. The Nielsen software meter used to capture this usage data has been optimized to be unobtrusive yet gather many biometric data points that the backend system can use to identify who is using the device, and also detect fraudulent behavior. As an Engineering Manager, you will be a cross functional team of developers, and DevOps Engineers, using a Scrum/Agile team management approach. Provide technical expertise and guidance to team members and help develop designs for complex applications. Ability to plan tasks and project phases as well as review, comment and approve the analysis, proposed design and test strategy done by members of the team. Responsibilities Oversee the development of scalable, reliable, and cost-effective software solutions with an emphasis on quality, best-practice coding standards, and cost-effectiveness Aid with driving business unit financials, and ensures budgets and schedules meet corporate requirements Participates in corporate development of methods, techniques and evaluation criteria for projects, programs, and people Has overall control of planning, staffing, budgeting, managing expense priorities, for the team they lead Provide training, coaching, and sharing technical knowledge with less experienced staff People manager duties, including annual reviews, career guidance, and compensation planning Rapidly identify technical issues as they emerge, and asses their impact to the business Provide day-to-day work direction to a large team of developers Collaborate effectively with Data Science to understand, translate, and integrate data methodologies into the product Collaborate with product owners to translate complex business requirements into technical solutions, providing leadership in the design and architecture processes Stay informed about the latest technology and methodology by participating in industry forums, having an active peer network, and engaging actively with customers Cultivate a team environment focused on continuous learning, where innovative technologies are developed and refined through collaborative effort Key Skills Bachelor's degree in computer science, engineering or relevant 8+ years of experience in information technology solutions development and 2+ years managerial experience Proven experience in leading and managing software development teams Development background in Java AWS Cloud based environment for high-volume data processing Experience with Data Warehouses, ETL, and/or Data Lakes Experience with Databases such as Postgres, DynamoDB, or RedShift Good understanding of CI/CD principles and tools. GitLab a plus Must have the ability to provide solutions utilizing best practices for resilience, scalability, cloud optimization and security Excellent project management skills Other desirable skills Knowledge of networking principles and security best practices AWS Certification is a plus Experience with MS Project or Smartsheets Experience with Airflow, Python, Lambda, Prometheus, Grafana, & OpsGeni a bonus Exposure to the Google Cloud Platform (GCP) useful Please be aware that job-seekers may be at risk of targeting by scammers seeking personal data or money. Nielsen recruiters will only contact you through official job boards, LinkedIn, or email with a nielsen.com domain. Be cautious of any outreach claiming to be from Nielsen via other messaging platforms or personal email addresses. Always verify that email communications come from an @ nielsen.com address. If you're unsure about the authenticity of a job offer or communication, please contact Nielsen directly through our official website or verified social media channels. Qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability, protected veteran status or other characteristics protected by law. Show more Show less

Posted 1 day ago

Apply

15.0 - 20.0 years

17 - 22 Lacs

Chennai

Work from Office

Naukri logo

Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : NAMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Engineer, you will design, develop, and maintain data solutions that facilitate data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to migrate and deploy data across various systems. You will collaborate with cross-functional teams to understand their data needs and provide effective solutions, ensuring that the data infrastructure is robust and scalable to meet the demands of the organization. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Mentor junior team members to enhance their skills and knowledge in data engineering.- Continuously evaluate and improve data processes to enhance efficiency and effectiveness. Professional & Technical Skills: - Must To Have Skills: Proficiency in Databricks Unified Data Analytics Platform.- Experience with data integration and ETL tools.- Strong understanding of data modeling and database design principles.- Familiarity with cloud platforms and services related to data storage and processing.- Knowledge of programming languages such as Python or Scala for data manipulation. Additional Information:- The candidate should have minimum 7.5 years of experience in Databricks Unified Data Analytics Platform.- This position is based in Chennai.- A 15 years full time education is required. Qualification 15 years full time education

Posted 1 day ago

Apply

15.0 - 20.0 years

17 - 22 Lacs

Chennai

Work from Office

Naukri logo

Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : NAMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Engineer, you will design, develop, and maintain data solutions that facilitate data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to migrate and deploy data across various systems. You will collaborate with cross-functional teams to understand their data needs and provide effective solutions, ensuring that the data infrastructure is robust and scalable to meet the demands of the organization. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Mentor junior team members to enhance their skills and knowledge in data engineering.- Continuously evaluate and improve data processes to enhance efficiency and effectiveness. Professional & Technical Skills: - Must To Have Skills: Proficiency in Databricks Unified Data Analytics Platform.- Experience with data integration and ETL tools.- Strong understanding of data modeling and database design principles.- Familiarity with cloud platforms and services related to data storage and processing.- Knowledge of programming languages such as Python or Scala for data manipulation. Additional Information:- The candidate should have minimum 7.5 years of experience in Databricks Unified Data Analytics Platform.- This position is based in Chennai.- A 15 years full time education is required. Qualification 15 years full time education

Posted 1 day ago

Apply

15.0 - 20.0 years

17 - 22 Lacs

Hyderabad

Work from Office

Naukri logo

Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Engineer, you will design, develop, and maintain data solutions that facilitate data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to migrate and deploy data across various systems. You will collaborate with cross-functional teams to understand data requirements and deliver effective solutions that meet business needs, while also troubleshooting and optimizing existing data workflows to enhance performance and reliability. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Mentor junior team members to enhance their skills and knowledge in data engineering.- Continuously evaluate and improve data processes to ensure efficiency and effectiveness. Professional & Technical Skills: - Must To Have Skills: Proficiency in Databricks Unified Data Analytics Platform.- Strong understanding of data pipeline architecture and design.- Experience with ETL processes and data integration techniques.- Familiarity with data quality frameworks and best practices.- Knowledge of cloud platforms and services related to data storage and processing. Additional Information:- The candidate should have minimum 5 years of experience in Databricks Unified Data Analytics Platform.- This position is based in Hyderabad.- A 15 years full time education is required. Qualification 15 years full time education

Posted 1 day ago

Apply

15.0 - 20.0 years

17 - 22 Lacs

Hyderabad

Work from Office

Naukri logo

Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Engineer, you will design, develop, and maintain data solutions that facilitate data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to migrate and deploy data across various systems. You will collaborate with cross-functional teams to understand data requirements and deliver effective solutions that meet business needs, while also troubleshooting and optimizing existing data workflows to enhance performance and reliability. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Mentor junior team members to enhance their skills and knowledge in data engineering.- Continuously evaluate and improve data processes to ensure efficiency and effectiveness. Professional & Technical Skills: - Must To Have Skills: Proficiency in Databricks Unified Data Analytics Platform.- Strong understanding of data pipeline architecture and design.- Experience with ETL processes and data integration techniques.- Familiarity with data quality frameworks and best practices.- Knowledge of cloud platforms and services related to data storage and processing. Additional Information:- The candidate should have minimum 5 years of experience in Databricks Unified Data Analytics Platform.- This position is based in Hyderabad.- A 15 years full time education is required. Qualification 15 years full time education

Posted 1 day ago

Apply

Exploring ETL Jobs in India

The ETL (Extract, Transform, Load) job market in India is thriving with numerous opportunities for job seekers. ETL professionals play a crucial role in managing and analyzing data effectively for organizations across various industries. If you are considering a career in ETL, this article will provide you with valuable insights into the job market in India.

Top Hiring Locations in India

  1. Bangalore
  2. Mumbai
  3. Pune
  4. Hyderabad
  5. Chennai

These cities are known for their thriving tech industries and often have a high demand for ETL professionals.

Average Salary Range

The average salary range for ETL professionals in India varies based on experience levels. Entry-level positions typically start at around ₹3-5 lakhs per annum, while experienced professionals can earn upwards of ₹10-15 lakhs per annum.

Career Path

In the ETL field, a typical career path may include roles such as: - Junior ETL Developer - ETL Developer - Senior ETL Developer - ETL Tech Lead - ETL Architect

As you gain experience and expertise, you can progress to higher-level roles within the ETL domain.

Related Skills

Alongside ETL, professionals in this field are often expected to have skills in: - SQL - Data Warehousing - Data Modeling - ETL Tools (e.g., Informatica, Talend) - Database Management Systems (e.g., Oracle, SQL Server)

Having a strong foundation in these related skills can enhance your capabilities as an ETL professional.

Interview Questions

Here are 25 interview questions that you may encounter in ETL job interviews:

  • What is ETL and why is it important? (basic)
  • Explain the difference between ETL and ELT processes. (medium)
  • How do you handle incremental loads in ETL processes? (medium)
  • What is a surrogate key in the context of ETL? (basic)
  • Can you explain the concept of data profiling in ETL? (medium)
  • How do you handle data quality issues in ETL processes? (medium)
  • What are some common ETL tools you have worked with? (basic)
  • Explain the difference between a full load and an incremental load. (basic)
  • How do you optimize ETL processes for performance? (medium)
  • Can you describe a challenging ETL project you worked on and how you overcame obstacles? (advanced)
  • What is the significance of data cleansing in ETL? (basic)
  • How do you ensure data security and compliance in ETL processes? (medium)
  • Have you worked with real-time data integration in ETL? If so, how did you approach it? (advanced)
  • What are the key components of an ETL architecture? (basic)
  • How do you handle data transformation requirements in ETL processes? (medium)
  • What are some best practices for ETL development? (medium)
  • Can you explain the concept of change data capture in ETL? (medium)
  • How do you troubleshoot ETL job failures? (medium)
  • What role does metadata play in ETL processes? (basic)
  • How do you handle complex transformations in ETL processes? (medium)
  • What is the importance of data lineage in ETL? (basic)
  • Have you worked with parallel processing in ETL? If so, explain your experience. (advanced)
  • How do you ensure data consistency across different ETL jobs? (medium)
  • Can you explain the concept of slowly changing dimensions in ETL? (medium)
  • How do you document ETL processes for knowledge sharing and future reference? (basic)

Closing Remarks

As you explore ETL jobs in India, remember to showcase your skills and expertise confidently during interviews. With the right preparation and a solid understanding of ETL concepts, you can embark on a rewarding career in this dynamic field. Good luck with your job search!

cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies