Home
Jobs

10314 Etl Jobs - Page 12

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

6.0 - 11.0 years

13 - 22 Lacs

Hyderabad, Bengaluru

Work from Office

Naukri logo

Aws Glue - Mandatory Aws S3 and AWS lambada - should have some experience Must have used snowpipe to build integration pipelines. how to build procedure from scratch. write complex Sql queries writing complex Sql queries python-Numpy and pandas

Posted 1 day ago

Apply

4.0 - 8.0 years

20 - 25 Lacs

Hyderabad

Work from Office

Naukri logo

Role & responsibilities: Building automated pipelines and solutions for data migration/data import or other operations requiring data ETL. Performing analysis on core products to support migration planning and development. Working closely with the Team Lead and collaborating with other stakeholders to gather requirements and build well architected data solutions. Produce supporting documentation, such as specifications, data models, relation between data and others, required for the effective development, usage and communication of the data operations solutions with different stakeholders. Competencies, Characteristics and Traits: Mandatory Skills - Total experience - 5 years, of which Minimum 3 years of Experience with SnapLogic pipeline development and minimum of 2 years in building ETL/ELT Pipelines, is needed. Experience working with databases on-premises and/or cloud-based environments such as MSSQL, MySQL, PostgreSQL, AzureSQL, Aurora MySQL & PostgreSQL, AWS RDS etc. Experience working with API sources and destinations Strong problem solving and analytical skills, high attention to detail Passion for analytics, real-time data, and monitoring Critical Thinking and collaboration skills Self-starter and a quick learner, ready to learn new technologies and tools that the job demands Preferred candidate profile: Essential: Strong experience working with databases on-premises and/or cloud-based environments such as MSSQL, MySQL, PostgreSQL, AzureSQL, Aurora MySQL & PostgreSQL, AWS RDS etc. Strong knowledge of databases, data modeling and data life cycle Proficient in understanding data and writing complex SQL Mandatory Skills - Total experience - 5 years, of which Minimum 3 years of Experience with SnapLogic pipeline development and minimum of 2 years in building ETL/ELT Pipelines, is needed. Experience working with REST API in data pipelines Strong problem solving and high attention to detail Passion for analytics, real-time data, and monitoring Critical Thinking, good communication and collaboration skills Focus on high performance and quality delivery Highly self-motivated and continuous learner Desirable: Experience working with no-SQL databases like MongoDB Experience with Snaplogic administration is preferable Experience working with Microsoft Power Platform (PowerAutomate and PowerApps) or any similar automation / RPA tool Experience with cloud data platforms like snowflake, data bricks, AWS, Azure etc. Awareness of emerging ETL and Cloud concepts such as Amazon AWS or Microsoft Azure Experience working with Scripting languages, such as Python, R, JavaScript, etc.

Posted 1 day ago

Apply

0 years

0 Lacs

New Delhi, Delhi, India

On-site

Linkedin logo

The purpose of this role is to work with the business and sing Adobe Analytics, Google Analytics, WebTrends and other technologies. This role serves as a subject matter expert on tag management and audience platforms, and provides guidance and oversight to other web audience resources. This role is able to provide direction and guidance on integration of marketing technologies and tools. Job Description: Mandatory (top 5) Experience in Leading Customer Data Platform related development Experience on Python or Java Scripting or Node.JS Experience with APIs (REST, Open APIs, CURL) Data analysis, ingestions, modelling and mapping Experience on unstructured data using JSON/Parquet file format Preferred (top 5) Understanding of CCPA, GDPR and other Data Protection Acts Client facing experience Experience with Reporting Technologies Experience on any AEP, ActionIQ , Lytics, Segment, Tealium, C360 Big Data ETLs Leading the team in implementation of technical solutions and driving proof of concepts Responsible for building data model based on the gathered requirements and data architecture. Reponsible for data ingestion into the cdp platform via batch and real time streaming modes - using ETL,API,JavaScript etc Responsible for data extraction/outbound dataflows to reporting tool, other adobe product or 3rd party system Reponsible for working on defined business rules/transformations/BRDs Colloborate with internal teams to develop solutions for CDP platform for marketing activation use cases. Should be able to work on queries, segment development, audience creation/activations, customer journey orchestrations Should be able to define segments and orchestrate customer journey based on the BRD Location: Bangalore Brand: Merkle Time Type: Full time Contract Type: Permanent Show more Show less

Posted 1 day ago

Apply

3.0 - 8.0 years

6 - 18 Lacs

Hyderabad

Work from Office

Naukri logo

Mandatory skills for Data engineer: Python/Pyspark, Aws Glue, lambda , redshift. Python/Pyspark, Aws Glue, lambda , redshift, SQL. Expert knowledge in AWS Data Lake implementation and support (S3, Glue,DMS Athena, Lambda, API Gateway, Redshift)

Posted 1 day ago

Apply

5.0 - 7.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Linkedin logo

Position Overview: We are seeking a highly skilled and experienced Senior Microsoft SQL Database Developer to join our team. The ideal candidate will be responsible for designing, developing, and maintaining complex Microsoft SQL databases, ensuring their efficiency, security, and scalability. This position requires deep expertise in Microsoft SQL development, performance tuning, database design, and optimization, along with a solid understanding of database architecture and troubleshooting. The Senior Microsoft SQL Database Developer will work closely with cross-functional teams to ensure the successful delivery of high-quality data solutions. Key Responsibilities: Database Development & Design: Develop and maintain complex SQL queries, stored procedures, triggers, function and views. Design, implement, and optimize relational database schemas. Create and maintain database objects (tables, indexes, views, etc.) ensuring data integrity and optimization. Performance Tuning & Optimization: Analyse query performance and optimize SQL statements for maximum efficiency. Use indexing, partitioning, and other techniques to improve database performance. Monitor and resolve database performance issues using profiling tools and techniques. Data Integration & Migration: Lead data integration efforts between multiple data sources, ensuring accurate data flow and consistency. Support data migration projects, converting and importing data between systems. Collaboration & Communication: Work closely with application developers, data architects, business analysts, and other stakeholders to deliver database solutions. Provide technical guidance and mentoring to junior developers. Participate in code reviews, ensuring SQL best practices are followed and adhered to. Troubleshooting & Issue Resolution: Identify and resolve complex database issues related to performance, functionality, and data integrity. Provide support for production database environments and resolve urgent database issues quickly. Documentation & Reporting: Document database structures, processes, and procedures to ensure consistency and compliance with internal standards. Provide status reports, performance insights, and recommendations to management. Key Qualifications: Education: Bachelor’s degree in computer science, Information Technology, or a related field (or equivalent work experience). Experience: Minimum of 5-7 years of experience in SQL development and database management. Proven experience working with large-scale SQL databases (Microsoft SQL Server, Azure SQL Server). Skills & Expertise: Strong proficiency in SQL, including complex queries, joins, subqueries, indexing, and optimization. Experience with SQL Server, T-SQL, or other relational database management systems. Expertise in database performance tuning, troubleshooting, and query optimization. Strong knowledge of database security best practices (user roles, encryption, etc.). Familiarity with cloud databases and technologies (e.g., Azure SQL, etc.) is a plus. Experience with data integration and ETL processes. Familiarity with version control tools such as Git. Soft Skills: Strong analytical and problem-solving abilities. Ability to work independently and as part of a team. Excellent communication skills, both written and verbal. Strong attention to detail and ability to handle complex tasks with minimal supervision. Preferred Skills: Experience with AZURE SQL Server, SQL Server. Knowledge of database monitoring tools (e.g., SQL Profiler, New Relic, SolarWinds). Familiarity with DevOps practices and CI/CD pipelines related to database development. Experience in agile or scrum development environments. Working Conditions: Full-time position with flexible work hours. Only Work from Office available. Occasional on-call support for production database systems may be required. Why Join Us? Competitive salary and benefits package. Opportunities for career growth and professional development. A dynamic, collaborative, and innovative work environment. Exposure to cutting-edge database technologies and large-scale systems. Show more Show less

Posted 1 day ago

Apply

4.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Linkedin logo

Line of Service Advisory Industry/Sector Not Applicable Specialism SAP Management Level Senior Associate Job Description & Summary A career within Data and Analytics services will provide you with the opportunity to help organisations uncover enterprise insights and drive business results using smarter data analytics. We focus on a collection of organisational technology capabilities, including business intelligence, data management, and data assurance that help our clients drive innovation, growth, and change within their organisations in order to keep up with the changing nature of customers and technology. We make impactful decisions by mixing mind and machine to leverage data, understand and navigate risk, and help our clients gain a competitive edge. Creating business intelligence from data requires an understanding of the business, the data, and the technology used to store and analyse that data. Using our Rapid Business Intelligence Solutions, data visualisation and integrated reporting dashboards, we can deliver agile, highly interactive reporting and analytics that help our clients to more effectively run their business and understand what business questions can be answered and how to unlock the answers. Required Skills - Degree in Computer Science or a related discipline Minimum 4 years of relevant experience Fluency ability in Python or shell scripting Experience with data mining, modeling, mapping, and ETL process Experience with Azure Data Factory, Data Lake, Databricks, Synapse analytics, BI Dashboard, and BI implementation projects. Hands-on Experience in Hadoop, PySpark, and SQL Spark. Knowledge in Azure / AWS, RESTful Web Service, SOAP, SOA, Microsoft SQL Server, MySQL Server, and Agile methodology is an advantage Strong analytical, problem- solving, and communication skills Excellent command of both written and spoken English. Should be able to Design, Develop, Deliver & maintain Data Infrastructures. Mandatory Skill Set- Hadoop, Pyspark Preferred Skill Set- Hadoop, Pyspark Year of experience required- 4 - 8 Qualifications- B.E / B.Tech Required Skills Hadoop Cluster, PySpark Optional Skills Desired Languages (If blank, desired languages not specified) Travel Requirements Not Specified Available for Work Visa Sponsorship? No Government Clearance Required? No Job Posting End Date Show more Show less

Posted 1 day ago

Apply

5.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

Description At Amazon, we strive to be the most innovative and customer centric company on the planet. Come work with us to develop innovative products, tools and research driven solutions in a fast-paced environment by collaborating with smart and passionate leaders, program managers and software developers. This role is based out of our Bangalore corporate office and is for an passionate, dynamic, analytical, innovative, hands-on, and customer-centric Business analyst. Key job responsibilities This role primarily focuses on deep-dives, creating dashboards for the business, working with different teams to develop and track metrics and bridges. Design, develop and maintain scalable, automated, user-friendly systems, reports, dashboards, etc. that will support our analytical and business needs In-depth research of drivers of the Localization business Analyze key metrics to uncover trends and root causes of issues Suggest and build new metrics and analysis that enable better perspective on business Capture the right metrics to influence stakeholders and measure success Develop domain expertise and apply to operational problems to find solution Work across teams with different stakeholders to prioritize and deliver data and reporting Recognize and adopt best practices in reporting and analysis: data integrity, test design, analysis, validation, and documentation Basic Qualifications 5+ years of analyzing and interpreting data with Redshift, Oracle, NoSQL etc. experience Experience with data visualization using Tableau, Quicksight, or similar tools Experience with data modeling, warehousing and building ETL pipelines Experience using Advanced SQL to pull data from a database or data warehouse and scripting experience (Python) to process data for modeling Preferred Qualifications Experience with AWS solutions such as EC2, DynamoDB, S3, and Redshift Experience in data mining, ETL, etc. and using databases in a business environment with large-scale, complex datasets Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you’re applying in isn’t listed, please contact your Recruiting Partner. Company - ADCI - BLR 14 SEZ Job ID: A2992205 Show more Show less

Posted 1 day ago

Apply

0 years

0 Lacs

Gurugram, Haryana, India

On-site

Linkedin logo

Line of Service Advisory Industry/Sector Not Applicable Specialism Data, Analytics & AI Management Level Senior Associate Job Description & Summary A career within Data and Analytics services will provide you with the opportunity to help organisations uncover enterprise insights and drive business results using smarter data analytics. We focus on a collection of organisational technology capabilities, including business intelligence, data management, and data assurance that help our clients drive innovation, growth, and change within their organisations in order to keep up with the changing nature of customers and technology. We make impactful decisions by mixing mind and machine to leverage data, understand and navigate risk, and help our clients gain a competitive edge. Creating business intelligence from data requires an understanding of the business, the data, and the technology used to store and analyse that data. Using our Rapid Business Intelligence Solutions, data visualisation and integrated reporting dashboards, we can deliver agile, highly interactive reporting and analytics that help our clients to more effectively run their business and understand what business questions can be answered and how to unlock the answers. Hands on experience in Qlik Sense development, dashboarding and data modeling and reporting (ad hoc report generation) techniques. 2. Must be good at Data transformation, the creation of QVD files and set analysis. 3. Experienced in application designing, architecting, development and deployment using Qlik Sense. Must be efficient in front-end development and know visualization best practices. 4. Strong database designing and SQL skills. Experienced in RDMS such as MS SQL Server, Oracle, MySQL etc. 5. Strong communication skills (verbal/written) to deliver the technical insights and interpret the data reports to the clients. Also helps in understanding and serving to the client’s requirements. 6. Leadership qualities and thoughtful implementation of Qlik Sense best practices to deliver effective Qlik Sense solutions to the users. 7. Able to comprehend and translate complex and advanced functional, technical, and business requirements into executable architectural designs. • Creating and maintaining technical documentation. 8. Experienced in data integration through extracting, transforming, and loading (ETL) data from various sources. 9. Experience in working and designing on Nprinting reports 10. Exposure to latest products in Qlik product suite (such as Replicate, Alerting) would be a huge plus. Mandatory Skill Set- Qlik Preferred Skill Set- Qlik Year of experience required- 4-6 Qualifications- BTech Required Skills QlikView Optional Skills Desired Languages (If blank, desired languages not specified) Travel Requirements Available for Work Visa Sponsorship? Government Clearance Required? Job Posting End Date Show more Show less

Posted 1 day ago

Apply

3.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

YOE - 3 to 5 Years Location - Bengaluru We are seeking a highly motivated and analytical Business Analyst to join our Data Analytics Team. In this role, you will play a critical part in turning raw data into actionable insights that support business decisions and strategic initiatives. You will work closely with cross-functional teams and directly engage with business stakeholders to understand data requirements, design robust data pipelines, and deliver impactful analyses. Collaborate with stakeholders across departments to gather and translate business requirements into data models and analytical solutions. Act as a key point of contact for business teams, ensuring their analytical needs are clearly understood and addressed effectively. Design, develop, and maintain ETL pipelines to ensure seamless data flow across systems. Perform advanced SQL queries to extract, manipulate, and analyze large datasets from multiple sources. Utilize Python to automate data workflows, perform exploratory data analysis (EDA), and build data transformation scripts. Leverage AWS tools (such as S3, Redshift, Glue, Lambda) for data storage, processing, and pipeline orchestration. Develop dashboards and reports to visualize key metrics and insights for business leadership. Conduct deep-dive analyses on business performance, customer behavior, and operational efficiencies to identify growth opportunities. Ensure data accuracy, integrity, and security throughout all analytics processes. Ideal Candidate Bachelor’s degree in Computer Science, Data Science, Engineering, Business Analytics, or a related field. 2+ years of experience in data analytics, business intelligence, or a similar role. Proficient in Advanced SQL for complex data manipulation and performance optimization. Intermediate proficiency in Python for data processing and automation (Pandas, NumPy, etc.). Experience with building and maintaining ETL pipelines. Familiarity with AWS Data Services (e.g., S3, Glue, Lambda, Athena). Strong analytical skills with a solid understanding of statistical methods and business performance metrics. Experience with data visualization tools like Tableau, Metabase. Excellent communication and interpersonal skills with the ability to engage directly with business stakeholders and translate their needs into actionable data solutions. Strong problem-solving skills and ability to work in a fast-paced, collaborative environment. Perks, Benefits and Work Culture Work with cutting-edge technologies on high-impact systems. Be part of a collaborative and technically driven team. Enjoy flexible work options and a culture that values learning. Competitive salary, benefits, and growth opportunities. Show more Show less

Posted 1 day ago

Apply

7.0 years

0 Lacs

Mumbai Metropolitan Region

On-site

Linkedin logo

Immediate Hire - Core Java + AWS Location: Mumbai (Powai) - Work from office Salary : Up to 30 LPA Full Time with benefits Experience: 7-10 Years Interested candidates please share updated CV to hr@hmforward.com Inviting applications for the role of Principal Consultant- #AWS Developer We are seeking an experienced Developer with expertise in AWS-based big data solutions, particularly leveraging Apache #Spark on #AWSEMR , along with strong backend development skills in #Java and Spring. The ideal candidate will also possess a solid background in data warehousing, #ETL pipelines, and large-scale data processing systems.. Responsibilities • Design and implement scalable data processing solutions using Apache Spark on AWS EMR. • Develop #microservices and backend components using Java and the Spring framework. • Build, optimize, and maintain ETL pipelines for structured and unstructured data. • Integrate data pipelines with AWS services such as #S3 , #Lambda , #Glue , #Redshift , and #Athena . • Collaborate with data architects, analysts, and #DevOps teams to support data warehousing initiatives. • Write efficient, reusable, and reliable code following best practices. • Ensure data quality, governance, and lineage across the architecture. • Troubleshoot and optimize Spark jobs and cloud-based processing workflows. • Participate in code reviews, testing, and deployments in Agile environments. Qualifications we seek in you! Minimum Qualifications • Bachelor’s degree Preferred Qualifications/ Skills • Strong experience with Apache Spark and AWS EMR in production environments. • Solid understanding of AWS ecosystem, including services like S3, #Lambda , #Glue , #Redshift , and #CloudWatch . • Proven experience in designing and managing large-scale data warehousing systems. • Expertise in building and maintaining ETL pipelines and data transformation workflows. • Strong SQL skills and familiarity with performance tuning for analytical queries. • Experience working in Agile development environments using tools such as Git, JIRA, and CI/CD pipelines. • Familiarity with data modeling concepts and tools (e.g., Star Schema, Snowflake Schema). • Knowledge of data governance tools and metadata management. • Experience with containerization (Docker, Kubernetes) and serverless architectures. Show more Show less

Posted 1 day ago

Apply

5.0 years

15 Lacs

Cochin

On-site

GlassDoor logo

Job Title: Database Lead (DB Lead) Location: Kochi Experience: 5+ Years Compensation: 20–25% hike on current CTC Employment Type: Full-Time Roles & Responsibilities: 1. Hands-on experience in writing complex SQL queries, stored procedures, packages, functions, and leveraging SQL analytical functions. 2. Expertise with Microsoft SQL Server tools and services, particularly SSIS (ETL processes). 3. Troubleshoot and support existing Data Warehouse (DW) processes. 4. Perform production-level performance tuning for MS SQL databases. 5. Monitor and report on SQL environment performance and availability metrics; implement best practices for performance optimization. 6. Participate in SQL code reviews with application teams to enforce SQL coding standards. 7. Manage database backup and restore operations, including scheduled Disaster Recovery (DR) tests. Should be well-versed in clustering , replication , and MS SQL restoration techniques. 8. Exhibit strong communication and coordination skills, with the ability to work efficiently under pressure. Desired Candidate Profile: · Bachelor’s Degree in Engineering (B.Tech) or Master of Computer Applications (MCA). · Minimum 5 years of relevant work experience in database development/administration. · Professional certifications in Database Development or Management are highly preferred. · Experience working in Agile/Scrum environments. Familiarity with JIRA is a plus. Job Types: Full-time, Permanent Pay: From ₹1,500,000.00 per year Schedule: Day shift Application Question(s): Do you have at least 5 years of hands-on experience with Microsoft SQL Server, including writing complex queries, stored procedures, and using SSIS (ETL processes)? Do you have experience with database backup/restoration, clustering, and Disaster Recovery (DR) testing in a production environment? Are you willing to work from Kochi and open to joining full-time with a 20–25% hike on your current CTC? Work Location: In person

Posted 1 day ago

Apply

10.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Linkedin logo

Requisition Number: 101362 Architect II Location: The role will be a hybrid position located in Delhi NCR, Hyderabad, Pune, Trivandrum and Bangalore, India Insight at a Glance 14,000+ engaged teammates globally #20 on Fortune’s World's Best Workplaces™ list $9.2 billion in revenue Received 35+ industry and partner awards in the past year $1.4M+ total charitable contributions in 2023 by Insight globally Now is the time to bring your expertise to Insight. We are not just a tech company; we are a people-first company. We believe that by unlocking the power of people and technology, we can accelerate transformation and achieve extraordinary results. As a Fortune 500 Solutions Integrator with deep expertise in cloud, data, AI, cybersecurity, and intelligent edge, we guide organisations through complex digital decisions. About The Role The Architect-II Data will focus on leading our Business Intelligence (BI) and Data Warehousing (DW) initiatives. This role involves designing and implementing end-to-end data pipelines using cloud services and data frameworks. They will collaborate with stakeholders and ETL/BI developers in an agile environment to create scalable, secure data architectures ensuring alignment with business requirements, industry best practices, and regulatory compliance. Responsibilities Architect and implement end-to-end data pipelines, data lakes, and warehouses using modern cloud services and architectural patterns. Develop and build analytics tools that deliver actionable insights to the business. Integrate and manage large, complex data sets to meet strategic business requirements. Optimize data processing workflows using frameworks such as PySpark. Establish and enforce best practices for data quality, integrity, security, and performance across the entire data ecosystem. Collaborate with cross-functional teams to prioritize deliverables and design solutions. Develop compelling business cases and return on investment (ROI) analyses to support strategic initiatives. Drive process improvements for enhanced data delivery speed and reliability. Provide technical leadership, training, and mentorship to team members, promoting a culture of excellence. Qualification 10+ years in Business Intelligence (BI) solution design, with 8+ years specializing in ETL processes and data warehouse architecture. 8+ years of hands-on experience with Azure Data services including Azure Data Factory, Azure Databricks, Azure Data Lake Gen2, Azure SQL DB, Synapse, Power BI, and MS Fabric (Knowledge) Strong Python and PySpark software engineering proficiency, coupled with a proven track record of building and optimizing big data pipelines, architectures, and datasets. Proficient in transforming, processing, and extracting insights from vast, disparate datasets, and building robust data pipelines for metadata, dependency, and workload management. Familiarity with software development lifecycles/methodologies, particularly Agile. Experience with SAP/ERP/Datasphere data modeling is a significant plus. Excellent presentation and collaboration skills, capable of creating formal documentation and supporting cross-functional teams in a dynamic environment. Strong problem-solving, time management, and organizational abilities. Keen to learn new languages and technologies continually. Graduate degree in Computer Science, Statistics, Informatics, Information Systems, or an equivalent field. What You Can Expect We’re legendary for taking care of you, your family and to help you engage with your local community. We want you to enjoy a full, meaningful life and own your career at Insight. Some of our benefits include: Freedom to work from another location—even an international destination—for up to 30 consecutive calendar days per year. But what really sets us apart are our core values of Hunger, Heart, and Harmony, which guide everything we do, from building relationships with teammates, partners, and clients to making a positive impact in our communities. Join us today, your ambITious journey starts here. Insight is an equal opportunity employer, and all qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, disability status, protected veteran status, sexual orientation or any other characteristic protected by law. When you apply, please tell us the pronouns you use and any reasonable adjustments you may need during the interview process. At Insight, we celebrate diversity of skills and experience so even if you don’t feel like your skills are a perfect match - we still want to hear from you! Insight is an equal opportunity employer, and all qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, disability status, protected veteran status, sexual orientation or any other characteristic protected by law. Insight India Location:Level 16, Tower B, Building No 14, Dlf Cyber City In It/Ites Sez, Sector 24 &25 A Gurugram Gurgaon Hr 122002 India Show more Show less

Posted 1 day ago

Apply

5.0 - 10.0 years

10 - 20 Lacs

Hyderabad

Work from Office

Naukri logo

Dear Candidate, Greetings from Tata Consultancy Services (TCS) We are pleased to invite you to our in-person interview drive for professionals with expertise in Snowflake developer - Hyderabad. Interview Drive Details: Date : 21-Jun-2025 Time : 9:00 AM to 5:00 PM Venue : TCS Deccan park-LS1 Zone Plot No 1, Survey No. 64/2, Software Units Layout Serilingampally Mandal, Madhapur Hyderabad - 500081, Telangana. Role ** Snowflake Developer Required Technical Skill Set** Snowflake Desired Experience Range** 5 to 10 yrs exp Location of Requirement Hyderabad Desired Competencies (Technical/Behavioral Competency) Must-Have** At least 5+ years of relevant work experience in any Data Warehouse Technologies At least 2+ years of experience in designing, implementing, and migrating Data/Enterprise/Engineering workloads on to snowflake DWH. Should be able take the requirements from Business , co-ordinate with Business and IT Teams on clarifications, dependencies and status reporting As an individual contributor, should be able Create, test, and implement business solutions in Snowflake Experience in implementing Devops/CICD using Azure Devops / GITLAB Actions is preferred. Hands on experience in Data Modeling Expert in SQL and Performance tuning techniques of queries Experience on ingestion techniques using ETL tools (IICS) and Snowflakes COPY/Snowpipe/StreamLit Utility Strong in writing snowflakes stored procedures, views, UDFs etc. Good exposure of handling CDC using Streams, TimeTravel Proficient in working with Snowflake Tasks, Data Sharing, Data replication Good-to-Have DBT Responsibility of / Expectations from the Role 1. Good exposure of handling CDC using Streams, TimeTravel 2. Expert in SQL and Performance tuning techniques of queries 3. Experience on ingestion techniques using ETL tools (IICS) and Snowflakes COPY/Snowpipe/StreamLit Utility 4. Strong in writing snowflakes stored procedures, views, UDFs etc. 5. Good exposure of handling CDC using Streams, TimeTravel We look forward to your confirmation and participation in the interview drive

Posted 1 day ago

Apply

10.0 years

0 Lacs

Hyderābād

On-site

GlassDoor logo

Do you want to help one of the most respected companies in the world reinvent its approach to data? At Thomson Reuters, we are recruiting a team of motivated data professionals to transform how we manage and leverage our commercial data assets. It is a unique opportunity to join a diverse and global team with centers of excellence in Toronto, London, and Bangalore. Are you excited about working at the forefront of the data driven revolution that will change the way a company works? Thomson Reuters Data and Analytics team is seeking an experienced Lead Engineer, Test Data Management with a passion for engineering quality assurance solutions for cloud-based data warehouse systems. About the Role As Lead Engineer, In this opportunity you will: Test Data Management, you play a crucial role in ensuring the quality and reliability of our enterprise data systems. Your expertise in testing methods, data validation, and automation are essential to bring best-in-class standards to our data products. Design test data management frameworks, apply data masking, data sub-setting, and generate synthetic data to create robust test data solutions for enterprise-wide teams. You will collaborate with Engineers, Database Architects, Data Quality Stewards to build logical data models, execute data validation, design manual and automated testing Mentor and lead the testing of key data development projects related to Data Warehouse and other systems. Lead engineering team members in implementation of test data best practices and the delivery of test data solutions. Be a thought leader investigating leading edge quality technology for test data management and systems functionality including performance testing for data pipelines. Innovate create ETL mappings, workflows, functions to move data from multiple sources into target areas. Partner across the company with analytics teams, engineering managers, architecture teams and others to design and agree on solutions that meet business requirements. Effectively communicate and liaise with other engineering groups across the organization, data consumers, and business analytic groups. Utilize your experience in the following areas: SQL for data querying, validation, and analysis Knowledge of database management systems (e.g., SQL Server, Postgresql, mySQL) Test Data Management Tools (e.g., K2View, qTest, ALM, Zephyr) Proficiency in Python for test automation and data manipulation PySpark for big data testing Test case design, execution, and defect management AWS Cloud Data practices and DevOps tooling Performance testing for data management solutions, especially for complex data flows Data Security, Privacy, and Data governance compliance principles About You You're a fit for the role of Lead Engineer, If your Job role includes: 10+ years of experience as a Tester, Developer or Data Analyst with experience in establishing end-to-end test strategies, planning for data validation, transformation, and analytics Advanced SQL Knowledge Designing and executing test procedures and documenting best practices Experience planning and executing regression testing, data validation, and quality assurance Advanced command of data warehouse creation, management, and performance strategies Experience engineering and implementing data quality systems in the cloud Proficiency in scripting language such as Python Hands on experience with data test automation applications (preference for K2View) Identification and remediation of data quality issues Data Management tools like: K2View, Immuta, Alation, Informatica Agile development Business Intelligence and Data Warehousing concepts Familiarity SAP, Salesforce systems Intermediate understanding of Big Data technologies AWS services and management, including serverless, container, queueing and monitoring services Experience with creating manual or automated tests on data pipelines Programming languages: Python Data interchange formats: Parquet, JSON, CSV Version control with GitHub Cloud security and compliance, privacy, GDPR #LI-VGA1 What’s in it For You? Hybrid Work Model: We’ve adopted a flexible hybrid working environment (2-3 days a week in the office depending on the role) for our office-based roles while delivering a seamless experience that is digitally and physically connected. Flexibility & Work-Life Balance: Flex My Way is a set of supportive workplace policies designed to help manage personal and professional responsibilities, whether caring for family, giving back to the community, or finding time to refresh and reset. This builds upon our flexible work arrangements, including work from anywhere for up to 8 weeks per year, empowering employees to achieve a better work-life balance. Career Development and Growth: By fostering a culture of continuous learning and skill development, we prepare our talent to tackle tomorrow’s challenges and deliver real-world solutions. Our Grow My Way programming and skills-first approach ensures you have the tools and knowledge to grow, lead, and thrive in an AI-enabled future. Industry Competitive Benefits: We offer comprehensive benefit plans to include flexible vacation, two company-wide Mental Health Days off, access to the Headspace app, retirement savings, tuition reimbursement, employee incentive programs, and resources for mental, physical, and financial wellbeing. Culture: Globally recognized, award-winning reputation for inclusion and belonging, flexibility, work-life balance, and more. We live by our values: Obsess over our Customers, Compete to Win, Challenge (Y)our Thinking, Act Fast / Learn Fast, and Stronger Together. Social Impact: Make an impact in your community with our Social Impact Institute. We offer employees two paid volunteer days off annually and opportunities to get involved with pro-bono consulting projects and Environmental, Social, and Governance (ESG) initiatives. Making a Real-World Impact: We are one of the few companies globally that helps its customers pursue justice, truth, and transparency. Together, with the professionals and institutions we serve, we help uphold the rule of law, turn the wheels of commerce, catch bad actors, report the facts, and provide trusted, unbiased information to people all over the world. About Us Thomson Reuters informs the way forward by bringing together the trusted content and technology that people and organizations need to make the right decisions. We serve professionals across legal, tax, accounting, compliance, government, and media. Our products combine highly specialized software and insights to empower professionals with the data, intelligence, and solutions needed to make informed decisions, and to help institutions in their pursuit of justice, truth, and transparency. Reuters, part of Thomson Reuters, is a world leading provider of trusted journalism and news. We are powered by the talents of 26,000 employees across more than 70 countries, where everyone has a chance to contribute and grow professionally in flexible work environments. At a time when objectivity, accuracy, fairness, and transparency are under attack, we consider it our duty to pursue them. Sound exciting? Join us and help shape the industries that move society forward. As a global business, we rely on the unique backgrounds, perspectives, and experiences of all employees to deliver on our business goals. To ensure we can do that, we seek talented, qualified employees in all our operations around the world regardless of race, color, sex/gender, including pregnancy, gender identity and expression, national origin, religion, sexual orientation, disability, age, marital status, citizen status, veteran status, or any other protected classification under applicable law. Thomson Reuters is proud to be an Equal Employment Opportunity Employer providing a drug-free workplace. We also make reasonable accommodations for qualified individuals with disabilities and for sincerely held religious beliefs in accordance with applicable law. More information on requesting an accommodation here. Learn more on how to protect yourself from fraudulent job postings here. More information about Thomson Reuters can be found on thomsonreuters.com.

Posted 1 day ago

Apply

8.0 years

0 Lacs

Hyderābād

On-site

GlassDoor logo

The people here at Apple dont just build products we craft the kind of wonder thats revolutionized entire industries. Its the diversity of those people and their ideas that supports the innovation that runs through everything we do, from amazing technology to industry-leading environmental efforts. Join Apple, and help us leave the world better than we found it. Imagine what you could do here! At Apple, new ideas have a way of becoming extraordinary products, services, and customer experiences very quickly. Bring passion and dedication to your job and there's no telling what you could accomplish. A passion for product ownership and track record will prove critical to success on our team. Be ready to make something extraordinary when here. Multifaceted, encouraging people and innovative, industry-defining technologies are the norm at Apple. Would you like to work in a fast-paced environment where your technical abilities will be challenged on a day to day basis? If so, Apples IS&T (Information systems and Technology) team is seeking a Software Engineer to work on building and scaling best in class data and reporting apps presenting metrics & performance indicators with the least latency and outstanding user experience. We are looking for a team member that will be able to think creatively and should have a real passion for building highly scalable analytical & reporting apps with end users in focus. You will engage directly with key business partners to understand the business strategies and solution needs. You will drive and lead functional & technical discussions with development teams and expected to design and own end to end applications. You will enjoy the benefits of working in a fast growing business where you are inspired to "Think Different" and where your efforts play a key role in the success of Apple's business. Description We're looking for an individual who loves challenges and taking on problems with imaginative solutions. Also works well in collaborative teams, and can produce high-quality software under tight constraints. You should be a self-starter, self-motivated, able to work independently, collaborate with multiple multi-functional teams across the globe (US, Singapore, India, and Europe) and work on solutions that have a larger impact on Apple business. You will interact with many other group’s / internal teams at Apple to lead and deliver best-in-class products in an exciting, constantly evolving environment. Minimum Qualifications 8+ years of experience developing enterprise applications using Java/J2EE, including Web Services (e.g., RESTful, SOAP), Spring Framework and SpringBoot, and ORM (e.g. Hibernate). Experience with micro-services architectures and container-based deployment (e.g. Docker, Kubernetes) Strong web development skills ( React). Hands-on experience in designing and developing user interfaces ensuring responsiveness, accessibility, and a user-friendly experience. Experience with Relational Database Management Systems (RDBMS) and SQL, as well as multi-modal NoSQL databases, including DocumentDB and GraphDB Preferred Qualifications Experience working with distributed teams using collaboration tools for software configuration management (e.g. Git / GitHub), agile project management (e.g. Jira), and knowledge repositories (e.g. Confluence / wikis) Experience with Extraction, Transformation, and Load (ETL) technologies, data replication, and event streaming. Experience with Cloud solutions, like Infrastructure as Code (e.g. CloudFormation), Configuration as Code (e.g. Ansbile), Elastic Computing, Virtual Private Clouds (VPCs) Proficiency in Test Driven Development (TDD), Continuous Integration / Continuous Deployment (CI/CD), and DevOps best practices Working experience in Agile development methodology Effective interpersonal, analytical and communication skills Results-oriented and demonstrates ownership and accountability Bachelor’s degree in Computer Science or related field Submit CV

Posted 1 day ago

Apply

10.0 - 12.0 years

9 - 9 Lacs

Hyderābād

On-site

GlassDoor logo

Title: Data Integration Developer – Manager Department: Alpha Data Platform Reports To: Data Integration Lead, Engineering Summary: State Street Global Alpha Data Platform , lets you load, enrich and aggregate investment data. Alpha Clients will be able to manage multi-asset class data from any service provider or data vendor for a more holistic and integrated view of their holdings. This platform reflects State Street’s years of experience servicing complex instruments for our global client base and our investments in building advanced data management technologies. Reporting to the Alpha Development delivery manager in <>, Data Integration Developer is responsible for overall development life cycle leading to successful delivery and support of Alpha Data Platform(ADP) Services to clients. Responsibilities: As a Data Integration Developer, be hands-on ETL/ELT data pipelines (Talend DI), Snowflake DWH, CI/CD deployment Pipelines and data-readiness (data quality) design, development, implementation and address code or data issues. Experience in designing and implementing modern data pipelines for a variety of data sets which include internal/external data sources, complex relationships, various data formats and high-volume. Experience and understanding of ETL Job performance techniques, Exception handling, Query performance tuning/optimizations and data loads meeting the runtime/schedule time SLAs both batch and real-time data uses cases. Demonstrate strong collaborative experience across regions (APAC, EMEA and NA) to come up with design standards, High level design solutions document, cross training and resource onboarding activities. Good understanding of SDLC process, Governance clearance, Peer Code reviews, Unit Test Results, Code deployments, Code Security Scanning, Confluence Jira/Kanban stories. Strong attention to detail during root cause analysis, SQL query debugging and defect issue resolution by working with multiple business/IT stakeholders. Qualifications: Education: B.S. degree (or foreign education equivalent) in Computer Science, Engineering, Mathematics, and Physics or other technical course of study required. MS degree strongly preferred. Experience: A minimum of 10- 12 years of experience in data integration/orchestration services, data architecture, design, development and implementations and providing data driven solutions for client requirements Experience in Snowflake DWH SQL, SQL server database query/performance tuning. Strong Data warehousing Concepts, ETL tools such as Talend Cloud Data Integration tool. Exposure to the financial domain knowledge is considered a plus. Cloud Managed Services such as source control code Github, MS Azure/Devops is considered a plus. Experience with Qlik Replicate and Compose tools (Change Data Capture) tools is considered a plus Exposure to Third party data providers such as Factset, Opturo, Bloomberg, Reuters, MSCI and other Rating agencies is a plus. Supervisory Responsibility: Individual Contributor Team Lead Manager of Managers Travel: May be required on a limited basis.

Posted 1 day ago

Apply

130.0 years

3 - 6 Lacs

Hyderābād

On-site

GlassDoor logo

Job Description Manager, Quality Engineer The Opportunity Based in Hyderabad, join a global healthcare biopharma company and be part of a 130- year legacy of success backed by ethical integrity, forward momentum, and an inspiring mission to achieve new milestones in global healthcare. Be part of an organisation driven by digital technology and data-backed approaches that support a diversified portfolio of prescription medicines, vaccines, and animal health products. Drive innovation and execution excellence. Be a part of a team with passion for using data, analytics, and insights to drive decision-making, and which creates custom software, allowing us to tackle some of the world's greatest health threats. Our Technology Centres focus on creating a space where teams can come together to deliver business solutions that save and improve lives. An integral part of our company’s’ IT operating model, Tech Centres are globally distributed locations where each IT division has employees to enable our digital transformation journey and drive business outcomes. These locations, in addition to the other sites, are essential to supporting our business and strategy. A focused group of leaders in each Tech Centre helps to ensure we can manage and improve each location, from investing in growth, success, and well-being of our people, to making sure colleagues from each IT division feel a sense of belonging to managing critical emergencies. And together, we must leverage the strength of our team to collaborate globally to optimize connections and share best practices across the Tech Centres. Role Overview Develop and Implement Advanced Automated Testing Frameworks: Architect, design, and maintain sophisticated automated testing frameworks for data pipelines and ETL processes, ensuring robust data quality and reliability. Conduct Comprehensive Quality Assurance Testing: Lead the execution of extensive testing strategies, including functional, regression, performance, and security testing, to validate data accuracy and integrity across the bronze layer. Monitor and Enhance Data Reliability: Collaborate with the data engineering team to establish and refine monitoring and alerting systems that proactively identify data quality issues and system failures, implementing corrective actions as needed. What will you do in this role: Develop and Implement Advanced Automated Testing Frameworks: Architect, design, and maintain sophisticated automated testing frameworks for data pipelines and ETL processes, ensuring robust data quality and reliability. Conduct Comprehensive Quality Assurance Testing: Lead the execution of extensive testing strategies, including functional, regression, performance, and security testing, to validate data accuracy and integrity across the bronze layer. Monitor and Enhance Data Reliability: Collaborate with the data engineering team to establish and refine monitoring and alerting systems that proactively identify data quality issues and system failures, implementing corrective actions as needed. Leverage Generative AI: Innovate and apply generative AI techniques to enhance testing processes, automate complex data validation scenarios, and improve overall data quality assurance workflows. Collaborate with Cross-Functional Teams: Serve as a key liaison between Data Engineers, Product Analysts, and other stakeholders to deeply understand data requirements and ensure that testing aligns with strategic business objectives. Document and Standardize Testing Processes: Create and maintain comprehensive documentation of testing procedures, results, and best practices, facilitating knowledge sharing and continuous improvement across the organization. Drive Continuous Improvement Initiatives: Lead efforts to develop and implement best practices for QA automation and reliability, including conducting code reviews, mentoring junior team members, and optimizing testing processes. What You Should Have: Educational Background: Bachelor's degree in computer science, Engineering, Information Technology, or a related field Experience: 4+ years of experience in QA automation, with a strong focus on data quality and reliability testing in complex data engineering environments. Technical Skills: Advanced proficiency in programming languages such as Python, Java, or similar for writing and optimizing automated tests. Extensive experience with testing frameworks and tools (e.g., Selenium, JUnit, pytest) and data validation tools, with a focus on scalability and performance. Deep familiarity with data processing frameworks (e.g., Apache Spark) and data storage solutions (e.g., SQL, NoSQL), including performance tuning and optimization. Strong understanding of generative AI concepts and tools, and their application in enhancing data quality and testing methodologies. Proficiency in using Jira Xray for advanced test management, including creating, executing, and tracking complex test cases and defects. Analytical Skills: Exceptional analytical and problem-solving skills, with a proven ability to identify, troubleshoot, and resolve intricate data quality issues effectively. Communication Skills: Outstanding verbal and written communication skills, with the ability to articulate complex technical concepts to both technical and non-technical stakeholders. Preferred Qualifications Experience with Cloud Platforms: Extensive familiarity with cloud data services (e.g., AWS, Azure, Google Cloud) and their QA tools, including experience in cloud-based testing environments. Knowledge of Data Governance: In-depth understanding of data governance principles and practices, including data lineage, metadata management, and compliance requirements. Experience with CI/CD Pipelines: Strong knowledge of continuous integration and continuous deployment (CI/CD) practices and tools (e.g., Jenkins, GitLab CI), with experience in automating testing within CI/CD workflows. Certifications: Relevant certifications in QA automation or data engineering (e.g., ISTQB, AWS Certified Data Analytics) are highly regarded. Agile Methodologies: Proven experience working in Agile/Scrum environments, with a strong understanding of Agile testing practices and principles. Our technology teams operate as business partners, proposing ideas and innovative solutions that enable new organizational capabilities. We collaborate internationally to deliver services and solutions that help everyone be more productive and enable innovation. Who we are We are known as Merck & Co., Inc., Rahway, New Jersey, USA in the United States and Canada and MSD everywhere else. For more than a century, we have been inventing for life, bringing forward medicines and vaccines for many of the world's most challenging diseases. Today, our company continues to be at the forefront of research to deliver innovative health solutions and advance the prevention and treatment of diseases that threaten people and animals around the world. What we look for Imagine getting up in the morning for a job as important as helping to save and improve lives around the world. Here, you have that opportunity. You can put your empathy, creativity, digital mastery, or scientific genius to work in collaboration with a diverse group of colleagues who pursue and bring hope to countless people who are battling some of the most challenging diseases of our time. Our team is constantly evolving, so if you are among the intellectually curious, join us—and start making your impact today. #HYDIT2025 Current Employees apply HERE Current Contingent Workers apply HERE Search Firm Representatives Please Read Carefully Merck & Co., Inc., Rahway, NJ, USA, also known as Merck Sharp & Dohme LLC, Rahway, NJ, USA, does not accept unsolicited assistance from search firms for employment opportunities. All CVs / resumes submitted by search firms to any employee at our company without a valid written search agreement in place for this position will be deemed the sole property of our company. No fee will be paid in the event a candidate is hired by our company as a result of an agency referral where no pre-existing agreement is in place. Where agency agreements are in place, introductions are position specific. Please, no phone calls or emails. Employee Status: Regular Relocation: VISA Sponsorship: Travel Requirements: Flexible Work Arrangements: Hybrid Shift: Valid Driving License: Hazardous Material(s): Required Skills: Business Intelligence (BI), Database Administration, Data Engineering, Data Management, Data Modeling, Data Visualization, Design Applications, Information Management, Software Development, Software Development Life Cycle (SDLC), System Designs Preferred Skills: Job Posting End Date: 08/31/2025 A job posting is effective until 11:59:59PM on the day BEFORE the listed job posting end date. Please ensure you apply to a job posting no later than the day BEFORE the job posting end date. Requisition ID: R345312

Posted 1 day ago

Apply

5.0 years

6 - 8 Lacs

Hyderābād

Remote

GlassDoor logo

- 5+ years of analyzing and interpreting data with Redshift, Oracle, NoSQL etc. experience - Experience with data visualization using Tableau, Quicksight, or similar tools - Experience with data modeling, warehousing and building ETL pipelines - Experience in Statistical Analysis packages such as R, SAS and Matlab - Experience using SQL to pull data from a database or data warehouse and scripting experience (Python) to process data for modeling Want to join the Earth’s most customer centric company? Do you like to dive deep to understand problems? Are you someone who likes to challenge Status Quo? Do you strive to excel at goals assigned to you? If yes, we have opportunities for you. Global Operations – Artificial Intelligence (GO-AI) at Amazon is looking to hire candidates who can excel in a fast-paced dynamic environment. Are you somebody that likes to use and analyze big data to drive business decisions? Do you enjoy converting data into insights that will be used to enhance customer decisions worldwide for business leaders? Do you want to be part of the data team which measures the pulse of innovative machine vision-based projects? If your answer is yes, join our team. GO-AI is looking for a motivated individual with strong skills and experience in resource utilization planning, process optimization and execution of scalable and robust operational mechanisms, to join the GO-AI Ops DnA team. In this position you will be responsible for supporting our sites to build solutions for the rapidly expanding GO-AI team. The role requires the ability to work with a variety of key stakeholders across job functions with multiple sites. We are looking for an entrepreneurial and analytical program manager, who is passionate about their work, understands how to manage service levels across multiple skills/programs, and who is willing to move fast and experiment often. Key job responsibilities • Ability to maintain and refine straightforward ETL and write secure, stable, testable, maintainable code with minimal defects and automate manual processes. • Proficiency in one or more industry analytics visualization tools (e.g. Excel, Tableau/Quicksight/PowerBI) and, as needed, statistical methods (e.g. t-test, Chi-squared) to deliver actionable insights to stakeholders. • Building and owning small to mid-size BI solutions with high accuracy and on time delivery using data sets, queries, reports, dashboards, analyses or components of larger solutions to answer straightforward business questions with data incorporating business intelligence best practices, data management fundamentals, and analysis principles. • Good understanding of the relevant data lineage: including sources of data; how metrics are aggregated; and how the resulting business intelligence is consumed, interpreted and acted upon by the business where the end product enables effective, data-driven business decisions. • Having high responsibility for the code, queries, reports and analyses that are inherited or produced and having analyses and code reviewed periodically. • Effective partnering with peer BIEs and others in your team to troubleshoot, research root causes, propose solutions, by either take ownership for their resolution or ensure a clear hand-off to the right owner. About the team The Global Operations – Artificial Intelligence (GO-AI) team is an initiative, which remotely handles exceptions in the Amazon Robotic Fulfillment Centers Globally. GO-AI seeks to complement automated vision based decision-making technologies by providing remote human support for the subset of tasks which require higher cognitive ability and cannot be processed through automated decision making with high confidence. This team provides end-to-end solutions through inbuilt competencies of Operations and strong central specialized teams to deliver programs at Amazon scale. It is operating multiple programs and other new initiatives in partnership with global technology and operations teams. Experience with AWS solutions such as EC2, DynamoDB, S3, and Redshift Experience in data mining, ETL, etc. and using databases in a business environment with large-scale, complex datasets Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you’re applying in isn’t listed, please contact your Recruiting Partner.

Posted 1 day ago

Apply

5.0 years

1 - 9 Lacs

Hyderābād

On-site

GlassDoor logo

We have an opportunity to impact your career and provide an adventure where you can push the limits of what's possible. As a Lead Software Engineer at JPMorgan Chase within the Consumer and community banking - Data technology, you are an integral part of an agile team that works to enhance, build, and deliver trusted market-leading technology products in a secure, stable, and scalable way. As a core technical contributor, you are responsible for conducting critical technology solutions across multiple technical areas within various business functions in support of the firm’s business objectives. Job responsibilities Executes creative software solutions, design, development, and technical troubleshooting with ability to think beyond routine or conventional approaches to build solutions or break down technical problems Develops secure high-quality production code, and reviews and debugs code written by others Identifies opportunities to eliminate or automate remediation of recurring issues to improve overall operational stability of software applications and systems Leads evaluation sessions with external vendors, startups, and internal teams to drive outcomes-oriented probing of architectural designs, technical credentials, and applicability for use within existing systems and information architecture Leads communities of practice across Software Engineering to drive awareness and use of new and leading-edge technologies Becomes a technical mentor in the team Required qualifications, capabilities, and skills Formal training or certification on software engineering concepts and 5+ years applied experience Experience in software engineering, including hands-on expertise in ETL/Data pipeline and data lake platforms like Teradata and Snowflake Hands-on practical experience delivering system design, application development, testing, and operational stability Proficiency in AWS services especially in Aurora Postgres RDS Proficiency in automation and continuous delivery methods Proficient in all aspects of the Software Development Life Cycle Advanced understanding of agile methodologies such as CI/CD, Application Resiliency, and Security Demonstrated proficiency in software applications and technical processes within a technical discipline (e.g., cloud, artificial intelligence, machine learning, mobile, etc.) In-depth knowledge of the financial services industry and their IT systems Preferred qualifications, capabilities, and skills Experience in re-engineering and migrating on-premises data solutions to and for the cloud Experience in Infrastructure as Code (Terraform) for Cloud based data infrastructure Experience in building on emerging cloud serverless managed services, to minimize/eliminate physical/virtual server footprint Advanced in Java plus Python (nice to have)

Posted 1 day ago

Apply

5.0 years

0 Lacs

Hyderābād

On-site

GlassDoor logo

JOB DESCRIPTION We have an opportunity to impact your career and provide an adventure where you can push the limits of what's possible. As a Lead Software Engineer at JPMorgan Chase within the Consumer and community banking - Data technology, you are an integral part of an agile team that works to enhance, build, and deliver trusted market-leading technology products in a secure, stable, and scalable way. As a core technical contributor, you are responsible for conducting critical technology solutions across multiple technical areas within various business functions in support of the firm’s business objectives. Job responsibilities Executes creative software solutions, design, development, and technical troubleshooting with ability to think beyond routine or conventional approaches to build solutions or break down technical problems Develops secure high-quality production code, and reviews and debugs code written by others Identifies opportunities to eliminate or automate remediation of recurring issues to improve overall operational stability of software applications and systems Leads evaluation sessions with external vendors, startups, and internal teams to drive outcomes-oriented probing of architectural designs, technical credentials, and applicability for use within existing systems and information architecture Leads communities of practice across Software Engineering to drive awareness and use of new and leading-edge technologies Becomes a technical mentor in the team Required qualifications, capabilities, and skills Formal training or certification on software engineering concepts and 5+ years applied experience Experience in software engineering, including hands-on expertise in ETL/Data pipeline and data lake platforms like Teradata and Snowflake Hands-on practical experience delivering system design, application development, testing, and operational stability Proficiency in AWS services especially in Aurora Postgres RDS Proficiency in automation and continuous delivery methods Proficient in all aspects of the Software Development Life Cycle Advanced understanding of agile methodologies such as CI/CD, Application Resiliency, and Security Demonstrated proficiency in software applications and technical processes within a technical discipline (e.g., cloud, artificial intelligence, machine learning, mobile, etc.) In-depth knowledge of the financial services industry and their IT systems Preferred qualifications, capabilities, and skills Experience in re-engineering and migrating on-premises data solutions to and for the cloud Experience in Infrastructure as Code (Terraform) for Cloud based data infrastructure Experience in building on emerging cloud serverless managed services, to minimize/eliminate physical/virtual server footprint Advanced in Java plus Python (nice to have) ABOUT US

Posted 1 day ago

Apply

6.0 - 8.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Linkedin logo

You should apply if you have: Experience in building analytics platforms from scratch for a product-based company. Strong proficiency in Power BI, SQL, and data visualization tools . Expertise in data modeling, ETL processes, and business intelligence . Ability to analyze large datasets and translate insights into actionable recommendations. Experience working with AWS, Redshift, BigQuery . A passion for data-driven decision-making and problem-solving. Excellent communication skills to present insights to stakeholders effectively. You should not apply if you: Lack of experience in data visualization tools like Power BI . Are unfamiliar with SQL and cloud-based data warehouses . Haven’t worked with ETL pipelines and data modeling . Struggle with translating complex data into business insights . Are not comfortable in a fast-paced, product-based company environment . Skills Required: Power BI (DAX, Power Query, Report Optimization) SQL (Query Optimization, Data Manipulation) ETL Processes & Data Warehousing AWS Redshift / Google BigQuery Python (Preferred but not mandatory) Business Intelligence & Data Storytelling Stakeholder Communication & Data-Driven Decision-Making What will you do? Build and manage an end-to-end analytics platform for Nutrabay. Develop interactive dashboards and reports for business insights. Work with large datasets to ensure data integrity and efficiency . Collaborate with engineering, product, and marketing teams to define key metrics. Implement ETL processes to extract and transform data from multiple sources. Ensure data security and governance within the analytics ecosystem. Conduct deep-dive analyses on performance metrics, user behavior, and market trends. Optimize Power BI reports for performance and scalability . Support decision-making with real-time and historical data analysis . Work Experience: 6-8 years of experience in data analysis, business intelligence, or related roles. Prior experience in a product-based or e-commerce company is a plus. Working Days: Monday - Friday Location: Golf Course Road, Gurugram, Haryana (Work from Office) Perks: Opportunity to build the analytics infrastructure from scratch. Learning and development opportunities in a fast-growing company . Work alongside a collaborative and talented team . Why Nutrabay: We believe in an open, intellectually honest culture where everyone is given the autonomy to contribute and do their life’s best work. As a part of the dynamic team at Nutrabay, you will have a chance to learn new things, solve new problems, build your competence and be a part of an innovative marketing-and-tech startup that’s revolutionising the health industry. Working with Nutrabay can be fun, and a place of a unique growth opportunity. Here you will learn how to maximise the potential of your available resources. You will get the opportunity to do work that helps you master a variety of transferable skills, or skills that are relevant across roles and departments. You will be feeling appreciated and valued for the work you delivered. We are creating a unique company culture that embodies respect and honesty that will create more loyal employees than a company that simply shells out cash. We trust our employees and their voice and ask for their opinions on important business issues. About Nutrabay: Nutrabay is the largest health & nutrition store in India. Our vision is to keep growing, having a sustainable business model and continue to be the market leader in this segment by launching many innovative products. We are proud to have served over 1 million customers uptill now and our family is constantly growing. We have built a complex and high converting eCommerce system and our monthly traffic has grown to a million. We are looking to build a visionary and agile team to help fuel our growth and contribute towards further advancing the continuously evolving product. Funding: We raised $5 Million in a Series A funding round. Show more Show less

Posted 1 day ago

Apply

3.0 years

5 - 12 Lacs

Hyderābād

On-site

GlassDoor logo

Objectives of this role Work with data to solve business problems, building and maintaining the infrastructure to answer questions and improve processes Help streamline our data science workflows, adding value to our product offerings and building out the customer lifecycle and retention models Work closely with the data science and business intelligence teams to develop data models and pipelines for research, reporting, and machine learning Be an advocate for best practices and continued learning Responsibilities Work closely with our data science team to help build complex algorithms that provide unique insights into our data Use agile software development processes to make iterative improvements to our back-end systems Model front-end and back-end data sources to help draw a more comprehensive picture of user flows throughout the system and to enable powerful data analysis Build data pipelines that clean, transform, and aggregate data from disparate sources Develop models that can be used to make predictions and answer questions for the overall business Required skills and qualifications Three or more years of experience with Python, SQL, and data visualization/exploration tools Familiarity with the AWS ecosystem, specifically Redshift and RDS Communication skills, especially for explaining technical concepts to nontechnical business leaders Ability to work on a dynamic, research-oriented team that has concurrent projects Preferred skills and qualifications Bachelor’s degree (or equivalent) in computer science, information technology, engineering, or related discipline Experience in building or maintaining ETL processes Professional certification Job Types: Full-time, Permanent Pay: ₹500,000.00 - ₹1,200,000.00 per year Benefits: Health insurance Provident Fund Schedule: Day shift Monday to Friday Morning shift Experience: data engineer: 3 years (Preferred) Python: 3 years (Preferred) Work Location: In person

Posted 1 day ago

Apply

7.0 years

5 - 7 Lacs

Hyderābād

On-site

GlassDoor logo

About the Role: Grade Level (for internal use): 10 Role: As a Senior Database Engineer, you will work on multiple datasets that will enable S&P CapitalIQ Pro to serve-up value-added Ratings, Research and related information to the Institutional clients. The Team: Our team is responsible for the gathering data from multiple sources spread across the globe using different mechanism (ETL/GG/SQL Rep/Informatica/Data Pipeline) and convert them to a common format which can be used by Client facing UI tools and other Data providing Applications. This application is the backbone of many of S&P applications and is critical to our client needs. You will get to work on wide range of technologies and tools like Oracle/SQL/.Net/Informatica/Kafka/Sonic. You will have the opportunity every day to work with people from a wide variety of backgrounds and will be able to develop a close team dynamic with coworkers from around the globe. We craft strategic implementations by using the broader capacity of the data and product. Do you want to be part of a team that executes cross-business solutions within S&P Global? Impact: Our Team is responsible to deliver essential and business critical data with applied intelligence to power the market of the future. This enables our customer to make decisions with conviction. Contribute significantly to the growth of the firm by- Developing innovative functionality in existing and new products Supporting and maintaining high revenue productionized products Achieve the above intelligently and economically using best practices Career: This is the place to hone your existing Database skills while having the chance to become exposed to fresh technologies. As an experienced member of the team, you will have the opportunity to mentor and coach developers who have recently graduated and collaborate with developers, business analysts and product managers who are experts in their domain. Your skills: You should be able to demonstrate that you have an outstanding knowledge and hands-on experience in the below areas: Complete SDLC: architecture, design, development and support of tech solutions Play a key role in the development team to build high-quality, high-performance, scalable code Engineer components, and common services based on standard corporate development models, languages and tools Produce technical design documents and conduct technical walkthroughs Collaborate effectively with technical and non-technical stakeholders Be part of a culture to continuously improve the technical design and code base Document and demonstrate solutions using technical design docs, diagrams and stubbed code Our Hiring Manager says: I’m looking for a person that gets excited about technology and motivated by seeing how our individual contribution and team work to the world class web products affect the workflow of thousands of clients resulting in revenue for the company. Qualifications Required: Bachelor’s degree in computer science, Information Systems or Engineering. 7+ years of experience on Transactional Databases like SQL server, Oracle, PostgreSQL and other NoSQL databases like Amazon DynamoDB, MongoDB Strong Database development skills on SQL Server, Oracle Strong knowledge of Database architecture, Data Modeling and Data warehouse. Knowledge on object-oriented design, and design patterns. Familiar with various design and architectural patterns Strong development experience with Microsoft SQL Server Experience in cloud native development and AWS is a big plus Experience with Kafka/Sonic Broker messaging systems Nice to have: Experience in developing data pipelines using Java or C# is a significant advantage. Strong knowledge around ETL Tools – Informatica, SSIS Exposure with Informatica is an advantage. Familiarity with Agile and Scrum models Working Knowledge of VSTS. Working knowledge of AWS cloud is an added advantage. Understanding of fundamental design principles for building a scalable system. Understanding of financial markets and asset classes like Equity, Commodity, Fixed Income, Options, Index/Benchmarks is desirable. Additionally, experience with Scala, Python and Spark applications is a plus. About S&P Global Market Intelligence At S&P Global Market Intelligence, a division of S&P Global we understand the importance of accurate, deep and insightful information. Our team of experts delivers unrivaled insights and leading data and technology solutions, partnering with customers to expand their perspective, operate with confidence, and make decisions with conviction. For more information, visit www.spglobal.com/marketintelligence . What’s In It For You? Our Purpose: Progress is not a self-starter. It requires a catalyst to be set in motion. Information, imagination, people, technology–the right combination can unlock possibility and change the world. Our world is in transition and getting more complex by the day. We push past expected observations and seek out new levels of understanding so that we can help companies, governments and individuals make an impact on tomorrow. At S&P Global we transform data into Essential Intelligence®, pinpointing risks and opening possibilities. We Accelerate Progress. Our People: We're more than 35,000 strong worldwide—so we're able to understand nuances while having a broad perspective. Our team is driven by curiosity and a shared belief that Essential Intelligence can help build a more prosperous future for us all. From finding new ways to measure sustainability to analyzing energy transition across the supply chain to building workflow solutions that make it easy to tap into insight and apply it. We are changing the way people see things and empowering them to make an impact on the world we live in. We’re committed to a more equitable future and to helping our customers find new, sustainable ways of doing business. We’re constantly seeking new solutions that have progress in mind. Join us and help create the critical insights that truly make a difference. Our Values: Integrity, Discovery, Partnership At S&P Global, we focus on Powering Global Markets. Throughout our history, the world's leading organizations have relied on us for the Essential Intelligence they need to make confident decisions about the road ahead. We start with a foundation of integrity in all we do, bring a spirit of discovery to our work, and collaborate in close partnership with each other and our customers to achieve shared goals. Benefits: We take care of you, so you can take care of business. We care about our people. That’s why we provide everything you—and your career—need to thrive at S&P Global. Our benefits include: Health & Wellness: Health care coverage designed for the mind and body. Flexible Downtime: Generous time off helps keep you energized for your time on. Continuous Learning: Access a wealth of resources to grow your career and learn valuable new skills. Invest in Your Future: Secure your financial future through competitive pay, retirement planning, a continuing education program with a company-matched student loan contribution, and financial wellness programs. Family Friendly Perks: It’s not just about you. S&P Global has perks for your partners and little ones, too, with some best-in class benefits for families. Beyond the Basics: From retail discounts to referral incentive awards—small perks can make a big difference. For more information on benefits by country visit: https://spgbenefits.com/benefit-summaries Global Hiring and Opportunity at S&P Global: At S&P Global, we are committed to fostering a connected and engaged workplace where all individuals have access to opportunities based on their skills, experience, and contributions. Our hiring practices emphasize fairness, transparency, and merit, ensuring that we attract and retain top talent. By valuing different perspectives and promoting a culture of respect and collaboration, we drive innovation and power global markets. Recruitment Fraud Alert: If you receive an email from a spglobalind.com domain or any other regionally based domains, it is a scam and should be reported to reportfraud@spglobal.com . S&P Global never requires any candidate to pay money for job applications, interviews, offer letters, “pre-employment training” or for equipment/delivery of equipment. Stay informed and protect yourself from recruitment fraud by reviewing our guidelines, fraudulent domains, and how to report suspicious activity here . ----------------------------------------------------------- Equal Opportunity Employer S&P Global is an equal opportunity employer and all qualified candidates will receive consideration for employment without regard to race/ethnicity, color, religion, sex, sexual orientation, gender identity, national origin, age, disability, marital status, military veteran status, unemployment status, or any other status protected by law. Only electronic job submissions will be considered for employment. If you need an accommodation during the application process due to a disability, please send an email to: EEO.Compliance@spglobal.com and your request will be forwarded to the appropriate person. US Candidates Only: The EEO is the Law Poster http://www.dol.gov/ofccp/regs/compliance/posters/pdf/eeopost.pdf describes discrimination protections under federal law. Pay Transparency Nondiscrimination Provision - https://www.dol.gov/sites/dolgov/files/ofccp/pdf/pay-transp_%20English_formattedESQA508c.pdf ----------------------------------------------------------- 20 - Professional (EEO-2 Job Categories-United States of America), IFTECH202.1 - Middle Professional Tier I (EEO Job Group) Job ID: 316332 Posted On: 2025-06-16 Location: Gurgaon, Haryana, India

Posted 1 day ago

Apply

3.0 years

4 - 6 Lacs

Hyderābād

On-site

GlassDoor logo

- 3+ years of data engineering experience - Experience with data modeling, warehousing and building ETL pipelines - Experience with one or more scripting language (e.g., Python, KornShell) - 3+ years of analyzing and interpreting data with Redshift, Oracle, NoSQL etc. experience As part of the Last Mile Science & Technology organization, you’ll partner closely with Product Managers, Data Scientists, and Software Engineers to drive improvements in Amazon's Last Mile delivery network. You will leverage data and analytics to generate insights that accelerate the scale, efficiency, and quality of the routes we build for our drivers through our end-to-end last mile planning systems. You will develop complex data engineering solutions using AWS technology stack (S3, Glue, IAM, Redshift, Athena). You should have deep expertise and passion in working with large data sets, building complex data processes, performance tuning, bringing data from disparate data stores and programmatically identifying patterns. You will work with business owners to develop and define key business questions and requirements. You will provide guidance and support for other engineers with industry best practices and direction. Analytical ingenuity and leadership, business acumen, effective communication capabilities, and the ability to work effectively with cross-functional teams in a fast-paced environment are critical skills for this role. Key job responsibilities • Design, implement, and support data warehouse / data lake infrastructure using AWS big data stack, Python, Redshift, Quicksight, Glue/lake formation, EMR/Spark/Scala, Athena etc. • Extract huge volumes of structured and unstructured data from various sources (Relational /Non-relational/No-SQL database) and message streams and construct complex analyses. • Develop and manage ETLs to source data from various systems and create unified data model for analytics and reporting • Perform detailed source-system analysis, source-to-target data analysis, and transformation analysis • Participate in the full development cycle for ETL: design, implementation, validation, documentation, and maintenance. Experience with big data technologies such as: Hadoop, Hive, Spark, EMR Experience with big data processing technology (e.g., Hadoop or ApacheSpark), data warehouse technical architecture, infrastructure components, ETL, and reporting/analytic tools and environments Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you’re applying in isn’t listed, please contact your Recruiting Partner.

Posted 1 day ago

Apply

10.0 years

5 - 10 Lacs

Hyderābād

On-site

GlassDoor logo

Job Information Date Opened 06/17/2025 Job Type Full time Industry IT Services City Hyderabad State/Province Telangana Country India Zip/Postal Code 500081 About Us About DATAECONOMY: We are a fast-growing data & analytics company headquartered in Dublin with offices inDublin, OH, Providence, RI, and an advanced technology center in Hyderabad,India. We are clearly differentiated in the data & analytics space via our suite of solutions, accelerators, frameworks, and thought leadership. Job Description Job Title: Technical Project Manager Location: Hyderabad Employment Type: Full-time Experience: 10+ years Domain: Banking and Insurance We are seeking a Technical Project Manager to lead and coordinate the delivery of data-centric projects. This role bridges the gap between engineering teams and business stakeholders, ensuring the successful execution of technical initiatives, particularly in data infrastructure, pipelines, analytics, and platform integration. Responsibilities: Lead end-to-end project management for data-driven initiatives, including planning, execution, delivery, and stakeholder communication. Work closely with data engineers, analysts, and software developers to ensure technical accuracy and timely delivery of projects. Translate business requirements into technical specifications and work plans. Manage project timelines, risks, resources, and dependencies using Agile, Scrum, or Kanban methodologies. Drive the development and maintenance of scalable ETL pipelines, data models, and data integration workflows. Oversee code reviews and ensure adherence to data engineering best practices. Provide hands-on support when necessary, in Python-based development or debugging. Collaborate with cross-functional teams including Product, Data Science, DevOps, and QA. Track project metrics and prepare progress reports for stakeholders. Requirements Required Qualifications: Bachelor’s or master’s degree in computer science, Information Systems, Engineering, or related field. 10+ years of experience in project management or technical leadership roles. Strong understanding of modern data architectures (e.g., data lakes, warehousing, streaming). Experience working with cloud platforms like AWS, GCP, or Azure. Familiarity with tools such as JIRA, Confluence, Git, and CI/CD pipelines. Strong communication and stakeholder management skills. Benefits Company standard benefits.

Posted 1 day ago

Apply

Exploring ETL Jobs in India

The ETL (Extract, Transform, Load) job market in India is thriving with numerous opportunities for job seekers. ETL professionals play a crucial role in managing and analyzing data effectively for organizations across various industries. If you are considering a career in ETL, this article will provide you with valuable insights into the job market in India.

Top Hiring Locations in India

  1. Bangalore
  2. Mumbai
  3. Pune
  4. Hyderabad
  5. Chennai

These cities are known for their thriving tech industries and often have a high demand for ETL professionals.

Average Salary Range

The average salary range for ETL professionals in India varies based on experience levels. Entry-level positions typically start at around ₹3-5 lakhs per annum, while experienced professionals can earn upwards of ₹10-15 lakhs per annum.

Career Path

In the ETL field, a typical career path may include roles such as: - Junior ETL Developer - ETL Developer - Senior ETL Developer - ETL Tech Lead - ETL Architect

As you gain experience and expertise, you can progress to higher-level roles within the ETL domain.

Related Skills

Alongside ETL, professionals in this field are often expected to have skills in: - SQL - Data Warehousing - Data Modeling - ETL Tools (e.g., Informatica, Talend) - Database Management Systems (e.g., Oracle, SQL Server)

Having a strong foundation in these related skills can enhance your capabilities as an ETL professional.

Interview Questions

Here are 25 interview questions that you may encounter in ETL job interviews:

  • What is ETL and why is it important? (basic)
  • Explain the difference between ETL and ELT processes. (medium)
  • How do you handle incremental loads in ETL processes? (medium)
  • What is a surrogate key in the context of ETL? (basic)
  • Can you explain the concept of data profiling in ETL? (medium)
  • How do you handle data quality issues in ETL processes? (medium)
  • What are some common ETL tools you have worked with? (basic)
  • Explain the difference between a full load and an incremental load. (basic)
  • How do you optimize ETL processes for performance? (medium)
  • Can you describe a challenging ETL project you worked on and how you overcame obstacles? (advanced)
  • What is the significance of data cleansing in ETL? (basic)
  • How do you ensure data security and compliance in ETL processes? (medium)
  • Have you worked with real-time data integration in ETL? If so, how did you approach it? (advanced)
  • What are the key components of an ETL architecture? (basic)
  • How do you handle data transformation requirements in ETL processes? (medium)
  • What are some best practices for ETL development? (medium)
  • Can you explain the concept of change data capture in ETL? (medium)
  • How do you troubleshoot ETL job failures? (medium)
  • What role does metadata play in ETL processes? (basic)
  • How do you handle complex transformations in ETL processes? (medium)
  • What is the importance of data lineage in ETL? (basic)
  • Have you worked with parallel processing in ETL? If so, explain your experience. (advanced)
  • How do you ensure data consistency across different ETL jobs? (medium)
  • Can you explain the concept of slowly changing dimensions in ETL? (medium)
  • How do you document ETL processes for knowledge sharing and future reference? (basic)

Closing Remarks

As you explore ETL jobs in India, remember to showcase your skills and expertise confidently during interviews. With the right preparation and a solid understanding of ETL concepts, you can embark on a rewarding career in this dynamic field. Good luck with your job search!

cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies