Home
Jobs

10660 Etl Jobs - Page 26

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

0 years

0 Lacs

Gurugram, Haryana, India

On-site

Linkedin logo

Line of Service Advisory Industry/Sector Not Applicable Specialism Data, Analytics & AI Management Level Senior Associate Job Description & Summary A career within Data and Analytics services will provide you with the opportunity to help organisations uncover enterprise insights and drive business results using smarter data analytics. We focus on a collection of organisational technology capabilities, including business intelligence, data management, and data assurance that help our clients drive innovation, growth, and change within their organisations in order to keep up with the changing nature of customers and technology. We make impactful decisions by mixing mind and machine to leverage data, understand and navigate risk, and help our clients gain a competitive edge. Creating business intelligence from data requires an understanding of the business, the data, and the technology used to store and analyse that data. Using our Rapid Business Intelligence Solutions, data visualisation and integrated reporting dashboards, we can deliver agile, highly interactive reporting and analytics that help our clients to more effectively run their business and understand what business questions can be answered and how to unlock the answers. Hands on experience in Qlik Sense development, dashboarding and data modeling and reporting (ad hoc report generation) techniques. 2. Must be good at Data transformation, the creation of QVD files and set analysis. 3. Experienced in application designing, architecting, development and deployment using Qlik Sense. Must be efficient in front-end development and know visualization best practices. 4. Strong database designing and SQL skills. Experienced in RDMS such as MS SQL Server, Oracle, MySQL etc. 5. Strong communication skills (verbal/written) to deliver the technical insights and interpret the data reports to the clients. Also helps in understanding and serving to the client’s requirements. 6. Leadership qualities and thoughtful implementation of Qlik Sense best practices to deliver effective Qlik Sense solutions to the users. 7. Able to comprehend and translate complex and advanced functional, technical, and business requirements into executable architectural designs. • Creating and maintaining technical documentation. 8. Experienced in data integration through extracting, transforming, and loading (ETL) data from various sources. 9. Experience in working and designing on Nprinting reports 10. Exposure to latest products in Qlik product suite (such as Replicate, Alerting) would be a huge plus. Mandatory Skill Set- Qlik Preferred Skill Set- Qlik Year of experience required- 4-6 Qualifications- BTech Required Skills QlikView Optional Skills Desired Languages (If blank, desired languages not specified) Travel Requirements Available for Work Visa Sponsorship? Government Clearance Required? Job Posting End Date Show more Show less

Posted 2 days ago

Apply

3.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

YOE - 3 to 5 Years Location - Bengaluru We are seeking a highly motivated and analytical Business Analyst to join our Data Analytics Team. In this role, you will play a critical part in turning raw data into actionable insights that support business decisions and strategic initiatives. You will work closely with cross-functional teams and directly engage with business stakeholders to understand data requirements, design robust data pipelines, and deliver impactful analyses. Collaborate with stakeholders across departments to gather and translate business requirements into data models and analytical solutions. Act as a key point of contact for business teams, ensuring their analytical needs are clearly understood and addressed effectively. Design, develop, and maintain ETL pipelines to ensure seamless data flow across systems. Perform advanced SQL queries to extract, manipulate, and analyze large datasets from multiple sources. Utilize Python to automate data workflows, perform exploratory data analysis (EDA), and build data transformation scripts. Leverage AWS tools (such as S3, Redshift, Glue, Lambda) for data storage, processing, and pipeline orchestration. Develop dashboards and reports to visualize key metrics and insights for business leadership. Conduct deep-dive analyses on business performance, customer behavior, and operational efficiencies to identify growth opportunities. Ensure data accuracy, integrity, and security throughout all analytics processes. Ideal Candidate Bachelor’s degree in Computer Science, Data Science, Engineering, Business Analytics, or a related field. 2+ years of experience in data analytics, business intelligence, or a similar role. Proficient in Advanced SQL for complex data manipulation and performance optimization. Intermediate proficiency in Python for data processing and automation (Pandas, NumPy, etc.). Experience with building and maintaining ETL pipelines. Familiarity with AWS Data Services (e.g., S3, Glue, Lambda, Athena). Strong analytical skills with a solid understanding of statistical methods and business performance metrics. Experience with data visualization tools like Tableau, Metabase. Excellent communication and interpersonal skills with the ability to engage directly with business stakeholders and translate their needs into actionable data solutions. Strong problem-solving skills and ability to work in a fast-paced, collaborative environment. Perks, Benefits and Work Culture Work with cutting-edge technologies on high-impact systems. Be part of a collaborative and technically driven team. Enjoy flexible work options and a culture that values learning. Competitive salary, benefits, and growth opportunities. Show more Show less

Posted 2 days ago

Apply

7.0 years

0 Lacs

Mumbai Metropolitan Region

On-site

Linkedin logo

Immediate Hire - Core Java + AWS Location: Mumbai (Powai) - Work from office Salary : Up to 30 LPA Full Time with benefits Experience: 7-10 Years Interested candidates please share updated CV to hr@hmforward.com Inviting applications for the role of Principal Consultant- #AWS Developer We are seeking an experienced Developer with expertise in AWS-based big data solutions, particularly leveraging Apache #Spark on #AWSEMR , along with strong backend development skills in #Java and Spring. The ideal candidate will also possess a solid background in data warehousing, #ETL pipelines, and large-scale data processing systems.. Responsibilities • Design and implement scalable data processing solutions using Apache Spark on AWS EMR. • Develop #microservices and backend components using Java and the Spring framework. • Build, optimize, and maintain ETL pipelines for structured and unstructured data. • Integrate data pipelines with AWS services such as #S3 , #Lambda , #Glue , #Redshift , and #Athena . • Collaborate with data architects, analysts, and #DevOps teams to support data warehousing initiatives. • Write efficient, reusable, and reliable code following best practices. • Ensure data quality, governance, and lineage across the architecture. • Troubleshoot and optimize Spark jobs and cloud-based processing workflows. • Participate in code reviews, testing, and deployments in Agile environments. Qualifications we seek in you! Minimum Qualifications • Bachelor’s degree Preferred Qualifications/ Skills • Strong experience with Apache Spark and AWS EMR in production environments. • Solid understanding of AWS ecosystem, including services like S3, #Lambda , #Glue , #Redshift , and #CloudWatch . • Proven experience in designing and managing large-scale data warehousing systems. • Expertise in building and maintaining ETL pipelines and data transformation workflows. • Strong SQL skills and familiarity with performance tuning for analytical queries. • Experience working in Agile development environments using tools such as Git, JIRA, and CI/CD pipelines. • Familiarity with data modeling concepts and tools (e.g., Star Schema, Snowflake Schema). • Knowledge of data governance tools and metadata management. • Experience with containerization (Docker, Kubernetes) and serverless architectures. Show more Show less

Posted 2 days ago

Apply

5.0 years

15 Lacs

Cochin

On-site

GlassDoor logo

Job Title: Database Lead (DB Lead) Location: Kochi Experience: 5+ Years Compensation: 20–25% hike on current CTC Employment Type: Full-Time Roles & Responsibilities: 1. Hands-on experience in writing complex SQL queries, stored procedures, packages, functions, and leveraging SQL analytical functions. 2. Expertise with Microsoft SQL Server tools and services, particularly SSIS (ETL processes). 3. Troubleshoot and support existing Data Warehouse (DW) processes. 4. Perform production-level performance tuning for MS SQL databases. 5. Monitor and report on SQL environment performance and availability metrics; implement best practices for performance optimization. 6. Participate in SQL code reviews with application teams to enforce SQL coding standards. 7. Manage database backup and restore operations, including scheduled Disaster Recovery (DR) tests. Should be well-versed in clustering , replication , and MS SQL restoration techniques. 8. Exhibit strong communication and coordination skills, with the ability to work efficiently under pressure. Desired Candidate Profile: · Bachelor’s Degree in Engineering (B.Tech) or Master of Computer Applications (MCA). · Minimum 5 years of relevant work experience in database development/administration. · Professional certifications in Database Development or Management are highly preferred. · Experience working in Agile/Scrum environments. Familiarity with JIRA is a plus. Job Types: Full-time, Permanent Pay: From ₹1,500,000.00 per year Schedule: Day shift Application Question(s): Do you have at least 5 years of hands-on experience with Microsoft SQL Server, including writing complex queries, stored procedures, and using SSIS (ETL processes)? Do you have experience with database backup/restoration, clustering, and Disaster Recovery (DR) testing in a production environment? Are you willing to work from Kochi and open to joining full-time with a 20–25% hike on your current CTC? Work Location: In person

Posted 2 days ago

Apply

10.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Linkedin logo

Requisition Number: 101362 Architect II Location: The role will be a hybrid position located in Delhi NCR, Hyderabad, Pune, Trivandrum and Bangalore, India Insight at a Glance 14,000+ engaged teammates globally #20 on Fortune’s World's Best Workplaces™ list $9.2 billion in revenue Received 35+ industry and partner awards in the past year $1.4M+ total charitable contributions in 2023 by Insight globally Now is the time to bring your expertise to Insight. We are not just a tech company; we are a people-first company. We believe that by unlocking the power of people and technology, we can accelerate transformation and achieve extraordinary results. As a Fortune 500 Solutions Integrator with deep expertise in cloud, data, AI, cybersecurity, and intelligent edge, we guide organisations through complex digital decisions. About The Role The Architect-II Data will focus on leading our Business Intelligence (BI) and Data Warehousing (DW) initiatives. This role involves designing and implementing end-to-end data pipelines using cloud services and data frameworks. They will collaborate with stakeholders and ETL/BI developers in an agile environment to create scalable, secure data architectures ensuring alignment with business requirements, industry best practices, and regulatory compliance. Responsibilities Architect and implement end-to-end data pipelines, data lakes, and warehouses using modern cloud services and architectural patterns. Develop and build analytics tools that deliver actionable insights to the business. Integrate and manage large, complex data sets to meet strategic business requirements. Optimize data processing workflows using frameworks such as PySpark. Establish and enforce best practices for data quality, integrity, security, and performance across the entire data ecosystem. Collaborate with cross-functional teams to prioritize deliverables and design solutions. Develop compelling business cases and return on investment (ROI) analyses to support strategic initiatives. Drive process improvements for enhanced data delivery speed and reliability. Provide technical leadership, training, and mentorship to team members, promoting a culture of excellence. Qualification 10+ years in Business Intelligence (BI) solution design, with 8+ years specializing in ETL processes and data warehouse architecture. 8+ years of hands-on experience with Azure Data services including Azure Data Factory, Azure Databricks, Azure Data Lake Gen2, Azure SQL DB, Synapse, Power BI, and MS Fabric (Knowledge) Strong Python and PySpark software engineering proficiency, coupled with a proven track record of building and optimizing big data pipelines, architectures, and datasets. Proficient in transforming, processing, and extracting insights from vast, disparate datasets, and building robust data pipelines for metadata, dependency, and workload management. Familiarity with software development lifecycles/methodologies, particularly Agile. Experience with SAP/ERP/Datasphere data modeling is a significant plus. Excellent presentation and collaboration skills, capable of creating formal documentation and supporting cross-functional teams in a dynamic environment. Strong problem-solving, time management, and organizational abilities. Keen to learn new languages and technologies continually. Graduate degree in Computer Science, Statistics, Informatics, Information Systems, or an equivalent field. What You Can Expect We’re legendary for taking care of you, your family and to help you engage with your local community. We want you to enjoy a full, meaningful life and own your career at Insight. Some of our benefits include: Freedom to work from another location—even an international destination—for up to 30 consecutive calendar days per year. But what really sets us apart are our core values of Hunger, Heart, and Harmony, which guide everything we do, from building relationships with teammates, partners, and clients to making a positive impact in our communities. Join us today, your ambITious journey starts here. Insight is an equal opportunity employer, and all qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, disability status, protected veteran status, sexual orientation or any other characteristic protected by law. When you apply, please tell us the pronouns you use and any reasonable adjustments you may need during the interview process. At Insight, we celebrate diversity of skills and experience so even if you don’t feel like your skills are a perfect match - we still want to hear from you! Insight is an equal opportunity employer, and all qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, disability status, protected veteran status, sexual orientation or any other characteristic protected by law. Insight India Location:Level 16, Tower B, Building No 14, Dlf Cyber City In It/Ites Sez, Sector 24 &25 A Gurugram Gurgaon Hr 122002 India Show more Show less

Posted 2 days ago

Apply

5.0 - 10.0 years

10 - 20 Lacs

Hyderabad

Work from Office

Naukri logo

Dear Candidate, Greetings from Tata Consultancy Services (TCS) We are pleased to invite you to our in-person interview drive for professionals with expertise in Snowflake developer - Hyderabad. Interview Drive Details: Date : 21-Jun-2025 Time : 9:00 AM to 5:00 PM Venue : TCS Deccan park-LS1 Zone Plot No 1, Survey No. 64/2, Software Units Layout Serilingampally Mandal, Madhapur Hyderabad - 500081, Telangana. Role ** Snowflake Developer Required Technical Skill Set** Snowflake Desired Experience Range** 5 to 10 yrs exp Location of Requirement Hyderabad Desired Competencies (Technical/Behavioral Competency) Must-Have** At least 5+ years of relevant work experience in any Data Warehouse Technologies At least 2+ years of experience in designing, implementing, and migrating Data/Enterprise/Engineering workloads on to snowflake DWH. Should be able take the requirements from Business , co-ordinate with Business and IT Teams on clarifications, dependencies and status reporting As an individual contributor, should be able Create, test, and implement business solutions in Snowflake Experience in implementing Devops/CICD using Azure Devops / GITLAB Actions is preferred. Hands on experience in Data Modeling Expert in SQL and Performance tuning techniques of queries Experience on ingestion techniques using ETL tools (IICS) and Snowflakes COPY/Snowpipe/StreamLit Utility Strong in writing snowflakes stored procedures, views, UDFs etc. Good exposure of handling CDC using Streams, TimeTravel Proficient in working with Snowflake Tasks, Data Sharing, Data replication Good-to-Have DBT Responsibility of / Expectations from the Role 1. Good exposure of handling CDC using Streams, TimeTravel 2. Expert in SQL and Performance tuning techniques of queries 3. Experience on ingestion techniques using ETL tools (IICS) and Snowflakes COPY/Snowpipe/StreamLit Utility 4. Strong in writing snowflakes stored procedures, views, UDFs etc. 5. Good exposure of handling CDC using Streams, TimeTravel We look forward to your confirmation and participation in the interview drive

Posted 2 days ago

Apply

10.0 years

0 Lacs

Hyderābād

On-site

GlassDoor logo

Do you want to help one of the most respected companies in the world reinvent its approach to data? At Thomson Reuters, we are recruiting a team of motivated data professionals to transform how we manage and leverage our commercial data assets. It is a unique opportunity to join a diverse and global team with centers of excellence in Toronto, London, and Bangalore. Are you excited about working at the forefront of the data driven revolution that will change the way a company works? Thomson Reuters Data and Analytics team is seeking an experienced Lead Engineer, Test Data Management with a passion for engineering quality assurance solutions for cloud-based data warehouse systems. About the Role As Lead Engineer, In this opportunity you will: Test Data Management, you play a crucial role in ensuring the quality and reliability of our enterprise data systems. Your expertise in testing methods, data validation, and automation are essential to bring best-in-class standards to our data products. Design test data management frameworks, apply data masking, data sub-setting, and generate synthetic data to create robust test data solutions for enterprise-wide teams. You will collaborate with Engineers, Database Architects, Data Quality Stewards to build logical data models, execute data validation, design manual and automated testing Mentor and lead the testing of key data development projects related to Data Warehouse and other systems. Lead engineering team members in implementation of test data best practices and the delivery of test data solutions. Be a thought leader investigating leading edge quality technology for test data management and systems functionality including performance testing for data pipelines. Innovate create ETL mappings, workflows, functions to move data from multiple sources into target areas. Partner across the company with analytics teams, engineering managers, architecture teams and others to design and agree on solutions that meet business requirements. Effectively communicate and liaise with other engineering groups across the organization, data consumers, and business analytic groups. Utilize your experience in the following areas: SQL for data querying, validation, and analysis Knowledge of database management systems (e.g., SQL Server, Postgresql, mySQL) Test Data Management Tools (e.g., K2View, qTest, ALM, Zephyr) Proficiency in Python for test automation and data manipulation PySpark for big data testing Test case design, execution, and defect management AWS Cloud Data practices and DevOps tooling Performance testing for data management solutions, especially for complex data flows Data Security, Privacy, and Data governance compliance principles About You You're a fit for the role of Lead Engineer, If your Job role includes: 10+ years of experience as a Tester, Developer or Data Analyst with experience in establishing end-to-end test strategies, planning for data validation, transformation, and analytics Advanced SQL Knowledge Designing and executing test procedures and documenting best practices Experience planning and executing regression testing, data validation, and quality assurance Advanced command of data warehouse creation, management, and performance strategies Experience engineering and implementing data quality systems in the cloud Proficiency in scripting language such as Python Hands on experience with data test automation applications (preference for K2View) Identification and remediation of data quality issues Data Management tools like: K2View, Immuta, Alation, Informatica Agile development Business Intelligence and Data Warehousing concepts Familiarity SAP, Salesforce systems Intermediate understanding of Big Data technologies AWS services and management, including serverless, container, queueing and monitoring services Experience with creating manual or automated tests on data pipelines Programming languages: Python Data interchange formats: Parquet, JSON, CSV Version control with GitHub Cloud security and compliance, privacy, GDPR #LI-VGA1 What’s in it For You? Hybrid Work Model: We’ve adopted a flexible hybrid working environment (2-3 days a week in the office depending on the role) for our office-based roles while delivering a seamless experience that is digitally and physically connected. Flexibility & Work-Life Balance: Flex My Way is a set of supportive workplace policies designed to help manage personal and professional responsibilities, whether caring for family, giving back to the community, or finding time to refresh and reset. This builds upon our flexible work arrangements, including work from anywhere for up to 8 weeks per year, empowering employees to achieve a better work-life balance. Career Development and Growth: By fostering a culture of continuous learning and skill development, we prepare our talent to tackle tomorrow’s challenges and deliver real-world solutions. Our Grow My Way programming and skills-first approach ensures you have the tools and knowledge to grow, lead, and thrive in an AI-enabled future. Industry Competitive Benefits: We offer comprehensive benefit plans to include flexible vacation, two company-wide Mental Health Days off, access to the Headspace app, retirement savings, tuition reimbursement, employee incentive programs, and resources for mental, physical, and financial wellbeing. Culture: Globally recognized, award-winning reputation for inclusion and belonging, flexibility, work-life balance, and more. We live by our values: Obsess over our Customers, Compete to Win, Challenge (Y)our Thinking, Act Fast / Learn Fast, and Stronger Together. Social Impact: Make an impact in your community with our Social Impact Institute. We offer employees two paid volunteer days off annually and opportunities to get involved with pro-bono consulting projects and Environmental, Social, and Governance (ESG) initiatives. Making a Real-World Impact: We are one of the few companies globally that helps its customers pursue justice, truth, and transparency. Together, with the professionals and institutions we serve, we help uphold the rule of law, turn the wheels of commerce, catch bad actors, report the facts, and provide trusted, unbiased information to people all over the world. About Us Thomson Reuters informs the way forward by bringing together the trusted content and technology that people and organizations need to make the right decisions. We serve professionals across legal, tax, accounting, compliance, government, and media. Our products combine highly specialized software and insights to empower professionals with the data, intelligence, and solutions needed to make informed decisions, and to help institutions in their pursuit of justice, truth, and transparency. Reuters, part of Thomson Reuters, is a world leading provider of trusted journalism and news. We are powered by the talents of 26,000 employees across more than 70 countries, where everyone has a chance to contribute and grow professionally in flexible work environments. At a time when objectivity, accuracy, fairness, and transparency are under attack, we consider it our duty to pursue them. Sound exciting? Join us and help shape the industries that move society forward. As a global business, we rely on the unique backgrounds, perspectives, and experiences of all employees to deliver on our business goals. To ensure we can do that, we seek talented, qualified employees in all our operations around the world regardless of race, color, sex/gender, including pregnancy, gender identity and expression, national origin, religion, sexual orientation, disability, age, marital status, citizen status, veteran status, or any other protected classification under applicable law. Thomson Reuters is proud to be an Equal Employment Opportunity Employer providing a drug-free workplace. We also make reasonable accommodations for qualified individuals with disabilities and for sincerely held religious beliefs in accordance with applicable law. More information on requesting an accommodation here. Learn more on how to protect yourself from fraudulent job postings here. More information about Thomson Reuters can be found on thomsonreuters.com.

Posted 2 days ago

Apply

8.0 years

0 Lacs

Hyderābād

On-site

GlassDoor logo

The people here at Apple dont just build products we craft the kind of wonder thats revolutionized entire industries. Its the diversity of those people and their ideas that supports the innovation that runs through everything we do, from amazing technology to industry-leading environmental efforts. Join Apple, and help us leave the world better than we found it. Imagine what you could do here! At Apple, new ideas have a way of becoming extraordinary products, services, and customer experiences very quickly. Bring passion and dedication to your job and there's no telling what you could accomplish. A passion for product ownership and track record will prove critical to success on our team. Be ready to make something extraordinary when here. Multifaceted, encouraging people and innovative, industry-defining technologies are the norm at Apple. Would you like to work in a fast-paced environment where your technical abilities will be challenged on a day to day basis? If so, Apples IS&T (Information systems and Technology) team is seeking a Software Engineer to work on building and scaling best in class data and reporting apps presenting metrics & performance indicators with the least latency and outstanding user experience. We are looking for a team member that will be able to think creatively and should have a real passion for building highly scalable analytical & reporting apps with end users in focus. You will engage directly with key business partners to understand the business strategies and solution needs. You will drive and lead functional & technical discussions with development teams and expected to design and own end to end applications. You will enjoy the benefits of working in a fast growing business where you are inspired to "Think Different" and where your efforts play a key role in the success of Apple's business. Description We're looking for an individual who loves challenges and taking on problems with imaginative solutions. Also works well in collaborative teams, and can produce high-quality software under tight constraints. You should be a self-starter, self-motivated, able to work independently, collaborate with multiple multi-functional teams across the globe (US, Singapore, India, and Europe) and work on solutions that have a larger impact on Apple business. You will interact with many other group’s / internal teams at Apple to lead and deliver best-in-class products in an exciting, constantly evolving environment. Minimum Qualifications 8+ years of experience developing enterprise applications using Java/J2EE, including Web Services (e.g., RESTful, SOAP), Spring Framework and SpringBoot, and ORM (e.g. Hibernate). Experience with micro-services architectures and container-based deployment (e.g. Docker, Kubernetes) Strong web development skills ( React). Hands-on experience in designing and developing user interfaces ensuring responsiveness, accessibility, and a user-friendly experience. Experience with Relational Database Management Systems (RDBMS) and SQL, as well as multi-modal NoSQL databases, including DocumentDB and GraphDB Preferred Qualifications Experience working with distributed teams using collaboration tools for software configuration management (e.g. Git / GitHub), agile project management (e.g. Jira), and knowledge repositories (e.g. Confluence / wikis) Experience with Extraction, Transformation, and Load (ETL) technologies, data replication, and event streaming. Experience with Cloud solutions, like Infrastructure as Code (e.g. CloudFormation), Configuration as Code (e.g. Ansbile), Elastic Computing, Virtual Private Clouds (VPCs) Proficiency in Test Driven Development (TDD), Continuous Integration / Continuous Deployment (CI/CD), and DevOps best practices Working experience in Agile development methodology Effective interpersonal, analytical and communication skills Results-oriented and demonstrates ownership and accountability Bachelor’s degree in Computer Science or related field Submit CV

Posted 2 days ago

Apply

10.0 - 12.0 years

9 - 9 Lacs

Hyderābād

On-site

GlassDoor logo

Title: Data Integration Developer – Manager Department: Alpha Data Platform Reports To: Data Integration Lead, Engineering Summary: State Street Global Alpha Data Platform , lets you load, enrich and aggregate investment data. Alpha Clients will be able to manage multi-asset class data from any service provider or data vendor for a more holistic and integrated view of their holdings. This platform reflects State Street’s years of experience servicing complex instruments for our global client base and our investments in building advanced data management technologies. Reporting to the Alpha Development delivery manager in <>, Data Integration Developer is responsible for overall development life cycle leading to successful delivery and support of Alpha Data Platform(ADP) Services to clients. Responsibilities: As a Data Integration Developer, be hands-on ETL/ELT data pipelines (Talend DI), Snowflake DWH, CI/CD deployment Pipelines and data-readiness (data quality) design, development, implementation and address code or data issues. Experience in designing and implementing modern data pipelines for a variety of data sets which include internal/external data sources, complex relationships, various data formats and high-volume. Experience and understanding of ETL Job performance techniques, Exception handling, Query performance tuning/optimizations and data loads meeting the runtime/schedule time SLAs both batch and real-time data uses cases. Demonstrate strong collaborative experience across regions (APAC, EMEA and NA) to come up with design standards, High level design solutions document, cross training and resource onboarding activities. Good understanding of SDLC process, Governance clearance, Peer Code reviews, Unit Test Results, Code deployments, Code Security Scanning, Confluence Jira/Kanban stories. Strong attention to detail during root cause analysis, SQL query debugging and defect issue resolution by working with multiple business/IT stakeholders. Qualifications: Education: B.S. degree (or foreign education equivalent) in Computer Science, Engineering, Mathematics, and Physics or other technical course of study required. MS degree strongly preferred. Experience: A minimum of 10- 12 years of experience in data integration/orchestration services, data architecture, design, development and implementations and providing data driven solutions for client requirements Experience in Snowflake DWH SQL, SQL server database query/performance tuning. Strong Data warehousing Concepts, ETL tools such as Talend Cloud Data Integration tool. Exposure to the financial domain knowledge is considered a plus. Cloud Managed Services such as source control code Github, MS Azure/Devops is considered a plus. Experience with Qlik Replicate and Compose tools (Change Data Capture) tools is considered a plus Exposure to Third party data providers such as Factset, Opturo, Bloomberg, Reuters, MSCI and other Rating agencies is a plus. Supervisory Responsibility: Individual Contributor Team Lead Manager of Managers Travel: May be required on a limited basis.

Posted 2 days ago

Apply

130.0 years

3 - 6 Lacs

Hyderābād

On-site

GlassDoor logo

Job Description Manager, Quality Engineer The Opportunity Based in Hyderabad, join a global healthcare biopharma company and be part of a 130- year legacy of success backed by ethical integrity, forward momentum, and an inspiring mission to achieve new milestones in global healthcare. Be part of an organisation driven by digital technology and data-backed approaches that support a diversified portfolio of prescription medicines, vaccines, and animal health products. Drive innovation and execution excellence. Be a part of a team with passion for using data, analytics, and insights to drive decision-making, and which creates custom software, allowing us to tackle some of the world's greatest health threats. Our Technology Centres focus on creating a space where teams can come together to deliver business solutions that save and improve lives. An integral part of our company’s’ IT operating model, Tech Centres are globally distributed locations where each IT division has employees to enable our digital transformation journey and drive business outcomes. These locations, in addition to the other sites, are essential to supporting our business and strategy. A focused group of leaders in each Tech Centre helps to ensure we can manage and improve each location, from investing in growth, success, and well-being of our people, to making sure colleagues from each IT division feel a sense of belonging to managing critical emergencies. And together, we must leverage the strength of our team to collaborate globally to optimize connections and share best practices across the Tech Centres. Role Overview Develop and Implement Advanced Automated Testing Frameworks: Architect, design, and maintain sophisticated automated testing frameworks for data pipelines and ETL processes, ensuring robust data quality and reliability. Conduct Comprehensive Quality Assurance Testing: Lead the execution of extensive testing strategies, including functional, regression, performance, and security testing, to validate data accuracy and integrity across the bronze layer. Monitor and Enhance Data Reliability: Collaborate with the data engineering team to establish and refine monitoring and alerting systems that proactively identify data quality issues and system failures, implementing corrective actions as needed. What will you do in this role: Develop and Implement Advanced Automated Testing Frameworks: Architect, design, and maintain sophisticated automated testing frameworks for data pipelines and ETL processes, ensuring robust data quality and reliability. Conduct Comprehensive Quality Assurance Testing: Lead the execution of extensive testing strategies, including functional, regression, performance, and security testing, to validate data accuracy and integrity across the bronze layer. Monitor and Enhance Data Reliability: Collaborate with the data engineering team to establish and refine monitoring and alerting systems that proactively identify data quality issues and system failures, implementing corrective actions as needed. Leverage Generative AI: Innovate and apply generative AI techniques to enhance testing processes, automate complex data validation scenarios, and improve overall data quality assurance workflows. Collaborate with Cross-Functional Teams: Serve as a key liaison between Data Engineers, Product Analysts, and other stakeholders to deeply understand data requirements and ensure that testing aligns with strategic business objectives. Document and Standardize Testing Processes: Create and maintain comprehensive documentation of testing procedures, results, and best practices, facilitating knowledge sharing and continuous improvement across the organization. Drive Continuous Improvement Initiatives: Lead efforts to develop and implement best practices for QA automation and reliability, including conducting code reviews, mentoring junior team members, and optimizing testing processes. What You Should Have: Educational Background: Bachelor's degree in computer science, Engineering, Information Technology, or a related field Experience: 4+ years of experience in QA automation, with a strong focus on data quality and reliability testing in complex data engineering environments. Technical Skills: Advanced proficiency in programming languages such as Python, Java, or similar for writing and optimizing automated tests. Extensive experience with testing frameworks and tools (e.g., Selenium, JUnit, pytest) and data validation tools, with a focus on scalability and performance. Deep familiarity with data processing frameworks (e.g., Apache Spark) and data storage solutions (e.g., SQL, NoSQL), including performance tuning and optimization. Strong understanding of generative AI concepts and tools, and their application in enhancing data quality and testing methodologies. Proficiency in using Jira Xray for advanced test management, including creating, executing, and tracking complex test cases and defects. Analytical Skills: Exceptional analytical and problem-solving skills, with a proven ability to identify, troubleshoot, and resolve intricate data quality issues effectively. Communication Skills: Outstanding verbal and written communication skills, with the ability to articulate complex technical concepts to both technical and non-technical stakeholders. Preferred Qualifications Experience with Cloud Platforms: Extensive familiarity with cloud data services (e.g., AWS, Azure, Google Cloud) and their QA tools, including experience in cloud-based testing environments. Knowledge of Data Governance: In-depth understanding of data governance principles and practices, including data lineage, metadata management, and compliance requirements. Experience with CI/CD Pipelines: Strong knowledge of continuous integration and continuous deployment (CI/CD) practices and tools (e.g., Jenkins, GitLab CI), with experience in automating testing within CI/CD workflows. Certifications: Relevant certifications in QA automation or data engineering (e.g., ISTQB, AWS Certified Data Analytics) are highly regarded. Agile Methodologies: Proven experience working in Agile/Scrum environments, with a strong understanding of Agile testing practices and principles. Our technology teams operate as business partners, proposing ideas and innovative solutions that enable new organizational capabilities. We collaborate internationally to deliver services and solutions that help everyone be more productive and enable innovation. Who we are We are known as Merck & Co., Inc., Rahway, New Jersey, USA in the United States and Canada and MSD everywhere else. For more than a century, we have been inventing for life, bringing forward medicines and vaccines for many of the world's most challenging diseases. Today, our company continues to be at the forefront of research to deliver innovative health solutions and advance the prevention and treatment of diseases that threaten people and animals around the world. What we look for Imagine getting up in the morning for a job as important as helping to save and improve lives around the world. Here, you have that opportunity. You can put your empathy, creativity, digital mastery, or scientific genius to work in collaboration with a diverse group of colleagues who pursue and bring hope to countless people who are battling some of the most challenging diseases of our time. Our team is constantly evolving, so if you are among the intellectually curious, join us—and start making your impact today. #HYDIT2025 Current Employees apply HERE Current Contingent Workers apply HERE Search Firm Representatives Please Read Carefully Merck & Co., Inc., Rahway, NJ, USA, also known as Merck Sharp & Dohme LLC, Rahway, NJ, USA, does not accept unsolicited assistance from search firms for employment opportunities. All CVs / resumes submitted by search firms to any employee at our company without a valid written search agreement in place for this position will be deemed the sole property of our company. No fee will be paid in the event a candidate is hired by our company as a result of an agency referral where no pre-existing agreement is in place. Where agency agreements are in place, introductions are position specific. Please, no phone calls or emails. Employee Status: Regular Relocation: VISA Sponsorship: Travel Requirements: Flexible Work Arrangements: Hybrid Shift: Valid Driving License: Hazardous Material(s): Required Skills: Business Intelligence (BI), Database Administration, Data Engineering, Data Management, Data Modeling, Data Visualization, Design Applications, Information Management, Software Development, Software Development Life Cycle (SDLC), System Designs Preferred Skills: Job Posting End Date: 08/31/2025 A job posting is effective until 11:59:59PM on the day BEFORE the listed job posting end date. Please ensure you apply to a job posting no later than the day BEFORE the job posting end date. Requisition ID: R345312

Posted 2 days ago

Apply

5.0 years

6 - 8 Lacs

Hyderābād

Remote

GlassDoor logo

- 5+ years of analyzing and interpreting data with Redshift, Oracle, NoSQL etc. experience - Experience with data visualization using Tableau, Quicksight, or similar tools - Experience with data modeling, warehousing and building ETL pipelines - Experience in Statistical Analysis packages such as R, SAS and Matlab - Experience using SQL to pull data from a database or data warehouse and scripting experience (Python) to process data for modeling Want to join the Earth’s most customer centric company? Do you like to dive deep to understand problems? Are you someone who likes to challenge Status Quo? Do you strive to excel at goals assigned to you? If yes, we have opportunities for you. Global Operations – Artificial Intelligence (GO-AI) at Amazon is looking to hire candidates who can excel in a fast-paced dynamic environment. Are you somebody that likes to use and analyze big data to drive business decisions? Do you enjoy converting data into insights that will be used to enhance customer decisions worldwide for business leaders? Do you want to be part of the data team which measures the pulse of innovative machine vision-based projects? If your answer is yes, join our team. GO-AI is looking for a motivated individual with strong skills and experience in resource utilization planning, process optimization and execution of scalable and robust operational mechanisms, to join the GO-AI Ops DnA team. In this position you will be responsible for supporting our sites to build solutions for the rapidly expanding GO-AI team. The role requires the ability to work with a variety of key stakeholders across job functions with multiple sites. We are looking for an entrepreneurial and analytical program manager, who is passionate about their work, understands how to manage service levels across multiple skills/programs, and who is willing to move fast and experiment often. Key job responsibilities • Ability to maintain and refine straightforward ETL and write secure, stable, testable, maintainable code with minimal defects and automate manual processes. • Proficiency in one or more industry analytics visualization tools (e.g. Excel, Tableau/Quicksight/PowerBI) and, as needed, statistical methods (e.g. t-test, Chi-squared) to deliver actionable insights to stakeholders. • Building and owning small to mid-size BI solutions with high accuracy and on time delivery using data sets, queries, reports, dashboards, analyses or components of larger solutions to answer straightforward business questions with data incorporating business intelligence best practices, data management fundamentals, and analysis principles. • Good understanding of the relevant data lineage: including sources of data; how metrics are aggregated; and how the resulting business intelligence is consumed, interpreted and acted upon by the business where the end product enables effective, data-driven business decisions. • Having high responsibility for the code, queries, reports and analyses that are inherited or produced and having analyses and code reviewed periodically. • Effective partnering with peer BIEs and others in your team to troubleshoot, research root causes, propose solutions, by either take ownership for their resolution or ensure a clear hand-off to the right owner. About the team The Global Operations – Artificial Intelligence (GO-AI) team is an initiative, which remotely handles exceptions in the Amazon Robotic Fulfillment Centers Globally. GO-AI seeks to complement automated vision based decision-making technologies by providing remote human support for the subset of tasks which require higher cognitive ability and cannot be processed through automated decision making with high confidence. This team provides end-to-end solutions through inbuilt competencies of Operations and strong central specialized teams to deliver programs at Amazon scale. It is operating multiple programs and other new initiatives in partnership with global technology and operations teams. Experience with AWS solutions such as EC2, DynamoDB, S3, and Redshift Experience in data mining, ETL, etc. and using databases in a business environment with large-scale, complex datasets Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you’re applying in isn’t listed, please contact your Recruiting Partner.

Posted 2 days ago

Apply

5.0 years

1 - 9 Lacs

Hyderābād

On-site

GlassDoor logo

We have an opportunity to impact your career and provide an adventure where you can push the limits of what's possible. As a Lead Software Engineer at JPMorgan Chase within the Consumer and community banking - Data technology, you are an integral part of an agile team that works to enhance, build, and deliver trusted market-leading technology products in a secure, stable, and scalable way. As a core technical contributor, you are responsible for conducting critical technology solutions across multiple technical areas within various business functions in support of the firm’s business objectives. Job responsibilities Executes creative software solutions, design, development, and technical troubleshooting with ability to think beyond routine or conventional approaches to build solutions or break down technical problems Develops secure high-quality production code, and reviews and debugs code written by others Identifies opportunities to eliminate or automate remediation of recurring issues to improve overall operational stability of software applications and systems Leads evaluation sessions with external vendors, startups, and internal teams to drive outcomes-oriented probing of architectural designs, technical credentials, and applicability for use within existing systems and information architecture Leads communities of practice across Software Engineering to drive awareness and use of new and leading-edge technologies Becomes a technical mentor in the team Required qualifications, capabilities, and skills Formal training or certification on software engineering concepts and 5+ years applied experience Experience in software engineering, including hands-on expertise in ETL/Data pipeline and data lake platforms like Teradata and Snowflake Hands-on practical experience delivering system design, application development, testing, and operational stability Proficiency in AWS services especially in Aurora Postgres RDS Proficiency in automation and continuous delivery methods Proficient in all aspects of the Software Development Life Cycle Advanced understanding of agile methodologies such as CI/CD, Application Resiliency, and Security Demonstrated proficiency in software applications and technical processes within a technical discipline (e.g., cloud, artificial intelligence, machine learning, mobile, etc.) In-depth knowledge of the financial services industry and their IT systems Preferred qualifications, capabilities, and skills Experience in re-engineering and migrating on-premises data solutions to and for the cloud Experience in Infrastructure as Code (Terraform) for Cloud based data infrastructure Experience in building on emerging cloud serverless managed services, to minimize/eliminate physical/virtual server footprint Advanced in Java plus Python (nice to have)

Posted 2 days ago

Apply

5.0 years

0 Lacs

Hyderābād

On-site

GlassDoor logo

JOB DESCRIPTION We have an opportunity to impact your career and provide an adventure where you can push the limits of what's possible. As a Lead Software Engineer at JPMorgan Chase within the Consumer and community banking - Data technology, you are an integral part of an agile team that works to enhance, build, and deliver trusted market-leading technology products in a secure, stable, and scalable way. As a core technical contributor, you are responsible for conducting critical technology solutions across multiple technical areas within various business functions in support of the firm’s business objectives. Job responsibilities Executes creative software solutions, design, development, and technical troubleshooting with ability to think beyond routine or conventional approaches to build solutions or break down technical problems Develops secure high-quality production code, and reviews and debugs code written by others Identifies opportunities to eliminate or automate remediation of recurring issues to improve overall operational stability of software applications and systems Leads evaluation sessions with external vendors, startups, and internal teams to drive outcomes-oriented probing of architectural designs, technical credentials, and applicability for use within existing systems and information architecture Leads communities of practice across Software Engineering to drive awareness and use of new and leading-edge technologies Becomes a technical mentor in the team Required qualifications, capabilities, and skills Formal training or certification on software engineering concepts and 5+ years applied experience Experience in software engineering, including hands-on expertise in ETL/Data pipeline and data lake platforms like Teradata and Snowflake Hands-on practical experience delivering system design, application development, testing, and operational stability Proficiency in AWS services especially in Aurora Postgres RDS Proficiency in automation and continuous delivery methods Proficient in all aspects of the Software Development Life Cycle Advanced understanding of agile methodologies such as CI/CD, Application Resiliency, and Security Demonstrated proficiency in software applications and technical processes within a technical discipline (e.g., cloud, artificial intelligence, machine learning, mobile, etc.) In-depth knowledge of the financial services industry and their IT systems Preferred qualifications, capabilities, and skills Experience in re-engineering and migrating on-premises data solutions to and for the cloud Experience in Infrastructure as Code (Terraform) for Cloud based data infrastructure Experience in building on emerging cloud serverless managed services, to minimize/eliminate physical/virtual server footprint Advanced in Java plus Python (nice to have) ABOUT US

Posted 2 days ago

Apply

6.0 - 8.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Linkedin logo

You should apply if you have: Experience in building analytics platforms from scratch for a product-based company. Strong proficiency in Power BI, SQL, and data visualization tools . Expertise in data modeling, ETL processes, and business intelligence . Ability to analyze large datasets and translate insights into actionable recommendations. Experience working with AWS, Redshift, BigQuery . A passion for data-driven decision-making and problem-solving. Excellent communication skills to present insights to stakeholders effectively. You should not apply if you: Lack of experience in data visualization tools like Power BI . Are unfamiliar with SQL and cloud-based data warehouses . Haven’t worked with ETL pipelines and data modeling . Struggle with translating complex data into business insights . Are not comfortable in a fast-paced, product-based company environment . Skills Required: Power BI (DAX, Power Query, Report Optimization) SQL (Query Optimization, Data Manipulation) ETL Processes & Data Warehousing AWS Redshift / Google BigQuery Python (Preferred but not mandatory) Business Intelligence & Data Storytelling Stakeholder Communication & Data-Driven Decision-Making What will you do? Build and manage an end-to-end analytics platform for Nutrabay. Develop interactive dashboards and reports for business insights. Work with large datasets to ensure data integrity and efficiency . Collaborate with engineering, product, and marketing teams to define key metrics. Implement ETL processes to extract and transform data from multiple sources. Ensure data security and governance within the analytics ecosystem. Conduct deep-dive analyses on performance metrics, user behavior, and market trends. Optimize Power BI reports for performance and scalability . Support decision-making with real-time and historical data analysis . Work Experience: 6-8 years of experience in data analysis, business intelligence, or related roles. Prior experience in a product-based or e-commerce company is a plus. Working Days: Monday - Friday Location: Golf Course Road, Gurugram, Haryana (Work from Office) Perks: Opportunity to build the analytics infrastructure from scratch. Learning and development opportunities in a fast-growing company . Work alongside a collaborative and talented team . Why Nutrabay: We believe in an open, intellectually honest culture where everyone is given the autonomy to contribute and do their life’s best work. As a part of the dynamic team at Nutrabay, you will have a chance to learn new things, solve new problems, build your competence and be a part of an innovative marketing-and-tech startup that’s revolutionising the health industry. Working with Nutrabay can be fun, and a place of a unique growth opportunity. Here you will learn how to maximise the potential of your available resources. You will get the opportunity to do work that helps you master a variety of transferable skills, or skills that are relevant across roles and departments. You will be feeling appreciated and valued for the work you delivered. We are creating a unique company culture that embodies respect and honesty that will create more loyal employees than a company that simply shells out cash. We trust our employees and their voice and ask for their opinions on important business issues. About Nutrabay: Nutrabay is the largest health & nutrition store in India. Our vision is to keep growing, having a sustainable business model and continue to be the market leader in this segment by launching many innovative products. We are proud to have served over 1 million customers uptill now and our family is constantly growing. We have built a complex and high converting eCommerce system and our monthly traffic has grown to a million. We are looking to build a visionary and agile team to help fuel our growth and contribute towards further advancing the continuously evolving product. Funding: We raised $5 Million in a Series A funding round. Show more Show less

Posted 2 days ago

Apply

3.0 years

5 - 12 Lacs

Hyderābād

On-site

GlassDoor logo

Objectives of this role Work with data to solve business problems, building and maintaining the infrastructure to answer questions and improve processes Help streamline our data science workflows, adding value to our product offerings and building out the customer lifecycle and retention models Work closely with the data science and business intelligence teams to develop data models and pipelines for research, reporting, and machine learning Be an advocate for best practices and continued learning Responsibilities Work closely with our data science team to help build complex algorithms that provide unique insights into our data Use agile software development processes to make iterative improvements to our back-end systems Model front-end and back-end data sources to help draw a more comprehensive picture of user flows throughout the system and to enable powerful data analysis Build data pipelines that clean, transform, and aggregate data from disparate sources Develop models that can be used to make predictions and answer questions for the overall business Required skills and qualifications Three or more years of experience with Python, SQL, and data visualization/exploration tools Familiarity with the AWS ecosystem, specifically Redshift and RDS Communication skills, especially for explaining technical concepts to nontechnical business leaders Ability to work on a dynamic, research-oriented team that has concurrent projects Preferred skills and qualifications Bachelor’s degree (or equivalent) in computer science, information technology, engineering, or related discipline Experience in building or maintaining ETL processes Professional certification Job Types: Full-time, Permanent Pay: ₹500,000.00 - ₹1,200,000.00 per year Benefits: Health insurance Provident Fund Schedule: Day shift Monday to Friday Morning shift Experience: data engineer: 3 years (Preferred) Python: 3 years (Preferred) Work Location: In person

Posted 2 days ago

Apply

7.0 years

5 - 7 Lacs

Hyderābād

On-site

GlassDoor logo

About the Role: Grade Level (for internal use): 10 Role: As a Senior Database Engineer, you will work on multiple datasets that will enable S&P CapitalIQ Pro to serve-up value-added Ratings, Research and related information to the Institutional clients. The Team: Our team is responsible for the gathering data from multiple sources spread across the globe using different mechanism (ETL/GG/SQL Rep/Informatica/Data Pipeline) and convert them to a common format which can be used by Client facing UI tools and other Data providing Applications. This application is the backbone of many of S&P applications and is critical to our client needs. You will get to work on wide range of technologies and tools like Oracle/SQL/.Net/Informatica/Kafka/Sonic. You will have the opportunity every day to work with people from a wide variety of backgrounds and will be able to develop a close team dynamic with coworkers from around the globe. We craft strategic implementations by using the broader capacity of the data and product. Do you want to be part of a team that executes cross-business solutions within S&P Global? Impact: Our Team is responsible to deliver essential and business critical data with applied intelligence to power the market of the future. This enables our customer to make decisions with conviction. Contribute significantly to the growth of the firm by- Developing innovative functionality in existing and new products Supporting and maintaining high revenue productionized products Achieve the above intelligently and economically using best practices Career: This is the place to hone your existing Database skills while having the chance to become exposed to fresh technologies. As an experienced member of the team, you will have the opportunity to mentor and coach developers who have recently graduated and collaborate with developers, business analysts and product managers who are experts in their domain. Your skills: You should be able to demonstrate that you have an outstanding knowledge and hands-on experience in the below areas: Complete SDLC: architecture, design, development and support of tech solutions Play a key role in the development team to build high-quality, high-performance, scalable code Engineer components, and common services based on standard corporate development models, languages and tools Produce technical design documents and conduct technical walkthroughs Collaborate effectively with technical and non-technical stakeholders Be part of a culture to continuously improve the technical design and code base Document and demonstrate solutions using technical design docs, diagrams and stubbed code Our Hiring Manager says: I’m looking for a person that gets excited about technology and motivated by seeing how our individual contribution and team work to the world class web products affect the workflow of thousands of clients resulting in revenue for the company. Qualifications Required: Bachelor’s degree in computer science, Information Systems or Engineering. 7+ years of experience on Transactional Databases like SQL server, Oracle, PostgreSQL and other NoSQL databases like Amazon DynamoDB, MongoDB Strong Database development skills on SQL Server, Oracle Strong knowledge of Database architecture, Data Modeling and Data warehouse. Knowledge on object-oriented design, and design patterns. Familiar with various design and architectural patterns Strong development experience with Microsoft SQL Server Experience in cloud native development and AWS is a big plus Experience with Kafka/Sonic Broker messaging systems Nice to have: Experience in developing data pipelines using Java or C# is a significant advantage. Strong knowledge around ETL Tools – Informatica, SSIS Exposure with Informatica is an advantage. Familiarity with Agile and Scrum models Working Knowledge of VSTS. Working knowledge of AWS cloud is an added advantage. Understanding of fundamental design principles for building a scalable system. Understanding of financial markets and asset classes like Equity, Commodity, Fixed Income, Options, Index/Benchmarks is desirable. Additionally, experience with Scala, Python and Spark applications is a plus. About S&P Global Market Intelligence At S&P Global Market Intelligence, a division of S&P Global we understand the importance of accurate, deep and insightful information. Our team of experts delivers unrivaled insights and leading data and technology solutions, partnering with customers to expand their perspective, operate with confidence, and make decisions with conviction. For more information, visit www.spglobal.com/marketintelligence . What’s In It For You? Our Purpose: Progress is not a self-starter. It requires a catalyst to be set in motion. Information, imagination, people, technology–the right combination can unlock possibility and change the world. Our world is in transition and getting more complex by the day. We push past expected observations and seek out new levels of understanding so that we can help companies, governments and individuals make an impact on tomorrow. At S&P Global we transform data into Essential Intelligence®, pinpointing risks and opening possibilities. We Accelerate Progress. Our People: We're more than 35,000 strong worldwide—so we're able to understand nuances while having a broad perspective. Our team is driven by curiosity and a shared belief that Essential Intelligence can help build a more prosperous future for us all. From finding new ways to measure sustainability to analyzing energy transition across the supply chain to building workflow solutions that make it easy to tap into insight and apply it. We are changing the way people see things and empowering them to make an impact on the world we live in. We’re committed to a more equitable future and to helping our customers find new, sustainable ways of doing business. We’re constantly seeking new solutions that have progress in mind. Join us and help create the critical insights that truly make a difference. Our Values: Integrity, Discovery, Partnership At S&P Global, we focus on Powering Global Markets. Throughout our history, the world's leading organizations have relied on us for the Essential Intelligence they need to make confident decisions about the road ahead. We start with a foundation of integrity in all we do, bring a spirit of discovery to our work, and collaborate in close partnership with each other and our customers to achieve shared goals. Benefits: We take care of you, so you can take care of business. We care about our people. That’s why we provide everything you—and your career—need to thrive at S&P Global. Our benefits include: Health & Wellness: Health care coverage designed for the mind and body. Flexible Downtime: Generous time off helps keep you energized for your time on. Continuous Learning: Access a wealth of resources to grow your career and learn valuable new skills. Invest in Your Future: Secure your financial future through competitive pay, retirement planning, a continuing education program with a company-matched student loan contribution, and financial wellness programs. Family Friendly Perks: It’s not just about you. S&P Global has perks for your partners and little ones, too, with some best-in class benefits for families. Beyond the Basics: From retail discounts to referral incentive awards—small perks can make a big difference. For more information on benefits by country visit: https://spgbenefits.com/benefit-summaries Global Hiring and Opportunity at S&P Global: At S&P Global, we are committed to fostering a connected and engaged workplace where all individuals have access to opportunities based on their skills, experience, and contributions. Our hiring practices emphasize fairness, transparency, and merit, ensuring that we attract and retain top talent. By valuing different perspectives and promoting a culture of respect and collaboration, we drive innovation and power global markets. Recruitment Fraud Alert: If you receive an email from a spglobalind.com domain or any other regionally based domains, it is a scam and should be reported to reportfraud@spglobal.com . S&P Global never requires any candidate to pay money for job applications, interviews, offer letters, “pre-employment training” or for equipment/delivery of equipment. Stay informed and protect yourself from recruitment fraud by reviewing our guidelines, fraudulent domains, and how to report suspicious activity here . ----------------------------------------------------------- Equal Opportunity Employer S&P Global is an equal opportunity employer and all qualified candidates will receive consideration for employment without regard to race/ethnicity, color, religion, sex, sexual orientation, gender identity, national origin, age, disability, marital status, military veteran status, unemployment status, or any other status protected by law. Only electronic job submissions will be considered for employment. If you need an accommodation during the application process due to a disability, please send an email to: EEO.Compliance@spglobal.com and your request will be forwarded to the appropriate person. US Candidates Only: The EEO is the Law Poster http://www.dol.gov/ofccp/regs/compliance/posters/pdf/eeopost.pdf describes discrimination protections under federal law. Pay Transparency Nondiscrimination Provision - https://www.dol.gov/sites/dolgov/files/ofccp/pdf/pay-transp_%20English_formattedESQA508c.pdf ----------------------------------------------------------- 20 - Professional (EEO-2 Job Categories-United States of America), IFTECH202.1 - Middle Professional Tier I (EEO Job Group) Job ID: 316332 Posted On: 2025-06-16 Location: Gurgaon, Haryana, India

Posted 2 days ago

Apply

3.0 years

4 - 6 Lacs

Hyderābād

On-site

GlassDoor logo

- 3+ years of data engineering experience - Experience with data modeling, warehousing and building ETL pipelines - Experience with one or more scripting language (e.g., Python, KornShell) - 3+ years of analyzing and interpreting data with Redshift, Oracle, NoSQL etc. experience As part of the Last Mile Science & Technology organization, you’ll partner closely with Product Managers, Data Scientists, and Software Engineers to drive improvements in Amazon's Last Mile delivery network. You will leverage data and analytics to generate insights that accelerate the scale, efficiency, and quality of the routes we build for our drivers through our end-to-end last mile planning systems. You will develop complex data engineering solutions using AWS technology stack (S3, Glue, IAM, Redshift, Athena). You should have deep expertise and passion in working with large data sets, building complex data processes, performance tuning, bringing data from disparate data stores and programmatically identifying patterns. You will work with business owners to develop and define key business questions and requirements. You will provide guidance and support for other engineers with industry best practices and direction. Analytical ingenuity and leadership, business acumen, effective communication capabilities, and the ability to work effectively with cross-functional teams in a fast-paced environment are critical skills for this role. Key job responsibilities • Design, implement, and support data warehouse / data lake infrastructure using AWS big data stack, Python, Redshift, Quicksight, Glue/lake formation, EMR/Spark/Scala, Athena etc. • Extract huge volumes of structured and unstructured data from various sources (Relational /Non-relational/No-SQL database) and message streams and construct complex analyses. • Develop and manage ETLs to source data from various systems and create unified data model for analytics and reporting • Perform detailed source-system analysis, source-to-target data analysis, and transformation analysis • Participate in the full development cycle for ETL: design, implementation, validation, documentation, and maintenance. Experience with big data technologies such as: Hadoop, Hive, Spark, EMR Experience with big data processing technology (e.g., Hadoop or ApacheSpark), data warehouse technical architecture, infrastructure components, ETL, and reporting/analytic tools and environments Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you’re applying in isn’t listed, please contact your Recruiting Partner.

Posted 2 days ago

Apply

10.0 years

5 - 10 Lacs

Hyderābād

On-site

GlassDoor logo

Job Information Date Opened 06/17/2025 Job Type Full time Industry IT Services City Hyderabad State/Province Telangana Country India Zip/Postal Code 500081 About Us About DATAECONOMY: We are a fast-growing data & analytics company headquartered in Dublin with offices inDublin, OH, Providence, RI, and an advanced technology center in Hyderabad,India. We are clearly differentiated in the data & analytics space via our suite of solutions, accelerators, frameworks, and thought leadership. Job Description Job Title: Technical Project Manager Location: Hyderabad Employment Type: Full-time Experience: 10+ years Domain: Banking and Insurance We are seeking a Technical Project Manager to lead and coordinate the delivery of data-centric projects. This role bridges the gap between engineering teams and business stakeholders, ensuring the successful execution of technical initiatives, particularly in data infrastructure, pipelines, analytics, and platform integration. Responsibilities: Lead end-to-end project management for data-driven initiatives, including planning, execution, delivery, and stakeholder communication. Work closely with data engineers, analysts, and software developers to ensure technical accuracy and timely delivery of projects. Translate business requirements into technical specifications and work plans. Manage project timelines, risks, resources, and dependencies using Agile, Scrum, or Kanban methodologies. Drive the development and maintenance of scalable ETL pipelines, data models, and data integration workflows. Oversee code reviews and ensure adherence to data engineering best practices. Provide hands-on support when necessary, in Python-based development or debugging. Collaborate with cross-functional teams including Product, Data Science, DevOps, and QA. Track project metrics and prepare progress reports for stakeholders. Requirements Required Qualifications: Bachelor’s or master’s degree in computer science, Information Systems, Engineering, or related field. 10+ years of experience in project management or technical leadership roles. Strong understanding of modern data architectures (e.g., data lakes, warehousing, streaming). Experience working with cloud platforms like AWS, GCP, or Azure. Familiarity with tools such as JIRA, Confluence, Git, and CI/CD pipelines. Strong communication and stakeholder management skills. Benefits Company standard benefits.

Posted 2 days ago

Apply

3.0 years

0 Lacs

Hyderābād

On-site

GlassDoor logo

JOB DESCRIPTION You’re ready to gain the skills and experience needed to grow within your role and advance your career — and we have the perfect software engineering opportunity for you. As a Data Engineer III at JPMorgan Chase within the Consumer & Community Banking Technology Team, you are part of an agile team that works to enhance, design, and deliver the software components of the firm’s state-of-the-art technology products in a secure, stable, and scalable way. As an emerging member of a software engineering team, you execute software solutions through the design, development, and technical troubleshooting of multiple components within a technical product, application, or system, while gaining the skills and experience needed to grow within your role. Job responsibilities Executes standard software solutions, design, development, and technical troubleshooting Writes secure and high-quality code using the syntax of at least one programming language with limited guidance Designs, develops, codes, and troubleshoots with consideration of upstream and downstream systems and technical implications Applies knowledge of tools within the Software Development Life Cycle toolchain to improve the value realized by automation Applies technical troubleshooting to break down solutions and solve technical problems of basic complexity Gathers, analyzes, synthesizes, and develops visualizations and reporting from large, diverse data sets in service of continuous improvement of software applications and systems. Proactively identifies hidden problems and patterns in data and uses these insights to drive improvements to coding hygiene and system architecture. Design & develop data pipelines end to end using PySpark, Java, Python and AWS Services. Utilize Container Orchestration services including Kubernetes, and a variety of AWS tools and services. Learns and applies system processes, methodologies, and skills for the development of secure, stable code and systems Adds to team culture of diversity, equity, inclusion, and respect Required qualifications, capabilities, and skills Formal training or certification on software engineering concepts and 3+ years of applied experience. Hands-on practical experience in system design, application development, testing, and operational stability Experience in developing, debugging, and maintaining code in a large corporate environment with one or more modern programming languages and database querying languages Hands-on practical experience in developing spark-based Frameworks for end-to-end ETL, ELT & reporting solutions using key components like Spark & Spark Streaming. Proficient in coding in one or more Coding languages – Core Java, Python and PySpark Experience with Relational and Datawarehouse databases, Cloud implementation experience with AWS including: AWS Data Services: Proficiency in Lake formation, Glue ETL (or) EMR, S3, Glue Catalog, Athena, Airflow (or) Lambda + Step Functions + Event Bridge, ECS Cluster and ECS Apps Data De/Serialization: Expertise in at least 2 of the formats: Parquet, Iceberg, AVRO, JSON AWS Data Security: Good Understanding of security concepts such as: Lake formation, IAM, Service roles, Encryption, KMS, Secrets Manager Proficiency in automation and continuous delivery methods. Preferred qualifications, capabilities, and skills Experience in Snowflake nice to have. Solid understanding of agile methodologies such as CI/CD, Applicant Resiliency, and Security. In-depth knowledge of the financial services industry and their IT systems. Practical cloud native experience preferably AWS. ABOUT US JPMorganChase, one of the oldest financial institutions, offers innovative financial solutions to millions of consumers, small businesses and many of the world’s most prominent corporate, institutional and government clients under the J.P. Morgan and Chase brands. Our history spans over 200 years and today we are a leader in investment banking, consumer and small business banking, commercial banking, financial transaction processing and asset management. We recognize that our people are our strength and the diverse talents they bring to our global workforce are directly linked to our success. We are an equal opportunity employer and place a high value on diversity and inclusion at our company. We do not discriminate on the basis of any protected attribute, including race, religion, color, national origin, gender, sexual orientation, gender identity, gender expression, age, marital or veteran status, pregnancy or disability, or any other basis protected under applicable law. We also make reasonable accommodations for applicants’ and employees’ religious practices and beliefs, as well as mental health or physical disability needs. Visit our FAQs for more information about requesting an accommodation. ABOUT THE TEAM Our Consumer & Community Banking division serves our Chase customers through a range of financial services, including personal banking, credit cards, mortgages, auto financing, investment advice, small business loans and payment processing. We’re proud to lead the U.S. in credit card sales and deposit growth and have the most-used digital solutions – all while ranking first in customer satisfaction.

Posted 2 days ago

Apply

0 years

0 Lacs

Telangana

On-site

GlassDoor logo

About Chubb Chubb is a world leader in insurance. With operations in 54 countries and territories, Chubb provides commercial and personal property and casualty insurance, personal accident and supplemental health insurance, reinsurance and life insurance to a diverse group of clients. The company is defined by its extensive product and service offerings, broad distribution capabilities, exceptional financial strength and local operations globally. Parent company Chubb Limited is listed on the New York Stock Exchange (NYSE: CB) and is a component of the S&P 500 index. Chubb employs approximately 40,000 people worldwide. Additional information can be found at: www.chubb.com. About Chubb India At Chubb India, we are on an exciting journey of digital transformation driven by a commitment to engineering excellence and analytics. We are proud to share that we have been officially certified as a Great Place to Work® for the third consecutive year, a reflection of the culture at Chubb where we believe in fostering an environment where everyone can thrive, innovate, and grow With a team of over 2500 talented professionals, we encourage a start-up mindset that promotes collaboration, diverse perspectives, and a solution-driven attitude. We are dedicated to building expertise in engineering, analytics, and automation, empowering our teams to excel in a dynamic digital landscape. We offer an environment where you will be part of an organization that is dedicated to solving real-world challenges in the insurance industry. Together, we will work to shape the future through innovation and continuous learning. Key Responsibilities: Design, implement, and maintain database systems using SQL and Azure Synapse Analytics. Monitor database performance, implement changes, and apply new patches and versions when required. Ensure data integrity and security by implementing and managing appropriate access controls and backup/recovery procedures. Collaborate with development teams to design and optimize database queries and structures. Troubleshoot and resolve database issues, ensuring minimal downtime and data loss. Develop and maintain documentation related to database configurations, processes, and service records. Assist in the design and implementation of data warehousing solutions using Azure Synapse. Provide support for data migration and integration projects. Stay updated with the latest industry trends and technologies to ensure our database systems are current and efficient. Qualifications: Bachelor’s degree in Computer Science, Information Technology, or a related field. Proven experience as a Database Administrator with a focus on SQL and Azure Synapse Analytics. Strong knowledge of database structure systems and data mining. Experience with database management tools and software. Excellent problem-solving skills and ability to work independently. Strong communication skills to collaborate effectively with team members and stakeholders. Familiarity with cloud-based database solutions and services, particularly within the Azure ecosystem. Preferred Skills: Experience with other database technologies such as Oracle, MySQL, or PostgreSQL. Knowledge of data warehousing concepts and ETL processes. Certification in SQL Server or Azure Synapse Analytics is a plus. Why Chubb? Join Chubb to be part of a leading global insurance company! Our constant focus on employee experience along with a start-up-like culture empowers you to achieve impactful results. Industry leader: Chubb is a world leader in the insurance industry, powered by underwriting and engineering excellence A Great Place to work: Chubb India has been recognized as a Great Place to Work® for the years 2023-2024, 2024-2025 and 2025-2026 Laser focus on excellence: At Chubb we pride ourselves on our culture of greatness where excellence is a mindset and a way of being. We constantly seek new and innovative ways to excel at work and deliver outstanding results Start-Up Culture: Embracing the spirit of a start-up, our focus on speed and agility enables us to respond swiftly to market requirements, while a culture of ownership empowers employees to drive results that matter Growth and success: As we continue to grow, we are steadfast in our commitment to provide our employees with the best work experience, enabling them to advance their careers in a conducive environment Employee Benefits Our company offers a comprehensive benefits package designed to support our employees’ health, well-being, and professional growth. Employees enjoy flexible work options, generous paid time off, and robust health coverage, including treatment for dental and vision related requirements. We invest in the future of our employees through continuous learning opportunities and career advancement programs, while fostering a supportive and inclusive work environment. Our benefits include: Savings and Investment plans: We provide specialized benefits like Corporate NPS (National Pension Scheme), Employee Stock Purchase Plan (ESPP), Long-Term Incentive Plan (LTIP), Retiral Benefits and Car Lease that help employees optimally plan their finances Upskilling and career growth opportunities: With a focus on continuous learning, we offer customized programs that support upskilling like Education Reimbursement Programs, Certification programs and access to global learning programs. Health and Welfare Benefits: We care about our employees’ well-being in and out of work and have benefits like Employee Assistance Program (EAP), Yearly Free Health campaigns and comprehensive Insurance benefits. Application Process Our recruitment process is designed to be transparent, and inclusive. Step 1: Submit your application via the Chubb Careers Portal. Step 2: Engage with our recruitment team for an initial discussion. Step 3: Participate in HackerRank assessments/technical/functional interviews and assessments (if applicable). Step 4: Final interaction with Chubb leadership. Join Us With you Chubb is better. Whether you are solving challenges on a global stage or creating innovative solutions for local markets, your contributions will help shape the future. If you value integrity, innovation, and inclusion, and are ready to make a difference, we invite you to be part of Chubb India’s journey. Apply Now: Chubb External Careers TBD

Posted 2 days ago

Apply

2.0 years

0 Lacs

Telangana

On-site

GlassDoor logo

Design and develop QlikView and Qlik Sense dashboards and reports. Collaborate with business stakeholders to gather and understand requirements. Perform data extraction, transformation, and loading (ETL) processes. Optimize Qlik applications for performance and usability. Ensure data accuracy and consistency across all BI solutions. Conduct testing and validation of Qlik applications. Provide ongoing support and troubleshooting for Qlik solutions. Stay up-to-date with the latest Qlik technologies and industry trends. Qualifications Bachelor’s degree in Computer Science, Information Technology, or related field. 2+ years of experience in Qlik development (QlikView and Qlik Sense). Strong understanding of data visualization best practices. Proficiency in SQL and data modeling. Experience with ETL processes and tools. Excellent problem-solving and analytical skills. Strong communication and interpersonal skills. Design and develop QlikView and Qlik Sense dashboards and reports. Collaborate with business stakeholders to gather and understand requirements. Perform data extraction, transformation, and loading (ETL) processes. Optimize Qlik applications for performance and usability. Ensure data accuracy and consistency across all BI solutions. Conduct testing and validation of Qlik applications. Provide ongoing support and troubleshooting for Qlik solutions. Stay up-to-date with the latest Qlik technologies and industry trends. Qualifications Bachelor’s degree in Computer Science, Information Technology, or related field. 2+ years of experience in Qlik development (QlikView and Qlik Sense). Strong understanding of data visualization best practices. Proficiency in SQL and data modeling. Experience with ETL processes and tools. Excellent problem-solving and analytical skills. Strong communication and interpersonal skills.

Posted 2 days ago

Apply

15.0 years

0 Lacs

Hyderābād

On-site

GlassDoor logo

Project Role : Software Development Lead Project Role Description : Develop and configure software systems either end-to-end or for a specific stage of product lifecycle. Apply knowledge of technologies, applications, methodologies, processes and tools to support a client, project or entity. Must have skills : SAP BusinessObjects Data Services Good to have skills : NA Minimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary: As a Software Development Lead, you will develop and configure software systems, either end-to-end or for specific stages of the product lifecycle. Your typical day will involve collaborating with various teams to ensure the successful implementation of software solutions, applying your knowledge of technologies and methodologies to support projects and clients effectively. You will engage in problem-solving activities, guiding your team through challenges while ensuring that project goals are met efficiently and effectively. Roles & Responsibilities: - Expected to be an SME. - Collaborate and manage the team to perform. - Responsible for team decisions. - Engage with multiple teams and contribute on key decisions. - Provide solutions to problems for their immediate team and across multiple teams. - Facilitate knowledge sharing sessions to enhance team capabilities. - Monitor project progress and ensure alignment with strategic objectives. Professional & Technical Skills: - Must To Have Skills: Proficiency in SAP BusinessObjects Data Services. - Strong understanding of data integration and transformation processes. - Experience with ETL (Extract, Transform, Load) methodologies. - Familiarity with database management systems and SQL. - Ability to troubleshoot and optimize data workflows. Additional Information: - The candidate should have minimum 7.5 years of experience in SAP BusinessObjects Data Services. - This position is based at our Hyderabad office. - A 15 years full time education is required. 15 years full time education

Posted 2 days ago

Apply

5.0 years

6 - 9 Lacs

Hyderābād

Remote

GlassDoor logo

Job Description Role Overview: A Data Engineer is responsible for designing, building, and maintaining robust data pipelines and infrastructure that facilitate the collection, storage, and processing of large datasets. They collaborate with data scientists and analysts to ensure data is accessible, reliable, and optimized for analysis. Key tasks include data integration, ETL (Extract, Transform, Load) processes, and managing databases and cloud-based systems. Data engineers play a crucial role in enabling data-driven decision-making and ensuring data quality across organizations. What will you do in this role: Develop comprehensive High-Level Technical Design and Data Mapping documents to meet specific business integration requirements. Own the data integration and ingestion solutions throughout the project lifecycle, delivering key artifacts such as data flow diagrams and source system inventories. Provide end-to-end delivery ownership for assigned data pipelines, performing cleansing, processing, and validation on the data to ensure its quality. Define and implement robust Test Strategies and Test Plans, ensuring end-to-end accountability for middleware testing and evidence management. Collaborate with the Solutions Architecture and Business analyst teams to analyze system requirements and prototype innovative integration methods. Exhibit a hands-on leadership approach, ready to engage in coding, debugging, and all necessary actions to ensure the delivery of high-quality, scalable products. Influence and drive cross-product teams and collaboration while coordinating the execution of complex, technology-driven initiatives within distributed and remote teams. Work closely with various platforms and competencies to enrich the purpose of Enterprise Integration and guide their roadmaps to address current and emerging data integration and ingestion capabilities. Design ETL/ELT solutions, lead comprehensive system and integration testing, and outline standards and architectural toolkits to underpin our data integration efforts. Analyze data requirements and translate them into technical specifications for ETL processes. Develop and maintain ETL workflows, ensuring optimal performance and error handling mechanisms are in place. Monitor and troubleshoot ETL processes to ensure timely and successful data delivery. Collaborate with data analyst and other stakeholders to ensure alignment between data architecture and integration strategies. Document integration processes, data mappings, and ETL workflows to maintain clear communication and ensure knowledge transfer. What should you have: Bachelor’s degree in information technology, Computer Science or any Technology stream 5+ years of working experience with enterprise data integration technologies – Informatica PowerCenter, Informatica Intelligent Data Management Cloud Services (CDI, CAI, Mass Ingest, Orchestration) Integration experience utilizing REST and Custom API integration Experiences in Relational Database technologies and Cloud Data stores from AWS, GCP & Azure Experience utilizing AWS cloud well architecture framework, deployment & integration and data engineering. Preferred experience with CI/CD processes and related tools including- Terraform, GitHub Actions, Artifactory etc. Proven expertise in Python and Shell scripting, with a strong focus on leveraging these languages for data integration and orchestration to optimize workflows and enhance data processing efficiency Extensive Experience in design of reusable integration pattern using the cloud native technologies Extensive Experience Process orchestration and Scheduling Integration Jobs in Autosys, Airflow. Experience in Agile development methodologies and release management techniques Excellent analytical and problem-solving skills Good Understanding of data modeling and data architecture principles Current Employees apply HERE Current Contingent Workers apply HERE Search Firm Representatives Please Read Carefully Merck & Co., Inc., Rahway, NJ, USA, also known as Merck Sharp & Dohme LLC, Rahway, NJ, USA, does not accept unsolicited assistance from search firms for employment opportunities. All CVs / resumes submitted by search firms to any employee at our company without a valid written search agreement in place for this position will be deemed the sole property of our company. No fee will be paid in the event a candidate is hired by our company as a result of an agency referral where no pre-existing agreement is in place. Where agency agreements are in place, introductions are position specific. Please, no phone calls or emails. Employee Status: Regular Relocation: VISA Sponsorship: Travel Requirements: Flexible Work Arrangements: Hybrid Shift: Valid Driving License: Hazardous Material(s): Required Skills: Business, Business Intelligence (BI), Database Administration, Data Engineering, Data Management, Data Modeling, Data Visualization, Design Applications, Information Management, Management Process, Social Collaboration, Software Development, Software Development Life Cycle (SDLC), System Designs Preferred Skills: Job Posting End Date: 07/31/2025 A job posting is effective until 11:59:59PM on the day BEFORE the listed job posting end date. Please ensure you apply to a job posting no later than the day BEFORE the job posting end date. Requisition ID: R353285

Posted 2 days ago

Apply

3.0 years

2 - 7 Lacs

Hyderābād

On-site

GlassDoor logo

About Us: Location - Hyderabad, India Department - Product R&D Level - Professional Working Pattern - Work from office. Benefits - Benefits at Ideagen DEI - DEI strategy Salary - this will be discussed at the next stage of the process, if you do have any questions, please feel free to reach out! We are seeking an experienced Data Engineer who is having strong problem solving and analytical skills, high attention to detail, passion for analytics, real-time data, and monitoring and critical Thinking and collaboration skills. The candidate should be a self-starter and a quick learner, ready to learn new technologies and tools that the job demands. Responsibilities: Building automated pipelines and solutions for data migration/data import or other operations requiring data ETL. Performing analysis on core products to support migration planning and development. Working closely with the Team Lead and collaborating with other stakeholders to gather requirements and build well architected data solutions. Produce supporting documentation, such as specifications, data models, relation between data and others, required for the effective development, usage and communication of the data operations solutions with different stakeholders. Competencies, Characteristics and Traits: Mandatory Skills - Minimum 3 years of Experience with SnapLogic pipeline development and building a minimum of 2 years in ETL/ELT Pipelines. Experience working with databases on-premises and/or cloud-based environments such as MSSQL, MySQL, PostgreSQL, AzureSQL, Aurora MySQL & PostgreSQL, AWS RDS etc. Experience working with API sources and destinations. Skills and Experience: Essential: Strong experience working with databases on-premises and/or cloud-based environments such as MSSQL, MySQL, PostgreSQL, AzureSQL, Aurora MySQL & PostgreSQL, AWS RDS etc Strong knowledge of databases, data modeling and data life cycle Proficient in understanding data and writing complex SQL Mandatory Skills - Minimum 3 years of Experience with SnapLogic pipeline development and building a minimum 2 years in ETL/ELT Pipelines Experience working with REST API in data pipelines Strong problem solving and high attention to detail Passion for analytics, real-time data, and monitoring Critical Thinking, good communication and collaboration skills Focus on high performance and quality delivery Highly self-motivated and continuous learner Desirable: Experience working with no-SQL databases like MongoDB Experience with Snaplogic administration is preferable Experience working with Microsoft Power Platform (PowerAutomate and PowerApps) or any similar automation / RPA tool Experience with cloud data platforms like snowflake, data bricks, AWS, Azure etc Awareness of emerging ETL and Cloud concepts such as Amazon AWS or Microsoft Azure Experience working with Scripting languages, such as Python, R, JavaScript, etc. About Ideagen Ideagen is the invisible force behind many things we rely on every day - from keeping airplanes soaring in the sky, to ensuring the food on our tables is safe, to helping doctors and nurses care for the sick. So, when you think of Ideagen, think of it as the silent teammate that's always working behind the scenes to help those people who make our lives safer and better. Everyday millions of people are kept safe using Ideagen software. We have offices all over the world including America, Australia, Malaysia and India with people doing lots of different and exciting jobs. What is next? If your application meets the requirements for this role, our Talent Acquisition team will be in touch to guide you through the next steps. To ensure a flexible and inclusive process, please let us know if you require any reasonable adjustments by contacting us at recruitment@ideagen.com. All matters will be treated with strict confidence. At Ideagen, we value the importance of work-life balance and welcome candidates seeking flexible or part-time working arrangements. If this is something you are interested in, please let us know during the application process. Enhance your career and make the world a safer place!

Posted 2 days ago

Apply

4.0 - 8.0 years

0 Lacs

Hyderābād

On-site

GlassDoor logo

We are seeking a skilled Data Engineer with strong experience in Azure Data Services, Databricks, SQL, and PySpark to join our data engineering team. The ideal candidate will be responsible for building robust and scalable data pipelines and solutions to support advanced analytics and business intelligence initiatives." Key Responsibilities:  Design and implement scalable and secure data pipelines using Azure Data Factory, Databricks, and Synapse Analytics.  Develop and maintain efficient ETL/ELT workflows into and within Databricks.  Write complex SQL queries for data extraction, transformation, and analysis.  Develop and optimize data transformation scripts using PySpark.  Ensure data quality, data governance, and performance optimization across all pipelines.  Collaborate with data architects, analysts, and business stakeholders to deliver reliable data solutions.  Perform data modelling and design for both structured and semi-structured data.  Monitor data pipelines and troubleshoot issues to ensure data integrity and timely delivery.  Contribute to best practices in cloud data architecture and engineering. Required Skills:  4–8 years of experience in data engineering or related fields.  Strong experience with Azure Data Services (ADF, Synapse, Databricks, Azure Storage).  Proficient with Databricks data warehouse – including data ingestion, Snow pipe, streams & tasks.  Advanced SQL skills, including performance tuning and complex query building.  Hands-on experience with PySpark for large-scale data processing and transformation.  Experience with ETL/ELT frameworks, orchestration, and scheduling.  Familiarity with data modelling concepts (dimensional/star schema).  Good understanding of data security, role-based access, and auditing in Snowflake and Azure. Preferred/Good to Have:  Experience with CI/CD pipelines and DevOps for data workflows.  Exposure to Power BI or similar BI tools.  Familiarity with Git, Terraform, or infrastructure-as-code (IaC) in cloud environments.  Experience with Agile/Scrum methodologies Job Type: Full-time Work Location: In person

Posted 2 days ago

Apply

Exploring ETL Jobs in India

The ETL (Extract, Transform, Load) job market in India is thriving with numerous opportunities for job seekers. ETL professionals play a crucial role in managing and analyzing data effectively for organizations across various industries. If you are considering a career in ETL, this article will provide you with valuable insights into the job market in India.

Top Hiring Locations in India

  1. Bangalore
  2. Mumbai
  3. Pune
  4. Hyderabad
  5. Chennai

These cities are known for their thriving tech industries and often have a high demand for ETL professionals.

Average Salary Range

The average salary range for ETL professionals in India varies based on experience levels. Entry-level positions typically start at around ₹3-5 lakhs per annum, while experienced professionals can earn upwards of ₹10-15 lakhs per annum.

Career Path

In the ETL field, a typical career path may include roles such as: - Junior ETL Developer - ETL Developer - Senior ETL Developer - ETL Tech Lead - ETL Architect

As you gain experience and expertise, you can progress to higher-level roles within the ETL domain.

Related Skills

Alongside ETL, professionals in this field are often expected to have skills in: - SQL - Data Warehousing - Data Modeling - ETL Tools (e.g., Informatica, Talend) - Database Management Systems (e.g., Oracle, SQL Server)

Having a strong foundation in these related skills can enhance your capabilities as an ETL professional.

Interview Questions

Here are 25 interview questions that you may encounter in ETL job interviews:

  • What is ETL and why is it important? (basic)
  • Explain the difference between ETL and ELT processes. (medium)
  • How do you handle incremental loads in ETL processes? (medium)
  • What is a surrogate key in the context of ETL? (basic)
  • Can you explain the concept of data profiling in ETL? (medium)
  • How do you handle data quality issues in ETL processes? (medium)
  • What are some common ETL tools you have worked with? (basic)
  • Explain the difference between a full load and an incremental load. (basic)
  • How do you optimize ETL processes for performance? (medium)
  • Can you describe a challenging ETL project you worked on and how you overcame obstacles? (advanced)
  • What is the significance of data cleansing in ETL? (basic)
  • How do you ensure data security and compliance in ETL processes? (medium)
  • Have you worked with real-time data integration in ETL? If so, how did you approach it? (advanced)
  • What are the key components of an ETL architecture? (basic)
  • How do you handle data transformation requirements in ETL processes? (medium)
  • What are some best practices for ETL development? (medium)
  • Can you explain the concept of change data capture in ETL? (medium)
  • How do you troubleshoot ETL job failures? (medium)
  • What role does metadata play in ETL processes? (basic)
  • How do you handle complex transformations in ETL processes? (medium)
  • What is the importance of data lineage in ETL? (basic)
  • Have you worked with parallel processing in ETL? If so, explain your experience. (advanced)
  • How do you ensure data consistency across different ETL jobs? (medium)
  • Can you explain the concept of slowly changing dimensions in ETL? (medium)
  • How do you document ETL processes for knowledge sharing and future reference? (basic)

Closing Remarks

As you explore ETL jobs in India, remember to showcase your skills and expertise confidently during interviews. With the right preparation and a solid understanding of ETL concepts, you can embark on a rewarding career in this dynamic field. Good luck with your job search!

cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies