Jobs
Interviews

1016 Etl Process Jobs - Page 10

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

8.0 - 13.0 years

8 - 13 Lacs

Telangana

Work from Office

Key Responsibilities: Team Leadership: Lead and mentor a team of Azure Data Engineers, providing technical guidance and support. Foster a collaborative and innovative team environment. Conduct regular performance reviews and set development goals for team members. Organize training sessions to enhance team skills and technical capabilities. Azure Data Platform: Design, implement, and optimize scalable data solutions using Azure data services such as Azure Databricks, Azure Data Factory, Azure SQL Database, and Azure Synapse Analytics. Ensure data engineering best practices and data governance are followed. Stay up-to-date with Azure data technologies and recommend improvements to enhance data processing capabilities. Data Architecture: Collaborate with data architects to design efficient and scalable data architectures. Define data modeling standards and ensure data integrity, security, and governance compliance. Project Management: Work with project managers to define project scope, goals, and deliverables. Develop project timelines, allocate resources, and track progress. Identify and mitigate risks to ensure successful project delivery. Collaboration & Communication: Collaborate with cross-functional teams including data scientists, analysts, and business stakeholders to deliver data-driven solutions. Communicate effectively with stakeholders to understand requirements and provide updates. Qualifications: Bachelor's or Master's degree in Computer Science, Information Technology, or a related field. Proven experience as a Team Lead or Manager in data engineering. Extensive experience with Azure data services and cloud technologies. Expertise in Azure Databricks, PySpark, and SQL. Strong understanding of data engineering best practices, data modeling, and ETL processes. Experience with agile development methodologies. Certifications in Azure data services (preferred). Preferred Skills: Experience with big data technologies and data warehousing solutions. Familiarity with industry standards and compliance requirements. Ability to lead and mentor a team.

Posted 2 weeks ago

Apply

5.0 - 10.0 years

15 - 30 Lacs

Chennai

Hybrid

We are looking for a Senior Data Platform Engineer to lead the design, development, and optimization of our data platform infrastructure. In this role, you will drive scalability, reliability, and performance across our data systems, working closely with data engineers, analysts, and product teams to enable data-driven decision-making at scale. Required Skills & Experience: Architect and implement scalable, secure, and high-performance data platforms (on AWS cloud using Databbricks). Build and manage data pipelines and ETL processes using modern data engineering tools (AWS RDS, REST APIs and, S3 based ingestions ) Monitor the Maintain the production data pipelines, work on enhancements Optimize data systems for performance, reliability, and cost efficiency. Implement data governance, quality, and observability best practices as per Freshworks standards Collaborate with cross-functional teams to support data needs. Qualifications: 1.Bachelor's degree in Computer Science, Information Technology, or related field. 2. Good exposure to Data structures and algorithms 3. Proven backend development experience using Scala, Spark or Python 4. Strong understanding of REST API development, web services, and microservices architecture. 5. Good to have experience with Kubernetes and containerized deployment. 6. Proficient in working with relational databases like MySQL, PostgreSQL, or similar platforms. 7. Solid understanding and hands-on experience with AWS cloud services. 8. Strong knowledge of code versioning tools, such as Git, Jenkins 9. Excellent problem-solving skills, critical thinking, and a keen attention to detail.

Posted 2 weeks ago

Apply

4.0 - 9.0 years

4 - 8 Lacs

Pune, Bengaluru

Work from Office

Your Role 4+ years experience in development of reports in AxiomSL Controllers View 10 . Good understanding on Data Sources, Data Models, Shorthand , Portfolios, Aggregations, Free Form , . Tabular Reports and Workflow using AXIOM Controller view tool based on regulatory report requirement . Solid understanding of SQL and ETL technologies . 3+ years of experience in a financial institution ideally within Regulatory Reporting . Excellent verbal/written communication and collaborative skills required Your Profile Experience in implementing Axiom solutions (US/UK Regulatory framework) Experience in Python and Shell scripting Database programming experience, ideally Sybase Axioms ASL language Understanding of Regulatory Landscape across various Jurisdictions, e.g. BOE/PRA/CCAR/Prime Reports Knowledge of software development best practices, including coding standards, code reviews, source control management, build process, continuous integration and continuous delivery Experience with Agile methodologies and development tools like Jira, GIT, Jenkins etc What You"ll love about working here We recognize the significance of flexible work arragemnets to provide support.Be it remote work, or flexible work hours. You will get an enviorment to maintain healthy work life balance. At the heart of our misssion is your career growth. our Array of career growth programs and diverse professions arecrafted to spport you in exploring a world of opportuneties Euip Yourself with valulable certification in the latest technlogies such as unix,Sql.

Posted 2 weeks ago

Apply

6.0 - 11.0 years

8 - 13 Lacs

Gurugram, Bengaluru

Work from Office

About the Role: Grade Level (for internal use): 10 S&P Global - Mobility The Role: Senior Business Analyst - Data Engineering The Team We are seeking a Senior Business Analyst in the Data Engineering Team, you will be responsible for bridging the gap between business needs and technical solutions. You will collaborate with stakeholders to gather requirements, analyze data workflows, and ensure the successful delivery of data-driven projects. The Impact In this role, you will have the opportunity to work in an Agile team, ensuring we meet our customer requirements and deliver impactful quality data. Using your technical skills, you will contribute to data analysis, design and implement complex solutions, and support the business strategy. Responsibilities Collaborate with business stakeholders to identify and document requirements for data engineering projects. Analyze existing data processes and workflows to identify opportunities for improvement and optimization. Work closely with data engineers and data scientists to translate business requirements into technical specifications. Conduct data analysis and data validation to ensure accuracy and consistency of data outputs. Develop and maintain documentation related to data processes, requirements, and project deliverables. Facilitate communication between technical teams and business stakeholders to ensure alignment on project goals and timelines. Participate in project planning and prioritization discussions, providing insights based on business needs. Support user acceptance testing (UAT) and ensure that solutions meet business requirements before deployment. Utilize Jira for project tracking, issue management, and to facilitate Agile project management practices. Stay updated on industry trends and best practices in data engineering and analytics. What Were Looking For Minimum of 6 years of experience as a Business Analyst in a data engineering environment. Strong understanding of data engineering concepts, data modeling, and ETL processes. Proficiency in data visualization tools (e.g., Tableau, Power BI) and SQL for data analysis. Experience with Jira for project management and tracking. Excellent analytical and problem-solving skills, with a keen attention to detail. Strong communication and interpersonal skills, with the ability to work collaboratively in a team environment. Experience with Agile methodologies and project management tools is must. Return to Work Have you taken time out for caring responsibilities and are now looking to return to workAs part of our Return-to-Work initiative (link to career site page when available), we are encouraging enthusiastic and talented returners to apply, and will actively support your return to the workplace. Statement: S&P Global delivers essential intelligence that powers decision making. We provide the worlds leading organizations with the right data, connected technologies and expertise they need to move ahead. As part of our team, youll help solve complex challenges that equip businesses, governments and individuals with the knowledge to adapt to a changing economic landscape. S&P Global Mobility turns invaluable insights captured from automotive data to help our clients understand todays market, reach more customers, and shape the future of automotive mobility. About S&P Global Mobility At S&P Global Mobility, we provide invaluable insights derived from unmatched automotive data, enabling our customers to anticipate change and make decisions with conviction. Our expertise helps them to optimize their businesses, reach the right consumers, and shape the future of mobility. We open the door to automotive innovation, revealing the buying patterns of today and helping customers plan for the emerging technologies of tomorrow. For more information, visit www.spglobal.com/mobility . Whats In It For You Our Purpose: Progress is not a self-starter. It requires a catalyst to be set in motion. Information, imagination, people, technologythe right combination can unlock possibility and change the world.Our world is in transition and getting more complex by the day. We push past expected observations and seek out new levels of understanding so that we can help companies, governments and individuals make an impact on tomorrow. At S&P Global we transform data into Essential Intelligence, pinpointing risks and opening possibilities. We Accelerate Progress. Our People: Our Values: Integrity, Discovery, Partnership At S&P Global, we focus on Powering Global Markets. Throughout our history, the world's leading organizations have relied on us for the Essential Intelligence they need to make confident decisions about the road ahead. We start with a foundation of integrity in all we do, bring a spirit of discovery to our work, and collaborate in close partnership with each other and our customers to achieve shared goals. Benefits: We take care of you, so you cantake care of business. We care about our people. Thats why we provide everything youand your careerneed to thrive at S&P Global. Health & WellnessHealth care coverage designed for the mind and body. Continuous LearningAccess a wealth of resources to grow your career and learn valuable new skills. Invest in Your FutureSecure your financial future through competitive pay, retirement planning, a continuing education program with a company-matched student loan contribution, and financial wellness programs. Family Friendly PerksIts not just about you. S&P Global has perks for your partners and little ones, too, with some best-in class benefits for families. Beyond the BasicsFrom retail discounts to referral incentive awardssmall perks can make a big difference. For more information on benefits by country visithttps://spgbenefits.com/benefit-summaries Global Hiring and Opportunity at S&P Global: At S&P Global, we are committed to fostering a connected andengaged workplace where all individuals have access to opportunities based on their skills, experience, and contributions. Our hiring practices emphasize fairness, transparency, and merit, ensuring that we attract and retain top talent. By valuing different perspectives and promoting a culture of respect and collaboration, we drive innovation and power global markets. Recruitment Fraud Alert If you receive an email from a spglobalind.com domain or any other regionally based domains, it is a scam and should be reported to reportfraud@spglobal.com. S&P Global never requires any candidate to pay money for job applications, interviews, offer letters, pre-employment training or for equipment/delivery of equipment. Stay informed and protect yourself from recruitment fraud by reviewing our guidelines, fraudulent domains, and how to report suspicious activity here. ---- Equal Opportunity Employer S&P Global is an equal opportunity employer and all qualified candidates will receive consideration for employment without regard to race/ethnicity, color, religion, sex, sexual orientation, gender identity, national origin, age, disability, marital status, military veteran status, unemployment status, or any other status protected by law. Only electronic job submissions will be considered for employment. If you need an accommodation during the application process due to a disability, please send an email to EEO.Compliance@spglobal.com and your request will be forwarded to the appropriate person. US Candidates Only The EEO is the Law Poster http://www.dol.gov/ofccp/regs/compliance/posters/pdf/eeopost.pdf describes discrimination protections under federal law. Pay Transparency Nondiscrimination Provision - https://www.dol.gov/sites/dolgov/files/ofccp/pdf/pay-transp_%20English_formattedESQA508c.pdf ---- 20 - Professional (EEO-2 Job Categories-United States of America), PDMGDV202.1 - Middle Professional Tier I (EEO Job Group), SWP Priority Ratings - (Strategic Workforce Planning)

Posted 2 weeks ago

Apply

7.0 - 8.0 years

17 - 20 Lacs

Hyderabad

Work from Office

Overview PepsiCo is embarking on a significant initiative of digitalization and standardization of the FP&A solution across all its divisions to make the finance organization more Capable, more Agile, and more Efficient. The MOSAIC program is a key enabler of that vision. It is the FP&A solution of the PepsiCo Global Template (PGT) that, for the first time, aims to integrate vertical planning for Operating Units (OUs) or markets, and horizontal planning for functions (e.g., Global Procurement, Compensation and Benefits, etc.) that have accountability across markets. The program aims to harmonize data, planning processes and ways of working across PepsiCo market. The Finance Application Developer / Architect (TM1) is a key contributor in designing, developing, and maintaining financial planning and analytics solutions using IBM Planning Analytics (TM1). This role combines technical expertise with a deep understanding of finance processes to create robust, scalable, and efficient systems that enable data-driven decision-making. The ideal candidate will excel in solution design, stakeholder collaboration, and aligning technical implementations with strategic business goals. Responsibilities Design, Enhance and Maintain Mosaic Solution Develop, troubleshoot and maintain robust TM1/Planning Analytics applications, including cubes, rules, and TurboIntegrator (TI) processes, to support financial planning, forecasting, and reporting. Collaborate with stakeholders to design and implement scalable, future-proof solutions that meet business requirements. Business Incident Triage Engage with finance and business teams to understand objectives, gather requirements, and translate them into effective technical designs. Provide advisory support to optimize financial processes and restore the solution Optimize System Performance Ensure the stability and performance of TM1 models, performing optimization and tuning to handle growing data and user demands efficiently. Data Integration and Automation Manage data flows between TM1 and other systems, automating processes for data loading, transformation, and reconciliation. Governance and Standards Implement best practices for data governance, model development, documentation, and version control to maintain system reliability and accuracy. Training and Support Deliver training and support to finance teams, empowering them to leverage TM1 solutions effectively for business insights. Qualifications Preferred Qualifications Bachelors degree required. Masters degree preferred. 7-8+ years of experience configuring, deploying and managing TM1 (Preferred) or SAP based Financial Planning & Analysis solution with a focus on Topline Planning. Technical Expertise Advanced proficiency in TM1/IBM Planning Analytics, including TurboIntegrator, rules, feeders, and cube development. Strong knowledge of data modelling, relational databases, and ETL processes. Familiarity with TM1 add-ins (e.g., Planning Analytics for Excel) and integration with other tools (e.g., ERP systems, visualization tools). Experience integrating TM1 with ERP systems (e.g., SAP, Oracle) and visualization tools (e.g., Power BI, Tableau). Financial Process Knowledge Understanding of budgeting, forecasting, consolidation, and financial reporting processes. Experience designing solutions aligned with finance-driven priorities and goals. Solution Design and Advisory Skills: Expertise in analyzing business requirements and providing innovative, strategic solutions. Ability to design architecture for scalability, reliability, and future growth. Communication and Collaboration Strong interpersonal skills to work with cross-functional teams and present complex technical ideas clearly to non-technical audiences. Proactive engagement with stakeholders to align on priorities and outcomes. Familiarity with Agile methodologies. Understanding of project test phases and testing automation tools Consulting background is a plus

Posted 2 weeks ago

Apply

0.0 years

25 - 30 Lacs

Hyderabad

Work from Office

Overview The MDG Master Data Harmonization Senior Manager is a key contributor in Sustaining , developing, and maintaining PGT SAP master data solution. This role combines technical expertise with a deep understanding of Master Data processes to create robust, scalable, and efficient systems that enable data-driven decision-making. The ideal candidate will excel in master data harmonization, stakeholder collaboration, and aligning technical implementations with strategic business goals. Responsibilities Sustain, Design and Maintain Harmonised SAP PGT Master Data Develop, troubleshoot and maintain robust SAP Master Data, including business partner (Customer and Vendors), Materials and Finance master data Collaborate with stakeholders to design and implement scalable, future-proof solutions that meet business requirements. Support Master Data Harmonisation Reports Engage with business teams highlight the data differences across the landscapes., create a synchronization plan for master data , gather requirements, and translate them into effective technical designs. Provide advisory support to harmonise master data processes Ensure Master Data is optimized for better System Performance Ensure the stability and performance of SAP Master data, performing optimization and tuning to handle growing data and user demands efficiently. Data Integration and Automation Manage data flows between PGT SAP and other systems, automating processes for data loading, transformation, and reconciliation. Governance and Standards Implement best practices for data governance, model development, documentation, and version control to maintain system reliability and accuracy. Stakeholder Collaboration and Communication Act as a liaison between technical teams and business stakeholders, translating complex technical solutions into clear, actionable outcomes for non-technical users. Training and Support Deliver training and support to finance teams, empowering them to leverage TM1 solutions effectively for business insights. Qualifications Technical Expertise Advanced proficiency in SAP Master Data processes, MDG and supporting tools Strong knowledge of data modeling, relational databases, and ETL processes. Familiarity with data harmonization add-ins (e.g., GDQ sustain reports and integration with other tools (e.g., ERP systems, visualization tools). SAP Process Knowledge Solid understanding of OTC, MTD, R2R processes and DEPENDENCIES OF MASTER DATA Experience designing solutions aligned with SAP MDG-driven priorities and goals. Solution Design and Advisory Skills: Expertise in analyzing business requirements and providing innovative, strategic solutions. Ability to design architecture for scalability, reliability, and future growth.

Posted 2 weeks ago

Apply

9.0 - 14.0 years

20 - 25 Lacs

Hyderabad

Work from Office

Overview Primary focus would be to lead development work within Azure Data Lake environment and other related ETL technologies, with the responsibility of ensuring on time and on budget delivery; Satisfying project requirements, while adhering to enterprise architecture standards. Role will lead key data lake projects and resources, including innovation related initiatives (e.g. adoption of technologies like Databricks, Presto, Denodo, Python,Azure data factory; database encryption; enabling rapid experimentation etc.)). This role will also have L3 and release management responsibilities for ETL processes Responsibilities Lead delivery of key Enterprise Data Warehouse and Azure Data Lake projects within time and budget Drive solution design and build to ensure scalability, performance and reuse of data and other components Ensure on time and on budget delivery which satisfies project requirements, while adhering to enterprise architecture standards. Manage work intake, prioritization and release timing; balancing demand and available resources. Ensure tactical initiatives are aligned with the strategic vision and business needs Oversee coordination and partnerships with Business Relationship Managers, Architecture and IT services teams to develop and maintain EDW and data lake best practices and standards along with appropriate quality assurance policies and procedures May lead a team of employee and contract resources to meet build requirements Set priorities for the team to ensure task completion Coordinate work activities with other IT services and business teams. Hold team accountable for milestone deliverables Provide L3 support for existing applications Release management Qualifications Experience Bachelors degree in Computer Science, MIS, Business Management, or related field 9 + years experience in Information Technology or Business Relationship Management 5 + years experience in Data Warehouse/Azure Data Lake 3 years experience in Azure data lake 2 years experience in project management Technical Skills Thorough knowledge of data warehousing / data lake concepts Hands on experience on tools like Azure data factory, databricks, pyspark and other data management tools on Azure Proven experience in managing Data, BI or Analytics projects Solutions Delivery experience - expertise in system development lifecycle, integration, and sustainability Experience in data modeling or database experience; Non-Technical Skills Excellent remote collaboration skills Experience working in a matrix organization with diverse priorities Experience dealing with and managing multiple vendors Exceptional written and verbal communication skills along with collaboration and listening skills Ability to work with agile delivery methodologies Ability to ideate requirements & design iteratively with business partners without formal requirements documentation Ability to budget resources and funding to meet project deliverables

Posted 2 weeks ago

Apply

2.0 - 4.0 years

8 - 12 Lacs

Hyderabad

Work from Office

Overview We are seeking a skilled and proactive business analyst with expertise in Azure Data Engineering to join our dynamic team. In this role, you will bridge the gap between business needs and technical solutions, leveraging your analytical skills and Azure platform knowledge to design and implement robust data solutions. You will collaborate closely with stakeholders to gather and translate requirements, develop data pipelines, and ensure data quality and governance. This position requires a strong understanding of Azure services, data modeling, and ETL processes, along with the ability to thrive in a fast-paced, evolving environment. Responsibilities Collaborate with stakeholders to understand business needs and translate them into technical requirements. Design, develop, and implement data solutions using Azure Data Engineering technologies. Analyze complex data sets to identify trends, patterns, and insights that drive business decisions. Create and maintain detailed documentation of business requirements, data models, and data flows. Work in an environment where requirements are not always clearly defined, demonstrating flexibility and adaptability. Conduct data quality assessments and implement data governance practices. Provide training and support to end-users on data tools and solutions. Continuously monitor and optimize data processes for efficiency and performance. Qualifications Minimum of 2-4 years of experience as a data analyst with hands-on experience in Azure Data Engineering. Proficiency in Azure Data Factory, Azure Databricks, Azure SQL Database, and other Azure data services. Strong analytical and problem-solving skills with the ability to work in a fast-paced, ambiguous environment. Excellent communication and interpersonal skills to effectively collaborate with cross-functional teams. Experience with data modeling, ETL processes, and data warehousing. Knowledge of data governance and data quality best practices. Ability to manage multiple projects and priorities simultaneously. Preferred Skills: Experience with other cloud platforms and data engineering tools. Certification in Azure Data Engineering or related fields.

Posted 2 weeks ago

Apply

6.0 - 11.0 years

3 - 6 Lacs

Bengaluru

Work from Office

Job Title ETL Developer Snap LogicExperience 6-12 YearsLocation Bangalore : We are seeking a highly skilled and experienced SnapLogic Contractor to join our team. The ideal candidate will have deep expertise in SnapLogic, including API development and utilizing the platform's agent functionality. This role is instrumental in driving seamless integrations and delivering robust data solutions for our organization. Experience and Education Required 6+ years of relevant experience inETL Developer Snap Logic Technical Skills: Design, develop, and maintain SnapLogic pipelines to support integration projects. Build and manage APIs using SnapLogic to connect various data sources and systems. Leverage SnapLogic agent functionality to enable secure and efficient data integration. Collaborate with cross-functional teams to gather requirements and ensure solutions meet business needs. Troubleshoot and optimize existing SnapLogic integrations to improve performance and reliability. Document integration processes and provide guidance to team members on best practices. Proven experience with SnapLogic, including API builds and agent functionality. Strong understanding of integration patterns and best practices. Proficiency in data integration and ETL processes. Expertise on Relational Databases Oracle, SSMS and familiar with NO SQL DB MongoDB Knowledge of data warehousing concepts and data modelling Experience of performing validations on large-scale datax`x` Strong Rest API ,JSONs and Data transformations experience Experience with Unit Testing and Integration Testing Familiarity with large language models (LLMs) and their integration with data pipelines. Experience in database architecture and optimization. Knowledge of U.S. healthcare systems, data standards (e.g., HL7, FHIR), and compliance requirements (e.g., HIPAA). Behavioral Skills: Excellent documentation and presentation skills, analytical and critical thinking skills, and the ability to identify needs and take initiative Follow engineering best practices and principles within your organisation Work closely with a Lead Software Engineer Be an active member of the MMC Technology community contribute, collaborate, and learn Build strong relationships with members of your engineering squad

Posted 2 weeks ago

Apply

10.0 - 15.0 years

4 - 8 Lacs

Noida

Work from Office

Highly skilled and experienced Data Modeler to join Enterprise Data Modelling team. The candidate will be responsible for creating and maintaining conceptual logical and physical data models ensuring alignment with industry best practices and standards. Working closely with business and functional teams the Data Modeler will play a pivotal role in standardizing data models at portfolio and domain levels driving efficiencies and maximizing the value of clients data assets. Preference will be given to candidates with prior experience within an Enterprise Data Modeling team. The ideal domain experience would be Insurance or Investment Banking. Roles and Responsibilities: Develop comprehensive conceptual logical and physical data models for multiple domains within the organization leveraging industry best practices and standards. Collaborate with business and functional teams to understand their data requirements and translate them into effective data models that support their strategic objectives. Serve as a subject matter expert in data modeling tools such as ERwin Data Modeler providing guidance and support to other team members and stakeholders. Establish and maintain standardized data models across portfolios and domains ensuring consistency governance and alignment with organizational objectives. Identify opportunities to optimize existing data models and enhance data utilization particularly in critical areas such as fraud banking AML. Provide consulting services to internal groups on data modeling tool usage administration and issue resolution promoting seamless data flow and application connections. Develop and deliver training content and support materials for data models ensuring that stakeholders have the necessary resources to understand and utilize them effectively. Collaborate with the enterprise data modeling group to develop and implement a robust governance framework and metrics for model standardization with a focus on longterm automated monitoring solutions. Qualifications: Bachelors or masters degree in computer science Information Systems or a related field. 10 years of experience working as a Data Modeler or in a similar role preferably within a large enterprise environment. Expertise in data modeling concepts and methodologies with demonstrated proficiency in creating conceptual logical and physical data models. Handson experience with data modeling tools such as Erwin Data Modeler as well as proficiency in database environments such as Snowflake and Netezza. Strong analytical and problemsolving skills with the ability to understand complex data requirements and translate them into effective data models. Excellent communication and collaboration skills with the ability to work effectively with crossfunctional teams and stakeholders. problem-solving skills,business intelligence platforms,erwin,data modeling,database management systems,data warehousing,etl processes,big data technologies,agile methodologies,data governance,sql,enterprise data modelling,data visualization tools,cloud data services,analytical skills,data modelling tool,,data architecture,communication skills

Posted 2 weeks ago

Apply

6.0 - 8.0 years

3 - 6 Lacs

Bengaluru

Work from Office

Job Title:ETL Developer Snap LogicExperience:6-8 YearsLocation:Bangalore : Technical Skills: Design, develop, and maintain SnapLogic pipelines to support integration projects. Build and manage APIs using SnapLogic to connect various data sources and systems. Leverage SnapLogic agent functionality to enable secure and efficient data integration. Collaborate with cross-functional teams to gather requirements and ensure solutions meet business needs. Troubleshoot and optimize existing SnapLogic integrations to improve performance and reliability. Document integration processes and provide guidance to team members on best practices. Proven experience with SnapLogic, including API builds and agent functionality. Strong understanding of integration patterns and best practices. Proficiency in data integration and ETL processes. Expertise on Relational Databases Oracle, SSMS and familiar with NO SQL DB MongoDB Knowledge of data warehousing concepts and data modelling Experience of performing validations on large-scale datax`x` Strong Rest API ,JSONs and Data transformations experience Experience with Unit Testing and Integration Testing Familiarity with large language models (LLMs) and their integration with data pipelines. Experience in database architecture and optimization. Knowledge of U.S. healthcare systems, data standards (e.g., HL7, FHIR), and compliance requirements (e.g., HIPAA). Behavioral Skills: Excellent documentation and presentation skills, analytical and critical thinking skills, and the ability to identify needs and take initiative Follow engineering best practices and principles within your organisation Work closely with a Lead Software Engineer Be an active member of the MMC Technology community contribute, collaborate, and learn Build strong relationships with members of your engineering squad

Posted 2 weeks ago

Apply

4.0 - 7.0 years

8 - 12 Lacs

Noida

Work from Office

Strong knowledge on Datawarehouse, ETL concepts, Unix commands and shell/bash script. Strong knowledge of Oracle database, SQL queries. Good understanding of processing data using XML JSON Good Communication skills. Additional skills - GitHub , Jenkins , shell scripting would be added advantage Mandatory Competencies DevOps/Configuration Mgmt - DevOps/Configuration Mgmt - Jenkins Beh - Communication and collaboration ETL - ETL - Ab Initio Database - Database Programming - SQL DevOps/Configuration Mgmt - DevOps/Configuration Mgmt - GitLab,Github, Bitbucket DevOps/Configuration Mgmt - DevOps/Configuration Mgmt - Basic Bash/Shell script writing

Posted 2 weeks ago

Apply

3.0 - 6.0 years

7 - 17 Lacs

Noida, Greater Noida

Work from Office

About CLOUDKEEPER: CloudKeeper is a cloud cost optimization partner that combines the power of group buying & commitments management, expert cloud consulting & support, and an enhanced visibility & analytics platform to reduce cloud cost & help businesses maximize the value from AWS, Microsoft Azure, & Google Cloud. A certified AWS Premier Partner, Azure Technology Consulting Partner, Google,Cloud Partner, and FinOps Foundation Premier Member, CloudKeeper has helped 400+ global companies save an average of 20% on their cloud bills, modernize their cloud set-up and maximize value all while maintaining flexibility and avoiding any long-term commitments or cost. CloudKeeper hived off from TO THE NEW, digital technology services company with 2500+ employees and an 8-time GPTW winner. Position Overview: We are looking for an experienced and driven Data Engineer to join our team. The ideal candidate will have a strong foundation in big data technologies, particularly Spark, and a basic understanding of Scala to design and implement efficient data pipelines. As a Data Engineer at CloudKeeper, you will be responsible for building and maintaining robust data infrastructure, integrating large datasets, and ensuring seamless data flow for analytical and operational purposes. Key Responsibilities: Design, develop, and maintain scalable data pipelines and ETL processes to collect, process, and store data from various sources. Work with Apache Spark to process large datasets in a distributed environment, ensuring optimal performance and scalability. Develop and optimize Spark jobs and data transformations using Scala for large-scale data processing. Collaborate with data analysts and other stakeholders to ensure data pipelines meet business and technical requirements. Integrate data from different sources (databases, APIs, cloud storage, etc.) into a unified data platform. Ensure data quality, consistency, and accuracy by building robust data validation and cleansing mechanisms. Use cloud platforms (AWS, Azure, or GCP) to deploy and manage data processing and storage solutions. Automate data workflows and tasks using appropriate tools and frameworks. Monitor and troubleshoot data pipeline performance, optimizing for efficiency and cost-effectiveness. Implement data security best practices, ensuring data privacy and compliance with industry standards. Stay updated with new data engineering tools and technologies to continuously improve the data infrastructure. Required Qualifications: 4- 6 years of experience required as a Data Engineer or an equivalent role Strong experience working with Apache Spark with Scala for distributed data processing and big data handling. Basic knowledge of Python and its application in Spark for writing efficient data transformations and processing jobs. Proficiency in SQL for querying and manipulating large datasets. Experience with cloud data platforms, preferably AWS (e.g., S3, EC2, EMR, Redshift) or other cloud-based solutions. Strong knowledge of data modeling, ETL processes, and data pipeline orchestration. Familiarity with containerization (Docker) and cloud-native tools for deploying data solutions. Knowledge of data warehousing concepts and experience with tools like AWS Redshift, Google BigQuery, or Snowflake is a plus. Experience with version control systems such as Git. Strong problem-solving abilities and a proactive approach to resolving technical challenges. Excellent communication skills and the ability to work collaboratively within cross-functional teams. Preferred Qualifications: Experience with additional programming languages like Python, Java, or Scala for data engineering tasks. Familiarity with orchestration tools like Apache Airflow, Luigi, or similar frameworks. Basic understanding of data governance, security practices, and compliance regulations.

Posted 2 weeks ago

Apply

5.0 - 10.0 years

16 - 20 Lacs

Navi Mumbai

Work from Office

Analyze existing reports and dashboards built in IBM Cognos and design equivalent solutions in Qlik Sense. Work closely with business stakeholders to understand reporting requirements and ensure seamless transition. Optimize Qlik Sense reports for performance, scalability, and user experience. Provide technical guidance to junior developers and support them on best practices in performance tuning and visual design. Ensure data accuracy, consistency, and quality across migrated reports. Collaborate with data engineering teams to ensure proper data sourcing and modeling. Leverage knowledge of Power BI and Tableau to support cross-platform initiatives and provide comparative insights when necessary. Follow industry standards and guidelines in BI dashboard design, data storytelling, and data visualization principles Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise 5+ years of experience in Business Intelligence / Data Visualization roles. Experience working with IBM Cognos and migrating reports to modern BI platforms. Good understanding of data modeling, ETL processes, and SQL. Strong knowledge of Power BI and Tableau is a must. Experience in report optimization techniques (performance tuning, load time improvement Preferred technical and professional experience Experience in enterprise-level BI deployments and dashboard governance. Certification in Qlik Sense / Power BI / Tableau is a plus. Background in financial services, healthcare, or other data-intensive industries is an advantage

Posted 2 weeks ago

Apply

8.0 - 10.0 years

11 - 15 Lacs

Noida

Work from Office

8 to 10 years of relevant experience Strong understanding of ETL concepts and database principles. Strong Knowledge on Azure Cloud platform. Strong knowledge on Unix. Proficiency in SQL for data querying and validation. Experience with ETL tools. Excellent analytical and problem-solving skills. Strong communication and collaboration skills. Experience with data validation and data quality checks. Design, develop, and execute detailed test cases, test plans, and test scripts based on requirements and specifications. Agile Methodology along with JIRA experience Very good communication and interaction skills Strong collaboration skills ability to work as a team player Experience working with onshore multi-geography teams Must have: - Understanding of ETL Processes/ concepts Should have good knowledge of common ETL tools (like: Informatica, Azure Data Factory or any ETL tools) Database testing SQL/ Oracle queries Azure basics Agile Good communication Mandatory Competencies ETL - ETL - Informatica Power Centre ETL - ETL - Tester ETL - ETL - Azure Data Factory Database - SQL Database - Oracle Agile - Agile QA - Agile Methodology QA - Testing Process QA Automation - QA Automation QA - Test Tool and Automation Cloud - Azure Java - Unix Beh - Communication and collaboration Database - Database Programming - SQL ETL - ETL - Ab Initio Agile - Agile - SCRUM QA/QE - QA Automation - Selenium Cloud - Azure - Azure Data Factory (ADF), Azure Databricks, Azure Data Lake Storage, Event Hubs, HDInsight Operating System - Operating System - Unix

Posted 2 weeks ago

Apply

8.0 - 13.0 years

5 - 9 Lacs

Mumbai

Work from Office

Project Role : Application Developer Project Role Description : Design

Posted 2 weeks ago

Apply

5.0 - 10.0 years

2 - 2 Lacs

Bengaluru

Remote

Job Title: Cloud Data Architect Duration: 12 months Must Have: Seeking candidates with at least 5-7 years' experience; Strong data architect, done data engineering and data architect previously. Must be able to review already developed ETL processes, working to lead integrations between multiple systems. Strong SAP experience required. Job Description: Design and implement end-to-end SAP ECC, BW, and HANA data architectures, ensuring scalable and robust solutions. Develop and optimize data models, ETL processes, and reporting frameworks across SAP landscapes. Lead integration efforts, defining and applying best practices for connecting SAP systems with external platforms and cloud services. Collaborate with business stakeholders to translate requirements into technical solutions, focusing on data quality and governance. Provide technical leadership and mentorship to project teams, ensuring alignment with enterprise integration patterns and standards.

Posted 2 weeks ago

Apply

5.0 - 10.0 years

2 - 2 Lacs

Bengaluru

Remote

Job Title: Cloud Data Architect Duration: 12 months Must Have: Seeking candidates with at least 5-7 years' experience; Strong data architect, done data engineering and data architect previously. Must be able to review already developed ETL processes, working to lead integrations between multiple systems. Strong SAP experience required. Job Description: Design and implement end-to-end SAP ECC, BW, and HANA data architectures, ensuring scalable and robust solutions. Develop and optimize data models, ETL processes, and reporting frameworks across SAP landscapes. Lead integration efforts, defining and applying best practices for connecting SAP systems with external platforms and cloud services. Collaborate with business stakeholders to translate requirements into technical solutions, focusing on data quality and governance. Provide technical leadership and mentorship to project teams, ensuring alignment with enterprise integration patterns and standards.

Posted 2 weeks ago

Apply

5.0 - 10.0 years

4 - 8 Lacs

Bengaluru

Work from Office

The purpose of this role is to design, test and maintain software programs for operating systems or applications which needs to be deployed at a client end and ensure its meet 100% quality assurance parameters Position Overview: Seeking a skilled Python and Airflow Developer with a strong data engineering background. Key Responsibilities: Develop and maintain data pipelines using Apache Airflow and Python. Collaborate with business managers to understand data requirements. Optimize existing data workflows for performance and reliability. Manage SQL and NoSQL databases for business intelligence. Ensure data quality and security. Troubleshoot data pipeline issues. Required Qualifications: Proven experience in Python programming and data engineering. Hands-on experience with Apache Airflow. Strong understanding of SQL and experience with SQL databases. Familiarity with NoSQL databases. Experience in data modeling and ETL processes. Must have: Python and Apache Airflow, preferably 5+ years of experience.

Posted 2 weeks ago

Apply

3.0 - 7.0 years

6 - 10 Lacs

Hyderabad, Pune

Work from Office

Senior Data Engg2 We are looking for a Senior Data Platform Engineer to lead the design, development, andoptimization of our data platform infrastructure. In this role, you will drive scalability, reliability,and performance across our data systems, working closely with data engineers, analysts, andproduct teams to enable data-driven decision-making at scale. Required Skills & ExperienceArchitect and implement scalable, secure, and high-performance data platforms (onAWS cloud using Databbricks). Build and manage data pipelines and ETL processes using modern data engineeringtools (AWS RDS, REST APIs and, S3 based ingestions ) Monitor the Maintain the production data pipelines, work on enhancements Optimize data systems for performance, reliability, and cost efficiency. Implement data governance, quality, and observability best practices as per Freshworksstandards Collaborate with cross-functional teams to support data needs.Qualifications:1. Bachelor''s/Masters degree in Computer Science, Information Technology, or relatedfield.2. Good exposure to Data structures and algorithms3. Proven backend development experience using Scala, Spark or Python4. Strong understanding of REST API development, web services, and microservicesarchitecture.5. Good to have experience with Kubernetes and containerized deployment.6. Proficient in working with relational databases like MySQL, PostgreSQL, or similarplatforms.7. Solid understanding and hands-on experience with AWS cloud services.8. Strong knowledge of code versioning tools, such as Git, Jenkins9. Excellent problem-solving skills, critical thinking, and a keen attention to detail.

Posted 2 weeks ago

Apply

3.0 - 8.0 years

10 - 18 Lacs

Chennai

Remote

We are looking for ETL Developers for our US-based Insurance customer project. Required Skills: Having a minimum of 3 - 8 years of experience in SSIS. Insurance Domain is Plus Secondary Skills: Oralce Database experience Optional: Attitude to take challenging tasks and risk mitigation planning Good Logical and Analytical skills, Ability to think out-of-the-box, Strong Technical Skills Good communication skills. Other tech Stacks: Informatica Data Integration ETL process Workflow Automation Graphical Tools knowledge Microsoft SQK Server Oracle Database - 12C / 18c Framework designing Interested applicant, pls respond with your resume to mohanaprasath.ponnusamy@kumaran.com

Posted 3 weeks ago

Apply

8.0 - 12.0 years

18 - 25 Lacs

Chennai

Remote

Required Skills: Having a minimum of 9+ years of experience in SSIS Architecting, Framework designing experience . Secondary Skills: Oralce Database experience Optional: Attitude to take challenging tasks and risk mitigation planning Good Logical and Analytical skills, Ability to think out-of-the-box, Strong Technical Skills Good communication skills. Other tech Stacks: Informatica Data Integration ETL process Workflow Automation Graphical Tools knowledge Microsoft SQK Server Oracle Database - 12C / 18c Framework designing

Posted 3 weeks ago

Apply

3.0 - 8.0 years

5 - 9 Lacs

Kolkata, Mumbai, New Delhi

Work from Office

Were looking for a Senior Data Analyst to join our data-driven team at an ad-tech company that thrives on turning complexity into clarity. Our analysts play a critical role in transforming raw, noisy data into accurate, actionable signals that drive real-time decision-making and long-term strategy. Youll work closely with product, engineering, and business teams to uncover insights, shape KPIs, and guide performance optimization. Responsibilities: Analyze large-scale datasets from multiple sources to uncover actionable insights and drive business impact. Design, monitor, and maintain key performance indicators (KPIs) across ad delivery, bidding, and monetization systems. Partner with product, engineering, and operations teams to define metrics, run deep-dive analyses, and influence strategic decisions. Develop and maintain dashboards, automated reports, and data pipelines to ensure data accessibility and accuracy. Lead investigative analysis of anomalies or unexpected trends in campaign performance, traffic quality, or platform behavior. Requirements BA / BSc in Industrial Engineering and Management / Information Systems Engineering / Economics / Statistics / Mathematics / similar background. 3+ years of experience in Data Analysis and interpretation (Marketing/ Business/ Product). High proficiency in SQL. Experience with data visualization of large data sets using BI systems (Qlik Sense, Sisense, Tableau, Looker, etc.). Experience working with data warehouse/data lake tools like Athena / Redshift / Snowflake /BigQuery. Knowledge of Python - An advantage. Experience building ETL processes An advantage. Fluent in English both written and spoken - Must

Posted 3 weeks ago

Apply

2.0 years

11 - 17 Lacs

Pune

Work from Office

The Role A Technical Data Analyst is responsible for performing data migration, data conversion and data validation projects for Addepar clients using existing tools and established processes. The ideal candidate will have a good understanding of financial portfolio data, a foundational level of python programming skills, exceptional communication skills and the ability to deliver results in alignment with project deadlines while meeting high quality standards. What You’ll Do Convert, migrate and validate data from external or internal sources using existing tooling with defined processes and workflows Complete data projects on-time meeting project deadlines while adhering to high quality standards Coordinate across project teams communicating regular status updates for assigned data projects and while effectively setting expectations Run python ETL scripts and at times modify, fix or debug as needed Raise keys issues to project team members and senior leadership as necessary Prioritize and context-switch effectively to complete simultaneous projects; seeing each through to the finish line Adhere to project management standard processes Identify and drive opportunities to improve current processes, workflows and tools to increase efficiency and automation Who You Are Minimum 2+ years experience working in technology and finance Experience working with colleagues spread across multiple global locations Must have domain experience wealth/portfolio/investment management. Proficient in Python programming language and well versed in ETL concepts Understands financial markets and has experience with financial products and portfolio data Excellent written and oral communication skills with the ability to convey complex information in an understandable manner Solution-oriented and passion for problem solving Highly organized, close attention to detail and driven to make processes more efficient Positive attitude, good work ethic, proactive and a high contributing teammate Independent, adaptable and can work with minimum supervision Proven ability to manage expectations and provide regular updates to the project team P.S. This role will require you to work from Pune office 3 days a week in UK shift i.e. 2:30 PM to 11:30 PM IST. (hybrid role)

Posted 3 weeks ago

Apply

3.0 - 8.0 years

10 - 18 Lacs

Chennai, Bengaluru

Hybrid

We are looking for ETL Developers for our US-based Insurance customer project. Required Skills: Having a minimum of 3 - 8 years of experience in Informatica / SSIS. Insurance Domain is Plus Secondary Skills: Oralce Database experience Optional: Attitude to take challenging tasks and risk mitigation planning Good Logical and Analytical skills, Ability to think out-of-the-box, Strong Technical Skills Good communication skills. Other tech Stacks: Informatica Data Integration ETL process Workflow Automation Graphical Tools knowledge Microsoft SQK Server Oracle Database - 12C / 18c Framework designing Interested applicant, pls respond with your resume to mohanaprasath.ponnusamy@kumaran.com

Posted 3 weeks ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies